Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

lcd advice

Status
Not open for further replies.

fenix

Technical User
Mar 29, 2001
436
US
Hi,

I'm putting a system together for a friend and wanted to get some advice on LCD monitors.

I'm looking at one that just has an analog input (15 pin), Dell E196FP 19" LCD Monitor. $159 seems to be a pretty good price.

The graphics card in the tower will most likely have both types of output, DVI & VGA.

I believe that LCD's are basically still analog devices (until the new generation all digital ones come out), so is there any performance disadvantage if I just used a VGA out and the VGA input on the monitor, versus getting an LCD monitor with a DVI input and using a DVI output from the card ?

Thank you
fenix
 
Performance-wise (as in speed), there is no difference. Image quality-wise, there is a difference. A monitor with a high quality DVI cable, using the DVI interface of the monitor and the video card will usually have a better (sharper, more consistent) image than using analog (VGA, D-sub, whatever you want to call it).

Maybe I misunderstand your point about this, but I'm not sure that most LCDs aren't actually digital. The video card in your PC definitely computes the image in a digital format. Regardless of what format the monitor uses, if you use analog connections, then your video card converts the image from digital to analog, then sends it down the cable to the monitor. While the signal is travelling down the 3-foot (or longer) cable, it is susceptible to interference which can effect image quality. The same interference isn't an issue when you use a digital cable, because the data bits being transferred are either on or off. Interference can't make them be partially on or distorted in the same way an analog signal can.

So my thinking is that DVI is definitely better, but if you're not the sort to be bothered as much by the image quality then it's not really a concern either way. I use DVI on my home systems, but my work systems are analog only, and the difference isn't that big.
 
Thanks so much for your reply and the info, it was
very useful.

I might have incorrect information, but as I mentioned in
the first post, my understanding is that the current generation of LCD monitors are still basically analog. When you use a digital out to the LCD digital in, there is a still a conversion process that takes place inside the monitor from digital to analog and that a new generation of totally digital monitors that are about to be released that don't need this conversion.

Maybe someone else can tell me if what I'm thinking has any validity or if I have this totally misconstued. I believe that I read an article about this a few months ago in either a PC World mag or in Wired. I couldn't find any info to back up what I'm saying so it wouldn't be the first time I had a misconception in the ever evolving world of technology.
 
I have read in several places that LCD monitors are digital and that including a VGA input means the manufacturer has to add extra a/d converter hardware. In theory this means that monitors which only have DVI inputs should be cheaper than ones with VGA, but that doesn't seem to be borne out in practice as most cheap monitors only have VGA. I think this is because there are still many graphics cards/motherboards out there without DVI outputs, so manufacturers have to include VGA inputs to avoid cutting their customer base - a bit like why you still find parallel and PS2 ports on most motherboards.

I expect the article you read was referring to a generation of monitors without VGA inputs. It should make them cheaper, but they'll just be normal LCDs without a/d converters.

I have a very high-quality LCD with both VGA and DVI inputs, and for a while I had both connected so that I could switch between the two. There is definitely a quality difference between them but it's very subtle. If I switched to VGA and forgot to switch back, I'd tend to notice a niggling slight lack of focus when doing something high-contrast, such as black text on a light background.

Regards

Nelviticus
 
Either it's the generation of monitors lacking VGA inputs, or they're talking about monitors coming out that have a new digital interface (HDCP or something like that) that includes DRM functionality to help "fight piracy". Basically, not only will the signal be digital from end to end, but it will also be encrypted as well. This is supposed to be a requirement of Vista (at least for HD playback), but I haven't seen anything about it in the betas that I have tried.
 
Thanks again kmc and nelviticus
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top