Link to home
Start Free TrialLog in
Avatar of griffen
griffen

asked on

DVI vs. VGA - recommendation

I currently have a home-built PC with a mid-line graphics card including a DVI out.  I also have a brand new Mac Mini (DVI out is standard).  My monitor is an old, but very nice, 17" Sony CRT.  I'm expecting to want to upgrade to a 17 or 19" LCD in the next year or two, but for now, I have the CRT.  I want to be able to control the 2 computers using a KVM switch.  I have priced KVM switches capable of using DVI and USB and have also priced those for VGA and USB.  The price difference is very significant to me.  I don't play games on my computer, but I do watch video and have a large collection of digital photos.  Assuming my new LCD has DVI support, am I going to notice enough of a difference to make it worth while to purchase a KVM with DVI capability, or can I save my money and get the VGA switch for use with the VGA connections?
Avatar of Callandor
Callandor
Flag of United States of America image

Go with VGA if the price is a consideration.  At desktop screen sizes, the differences are extremely small; only a projected image or a long cable run would show a clear advantage to DVI.
Avatar of snerkel
snerkel

Another option may be that if you get a LCD monitor with DVI they also will generally have VGA. Most monitors will allow you to switch the displayed source at the press of a button, my own Dell UltraSharp is equiped with both so I use DVI for my main PC and VGA for my second PC.

OK I have to switch my KVM and my monitor but the quality of DVI makes this well worth while.

for those kind of uses i would reccomend staying with VGA.
DVI is a true digital connection as opposed to VGA, which is converted from digital to analogue (becuase a VGA cable is analogue) and then reconverted to digital again at the monitor. there wont be a noticable quality loss in most cases, however if your not looking to get the ultimate visual experience ususally seen in advanced DVD home theater/projector setups, then i wouldnt sweat it. I do part time graphics design and am more than happy with my 21" VGA setup.

you should see my home theater though :D  thats a different story. I used DVI from the computer to my DLP projector. very sharp image! but expensive

oh and KVM switches are notorious for degrading quality. I would never use DVI with any kind of switch, kind of defeats the whole purpose, cable length of DVI does matter! the shorter the better.
Callandor is right.  If your screen is smaller than 24" 1920x1080, then you should opt for VGA.  DVI won't give you anything more at that size.  Also, DVI has over/underscan issue with some broadcasts and SDTV recording that are annoying.

On one of my projetors, I use:
DVI for my HDTV and EDTV
VGA for my PC
Component for my XBOX and SDTV
S-Video for my DVD

I'd go with DVI, but the extent of the difference is variable; sometimes it's huge, sometimes it's near zero.

There are 2 factors that comprise most of the difference between DVI and VGA:

The first difference is that the analog (VGA) monitor has to "sample" the analog signal and convert it back to digita.  This sampling is controlled by a "dot clock" in the monitor.  The problem is getting this dot clock in the monitor to "line up", EXACTLY, with the dot clock in the video card of the computer.  In other words, the monitor has to sample the dot clock in row 23, column 281, at exactly the same moment that the video card is generating and outputing row 23, column 281.  In order to do this, the monitor provides for adjustment of both the dot clock frequency and the dot clock phase, but it takes a bit of work to get the two dot clocks (in the monitor and in the video card) matched EXACTLY, and if they are not, you get a form of distortion, and sometimes a "shimmering" or "sparkling" of the pixels that can be very annoying (although most untrained users won't know or be able to pinpoint exactly what the distortion is or why the display is "uncomfortable").  And, with some combinations of monitor and video card, you CANNOT get the two dot clocks to match, exactly.

The second difference is that the analog signal is subject to "ghosts" and "ringing".  This is almost entirely a cable problem, but many of the cables being used (even those supplied with the monitors) are not that good, and many monitors don't terminate them all that well.  The video cable is carrying square waves with frequencies in excess of 80 MHz, and the transmission line characteristics of such signals are very pronounced, and you will see ghosting and ringing if the lines are of low quality and/or improperly terminated.  This is most visible on fine detail, which happens to include character displays.  And, by the way, the inclusion of things like extension cables and KVM switches will make this form of image degradation MUCH worse.

The bottom line is that a DVI interface COMPLETELY eliminates both of these issues.  It doesn't make them "better", it makes them "perfect".  Now to be fair, on a really good analog monitor with a really good cable, with proper adjustment, there may be NO visible difference between that monitor and a DVI monitor.  However, this isn't the normal case, the normal case is that the DVI displays is better, and what varies is by how much it is better, which varies from "not much" to "huge difference, night and day".  But in your case, there is no question that the inclusion of a KVM will make matters worse on the analog side, and, in any case, I spend too much time looking at my display to settle for anything but the best.
ViRoy, DVI is a digital interface.  As long as it works, there can be no degradation, not due to cable length, not due to KVM switches.  In fact, (again, as long as it's working) DVI is a way of avoiding the degradation imposed by cables and KVM switches.  If you degrade the signal on a DVI cable enough, the interface will simply stop working.  But you are either going to get a perfect signal, or no signal at all.  That is the beauty of a digial transmission system like DVI.

I use a 50ft cable on my VGA reverse-KVM swtich to connect my server PC to my projector in my bedroom.  First I had ghosting, but I got a better cable and now I have a crystal clear image 6ft from the video card and 50ft from it.  The better cable was $75 more expensive to give you some idea where your $$ will go.  If your displays will be 10ft (3m) or less from the video card, you won't notice a difference between DVI and VGA with standard cables.  
Avatar of griffen

ASKER

Thank you all for your comments.  All have been helpful, though I have to follow up to some degree:

1) Simkiss, interpretting Callandor, argued that a display under 24" wouldn't benefit from DVI b/c the resolution would be under 1920 x 1080.  I assume this statement relates to the resolution required for an HD signal.  But, given that I currently have a CRT, isn't it already capable of that resolution?  I admit I have never tried to set it that high as the refresh rate drops off too drastically for my liking well below that, but I wasn't sure that the actual diagonal measurement of the screen mattered for CRT displays.  So, if this is only a concern when I move to an LCD, I still was under the impression that you could get an HD signal on a screen smaller than that.  Am I missing something?

2) Watzman explained that because the DVI signal is digital, there is no degradation due to the addition of a KVM switch.  If that is true, does the quality of the KVM effect the signal?  Why are the switches so expensive if there is no real need for signal processing in the KVM itself?
ASKER CERTIFIED SOLUTION
Avatar of simkiss
simkiss

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
It is very difficult to get resolutions like 1920 x 1080 on any but the best monitors, be they CRT or LCD (the Apple CinemaHD will do it, but it is very expensive), and I have watched 1080i on my Samsung 710N with analog VGA and liked it.  I have found in my use that a VGA KVM will work well with typical desktop setups, but if you are a purist, then what Watzman says about connections and switches degrading the signal apply; it depends on how demanding you are and how much you are willing to spend to get the best.

While I agree that a DVI signal is the best, long DVI cables cost more than long VGA cables, and cheap DVI cables, I have found to my chagrin, are subject to interference.  I have a 30-ft DVI that shows "green sparklies" due to inadequate shielding, so even though it's digital, it can suffer from picking up interference.  A good quality shielded cable will not do this, but they cost more.
Avatar of griffen

ASKER

Really, thank you all very much.  I think Simkiss's response really provided the information I needed.  Basically it sounds like I will be missing out by not going DVI, but given the size of my screen (both now and my likely future LCD of 17 or 19") chances are the quality difference would not be noticeable to the point it requires an expenditure of an additional $100-$300 dollars.  Hopefully in the next few years I'll get an HD set and connect the Mac Mini directly to that via DVI and then have a dedicated DVI connection for my PC/Monitor.  When that happens, I won't need a KVM, especially if my Linux box (in production) can use the VGA input on the new LCD and the monitor can take care of the input switching.  Thanks again to all.
>>> I mention 1920x1080, because at, say, 1280x1024, you won't notice a difference between DVI and VGA because the attenuation (loss of signal voltage over distance) on a standard VGA cable is not significant enough to degrade your signal to the point where you could see anything different if the image is only displayed at a size of 19" or 21".   the higher your resolution though, the more your analog signal is subject to attenuation. also, the longer the cable, the more your analog signal is subject to attenuation.  if both are 6ft though, you won't see any difference.

I am sorry to disagree but my 17" Dell is visibly better on DVI, VGA is good until you see the DVI image
Well, there are a lot of variables here: video card, resolution, video cable, monitor, condition of eyes, ambient lighting, distance from monitor, and personal finickiness.  What's good for one person is not always true for another, as arguments in home theater forums will demonstrate.

I have to disagree with some of the comments by Simkiss and also (and this is a rareity) Callandor.  I was a product manager and engineer for displays (both CRT and LCD) for 7 years, but I've been involved with displays and video since 1965 (40 years), as in addition to my computer work, I also did televsion broacast engineering even as a teenager (I have an FCC license).  The difference between analog and DVI can be very pronounced even at 1024x768, and even with a cable length of 5 feet or less.  Analog quality is critically dependent on the quality of the cable, and also on the adjustment of the dot clock.  Conversely, it is very difficult to get quality degradation on a DVI interface that is continuing to work.  You do not get the same type of "pixelization" with loss of quality in a DVI cable that you get with dropouts in an MPEG data stream (DVD or satellite transmission).  MPEG is compressed, and uses a data stream with "key fields", which are complete, and between them only the changes are transmitted, which is nothing at all like DVI that is an uncompressed pure digital transmission of a sequential list of pixels.  Now as to whether or not you will notice what you are missing, I can't say .... that will depend on your particular hardware, and it also depends on how critical a viewer you are.  But the difference may well end up being very, very substantial.  Even at 1024x768, and even with a cable length of only 5 feet or less.
Avatar of griffen

ASKER

This is why I love this forum.  I appreciate all this additional information even after I accepted an answer.  I think it is an important question on people's minds, so I would like to encourage further discussion.  I don't think it is adequately covered on too many sites.  Thank you all for sharing your wisdom.  In case you are interested in my decision aside from the present debate, I'm updating below.

__________________

The bottom line answer for me at this point is that I hope the KVM is only a temporary solution and I simply can't afford a DVI/USB 4-port KVM unless everyone here unanimously told me it was of the utmost importance.  The lowest resoultion I run is 1024x768 and I will be disappointed with a noticeably poor image quality, but I haven't yet seen a DVI connected monitor on my system and I am pretty happy with the quality.  As I've indicated, My plan will be as follows:

1) Today - buy a 4-port VGA/USB KVM for use with Windows/Linux/MacOS boxes.

2) 1 year from now - buy a 17" or 19" LCD monitor capable of switch between a DVI and a VGA input to use with my Windows and Linux boxes.   I will then choose one machine (windows or MacOS) for the DVI interface and use the VGA KVM for the VGA interface.

3) 2 years from now - buy an HD set worthy of envy; connect the MacMini to that set via DVI, connect my Windows box to the DVI input on the monitor and the Linux box (if still running) to the VGA on the monitor.

Still, when I get the LCD with the DVI input, I'll test it out.  If I notice a big difference, I'll chalk up the $83.00 I just spent on my 4-port KVM and buy a new one if need be.  With any luck, the MacMini will have made such inroads that KVM prices will drop.  

Thank you all again.  
Watzman is correct but only with shoddy and foreign cables (I have similar credentials :o).  I've been through this.   If you have the standard cables (i.e. you get your VGA cable with a ViewSonic monitor vs. a cheap cable with an ESA or other no-name monitor) you will not see the difference at all.  I work on 1920x1080 video and I have LCD, DLP Projectors, and CRTs.  I believe you'll be very pleased with the VGA KVM.
 


Similarly, I use a 30-ft VGA to 5 BNC cable from RAM Electronics with an LCOS front projector on a 100" screen.  HDTV looks like a glass window from 10 feet away.
Isn't that the best?  I always recommend projectors.   I would never buy another TV.
As long as you have a light-controlled environment, projectors are more economical and produce a bigger picture.