Jump to content

Difference in quality between vga and dvi/hdmi

mougi

I primarily used a vga input to my monitor. However, just out of curiosity and  having been told that digital input (particularly in the form of dvi) offers better quality picture, I tried this. However, I found the difference to be indiscernable.

 

What I'm wondering is whether there is supposed a noticeable difference or not. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

There may or may not be, but I personally have seen quite large differences on multiple monitors, especially with their color reproduction.

Link to comment
Share on other sites

Link to post
Share on other sites

Emmm... day and night. Maybe you can't see the difference because your monitor is old and has low resolution. Also HDMI and especially DVI can reach higher refresh rates than VGA. Maybe your monitor doesn't have dual-link DVI but it has DVI-A which is same as VGA or DVI-I which is a bit better than VGA but still worse than DVI-D.

CPU: AMD FX-6100 Black Edition @3.9GHz GPU: XFX 7970 DD (1062/1520 MHz) MOBO: ASUS Sabertooth 990FXA(1st Revision) RAM: Corsair Vengeance 16GB @ 1333MHz Storage: Corsair Force 3 120GB(Boot) + WD Green 1TB(storage) PSU: FSP AURUM 600W(80+ Gold) CPU Cooler: Cryorig M9a  Case: NZXT Tempst 410 Elite(Mid-Tower) Mouse: Logitech G602(Manufacturer Refurbished) Keyboard: Noppoo Choc Mini(Cherry MX Blue) AUDIO:Sennheiser HD 598+ASUS Xonar DGX Monitor: LG M2280DF 21.5" 1080p(TN-75Hz)

Link to comment
Share on other sites

Link to post
Share on other sites

Image quality aside I've had too many problems with VGA to go back... things turning funny colors, etc.  Since it's susceptible to interference, it relies on the shielding of the cable to get a clear signal.  If you don't have a good cable, and degrades with the more wireless devices you have around it.  Digital standards are much less trouble.

Link to comment
Share on other sites

Link to post
Share on other sites

How about dvi vs displayport ?

Cpu: Intel i7 4770k @4.4 Ghz | Case: Corsair 350D | Motherbord: Z87 Gryphon | Ram: dominator platinum 4X4 1866 | Video Card: SLI GTX 980 Ti | Power Supply: Seasonic 1000 platinum | Monitor: ACER XB270HU | Keyboard: RK-9100 | Mouse: R.A.T. 7 | Headset : HD 8 DJ | Watercooled

Link to comment
Share on other sites

Link to post
Share on other sites

Mate DVI-D is so much nicer on my monitor in comparison to VGA! it's kind of crazy!

 

Oh and fyi

 

 

 

:)

Edited by SirReallySam

PROFILEYEAH

What do people even put in these things?

Link to comment
Share on other sites

Link to post
Share on other sites

Linus has an error on his video.

DisplayPort can be converted into VGA, single-link DVI, and HDMI much like DVI.

The interpreter that he got, is to convert DisplayPort to dual link DVI. The reason for this, is that Dual-link DVI is actually a hack done to support higher resolution (or faster refresh rate) than normal DVI. It was done because DisplayPort and HDMI didn't exists back then. So what it means is that, dual-link DVI, as the name says, is really like having 2x DVI into 1 plug. So the interpreter box, or active adapter, needs to cut the DisplayPort signal, and transform it into dual link DVI, for the monitor to process.

As most 2560x1440 and 2560x1600, and 120Hz monitors have DisplayPort already, it is not issue, as you just use DisplayPort, and call it a day.

Also VGA has numerous downsides, which Linus forgot to mention

-> VGA doesn't allow communication between the monitor and graphic card. The monitor guesses the resolution and image placement.. that is why you have the "Auto-adjust" system on VGA. DVI, HDMI and DisplayPort alloys the graphic card to receive the monitor specs and adjust the image output to the monitor correctly (correct resolution, know all the supported resolution, and so on)

-> VGA is prone to interferences. In a rural area, a crappy VGA cable will be just fine compared to DVI, especially at low resolution. But go downtown, and that cable will most likely give you a blurry, or see static (mostly visible on solid colors), and the image won't be great.

Link to comment
Share on other sites

Link to post
Share on other sites

LCD monitors are digital devices, the resolution is fixed, and will work best with digital input in it's native resolution, any analog input must be converted to digital before being displayed.

 

CRT monitors need analog input, LCDs natively need digital, but can accept analog input if they contain an ADC to convert analog (vga/component) to digital before displaying.

 

VGA - Analog

DVI - Digital

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×