Jump to content

DVI vs VGA

lilwic

So i have two connectors on the back of my monitor, a DVI and a VGA. Now i want to know if i plug my xbox one into the VGA, will it decrease resolution as if i were to plug it into the DVI? because i have two choices, plug the xbox one into the VGA or get a DVI splitter and plug it into the DVI. So does VGA make a noticeable impact on resolution? By the way i have a 20" 1600 x 900 resolution moniter, model is a HP 2010i.

CPU Ryzen 5 3600 | Motherboard MSI B550 | RAM 32 GB Corsair Vengance Pro | GPU EVGA GTX 980ti | Case Corsair 4000X | Storage Seagate Barracuda 2TB HDD,  Team GX2 1TB SSD | PSU Sentey 850w 80+ Bronze | Cooling AMD Wraith RGB | PCPartPicker URL https://pcpartpicker.com/list/RN7kGc |

Link to comment
Share on other sites

Link to post
Share on other sites

DVI !!!

end of discussion.

 

 

The VGA connector has no limit. The GPU producing analog signal has a limit, that's why you see maximum VGA resolution for GPUs on spec sheet, it depends on DMAC speed.

The DVI signal is clearer and you will get a sharper image (on some LCD screens).

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

DVI is digital, VGA is not.

 

VGA will NOT effect resolutions, but it is best to use DVI

n0ah1897, on 05 Mar 2014 - 2:08 PM, said:  "Computers are like girls. It's whats in the inside that matters.  I don't know about you, but I like my girls like I like my cases. Just as beautiful on the inside as the outside."

Link to comment
Share on other sites

Link to post
Share on other sites

The resolution will not change, as your monitor is below both VGA and DVI's maximum resolution. However, the quality of the signal can change with VGA cables, as it is an analogue signal, unlike DVI's digital signal. 

 

If you can, you should use DVI, it's a better interface in general. 

Link to comment
Share on other sites

Link to post
Share on other sites

Frankly I would use VGA just so I can leave the DVI port open for other things in the future. (e.g. laptop, pc, etc)

 

As long as the VGA cable isn't too long, the only real difference is how the information is being carried. (Analog Vs. Digital)

//ccap
Link to comment
Share on other sites

Link to post
Share on other sites

DVI !!!

end of discussion.

Stupid answer.

 

So i have two connectors on the back of my monitor, a DVI and a VGA. Now i want to know if i plug my xbox one into the VGA, will it decrease resolution as if i were to plug it into the DVI? because i have two choices, plug the xbox one into the VGA or get a DVI splitter and plug it into the DVI. So does VGA make a noticeable impact on resolution? By the way i have a 20" 1600 x 900 resolution moniter, model is a HP 2010i.

VGA is an analog connector which does support Full HD at 60Hz. So there shouldn't be any problem.

DVI is a newer connecter which can be analog or digital, depending on the connector and your motherboard.

There is DVI-A (analog, essentially the same as VGA), DVI-D (digital, essentially the same as HDMI) and DVI-I which supports both.

 

DVI is digital, VGA is not.

 

VGA will NOT effect resolutions, but it is best to use DVI

The resolution will not change, as your monitor is below both VGA and DVI's maximum resolution. However, the quality of the signal can change with VGA cables, as it is an analogue signal, unlike DVI's digital signal. 

 

If you can, you should use DVI, it's a better interface in general.

No, DVI is not digital. That completly depends on the connector of the monitor and graphics output of the device.

DVI can support digital and analog. (DVI-D, DVI-A and DVI-I)

dvi_connector_types.gif

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

VGA is capable of running a 900p monitor just fine. Just use DVI and be done with it.

 

Also, VGA won't decrease the resolution on a 900p monitor Lol

I don't do signatures.

Link to comment
Share on other sites

Link to post
Share on other sites

omg just plug any hole and run with it

try it out

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

The VGA connector has no limit.

Everything has a limit.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

Nope but DVI is better

Why?

CPU Ryzen 5 3600 | Motherboard MSI B550 | RAM 32 GB Corsair Vengance Pro | GPU EVGA GTX 980ti | Case Corsair 4000X | Storage Seagate Barracuda 2TB HDD,  Team GX2 1TB SSD | PSU Sentey 850w 80+ Bronze | Cooling AMD Wraith RGB | PCPartPicker URL https://pcpartpicker.com/list/RN7kGc |

Link to comment
Share on other sites

Link to post
Share on other sites

Everything has a limit.

well... VGA doesn 't have a teorethical limit

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why?

DVI can transport both digital and analog signals but VGA will still work for him.

Link to comment
Share on other sites

Link to post
Share on other sites

well... VGA doesn 't have a teorethical limit

It has a limit. Otherwise, it would have infinite bandwidth, but it doesn't. If your argument is increasing clocks, you can do that on any piece of computer hardware. Therefore, I guess no computer hardware is limited.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

Actually he is right in the sense that more you'll push the resolution, the more you'll need to cut on signal reliability (interference increases), the more you need to cut on colors and refresh rate (1 color, 0.01Hz, at 5K is possible, assuming the GPU can allow you to do it). Now is it reality it is not something most people will enjoy. That is what DXMember meant by 'no limit'

Link to comment
Share on other sites

Link to post
Share on other sites

In short; It doesn't really matter which one you use. I used a VGA converter from my Xbox 360 once and it was completely fine, so use whatever seems easiest for you! Like CCap mentioned:

 

Frankly I would use VGA just so I can leave the DVI port open for other things in the future. (e.g. laptop, pc, etc)

I would personally go for VGA on your Xbox, so you can use the DVI input for something better suited for it! Because frankly it really wont matter.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×