Jump to content

AMD R2-290X no dvi-i to vga? NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO

lhikary
Go to solution Solved by lhikary,

ONLY REASON I CAN SPEND $600 ON A GPU IS BECAUSE I WORKED 3 MONTHS TO GATHER THE BLOODY MONEY

SO IT SHOULD TAKE YOU HALF A MONTH TO BUY A $100 1080P DISPLAY THAT HAS DVI.

SUCK IT UP. I CAN USE CAPS TOO! 

I .. AM A MAN!

304155-spongebob-square-pants-i-am-a-man

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

SO IT SHOULD TAKE YOU HALF A MONTH TO BUY A $100 1080P DISPLAY THAT HAS DVI.

SUCK IT UP. I CAN USE CAPS TOO! 

I .. AM A MAN!304155-spongebob-square-pants-i-am-a-man

Harsh but true

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

Real programmers don't document, if it was hard to write, it should be hard to understand.
I've learned that something constructive comes from every defeat.

Link to comment
Share on other sites

Link to post
Share on other sites

SO IT SHOULD TAKE YOU HALF A MONTH TO BUY A $100 1080P DISPLAY THAT HAS DVI.

SUCK IT UP. I CAN USE CAPS TOO! 

I .. AM A MAN!

304155-spongebob-square-pants-i-am-a-man

I CAN USE CAPS ASWELL :D

/request close

Real programmers don't document, if it was hard to write, it should be hard to understand.
I've learned that something constructive comes from every defeat.

Link to comment
Share on other sites

Link to post
Share on other sites

who even uses VGA nowadays.......... :rolleyes:

| Corsair 900D | i7-4770K | ASUS Maximus VI Formula | SLI ASUS Geforce GTX780 | EK & Alphacool WaterCooling | 


| Corsair Vengeance Pro Red 32GB | | Two Corsair Neutron GTX Series 240GB SSD | Corsair AX1200i |


| Western Digital Caviar Black 1TB | Two Seagate Barracuda 4TB  | Windows 8 Pro 64 Bit |http://linustechtips.com/main/topic/189392-corsair-900d-sli-watercooled-refresh-and-new-setup/#entry2550514

Link to comment
Share on other sites

Link to post
Share on other sites

Usually old LCD monitors like mine still have a DVI port and a VGA port. I think only CRT monitors only have a VGA port so it isn't necessary to have vga now imo.

Hello and Welcome to LTT Forum!


If you are a new member, please read the rules located in "Forum News and Info". Thanks!  :)


Linus Tech Tips Forum Code of Conduct           FAQ           Privacy Policy & Legal Disclaimer

Link to comment
Share on other sites

Link to post
Share on other sites

I don't really mind this VGA is useless for a high end card like that. If you are buying a high end card and don't have a DVI capable monitor then you should get the monitor first.

-Corsair 750D - MSI Mpower Z77 - 3570k @ 4.21 -Corsair H60- 8GB Corsair Vengeance 2133 Mhz- Corsair RM850 - MSI 780 Lightning -256GB SSD- 2 1TB HDDs - LG 23' Flatron - Razer Death Adder - Ducky Shine 3 Cherry MX Blue - Logitech z506 5.1-

Project - Bloody Lightning

Link to comment
Share on other sites

Link to post
Share on other sites

I bought a new monitor when I bought a new gpu. And it wasn't a $600 card, I just didnt want to use vga anyways...

Link to comment
Share on other sites

Link to post
Share on other sites

Unless your motherboard supports LucidVirtu, allow the video to switch from using your discreet gpu to your apu.

 

The colours of my monitor, using a dvi cable is much richer (yes, may factor cabling quality), but testing other available vga cables, I find that vga sucks for graphical design, especially when your project involves a lot of rich colours.

 

AMD is making sense to remove analogue signal on a premium high-end card, plus I wouldn't mind seeing AMD removing all DVI and opt for more modern connectors like full-size HDMI and/or DP.

THAT I would mind. This is because most people still got HDMI 1.4 or lower and that can't support more then 60 Hz as far as I know...

And DP cabling is expensive as hell...

Tor
Corsair Obsidian 650D - Intel 4770K CPU - Gigabyte G1 Sniper 5 - ASUS GTX 780 Direct CU 2 - Kingston Beast Hyperx Beast 16 GB RAM -  Corsair AX 1200i PSU - Samsung EVO drive 750 GB - Corsair AF series 120mm fans - Corsair H100i - Razer Blackwidow Ultimate 2013 edition - Razer Ouroboros - Razer Manticor - Windows 7 - Beyerdynamic MMX 300

Link to comment
Share on other sites

Link to post
Share on other sites

On a more trivial note: Welcome to the forums :D

Tor
Corsair Obsidian 650D - Intel 4770K CPU - Gigabyte G1 Sniper 5 - ASUS GTX 780 Direct CU 2 - Kingston Beast Hyperx Beast 16 GB RAM -  Corsair AX 1200i PSU - Samsung EVO drive 750 GB - Corsair AF series 120mm fans - Corsair H100i - Razer Blackwidow Ultimate 2013 edition - Razer Ouroboros - Razer Manticor - Windows 7 - Beyerdynamic MMX 300

Link to comment
Share on other sites

Link to post
Share on other sites

About time they got rid of ancient VGA connectivity. Technology should move forward not back. :P

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why not? There isn't any difference in image quality, and unless you want to run 2560x1600 or higher, there is no need for DVI, since VGA allows resolutions up to 2048x1536.

The only reason I can see is that DVI can carry sound and is needed to watch BluRays because it supports HDCP (which is pretty much the only reason I bought a DVI cable, since my monitor shipped with only a VGA cable and I couldn't watch BluRays with that).

 not quiet, vga is only limited by the RAMDAC on gpus (99% of which are 400mhz) and the horizontal/vertical scanning frequency of your crt. I for example can run 2560x1600 @ 68Hz on my crt and it still looks ok.

 

The removal of analogue support kinda sucks, i don't know how much space it needs on the pcb or how expensive it is (probably not much, gpu manufacturers don't pay much attention to their RAMDACs anyway). 

I recently got a crt and it is still the superior tech for motion picture (together with plasma, although that is not really usable as a monitor). I'm so mad that everything good about CRT got thrown out of the window for LCD which sucked for a long time and were overpriced like hell but everyone was like "look at my LCD (with 0.5cdm/2 blacks, bad viewing angles, 10ms input lag and 20ms repsonse time) light and thin; black levels? wtf are you talking about, look it is so thin".

LCD has come a long way but 10 years and we still have 60HZ and 1080p because consumers don't give a damn (and that is ok for the most part, but that something like the lightboost hack only exists because of 3D and not because manufacturers actually tried to improve/solve the sample and hold blur of lcds is so ... depressing). And companys like BenQ selling 120Hz panels for 300$+ as if this is justified in any way, i'm glad that we now have those korean pls, 1440p and (overclockable to 100-120HZ) monitors for around 350$ which show that a lot more for a lot less money is possible.

 

Uhm, so i kinda digressed, where was i....right; having vga (and if only for a secondary screen which is still working fine and can be used) is nice, and for a tiny tiny minority which uses CRT is nice, now those ppl. have to buy something like an HD fury which isn't exactly cheap. Not such a great move as some ppl. here say it is IMHO, heck, having a ps/2 port for easy n-key rollover is nice to have too.

Link to comment
Share on other sites

Link to post
Share on other sites

Both of my 1920x1080P monitors are VGA, One using DVI-I to VGA and one using DisplayPort to VGA (You can get the adapter for £10 on Ebay) I don't get what the fuss is over DVI and HDMI, My brothers monitor is a DVI version of mine and I actually find I can see everything less crisp on his. 

CPU: Intel Core i7-4790k @ 4.7 1.3v  with a Corsair H80 w/Dual SP120s - Motherboard: MSI Z97 gaming 5 - RAM: 4x4 G.Skill Ripjaws X @ 1600 - GPU: Dual PowerColour R9 290- SSD: Samsung NVME SM951 256GB-- PSU: Corsair RM 1000  - Case: NZXT H440 Black/red - Keyboard: Coolermaster CM storm Quickfire TK, Cherry MX blues - Mouse: Logitech G502 - Heaphones: Beyerdynamic DT 770 - Monitors: 3x VE248H Eyefinity 1080P -  Phone: iPhone 6S Plus               Please post your specifications in your post, signature or even better, system page on your profile!

Link to comment
Share on other sites

Link to post
Share on other sites

If people were planning to use VGA with this card, they should feel ashamed of themselves.

I think this is a step in the right direction; we should be looking to future technologies, not the past.

Please use some sense before replying to me. This is not a personal attack so don't take it like one.

GW2: Vettexl.9726

Link to comment
Share on other sites

Link to post
Share on other sites

people wouldn't put cheap headphones/headsets on a quality sound card (which these days are only ~$100)

 

but this is an issue for a ~$600 card?

 

:rolleyes:

Link to comment
Share on other sites

Link to post
Share on other sites

I've got a BenQ monitor that I've had since 2002 that has DVI. There is no reason to have a card like that attached to a monitor that doesn't support digital input.

- Silverstone TJ08B-E - Gigabyte Z87M-D3H - i7 4770k @ 4.0GHZ 1.2v - 16gb Kingston HyperX Black 1600 - Gigabyte GTX 770 OC 4GB -


- Silverstone Fortress FT02 - MSI Z77 Mpower - i5 3570k @ 4.0GHZ 1.09v - 8gb Mushkin Blackline 1600 - MSI GTX 670 PE -


- Lenovo T430 (1600x900) - i5 3210m - 8GB DDR3 1333 - nVidia NVS5400M - 256GB mSATA OS - 320GB HDD-

Link to comment
Share on other sites

Link to post
Share on other sites

So let me get this straight... You want to get a TOP of the line enthusiast gpu when your monitor cant even support DVI?! Priorities man.

CPU: i7 6700k @ 4.6ghz | CASE: Corsair 780T White Edition | MB: Asus Z170 Deluxe | CPU Cooling: EK Predator 360 | GPU: NVIDIA Titan X Pascal w/ EKWB nickel waterblock | PSU: EVGA 850w P2 | RAM: 16GB DDR4 Corsair Domintator Platinum 2800mhz | Storage: Samsung 850 EVO 500GB | OS: Win 10 Pro x64 | Monitor: Acer Predator X34/HTC VIVE Keyboard: CM Storm Trigger-Z | Mouse: Razer Taipan | Sound: Audio Technica ATH-M50x / Klipsch Promedia 2.1 Sound System 

 

Link to comment
Share on other sites

Link to post
Share on other sites

ehh u do know u can just wait for another vendor to make one that works with a VGA adapter

I'm pretty sure the 3rd party manufactures need to make a custom PCB design for this to happen.. ASUS might do it.

Everyone have a cool signature. I don't, so I thought I would write something.

- Cool right?

Link to comment
Share on other sites

Link to post
Share on other sites

If you buy R9-290x and use VGA, Linus will come And tell you a little story... :P

A little story?! he will have a full on rant.

Link to comment
Share on other sites

Link to post
Share on other sites

Why not? There isn't any difference in image quality, and unless you want to run 2560x1600 or higher, there is no need for DVI, since VGA allows resolutions up to 2048x1536.

The only reason I can see is that DVI can carry sound and is needed to watch BluRays because it supports HDCP (which is pretty much the only reason I bought a DVI cable, since my monitor shipped with only a VGA cable and I couldn't watch BluRays with that).

1. dvi doesn't support sound.

2. VGA quality IS worse, but not for reasons u'd think. If given a good vga monitor(of which there haven't been for the fast 7 years), quality isn't an issue. BUT monitors these days have horrible vga converters on them, making quality bad. VGA is basically digital nowadays, with only the cable being analog

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

To be honest, who the f*ck would spend so much money on this kind of gpu and use VGA? WTF xD

I have a cheap monitor as a secondary monitor. It sucks if I need to upgrade a monitor just to look through text/browse the internet, but I'm *kind of* glad for things like this (forcing markets/technology to advance forward and not being limited by old standards).

Build Logs

 

Project "Oh No" | Ice Pack | Secondhand | MicroP

 

Link to comment
Share on other sites

Link to post
Share on other sites

A little story?! he will have a full on rant.

i dont know why Linus hates it i mean

i use it

it runs at 1080p so whats the problem

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, I'm sure. It is possible that it's a european model, it actually has a sticker that says DVI right next to the VGA port, but there is no port. I even tried to disassemble it to see if I can soldier on a connector...

 

What brand is it?

Link to comment
Share on other sites

Link to post
Share on other sites

lol what

Its like complaining that you can't put it in an AGP slot, one does not simply buy a high end gpu and use it with a ancient monitor! 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×