Jump to content

Is it possible to use a pre-Pascal GPU as a video DAC?

chickennuggetstyle

I haven't owned a desktop for 3 or 4 years now, but I think I finally hate my laptop enough to make the change. I only ever use the thing at home plugged in to an external display anyways, and it's frankly ugly as all hell, so I'm not gonna be sad to see it go. Hoping to find someone out there who'll swap me for a desktop with comparable specs, but maybe that's wishful thinking.

 

Anyways, the issue I'm posting about is with my CRT monitor (no, I won't get a new monitor), which importantly only takes analog video in. My DisplayPort->VGA adapter hasn't ever had enough bandwidth for me to simultaneously run the monitor at its max resolution and max refresh simultaneously, even though I know it's capable. But I have heard that one can use an older GPU (With DVI-I) as a sort of passthrough/glorified video DAC.

 

It seems like the obvious solution for me is to stick an old Maxwell card in the build I'm planning. Problem is that I can't find a tutorial for setting the configuration up anywhere online, but I think it should be relatively similar to what LMG did in their "Nvidia Said We Couldn't Game On This Crypto Mining Card..." video.

 

I really don't want to have to buy yet another dongly adapter (and risk it also not having enough bandwidth) if this much cleaner solution is an option.

EDIT: I can't write. To rephrase all this more clearly:

 

I like my CRT, but it needs native analog video in. Modern cards have jaw-dropping processing capabilities, but no native analog video out. Older cards have native analog video out. I'd like to put one new card and one old card in my upcoming system build, configured so that the older card can work as an overqualified DAC and display output. Videos like this indicate that it should be possible. Anyone know if it is, and if so, how to do it?

 

Any help or insight would be strongly 💪 appreciated.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mel0nMan said:

The video controller in whatever new GPU you get will be more than good enough to run whatever display that is. Seeing as they can handle what like 4k 240hz now? 

Well the issue is actually hooking it up to the monitor, which only takes VGA.

Link to comment
Share on other sites

Link to post
Share on other sites

DVI-I (the type of DVI that has the "+" with the 4 pins around it) is basicly also a VGA output, with one of those passive adapter plugs, or just a DVI-I to VGA cable.

 

and it's either gonna list the compatible modes for your CRT if it is modern enough for edid, or list some default modes.

either way, on VGA nvidia control panel lets you go WILD with custom resolutions.

1 minute ago, Mel0nMan said:

The video controller in whatever new GPU you get will be more than good enough to run whatever display that is. Seeing as they can handle what like 4k 240hz now? 

the problem OP is facing, but didnt quite know how to explain, is native VGA output, so he doesnt have to deal with some adapter going funky with odd-ball CRT modes.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, manikyath said:

DVI-I (the type of DVI that has the "+" with the 4 pins around it) is basicly also a VGA output, with one of those passive adapter plugs, or just a DVI-I to VGA cable.

 

and it's either gonna list the compatible modes for your CRT if it is modern enough for edid, or list some default modes.

either way, on VGA nvidia control panel lets you go WILD with custom resolutions.

the problem OP is facing, but didnt quite know how to explain, is native VGA output, so he doesnt have to deal with some adapter going funky with odd-ball CRT modes.

I think I need to get some sleep. Seems like I'm struggling hard to get this point across in writing, lol.

 

I know that DVI-I is analog, but modern cards don't have DVI-I. Only pre-Pascal stuff does. I'd like to be able to harness the power of newer architecture without losing analog capability, so my question was about having a secondary GPU in the system (one that actually has DVI-I) which should just work as a DAC.

 

Sorry for my lack of clarity. Going to edit my post now.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, manikyath said:

DVI-I (the type of DVI that has the "+" with the 4 pins around it) is basicly also a VGA output, with one of those passive adapter plugs, or just a DVI-I to VGA cable.

 

and it's either gonna list the compatible modes for your CRT if it is modern enough for edid, or list some default modes.

either way, on VGA nvidia control panel lets you go WILD with custom resolutions.

the problem OP is facing, but didnt quite know how to explain, is native VGA output, so he doesnt have to deal with some adapter going funky with odd-ball CRT modes.

Ah. Ok makes more sense. But yea, a DVI output carries an analogue signal and should work fine with vga

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, chickennuggetstyle said:

I think I need to get some sleep. Seems like I'm struggling hard to get this point across in writing, lol.

 

I know that DVI-I is analog, but modern cards don't have DVI-I. Only pre-Pascal stuff does. I'd like to be able to harness the power of newer architecture without losing analog capability, so my question was about having a secondary GPU in the system (one that actually has DVI-I) which should just work as a DAC.

 

Sorry for my lack of clarity. Going to edit my post now.

tbh, for any resolution a CRT could game at, just buy some second hand 980(Ti) and call it a day..

 

you're gonna lose out on RTX, for the 'several' games that make it a worthwhile investment of 2000 of your dollars...

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Caroline said:

What kind of adapter are you using?

 

I'm rocking a CRT and my adapter tops at 1920*1440 and 97Hz but I set it to 85 for daily use.

 

Oh, and make sure you're using a 15-pin cable to connect the monitor and adapter. That's important.

I'm using the official Apple adapter, but didn't read the fine print (it only does 1080p60). All I need is 1600x1200@85hz. Mind linking me to whatever it is you're using? 🙂

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, manikyath said:

tbh, for any resolution a CRT could game at, just buy some second hand 980(Ti) and call it a day..

 

you're gonna lose out on RTX, for the 'several' games that make it a worthwhile investment of 2000 of your dollars...

The CRT I'm running does better resolutions and framerates than my 2017 gaming laptop's LCD. That gaming laptop has a GTX 1070 in it, which still doesn't run as smoothly as I'd like. Low public opinion of CRTs disappoints me once again 😞 But thanks for the help anyways.

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, chickennuggetstyle said:

Low public opinion of CRTs disappoints me once again

I mean LCDs are better in almost all ways now. Brighter, better contrast, higher resolution, no flicker, no burnin. We got things like HDR recently as were not limited by the standards made with CRTs in mind.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Electronics Wizardy said:

I mean LCDs are better in almost all ways now. Brighter, better contrast, higher resolution, no flicker, no burnin. We got things like HDR recently as were not limited by the standards made with CRTs in mind.

IMO this is a complete misconception.

 

I've seen a lot of "high end" LCDs at this point, and would have agreed with you not-so-long-ago. But I got ahold of my first high end CRT just a few weeks ago, and haven't looked back. The "make better contrast by make brighter backlight" strategy by LCD manufacturers doesn't do it for me. Dark scenes in low light still look like crap on any LCD that doesn't have some crazy number of dimming zones (which cause crazy blooming and often just look like shit). 1440p is more than enough resolution for me at computer monitor sizes, and flicker isn't noticeable after 85hz. The one thing I'll concede to you is burn-in, but how hard is it to just turn on a screen saver? Or just put your computer to sleep if you're not using it?

 

I don't mean to be inflammatory here, but CRT is just a superior technology for overall picture quality (in dark environments). Size and convenience are obviously massive drawbacks, but people who think consumer-level LCDs are ever gonna come close to "catching up" in picture clearly haven't yet seen a good CRT. And people who have and still prefer their LCD obviously have higher priorities than getting the absolute highest image quality achievable. I'm just a bit of a budget videophile 🙂

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, chickennuggetstyle said:

"make better contrast by make brighter backlight" strategy by LCD manufacturers doesn't do it for me

Even without playing with backlights, lcds are much better than crts, crts only get about a 100-300:1 contrast ratio with a checkboard pattern or simmilar, while lcds can get about 1000:1. They do well in full on vs full off, but thats a pretty unrealistic use case.

 

31 minutes ago, chickennuggetstyle said:

1440p is more than enough resolution for me at computer monitor sizes,

LCDs are also much sharper normally too, I have used many a good crt, and text just isn't as sharp, and with a large display, 4k /6k is a pretty noticable improvement.

 

31 minutes ago, chickennuggetstyle said:

The one thing I'll concede to you is burn-in, but how hard is it to just turn on a screen saver? Or just put your computer to sleep if you're not using it?

This is still a issue with things like task bars and static elements, and lots of people use their monitors for 8+ hours a day, so a screen saver won' thelp here.

 

31 minutes ago, chickennuggetstyle said:

CRT is just a superior technology for overall picture quality

Not really, there is a good reason why everyone has moved on from crts now.

 

 

But for your orginal question, people stopped caring about VGA a while ago. Id just get a 980 ti here if I were you, there pretty cheap, and still fast. I don't know of any good VGA converter, and there is no native way to have anouther gpu. Using a older gpu also has the issue of the software support being dropped for the older gpus, and forcing you on a old driver.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Electronics Wizardy said:

Even without playing with backlights, lcds are much better than crts, crts only get about a 100-300:1 contrast ratio with a checkboard pattern or simmilar, while lcds can get about 1000:1. They do well in full on vs full off, but thats a pretty unrealistic use case.

Maybe this is the case on some really expensive calibrated LCD, but I've done side-by-side comparisons of those OLED test videos between my CRT and every other TV/monitor in my house (several of which costing $1000+). I've never seen any LCD come close. Some of my crappier CRT SDTVs only do full on/full of though admittedly.

6 hours ago, Electronics Wizardy said:

LCDs are also much sharper normally too, I have used many a good crt, and text just isn't as sharp, and with a large display, 4k /6k is a pretty noticable improvement.

I prefer CRT smoothness to blocky LCD sharpness, but I can see how it could be annoying for some applications. Still, try running an LCD at any resolution besides the native- you won't be impressed with the clarity. I don't doubt that LCDs are better for word processing, but I'd also be curious what "good CRT"s you've used. Good ones can checkerboard without any noticeable lightening of the black levels.

7 hours ago, Electronics Wizardy said:

This is still a issue with things like task bars and static elements, and lots of people use their monitors for 8+ hours a day, so a screen saver won' thelp here.

 

Not really, there is a good reason why everyone has moved on from crts now.

Color accuracy, response time, motion clarity, and once again: black levels. Someone who writes word documents with their taskbar visible the entire day is probably not going to care about these things, but they're all objectively better on CRT. I'm not saying everyone should go buy a CRT (I already hate how difficult they are to get ahold of) but they produce superior images.

 

People tend to really hate the idea that they don't have "the best thing", because for some reason everyone wants to think of themselves as 100% pragmatic and objective. I think that's why everyone wants to remember CRTs so unfondly- because we ditched them for a more convenient product which, at the time, was objectively far inferior (seriously, try to use an LCD from 2006). But there's no shame in discarding an inconvenient product for a more convenient one, especially if the pros of the inconvenient product seem insignificant. What I can't get behind is when people ignore the limitations of the technology they use, just because it feels shitty to not have "the best thing". Especially when I hear the shoddy justification "everyone else is using it". Display manufacturers are plenty complacent as it is.

7 hours ago, Electronics Wizardy said:

But for your orginal question, people stopped caring about VGA a while ago. Id just get a 980 ti here if I were you, there pretty cheap, and still fast. I don't know of any good VGA converter, and there is no native way to have anouther gpu.

One guy a few comments up apparently has an adapter with decent bandwidth.

7 hours ago, Electronics Wizardy said:

Using a older gpu also has the issue of the software support being dropped for the older gpus, and forcing you on a old driver.

...which is exactly why I won't be getting a 980Ti. But your insight is appreciated regardless.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×