Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
chickennuggetstyle

Are there any analog GPUs left?

Recommended Posts

Posted · Original PosterOP

Planning on hooking up an analog TV to a sort of gaming/home theater PC. Unfortunately, as far as I can tell, the age of DVI on GPUs is over. Just wondering if there's anything I've missed here, or if my 2 best options really are adapters and used cards :(

Link to post
Share on other sites
1 minute ago, chickennuggetstyle said:

Planning on hooking up an analog TV to a sort of gaming/home theater PC. Unfortunately, as far as I can tell, the age of DVI on GPUs is over. Just wondering if there's anything I've missed here, or if my 2 best options really are adapters and used cards :(

I think used cards and adapters is your best go 


Reminder⚠️

I'm just speaking from experience so what I say may not work 100%

Please try searching up the answer before you post here but I am always glad to help

Link to post
Share on other sites
1 minute ago, chickennuggetstyle said:

Planning on hooking up an analog TV to a sort of gaming/home theater PC. Unfortunately, as far as I can tell, the age of DVI on GPUs is over. Just wondering if there's anything I've missed here, or if my 2 best options really are adapters and used cards :(

DVI for a long time hasn't even been the analog variant. The last couple gens (current one not included) had cards with DVI-D, not DVI-I.

 

For DVI-I you probably have to go back to R9 200 series (although DVI-I was becoming uncommon there too) and GTX 500 or 600 series if memory serves me right.

 

Or you have to go even further back for VGA or S-Video support.

Or you of course can get an adapter.


"We're all in this together, might as well be friends" Tom, Toonami.

Sorry if my post seemed rude, that is never my intention.

"Why do we suffer a lifetime for a moment of happiness?" - Anonymous

 

Link to post
Share on other sites
1 minute ago, minibois said:

DVI for a long time hasn't even been the analog variant. The last couple gens (current one not included) had cards with DVI-D, not DVI-I.

Dang it was gonna say the same thing 🤣


Fun Fact: The Meshify c is the best case to ever exist.

 

 

Wārudobītā

Spoiler

3100.

B550m TuF plus.

Hyper x rgb 8 gig stick. (MJR or CJR  i still don't know lol).

1650 super.

CX450.

MX500.

NX500.

 

 

 

Link to post
Share on other sites
5 minutes ago, chickennuggetstyle said:

Planning on hooking up an analog TV to a sort of gaming/home theater PC. Unfortunately, as far as I can tell, the age of DVI on GPUs is over. Just wondering if there's anything I've missed here, or if my 2 best options really are adapters and used cards :(

Use an active adapter, so you won't lose noticeable quality and can use any modern gpu that you want. And upgrade your TV in the future when you have the chance to

Link to post
Share on other sites
Posted · Original PosterOP
1 minute ago, minibois said:

DVI for a long time hasn't even been the analog variant. The last couple gens (current one not included) had cards with DVI-D, not DVI-I.

 

For DVI-I you probably have to go back to R9 200 series (although DVI-I was becoming uncommon there too) and GTX 500 or 600 series if memory serves me right.

 

Or you have to go even further back for VGA or S-Video support.

Or you of course can get an adapter.

Damn, so I really am fucked. Guess I'll just have to live with... latency 😭

Link to post
Share on other sites
Posted · Original PosterOP
3 minutes ago, 3rrant said:

Use an active adapter, so you won't lose noticeable quality and can use any modern gpu that you want. And upgrade your TV in the future when you have the chance to

Any chance you could explain the difference to me between an active and a "standard" adapter? I assume all of them would require external power, right (given that there is a digital->analog conversion occurring)?

Link to post
Share on other sites
2 minutes ago, chickennuggetstyle said:

Any chance you could explain the difference to me between an active and a "standard" adapter? I assume all of them would require external power, right (given that there is a digital->analog conversion occurring)?

It might not be needed depending on the card and output you're using, anyway a quick resume:

Active DisplayPort Adapters

• Uses video converter chips inside the adapter for signal conversion
• More expensive due to additional chips
• Necessary when source does not support dual-mode DisplayPort (DP++)
• Converts both single-mode and dual-mode DisplayPort
• Analog VGA and dual-link DVI require powered active adapters to convert protocol and signal levels. (VGA adapters are powered by the DisplayPort Connector while dual-link DVI adapters may rely on an external power source)
• May be necessary if graphics card cannot support DP++ output on the maximum number of monitors

Passive DisplayPort Adapters

• Cheaper than active adapters
• Only a maximum of two passive adapters can be used per GPU, due to the fact that each GPU only has 2 clock sync signals for the passive adapter to utilize
• Relies on DisplayPort or Mini DisplayPort sources that support dual-mode (DP++)
• DP++ source can use a passive adapter to convert DisplayPort signals to single-link DVI or HDMI
• DP++ video source performs the conversion instead of the adapter

 

From: https://blog.exxactcorp.com/active-vs-passive-displayport-adapters-what-you-need-to-know/

 

 

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×