Jump to content

What Gpu is good for the next 10 years?

Offical_Meep

What Gpu is good for the next 10 years?

nothing to Expensive

but something reasonable Please!

 

 

Thanks

 

On a 1080p Display

 

probably dual display but mostly 1

Link to comment
Share on other sites

Link to post
Share on other sites

A GTX 1070/1080 would be what I can think of. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Technicolors said:

depends on the resolution. 

updated the post! Thanks for reminding me

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, luegnicl said:

Not a Single one! Think about a Card from 2007 that would be usable today?!

That's actually a great point, especially the way technology is progressing these days. But who knows?

i5 6600k and GTX 1070 but I play 1600-900. 1440p BABY!

Still, don't put too much faith in my buying decisions. xD 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, luegnicl said:

Not a Single one! Think about a Card from 2007 that would be usable today?!

Usable? Yes. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, deXxterlab97 said:

Usable? Yes. 

Usable? I mean for 2016 titles

Link to comment
Share on other sites

Link to post
Share on other sites

Even at 1080p, some games need way more vram now than they did just a few years ago. By default,  no card would. It can work, but you sure aren't going to be playing modern titles or those ever popular half broken not finish titles like Ark. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SteveGrabowski0 said:

LOL you could buy a Titan X Pascal and it'll be pretty low end within 5 years most likely.

yes. mores law is true. for graphics cards. not for cpus. there is/was not enough need and competition.

Link to comment
Share on other sites

Link to post
Share on other sites

No card will be good for 10 Years, the way displays and graphics in games has been progressing nothing will cover that much, but either wait for the 1080ti / Vega cards would be the best bet

 | CPU: AMD FX 8350 + H100i | GPU: AMD R9 290X + NZXT Kraken | RAM: HyperX Beast 2033 16GB | PSU: EVGA G2 | MOBO: ASRock 970M |

| CASE: Corsair Carbide 88R |STORAGE: 1x WD Black | KEYBOARD: Corsair K70 | MOUSE: R.A.T 9 |

SOMETIMES LOSING THE BATTLE, MEANS YOU CAN WIN THE WAR

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mooshi said:

Even at 1080p, some games need way more vram now than they did just a few years ago. By default,  no card would. It can work, but you sure aren't going to be playing modern titles or those ever popular half broken not finish titles like Ark. 

it is not possible to run gta 5 at 4k max setzings because it needs 25/26 gb vram and has no dx12 support.

Link to comment
Share on other sites

Link to post
Share on other sites

Depends how far games advance and what resolutions you plan on playing.

I'd go as far as saying a Titan XP will only be relevant for 5 or so years before it can only handle 1080p, either due to VRAM not being enough or just not having enough power. Assuming games keep developing and becoming more demanding at the rates they are now.

PC - CPU Ryzen 5 1600 - GPU Power Color Radeon 5700XT- Motherboard Gigabyte GA-AB350 Gaming - RAM 16GB Corsair Vengeance RGB - Storage 525GB Crucial MX300 SSD + 120GB Kingston SSD   PSU Corsair CX750M - Cooling Stock - Case White NZXT S340

 

Peripherals - Mouse Logitech G502 Wireless - Keyboard Logitech G915 TKL  Headset Razer Kraken Pro V2's - Displays 2x Acer 24" GF246(1080p, 75hz, Freesync) Steering Wheel & Pedals Logitech G29 & Shifter

 

         

Link to comment
Share on other sites

Link to post
Share on other sites

I'd say with the way things are going in the CPU world right now (Moore's law slowing down), you may be better off now with a TItan XP in 10 years than the gtx 8800 user is now.  But It's also difficult to say what other factors in GPU architecture will make a difference in 10 years, so you should never really plan for that far ahead.  Building a PC means you're a builder for life, and you can't just stop being one for 10 years. :P  

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

lol, the next 10 years... That is a huge timespan. 

For the next 2-3 years would be more reasonable.

My Setup: 
CPU: i7 4790 @3800 MHz, MB: MSI H87-G41, Grafik: Gigabyte GTX 1080TI, RAM: 2x 8GB DDR3 (1600), Storage: Samsung SSD 850 Evo

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Biggerboot said:

I'd say with the way things are going in the CPU world right now (Moore's law slowing down), you may be better off now with a TItan XP in 10 years than the gtx 8800 user is now.  But It's also difficult to say what other factors in GPU architecture will make a difference in 10 years, so you should never really plan for that far ahead.  Building a PC means you're a builder for life, and you can't just stop being one for 10 years. :P  

 

 

it is not about mores law. it is still possible. the manufacturers are just not willing to do more

Link to comment
Share on other sites

Link to post
Share on other sites

10 years?  I'm not even sure if that's possible, unless you're satisfied running future games at the absolute lowest settings.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

Since no one can tell you what games will require in the year 2027, no one can tell you what current card will work on games in 2027

CPU: AMD Ryzen 7 7800x3d  Motherboard:  Gigabyte B650 AORUS Elite  RAM:  Vengeance 2 x 16GB DDR5 6000   GPU:  Zotac RTX 4090

Storage:  M.2 Samsung Evo 860 TB / Samsung Evo 840 500GB   Case:  be quiet Dark Base 900   PSU: Corsair SHIFT RM1000x  Display:  ASUS AW3423DW QD-OLED 34" 3440x1440 Ultrawide w/ GSYNC

Link to comment
Share on other sites

Link to post
Share on other sites

Usually its the VRAM that gives out to meet demands of future games before the performance of the chip so if you really serious, your only option would be the Titan X Pascal and its 12GB VRAM.

 

Geometry has slowed down in improvements if not stopped, something the 2007 cards had to worry about more as time progressed as that shortened their practicality. But with geometry being acceptable where it is now, it's really just lighting & effects, and higher quality textures that are gonna tax out your GPU as the years go by

 

 

System: Intel Core i3 3240 @ 3.4GHz, EVGA GTX 960 SSC 2GB ACX 2.0, 8GB 1600MHz DDR3 Kingston HyperX RAM, ASRock B75M-DGS R2.0 Motherboard, Corsair CX430 W Power Supply

Link to comment
Share on other sites

Link to post
Share on other sites

None.  Try running modern games at 1080p on an 8800gt...

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, luegnicl said:

it is not about mores law. it is still possible. the manufacturers are just not willing to do more

Sorta.  You'll have to elaborate on that point because at face value it just sounds like conspiracy talk.  For video card makers, yes, because there's other factors they need to consider in the video card besides the processor.  CPU/GPU manufacturers can be affected by a variety of factors that can be business/profit driven as well as physical limitations in an aging architecture.  Then the software optimization plays a role.  If you're referring to things we're seeing on the consumer level like Intel's stalling for 10nm cpu's, that doesn't cover the whole scope of it. 

 

Think of how processors and computer architecture are advancing.  If somebody like Intel is considering another short term solution similar to their optimization process, that will pale in comparison to the advancements we'll see in future architectures (perhaps a replacement to x86 or silicon based cpu's). At that point, the conventions on computers themselves may change, and we may not be considering hardware upgrades for the same reasons.

 

Regardless of what your opinion is on the matter, accurately predicting what computers will need in 10 years is impossible.  Only recently have we been seeing more emphasis on better optimization (DX12) rather than pushing for more realism.  So I think when you look at what are going to drive 'sales', more emphasis will be going into superficial things higher resolutions (4k netflix is only the beginning).  This will probably continue to be the case until we see that next major evolution, which is probably further away than 10 years.

 

(tl;dr)Regarding the OP, don't worry about 10 years.  Regardless of how high grade a video card you get now, they'll all reach obsolesce at the same time (roughly 5 years), because computers have gotten faster exponentially.  That's why Moore's law has had a pretty strong relevance on this 5ish year cycle, but it's also why sandy bridge users can still get by.  More emphasis being put on vRAM by newer engines adds to the unpredictability.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×