Jump to content

Are gpu's becoming rapidly weaker over time compared to the past?

jaslion

Basically what it comes down to is that I've noticed with a vega 56 I got for cheap I can run any game at 1440p high settings 60fps solid but the gpu is already at 80% usage in any game this feels just wrong. As this would point that in a year or so it would be at 100% with the new consoles having launched and then already having to drop settings. This feels very quick compared to older cards.

 

For example a hd 7970 my friend got int 2012 could be a 1080p high 60fps card till about 2018 without any problem and that was also a card that went through an entire new generation of consoles. But I can't see my vega lasting that long even tho it is basically positioned around the same as the 7970 was back in the day especially with my vega 64 unlock on it.

 

What it really comes down to is the feeling of buying a product that rapidly becomes outdated. I normally use my system for at least 5 years and that has never really been a problem and I always used a gen or 2 older stuff but now it seems that it will just make the 5 year mark.

 

Another thing is that with all those big games I genuinely don't really see a big difference in graphics. Yeah sure read dead redemption looks good but so does gta V and when playing I wouldn't notice a difference if the graphics were that of gta V quality level. I play games like sea of thieves, guild wars 2, dishonored 1&2, monster hunter world, DRG, ... and those games run great but the moment I tried something new and not really better looking like DMC 5, the new AC game, sekiro,.. they all really hit my system hard and I feel like there is no need for it.

 

Am I just seeing a wrong pattern or is it just a lot less gpu for the money both in performance and time it lasts these last 3 years?

Link to comment
Share on other sites

Link to post
Share on other sites

Technology is advancing more quickly, and it sure seems like we're going longer between GPU generations than we were back from Tesla to Maxwell.

Aerocool DS are the best fans you've never tried.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, jaslion said:

Basically what it comes down to is that I've noticed with a vega 56 I got for cheap I can run any game at 1440p high settings 60fps solid but the gpu is already at 80% usage in any game this feels just wrong. As this would point that in a year or so it would be at 100% with the new consoles having launched and then already having to drop settings. This feels very quick compared to older cards.

Vega GPUs were outdated when they launched so it's not a realistic outlook on the state of GPU performance improvements over the years, a 2016 GPU like the GTX 1070 performs the same as a Vega 56, so you're talking about a 4 year old GPU at this point, AMD only just started reaching NVIDIA with their RDNA architecture

Quote or Tag people so they know that you've replied.

Link to comment
Share on other sites

Link to post
Share on other sites

Most generations of graphics cards before were incremental improvements, but we are now seeing leaps and bounds in technology and capability of manufacture.

Link to comment
Share on other sites

Link to post
Share on other sites

1060 3gb and 6gb were and is the most popular gpu on steam, we lost 1 generation with turing and this gen (even though it's a 40% jump) is affected by covid, lisa su already hinted that it'll be another big jump next gen. That should put some space between pc gpus and consoles.

 

I don't think gpus are getting slower. It's actually accelerating toward 4k/144hz imho, just doesn't feel like it cause this year's been a waiting game.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

My 3080 is usually at 95%+ usage at 1440p so

 

idk what you really want us to say, LOL

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Mister Woof said:

My 3080 is usually at 95%+ usage at 1440p so

 

idk what you really want us to say, LOL

More of a feeling off huh that is oddly high usage for a card I hoped to last me a long time.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, jaslion said:

More of a feeling off huh that is oddly high usage for a card I hoped to last me a long time.

it means you're using the most of your card's capability.

 

I could drop it into the 40s-50s if i capped FPS at 60. Or played games with RTX off. But I won't because I'd rather have more frames/features.

 

High usage is not really a bad thing. I'd be more upset if I spent $750 on a GPU and have it idle most of the time and not get the frames I wanted.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mister Woof said:

it means you're using the most of your card's capability.

 

I could drop it into the 40s-50s if i capped FPS at 60. Or played games with RTX off. But I won't because I'd rather have more frames/features.

 

hth

Oh yeah I am aware of that I just like to use my stuff for a long time so already being at that in just 1440p 60fps is a bit of a oh hey that is quick but as has been pointed out the vega 56 is basically a 4 year old card in terms of performance which is something I forgot to factor in so that makes things a lot less surprising.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, jaslion said:

Oh yeah I am aware of that I just like to use my stuff for a long time so already being at that in just 1440p 60fps is a bit of a oh hey that is quick but as has been pointed out the vega 56 is basically a 4 year old card in terms of performance which is something I forgot to factor in so that makes things a lot less surprising.

The beauty of PC parts is inherent scalability of settings.

 

Here's an example:

 

ultra

 

The benchmarks you see in Valhalla are usually ultra quality, which, like Odyssey, really kill performance (volumetric clouds I'm lookin at you). Tune it down just a bit and you're looking at either way more framerate or if you cap FPS, less load.

 

Theorycraft that each detail setting is a year worth of new games, Ultra quality performance values might represent high in a year, and then medium in 2. Either way, you're looking at good performance for a while, as long as you're willing to change some settings.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

Let me take you back in time to the early-2000's, where each new generation GPU was a monumental leap forward... and we got new architectures almost yearly.

 

This is what a Geforce 4 Ti 4600 looked like (this was a high-end GPU)

PNY GeForce4 Ti 4600 - NVIDIA GeForce4 Ti 4400/4600 Roundup - April 2002

 

 

 

A couple of years later, this is what a Geforce 6800 Ultra looked like (2004).

GeForce 6800 Ultra Preview - Page 1 - A Mermaid Introduction

 

 

Then the 8800 GTX (2006)

 

NVIDIA GeForce 8800 GTX Specs | TechPowerUp GPU Database

 

 

 

Let's fast forward to 2013 with the GTX Titan.

 

NVIDIA GeForce GTX TITAN Performance Review and Frame Rating Update - PC  Perspective

 

 

But wait... what about today? How about a 3-slot behemoth called the RTX 3090!!!

Nvidia GeForce RTX 3090 Founders Edition Review - IGN



 

 

What does this mean, exactly?

Well, we've gone from a tiny fan and no auxiliary power on a flagship GPU, to a MASSIVE hunk of aluminum and copper that takes up 3 slots and a 300-watt power conector... in roughly 20 years.

The fact is, technology continues to move forward, but with each passing generation, we push the silicon further and further until we need an absolutely insane cooler just to keep the thing running. We've also gone from a new line of chips every year, to a new(ish) architecture every 2 years.



TL;DR GPUs are NOT advancing as fast anymore as we are literally bumping up against what is physically possible with the silicon, and we're having to continually "move the goal post" as to what's acceptable. 20 years ago, the RTX 3090 would have been improbable; nothing could contain a 350-watt GPU. No case, no power supply, no motherboard. Today, it's on the edge of what's possible, but for enthusiasts, it's totally possible. If you look at the big picture, GPUs are advancing very slowly. However, CPUs are advancing even more slowly.

CPU: Ryzen 7 5800x3D || GPU: Gigabyte Windforce RTX 4090 || Memory: 32GB Corsair 3200mhz DDR4 || Motherboard: MSI B450 Tomahawk || SSD1: 500 GB Samsung 850 EVO M.2 (OS drive) || SSD2: 500 GB Samsung 860 EVO SATA (Cache Drive via PrimoCache) || Spinning Disks: 3 x 4TB Western Digital Blue HDD (RAID 0) || Monitor: LG CX 55" OLED TV || Sound: Schiit Stack (Modi 2/Magni 3) - Sennheiser HD 598, HiFiMan HE 400i || Keyboard: Logitech G915 TKL || Mouse: Logitech G502 Lightspeed || PSU: EVGA 1300-watt G+ PSU || Case: Fractal Design Pop XL Air
 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, MadPistol said:

Let me take you back in time to the early-2000's, where each new generation GPU was a monumental leap forward... and we got new architectures almost yearly.

 

This is what a Geforce 4 Ti 4600 looked like (this was a high-end GPU)

PNY GeForce4 Ti 4600 - NVIDIA GeForce4 Ti 4400/4600 Roundup - April 2002

 

 

 

A couple of years later, this is what a Geforce 6800 Ultra looked like (2004).

GeForce 6800 Ultra Preview - Page 1 - A Mermaid Introduction

 

 

Then the 8800 GTX (2006)

 

NVIDIA GeForce 8800 GTX Specs | TechPowerUp GPU Database

 

 

 

Let's fast forward to 2013 with the GTX Titan.

 

NVIDIA GeForce GTX TITAN Performance Review and Frame Rating Update - PC  Perspective

 

 

But wait... what about today? How about a 3-slot behemoth called the RTX 3090!!!

Nvidia GeForce RTX 3090 Founders Edition Review - IGN



 

 

What does this mean, exactly?

Well, we've gone from a tiny fan and no auxiliary power on a flagship GPU, to a MASSIVE hunk of aluminum and copper that takes up 3 slots and a 300-watt power conector... in roughly 20 years.

The fact is, technology continues to move forward, but with each passing generation, we push the silicon further and further until we need an absolutely insane cooler just to keep the thing running. We've also gone from a new line of chips every year, to a new(ish) architecture every 2 years.



TL;DR GPUs are NOT advancing as fast anymore as we are literally bumping up against what is physically possible with the silicon, and we're having to continually "move the goal post" as to what's acceptable. 20 years ago, the RTX 3090 would have been improbable; nothing could contain a 350-watt GPU. No case, no power supply, no motherboard. Today, it's on the edge of what's possible, but for enthusiasts, it's totally possible. If you look at the big picture, GPUs are advancing very slowly. However, CPUs are advancing even more slowly.

We have had graphics that were "great" over a decade ago. You could package a game in 2010 graphics engines today, and it would still look amazing if executed properly and still be almost as good as today's graphics.

 

Issue we're facing is continual increases in attempts to reach photo-realism, and that as usual, is a diminishing return on that endeavor.

 

Instead of brute-forcing our way into more frames (which is what's happening right now), it should instead fall upon software and hardware technologies to make better use of the resources we presently have.

 

Stuff like DLSS/FidelityFX, adaptive resolution scaling, etc., really are the way of the future to getting where we want to be.

 

It's already proven that graphics cards with enough power run through them can pump out some amazing graphics, but what IMO needs to be focused on is efficiency.

 

Give us 2010-2015 graphics fidelity, with 2020 technologies, and we can get away with the same (or more) with a whole lot less.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

there will be a lot of ups and downs, but at some point it might become too costly and too advanced.

or if things will start to be more software dependant.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Mister Woof said:

We have had graphics that were "great" over a decade ago. You could package a game in 2010 graphics engines today, and it would still look amazing if executed properly and still be almost as good as today's graphics.

 

Give us 2010-2015 graphics fidelity, with 2020 technologies, and we can get away with the same (or more) with a whole lot less.

 

I think you might be viewing this through rose-tinted glasses.
 


Lost Planet 2 is the most "modern" looking game out of that bunch, and it still falls extremely short of something like RDR2, or the new reigning champ, Cyberpunk 2077. We've just reached a point where developers need some "help" with realistic lighting... that's where Ray Tracing comes in. Ray Tracing is the (extremely expensive) way forward.

Nvidia and AMD, along with developers, have found cool, innovative ways to make graphics look more realistic without exacting a huge toll on hardware, but every now and then, we get a breakout tech that brings current hardware to its knees. 20 years ago, that was Transform and Lighting. 10 years ago, that was tessellation. Today, it's Ray Tracing.

Path Traced graphics is the holy grail of rendering tech. We will see how long it will be until we get to that point (Quake II RTX and Minecraft don't count lol) 

CPU: Ryzen 7 5800x3D || GPU: Gigabyte Windforce RTX 4090 || Memory: 32GB Corsair 3200mhz DDR4 || Motherboard: MSI B450 Tomahawk || SSD1: 500 GB Samsung 850 EVO M.2 (OS drive) || SSD2: 500 GB Samsung 860 EVO SATA (Cache Drive via PrimoCache) || Spinning Disks: 3 x 4TB Western Digital Blue HDD (RAID 0) || Monitor: LG CX 55" OLED TV || Sound: Schiit Stack (Modi 2/Magni 3) - Sennheiser HD 598, HiFiMan HE 400i || Keyboard: Logitech G915 TKL || Mouse: Logitech G502 Lightspeed || PSU: EVGA 1300-watt G+ PSU || Case: Fractal Design Pop XL Air
 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, MadPistol said:

 

I think you might be viewing this through rose-tinted glasses.
 


Lost Planet 2 is the most "modern" looking game out of that bunch, and it still falls extremely short of something like RDR2, or the new reigning champ, Cyberpunk 2077. We've just reached a point where developers need some "help" with realistic lighting... that's where Ray Tracing comes in. Ray Tracing is the (extremely expensive) way forward.

 

I mean, obviously newer engines look better. I'm just saying that games from 2010 could still look "good enough", and even as much effort as we put into photo-realism as we do today, we still aren't there.

 

So try to get there and still fail, including the costs, or do what we know works?

 

Kind of how despite the graphics we have available, people are willing to play games on Nintendo Switch's smartphone-level hardware.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Mister Woof said:

 

I mean, obviously newer engines look better. I'm just saying that games from 2010 could still look "good enough", and even as much effort as we put into photo-realism as we do today, we still aren't there.

 

So try to get there and still fail, including the costs, or do what we know works?

 

Kind of how despite the graphics we have available, people are willing to play games on Nintendo Switch's smartphone-level hardware.

A good example is titanfall 1 and 2 it uses the by now ancient source engine and still looks really good (and runs amazing too)

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, jaslion said:

Am I just seeing a wrong pattern or is it just a lot less gpu for the money both in performance and time it lasts these last 3 years?

If you include scalper prices...yes.

:)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×