Jump to content

Are all newer GPUs power consumptions lower compared to older GPUs?

renchy_is_sketchy

Stupid question but i just saw the confirmed specifications of the 5600XT i was waiting to get , and it turns out that the estimated power consumption is 150W? (as said here https://wccftech.com/amd-radeon-rx-5600-xt-navi-gpu-specs-leak/) , while my current rx 580s (https://www.sapphiretech.com/en/consumer/nitro-rx-580-4g-g5) power consumption is <235W ?

 Or are they just making the newer models far more effective than older ones?

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, renchy_is_sketchy said:

Are all newer GPUs power consumptions lower compared to older GPUs?

No, not even nearly. Performance-per-watt is higher, but manufacturers design their cards to target similar range of wattages as always. Look at e.g. RTX 2080Ti -- it has a TDP of 250W.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

Simply looking at power without comparing performance is a bit misleading, but in general I would expect performance per watt to go up over time. Meaning you get more performance at the same power or use less power for the same performance.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

Top of the line cards have been around 250W for the last ten years or so. Power efficiency goes up over time, but AMD and Nvidia just uses that to deliver higher performance at the high end, and at each step down in the hierarchy.

 

What you're doing is comparing a card higher up on the stack with a lower one. It would be a different story if you compared with an RX 570 or 560.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, renchy_is_sketchy said:

Stupid question but i just saw the confirmed specifications of the 5600XT i was waiting to get , and it turns out that the estimated power consumption is 150W? (as said here https://wccftech.com/amd-radeon-rx-5600-xt-navi-gpu-specs-leak/) , while my current rx 580s (https://www.sapphiretech.com/en/consumer/nitro-rx-580-4g-g5) power consumption is <235W ?

 Or are they just making the newer models far more effective than older ones?

There's no guarantee ALL video cards will have lower power.

As you go to a newer (lower nm) manufacturing process, the transistors inside the chip (the building blocks that make the cores inside the gpu chip) need less power to work and produce less heat, but the video card manufacturer may use MORE transistors for each core to increase the performance of these cores. If you'd take the same gpu chip (ex rx580 chip made at 14nm) and remake it without any alterations in a lower nm process (like the rx5x00's 7nm process), absolutely you'd get lower power consumption. 

However, new designs often improve the cores, adding transistors and blocks, and the frequencies will be different, so the savings you may get due to better manufacturing process may be lost due to these improvements.

This is done because moving to new manufacturing processes is expensive... like just preparing the circuit and programming the machines that "print" the chip on silicon can cost near 1 million dollars at these low nm sizes (can be a few hundred thousands for 28nm and higher, that's why they still make chipsets and ssd controllers and various chips at 28..65nm processes, because less initial investment).

So, it's not really worth spending months in testing new revision of exactly same chip, and spending near 1m dollars to shrink the chip using new process, just to make the chip maybe 1$ cheaper.. you want to increase performance or make new product.

 

 

The power consumption varies with frequency, number of "cores" in the gpu chip, amount of memory and its specifications (ex 128 bit bus vs 256 bit bus, double the amount of ram chips and maybe 5-15w more power consumed by ram)

Link to comment
Share on other sites

Link to post
Share on other sites

Power consumption over time has increased up until about 2010, where it basically flatlined because the PCIe standard only allows for so much power to be drawn and that hasn't updated. Not that it needs to anyway.

 

But to achieve the same results as yesteryear takes less power to do so.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Mira Yurizaki said:

Power consumption over time has increased up until about 2010, where it basically flatlined because the PCIe standard only allows for so much power to be drawn and that hasn't updated. Not that it needs to anyway.

 

But to achieve the same results as yesteryear takes less power to do so.

The PCIe standard doesn't limit power consumption. You just add more power connectors as needed.

 

The limit is simply being able to cool a reasonably sized graphics card without ridiculous noise.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Sakkura said:

The PCIe standard doesn't limit power consumption. You just add more power connectors as needed.

 

The limit is simply being able to cool a reasonably sized graphics card without ridiculous noise.

While technically you could do that, the spec officially only allows for up to 300 watts of power. If you go beyond that, you can't brand your device with the official PCIe logo. Though looking into this more, PCIe 4.0 bumps up how much external cables can take.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×