Jump to content

Undervolting 3080 to reduce TDP

varrys

I'm new to undervolting, never done it before, but the Aorus Xtreme is pretty power hungry with a max of 450W, so I wanted to reduce the load a little because I have a 750W power supply. Also, I'm in FL, and the GPU heats up my office quite a bit while gaming! Here are my first attempts. Need to test long-term with gaming, but it seems the 937v, 2000MHz is a win for temps and TDP? Any other suggestions? I suppose I should run these tests while also monitoring the Max Watts via HWMonitor?

 

EDIT: Well, played about 30 minutes of Wonderlands, and it crashed at the 937mV, 2000 MHz settings. At this point, should I go for 950mV, 2025MHz? But I don't see that lowering temps and TDP very much.

 

Factory Settings, Run 1

Highest temp: 72

Clock speed: 1980-2025 (often in 1990-2000)

Score: 4726

Min FPS: 51.3

Max FPS: 394.8

 

Factory Settings, Run 2

Highest temp: 72

Clock speed: 1980-2025 (often in 1990-2000)

Score: 4711

Min FPS: 60.6

Max FPS: 391.7

 

950mV, 2050 MHz, -200 core clock to set curve

Highest temp: 71 (often 69-70)

Clock speed: stayed at 2025 nearly whole time

Score: 4758

Min FPS: 60.0

Max FPS: 400.5

Performance Change: +0.6% from best factory score

Temperature difference: -1

 

925mV, 2025 MHz, -200 core clock – Crashed

 

937mV, 2000 MHz, -250 core clock to set curve

Highest temp: 67

Clock speed: 1995-2010 (mostly 1995)

Score: 4717

Min FPS: 49.0

Max FPS: 391.1

Performance Change: -0.2% from best factory score

Temperature difference: -5

5600x/RTX 4080

 

“1. Never tell everything at once.” - Ken Venturi's Two Great Rules of Life

Link to comment
Share on other sites

Link to post
Share on other sites

Thats pretty much how I did it with my 1070, but I havent worked with a more recent card. I used HWInfo to monitore temperatures and Heaven to stress the GPU. 
What Stress test tool did you use?

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Devryd said:

Thats pretty much how I did it with my 1070, but I havent worked with a more recent card. I used HWInfo to monitore temperatures and Heaven to stress the GPU. 
What Stress test tool did you use?

Used Unigine Heaven to run my tests. Ran it at factory twice to warm it up, then the tinkering.

5600x/RTX 4080

 

“1. Never tell everything at once.” - Ken Venturi's Two Great Rules of Life

Link to comment
Share on other sites

Link to post
Share on other sites

If you want to reduce the power consumption go to 1850~1950Mhz and drop the voltage further. You can probably get a pretty significant power reduction while losing little performance, if you're lucky 100W or so less with similar performance to stock if it manages to be stable at 1900Mhz+@850mV.

 

This video can probably help you get some idea of the power consumption you can get at which voltage and clock. Naturally due to unit variation you still will need to try it.

 

Just now, varrys said:

Used Unigine Heaven to run my tests. Ran it at factory twice to warm it up, then the tinkering.

I recommend that you run something heavier, Unigine Superposition, TimeSpy, Pure RayTracing benchmark, or games like SoTR that have good built-in benchmarks or similar, so you can properly apply load and test performance.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, KaitouX said:

If you want to reduce the power consumption go to 1850~1950Mhz and drop the voltage further. You can probably get a pretty significant power reduction while losing little performance, if you're lucky 100W or so less with similar performance to stock if it manages to be stable at 1900Mhz+@850mV.

 

This video can probably help you get some idea of the power consumption you can get at which voltage and clock. Naturally due to unit variation you still will need to try it.

 

I recommend that you run something heavier, Unigine Superposition, TimeSpy, Pure RayTracing benchmark, or games like SoTR that have good built-in benchmarks or similar, so you can properly apply load and test performance. 1440p, Shaders and Textures set to High.

Thanks for the advice. Will run Superposition starting at 900mV, 1900Mhz and see what happens.

5600x/RTX 4080

 

“1. Never tell everything at once.” - Ken Venturi's Two Great Rules of Life

Link to comment
Share on other sites

Link to post
Share on other sites

what also helped in my case was increasing the memory speed. I could drop the core clock a little and make up with the loss in memory frequency

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, KaitouX said:

If you want to reduce the power consumption go to 1850~1950Mhz and drop the voltage further. You can probably get a pretty significant power reduction while losing little performance, if you're lucky 100W or so less with similar performance to stock if it manages to be stable at 1900Mhz+@850mV.

 

This video can probably help you get some idea of the power consumption you can get at which voltage and clock. Naturally due to unit variation you still will need to try it.

 

I recommend that you run something heavier, Unigine Superposition, TimeSpy, Pure RayTracing benchmark, or games like SoTR that have good built-in benchmarks or similar, so you can properly apply load and test performance.

Ran Superposition and these are the first results. Promising, if it holds up in gaming!

 

Superposition Factory Settings, Shaders/Textures High, Windowed

Highest temp: 70

Score: 18672

Min FPS: 86.30

Max FPS: 157.11

Avg FPS: 139.66

HWMonitor TDP Max: 353.65W, (saw 372 in MSI Afterburner monitoring)

 

900 mV, 1900 MHz, -250 core clock to set curve

Highest temp: 62

Score: 18647

Min FPS: 104.25

Max FPS: 144.97

Avg FPS: 139.47

HWMonitor TDP Max: 273.99 (saw 288 in MSI Afterburner monitoring)

Performance Change: -0.24% Score, -0.24% FPS

Temperature difference: -8

5600x/RTX 4080

 

“1. Never tell everything at once.” - Ken Venturi's Two Great Rules of Life

Link to comment
Share on other sites

Link to post
Share on other sites

Welp, played Wonderlands for an hour at the new settings, and no crash. It ran at 68 degrees max temp instead of the 74 I was sometimes getting. And HWMonitor showing a top TDP of 312W instead of the 370W I had seen before. That's 9% drop in temperature and 15% drop in TDP for no real noticable difference while playing the game.

 

Thanks for the advice!

5600x/RTX 4080

 

“1. Never tell everything at once.” - Ken Venturi's Two Great Rules of Life

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×