Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Majestic

Pascal GPU Boost 3.0 topic, something every pascal owner should look at.

Recommended Posts

Posted · Original PosterOP

!!Make sure to check out the feedback comment below this for up-to-date aknowledgements or changes.!!

 

Pre-amble and introduction

 

Hello everyone, I decided to make another topic to put into my signature and possibly help people with. This topic will especially be helpful for people with Founders Edition pascal cards, or people having thermal issues in SFF cases. It is also an absolute must for people who are building a cryptomachine, as this will severely lower power consumption. Otherwise, it will just be interesting for people like me who like to tinker with tech. Subject this time is GPU Boost 3.0.

 

You can also limit the card's power usage by simply lowering the power target. But this won't actually make the card any more efficient. And this will invariably hurt the performance of the card. As it will be switching more often to stay within the thermal limits you set. If you really want to get the most out of your hardware without sacrificing performance, you have to tamper with boost 3.0 and undervolt.

 

Other people have already written down just how Boost 3.0 works, and you can read it here:

http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/15

 

A TL:DR would be that the GPU takes into consideration a couple of power related parameters to boost to the maximum allowable coreclock whilst staying within the TDP and stability regions.

 

What do you need, and how does it work:

 

This topic will only be about downvolting with MSI Afterburner's Frequency/Voltage curve and how to potentially reduce your power consumption. To either get lower noise + heat output, higher boostclocks or both.

I will be using my own EVGA GTX 1070 FTW for this test, and will be using MSI Afterburner 4.3.0, get it here:

https://www.msi.com/page/afterburner

 

I've also selected the standard skin (v3) from the "interface" options menu. Once you have MSI Afterburner installed there are a few things you must do before starting. On the default skin, press the cogwheel to get into the settings. In the "general" tab, tap both "Unlock Voltage Control" and "Unlock Voltage Monitoring". If you want the same skin as me, go to "user interface" tab and select "default MSI v3". In the "monitoring" tab select "Power", "GPU Temperature", "Coreclock" and "GPU Voltage". You can also, for each node, select "show in on-screen" down in the bottom. At the top, also make sure to set the polling rate at something like 100/200ms. As it will make it easier to follow the dynamic behaviour. 

 

You can then use your favorite stability/benchmarking utility, but I suggest using Unigine Heaven, as I've always found that to be both easy to boot and root out instabilities quite well. Get it here:

https://benchmark.unigine.com/heaven

 

The big issue with GPU Boost 3.0 is that you don't have full control over it. The GPU for the most part will have it's own idea of what to boost to, and you have to compensate for it's behaviour. For example, when during stresstesting you have found your bottom voltage, save the profile and let the card cool. The thermal window for Boost 3.0 has suddenly expanded and it will start with a higher boostclock than you were originally testing with and can potentially crash. Once it heats up again, it will return to the boostclock you did the stability test with. A safe bet will be to look for the bottom voltage and add a few millivolts to compensate for the higher starting boost.

 

Undervolting (novice/adepts can start here)

 

Start by booting up Heaven, run it windows, but at extreme settings. Alt-tab and put your MSI Afterburner window in front of it, and to avoid confusion, hit the "reset" button. Now let the card run for a good 10-15 minutes until the card levels out in temperature. What we are interested in are the Core temperature, the clockspeed and voltage at this stage. If you try to set a curve at a lower thermal interval, GPU-boost will auto adjust the graph to hit higher clocks than you've set them at. And when it heats back up, it will be below what you want it to be. This goes a lot more smoothy whilst the GPU is being stressed and GPU-boost stops interfering.

 

Now whilst having MSI Afterburner in front, press CTRL+F to pull up the boost 3.0 boost table and it will look something like this:

 

Spoiler

equilibrium.thumb.png.9dc397e7191555d1da8989b096718f46.png

So in my case, my 1070 ftw levels out at 69-70 degrees, with 1924mhz and 1,031V. These will now be your base numbers to work from. Now look on the graph where this GPU boost 3.0 operating point is. It should be in the middle of the cross. And set the white node above it to match these numbers like so:

 

Spoiler

set_to_equilibrium.thumb.png.3eb7b1c725f02c9d29ce6236feb5ccba.png

So 1924 on the frequency (y-axis) and 1031 on the voltage (x-axis), you can use the up/down keys to finetune. Now comes the fiddly bit. You need to lower all the nodes to the right below or equal to this node (so all at <1924). It will auto adjust everything after you hit apply, so you don't need to set everything as 1924, just make sure it's lower. Hit apply after you've done so. If it looks like this, you're done:

 

EDIT: Thanks to @Darkseth for the keybindings:

holding SHIFT moves the entire curve up or down, holding CTRL tilts the curve to a direction. Makes it allot easier to achieve the above stated goal. Thanks for that.

!!EDIT#2: For some reason this totally screws up my stability when using those SHIFT and CTRL commands on my EVGA unit. If for some reason it keeps crashing even at conservative levels, do not use those commands! I think shift offsets the lower-end behavior of the card aswell and makes it unstable at desktop usage. So it's mostly shift that makes it unstable.

!!EDIT#3: On my GTX 1080 FTW this is no longer the case.

 

Spoiler

set_to_equilibrium_Level_out.thumb.png.8eb18b7c00b7d2a100f26f5e9c304919.png

 

If it doesn't, or if the first node has shifted away, lower everything to the left a bit like shown, and readjust the 1031/1924 node again to snap into place. If the nodes are too close, they can swap between predetermined slots. Hence lowering the left nodes will make it easier for the software to distinguish what you're doing. It should now still read the correct values whilst running in Heaven.

 

Now all you need to do is level the nodes (dots) to the left of your operating voltage, one at a time. So raise the node to the left of (in my case) 1031mv up to the same level. Each time doing so, you will readjust a powerstate to a lower voltage (as you'll be moving to the left on the x-axis with the same frequency). And you should see your voltage drop on MSI Aterburner. Keep moving the nodes to 1924 until it fails and add about 0,2-0,25mv (so it if fails at 0,85, set it to 0,875 or the nearest available node). Once you've found your voltage, make sure to give it a good stresstest to make sure it's really stable. Save the profile by hitting "save" and clicking one of the 5 buttons. For added security you can look at what your powertarget is doing when it's levelled out and set the max power-target to about +10% of that. So if it's reading 75%, set it to 85%. That way, when the card is cold and you start up a 3D application it won't try to run 2000mhz and crash on the low voltage.

 

Now turn off the test, let the GPU settle to it's idle temp and restart the benchmark. It will now boost higher and the card needs to keep stable at the selected voltage. If it fails, reset the card, let it heat back up to it's equilibrium point and load the stable voltage again. Move one node down again to raise the voltage a bit. (doing this step in idle will fuck up the graph again). 

 

Undervolting and yet Overclocking?

 

And now you should be done. You can also combine this with an overclock. To do this, reset the card and let it heat up again. Set the +core clock slider up to the point you want your coreclock to be and start back from the beginning. It can look something like this:

 

Spoiler

undervolting.thumb.png.58decdba720713d0a68427e626433a1f.png

As you can see, the power % takes a significant hit, meaning your card is running much more efficient (so long as the coreclocks don't drop and you've done everything correctly). Note that the result can be significant and the coretemperature can drop so much as to boost to higher clockspeeds. This is why you also need to compensate a bit and don't be too greedy on the voltage. Look for a nice sweetspot of stability and powersaving.

 

Compensating for the performance loss.

 

Thanks to @Darkseth for pointing this out. After some investigation, apparently using aggressive delta's between the powerstates causes a performance delta aswell. It can be anywhere up to 3-5%. Nothing major, but it can be avoided to a certain extent. Once you have your curve like in the last section above, you may want to make it less abrupt and adjust the powerstates left of your cut-off point to make a more ramp-like function. This may require a bit of finetuning to figure out what works best. Fire up heaven again and press "camera" to set it to "free". Pick a point of view with not many moving objects to keep the framerate as steady as possible and start sculpting the curve ( ͡° ͜ʖ ͡°). To give you an idea what to aim for, here are a few examples. Make the curve as smooth as possible up to the point you level it out. Just make sure to leave the first node alone.

 

Spoiler

594118889c1ee_Adjustingforpowerstates.thumb.png.45c930e75792e0f66f6b9c6131953025.png


Troubleshooting:

- Performance loss? Check above or first comment.

 

Interested in your results, would appreciate if you left yours below.

Link to post
Share on other sites
Posted · Original PosterOP

Reserved.

 

Feedback

11 hours ago, Darkseth said:

Yo man^^ Theres a german Forum about this topic: https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11113302#post11113302

He found out, that if you just let the Curve untouched, but ONLY pull up 1 dot, get less performance, than if you had pulled the whole curve up.

Can confirm it has a very slight impact on performance apparently. Heaven benchmark score went from 95 to 91.3 during the same clockspeeds.

That said, you can easily compensate by settling for a higher clockspeed and still get a lower power enveloppe.

 

Using the CTRL and SHIFT commands makes my card entirely unstable.

 

EDIT: So i've been investigating this, and it appears the delta's between the powerstates cause the performance drop as you said. When staring at a specific point in Heaven to get a steady fps, having the default profile gives me 75 fps, a steep delta gives me 71 fps, but a edited curve gives me 74 fps. Maybe with a bit of finetuning I can remove the delta performance drop. Here are a few examples:

 

 

Spoiler

59411731d0cee_Adjustingforpowerstates.thumb.png.a90b90160c39db5c55c9fc7a4e06041d.png

This did not have the effect of raising the power% again (not significantly), so it's probably just something that upsets GPU Boost 3.0 when too aggressive delta's are used. I guess this simply requires a lot of finetuning.

_________________________________________________________________________________________________________________

 

Tagging:

@i_build_nanosuits @TheRandomness @Billy_Mays @AlwaysFSX

Link to post
Share on other sites
44 minutes ago, Majestic said:

Now whilst having MSI Afterburner in front, press CTRL+F to pull up the boost 3.0 boost table

WOW! :o

Honestly...you're a GENIUS...all the rest i would have figured out...but MAN you have no idea how much time i spent editing stupid BIOS file for my 980ti to edit the voltage and frequency steps of GPU boost...all i had to do was to hit CTRL+F in afterbuner...AMAZING...why is this shit hidden like that!? how did you find it? you happened to hit both keys by mistake and it poped open?!

 


| CPU: Core i7-8700K @ 5.0ghz - 1.3v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI GTX 1080Ti Gaming X Trio 2ghz OC  RAM: 16GB T-Force Delta RGB 3000mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Rift S

 

Read: My opinions on VR in it's current state, should YOU buy into it?

Link to post
Share on other sites
1 minute ago, Majestic said:

Saw it on some video.

that's...INSANE...thanks for that man! ...i had absolutely no idea.


| CPU: Core i7-8700K @ 5.0ghz - 1.3v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI GTX 1080Ti Gaming X Trio 2ghz OC  RAM: 16GB T-Force Delta RGB 3000mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Rift S

 

Read: My opinions on VR in it's current state, should YOU buy into it?

Link to post
Share on other sites
Posted · Original PosterOP

Just tried it on some games. Was playing some Rise of the Tomb Raider, stockprofile was 1924 at 70 degrees and 75-80% TDP. Switched to the undervolted profile and it went down to 62 degrees, 50% TDP and gained a boostlevel (1936). System is making significantly less noise now during game, as the casefans are spun up according to the GPU (as it uses the most power). Maybe have to set it back to CPU now, as that uses more power now xD

Link to post
Share on other sites
2 hours ago, i_build_nanosuits said:

WOW! :o

Honestly...you're a GENIUS...all the rest i would have figured out...but MAN you have no idea how much time i spent editing stupid BIOS file for my 980ti to edit the voltage and frequency steps of GPU boost...all i had to do was to hit CTRL+F in afterbuner...AMAZING...why is this shit hidden like that!? how did you find it? you happened to hit both keys by mistake and it poped open?!

 

I don't think the voltage/frequency curve works for Maxwell.  It was new with boost 3.0 and Pascal.


CPU: Ryzen 1600X @ 4.15ghz  MB: ASUS Crosshair VI Mem: 32GB GSkill TridenZ 3200
GPU: 1080 FTW PSU: EVGA SuperNova 1000P2 / EVGA SuperNova 750P2  SSD: 512GB Samsung 950 PRO
HD: 2 x 1TB WD Black in RAID 0  Cooling: Custom cooling loop on CPU and GPU  OS: Windows 10

 

Link to post
Share on other sites
Posted · Original PosterOP
Just now, Vellinious said:

I don't think the voltage/frequency curve works for Maxwell.  It was new with boost 3.0 and Pascal.

I missed he said 980TI, you're right. It's def. boost 3.0 cards only.

Link to post
Share on other sites

Also, something to consider.  The curve tends to work better, the closer to stock it remains.  Using a really aggressive, steep curve can lock in a voltage, but depending on the temps the core is running, will depend on how well it runs.  

For instance:
This curve, with a really aggressive, steep curve works "ok"

Og1PpjK.png


But this curve, closer to the stock type curve, works better:

cHLlI5C.png

Just something to consider.  I spent months fine tuning the curves for my 1080 FTWs for HWBot runs.


CPU: Ryzen 1600X @ 4.15ghz  MB: ASUS Crosshair VI Mem: 32GB GSkill TridenZ 3200
GPU: 1080 FTW PSU: EVGA SuperNova 1000P2 / EVGA SuperNova 750P2  SSD: 512GB Samsung 950 PRO
HD: 2 x 1TB WD Black in RAID 0  Cooling: Custom cooling loop on CPU and GPU  OS: Windows 10

 

Link to post
Share on other sites
Posted · Original PosterOP
2 minutes ago, Vellinious said:

Also, something to consider.  The curve tends to work better, the closer to stock it remains.  Using a really aggressive, steep curve can lock in a voltage, but depending on the temps the core is running, will depend on how well it runs.  

Well mine follows the stock curve, up until the undervolt. Since the voltage is so low, it looks aggressive by default.

curve.png.e8f0904a8f08c0868df4cb2a1326cd21.png

 

Link to post
Share on other sites
1 minute ago, Majestic said:

Well mine follows the stock curve, up until the undervolt. Since the voltage is so low, it looks aggressive by default.

curve.png.e8f0904a8f08c0868df4cb2a1326cd21.png

 

Different objectives I guess.  You're trying to make it run at a static clock, and give it as many options as possible in terms of voltage, to switch to, depending on core temp.  It's a good idea.

I was pushing the other direction.  lol


CPU: Ryzen 1600X @ 4.15ghz  MB: ASUS Crosshair VI Mem: 32GB GSkill TridenZ 3200
GPU: 1080 FTW PSU: EVGA SuperNova 1000P2 / EVGA SuperNova 750P2  SSD: 512GB Samsung 950 PRO
HD: 2 x 1TB WD Black in RAID 0  Cooling: Custom cooling loop on CPU and GPU  OS: Windows 10

 

Link to post
Share on other sites
Posted · Original PosterOP
1 minute ago, Vellinious said:

Different objectives I guess.  You're trying to make it run at a static clock, and give it as many options as possible in terms of voltage, to switch to, depending on core temp.  It's a good idea.

I was pushing the other direction.  lol

To me, the extra 100mhz isn't really worth the massive increase in power consumption. It uses double or higher the power it is using now for an additional...6% performance.

But you're right, that's the nice thing about this, you can push it either way. Thanks for the feedback anyway :)

Link to post
Share on other sites

There's more there than just 100mhz, if it's running cool enough, and you're hunting for it.  Of course, I'm not looking for efficiency.  /shrug

JAK1te3.png

 


CPU: Ryzen 1600X @ 4.15ghz  MB: ASUS Crosshair VI Mem: 32GB GSkill TridenZ 3200
GPU: 1080 FTW PSU: EVGA SuperNova 1000P2 / EVGA SuperNova 750P2  SSD: 512GB Samsung 950 PRO
HD: 2 x 1TB WD Black in RAID 0  Cooling: Custom cooling loop on CPU and GPU  OS: Windows 10

 

Link to post
Share on other sites

I'm curious how much control I'll have on my mobile GTX 1060. Time to install Heaven and give it a go.


hating popular things as a personality trait is infinitely more cringe than liking things unapologetically

Link to post
Share on other sites
Just now, Suika said:

I'm curious how much control I'll have on my mobile GTX 1060. Time to install Heaven and give it a go.

Hadn't thought about mobile cards (forgot they existed basically) but you could probably really control thermals. :o


.

Link to post
Share on other sites
Posted · Original PosterOP
10 minutes ago, AlwaysFSX said:

Hadn't thought about mobile cards (forgot they existed basically) but you could probably really control thermals. :o

It would probably be one of the best case scenario's to use it on. Laptops are incredibly thermally constrained.

Link to post
Share on other sites
5 minutes ago, AlwaysFSX said:

Hadn't thought about mobile cards (forgot they existed basically) but you could probably really control thermals. :o

I'm hoping so. MSI Afterburner lets me fiddle with GPU boost so I'm assuming the functionality isn't limited like voltage control is, but it'll take me some time to report back on current thermal performance, tuning GPU boost, and then measuring the tuned thermals.


hating popular things as a personality trait is infinitely more cringe than liking things unapologetically

Link to post
Share on other sites
1 minute ago, Majestic said:

It would probably be one of the best case scenario's to use it on. Laptops are incredibly thermally constrained.

You wouldn't happen to have a Pascal laptop, would you? :P

1 minute ago, Suika said:

I'm hoping so. MSI Afterburner lets me fiddle with GPU boost so I'm assuming the functionality isn't limited like voltage control is, but it'll take me some time to report back on current thermal performance, tuning GPU boost, and then measuring the tuned thermals.

Will be interesting to see, I'm sure it'll take time though with how long setting up controls takes.


.

Link to post
Share on other sites

Yo man^^ Theres a german Forum about this topic: https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11113302#post11113302

He found out, that if you just let the Curve untouched, but ONLY pull up 1 dot, get less performance, than if you had pulled the whole curve up.


See here: https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11113697#post11113697

 

(use Google translate, it should be clear. First picture is the Wrong curve, bottom one is the right one.).

 

So, having such a Jump apparently reduces performance, even if the Boost clock is held perfectly fine.

But if you pull up the  curve with CTRL + Mouse, and then pull the dots right from your voltage/clock dot down, so it never goes higher, the Performance loss is a bit minimal.

Multiple users have made this experience.

 

Can you maybe check this one out for yourself, and see if you get to the same conclusion?

Because if yes, i think this is worth mentioning. Since there is some strange stuff going on with Boost 3.0.

 

 

 

2nd Point: Also, some people noticed, if you have a certain  Clock speed via Curve, you end up with a few % less performance, than if you use ONLY the Offset.

One user in Hardwareluxx Forum (don't ask for the Post, can't find it right now... xD I will provide the Link when i found it., The Forum seems to be down right now) found out a method, where this performance loss is minimal:

1. using Offset, so that your wish clock speed matches the Voltage (vor example, you want 1850 Mhz at 0.85 Volt).

2. go into Curve --> pull every dot right from it DOWN

 

But i am not very sure about this one.. At least the above should be worth mentioning.

Link to post
Share on other sites
1 hour ago, Majestic said:

will check tomorrow, thanks.

to add to this thread:

- Hold SHIFT and click on a dot to move the entire curve up and down.

- Hold CTRL and click on a dot that is either more to the left or to the right of the curve to tilt the curve up and down.

 


| CPU: Core i7-8700K @ 5.0ghz - 1.3v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI GTX 1080Ti Gaming X Trio 2ghz OC  RAM: 16GB T-Force Delta RGB 3000mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Rift S

 

Read: My opinions on VR in it's current state, should YOU buy into it?

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×