Jump to content

11c difference between furmark and fortnite

aren332

I have a evga rtx 2060 ko ultra

The gpu is deshrouded with 2 arctic p12 fans

It is overclocked +100 core +800 memory in msi afterburner.

The card's tdp is 170w (not 160w like the normal 2060 for some reason, but the power slider is locked)

 

Running furmark (extreme burn in enabled in settings) for about 15 minutes, temps average at around 75, power consumption of about 145 watts (according to msi afterburner)

Playing fortnite (medium settings), temps average at around  86c but often reach 87c. Power consumption is about 165 watts, max of 169.3

Inspecting an item in CS2, temps go up to 78c, 150w consumption.

 

Why are the furmark temps/power consumption so low? Isnt it like the most intensive gpu stress test?

Link to comment
Share on other sites

Link to post
Share on other sites

Real game performance can tax parts of the GPU that furmark doesn’t, namely heavy dynamic use of video memory. And furmark on its own doesn’t stress the cpu much, and the cpu exhausting heat into the chassis can really impact GPU thermals.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, 8tg said:

cpu exhausting heat into the chassis can really impact GPU thermals.

running prime 95 and furmark at the same time, temps on gpu around 76c, cpu temps at 90c (thats normal temps since its overclocked) I have a front mounted aio

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, aren332 said:

Isnt it like the most intensive gpu stress test?

nope. its bullshit placebo

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, aren332 said:

furmark

nvidia and amd throttle as soon they detect it, its built into the vbios.

 

just stop using this meaningless "benchmark" its only used by snakeoil sellers to gaslight their unassuming victims 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Another issue is that Furmark is a stable load that allows your PC and cooling to settle in. PLaying a game can made demand jump all over the place and a bad fan curve can be caught with its pants down when it itself keeps jumping up and down to keep up.

Desktop: Ryzen 7 5800X3D - Kraken X62 Rev 2 - STRIX X470-I - 3600MHz 32GB Kingston Fury - 250GB 970 Evo boot - 2x 500GB 860 Evo - 1TB P3 - 4TB HDD - RX6800 - RMx 750 W 80+ Gold - Manta - Silent Wings Pro 4's enjoyer

SetupZowie XL2740 27.0" 240hz - Roccat Burt Pro Corsair K70 LUX browns - PC38X - Mackie CR5X's

Current build on PCPartPicker

 

Link to comment
Share on other sites

Link to post
Share on other sites

Furmark is classed as a powervirus on gpu's. When its detected the gpu enters a protective state so it can't be harmed as in the past furmark was able to wreck some gpu's

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Mark Kaine said:

nvidia and amd throttle as soon they detect it, its built into the vbios.

No. It's not that they throttle because they see that it's FurMark. It's rather that FurMark is so demanding that the card runs into its power limit way before it can reach its clock limit because it draws so many amps. Cards throttle because of that. I think in the very early days, cards didn't have enforced power limits yet, that's why you still see the warning message every time you start a burn-in test

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×