Jump to content

Oddity observed running Time Spy

Chris Fortune

Having recently finished my upgrade I was doing the usual benchmarking sweet to convince myself that "yeah it was fine before but just look at it now" and it was so worth it. A patently false assertion in my case when you look at the specs but.... well you will see. Anyway I noticed that on my primary screen I would get a significantly higher score than on my side panel.

 

System

Ryzen 9 5900X

Asus Tuf Gaming Pro X570

RTX 3090 F.E.

32Gb DDR4 3600 CL16 (TForce Dark Pro)

 

Main Screen CRG9 49

Side Screen 28 UE590

 

These two runs were run consecutively with a short time to allow system to drop to idle temps prior to starting with GPU fully stock, CPU PBO with 200 MHz offset and REM using DoCP profile

 

Main screen 18195

Side screen 17228

 

Now I understand that run variance happens but this is way more than that, I watched clocks and temps on HWiNFO64 Afterburner and AI3 and whilst details were different as expected peak temps were identical and well below throttle temp.

 

Is this normal? I would have expected there to be a screen resolution adjustment within the 3DMark software.

Link to comment
Share on other sites

Link to post
Share on other sites

Time Spy=1080p, Time Spy Extreme=4K?

 

I would need more details than that to see why there is such difference but I expect that something didn't boost as high on the second run. I would suggest more runs to compare.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, aDoomGuy said:

Time Spy=1080p, Time Spy Extreme=4K?

 

I would need more details than that to see why there is such difference but I expect that something didn't boost as high on the second run. I would suggest more runs to compare.

CRG9  49 is 1440p effectively as it doesn't use the full widescreen but that aside my surprise was that on exactly the same hardware there was such a big difference

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Chris Fortune said:

Now I understand that run variance happens

You clearly don't but ok. 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Mark Kaine said:

You clearly don't but ok. 

Run variance of about 900 points doesn't seem like run variance to me, that is a different score.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Chris Fortune said:

Run variance of about 900 points doesn't seem like run variance to me, that is a different score.

The only way to confirm this is to do (many) more runs and actually see what the spread is.

 

7 hours ago, Chris Fortune said:

Now I understand that run variance happens but this is way more than that, I watched clocks and temps on HWiNFO64 Afterburner and AI3 and whilst details were different as expected peak temps were identical and well below throttle temp.

What details were different? Were you reaching identical clocks all the time? Temperature for example is an important factor in determining how high Nvidia GPUs boost.

7 hours ago, Chris Fortune said:

I would have expected there to be a screen resolution adjustment within the 3DMark software.

The resolution for 3D Mark benchmarks is fixed.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, tikker said:

What details were different? Were you reaching identical clocks all the time? Temperature for example is an important factor in determining how high Nvidia GPUs boost.

The only hardware difference was which monitor I used for displaying the benchmark

 

Starting temps were within 2 deg C of each other  and ambient was the same

 

I did not adjust any clock or any other overclocking settings between runs

 

Observation of clocks during runs whilst not identical had the same max levels and similar average levels as determined by eye

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Chris Fortune said:

Starting temps were within 2 deg C of each other  and ambient was the same

Starting temps don't matter, load temps do.

1 minute ago, Chris Fortune said:

Observation of clocks during runs whilst not identical had the same max levels and similar average levels as determined by eye

"Similar average levels" meaning the reported/logged average is identical or they just look the same to you?

 

Small differences in the system can lead to differences in points. You can't really conclude anything meaningful from 1 run on the first monitor and 1 run on the second. In the second case it has to do more upscaling as well, for example, as you go from a 1440p render to a 4k screen. That should be peanuts for a 3090, but who knows.

 

I'd say if you really care about this do more runs. Do like 5-10 on the first monitor and then 5-10 on the second. Then see what your spread in scores is.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, tikker said:

Starting temps don't matter, load temps do.

"Similar average levels" meaning the reported/logged average is identical or they just look the same to you?

 

Small differences in the system can lead to differences in points. You can't really conclude anything meaningful from 1 run on the first monitor and 1 run on the second. In the second case it has to do more upscaling as well, for example, as you go from a 1440p render to a 4k screen. That should be peanuts for a 3090, but who knows.

 

I'd say if you really care about this do more runs. Do like 5-10 on the first monitor and then 5-10 on the second. Then see what your spread in scores is.

Max temps were 66/67 on both runs, fans were fixed and case cooling was also fix. I tried to limit variation as much as possible.

 

Similar average levels I meant during the runs the clocks appeared to follow the same up and down pattern both runs but there were differences in exact timings of changes.

 

Peak clocks for both runs were within 0.5% as reported by HWiNFO64

 

I intend to do more runs but was out of time today, I will most likely do a pattern similar to your suggestion and also alternate for a total of 10 runs too.

 

To get back to what started this off for me, I was surprised that changing the monitor resolution gave such a large difference(approx 900 on a score of 18195) and was just looking for ideas as to what I may have missed in trying to eliminate user error

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Chris Fortune said:

Max temps were 66/67 on both runs, fans were fixed and case cooling was also fix. I tried to limit variation as much as possible.

 

Similar average levels I meant during the runs the clocks appeared to follow the same up and down pattern both runs but there were differences in exact timings of changes.

 

Peak clocks for both runs were within 0.5% as reported by HWiNFO64

 

I intend to do more runs but was out of time today, I will most likely do a pattern similar to your suggestion and also alternate for a total of 10 runs too.

 

To get back to what started this off for me, I was surprised that changing the monitor resolution gave such a large difference(approx 900 on a score of 18195) and was just looking for ideas as to what I may have missed in trying to eliminate user error

It could be a whole bunch of things. My immediate guess would be GPU Boost maybe. IIRC GPU Boost 3.0 starts scaling down boost clocks at 60 C already (note this is not thermal throttling).

 

It breaks down your score in GPU and CPU, which ones differ the most?

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, tikker said:

It could be a whole bunch of things. My immediate guess would be GPU Boost maybe. IIRC GPU Boost 3.0 starts scaling down boost clocks at 60 C already (note this is not thermal throttling).

 

It breaks down your score in GPU and CPU, which ones differ the most?

CPU scores were close enough to be considerred variance 14087 and 13996 whilst GPU were 19183 and 17961 respectively

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Chris Fortune said:

CPU scores were close enough to be considerred variance 14087 and 13996 whilst GPU were 19183 and 17961 respectively

Ah yeah that's a good drop in GPU score. My first guess is GPU Boost kicking in in your second run. As you mention peak clocks are the similar, I suspect temperatures. The card may have been cooler (perhaps the heatsink was still "cold" allowing for a tad more heat to be soaked up) long enough for it to boost longer.

 

48 minutes ago, Chris Fortune said:

Peak clocks for both runs were within 0.5% as reported by HWiNFO64

Is this a 12 MHz or so difference by any chance?

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, tikker said:

Ah yeah that's a good drop in GPU score. My first guess is GPU Boost kicking in in your second run. As you mention peak clocks are the similar, I suspect temperatures. The card may have been cooler (perhaps the heatsink was still "cold" allowing for a tad more heat to be soaked up) long enough for it to boost longer.

 

Is this a 12 MHz or so difference by any chance?

In regards to the first point I did allow to get idle temps back as similar as possible but that heatsink is huge and I may have been a bit quick and to the second point I rounded down as it was actually 15

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Chris Fortune said:

In regards to the first point I did allow to get idle temps back as similar as possible but that heatsink is huge and I may have been a bit quick and to the second point I rounded down as it was actually 15

Ah right, couldn't remember if it was 15 or 12 and could only find 12 for Pascal. That's what I thought though. You're seeing GPU Boost at work :) The cards temperatures push it to different boost bins. I don't know personally what a 15 MHz difference in clock would translate to in points, but seeing your GPU score is the one that dropped significantly I'd say temperatures are the cause of what you are seeing, or a least a significant contributor.

 

Techpowerup made a nice graph from a 3080:

gpu-clock-boost-vs-temperature.jpg

 

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

Two things are apparent from chatting with you, thanks for that btw

 

1) it is odd at the very least

 

2) definitely need more data

 

Next runs will be 5 of each with a full cool off and shut down in between each set.

Link to comment
Share on other sites

Link to post
Share on other sites

Is it me or sounds 19200 gpu score very low for a 3090 in general? My 3080 scores higher.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Dober83 said:

Is it me or sounds 19200 gpu score very low for a 3090 in general?

I was thinking that myself but I have a kind off possible explanation,

 

People who bench their brand new GPU will be likely to clock it a little at least so the scores recorded are inflated over stock, I have 19501 clocking it a bit on air and looking at the numbers for that run I think maybe 250 more would be fairly easy. For reference 19501 is top 50 in UK and top 17% worldwide. These are combined scores not GPU

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Chris Fortune said:

People who bench their brand new GPU will be likely to clock it a little at least so the scores recorded are inflated over stock, I have 19501 clocking it a bit on air and looking at the numbers for that run I think maybe 250 more would be fairly easy. For reference 19501 is top 50 in UK and top 17% worldwide. These are combined scores not GPU

17-19k GPU score does seem a little below average. You could check with GPU-Z I think if you're e.g. hitting power limit as well besides being at the mercy of GPU Boost.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, tikker said:

17-19k GPU score does seem a little below average. You could check with GPU-Z I think if you're e.g. hitting power limit as well besides being at the mercy of GPU Boost.

3DMark quote 19858 as the generic score for a 3090 but that includes AiB as well as FE, some AiBs are clocked out of the box so the numbers I am getting are not terrible at stock, GPU score on my best run was 20862 so I don't think I am that far off for stock cooling solutions

Link to comment
Share on other sites

Link to post
Share on other sites

do15- 20 runs of the *same* benchmark on each monitor, don't swap between monitors in between, write down each result then compare, I think then you'll see a bit clearer. 

 

I mean you already know your gpu is performing within expectations but this is what you have to do if you want anything conclusive. 

 

PS: there also should be no random waiting time for "idle temps" that's just not how it works, professional testers warm up their equipment before benchmarking so they're at "operational" temps because that's what matters.

 

If you have issues with temps that's another issue entirely (meaning there should be no big variations when your running a bench continuesly) 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mark Kaine said:

do15- 20 runs of the *same* benchmark on each monitor, don't swap between monitors in between, write down each result then compare, I think then you'll see a bit clearer. 

 

I mean you already know your gpu is performing within expectations but this is what you have to do if you want anything conclusive. 

 

 

Pretty much the plan for the morning although I was going to run until consecutive results are comparable the let it chill for 20 mins on full fan and then run he same for the other monitor.

 

In all seriousness it is not the absolute numbers that I am getting which I am worried about but rather I am trying to understand what appears to be a large difference when using only a different screen.

 

Thanks to all the ideas and suggestions given, they have been thoghtful and useful in considering what will give useful and better information.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Chris Fortune said:

In all seriousness it is not the absolute numbers that I am getting which I am worried about but rather I am trying to understand what appears to be a large difference when using only a different screen.

Well the tests should show that difference if there is one. I'm more inclined to say it doesn't have much to do with your monitor though and just either thermals or some other fluctuation, but it'll be interesting to see what you find.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, tikker said:

Well the tests should show that difference if there is one. I'm more inclined to say it doesn't have much to do with your monitor though and just either thermals or some other fluctuation, but it'll be interesting to see what you find.

I would hope that is the case but 900 is a lot to see on a run by run change with some down till between, especially when temps are not high during the run, sub 70 at all times. That said there is always oddities that are just blips for no apparent reason and a longer set of runs will hopefully cover that possibility.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, tikker said:

Well the tests should show that difference if there is one. I'm more inclined to say it doesn't have much to do with your monitor though and just either thermals or some other fluctuation, but it'll be interesting to see what you find.

yeap sounds like thermals hence this cool down phase is probably counter productive... needs lots of runs in succession imo... and yes as long there's no upscaling involved or something I don't expect a big variance between monitors... 

 

 

idk does time spy even allow something like DSR etc? 

 

 

I just know they called me a cheater for using "clamp" mode lol... so it's rather finicky test anyway. 

 

 

 

9 minutes ago, Chris Fortune said:

chill for 20 mins on full fan and then run he same for the other monitor

honestly this is creating an "unnatural" situation and isn't advisable therfore, after ~20 tests your equipment will have "perfect" operating temperature, there's no need to wait, it'll only fudge the results 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mark Kaine said:

yeap sounds like thermals hence this cool down phase is probably counter productive... needs lots of runs in succession imo... and yes as long there's no upscaling involved or something I don't expect a big variance between monitors... 

 

 

idk does time spy even allow something like DSR etc? 

 

 

I just know they called me a cheater for using "clamp" mode lol... so it's rather finicky test anyway. 

I have no idea what you just said, all I do is select the monitor and hit run test

 

Will have more info tomorrow as I find first thing in the morning the most consistent time for steady ambient temps. With a bit more data maybe it will show something and maybe not, wont know till I have done the runs.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×