Jump to content

Why are Ryzen gaming benchmarks mainly done at 4K?

Running gaming benchmarks at 4K, which is known to be GPU bound much more than CPU bound, will show results that are much closer between different CPU's.  Run those same benchmarks at 1080p, which is much more CPU bound, to see the real gaming performance difference between the CPU's.  And for those who would say that if you can afford a Ryzen based system, then you wouldn't be gaming at 1080p, then bench at no higher than 1440p.  Making the argument that Ryzen is just as good of a gaming CPU as the 7700K because the numbers are really close at 4K is misleading.

Link to comment
Share on other sites

Link to post
Share on other sites

my wife is upgrading to ryzen, the results seem good especially for the games we play also the price. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Gerr said:

Running gaming benchmarks at 4K, which is known to be GPU bound much more than CPU bound, will show results that are much closer between different CPU's.  Run those same benchmarks at 1080p, which is much more CPU bound, to see the real gaming performance difference between the CPU's.  And for those who would say that if you can afford a Ryzen based system, then you wouldn't be gaming at 1080p, then bench at no higher than 1440p.  Making the argument that Ryzen is just as good of a gaming CPU as the 7700K because the numbers are really close at 4K is misleading.

 

That's exactly why they did it.  Eliminates the margin gap between the CPUs compared and makes them look equal in gaming. 

Link to comment
Share on other sites

Link to post
Share on other sites

Because anything about 4K = more clicks. - LinusClickTips 7350k review.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

if they are using the same GPU then what is the problem? could it be that 1080p does not stress the system enough?

             ☼

ψ ︿_____︿_ψ_   

Link to comment
Share on other sites

Link to post
Share on other sites

 

Here's a couple more examples of what AMD did to skew perception of gaming performance.  Keep in mind, all companies do this to some degree.  AMD just happens to be the best at it.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, SCHISCHKA said:

if they are using the same GPU then what is the problem? could it be that 1080p does not stress the system enough?

Gaming at different resolutions places different stresses on different components.  Lower resolution gaming like at 1080p doesn't tax the GPU as much and thus places a greater dependency on the CPU and you will see a greater difference in benchmark scores between different CPU's.  The opposite is true at 4K, which stresses the GPU much more and thus you won't see as much of a difference in CPU performance.  This is how AMD is hiding its lower gaming performance vs Intel 7700K, buy benchmarking games only at 4K.  If you benchmark those same games at 1080p, you will see a much bigger performance difference.

 

I would expect review sites to know this, but yet I keep seeing many sites only benchmarking Ryzen at 4K, which plays right into their deception.  Sadly, that included the LTT video, which only showed 4K gaming performance.

 

Don't get me wrong, I am not an AMD hater and think their Ryzen CPU is a great accomplishment.  However, it is most competitive against Intel when doing productivity work and not so much in gaming.  Problem is many of the reviews are only testing at 4K and claiming both, which is inaccurate.

Link to comment
Share on other sites

Link to post
Share on other sites

If you want to stress test the CPU for gaming, you're supposed to do it at the lowest resolution and quality settings possible. This eliminates the GPU from the equation as much as possible.

 

http://www.tomshardware.com/reviews/amd-ryzen-7-1800x-cpu,4951-6.html

 

Tom's hardware did it at 1080p. They mentioned AMD provided 4K benchmarks, so really, if anyone is blindly following what AMD did, it's AMD's fault.

 

EDIT: Okay, maybe not really AMD's fault, but AMD should know better.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Gerr said:

 Lower resolution gaming like at 1080p doesn't tax the GPU as much and thus places a greater dependency on the CPU and you will see a greater difference in benchmark scores between different CPU's

this does not make sense to me. less pixels = less GPU work sure but how does that increase CPU work? Same game logic, same draw and update calls, same number of objects in game.

             ☼

ψ ︿_____︿_ψ_   

Link to comment
Share on other sites

Link to post
Share on other sites

Steve from Gamers Nexus sums that up nicely...

 

When we approached AMD with these results pre-publication, the company defended its product by suggesting that intentionally creating a GPU bottleneck (read: no longer benchmarking the CPU’s performance) would serve as a great equalizer. AMD asked that we consider 4K benchmarks to more heavily load the GPU, thus reducing workload on the CPU and leveling the playing field. While we fundamentally disagree with this approach to testing, we decided to entertain a mid-step: 1440p, just out of respect for additional numbers driven by potentially realistic use cases. Of course, in some regard, benchmarking CPUs at 4K would be analogous to benchmarking GPUs at 720p: The conclusion would be that every GPU is “the same,” since they’d all choke on the CPU. Same idea here, just the inverse.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/2/2017 at 2:14 PM, SCHISCHKA said:

this does not make sense to me. less pixels = less GPU work sure but how does that increase CPU work? Same game logic, same draw and update calls, same number of objects in game.

Generally each frame you render, whether it be 4k or 1080 as the same amount of items on screen/physics calculations to be done.

 

These are done by the cpu.

 

Thus, when the GPU, which basically just renders the frame can render frames as fast as possible, it leads to the cpu having more work to do as it does work on a per frame basis instead of the gpus per pixel (which can vary per frame)

System Specs:

Spoiler

CPU: Intel Core i5-6500 3.2GHz Quad-Core Processor 
CPU Cooler: Cooler Master Hyper 212 EVO 82.9 CFM Sleeve Bearing CPU Cooler 
Motherboard: Gigabyte GA-Z170MX-Gaming 5 Micro ATX LGA1151 Motherboard  
Memory: G.Skill Ripjaws V Series 16GB (2 x 8GB) DDR4-2400 Memory 
Storage: A-Data Premier SP550 240GB 2.5" Solid State Drive 
Storage: Samsung 850 EVO-Series 500GB 2.5" Solid State Drive  
Storage: Western Digital Blue 3TB 3.5" 5400RPM Internal Hard Drive  
Video Card: EVGA GeForce GTX 1070 8GB FTW Gaming ACX 3.0 Video Card  
Case: Thermaltake Core V21 MicroATX Mini Tower Case 
Power Supply: EVGA SuperNOVA G2 550W 80+ Gold Certified Fully-Modular ATX Power Supply  

Displays: PlayStation® 3D Display (1080p)

Displays: VA1948M (900p)
Case Fan: Corsair CO-9050017-WLED 66.4 CFM  140mm Fan 
Case Fan: Corsair ML120 75.0 CFM  120mm Fans  
Mouse: Logitech G502 Wired Optical Mouse 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/2/2017 at 11:14 AM, SCHISCHKA said:

this does not make sense to me. less pixels = less GPU work sure but how does that increase CPU work? Same game logic, same draw and update calls, same number of objects in game.

It doesn't increase CPU work. But having a GPU bottleneck hides differences between CPUs. Take for example:

 

At 1080p:

CPU A can get up to 60 fps before maxing out

CPU B can get up to 80 fps before maxing out

GPU (same for both platforms) can get up to 120 fps before maxing out

 

When the tests are run here, the results will be 60 fps on the A platform and 80 fps on the B platform.

 

But, at 4K the GPU demand per frame is significantly higher while the CPU demand per frame is the same. So:

CPU A can get up to 60 fps before maxing out

CPU B can get up to 80 fps before maxing out

GPU can get up to 30 fps before maxing out

 

Now, both platforms A and B will get 30 fps, because that's the most that the GPU can do in both cases. However this tells us nothing about how powerful the CPUs are, other than they are powerful enough to get at least 30 fps. We don't know how much higher each one can go, so we don't know which one is more powerful and by how much.

 

It's effectively the same problem as, for example, benchmarking with V-Sync on, except in this case it's the GPU that determines the frame cap, not the monitor's refresh frequency, but it doesn't matter. If anything besides the CPU is determining the framerate cap, then it isn't a valid benchmark of CPU capability.

 

People can argue all day long about "using realistic settings" and it's fine to run benchmarks at realistic settings, but if you do that you have to accept that you're not benchmarking CPU capability, you're benchmarking something else, and the results are not valid for comparing CPU capability.

Link to comment
Share on other sites

Link to post
Share on other sites

On Thu Mar 02 2017 at 1:14 PM, SCHISCHKA said:

this does not make sense to me. less pixels = less GPU work sure but how does that increase CPU work? Same game logic, same draw and update calls, same number of objects in game.

By creating a GPU bottleneck you hide CPU limitations. For example, I had an i3-4160 and a GTX 750 Ti. I upgraded to an i7-4790K, but saw zero gaming performance improvement.  My 750 Ti is bottlenecking the system. I could use those results to argue that an i7-4790K and i3-4160 are identical when it comes to gaming performance, even though they aren't. 

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

Sure , but by that logic you could run the games at 480/720p to stress the cpu even more. Chances are, people buying 300$+ cpus are also running on higher end gpus, at higher resolutions. 

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, Coaxialgamer said:

Sure , but by that logic you could run the games at 480/720p to stress the cpu even more. Chances are, people buying 300$+ cpus are also running on higher end gpus, at higher resolutions. 

Or in Linus' words, "I didn't test Freesync in the Freesync monitor because people buying it won't buy AMD GPUs" (loose quote :P). At some point you are effectively running a synthetic benchmark and calling it "real world performance test", even if no one will actually use those settings. It's informative, but it's still not "real world", and it's often misleading.

On 3/2/2017 at 8:10 PM, M.Yurizaki said:

If you want to stress test the CPU for gaming, you're supposed to do it at the lowest resolution and quality settings possible. This eliminates the GPU from the equation as much as possible.

But it also leads to results like "CPU 1 gives you 800 FPS, while CPU 2 gives you 400 FPS. CPU 2 is 100% slower! CPU 2 sux!", while no one really should care which CPU they use at that point. Again, it's not false, but it's also not better than just running Firestrike or something.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SpaceGhostC2C said:

But it also leads to results like "CPU 1 gives you 800 FPS, while CPU 2 gives you 400 FPS. CPU 2 is 100% slower! CPU 2 sux!", while no one really should care which CPU they use at that point. Again, it's not false, but it's also not better than just running Firestrike or something.

But that's more of a problem of people not knowing what to take from the results.

 

It's not doing CPU gaming benchmarks any service when you throw in only 4K benchmarks because the GPU is the bottleneck more often in that scenario and it can hide the deficiencies of the CPU.

Link to comment
Share on other sites

Link to post
Share on other sites

Lets face it AMD tried to have us over. Lets be honest every company tries to do it. At 4k the results are very much the same as basically you are taking the CPU out the equation and forcing a GPU bottleneck where as at 1080p you are forcing a CPU bottleneck because the GPU is producing the frames but the CPU cant push them out quick enough. AMD Ryzen is a great CPU for productivity due to its price and cores. Also AMD giving the excuse that ' most gamers game at 1440p and 4k ' is complete trash as proven on steams hardware stats as 1080p gaming is still way ahead in the percentages. Now 1 thing i would say to Ryzen fans who game is maybe wait till the Ryzen 4 or 6 cores come out. The could potentially run at higher speeds which could very well put them on par with Intel and lets face it will be a few pounds/dollars or whatever your local currency is cheaper.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, danrey84 said:

Lets face it AMD tried to have us over. Lets be honest every company tries to do it. At 4k the results are very much the same as basically you are taking the CPU out the equation and forcing a GPU bottleneck where as at 1080p you are forcing a CPU bottleneck because the GPU is producing the frames but the CPU cant push them out quick enough. AMD Ryzen is a great CPU for productivity due to its price and cores. Also AMD giving the excuse that ' most gamers game at 1440p and 4k ' is complete trash as proven on steams hardware stats as 1080p gaming is still way ahead in the percentages. Now 1 thing i would say to Ryzen fans who game is maybe wait till the Ryzen 4 or 6 cores come out. The could potentially run at higher speeds which could very well put them on par with Intel and lets face it will be a few pounds/dollars or whatever your local currency is cheaper.

Then why does the performance of the GPU get better with the Ryzen at 4k?

 

If I'm buying for 4k and Ryzen does better than the Intel for cheaper... Why would I buy an Intel? Why would I care about 1080p? 

 

Think about it. I get better frames with Ryzen at 4k.... Shouldn't it be nearly the same or less? Why isn't it?

 

If people want to keep saying it is the same, why not show it? Instead of spewing....

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Coaxialgamer said:

Sure , but by that logic you could run the games at 480/720p to stress the cpu even more. Chances are, people buying 300$+ cpus are also running on higher end gpus, at higher resolutions. 

Yes, and that is a valid CPU benchmark. Mind you, not because it stresses the CPU more, but because it stresses the GPU less.

 

The point isn't to test "what framerate should you expect to see in games". For that you would test with a realistic setup. If you want to test how powerful CPUs are relative to each other, you need to test in a scenario in which they can leverage their full capabilities, even if that scenario isn't typical settings with typical hardware.

 

It would be like testing whether PSU A can output more power than PSU B, and then test them on a "typical gaming system" (say an i7 and a GTX 1070) and measure the output, and find they both output 400 W under full load without problems. Then conclude PSU A is just as powerful as PSU B. This is an invalid test for answering that question, so the conclusion does not follow from the results. In order to find out that PSU A goes up to 1000 W and PSU B goes up to 1200 W, you would need to engineer a scenario that very few people would actually experience, but yes that is how you would test it, "realistic" or not. Whether or not the difference between 1000 and 1200 W power supplies actually matters for typical scenarios is a separate question which you would answer by testing with a "typical system". But the results of that test should not be passed off as results for which PSU is more powerful.

 

Likewise, in CPU tests, you can test performance in a typical situation with normal settings, this is a valid test, but you need to be aware of what you're testing. In that situation, you are testing "is this CPU powerful enough to not be the limiting factor for a typical a gaming system?" The results of this test, while valid in their own right, cannot be substituted as a test for how powerful each CPU is. To test that, you need to run tests at low resolution. That will show you which CPU is more powerful. Once you have those results, whether it matters that CPU A is 20% faster than CPU B is again a separate question. You need to determine how the CPUs stack up against each other, and then ask "ok, where's the cutoff point where anything higher than X CPU will not matter for my system".

 

In short, both of these tests provide valuable information, you need to test both, but you also need to be able to draw the right conclusions.

 

Testing at 1280×720 (or whatever) will tell you how powerful each CPU is in comparison to the others.

 

Testing at 4K (or whatever system and settings are "typical system" for the hardware) will tell you which CPU (and above) is good enough to not matter above that level of CPU performance, for that system.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, kingj0n said:

Then why does the performance of the GPU get better with the Ryzen at 4k?

 

If I'm buying for 4k and Ryzen does better than the Intel for cheaper... Why would I buy an Intel? Why would I care about 1080p? 

 

Think about it. I get better frames with Ryzen at 4k.... Shouldn't it be nearly the same or less? Why isn't it?

 

If people want to keep saying it is the same, why not show it? Instead of spewing....

Because at 4k the GPU becomes the bottleneck not the CPU as there isn't a GPU out there than can push 4k ultra preset into the 144Hz and abive. Where as at 1080p the CPU becomes the bottleneck not the GPU. As the GPU can push out a ton frames but the CPU can't keep up with it. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, danrey84 said:

Because at 4k the GPU becomes the bottleneck not the CPU as there isn't a GPU out there than can push 4k ultra preset into the 144Hz and abive. Where as at 1080p the CPU becomes the bottleneck not the GPU. As the GPU can push out a ton frames but the CPU can't keep up with it. 

This is almost entirely un-related to Ryzen benchmarks. However, do you think this may be the reason why my i5 seems to be my bottleneck at 1080p with an RX 480? Everything i look up says an i5 wont even bottleneck a 1080, but my personal experience and the numbers I've gathered through Afterburner and Task Manager tell me something different. CPU usage is up near 100% while GPU usage is more often than not well below 80%.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Ununhexium116 said:

This is almost entirely un-related to Ryzen benchmarks. However, do you think this may be the reason why my i5 seems to be my bottleneck at 1080p with an RX 480? Everything i look up says an i5 wont even bottleneck a 1080, but my personal experience and the numbers I've gathered through Afterburner and Task Manager tell me something different. CPU usage is up near 100% while GPU usage is more often than not well below 80%.

Remember that these are synthetic benchmarks and in the real world all your background programs will cause a cpu bottleneck in the cpu with fewer cores.

They really need to include tests with a real world scenario, or at least with one core disabled.

Link to comment
Share on other sites

Link to post
Share on other sites

I'd say testers should use multiple resolutions in these game benches; more on lower resolution and few example tests at 4K. These tests at 1080p are pretty useless to those who are considering playing at 4K and I'd imagine that quite a few would go for 4K if they are buying $400-$500 CPU. Someone with little experience could imagine that this difference in gaming performance would continue at 4K. 

 

Or if tester doesn't want to do those example tests..it would be nice if that tester would clarify in their reviews that at 4K differences in CPU power are almost nullified and if you game at 4K you shouldn't even watch these tests at 1080p.

Link to comment
Share on other sites

Link to post
Share on other sites

Testing at 1080p is far from useless.  It shows the true gaming ability of the CPU's!  This is because as you increase the resolution, the GPU is working harder and harder and more often becomes the bottleneck.  4K benchmarks often show that even an i3 can preform just as good as an i7 or 1800x, so does that mean it's as good of a gaming CPU as those others?

 

The opposite is true when you benchmark GPU's.  You don't want to benchmark at 1080p with a low end CPU because the CPU becomes the bottleneck.  But by many of the arguments is see here, at 1080p with a dual core Pentium CPU, a 1050 preforms as well as a 1080, so that must mean a 1050 is as good of a gaming GPU as a 1080, doesn't it?!?

 

The whole point of testing a CPU or GPU is minimize the chance of other parts of the system being the bottleneck to show what the device you are testing is really able to do.  Yes, that means testing in a way a person might not normally use, but that's why testers often add in benchmarks at other configurations/resolutions so you can see real world performance too.

Link to comment
Share on other sites

Link to post
Share on other sites

Benchmarking at 1080p is totally meaningless, that is why. For me, 4K CPU and GPU benchmarking makes more sense, shows you real gaming performance, and on top of that, WHO IS THAT STUPID DUMB Gamer who would buy a Ryzen 8 core and will play games at 1080p? Benchmarks at 1080p do not show high end gaming performance we want to see, I don't care if you have the 240Hz monitor, if you suck at CSGO, you are still dead meat.

 

Do NOT, i repeat, Do NoooooooooT Trust 1080p benchmarks on CPUs!!! Absolutely MEANINGLESS to bench at 1080p or lower. They only do that to put down the company, and ALL of those benchmarks can not be more WRONG! 

Judging by 4K benchmarks, Ryzen is = to Intel's high end and beats i7 7700k across the board. If you buy a 7700K, wait an year and you will be throwing it away, you would be disappointed from intel! DO NOT BUY Kaby Lake, buy Ryzen, and if you are an intel fanboy or trust 1080p benchmarks, well im sorry for you but you are stupid! 

 

(I dont care if you ban me from this forum because I was offensive, but you should know the truth, and one day you will say, "This guy was right")

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×