Jump to content

What I mean by the title is, between 3DMark/Unigine/PassMark/Etc., is there one benchmark tool that most/more accurately gives you an idea of how your hardware is going to do playing modern games at specific settings?  For instance, if you take two different processors and pair them with the same GPU and then run a benchmark, which would be best to boil down the results to say, "this system is going to get you X more/less FPS in modern games than the other system at these settings"?

 

Perhaps it is a stupid question, but with 3DMark for example, the different tests produce large disparities for the combined test scores, which themselves don't seem to reflect real-world use.  My i5-3570k at 4.2Ghz and RX 580 8GB system can pretty much run most games at max settings at 1080p at 60+FPS.  However, in Fire Strike, which is a DX11 benchmark rendered at 1080p, the 2 graphics scores average 69.52 FPS, while the combined score for that setup is only 23.78 FPS.  Meanwhile, an old A8-7670k at 4.4Ghz I have (think an X4 860K) with an RX 580 averages 67.06 FPS for the graphics scores and 12.02 FPS for the combined score.  Am I misunderstanding the results?  Would the second processor only get me half of the game performance of the first?  I guess that makes sense given those two examples, but where would they diverge?  Low?  Medium?  Max settings?  I don't think you are getting that kind of information based on what 3DMark tells you.

 

TL;DR:  If you wanted to test CPU A vs CPU B, paired with the same GPU, and see which is going to get you better results playing a modern game at 1080p at different settings, what benchmark would be best to use?

Link to post
Share on other sites

17 minutes ago, skaughtz said:

What I mean by the title is, between 3DMark/Unigine/PassMark/Etc., is there one benchmark tool that most/more accurately gives you an idea of how your hardware is going to do playing modern games at specific settings?  For instance, if you take two different processors and pair them with the same GPU and then run a benchmark, which would be best to boil down the results to say, "this system is going to get you X more/less FPS in modern games than the other system at these settings"?

 

Perhaps it is a stupid question, but with 3DMark for example, the different tests produce large disparities for the combined test scores, which themselves don't seem to reflect real-world use.  My i5-3570k at 4.2Ghz and RX 580 8GB system can pretty much run most games at max settings at 1080p at 60+FPS.  However, in Fire Strike, which is a DX11 benchmark rendered at 1080p, the 2 graphics scores average 69.52 FPS, while the combined score for that setup is only 23.78 FPS.  Meanwhile, an old A8-7670k at 4.4Ghz I have (think an X4 860K) with an RX 580 averages 67.06 FPS for the graphics scores and 12.02 FPS for the combined score.  Am I misunderstanding the results?  Would the second processor only get me half of the game performance of the first?  I guess that makes sense given those two examples, but where would they diverge?  Low?  Medium?  Max settings?  I don't think you are getting that kind of information based on what 3DMark tells you.

 

TL;DR:  If you wanted to test CPU A vs CPU B, paired with the same GPU, and see which is going to get you better results playing a modern game at 1080p at different settings, what benchmark would be best to use?

 

 

3DMARK is excellent and is stable and repeatable and an excellent way to compare to other systems, actually the best way in reality because it validates the scores. It is not easy on the system at all.

 

Unigine Heaven, Valley and Superposition are also good and more gaming like with more real world like results.

 

Passmark is good for evaluating the whole system and comparing to others. 

i9 9900K @ 5.0 GHz, NH D15, 32 GB DDR4 3200 GSKILL Trident Z RGB, AORUS Z390 MASTER, EVGA RTX 3080 FTW3 Ultra, Samsung 970 EVO Plus 500GB, Samsung 860 EVO 1TB, Samsung 860 EVO 500GB, ASUS ROG Swift PG279Q 27", Steel Series APEX PRO, Logitech Gaming Pro Mouse, CM Master Case 5, Corsair AXI 1600W Titanium. 

 

i7 8086K, AORUS Z370 Gaming 5, 16GB GSKILL RJV DDR4 3200, EVGA 2080TI FTW3 Ultra, Samsung 970 EVO 250GB, (2)SAMSUNG 860 EVO 500 GB, Acer Predator XB1 XB271HU, Corsair HXI 850W.

 

i7 8700K, AORUS Z370 Ultra Gaming, 16GB DDR4 3000, EVGA 1080Ti FTW3 Ultra, Samsung 960 EVO 250GB, Corsair HX 850W.

 

 

Link to post
Share on other sites

20 minutes ago, skaughtz said:

TL;DR:  If you wanted to test CPU A vs CPU B, paired with the same GPU, and see which is going to get you better results playing a modern game at 1080p at different settings, what benchmark would be best to use?

If you're going to limit it to one benchmark suite, 3DMark would be it since you can test with a variety of different levels of graphical quality with mixed workloads.

 

Otherwise it would be better to run games with a benchmarking mode like GTA V, Rise/Shadow of the Tomb Raider, or The Division (1 or 2).

Link to post
Share on other sites

21 minutes ago, NunoLava1998 said:

The game itself

This, some games favor CPU or GPU, some favor single core CPU performance, some threads, some prefer AMD over Nvidia, etc.

 

Each engine is different.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×