Jump to content

Cinebench

I wonder if those cinebench scores are good or not

4569809E-14C8-4B43-B639-DE599F77FAAC.jpeg

0E66EC2F-40B0-49AB-A412-6B09E49D00A8.jpeg

5928A102-E7AC-4E1E-9346-1C0605C83062.jpeg

Link to comment
Share on other sites

Link to post
Share on other sites

I'd imagine that CPU is going to bottleneck the GPU though.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Alex Atkin UK said:

I'd imagine that CPU is going to bottleneck the GPU though.

Depends on the GPU and game/settings 

Dirty Windows Peasants :P ?

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, Alex Atkin UK said:

I'd imagine that CPU is going to bottleneck the GPU though.

I doubt it.

You need far less CPU power than most people think for gaming.

 

The i3-9100F gets about the same gaming results as a Ryzen 7 2700X when you pair it with an overclocked 2080 Ti. They are like 1-2% apart in terms of performance for gaming (with a 2080 Ti, the difference is even smaller with a weaker GPU). It's even less of a difference if you increase the resolution or graphics details.

In 99% of gaming cases, the 1660 GPU will be the bottleneck long before the i3-9100F is.

 

 

If you have an overlocked 2080 Ti, and play games at 1440p, then you only get a ~10% performance increase by going from a i3-9100F to an i9-9900K or a Ryzen 9 3900X.

CPU matters very little for gaming.

If you increase the resolution from 1440p to 4K, there is only a ~2% difference between the i3 and the i9.

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, Alex Atkin UK said:

I'd imagine that CPU is going to bottleneck the GPU though.

No it’s not bottlenecking it it’s ok 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, LAwLz said:

I doubt it.

You need far less CPU power than most people think for gaming.

 

The i3-9100F gets about the same gaming results as a Ryzen 7 2700X when you pair it with an overclocked 2080 Ti. They are like 1-2% apart in terms of performance for gaming (with a 2080 Ti, the difference is even smaller with a weaker GPU). It's even less of a difference if you increase the resolution or graphics details.

In 99% of gaming cases, the 1660 GPU will be the bottleneck long before the i3-9100F is.

 

 

If you have an overlocked 2080 Ti, and play games at 1440p, then you only get a ~10% performance increase by going from a i3-9100F to an i9-9900K or a Ryzen 9 3900X.

CPU matters very little for gaming.

If you increase the resolution from 1440p to 4K, there is only a ~2% difference between the i3 and the i9.

Thanks I haven’t hear that

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, Lord Vile said:

Depends on the GPU and game/settings 

True, I tend to play a lot of open world games and moving from an i5 4690 to an i5 8600k to an i9 9900k, I've seen an improvement in smoothness each time.  Its not all about frame rates, its about frequency of hitches/stutters and really eye opening was fast travel load times.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, LAwLz said:

If you have an overlocked 2080 Ti, and play games at 1440p, then you only get a ~10% performance increase by going from a i3-9100F to an i9-9900K or a Ryzen 9 3900X.

CPU matters very little for gaming.

If you increase the resolution from 1440p to 4K, there is only a ~2% difference between the i3 and the i9.

Average FPS doesn't tell you the whole story, the same way the "average wage" is MUCH higher than the "common" wage.

What's important is the frame rate/time consistency.  Plus the fact that were about to get a new generation of consoles that will push CPU utilisation much higher than it is today.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, Alex Atkin UK said:

Average FPS doesn't tell you the whole story, the same way the "average wage" is MUCH higher than the "common" wage.

What's important is the frame rate/time consistency.  Plus the fact that were about to get a new generation of consoles that will push CPU utilisation much higher than it is today.

The difference in consistency is bigger than average FPS, but the difference still isn't THAT big. Here are some benchmarks if you don't believe me. Sadly I couldn't find i3-9100F benchmarks, but the 8350K is more or less the same.

 

Far Cry 5 99th percentile:

i9-9990K: 114.7

i3-8350K: 83.9

Difference: 27% slower

 

Metro Exodus 99th percentile:

i9-9990K: 81.2

i3-8350K: 92.5

Difference: 14% faster

 

DOTA 2 99th percentile:

i9-9990K: 144.2

i3-8350K: 112.0

Difference: 22% slower

 

But please remember that this is with a GTX 2080, and it's at 1080p. The differences is smaller with higher resolution.

 

 

Most of the benchmarks from Anandtech tells a similar story, except their test bed has some outliers like Civ 6 which is very CPU heavy. But it's worth remembering that their benchmarks are also done with a very high end (1080 Ti) GPU. Far more powerful than the 1660 the OP has.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Alex Atkin UK said:

True, I tend to play a lot of open world games and moving from an i5 4690 to an i5 8600k to an i9 9900k, I've seen an improvement in smoothness each time.  Its not all about frame rates, its about frequency of hitches/stutters and really eye opening was fast travel load times.

Travel load is more to do with the Dtorage I would have thought. To be fair I haven’t upgraded my CPU that much I didn’t notice much going from a 6600K to a 3600X. Main issue with quad cores atm is aggressive DRM like in Assassins creed 

Dirty Windows Peasants :P ?

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Lord Vile said:

Travel load is more to do with the Dtorage I would have thought. To be fair I haven’t upgraded my CPU that much I didn’t notice much going from a 6600K to a 3600X. Main issue with quad cores atm is aggressive DRM like in Assassins creed 

I'm not entirely convinced Assassins Creed is the DRM, I think its the complex world simulation.  People underestimate how much is going on in those games.

I actually had a shock, when I upgraded my CPU, some games loaded as fast from the HDD as they did from the SSD on the old CPU.  I have no idea why that happened.  But the CPU can certainly bottleneck loading speeds as even on an SSD the data has to be decompressed, the faster the storage, the quicker it needs to decompress to see the benefit.

 

12 hours ago, LAwLz said:

The difference in consistency is bigger than average FPS, but the difference still isn't THAT big. Here are some benchmarks if you don't believe me. Sadly I couldn't find i3-9100F benchmarks, but the 8350K is more or less the same.

 

Far Cry 5 99th percentile:

i9-9990K: 114.7

i3-8350K: 83.9

Difference: 27% slower

 

Metro Exodus 99th percentile:

i9-9990K: 81.2

i3-8350K: 92.5

Difference: 14% faster

 

DOTA 2 99th percentile:

i9-9990K: 144.2

i3-8350K: 112.0

Difference: 22% slower

 

But please remember that this is with a GTX 2080, and it's at 1080p. The differences is smaller with higher resolution.

 

 

Most of the benchmarks from Anandtech tells a similar story, except their test bed has some outliers like Civ 6 which is very CPU heavy. But it's worth remembering that their benchmarks are also done with a very high end (1080 Ti) GPU. Far more powerful than the 1660 the OP has.

That doesn't address frame time, you can have terrible frame pacing and it not reflect in the frame rate at all, because you still get x amount of frames per second, but they are not being drawn to the screen at a consistent speed so it feels bad.  Although they seem to downplay the CPU angle in the video (as they are assuming you have a high-end CPU), other YouTubers have shown the CPU to be relevant.

Also https://forums.tomshardware.com/threads/i3-9100f-bottlenecking.3514488/

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Alex Atkin UK said:

I'm not entirely convinced Assassins Creed is the DRM, I think its the complex world simulation.  People underestimate how much is going on in those games.

I actually had a shock, when I upgraded my CPU, some games loaded as fast from the HDD as they did from the SSD on the old CPU.  I have no idea why that happened.  But the CPU can certainly bottleneck loading speeds as even on an SSD the data has to be decompressed, the faster the storage, the quicker it needs to decompress to see the benefit.

 

That doesn't address frame time, you can have terrible frame pacing and it not reflect in the frame rate at all, because you still get x amount of frames per second, but they are not being drawn to the screen at a consistent speed so it feels bad.  Although they seem to downplay the CPU angle in the video (as they are assuming you have a high-end CPU), other YouTubers have shown the CPU to be relevant.

Also https://forums.tomshardware.com/threads/i3-9100f-bottlenecking.3514488/

Were you using the old version of SATA or something? 

 

do think it's DRM because it tanked when there wasn't much difference between the new and previous game and it was when they first implemented that DRM.

Dirty Windows Peasants :P ?

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

Well mine is 7659. But don’t go off them. go off what you do on your computer, does it run smooth, is it clear. If you use it for new AAA games then no it’s not. But if you are playing old games 10 or more years that’s fine. Typing it’s fine. Streaming no. If you are happy with what it does then yes it’s good. But you won’t be setting benchmarks records. Cinebench is good for having a baseline for over locking and seeing if your over clocking has made it better

57E324FB-CB2B-4759-A904-1E42605978D2.jpeg

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×