Jump to content

i7 8700k with a 3080?

Consul

I have an 8700K that I use STRICKTLY for 3D benchmarking.

Well, AMD hasn't released anything that can take it on. 

 

AMD pipelines need a little work yet imo. 

Short for decent IPC, but too short for higher frequency. 

 

AMD 7nm+ going to be maybe 10% faster than current chips.

That's 200mhz (+/-) frequency gains without changing anything else.

Or they change other features a bit, better instruction set (revision A/B/C ect..)

 

Intel won't bottle neck a 3080. If so, not as bad as amd.

Which AMD usually has lower Low FPSs.

 

But it's AMD's time to shine with that good price and heavy thread count.

People like big numbers at low cost. Yeah 24 threads sounds great, while more than 50% sits idle during gaming.

 

But in the end, it's about who's willing to overclock for net gains? Just only the few I think. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, farmfowls said:

Here's the video the OP references. Jay had noticed that when doing benchmarks on the 3080, the 8700k was bottlenecking the 3080 by 20fps. Now this was on 1080p and he recommended getting a 1440p monitor instead of a new CPU/MB for those in this situation.

But what about 1440p? I have a 8086k at 5GHz but I have a 1440p monitor unlike the OP. And even I'm worried now that I'll have to upgrade my CPU/MB too.     

I didn't watch the video, but was the 8700k overclocked?

 

A stock 8700k runs at 4.3ghz, pretty tame by Intel 2020 standards.

 

If it's 5ghz range, then you can safely just assume any 10600k results will be interchangeable with 8700k.

 

The 3600-3900 data seems to indicate there's not much of a difference between a 6/12 and 12/24 configuration, and I'd presume that carries over to 8700k/10600k-10700k-10900k as well. 

 

The processor being an 8700k or having 12 bajillion cores won't make matter if the software can't utilize them.

 

The actual difference between stock 10900k and stock 8700k is about 700mhz. This is probably the issue thats driving his 1080p FPS delta.

 

Him saying the 8700k isn't enough might convince people to go out and get a something "newer" like a 3900x, and probably get worse bottlenecking on a 3080 in most games that don't give a shit about cores past 6.

 

EDIT: okay, watched his video.

 

1. He states there were big improvements in his 2080ti data from rebenching today vs the 3080 compared to his original 2080ti results.

2. He states that there's typically driver improvements that cause changes over time.

3. He attributes the differences solely to it being that they used the 8700k to test it.

4. He didn't retest the 8700k today (they used a 10900k) to see if indeed the 8700k was the reason for the differences or it was in fact simply driver optimizations.

 

5. So he made an assumption without proving it and put on youtube.

 

Given Gamer's nexus' review of the 10900k, and adamantly recommending the 10600k over the 10900k for just gaming, I have to assume Jayztwocents just talking out of his ass.

 

OR

 

Security mitigations are just that more impacting on the 8th generation chips.

 

Those are the only two scenarios that make sense to me.

 

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Mister Woof said:

I have to assume Jayztwocents just talking out of his ass.

Even though I love Jayz's content so much, my vote is on this.
Because when looked at gaming performances, there is barely any difference before and after the patches.
The performance hits are mostly on the work-related (enterprise, rendering etc) type of programs.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Consul said:

Even though I love Jayz's content so much, my vote is on this.
Because when looked at gaming performances, there is barely any difference before and after the patches.
The performance hits are mostly on the work-related (enterprise, rendering etc) type of programs.

Interestingly, last month someone DID revisit the 8700k, and here was the result.

 

 

 

Can't link the image for some reason but:

 

https://www.techspot.com/review/2068-intel-core-i7-8700k-revisited/

 

5ghz 8700k SottR 1080p high:

130fps average, 106fps 1% low

 

10900k SotTR 1080p high:

132fps average, 117fps 1% low

 

10% better minimums, average FPS is identical.

 

SotTR, one of the more core-friendly games, seems to perform JUST FINE with the 2080ti and the overclocked 8700k..........

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/6/2020 at 2:24 PM, DarkSmith2 said:

 


Neither Intel nor AMD managed to bring out a faster CPU for gaming yet. More cores/threads are only useful in a very limited amount of games.
You would actually have to upgrade to higher resolution or wait until something faster comes out to remove any CPU bottlenecks on 1080p.

 

The trend even goes in favor of the 8700k with more and more dx12 games being developed, lifting the CPU bottleneck. I mean i have a 2080ti and even at 720p the division2 and modernwarfare on small maps (deathmatch) are GPU limited while gaining about 100fps compared to 1080p (all settings on lowest possible).

 

So my guestimation is 8700k will be fine for the next 3years, because AMD isnt even on par with Intel in gaming yet and wont be on par with its next gen.

(way to far behind, google for some 720p benchmarks to know how far, spoiler: its above 20% on average and even for AMD a +20%increase is unrealistic.)

Actually I missed this. My man here already had it covered.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Mister Woof said:

5ghz 8700k SottR 1080p high:

130fps average, 106fps 1% low

 

10900k SotTR 1080p high:

132fps average, 117fps 1% low

 

10% better minimums, average FPS is identical.

 

SotTR, one of the more core-friendly games, seems to perform JUST FINE with the 2080ti and the overclocked 8700k..........

I've watched that video a while ago, and now that I see what you've written too, here's my conclusion:

 

There is no way an 8700k CPU will be bottlenecking a 3080, especially when overclocked. Even if there is, it will be really small that it's unnoticable.

(Mine is not overclocked, but I use my PC on High Performance power settings and it goes up to 4.5 GHz, and I feel like it is enough for a 3080 at this point)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Consul said:

I've watched that video a while ago, but now that I see what you've written too, so here's my conclusion:

 

There is no way an 8700k CPU will be bottlenecking a 3080, especially when overclocked. Even if there is, it will be really small that it's unnoticable.

(Mine is not overclocked, but I use my PC on High Performance power settings and it goes up to 4.5 GHz, and I feel like it is enough for a 3080 at this point)

Lol i mean it is still probably, but like i said, probably no worse than anything else and unless some game comes along that needs more threads it will still be as viable as the rest of them.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

Also I'm hyped to personally see the difference between a 3080 and my current 1060 6GB. It's going to be amazing.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Mister Woof said:

I didn't watch the video, but was the 8700k overclocked?

 

A stock 8700k runs at 4.3ghz, pretty tame by Intel 2020 standards.

 

If it's 5ghz range, then you can safely just assume any 10600k results will be interchangeable with 8700k.

 

The 3600-3900 data seems to indicate there's not much of a difference between a 6/12 and 12/24 configuration, and I'd presume that carries over to 8700k/10600k-10700k-10900k as well. 

 

The processor being an 8700k or having 12 bajillion cores won't make matter if the software can't utilize them.

 

The actual difference between stock 10900k and stock 8700k is about 700mhz. This is probably the issue thats driving his 1080p FPS delta.

 

Him saying the 8700k isn't enough might convince people to go out and get a something "newer" like a 3900x, and probably get worse bottlenecking on a 3080 in most games that don't give a shit about cores past 6.

 

EDIT: okay, watched his video.

 

1. He states there were big improvements in his 2080ti data from rebenching today vs the 3080 compared to his original 2080ti results.

2. He states that there's typically driver improvements that cause changes over time.

3. He attributes the differences solely to it being that they used the 8700k to test it.

4. He didn't retest the 8700k today (they used a 10900k) to see if indeed the 8700k was the reason for the differences or it was in fact simply driver optimizations.

 

5. So he made an assumption without proving it and put on youtube.

 

Given Gamer's nexus' review of the 10900k, and adamantly recommending the 10600k over the 10900k for just gaming, I have to assume Jayztwocents just talking out of his ass.

 

OR

 

Security mitigations are just that more impacting on the 8th generation chips.

 

Those are the only two scenarios that make sense to me.

 

Ah okay, so he ran the benchmarks with 10900k/3080 and 10900k/2080Ti, saw a 20 fps performance increase in the 2080Ti from when he initially ran his benchmarks and came to the conclusion that the different CPU was the cause without actually testing to see if it was the cause and ruling out other factors, one of which being drivers that he himself attributed to causing performance issues prior? 

 

And let's say just for shits and giggles that he is right and that it is the CPU. Being that it would be a 20 fps hit on 1080p, would there still be hit on 1440p? And if so, are we talking 10 fps? 5? 1? 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, farmfowls said:

Ah okay, so he ran the benchmarks with 10900k/3080 and 10900k/2080Ti, saw a 20 fps performance increase in the 2080Ti from when he initially ran his benchmarks and came to the conclusion that the different CPU was the cause without actually testing to see if it was the cause and ruling out other factors, one of which being drivers that he himself attributed to causing performance issues prior? 

 

And let's say just for shits and giggles that he is right and that it is the CPU. Being that it would be a 20 fps hit on 1080p, would there still be hit on 1440p? And if so, are we talking 10 fps? 5? 1? 

well other outlets with a 5ghz 8700k just ran the 2080ti numbers vs 10900k and there was minimal difference.

 

That doesn't mean it won't bottleneck a 3080 to a degree, but it throws doubt on Jayztwocent's statement that 2080ti at launch was slower because it was run on an 8700k

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Mister Woof said:

well other outlets with a 5ghz 8700k just ran the 2080ti numbers vs 10900k and there was minimal difference.

 

That doesn't mean it won't bottleneck a 3080 to a degree, but it throws doubt on Jayztwocent's statement that 2080ti at launch was slower because it was run on an 8700k

When you say minimal, how minimal do you mean? 1-2 fps? If so that's totally fine. Currently I'm running a 1080 ti with my 8086k (5GHz) and 1440p monitor (PG279Q 165Hz). So I was hoping to get the 3080 (or 3080 Ti/Super if it ever comes) to improve my FPS in games and prepare for newer games releasing like CyberPunk etc. Would also like to run RDR2 with higher settings and extra frames just because it was the first game I actually had to turn down a bit on my current build. I was counting on the 3080 boosting my frames quite a bit. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, farmfowls said:

When you say minimal, how minimal do you mean? 1-2 fps? If so that's totally fine. Currently I'm running a 1080 ti with my 8086k (5GHz) and 1440p monitor (PG279Q 165Hz). So I was hoping to get the 3080 (or 3080 Ti/Super if it ever comes) to improve my FPS in games and prepare for newer games releasing like CyberPunk etc. Would also like to run RDR2 with higher settings and extra frames just because it was the first game I actually had to turn down a bit on my current build. I was counting on the 3080 boosting my frames quite a bit. 

https://www.techspot.com/review/2068-intel-core-i7-8700k-revisited/

 

5ghz 8700k either beat or was equal to a stock 10900k, and only "lost" in SotTR 1% minimums.

 

mostly indistinguishable irl 

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Mister Woof said:

 

https://www.techspot.com/review/2068-intel-core-i7-8700k-revisited/

 

5ghz 8700k either beat or was equal to a stock 10900k, and only "lost" in SotTR 1% minimums.

 

mostly indistinguishable irl 

After reading this, I'm very happy that I went for an 8700k.
Even though its just 2 FPS, on RDR2, it beats the i9 10900k when overclocked (and it is on the exact same level in terms of average fps when not overclocked)

Link to comment
Share on other sites

Link to post
Share on other sites

@Mister Woof @Consul

here is other emperical data which could help (720p):

 

Some things i have to mention, with a Overclock of 5.2GHz on the 8700k the dude that did the benchmark havnt had fast enough RAM, for deeper explaination you can watch the digital foundry 8700k review, long story short: if you dont have the memory bandwidth to match your overclock the overclock does nothing but wasting power and you actually want ddr4 4000+ for a 5.2GHz OC on a 8700k else it wont perform as having 5.2GHz.

To show something about this from my own testing,

This is 8700k clocked at 5.2GHz with DDR4 3600 CL 15 and Cache of 4500Mhz:
934407686_8700k5.2GHz45Cachelowram.thumb.jpg.a2c111e855a79bb91b3a77b7d45eff6e.jpg

This is 8700k clocked at 5.2GHz with ddr4 4000 Cl17 and Cache of 4700Mhz:

690945614_CinebenchR155.2GHz4.7Cache.thumb.jpg.64b7c7f87e96e48f95414b097ed95da6.jpg

This is the same 5.2Ghz 4000cl17 and Cache 4700Mhz on R20 (notice single core score):

770860640_Cinebenchr205.2GHz4.7Cache.thumb.jpg.8f506730dab6b7785d62e43e1ad223a2.jpg

 

This is a 10900k at 5.3GHz with 4.8GHz cache and DDR4 4400 CL16 (HT disabled):

1201577330_10900k5_3GHz.thumb.jpg.2cd6827b39c84984dfeaba97f53f42f0.jpg

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×