Jump to content

Quad Core CPU's are slowly becoming like the Dual Core?

17 hours ago, Bhav said:

I still dont see the reason why I would want to build around a 6700k, when a 5820k and mid range X99 board can be had for only a slight bit more.

Because the 6700k performs better in games?

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, FPS-Russia said:

for the first time people now actually recommend i7's with high end graphics card, so does this mean Quad Cores are phasing out? 

No, I think this is just people suddenly becoming familiar with the term "bottlenecking" and are freaking out over the idea that they're not getting as much performance as they could if they went with an i5 instead of an i7. Instead, people need to set some concrete, realistic requirements. Do you want 60FPS on maximum settings at 1080p? Even a Core i3 can do that just fine.

 

But no, more threads does not automagically equate to better performance. I have an i7-6700, the not-K part. When I had a GTX 980 paired up with it, it performed slightly worse than when I paired it up with an overclocked i5-4670K. It wasn't even that huge of an overclock at 4.0GHz. But the i7-6700 did score higher if any CPU related tasks were involved. But when I got a GTX 1080, I still saw a massive improvement across the board.

 

Besides, have a look at AnandTech's Broadwell-E gaming benchmark run. An i5-4690 (not K) can almost keep up with an i7-5960X. By almost, within 2-3FPS.

 

4 threads are not being "phased out" by 8+ threaded processors anytime soon. And it's likely with DX12 and Vulkan, that will remain the case for a while.

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, i_build_nanosuits said:

You don't do video rendering or transcoding of video files do you?

No, I do parallel programming in Fortran+MPICH2.

 

I didn't say "the kind of programs that benefit from multiple cores", I said "the kind of programs that benefit the most from multiple cores", a.k.a. near 1/n scaling. HT is in fact even harmful, as I found out in my use case, so I had to switch it off to increase performance.

 

And as I also explained, even something like cinebench already shows a very limited profit from HT compared to physical cores, even crippled ones.

 

Naturally, programs with lots if idle and in-between times (at within core level, even if task manager tells you it's busy) benefit a lot from more efficient use of existing cores before needing additional cores (I think I said that too), but I'm talking about programs that are really hogging those cores, the more the better (as in I've used 40+ cores and it would continue to scale away the computational time).

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SpaceGhostC2C said:

And as I also explained, even something like cinebench already shows a very limited profit from HT compared to physical cores, even crippled ones.

https://docs.google.com/spreadsheets/d/1sxzGshuqVtFe_2zgRhN3gXCraR7d8p-NazJ6z0nsGGc/edit#gid=0

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, i_build_nanosuits said:

Nice spreadsheet. Look at those physical cores doing better than fake cores. Thanks for proving my point.

 

Also, no MP to SP ratio, which is how you measure what you are getting in terms of scaling. But of course, I already explained that too in the original message, that you seem to have quoted without reading.

Link to comment
Share on other sites

Link to post
Share on other sites

If you need to run your games at 144 FPS, CPUs are usually a problem. To some degree, even i7's. That's not new. It's only natural that if your video card can manage 144 FPS in the game you're playing, it's probably going to expose your CPU's limits.

 

I think it's something forum posters have become a lot more aware of lately because of the large jump in performance Pascal cards had over Maxwell, and it's exacerbated by the fact that most of us are still playing games that don't demand much, or are older than a year. Gaming at 120+ FPS has suddenly become quite a bit more accessible, especially if you're playing Doom or Overwatch, and those people will probably find their CPUs limiting performance before their 1070/1080 does.

 

I think the solution for most of us isn't a new CPU, it's new games. Try Deus Ex Mankind Divided, where even a GTX 1080 can struggle for 60 FPS at 1080p if you turn up the settings. If an i5 gets in the way of that, there's something wrong with the game, not the i5.

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, SteveGrabowski0 said:

Because the 6700k performs better in games?

Marginally better. Negligibly, I'd even say. I'd trade that any day for a higher-end chipset, which X99 is when compared to a Z170, but also for a 30%+ faster CPU across all cores.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Morgan MLGman said:

Marginally better. Negligibly, I'd even say. I'd trade that for a higher-end chipset, which X99 is when compared to a Z170, but also for a 30%+ faster CPU across all cores.

i'd pick the 5820K over the 6700K any day of the week as well...it's a much more capable CPU.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, porina said:

If you can do that on my laptop with H170 chipset, I'd be impressed. Not every Skylake i7 is the 6700k.

Well its up to the user to pick an overclockable chipset and CPU if they want to do that.

Linus is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, SteveGrabowski0 said:

Because the 6700k performs better in games?

Lol no. And if so barely. The 5820k and the better X99 chipset are vastly better for the money than building around a 6700k.

Linus is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, Bhav said:

Lol no. And if so barely. The 5820k and the better X99 chipset are vastly better for the money than building around a 6700k.

You guys are all twisting what I'm saying. I said the reason someone would consider a 6700k over a 5820k was that the 6700k is better for gaming. I would buy the 6700k over the 5820k because gaming is what I'd be using it for. I know the 5820k mops the floor with the 6700k in video editing but I don't care because I don't have a youtube channel.

Link to comment
Share on other sites

Link to post
Share on other sites

The 6700k overclocks higher because it has less cores, everyone knows that.

 

In that case would you buy an unlocked pentium instead of an I7?

 

Also 148 FPS on a 4.6 Ghz vs 150 FPS on a 6700k?

 

http://lmgtfy.com/?q=define%3A+barely

Linus is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Bhav said:

The 6700k overclocks higher because it has less cores, everyone knows that.

 

In that case would you buy an unlocked pentium instead of an I7?

No, an unlocked Pentium is garbage for gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SteveGrabowski0 said:

And that 6700k and 5820k were both at 4.6 GHz in the video I posted anyways.

WOW, 2 FPS MORE! 148 FPS TO 150 FPS!!!! OMG WOWOWOWOWOWOWOWOW!

Linus is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Bhav said:

WOW, 2 FPS MORE! 148 FPS TO 150 FPS!!!! OMG WOWOWOWOWOWOWOWOW!

Sarcasm is always the retort of someone with nothing interesting to say.

 

GTA V: first part of the benchmark (in the country)

4.6 GHz 6700k beats 4.6 GHz 5820k by 5.0% average

Rds3PBM.jpg

 

 

GTA V: second part of the benchmark (in the city)

4.6 GHz 6700k beats 4.6 GHz 5820k by 6.6% average

tuvkJrh.jpg

 

 

AC Unity

4.6 GHz 6700k beats 4.6 GHz 5820k by 6.6% average

JSrVue7.png

 

 

Far Cry 4

4.6 GHz 6700k beats 4.6 GHz 5820k by 22.8% average

BWOCYbu.png

 

 

Crysis 3 (first part of benchmark)

4.6 GHz 6700k loses to 4.6 GHz 5820k by 17.9% average

wuXwSBh.jpg

 

 

Crysis 3 (final result of benchmark)

4.6 GHz 6700k beats 4.6 GHz 5820k by 7.2% average

i7zGI2g.jpg

 

 

Shadow of Mordor

4.6 GHz 6700k beats 4.6 GHz 5820k by 2.6% average

uhTVMgc.jpg

 

 

Witcher 3 (Novigrad)

4.6 GHz 6700k beats 4.6 GHz 5820k by 2.7% average

9CgRruZ.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

Ok then, you go buy a 6700k.

 

I'll stick to having bought a 5820k, and then having been lucky with an almost free upgrade to a 6850k after RMA woes for other stuff.

Linus is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bhav said:

Ok then, you go buy a 6700k.

 

I'll stick to having bought a 5820k, and then having been lucky with an almost free upgrade to a 6850k after RMA woes for other stuff.

I would buy a 6700k if I was in the market for a cpu, since all I care about is gaming. I know a 5820k destroys a 6700k if you're a content creator or doing other very cpu intensive tasks, in which case the 5820k is still an amazing deal. And it might end up aging better too, though I'm not as confident about that with the new generation of consoles that drives AAA game development designed for pushing 4k instead of higher framerates. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SteveGrabowski0 said:

I would buy a 6700k if I was in the market for a cpu, since all I care about is gaming. I know a 5820k destroys a 6700k if you're a content creator or doing other very cpu intensive tasks, in which case the 5820k is still an amazing deal. And it might end up aging better too, though I'm not as confident about that with the new generation of consoles that drives AAA game development designed for pushing 4k instead of higher framerates. 

And do tell me how you think the 6700k vs 5820k comparisons you posted would hold up in an SLI system.

 

You should correct your stance to 'All I care about is gaming on a single GPU'. 

Linus is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, typographie said:

For at least $50 less, and as someone who doesn't have a compute-heavy workload, that sounds like a great deal. :P

As long as you're only using a single GPU, and until games start using more cores.

Linus is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Bhav said:

And do tell me how you think the 6700k vs 5820k comparisons you posted would hold up in an SLI system.

 

You should correct your stance to 'All I care about is gaming on a single GPU'. 

God, just take the loss already. With SLI Titan X:

 

uNt12Yz.jpg

 

FcC7q9K.jpg

 

uNt12Yz.jpg

 

Oqkftja.png

 

qOvt0vs.png

 

PoeCRoT.jpg

 

QuBHMKg.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Bhav said:

As long as you're only using a single GPU, and until games start using more cores.

Sure. Which will probably happen around the time you can buy a 6+ core CPU with acceptable IPC performance for $150–200. And then maybe a year or three after that, since not everyone upgrades right away, and game development usually lags a few years behind the CPU market. Once those in the gaming mainstream have that many cores to throw at a game, then I'll worry about those with only 4–8 (if anyone still does by then).

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, VerticalDiscussions said:

What is MP/SP?

Multi-Processor to Single-Processor (I may be misquoting cinebench from the top of my head). Essentially, you can run cinebench in single core mode and un multi-core mode, and after doing so it will tell you your ratio as well (it will show it as MP socre/SP score, or SP time divided MP time, I don't recall).

 

More generally, you can take any program that runs in both single core mode and multi-core mode and divide the time spent in one case by the other. In that sense, the best case scenario is to get 1/n, where n is the number of processors/cores, so that by having twice as much cores it takes half the time. This is typically impossible, but the closer to 1/n, the better the program "scales" (this is basically what GPUs do with their zillion cores/shaders/texture units/whatever, as their workload is very scalable). 

 

Back to cinebench, HT allows i7s to do better than 1/4, but less than 1/5, while i5s stay around 1/3.5. FX-8xxx would do something like 1/6.5 (I'm talking about quad core i7s, of course the 6 and 8 core ones go to 6+ etc)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×