Jump to content

The 8 core showdown and analysis thread.

ALXAndy

still cool to see that in Crysis 3 a FX9590 with 290X is beating a 4930K with 780Ti.

Both on 1080p and 1440p

 

But sst don´t tell anyone. :D

Link to comment
Share on other sites

Link to post
Share on other sites

still cool to see that in Crysis 3 a FX9590 with 290X is beating a 4930K with 780Ti.

Both on 1080p and 1440p

 

But sst don´t tell anyone. :D

Like I told you before, educate yourself on what CPU or GPU bottlenecking means. You're comparing the performance of two gpu's from a different brand, not the CPU like you're trying to achieve. Google what the CPU/GPU does in a game, it's readily available. Put that test in a 720p to remove the GPU cap the 4930K will destroy it.

Let me make it simple;

1) GPU bottleneck: when the GPU is all the time at 99%

56769.png

2) CPU bottleneck: when the GPU is below 99% eg 50%:

56765.png

If we were to massively simplify the process of creating a frame in an 2-Way SLI system then measure how long each component took to complete its share of the work, it might look like one of the images below. Each individual frame is first prepared by the CPU and then handed off to a GPU to be rendered as illustrated.

Jibagsl.png

How well SLI scales depends on how efficiently the game is able to handle multiple GPUs, how developed the SLI profile for that game is, and also how busy we can keep the GPUs. That last part means we have some responsibility in the scaling equation: making sure the GPUs are the bottleneck in our system.

5nVTYxe.png

Introducing a second graphics card to our system has yielded a measurable performance gain of about +15%, however our GPUs have a lot of downtime that wasn't present before. More importantly, our CPU is now constantly busy preparing new frames, and with each GPU completing its frame long before the CPU can provide it with another, we're back to being CPU-limited. If we want to see more use out of our second graphics card, that means we have some tweaking to do.

Here have a nice read that covers mostly about gpu/cpu bottlenecking written by an official guy from nvidia: https://forums.geforce.com/default/topic/532913/sli/geforce-sli-technology-an-introductory-guide/post/3749687/#3749687

Your goal is making the GPU the bottleneck always has been because they are rendering what you are seeing on your screen not the CPU and which is why you never see a difference between cpu's that all took it to its max load and why you never should show a 350% difference like Logan did in Far Cry 3. In some games you'd never reach that point (unless you have a cheap gt9200) such as WoW where you will never have enough cpu performance. Call it BS as you usually do, but you're only BS'ing yourself by being denial. 

Link to comment
Share on other sites

Link to post
Share on other sites

Pff i dont try to achieve anything lol

Its just a funny side note, cause crysis is a very demending game.

Offcourse its a gpu compairisson basicly, but still its kinda funny

 

I suppose you realy need to see a doctor or so.

Cause it seems to realy hurt you as an intel fanboy

 

i dont realy care about the whole thing, to be honnest ;)

 

bear_facepalm_by_akgaimer-d3bkgsu.png

Link to comment
Share on other sites

Link to post
Share on other sites

Offcourse its a gpu compairisson basicly, but still its kinda funny

Nice backpedalling here, who would start comparing gpu's in a cpu thread? Who brings Logans vs Pistols video up? You clearly aimed at comparing the 4930K with the 9590.

 

 

Cause it seems to realy hurt you as an intel fanboy

People who call people fanboys are usually die-hard AMD fanboys. Don't make me start about your biased information you sprayed around along with calling any benchmarks that show Intel doing better by a huge margin from anandtech/vrzone/pclab/hwinfo/hardwarepal/pcgameshardware/sweclockers/hcw/bunch of videos etc bullshit. Aslong as you're showing Intel doing better its automatically classified bs or you get a /facepalm picture.

 

 

I suppose you realy need to see a doctor or so.

If you sticked to the facts much earlier you wouldn't have said this. 

Link to comment
Share on other sites

Link to post
Share on other sites

 

Parallelism will be always crap in games, it's about how well it is multithreaded not just only about how much, games will never magically saturate 8 threads to make AMD somehow equal to Intel i5's/i7's. 

 

Bollocks.

 

As for the rest? I think you've missed the point. I never said that the cores offer more performance, I simply said they were being used and showing loads. How much this core use equates into performance? I don't know yet. I will be testing that next.

 

But it seems I upset your blue god.

 

Edit. I just wanted to point a couple of things out here.

 

I've been accused of all sorts thus far. I didn't just post this article here I posted it on other forums too. It's gained plenty of reaction from the Intel fanboys and I've been accused of all sorts. So to clear a few things up.

 

This is not all down to Windows 8 and these are not dummy loads being put on the cores by the OS. Please look at the Crysis 2 results. This is a game that only uses four cores at best, and mainly leans on two cores. As such the results were exactly as I expected.

 

Please also note, I chose games that like cores. I'm no fanboy, I have AMD rigs and I have Intel rigs. I could run more games that do not load the cores but there's no point. One is enough to show the difference between a game that supports threading and uses it and one that doesn't. I deliberately chose 99% of the games I ran because I knew how they thread.

 

Whether or not this threading makes a difference? well it's quite clear that in some cases clock speed doesn't mean very much. Take Metro Last Light for example, I just read a review where they dropped the clock speed of the 8350 down to 2ghz (44%) and the game only ran 14% slower. The game clearly prefers to be threaded rather than rely on massive clock speeds.

 

I still have testing to do, and will do it, but right now I am short of time. I am going to disable the cores two at a time to see what, if any, impact it has on the overall FPS of a game. It could make 0% difference, at no point have I claimed that the extra cores and threading makes games run any faster. The sole point was to get people off this ridiculous ideology that games only run on four cores. If a game that supports threading can clearly use 16 threads (and one that doesn't ignore a large chunk of threads) then that's my point proven. I know of a good few other games that support highly threaded CPUs and plenty that don't !

 

At no point was this thread, or any of the information in it, supposed to be an attack, or 'diss' on one company or another. At the end of the day the results are set in stone, there's nothing I can do to change them. However, please pay close attention here - I chose the Xeon. I chose it because I would rather have a lower clocked CPU using less power and getting half as hot as one screaming its balls off. And the AMD was unable to beat the Intel in two of the very latest games (Tomb Raider and Metro Last Light) and so I hedged my bets on the Intel because it allows my rig to make about 80% less of the noise it was making before. That's all !

 

From a lot of the testing I have done two things seem to end up the same way. Less cores + high clock speed generally tends to equal more cores at a lower clock speed. Please note, the entire PC gaming industry as we know it is aiming for the latter. More cores, less clock speed. They are aiming to eliminate the CPU as much as possible and leave the graphics cards to do their rightful job.

 

If you still don't want to listen to me? fine. Then here you go, here is it from the coder's mouth.

 

http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen

 

Paying close attention to...

 

"I'd go for the FX-8350, for two reasons. Firstly, it's the same hardware vendor as PS4 and there are always some compatibility issues that devs will have to work around (particularly inSIMD coding), potentially leading to an inferior implementation on other systems - not very likely a big problem in practice though," he says.

"Secondly, not every game engine is job-queue based, even though the Avalanche Engine is, some games are designed around an assumption of available hardware threads. The FX-8350 will clearly be much more powerful [than PS4] in raw processing power considering the superior clock speed, but in terms of architecture it can be a benefit to have the same number of cores so that an identical frame layout can be guaranteed."

 

Now that quote is taken directly from Avalanche Studios' Chief Technical Office, Linus Blomberg.

 

So, I will take the word of the devs, given that, you know? they tend to know what they are talking about better than any one else.

Area 51 2014. Intel 5820k@ 4.4ghz. MSI X99.16gb Quad channel ram. AMD Fury X.Asus RAIDR.OCZ ARC 480gb SSD. Velociraptor 600gb. 2tb WD.

Link to comment
Share on other sites

Link to post
Share on other sites

You sir, have made the post of the year, at this point atleast! :)

  • My System: MSI Z87-GD65 - CPU: I7 4770K OC 4.0 GHZ I7 4770 (3.9 GHz), Antec 620 watercooler - GTX 760 2GB/OC (MSI TwinFrozr IV) - 16 GB Kingston 2133 - 750w Energon - Zalman Z11+ HF1 - Storeage: 2x120 GB EVO 840, 3 TB Seagate Display: Asus 27" VE276Q | Planned upgrades: Gamdias Hermes ultimate, HyperX Cloud, MSI AMD R9 290x, 2x120 HyperX (Linux boot drives)

  • Pedro19reddit: "A good console exclusive is like a Ferrari that can only be driven on a swamp." SlickPc "i couldn't even under-clock my 5 year old GFX card to be as slow as a "Next-gen" console #PcMasterRaceProblems

Link to comment
Share on other sites

Link to post
Share on other sites

@ALXAndy Very nicely executed review! Gives you a fairly good idea of how things stand. You should ignore @Faa btw, nothing more than a raging Intel fanboy as he'll just come into every post related to AMD and fill it with bogus benchmarks.  Arguing with him is a waste of time. Adding him to my ignore list was the best thing i ever did.

Case: Phanteks Enthoo Pro | PSU: Enermax Revolution87+ 850W | Motherboard: MSI Z97 MPOWER MAX AC | GPU 1: MSI R9 290X Lightning | CPU: Intel Core i7 4790k | SSD: Samsung SM951 128GB M.2 | HDDs: 2x 3TB WD Black (RAID1) | CPU Cooler: Silverstone Heligon HE01 | RAM: 4 x 4GB Team Group 1600Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

@ALXAndy Very nicely executed review! Gives you a fairly good idea of how things stand. You should ignore @Faa btw, nothing more than a raging Intel fanboy as he'll just come into every post related to AMD and fill it with bogus benchmarks. Arguing with him is a waste of time. Adding him to my ignore list was the best thing i ever did.

But bro dont you know? If you game @ 720p or less you will bottleneck. Everyone buys 720p monitors these days am I right? Lmao
You can't be serious.  Hyperthreading is a market joke?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

As for the rest? I think you've missed the point. I never said that the cores offer more performance, I simply said they were being used and showing loads. 

Haven't you read my post? Threads are switching rapidly between cores, been a thing since the 90's and tskmngr is showing us that behavior. You're not interested in how many cores and how much has been used, you're interested in how many cores and how much theyve been used at the same time, not over an hour.

4 threads used -> 4 cores can only be active at the same time and nothing more -> what we got is Windows showed load on every core. 

 

 

 

I'm no fanboy, 

Then don't act like a typical AMD fanboy that's not willing to accept criticism.

 

 

One is enough to show the difference between a game that supports threading and uses it and one that doesn't. 

 

So there are games that don't support threading? I will use singlethreading instead since you didn't mention the word multi. How exactly can an app/game not have a thread?

 

 

I deliberately chose 99% of the games I ran because I knew how they thread.

 

Doesn't seem like you do.

 

 If a game that supports threading can clearly use 16 threads (and one that doesn't ignore a large chunk of threads) then that's my point proven. I know of a good few other games that support highly threaded CPUs and plenty that don't !

Multi means two+. Multithreading is about having for each task a thread, parallel coding is a total different story which is a massive keyfactor to performance and it's much harder to do especially for games. A 10 years old game like WoW has 40-50 threads meaning it can use 40-50 cores. Ofc the engine itself is on 2 threads and the other threads (manabar, combatlog, energybar, chat etc) have almost as good as no impact on your performance meaning it doesnt need anything more than a dualcore. Sure BF4 has more bigger threads but I'm not interested in seeing the engine on 5 threads and the physics engine on 1 thread taking a 8350 max to 75% load, you're obviously interested in having the engine on 6 threads and physics offloaded to the GPU and the max load would have been 75% as well with much better performance. It's about how well it is multithreaded and the level of parallelism. Threads are waiting too much on each other to finish, theyre not running complety indepedently of each other.

The BF4 example isn't correct, it's kinda unknown how its written. 

 

 

 

So, I will take the word of the devs, given that, you know? they tend to know what they are talking about better than any one else.

Seems like your salesman didn't bother to sell you that 2 cores are reserved for the OS and remaining ones are for the game. Straight up from Sony: "The CPU performance analysis tool is pictured here, strongly suggesting that six of the eight AMD CPU cores are available to developers."

3Llxmre.jpg

http://www.eurogamer.net/articles/digitalfoundry-ps3-system-software-memory

Look at any BF4/WD benchmarks, the 8350 only does a tiny bit better than the 6300 mainly because how the architecture works (you'd be better off giving two threads to two modules instead of to one module, the single threaded performance is higher that way). There's no reason for any dev to make it "8core optimized" like you're claiming here if only 6 cores are available for them.

 

 

From a lot of the testing I have done two things seem to end up the same way. Less cores + high clock speed generally tends to equal more cores at a lower clock speed. Please note, the entire PC gaming industry as we know it is aiming for the latter. More cores, less clock speed. They are aiming to eliminate the CPU as much as possible and leave the graphics cards to do their rightful job.

Less cores + lower clock speed + a shitload more IPC is better than more cores + higher clock speed aka the 8350. Single core performance is always going to be the factor that makes a CPU better for gaming, regardless of the console gimmicks. Unless every game is on the level of Cinebench's mt'ing/parallelism the 8350 would have been better than the i5.

 

 

at no point have I claimed that the extra cores and threading makes games run any faster.

O really?

djCtwur.png

 

 

 

How much this core use equates into performance? I don't know yet. I will be testing that next.

I could help you with that. You really should aim for a pure cpu bound scenario like I did with my BF3 video. For example Crysis 3 you were GPU limited and the cpu loads were quite low if you disable 4-6 cores the chance that you'd be still gpu bound is quite high, you preferably want 4 gtx 780's in SLI get CPU bound with all cores and start disabling cores. So you can drop results like 1 core 10 fps - 2 cores 20 fps - 3 cores 30 fps - 4 cores 40 fps - 5 cores 50 fps - 6 cores 60 fps - 7 cores 70 fps - 8 cores 80 fps (not that its going to scale like this but whatever) and enable a 9th core that might add no performance while still being cpu bound. That would make more sense than if you'd drop; 1 core 10 fps - 2 cores 20 fps - 3 cores 30 fps - [being gpu bound now] 4 cores 40 fps - 5 cores 40fps - 6 cores 40 fps - 250 core - 40 fps 

Just avoid doing it with the 8350 imo.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm still impressed that the AMD did so well considering that it's consumer-grade tech rather than server/industrial-grade tech. The fact that it can keep anywhere near the Xeon shows that AMD got some stuff right (though I would really like SOME more single-core performance in future chips) and are best price/performance. However, it also shows that Intel are convinced they have enough of a monopoly that they can lock stuff down and not innovate as much. To me this test shows that they kinda shot themselves in the foot a little bit since the AMD did comparatively well and the Xeon was limited by an unchangable clock. I like AMD for the affordability and multi-core performance, but not matter how well it did and will ever do, it shouldn't be able to even touch a Xeon. They're different classes of chip. Intel really need to push forward.

 

Actually, if Intel can increase their core count without sacrificing performance then they will end up with the single best performing chips in almost every scenario.

Link to comment
Share on other sites

Link to post
Share on other sites

None of the pictures work, can you use Imgur?

AMD Ryzen 5 3600 | Corsair H100i Pro | ASRock X570 Phantom Gaming 4 | Corsair Vengeance 32GB 2x16gb @ 3200mhz  | Vega 64 @ Stock | Fractal Design Define R4 | Corsair RM750

 

ThinkPad T480 | Intel Core i7 8650u | Nvidia MX150 | 32GB DDR4 @ 2400mhz | Samsung 840 Pro 1tb | 1080p touchscreen

Link to comment
Share on other sites

Link to post
Share on other sites

Awesome test bro loved it

THE LITTLE DEVIL : i5 4690k on NOCTUA NHD15

Asus z97 sabertooth mark 2 

WD black 1TB & WD Green 2TB HDD, Samsung 840 Evo 240gb & Hyperx 240gb SSD 

RAM- Corsair vengence 16gb and hyperx fury 8gb

PSU-Seasonic sII 620 wtt power supply

PErPHERaLS: Logitech g403 mouse, Redgear cherry MX brown switches, Logitech gamepad f310,Cosmic byte XXL RGB mousepad 

Audio: SennheiserHD 558 and antlion mod mic

PHOnE: Realme 3 pro 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×