Jump to content

AMD FX 8370 vs Intel I7 5960x [GTX 970 SLI 4K benchmarks]

Try and crank something that is more powerful into the system, and you'll see how the cheaper CPUs will go out quickly.

Just look at Jayz system, 3 Titan X bios hacked if im not mistaken, and those will put your CPU behind, so you must overclock the CPU.

He overclocks all his systems regardless.

 

There is a difference between needing to OC, and jsut OCing because you CAN.

 

I didnt need to OC my 4790k, i didnt really need to OC my FX 8320 (for other then scoring better in cinebench and getting +5FPS in most games)... I did it because i COULD

 

However. some products like the G3258, you HAVE to OC. Because it is rather meh out of the box.

Link to comment
Share on other sites

Link to post
Share on other sites

He overclocks all his systems regardless.

 

There is a difference between needing to OC, and jsut OCing because you CAN.

 

I didnt need to OC my 4790k, i didnt really need to OC my FX 8320 (for other then scoring better in cinebench and getting +5FPS in most games)... I did it because i COULD

 

However. some products like the G3258, you HAVE to OC. Because it is rather meh out of the box.

Jay does have to oc his 5960x to feed the 3 Titans. He explains it when he talks about why there is no point in adding a 4th card.

My posts are in a constant state of editing :)

CPU: i7-4790k @ 4.7Ghz MOBO: ASUS ROG Maximums VII Hero  GPU: Asus GTX 780ti Directcu ii SLI RAM: 16GB Corsair Vengeance PSU: Corsair AX860 Case: Corsair 450D Storage: Samsung 840 EVO 250 GB, WD Black 1TB Cooling: Corsair H100i with Noctua fans Monitor: ASUS ROG Swift

laptop

Some ASUS model. Has a GT 550M, i7-2630QM, 4GB or ram and a WD Black SSD/HDD drive. MacBook Pro 13" base model
Apple stuff from over the years
iPhone 5 64GB, iPad air 128GB, iPod Touch 32GB 3rd Gen and an iPod nano 4GB 3rd Gen. Both the touch and nano are working perfectly as far as I can tell :)
Link to comment
Share on other sites

Link to post
Share on other sites

Jay does have to oc his 5960x to feed the 3 Titans. He explains it when he talks about why there is no point in adding a 4th card.

Uhm wasnt that just because of the SLI scaling being atrocious in general, rather then the CPU not feeding. I mean, only a fraction of a handful of games has 3 or 4 card profile that is optimized for either Nvidia or AMD... So Just going over 2 cards in CF or SLI is usually a bad idea.

Link to comment
Share on other sites

Link to post
Share on other sites

Ive had just as many crashes during WvW on my 8320 as my current i7 4790k (both are/were OCd).... GW is just atrocious in terms of API and it lacks a LOT of optimizations, on both server side, client to server side and the game itself..

I mean, at what point does it make sense that me living in EU, must be routed through the login servers in Houston Texas, before being patched back to Koeln/Hamburg Germany???? And best part is, when somebody DDOs the US servers (like when some nabs hit the WIldstar Servers, situated in the same server senter as the NA GW2 servers) then ALL EU PLAYERS got rekt cuz login servers forces us to stay synced with the NA servers (this is also why i didnt notice much difference between being on NA and EU servers in terms of ping-lag.....)

So yeah, GW2 crashes. NOT CPU bound for the most part.

The reason why I mentioned GW2 is because I have done some of my own personal tests with it and noticed a difference. With my 2600k at 3.8, during Jormag, my fps would go down to 20, the sound would go out, and then I would crash. When I overclocked it to 4.4, my fps only went down to 30 at most and it never crashed. I am aware that there's a limit where a faster CPU won't change anything due to the game.

Link to comment
Share on other sites

Link to post
Share on other sites

Uhm wasnt that just because of the SLI scaling being atrocious in general, rather then the CPU not feeding. I mean, only a fraction of a handful of games has 3 or 4 card profile that is optimized for either Nvidia or AMD... So Just going over 2 cards in CF or SLI is usually a bad idea.

I agree it's bad but jay showed GPU utilization when two cards were used and three cards at different clocks. If he oc'd his CPU, utilization on the GPUs went up.

My posts are in a constant state of editing :)

CPU: i7-4790k @ 4.7Ghz MOBO: ASUS ROG Maximums VII Hero  GPU: Asus GTX 780ti Directcu ii SLI RAM: 16GB Corsair Vengeance PSU: Corsair AX860 Case: Corsair 450D Storage: Samsung 840 EVO 250 GB, WD Black 1TB Cooling: Corsair H100i with Noctua fans Monitor: ASUS ROG Swift

laptop

Some ASUS model. Has a GT 550M, i7-2630QM, 4GB or ram and a WD Black SSD/HDD drive. MacBook Pro 13" base model
Apple stuff from over the years
iPhone 5 64GB, iPad air 128GB, iPod Touch 32GB 3rd Gen and an iPod nano 4GB 3rd Gen. Both the touch and nano are working perfectly as far as I can tell :)
Link to comment
Share on other sites

Link to post
Share on other sites

The attacking and bullying in this thread is sickening. Some of you are trying way too hard to discredit the author of that article. And others are just bashing because being of fanboyism. Makes me lose faith in the forums

 

I can't criticise an author who, by his own admission, made a biased review and tried to swing a narrative he set out to get across?

Are you saying by pointing out the flaws, and calling him out on it, i'm being morally objectionable? Ever heard of peer pressure m8? 

 

You do realise the amount of AMD FX vs. Intel posts is just going to be littered with this bogus benchmark right? And that this is could cause people to purchase unbalanced systems, because clearly CPU doesn't matter. I find it more interesting how you find the authors behaviour fine, but criticise my critique. Those are some weird set of moral standards.

Link to comment
Share on other sites

Link to post
Share on other sites

I agree it's bad but jay showed GPU utilization when two cards were used and three cards at different clocks. If he oc'd his CPU, utilization on the GPUs went up.

Just rewatched the vid. I think he was referring to the difference in synthetics rather then real games. And synthetics are way more optimized to reveal the true potential for any hardware. So.... Yes it prolly is a bottleneck of sorts, but SLI scaling should be way more of a bottleneck seeing as he said, and i quote:

 

"One 980 ref vs one titan X there is a huge difference between the two, but in a three way configuration its infact very minimal gains"

Link to comment
Share on other sites

Link to post
Share on other sites

Just rewatched the vid. I think he was referring to the difference in synthetics rather then real games. And synthetics are way more optimized to reveal the true potential for any hardware. So.... Yes it prolly is a bottleneck of sorts, but SLI scaling should be way more of a bottleneck seeing as he said, and i quote:

 

"One 980 ref vs one titan X there is a huge difference between the two, but in a three way configuration its infact very minimal gains"

I agree the sli scaling is the biggest impact on performance.

My posts are in a constant state of editing :)

CPU: i7-4790k @ 4.7Ghz MOBO: ASUS ROG Maximums VII Hero  GPU: Asus GTX 780ti Directcu ii SLI RAM: 16GB Corsair Vengeance PSU: Corsair AX860 Case: Corsair 450D Storage: Samsung 840 EVO 250 GB, WD Black 1TB Cooling: Corsair H100i with Noctua fans Monitor: ASUS ROG Swift

laptop

Some ASUS model. Has a GT 550M, i7-2630QM, 4GB or ram and a WD Black SSD/HDD drive. MacBook Pro 13" base model
Apple stuff from over the years
iPhone 5 64GB, iPad air 128GB, iPod Touch 32GB 3rd Gen and an iPod nano 4GB 3rd Gen. Both the touch and nano are working perfectly as far as I can tell :)
Link to comment
Share on other sites

Link to post
Share on other sites

I can't criticise an author who, by his own admission, made a biased review and tried to swing a narrative he set out to get across?

Are you saying by pointing out the flaws, and calling him out on it, i'm being morally objectionable? Ever heard of peer pressure m8? 

 

This sjw society man...

So you are saying fanboyism isnt real because a weaker product cannot compete with a stronger one when there is something else holding them back?

 

Its like trying to fill a 1000L tank, with a inlet that only allows 5L/minute flowrate... Wont matter how big the pump, you'd still get limited by that inlet.

Link to comment
Share on other sites

Link to post
Share on other sites

So you are saying fanboyism isnt real because a weaker product cannot compete with a stronger one when there is something else holding them back?

 

Its like trying to fill a 1000L tank, with a inlet that only allows 5L/minute flowrate... Wont matter how big the pump, you'd still get limited by that inlet.

 

Read my earlier posts, and that of patrickjp93, to read my problems with the benchmark presented. I'm fully aware of the limitations under 4K. It still doesn't explain a lot of factors.

 

You just have to squeeze the word "fanboy" into every scentence, don't you. And that analogy is pretty bad.

Link to comment
Share on other sites

Link to post
Share on other sites

Read my earlier posts, and that of patrickjp93, to read my problems with the benchmark presented. I'm fully aware of the limitations under 4K. It still doesn't explain a lot of factors.

 

You just have to squeeze the word "fanboy" into every scentence, don't you. And that analogy is pretty bad.

that analogy fits this article, yes it isnt great, but it shows the point for CPU vs CPU (not system vs system)

 

Honestly though, RAM timings, i am not sure they would make this much disparity, although i know the FX IMC isnt really great (my own FXs IMC was shit beyond belief). So getting 2133 stable on the FX is a small feat in itself and must have taken some tweaking for sure. Perhaps, JUST PERHAPS, they modded the timings to make it stable?

 

The DDR4 RAM is well, from what i can see, plain DDR4 with well, ok timings for that speed.The DDR3 sticks do have slightly better stock CAS latency, normal would be around 11, bad would be 12 or 13.... but its nothing groundbreaking...

 

 

Oh well, theoretically there may be a disparity, practically, i am not sure it would matter with the GPUs at hand... perhaps if they used AMD GPUs due to their better resolution scaling, and worse CPU overhead, so FX would struggle more. But well, we'd never know i guess.

Link to comment
Share on other sites

Link to post
Share on other sites

Oh, the Intelfanboys are at it again!  "Every must be a gama, so bottleneck here, bottleneck there, everything is dx11 bottleneck...pure intel mein!"

 

 

You have to understand that there are people who actually utilize AMD 8 cores in other areas, not everything is going to be massively bottlenecked, and an AMD 8 core is usually cheaper than an Intle 8 core.  if you're doing 3D modeling, and need those 8 cores...physical cores are going to be better than virtual cores. If I'm on a budget, and I need to do 3D modeling....why oh why would I spend 1000 bucks over 2-300 when I'm on a budget?  Now, calm down.  Not everyone is a gamer.

you can always buy i7 5820k for 389$ :) it's 6 real cores and 12 threads and it will blow out of the water any AMDs 8 core CPUs! It has much higher IPC then any amd cpu. intels 1 core with HT is better then AMDs 1 module (2 core)

 

actually amds CPUs are not real independent 8 cores, they consist of 4 modules each has 2 cores in it and they are sharing resources. this means that 1 module != 2 cores it's like ~ 70% of 2 core performance so AMD 8 core cpu is even slower then if it was real independent AMD 8 core CPU.

 

there is no logic in these tests look at this.

fx-8370-vs-5960x_gaming-witcher3_gtx970-

 

FX 8370 is 4/4.3 GHZ i7 5960x is 3/3.5 GHZ but AMD is not even true 8 core, intel is true 8 core with 16 threads and even when it is at 4.4 GHZ it loses to FX 8370 which is at 4.3 GHZ at his max turbo? LOL

 

This is shit!

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

Threads are shit in what I use.  I want 8 not 6.  :x   Also, I already said that.  So, you're repeating me.

You want 8 and that's it? What about that intels 6 core is much faster then that 8 core found in AMDs CPUs

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

It was 2013.   :D  I don't think there was a 300 something buck i7 at the time with 6 cores.

You had to start from this :D

 

But these tests are just false!

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

I'd like to see an independent third party comparison.

LOL there is nothing to do then with these 2 CPUs. Fx 8370 has to be compared with i7 6700k 

 

I don't exactly care if they are true or not.  I'm not loyal to any pure HW company, but you do know there are tasks where the 8 core AMD > the 6 core i7.  That i7 does have more advantages, but not for what I mainly focus at.  I need actual physical 8 cores.   :D  For the 3D modelling I do where physical cores are > threads.  However,  my AMD 8 core does really bad single threaded/core tasks and a lot of other tasks where that 6 core will destroy it.  

No! i7 5820k is much faster in 3D modelling too with it's 6 cores / 12 threads!

 

look Fx 8350 vs i5 3570k lol only 4 cores 4 threads amd is slightly faster but what if that was Fx 8370 vs i7 5820k 6 core/12 thread? LOL intels 6 cores are much faster then amds 8 cores!

 

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

I'll tell all of you that these tests are highly conceivable. It's not about single threaded performance. It's about the fact that games under DirectX 11 don't take advantage of multi core processors; *Part* of the equation which we are clearly failing to address. This is partly why (yes, not much) the Duel core Pentium G3258 and the beefy quad core 4790k performs so effectively with this current generation of games. 

 

*Over obsessing about single-threaded tasks. That's what drew me toward this article. If tasks/instructions can't be parsed, then hardware is naturally sitting idle. If hardware is sitting idle, then we have yet to create software to take advantage of it. 

 

Look back to when 64 bit CPUs first hit the home PC market. How many programs were available to take advantage of it? How many years did we wait until the technology was fully employed? 

 

We have this obsession with getting the best at all times and ignoring the receipt we pay for them. I bought an XPS 15 (L502x) with an I7 2820QM in 2011. Was the extra cache worth the extra cash? These days I'm thinking no and it was a major family decision to pay extra for it. I'm really not sold on how well I did or didn't future proof my office solution that could casually play games (The Geforce GT 540 quickly shamed me.). If this article is true, how many people (mate_mate91, valdyrgramr) might be feeling some buyer's remorse?

 

Conclusion, we don't all live in the upper third of our communities while playing PC games. Price per performance matters. I would've stuck with consoles if I didn't carefully consider it this time. just like counting the gigabytes on your video card, megahertz on your GPU, DDR3/5, NM or a manufacturing process, it isn't always that simple to dismiss AMD because it came out with this idea in 2011 doesn't mean it's worthless or noncompetitive today. Let's reconvene this conversation when the software catches up to DX 12. You'll see in the video some of AMD's cores are doing nothing.

 

Link to comment
Share on other sites

Link to post
Share on other sites

This is totaly no BS.

At 4K you are simply GPU limmited.

THe cpu doesnt make any significant diffrence at 4k gaming anymore.

With most gpu´s we have right now.

 

I wonder why they chose exactly GTX 970 SLI for 4K and not something higher end that would actually make more sense at that res. I think it would be fair to run the same benchmarks, but with two or three TITAN Xs; a setup which is more likely to be bottleneck by that 8370 at 4K!

CPU: AMD Ryzen 9 - 3900x @ 4.4GHz with a Custom Loop | MBO: ASUS Crosshair VI Extreme | RAM: 4x4GB Apacer 2666MHz overclocked to 3933MHz with OCZ Reaper HPC Heatsinks | GPU: PowerColor Red Devil 6900XT | SSDs: Intel 660P 512GB SSD and Intel 660P 1TB SSD | HDD: 2x WD Black 6TB and Seagate Backup Plus 8TB External Drive | PSU: Corsair RM1000i | Case: Cooler Master C700P Black Edition | Build Log: here

Link to comment
Share on other sites

Link to post
Share on other sites

Fanboys, fanboys everywhere.

 

Anyways, the test certainly is interesting, while I agree with others that there is a lot to be said about the methodology used here and I don't think there actually is a large group of people who buy a 5960X just to game(and those who do honestly kind of deserve paying too much because they clearly can't handle money well), it's still interesting.

My Build:

Spoiler

CPU: i7 4770k GPU: GTX 780 Direct CUII Motherboard: Asus Maximus VI Hero SSD: 840 EVO 250GB HDD: 2xSeagate 2 TB PSU: EVGA Supernova G2 650W

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder why they chose exactly GTX 970 SLI for 4K and not something higher end that would actually make more sense at that res. I think it would be fair to run the same benchmarks, but with two or three TITAN Xs; a setup which is more likely to be bottleneck by that 8370 at 4K!

 

That's simple. I personally had a 970 already, and when I wanted to do the report I realized that I needed something more powerful for 4K tests. Unfortunately, we're a smaller site and getting someone to provide us with a Titan X or 980 Ti is nigh impossible. So, I paid for a second 970 out of pocket as all of the performance results I've seen put 2x 970s in line with a single 980 Ti or Titan X. It was simply the most cost effective solution at the time. 

Link to comment
Share on other sites

Link to post
Share on other sites

I can't criticise an author who, by his own admission, made a biased review and tried to swing a narrative he set out to get across?

Are you saying by pointing out the flaws, and calling him out on it, i'm being morally objectionable? Ever heard of peer pressure m8? 

 

You do realise the amount of AMD FX vs. Intel posts is just going to be littered with this bogus benchmark right? And that this is could cause people to purchase unbalanced systems, because clearly CPU doesn't matter. I find it more interesting how you find the authors behaviour fine, but criticise my critique. Those are some weird set of moral standards.

 

Did I call anyone out by name? Nope. Did I quote anyone? Nope. But I do think its funny that you think Im talking about you. Says a lot about yourself. I made a comment about the amount of effort being given to try and discredit the author (most of which comes across as attacks) and about the amount of bullying being done by fanboys just because they're being fanboys. The fact that you got so defensive about the post I made says a lot about what you posted in this thread. And no, I have no problems with people questioning a review or test. We all need to question everything in reviews and tests. Its the ways its being done that makes me sad. Most people just go about it like, "Nope, impossible. You are incorrect. You must have done the test wrong." 

 

And again, Im not calling anyone out. Im not saying youre bashing/bullying/fanboyism or anything of the like. I just made a general statement that a lot of the posts in this thread makes me sad about this forum. Too many fanboys attacking

PC Audio Setup = Beyerdynamic DT 770 pro 80 ohm and Sennheiser pc37x (also for xbox) hooked up to Schiit Fulla 3

Link to comment
Share on other sites

Link to post
Share on other sites

Some of you guys are becoming so toxic. Instead of saying "oh I think this test is flawed, you're using gpu limited games in a cpu comparison", you guys are pretty much saying "this Is a bs test, AMD paid you off, you are a fanboy, you've tainted the tests"... Ridiculous, use constructive criticism ffs.

The guy who made the article is on the forum, actively replying to youse. All he's saying here is that for the majority of games (most of which are gpu bound), you don't need an expensive Intel cpu. The problem mainly comes with console players and/or heavy Intel fanboys, who completely dismiss all AMD cpus as trash for gaming, despite the fact that it really isnt the case. Yes, he could've compared with a 6700k or even 6600k considering the age of the Piledriver. He also could've added in maybe an i3 or even lower end i5 to further prove his point but yeah,those are things he can add in the future.

Everyone knows 5960x is better than the 8370, this doesn't change that.

THE BEAST Motherboard: MSI B350 Tomahawk   CPU: AMD Ryzen 7 1700   GPU: Sapphire R9 290 Tri-X OC  RAM: 16GB G.Skill FlareX DDR4   

 

PSU: Corsair CX650M     Case: Corsair 200R    SSD: Kingston 240GB SSD Plus   HDD: 1TB WD Green Drive and Seagate Barracuda 2TB Media Drive

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

The attacking and bullying in this thread is sickening. Some of you are trying way too hard to discredit the author of that article. And others are just bashing because being of fanboyism. Makes me lose faith in the forums

 

It's not fanboyism, it's scrutiny and the article doesn't hold up to it. It's also not fanboyism to compare potentially the most powerful consumer CPU available at the moment (depending on workload) with something that performs competitively with Sandy Bridge. When these two are apparently being heralded as equals it is very much worth questioning what testing methodology lead to this result and if it does actually hold any water.

Link to comment
Share on other sites

Link to post
Share on other sites

Some of you guys are becoming so toxic. Instead of saying "oh I think this test is flawed, you're using gpu limited games in a cpu comparison", you guys are pretty much saying "this Is a bs test, AMD paid you off, you are a fanboy, you've tainted the tests"... Ridiculous, use constructive criticism ffs.

The guy who made the article is on the forum, actively replying to youse. All he's saying here is that for the majority of games (most of which are gpu bound), you don't need an expensive Intel cpu. The problem mainly comes with console players and/or heavy Intel fanboys, who completely dismiss all AMD cpus as trash for gaming, despite the fact that it really isnt the case. Yes, he could've compared with a 6700k or even 6600k considering the age of the Piledriver. He also could've added in maybe an i3 or even lower end i5 to further prove his point but yeah,those are things he can add in the future.

Everyone knows 5960x is better than the 8370, this doesn't change that.

 

1428135514083.jpg

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

The guy who made the article is on the forum, actively replying to youse. All he's saying here is that for the majority of games (most of which are gpu bound), you don't need an expensive Intel cpu.

 

His testing methodology doesn't even prove that much. All he needed to do to show that was to run the game at 800x600 and prove that these CPUs could push out at least 60 fps completely isolated from GPU bottlenecks.

 

If he had done this and come to the conclusion that the 8320 averaged 80 fps while the 5960X averaged 160 this would have been evidence that only people who are using high refresh rate panels need care and for everyone else the 8320 would have been fine. (However if he had found that the 8320 dropped below 60 fps at all then he would have proven the opposite -- that AMD truly aren't worth looking at until they release a new product).

 

He didn't do this. He manufactured a situation in which both CPUs were heavily restrained by GPUs. All this proves is that if you are using 970s in SLI for 4K they will limit you before either of these CPUs, which is a very specific message to take away from this, and not altogether informative.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×