Jump to content

FX8350 vs i5 3570K (GTX 670) --> Teksyndicate

Not this guy again. 8350 beats 3570K at Skyrim @ stock? Yea right. It's so obvious his EVGA Stinger mobo is throttling the 3570K. How does a 670 have 25 FPS for 3570K? I'm no Intel fanboy, I totally agree that the AMD benchies are correct, but the Intel benchmarks are just beyond questionable.

He really doesn't know how to benchmark.

I have an i3 + a brand new 7870. I get 25 - 28 FPS @ stock clocks, max settings - Far Cry 3

Skyrim has 100+ mods on it if you watch more of their videos they go into this you still have no idea what you are talking about

100+ mods do i need to say this again 100+ mods

Link to comment
Share on other sites

Link to post
Share on other sites

Sometimes for budget builds AMD is the way to go, but the i5 is much better if you want the best performance in games. Hopefully AMD get a bit more competitive next gen because right now they're really lagging behind. I tend to find AMD GPUs better at the mid-end and Nvidia better at the high end. That is mostly personnel preference though. Drivers especially.
Zuba Twizta they have another video with the AMD GPU look it up i cant remember the name
Link to comment
Share on other sites

Link to post
Share on other sites

I think its covered a lot, just seems like he got a bad i5 and a good 8350 though, i dont have a problem going to 4.5 and im using a fkn Hyper 212 evo
@Vanwazltoff if he can get the i5 to 4.5Ghz the MOBO must not be that bad and 5Ghz is very easy with the 8350 as it only 1Ghz bump i can get my 8120 to 4.4Ghz maybe even further if i had a better MOBO and cooling

ps I dont know shit

Link to comment
Share on other sites

Link to post
Share on other sites

I am honestly thinking if you are comparing 'gaming CPU' and you want to do the ... 'questionable' tests that he did, and you actually have the balls to bring a price point into it, then there is no way you should legitly use 1440p Resolutions, because the displays that are capable of that are ANYTHING but budget, and currently for gaming a 720p or 1080p monitor is king, because games just aren't optimized for the ultra high end displays yet *blaming consoles holding back gaming development, atleast I am*

With that said, the tests that were ran, were unrealistic, and ... even the 2nd video produced questionable results.

I Can easily match & beat 'barely' the metro 2033 benchmarks on a SLI GTX 550 ti, on a Sabertooth x58 using a i7-960. That alone calls into some serious question about the results of his test, using a GTX 670, or the 7870. I love my computer to death, and am over protective to a fault, but I'm not dumb, and kinda expect those GPU's to beat out my setup.

-That is assuming there isn't some crazy increase in Windows 8, with SLI on the X58 platform under 16x/16x, that some how is able to generate insane amount of performance, that could expect to match or beat 670's or 7870.

Link to comment
Share on other sites

Link to post
Share on other sites

The FX 8350 is a great processor, but the FX 8320 is actually better in terms of performance/dollar .

Why would I say that, well the FX 8320 is just as good of an overclocker as the FX 8350 in fact on OCN the highest overclock achieved among the fx 8300 owners club was 5.4 Ghz on an H100 with an FX 8320 .

Now when you really think about it. the FX 8320 costs 170$ on amazon & tigerdirect , 50$ cheaper than the 3570 K..

The AMD motherboards are generally cheaper than equivalent Intel ones & you have those 8 cores which will really benefit you if you are doing any productivity workloads .

So why oh why would you buy an intel CPU? it boggles my mind .

The only drawback to the FX CPUs is the power consumption, but because the difference in price is so big, you'd have to run the 3570k rig for 6 years & 4 months to even make the 50$ in price difference.

(running a 3570K @ 4.5Ghz vs running an FX 8350/8320 @ 5Ghz will save you 8 dollars per year).

of Course, that is assuming you go by his ... 'questionable' math, and assuming you only play 3 hours a day, while also assuming the computer isn't under heavy load for most the day.
Link to comment
Share on other sites

Link to post
Share on other sites

The FX 8350 is a great processor, but the FX 8320 is actually better in terms of performance/dollar .

Why would I say that, well the FX 8320 is just as good of an overclocker as the FX 8350 in fact on OCN the highest overclock achieved among the fx 8300 owners club was 5.4 Ghz on an H100 with an FX 8320 .

Now when you really think about it. the FX 8320 costs 170$ on amazon & tigerdirect , 50$ cheaper than the 3570 K..

The AMD motherboards are generally cheaper than equivalent Intel ones & you have those 8 cores which will really benefit you if you are doing any productivity workloads .

So why oh why would you buy an intel CPU? it boggles my mind .

The only drawback to the FX CPUs is the power consumption, but because the difference in price is so big, you'd have to run the 3570k rig for 6 years & 4 months to even make the 50$ in price difference.

(running a 3570K @ 4.5Ghz vs running an FX 8350/8320 @ 5Ghz will save you 8 dollars per year).

The math is sound and if you play for 6 hours a day you'd still have to use your system for 3 years and 2 months to make up the difference in price, just enough time for your next upgrade .
Link to comment
Share on other sites

Link to post
Share on other sites

The FX 8350 is a great processor, but the FX 8320 is actually better in terms of performance/dollar .

Why would I say that, well the FX 8320 is just as good of an overclocker as the FX 8350 in fact on OCN the highest overclock achieved among the fx 8300 owners club was 5.4 Ghz on an H100 with an FX 8320 .

Now when you really think about it. the FX 8320 costs 170$ on amazon & tigerdirect , 50$ cheaper than the 3570 K..

The AMD motherboards are generally cheaper than equivalent Intel ones & you have those 8 cores which will really benefit you if you are doing any productivity workloads .

So why oh why would you buy an intel CPU? it boggles my mind .

The only drawback to the FX CPUs is the power consumption, but because the difference in price is so big, you'd have to run the 3570k rig for 6 years & 4 months to even make the 50$ in price difference.

(running a 3570K @ 4.5Ghz vs running an FX 8350/8320 @ 5Ghz will save you 8 dollars per year).

Yes, but that's assuming you don't use your computer for any heavy loads like video work, or editing as well.
Link to comment
Share on other sites

Link to post
Share on other sites

The FX 8350 is a great processor, but the FX 8320 is actually better in terms of performance/dollar .

Why would I say that, well the FX 8320 is just as good of an overclocker as the FX 8350 in fact on OCN the highest overclock achieved among the fx 8300 owners club was 5.4 Ghz on an H100 with an FX 8320 .

Now when you really think about it. the FX 8320 costs 170$ on amazon & tigerdirect , 50$ cheaper than the 3570 K..

The AMD motherboards are generally cheaper than equivalent Intel ones & you have those 8 cores which will really benefit you if you are doing any productivity workloads .

So why oh why would you buy an intel CPU? it boggles my mind .

The only drawback to the FX CPUs is the power consumption, but because the difference in price is so big, you'd have to run the 3570k rig for 6 years & 4 months to even make the 50$ in price difference.

(running a 3570K @ 4.5Ghz vs running an FX 8350/8320 @ 5Ghz will save you 8 dollars per year).

Well, do your own math and figure out what works for you.

If electricity is expensive where you live and you do a lot of heavy workloads I guess an FX processor won't suit you .

Link to comment
Share on other sites

Link to post
Share on other sites

Well i think its good to see some people opening there eyes to the AMD camp seeing its not all bad... And for some workloads those 8 cores are really good... And to all that are saying that the ITX motherboard they were using on the intel setup is bad, well why did they get that overclock then? If that motherboard was that bad why did they get a decent OC then? I think its a fair video, the AMD camp cant match the i7s on LGA 1155 not to mention 2011, but hey for me and others on a budget who are looking at i5 performance the AMD FX camp is also a good way to go (and I did go that way :O :D)

CPU: AMD FX 8120 @4.5Ghz - CPU cooler: Cooler Master Nepton 240M Watercooling - Mobo: Asus M5A97Pro - GPU: Sapphire Radeon R9 280X Vapor-X @ 1200Mhz - Memory: Kingston Hyper X 16GBs DDR3 - Storage: Kingston SSD & Seagate Baracude HDD - PSU: Cooler Master V850W PSU- Case: Cooler Master Cosmost II

-- Build Log old PC (HAF XB): 'the Cube': http://linustechtips.com/main/topic/36288-the-cube-cooler-master-haf-xb/ --

 

Link to comment
Share on other sites

Link to post
Share on other sites

I am honestly thinking if you are comparing 'gaming CPU' and you want to do the ... 'questionable' tests that he did, and you actually have the balls to bring a price point into it, then there is no way you should legitly use 1440p Resolutions, because the displays that are capable of that are ANYTHING but budget, and currently for gaming a 720p or 1080p monitor is king, because games just aren't optimized for the ultra high end displays yet *blaming consoles holding back gaming development, atleast I am*

With that said, the tests that were ran, were unrealistic, and ... even the 2nd video produced questionable results.

I Can easily match & beat 'barely' the metro 2033 benchmarks on a SLI GTX 550 ti, on a Sabertooth x58 using a i7-960. That alone calls into some serious question about the results of his test, using a GTX 670, or the 7870. I love my computer to death, and am over protective to a fault, but I'm not dumb, and kinda expect those GPU's to beat out my setup.

-That is assuming there isn't some crazy increase in Windows 8, with SLI on the X58 platform under 16x/16x, that some how is able to generate insane amount of performance, that could expect to match or beat 670's or 7870.

You have to take into consideration that settings he is using.

He maxed every single setting out even the AA and AF settings.

8x AA alone eats almost half of your FPS .

Link to comment
Share on other sites

Link to post
Share on other sites

Well i think its good to see some people opening there eyes to the AMD camp seeing its not all bad... And for some workloads those 8 cores are really good... And to all that are saying that the ITX motherboard they were using on the intel setup is bad, well why did they get that overclock then? If that motherboard was that bad why did they get a decent OC then? I think its a fair video, the AMD camp cant match the i7s on LGA 1155 not to mention 2011, but hey for me and others on a budget who are looking at i5 performance the AMD FX camp is also a good way to go (and I did go that way :O :D)
I salute you for ignoring all the media buzz and the fanboyism going on and making a sound decision .
Link to comment
Share on other sites

Link to post
Share on other sites

I am honestly thinking if you are comparing 'gaming CPU' and you want to do the ... 'questionable' tests that he did, and you actually have the balls to bring a price point into it, then there is no way you should legitly use 1440p Resolutions, because the displays that are capable of that are ANYTHING but budget, and currently for gaming a 720p or 1080p monitor is king, because games just aren't optimized for the ultra high end displays yet *blaming consoles holding back gaming development, atleast I am*

With that said, the tests that were ran, were unrealistic, and ... even the 2nd video produced questionable results.

I Can easily match & beat 'barely' the metro 2033 benchmarks on a SLI GTX 550 ti, on a Sabertooth x58 using a i7-960. That alone calls into some serious question about the results of his test, using a GTX 670, or the 7870. I love my computer to death, and am over protective to a fault, but I'm not dumb, and kinda expect those GPU's to beat out my setup.

-That is assuming there isn't some crazy increase in Windows 8, with SLI on the X58 platform under 16x/16x, that some how is able to generate insane amount of performance, that could expect to match or beat 670's or 7870.

Oh I understand alright. :P Though I'm still matching performance. *granted i have to run in SLI, but still*

Though I honestly think you can't call a real world benchmark then test a 1440p monitor, and expect a user on AMD to buy it. Those are so far beyond normal costs, it's silly... so the real world tests would be 1080p & 720p because those are the two common, good, cheap monitor resolutions that would ever be considered for a build that wasn't baller high end, because you can setup.

I'm going to test Crysis, Crysis wars, & Crysis warhead under the same settings within the next few days, but outside that ... Real world gaming performance, people aren't going to crank their **** to max, just because, and those who do.. aren't going to give a flying **** about the cost differences in CPU, as they are likely to build their computer to do more then just gaming, and want to show it off in other aspects.

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting article when it comes to crossfire

http://www.tomshardware.com/reviews/fx-8350-core-i7-3770k-gaming-bottleneck,3407.html

If i am going to trust anyone with benchmarks its going to be TTL not Teksyndicate, I watch them more for interesting news and a few laughs more than anything.

^ not to mention the price difference.. lmao.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Well i think its good to see some people opening there eyes to the AMD camp seeing its not all bad... And for some workloads those 8 cores are really good... And to all that are saying that the ITX motherboard they were using on the intel setup is bad, well why did they get that overclock then? If that motherboard was that bad why did they get a decent OC then? I think its a fair video, the AMD camp cant match the i7s on LGA 1155 not to mention 2011, but hey for me and others on a budget who are looking at i5 performance the AMD FX camp is also a good way to go (and I did go that way :O :D)
Seconded ! I honestly don't understand why some people are upset because of this news. Agreed, you don't have to believe everything anybody on the internet says, but the arguments some people use are ridiculous (throttling of the cpu by the motherboard used, GPU-bottleneck, unrealistic resolutions/settings,...). A little bit of competition in the market is only beneficial for us, the consumers. I'm wondering if some people react like this because they feel threatened in their choice of cpu.
Link to comment
Share on other sites

Link to post
Share on other sites

My god those guys are dislikeable. I want them to be right and their smugness still repulses me.

My first question would be why AMD's been silent this whole time. If their CPU is more capable than Intel's staple gaming CPU that's recommended in 90% of builds, why have they not made some noise? If they've been unjustifiably losing money hand over fist to the 3570k, why didn't they say anything?

Link to comment
Share on other sites

Link to post
Share on other sites

Shh... I dare everybody to stop talking about this.

post-4771-13667861377109_thumb.jpg

export PS1='\[\033[1;30m\]┌╼ \[\033[1;32m\]\u@\h\[\033[1;30m\] ╾╼ \[\033[0;34m\]\w\[\033[0;36m\]\n\[\033[1;30m\]└╼ \[\033[1;37m\]'


"All your threads are belong to /dev/null"


| 80's Terminal Keyboard Conversion | $5 Graphics Card Silence Mod Tutorial | 485KH/s R9 270X | The Smallest Ethernet Cable | Ass Pennies | My Screenfetch |

Link to comment
Share on other sites

Link to post
Share on other sites

If you like making videos or livestreaming or multitasking AT ALL

8350 is best bet

if not

get a 3570k

If I had one wish, I would ask for a big enough ass for the whole world to kiss

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think the larger point is, that you shouldn't purchase anything more than an G860/FX6300 for gaming rig with a single-GPU setup, since you would be hitting a GPU-bottleneck anyways. The vast majority of games don't even fully utilize 4 cores, often it's more like having certain processes on the 3rd/4th core which barely increase performance, the non-GPU bottlenecked tests do show both easily hitting 60fps, which is more than enough anyways.

I will say though, I do not agree with the results. I've said in other threads that I have looked through other tests and not only do they show the i5 winning, but the margins are usually very, very small, even in the low-res tests and such. I'm really questioning the Stinger acting as a bottleneck.

Link to comment
Share on other sites

Link to post
Share on other sites

To clarify, the 3570K is about $10-15 more expensive in Australia than the FX-8350.

Firstly, I'm not an AMD fan or an Intel fan, I'm a fan of whichever chip is better at the price-point that it has been given. However, there were a few things I found to be a little concerning in this video.

The numbers seem much lower than they should be -- for example with Crysis Warhead, my 3570K with my 7970 gets almost 60 fps at stock speeds. Already the "41fps" they state at stock seems a little fishy. With Arma 2, somehow the 3570K got 25fps at stock? Benchmarks elsewhere show the 3570k getting over 100fps at Ultra@1080p, but with no AA and no AF. Is it possible the 8350 might do better with AA/AF than the 3570k?

"Far Cry 3 is another game that seems to favor the AMD"

This is not what we should be looking at for benchmarks. When a 3960X gets only 36 frames per second in Far Cry 3, I'm afraid I find it hard to believe that the 8350 is getting 57. In these instances, I think the game itself seems to be specifically optimized for the AMD CPU.

I think the result difference you're likely to see in games, generally, is similar to that of the Metro 2033 results. That is to say, the 3570k will win in every category for most games, but not by much (a couple of frames).

Overall verdict: In the instances when the 8350 actually won, it won by a LOT. To me, I see these chips as more or less equal -- the 8350 isn't a true eight-core; more like a slightly-better-than-with-hyperthreading quad core (this is why the 3570k still wins in some rendering/editing scenarios). They are both excellent chips, but it's generally not disputed that a 3570k beats out the 8350 in most scenarios.

Like Toby says, I want them to be correct, but something about the way Logan factually states everything makes me less likely to believe him. I'm opting for the idea that -- more or less -- the AMD processor is optimized better in the benchmarks that it won, because of how much it won by. Double the frames per second? Please; the two processors are not that different. If the 8350 is getting double the fps in a game compared to a 3570K, either it's slightly (at least) more optimized for that CPU or you have done something wrong. Real world difference between the two CPUs is win-some-lose-some, but not by a lot.

Let's look at Arma 2, overclock / stock clock:

AMD: 51.92 / 37.8

Intel: 25.56 / 13.12

I refuse to believe that the above benchmark is an indication of CPU performance. It isn't. It's an indication of how the game is optimized. The processors are more-or-less comparable, a benchmark like that seems illegitimate.

I could be wrong, but that just seems wrong. As an advocate for neither AMD nor Intel, I can happily say that the 8350 wins in some games. I just can't believe it isn't due to CPU optimization in Arma 2 specifically.

Link to comment
Share on other sites

Link to post
Share on other sites

Not this guy again. 8350 beats 3570K at Skyrim @ stock? Yea right. It's so obvious his EVGA Stinger mobo is throttling the 3570K. How does a 670 have 25 FPS for 3570K? I'm no Intel fanboy, I totally agree that the AMD benchies are correct, but the Intel benchmarks are just beyond questionable.

He really doesn't know how to benchmark.

I have an i3 + a brand new 7870. I get 25 - 28 FPS @ stock clocks, max settings - Far Cry 3

^

"Now eBombzor, I'm really happy for you, and imma let you finish, but Logan's Skyrim has one hundred mods at the same time. One hundred mods at the same time!"

Link to comment
Share on other sites

Link to post
Share on other sites

To clarify, the 3570K is about $10-15 more expensive in Australia than the FX-8350.

Firstly, I'm not an AMD fan or an Intel fan, I'm a fan of whichever chip is better at the price-point that it has been given. However, there were a few things I found to be a little concerning in this video.

The numbers seem much lower than they should be -- for example with Crysis Warhead, my 3570K with my 7970 gets almost 60 fps at stock speeds. Already the "41fps" they state at stock seems a little fishy. With Arma 2, somehow the 3570K got 25fps at stock? Benchmarks elsewhere show the 3570k getting over 100fps at Ultra@1080p, but with no AA and no AF. Is it possible the 8350 might do better with AA/AF than the 3570k?

"Far Cry 3 is another game that seems to favor the AMD"

This is not what we should be looking at for benchmarks. When a 3960X gets only 36 frames per second in Far Cry 3, I'm afraid I find it hard to believe that the 8350 is getting 57. In these instances, I think the game itself seems to be specifically optimized for the AMD CPU.

I think the result difference you're likely to see in games, generally, is similar to that of the Metro 2033 results. That is to say, the 3570k will win in every category for most games, but not by much (a couple of frames).

Overall verdict: In the instances when the 8350 actually won, it won by a LOT. To me, I see these chips as more or less equal -- the 8350 isn't a true eight-core; more like a slightly-better-than-with-hyperthreading quad core (this is why the 3570k still wins in some rendering/editing scenarios). They are both excellent chips, but it's generally not disputed that a 3570k beats out the 8350 in most scenarios.

Like Toby says, I want them to be correct, but something about the way Logan factually states everything makes me less likely to believe him. I'm opting for the idea that -- more or less -- the AMD processor is optimized better in the benchmarks that it won, because of how much it won by. Double the frames per second? Please; the two processors are not that different. If the 8350 is getting double the fps in a game compared to a 3570K, either it's slightly (at least) more optimized for that CPU or you have done something wrong. Real world difference between the two CPUs is win-some-lose-some, but not by a lot.

Let's look at Arma 2, overclock / stock clock:

AMD: 51.92 / 37.8

Intel: 25.56 / 13.12

I refuse to believe that the above benchmark is an indication of CPU performance. It isn't. It's an indication of how the game is optimized. The processors are more-or-less comparable, a benchmark like that seems illegitimate.

I could be wrong, but that just seems wrong. As an advocate for neither AMD nor Intel, I can happily say that the 8350 wins in some games. I just can't believe it isn't due to CPU optimization in Arma 2 specifically.

Though, I'd have to agree, but like to play a bit of devils advocate if I may. I propose that their tests would be valid *if carefully done, and watched over by people who actually knew what they were doing, and how to keep track of facts and factors. Though what I honestly think is missing is more games. If you are honestly wanting to just do game performance, then one should be pulling over 100 games to verify & test, with varing ages & system requirements. *and that information also recorded.*

What bothers me, is that.... they did it under false pretenses.. 'real world gaming performance' while being 720 or 1080p for most, doesn't include everything cranked up to the max, because alot of the graphical effects either get in the way or aren't noticeable for the average person, between 2x&6x AA. That the real problem isn't that the CPU is good or bad with games, it's that both preform well as needed, with some settings tweaks to help optimize it for your system, one way or another. in NVidia control pannel or AMD ATI version of that.

There being another problem, that people who know what their doing, can make older CPU's wipe the floor with those bench marks, *given new GPU's* while those who are essencially computer babies who don't know RAM from HDD, would run everything at stock speeds, and when that comes down to it, Intel is the better CPU for them, because it's made to work well without being OC'd *though OCing doesn't hurt the performance if ya know what ya doing*

Let me know if I'm off base or, I actually hit part of what bothers you about it as well.

Link to comment
Share on other sites

Link to post
Share on other sites

To clarify, the 3570K is about $10-15 more expensive in Australia than the FX-8350.

Firstly, I'm not an AMD fan or an Intel fan, I'm a fan of whichever chip is better at the price-point that it has been given. However, there were a few things I found to be a little concerning in this video.

The numbers seem much lower than they should be -- for example with Crysis Warhead, my 3570K with my 7970 gets almost 60 fps at stock speeds. Already the "41fps" they state at stock seems a little fishy. With Arma 2, somehow the 3570K got 25fps at stock? Benchmarks elsewhere show the 3570k getting over 100fps at Ultra@1080p, but with no AA and no AF. Is it possible the 8350 might do better with AA/AF than the 3570k?

"Far Cry 3 is another game that seems to favor the AMD"

This is not what we should be looking at for benchmarks. When a 3960X gets only 36 frames per second in Far Cry 3, I'm afraid I find it hard to believe that the 8350 is getting 57. In these instances, I think the game itself seems to be specifically optimized for the AMD CPU.

I think the result difference you're likely to see in games, generally, is similar to that of the Metro 2033 results. That is to say, the 3570k will win in every category for most games, but not by much (a couple of frames).

Overall verdict: In the instances when the 8350 actually won, it won by a LOT. To me, I see these chips as more or less equal -- the 8350 isn't a true eight-core; more like a slightly-better-than-with-hyperthreading quad core (this is why the 3570k still wins in some rendering/editing scenarios). They are both excellent chips, but it's generally not disputed that a 3570k beats out the 8350 in most scenarios.

Like Toby says, I want them to be correct, but something about the way Logan factually states everything makes me less likely to believe him. I'm opting for the idea that -- more or less -- the AMD processor is optimized better in the benchmarks that it won, because of how much it won by. Double the frames per second? Please; the two processors are not that different. If the 8350 is getting double the fps in a game compared to a 3570K, either it's slightly (at least) more optimized for that CPU or you have done something wrong. Real world difference between the two CPUs is win-some-lose-some, but not by a lot.

Let's look at Arma 2, overclock / stock clock:

AMD: 51.92 / 37.8

Intel: 25.56 / 13.12

I refuse to believe that the above benchmark is an indication of CPU performance. It isn't. It's an indication of how the game is optimized. The processors are more-or-less comparable, a benchmark like that seems illegitimate.

I could be wrong, but that just seems wrong. As an advocate for neither AMD nor Intel, I can happily say that the 8350 wins in some games. I just can't believe it isn't due to CPU optimization in Arma 2 specifically.

Completely agree; not a wide enough range of games. I must say I didn't particularly like the video.

I have a 7970 and, granted, I run 5760x1080, most people with a 670 don't usually have 1440p monitors -- on a mainstream level.

Most people don't use heavy AA/AF either, this is a flaw... I would be happy with it if it included more games. At least ten to show a real-world distinction between the two, if there are any.

On a general level most people won't be overclocking. Though this might be what we would do, I don't have an H100i and am more than happy with my 4.2GHz speed. I'm sure they could have pushed that 3570K a little more with the H100... if you're going to overclock them, why increase the clock for both by 1GHz? Boost it up as much as you can on each to show the similarity... and you might as well give them both the same cooler while you're at it.

It just seemed like a very rushed bunch of strange benchmarks... I felt as though they needed to take more time with the systems they implemented and whatnot.

Link to comment
Share on other sites

Link to post
Share on other sites

@toby logan? smug? lol you must be new here.......

As for all the Butt hurt, This was a comparison of how the CPU's would react in certain games, so CPU optimization doesn't matter. If you play far cry and want highest possible fps and stability, you go with the results period.

Logan And Wendell stated multiple times that they could care less what the results are. They stated quite clearly that they all run Intel. So if anything underhanded was afoot, wouldn't they try to fib the results? We need to all relax, The Tek guys are here to help the community, shitting on them just makes you look like a smug fanboy. Oh and btw 1440p isnt some exotic resolution, I use two with my 3930k and sli 660ti's

Link to comment
Share on other sites

Link to post
Share on other sites

Not this guy again. 8350 beats 3570K at Skyrim @ stock? Yea right. It's so obvious his EVGA Stinger mobo is throttling the 3570K. How does a 670 have 25 FPS for 3570K? I'm no Intel fanboy, I totally agree that the AMD benchies are correct, but the Intel benchmarks are just beyond questionable.

He really doesn't know how to benchmark.

I have an i3 + a brand new 7870. I get 25 - 28 FPS @ stock clocks, max settings - Far Cry 3

Ok what does mods have to do with 8350 beating the 3570K when Skyrim is a quad threaded game? It just makes the game a little harder to run, that's about it. The 8350 should be about 5- 10 FPS behind the 3570K.

He says Hardware Canucks is unbiased? http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/57446-fx-8350-cpu-review-amds-vishera-arrives-17.html

His own results contradict HC's. Not to mention HC's benchmark scores are identically proportional to Toms or Anandtech's review.

Mods only make the game more demanding, they don't magically make the game more optimized for more cores or architectures! The FPS should decrease proportionally to the benchmarks, but they don't, at least not in Logan's fantasy land.

I don't hate AMD, I would even recommend an 8350 over the 3470 any day. But this is just ridiculous.

Link to comment
Share on other sites

Link to post
Share on other sites

To clarify, the 3570K is about $10-15 more expensive in Australia than the FX-8350.

Firstly, I'm not an AMD fan or an Intel fan, I'm a fan of whichever chip is better at the price-point that it has been given. However, there were a few things I found to be a little concerning in this video.

The numbers seem much lower than they should be -- for example with Crysis Warhead, my 3570K with my 7970 gets almost 60 fps at stock speeds. Already the "41fps" they state at stock seems a little fishy. With Arma 2, somehow the 3570K got 25fps at stock? Benchmarks elsewhere show the 3570k getting over 100fps at Ultra@1080p, but with no AA and no AF. Is it possible the 8350 might do better with AA/AF than the 3570k?

"Far Cry 3 is another game that seems to favor the AMD"

This is not what we should be looking at for benchmarks. When a 3960X gets only 36 frames per second in Far Cry 3, I'm afraid I find it hard to believe that the 8350 is getting 57. In these instances, I think the game itself seems to be specifically optimized for the AMD CPU.

I think the result difference you're likely to see in games, generally, is similar to that of the Metro 2033 results. That is to say, the 3570k will win in every category for most games, but not by much (a couple of frames).

Overall verdict: In the instances when the 8350 actually won, it won by a LOT. To me, I see these chips as more or less equal -- the 8350 isn't a true eight-core; more like a slightly-better-than-with-hyperthreading quad core (this is why the 3570k still wins in some rendering/editing scenarios). They are both excellent chips, but it's generally not disputed that a 3570k beats out the 8350 in most scenarios.

Like Toby says, I want them to be correct, but something about the way Logan factually states everything makes me less likely to believe him. I'm opting for the idea that -- more or less -- the AMD processor is optimized better in the benchmarks that it won, because of how much it won by. Double the frames per second? Please; the two processors are not that different. If the 8350 is getting double the fps in a game compared to a 3570K, either it's slightly (at least) more optimized for that CPU or you have done something wrong. Real world difference between the two CPUs is win-some-lose-some, but not by a lot.

Let's look at Arma 2, overclock / stock clock:

AMD: 51.92 / 37.8

Intel: 25.56 / 13.12

I refuse to believe that the above benchmark is an indication of CPU performance. It isn't. It's an indication of how the game is optimized. The processors are more-or-less comparable, a benchmark like that seems illegitimate.

I could be wrong, but that just seems wrong. As an advocate for neither AMD nor Intel, I can happily say that the 8350 wins in some games. I just can't believe it isn't due to CPU optimization in Arma 2 specifically.

I totally agree and my reasoning is that the mobo is throttling the 3570K, I mean why would he use the EVGA Stinger?

The benchmarks are incredibly inconsistent and this is a topic we shouldn't argue about, the 3570K beats the 8350 in gaming (that's not Civilization V) and it's a known fact. If AMD had known that their CPUs performed lot better, than they would've addressed it like they did with the 8150 back in Oct 2011, and they would've priced the 8350 next to the 3570K.

It's not that we think Intel is always better, it's that 8350 gets double the FPS of what the 3570K gets. I would be suspicious if the 3570K had double the FPS of the 8350.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×