Jump to content

Best CPUs for the money 2013 by AnandTech

samuel

So gaming wise, one gpu...1440p, max settings. And the winner is....A8 5600K! Two cards, Core I5s and 8Cores AMD start to make sense. This is insane!

http://www.anandtech.com/show/6985/choosing-a-gaming-cpu-at-1440p-adding-in-haswell-/9

 

Oh tom's hardware...shame on you!:

Captur.png

FX8320 4.2Ghz@1.280v& 4.5 Ghz Turbo@1.312v Thermalright HR-02/w TY-147 140MM+Arctic Cooling 120MMVRM cooled by AMD Stock Cooler Fan 70MM 0-7200 RPM PWM controlled via SpeedfanGigabyte GA990XA-UD3Gigabyte HD 7970 SOC@R9 280X120GiBee Kingston HyperX 3K2TB Toshiba DT01ACA2001TB WD GreenZalman Z11+Enermax 140MM TB Apollish RED+2X Deepcool 120MM and stock fans running @5VSingle Channel Patriot 8GB (1333MHZ)+Dual Channel 4GB&2GB Kingston NANO Gaming(1600MHZ CL9)=14GB 1,600 Jigahurtz 10-10-9-29 CR1@1.28VSirtec High Power 500WASUS Xonar DG, Logitech F510Sony MDR-XD200Edifier X220 + Edifier 3200A4Tech XL-747H 3600dpiA4Tech X7-200MPdecent membrane keyboardPhilips 236V3LSB 23" 1080p@71Hz .

               
Sorry for my English....

Link to comment
Share on other sites

Link to post
Share on other sites

I am just dumbstruck. Wut? This is... suprising. Good job AMD.

8320 FTW :) 

Link to comment
Share on other sites

Link to post
Share on other sites

this test is bullshit. If you only take games who don't need that much CPU power you know that with max settings and under 1440p you will run into GPU limitations. and the extra cpu power is useless.

 

And than look at the frame rates, 10-30fps....yeah really realistic settings, i would want to play like that! Ant the most important information, minimum frames?

 

To even test a A8 5800k insted of a FX-4350 or Athlon II X4 750K makes it even worse...

 

Never seen such a useless test...

Link to comment
Share on other sites

Link to post
Share on other sites

this test is bullshit. If you only take games who don't need that much CPU power you know that with max settings and under 1440p you will run into GPU limitations. and the extra cpu power is useless.

 

And than look at the frame rates, 10-30fps....yeah really realistic settings, i would want to play like that! Ant the most important information, minimum frames?

 

To even test a A8 5800k insted of a FX-4350 or Athlon II X4 750K makes it even worse...

 

Never seen such a useless test...

So...anandtech is dumb you say?

If I were gaming today on a single GPU, the A8-5600K (or non-K equivalent) would strike me as a price competitive choice for frame rates, as long as you are not a big Civilization V player and do not mind the single threaded performance.  The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580, as well as feel the same in the OS as an equivalent Intel CPU.

FX8320 4.2Ghz@1.280v& 4.5 Ghz Turbo@1.312v Thermalright HR-02/w TY-147 140MM+Arctic Cooling 120MMVRM cooled by AMD Stock Cooler Fan 70MM 0-7200 RPM PWM controlled via SpeedfanGigabyte GA990XA-UD3Gigabyte HD 7970 SOC@R9 280X120GiBee Kingston HyperX 3K2TB Toshiba DT01ACA2001TB WD GreenZalman Z11+Enermax 140MM TB Apollish RED+2X Deepcool 120MM and stock fans running @5VSingle Channel Patriot 8GB (1333MHZ)+Dual Channel 4GB&2GB Kingston NANO Gaming(1600MHZ CL9)=14GB 1,600 Jigahurtz 10-10-9-29 CR1@1.28VSirtec High Power 500WASUS Xonar DG, Logitech F510Sony MDR-XD200Edifier X220 + Edifier 3200A4Tech XL-747H 3600dpiA4Tech X7-200MPdecent membrane keyboardPhilips 236V3LSB 23" 1080p@71Hz .

               
Sorry for my English....

Link to comment
Share on other sites

Link to post
Share on other sites

That's very interesting. Something I might consider for my next build because TBH, I don't do anything else with my gaming rig really, apart from web browsing occasionally, office apps and the like.

 

So for my usage case why spend another $150 on something that will make no real difference when I could put that in to a higher end GPU instead? Cool.

- Silverstone TJ08B-E - Gigabyte Z87M-D3H - i7 4770k @ 4.0GHZ 1.2v - 16gb Kingston HyperX Black 1600 - Gigabyte GTX 770 OC 4GB -


- Silverstone Fortress FT02 - MSI Z77 Mpower - i5 3570k @ 4.0GHZ 1.09v - 8gb Mushkin Blackline 1600 - MSI GTX 670 PE -


- Lenovo T430 (1600x900) - i5 3210m - 8GB DDR3 1333 - nVidia NVS5400M - 256GB mSATA OS - 320GB HDD-

Link to comment
Share on other sites

Link to post
Share on other sites

So...anandtech is dumb you say?

If your goal is to find the most value CPU for gaming you have to test value CPUs! as I said FX-4350 or Athlon II X4 750K are bassicly as fast as a A8-5800 but way cheaper.

On the other side, if you get 10-30fps as a result your benchmarks are useless. go down with the setting untill you hit something usefull like 50-60fps. than you can compare. And even if you want to stick with your low FPS benchmarks the really important thing is still minimum frames. What if the A8-5800 goes down to 50% of average fps or something like that but other cpus don't? And they didn't test enough games.

 

i don't think that AMDs vlaue CPUs are bad, a i3-3220 or AMD quadcore was always a good value choice for a gaming cpu.

 

Anandtech isn't dumb but this test is....it's just not how you would use a gaming pc.

Link to comment
Share on other sites

Link to post
Share on other sites

If your goal is to find the most value CPU for gaming you have to test value CPUs! as I said FX-4350 or Athlon II X4 750K are bassicly as fast as a A8-5800 but way cheaper.

On the other side, if you get 10-30fps as a result your benchmarks are useless. go down with the setting untill you hit something usefull like 50-60fps. than you can compare. And even if you want to stick with your low FPS benchmarks the really important thing is still minimum frames. What if the A8-5800 goes down to 50% of average fps or something like that but other cpus don't? And they didn't test enough games.

 

i don't think that AMDs vlaue CPUs are bad, a i3-3220 or AMD quadcore was always a good value choice for a gaming cpu.

 

Anandtech isn't dumb but this test is....it's just not how you would use a gaming pc. 

You are wrong. Both have small or no L3 cache at all wich is vital for high end gaming CPUs. FX 6300/50 is what you were refering to. Still, 5800k, 750k, 5600k in a ITX build with a frigging 780 inside...you get a 4K ready gaming box! You basically buy only the videocard! 

FX8320 4.2Ghz@1.280v& 4.5 Ghz Turbo@1.312v Thermalright HR-02/w TY-147 140MM+Arctic Cooling 120MMVRM cooled by AMD Stock Cooler Fan 70MM 0-7200 RPM PWM controlled via SpeedfanGigabyte GA990XA-UD3Gigabyte HD 7970 SOC@R9 280X120GiBee Kingston HyperX 3K2TB Toshiba DT01ACA2001TB WD GreenZalman Z11+Enermax 140MM TB Apollish RED+2X Deepcool 120MM and stock fans running @5VSingle Channel Patriot 8GB (1333MHZ)+Dual Channel 4GB&2GB Kingston NANO Gaming(1600MHZ CL9)=14GB 1,600 Jigahurtz 10-10-9-29 CR1@1.28VSirtec High Power 500WASUS Xonar DG, Logitech F510Sony MDR-XD200Edifier X220 + Edifier 3200A4Tech XL-747H 3600dpiA4Tech X7-200MPdecent membrane keyboardPhilips 236V3LSB 23" 1080p@71Hz .

               
Sorry for my English....

Link to comment
Share on other sites

Link to post
Share on other sites

It's about a week old but a really good read, I didn't find it all that surprising because of AMD's aggressive pricing.

Link to comment
Share on other sites

Link to post
Share on other sites

Wow, and yet I thought i5s were great for gaming. An i3 does the job just as well. Thanks for the article and as always don't forget to subsc.... wait what? And thanks to Anandtech for doing this test.

Link to comment
Share on other sites

Link to post
Share on other sites

Creepy....FX 8320 and 6300 are out of stock/limited stock everywere in my country....

FX8320 4.2Ghz@1.280v& 4.5 Ghz Turbo@1.312v Thermalright HR-02/w TY-147 140MM+Arctic Cooling 120MMVRM cooled by AMD Stock Cooler Fan 70MM 0-7200 RPM PWM controlled via SpeedfanGigabyte GA990XA-UD3Gigabyte HD 7970 SOC@R9 280X120GiBee Kingston HyperX 3K2TB Toshiba DT01ACA2001TB WD GreenZalman Z11+Enermax 140MM TB Apollish RED+2X Deepcool 120MM and stock fans running @5VSingle Channel Patriot 8GB (1333MHZ)+Dual Channel 4GB&2GB Kingston NANO Gaming(1600MHZ CL9)=14GB 1,600 Jigahurtz 10-10-9-29 CR1@1.28VSirtec High Power 500WASUS Xonar DG, Logitech F510Sony MDR-XD200Edifier X220 + Edifier 3200A4Tech XL-747H 3600dpiA4Tech X7-200MPdecent membrane keyboardPhilips 236V3LSB 23" 1080p@71Hz .

               
Sorry for my English....

Link to comment
Share on other sites

Link to post
Share on other sites

You are wrong. Both have small or no L3 cache at all wich is vital for high end gaming CPUs

...the A8-5800k has no L3 chache, the AthlonII X4 750K is the exact same CPU as the 5800k without the GPU but 20$ cheaper. both or unlocked and will overklock the same. The AMD FX-4350 has 8MB L3 chache and is faster then the other 2 per clock...so in the end the 5800k is just the worst of all 3...

Link to comment
Share on other sites

Link to post
Share on other sites

I keep seeing these reviews saying that AMD CPUs are really good but they're generally talking about their APUs and comparing the integrated graphics. It's just pointless, stupid and pathetic. Everyone knows that Intel has no iGPU capability - though that is changing... slowly. This one appears to have some odd results but I don't get why everyone in this thread is screaming that AMD is doing well. I think there were two mentions of AMD being 'acceptable' compared to Intel's 5 or 6 'best choice'. There was the one about the 5800k being the best option, but I don't see any mention of what the budget is or anything. This seems to be a review of everything but with nothing in mind... It just makes no sense!

 

Could someone explain to me what all of these tests are based on so that I can have a much more suitable opinion of the review. Or is this just some guy randomly making up graphs of some results he's grabbed for a bunch of different tests and then just picking one and saying it's the best?

Link to comment
Share on other sites

Link to post
Share on other sites

Would be uber nice if AMD would go ahead and do a low power state APU so I can do a mITX HTPC with PCIe cable card reader. 

Main Rig: i5 760 @ 4.0GHZ Asus p7p55d-e, 8GB Corsair Vengance @ 1600 Mhz. Samsung BX2231 X 3 (5760x1080)

                EVGA GTX 680 Superclocked +150/+500, 128GB Crucial M4, 1TB WD Black

                Xonar DX, AudioEngine A2, Astro Mixamp, AudioTechnica ath-m50 & ath-ad700

Link to comment
Share on other sites

Link to post
Share on other sites

I keep seeing these reviews saying that AMD CPUs are really good but they're generally talking about their APUs and comparing the integrated graphics. It's just pointless, stupid and pathetic. Everyone knows that Intel has no iGPU capability - though that is changing... slowly. This one appears to have some odd results but I don't get why everyone in this thread is screaming that AMD is doing well. I think there were two mentions of AMD being 'acceptable' compared to Intel's 5 or 6 'best choice'. There was the one about the 5800k being the best option, but I don't see any mention of what the budget is or anything. This seems to be a review of everything but with nothing in mind... It just makes no sense!

 

Could someone explain to me what all of these tests are based on so that I can have a much more suitable opinion of the review. Or is this just some guy randomly making up graphs of some results he's grabbed for a bunch of different tests and then just picking one and saying it's the best?

Anandtech, a trusted and dedicated review website with real people decided to do a CPU review. They wanted to find the best CPU for gaming with a single, dual, triple, and quad graphics cards. The review just shows how each CPU performs. This is not showing the graphics capabilities of each CPU, if that's what you are asking. They then explain the 'best' CPU for each type of graphics card setup. For example if you have one graphics card, Ian (the writer of the article) suggest you get an AMD 5600k, i3 3225, or an 8350. We are all well aware that the A8 5600K doesn't have as good single threaded performance though... Personally I would get an i3 and a beastly graphics card, if only they came out with this article before I purchased my rig...  :D

Link to comment
Share on other sites

Link to post
Share on other sites

In the intro to this update, I addressed a couple of points regarding testing 1440p over 1080p, as well as reasons for not using FCAT or reporting minimum FPS.  But one of the bigger issues brought up in the first Gaming CPU article comes from the multiplayer gaming perspective, when dealing with a 64-player map in BF3.  This is going to be a CPU intensive situation for sure, dealing with the network interface to update the GPU and processing.  The only issue from our side is repetitive testing.  I focused a lot on the statistics of reporting benchmarking results, and trying to get a consistent MP environment for game testing that can be viewed at objectively is for all intents and purposes practically impossible.  Sure I could play a few rounds in every configuration, but FPS numbers would be all over the place based on how the rounds went.  I would not be happy on publishing such data and then basing recommendations from it.

 

This is a good point, but how is testing on a NON-networked game any better?  There are OS system processes, updaters, and any number of things that might affect your end result.  Thus, none of his tests are REPEATABLE.  I applaud Linus for at least showing us how we might EXACTLY reproduce his test in his videos, but there simply are no website benchmark reviews I trust that show an exact and repeatable methodology.  It's the scientific method, not the scientific play-through.

 

Then there's the issue that each video game is designed differently, and quite often with a certain processor and instruction set in mind.  Your mileage might vary game to game.  It's no wonder this is all so confusing for the consumer.

"Pardon my French but this is just about the most ignorant blanket statement I've ever read. And though this is the internet, I'm not even exaggerating."

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

...the A8-5800k has no L3 chache, the AthlonII X4 750K is the exact same CPU as the 5800k without the GPU but 20$ cheaper. both or unlocked and will overklock the same. The AMD FX-4350 has 8MB L3 chache and is faster then the other 2 per clock...so in the end the 5800k is just the worst of all 3...

I did not know that 4350 dubled the L3 cache of 4300. So is not just increased freq afterall...(Phenom 2 955-980 style.)

FX8320 4.2Ghz@1.280v& 4.5 Ghz Turbo@1.312v Thermalright HR-02/w TY-147 140MM+Arctic Cooling 120MMVRM cooled by AMD Stock Cooler Fan 70MM 0-7200 RPM PWM controlled via SpeedfanGigabyte GA990XA-UD3Gigabyte HD 7970 SOC@R9 280X120GiBee Kingston HyperX 3K2TB Toshiba DT01ACA2001TB WD GreenZalman Z11+Enermax 140MM TB Apollish RED+2X Deepcool 120MM and stock fans running @5VSingle Channel Patriot 8GB (1333MHZ)+Dual Channel 4GB&2GB Kingston NANO Gaming(1600MHZ CL9)=14GB 1,600 Jigahurtz 10-10-9-29 CR1@1.28VSirtec High Power 500WASUS Xonar DG, Logitech F510Sony MDR-XD200Edifier X220 + Edifier 3200A4Tech XL-747H 3600dpiA4Tech X7-200MPdecent membrane keyboardPhilips 236V3LSB 23" 1080p@71Hz .

               
Sorry for my English....

Link to comment
Share on other sites

Link to post
Share on other sites

Anandtech, a trusted and dedicated review website with real people decided to do a CPU review. They wanted to find the best CPU for gaming with a single, dual, triple, and quad graphics cards. The review just shows how each CPU performs. This is not showing the graphics capabilities of each CPU, if that's what you are asking. They then explain the 'best' CPU for each type of graphics card setup. For example if you have one graphics card, Ian (the writer of the article) suggest you get an AMD 5600k, i3 3225, or an 8350. We are all well aware that the A8 5600K doesn't have as good single threaded performance though... Personally I would get an i3 and a beastly graphics card, if only they came out with this article before I purchased my rig... :D

My point proved. Is this in regard to any specific budgets, or just some random choices from the graphs? I'm not an idiot and I understand that the CPU doesn't provide graphics power. But all I can see from this review is that they haven't got anything set in stone. They've just tried every processor and picked a couple for each selection. As you said, the 5800k doesn't have the best single threaded performance, yet he's stated that it's the best choice for a single GPU. What's he on about!? It's clearly NOT! The best choice would be either a 3970X or a 3770/4770K, obviously. But not many people can afford that. So the review should state on what basis he is choosing each option. In a realistic environment, it'd be more likely for i5 3570/4670k or FX8350 options that I'd expect people to go for which are good options.

Link to comment
Share on other sites

Link to post
Share on other sites

My point proved. Is this in regard to any specific budgets, or just some random choices from the graphs? I'm not an idiot and I understand that the CPU doesn't provide graphics power. But all I can see from this review is that they haven't got anything set in stone. They've just tried every processor and picked a couple for each selection. As you said, the 5800k doesn't have the best single threaded performance, yet he's stated that it's the best choice for a single GPU. What's he on about!? It's clearly NOT! The best choice would be either a 3970X or a 3770/4770K, obviously. But not many people can afford that. So the review should state on what basis he is choosing each option. In a realistic environment, it'd be more likely for i5 3570/4670k or FX8350 options that I'd expect people to go for which are good options.

It seems like he's going for a "best bang for your buck" decision for which is best. An expensive cpu like the 3770 will perform better in nearly all situations but the question is, is it worth it?

I do find it strange that anand picked 1440p for the resolution as it is incredibly taxing on the graphics card, and thus very likely to make the gpu into a bottleneck. Which would seem to be the opposite of what you want from a cpu benchmark

Link to comment
Share on other sites

Link to post
Share on other sites

Glad I went AMD.

Life is pain. Anyone who says any different is either selling something or the government.

 

----CPU: FX-6300 @ 4.2ghz----COOLER: Hyper 212 EVO----MOBO: MSI 970A-G46----PSU: OCZ 600watt----CASE: Black Corsair C70----GPU: Sapphire 7870 dual fan ghz edtion----2 random HDD'S----A couple fans here and there. Mouse: Gigabyte M6900-------Keyboard: Logitech G105-----Mousepad: Steel series something something.

Link to comment
Share on other sites

Link to post
Share on other sites

To be fair this test is rather stupid. At 1440p the gpu becomes the bottleneck because at that resolution games are gpu-limited. If they decrease the resolution to say 1080p or less then the 5600k begins to bottleneck the hd 7970 and provide less fps than say an i5 or an fx 6350.etc

Link to comment
Share on other sites

Link to post
Share on other sites

ummm... did anand just say:

 

 The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580

 

wat? really? if this isn't a typo, gonna get me one of those!

 

 

EDIT: didn't realize they were running it WITH a 7970 and a 580. thought they were running it AGAINST them, considering it's an APU.

Edited by carolkarine

Daily Driver:

Case: Red Prodigy CPU: i5 3570K @ 4.3 GHZ GPU: Powercolor PCS+ 290x @1100 mhz MOBO: Asus P8Z77-I CPU Cooler: NZXT x40 RAM: 8GB 2133mhz AMD Gamer series Storage: A 1TB WD Blue, a 500GB WD Blue, a Samsung 840 EVO 250GB

Link to comment
Share on other sites

Link to post
Share on other sites

ummm... did anand just say:

 

 The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580

 

wat? really? if this isn't a typo, gonna get me one of those!

 

 

EDIT: didn't realize they were running it WITH a 7970 and a 580. thought they were running it AGAINST them, considering it's an APU.

Anand was just testing the cpu capabilities of the APUs. The integrated graphics on AMD APUs (while powerful for integrated graphics) has a hard time beating most dedicated graphics cards even if they're a generation or two behind.

Link to comment
Share on other sites

Link to post
Share on other sites

I will totaly buy the fx 6300. no doubt about it.

FX8320 4.2Ghz@1.280v& 4.5 Ghz Turbo@1.312v Thermalright HR-02/w TY-147 140MM+Arctic Cooling 120MMVRM cooled by AMD Stock Cooler Fan 70MM 0-7200 RPM PWM controlled via SpeedfanGigabyte GA990XA-UD3Gigabyte HD 7970 SOC@R9 280X120GiBee Kingston HyperX 3K2TB Toshiba DT01ACA2001TB WD GreenZalman Z11+Enermax 140MM TB Apollish RED+2X Deepcool 120MM and stock fans running @5VSingle Channel Patriot 8GB (1333MHZ)+Dual Channel 4GB&2GB Kingston NANO Gaming(1600MHZ CL9)=14GB 1,600 Jigahurtz 10-10-9-29 CR1@1.28VSirtec High Power 500WASUS Xonar DG, Logitech F510Sony MDR-XD200Edifier X220 + Edifier 3200A4Tech XL-747H 3600dpiA4Tech X7-200MPdecent membrane keyboardPhilips 236V3LSB 23" 1080p@71Hz .

               
Sorry for my English....

Link to comment
Share on other sites

Link to post
Share on other sites

I usually like Anandtech but as some people have already pointed out, this test is pretty stupid. It's painfully obvious that they are being very GPU bottlenecked so it doesn't really matter what CPU they use (the A8-3850 is pretty much within margin of error from the FX 8350, which is far more powerful).

Who is going to play games at max settings, on a 2560x1440 monitor, with an A6-3650 (which performed better than the A8-3850 in some tests because of the margin of error)? If they used reasonable settings, like maybe 1920x1080 or something else to minimize the GPU bottleneck (or hell, just use reasonable settings) then the result would have been quite a lot different.

 

TheJian made a very good comment on the article and I'll quote it here just in case someone can't find it:

This is incorrect. It is only competitive when you TAP out the gpu by forcing them into situations they can't handle. If you drop the res to 1080p suddenly the CPU is VERY important and they part like the red sea.

This is another attempt at covering for AMD and trying to help them sell products (you can judge whether it's intentional or not on your own). When no single card can handle the resolutions being forced on them (1440p) you end up with ALL cpu's looking like they're fine. This is just a case of every cpu saying hurry up mr. vid card I'm waiting (or we're all waiting). Lower the res to where they can handle it and cpus start to show their colors. If this article was written with 1080p being the focus (as even his own survey shows 96% of us use it OR lower, and adding in 1920x1200 you end up with 98.75%!!) you would see how badly AMD is doing vs Intel since the video cards would NOT be brick walled screaming under the load.

http://www.tomshardware.com/reviews/neverwinter-pe...
An example of what happens when you put the vid card at 1080p where cpu's can show their colors.
"At this point, it's pretty clear that Neverwinter needs a pretty quick processor if you want the performance of a reasonably-fast graphics card to shine through. At 1920x1080, it doesn't matter if you have a Radeon HD 7790, GeForce GTX 650 Ti, Radeon HD 7970, or GeForce GTX 680 if you're only using a mid-range Core i5 processor. All of those cards are limited by our CPU, even though it offers four cores and a pretty quick clock rate."

It's not just Civ5. I could point out how inaccurate the suggestions in this 1440p article are all day. Just start looking up cpu articles on other web sites and check the 1080p data. Most cpu articles show using a top card (7970 or 680 etc) so you get to see the TRUTH. The CPU is important in almost EVERY game, unless you shoot the resolution up so high they all score the same because your video card can't handle the job (thus making ANY cpu spend all day waiting on the vid card).

I challenge anandtech to rerun the same suite, same chips at 1080p and prove I'm wrong. I DARE YOU.

http://www.hardocp.com/article/2012/10/22/amd_fx83...
More evidence of what happens when gpu is NOT tapped out. Look at how Intel is KILLING AMD at hardocp. Even if you say "but eventually I'll up my res and spend $600 on a 1440p monitor", you have to understand that as you get better gpu's that can handle that res, you'll hate the fact you chose AMD for a cpu as it will AGAIN become the limiter.
"Lost Planet is still used here at HardOCP because it is one of the few gaming engines that will reach fully into our 8C/8T processors. Here we see Vishera pull off its biggest victory yet when compared to Zambezi, but still lagging behind 4 less cores from Intel."

"Again we see a new twist on the engine above, and it too will reach into our 8C/8T. While not as pronounced as Lost Planet, Lost Planet 2 engine shows off the Vishera processors advancements, yet it still trails Intel's technology by a wide margin."

"The STALKER engine shows almost as big an increase as we saw above, yet with Intel still dealing a crippling gaming blow to AMD's newest architecture."
Yeah, a 65% faster Intel is a LOT right? Understand if you go AMD now, once you buy a card (20nm maxwell etc? 14nm eventually in 3yrs?) you will CRY over your cpu limiting you at even 1440p. Note the video card Hardocp use for testing was ONLY a GTX 470. That's old junk, he could now run with 7970ghz or 780gtx and up the res to 1080p and show the same results. AMD would get a shellacking.

http://techreport.com/review/24879/intel-core-i7-4...
Here, techreport did it in 1080p. 20% lower for A10-5800 than 4770K in crysis 3. It gets worse with farcry 3 etc. In Far Cry 3 i4770k scored 96fps at 1080p, yet AMD's A10-5800 scored a measly 68. OUCH. So roughly 30% slower in this game. HOLY COW man check out Tomb Raider...Intel 126fps! AMD A10-5800 68fps! Does Anandtech still say this is a good cpu to go with? At the rest 98.75% of us run at YOU ARE WRONG. That's almost 2x faster in tomb raider at 1080p! Metro last light INtel 93fps, vs, AMD A10-5800 51fps again almost TWO TIMES faster!

From Ian's conclusion page here:
"If I were gaming today on a single GPU, the A8-5600K (or non-K equivalent) would strike me as a price competitive choice for frame rates, as long as you are not a big Civilization V player and do not mind the single threaded performance. The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580, as well as feel the same in the OS as an equivalent Intel CPU."

He's not even talking the A10-5800 that got SMASHED at techreport as shown in the link. Note they only used a RAdeon 7950. A 7970ghz or GTX 780 would be even less taxed and show even larger CPU separations. I hope people are getting the point here. Anandtech is MISLEADING you at best by showing a resolution higher than 98.75% of us are using and tapping out the single gpu. I could post a dozen other cpu reviews showing the same results. Don't walk, RUN away from AMD if you are a gamer today (or tomorrow). Haswell boards are supposed to take a broadwell chip also, even more ammo to run from AMD.

Ian is recommending a cpu that is lower than the one I show getting KILLED here. Games might not even be playable as the A10-5800 was hitting 50fps AVG on some things. What would you hit with a lower cpu avg, and worse what would the mins be? Unplayable? Get a better CPU. You've been warned.

 

He also made a second comment:

The killing happened at 1080p also which is what techreport showed. Since 98.75% of us run 1920x1200 or below, I'm thinking that is pretty important data.

The second you put in more than one card the cpus separate even at 1440p. Meaning, next years SINGLE card or the one after will AGAIN separate the cpus as that single card will be able to wait on the CPU as the bottleneck goes back to cpu. Putting that aside, hardocp showed even the mighty titan at $1000 had stuff turned of at 1080p. So you are incorrect. Is it serious overkill if hardocp is turning stuff off for a smooth game experience? 7970/GTX680 had to turn off even more stuff in the 780GTX review (titan and 780gtx mostly had the same stuff on, but the 7970ghz and 680gtx they compared to turned off quite a bit to remain above 30fps).

I'm a serious player, and I can't run 1920x1200 with my radeon 5850 which was $300 when I bought it. I'm hoping maxwell will get me 30fps with EVERYTHING on in a few games at 1440p (I'm planning on buying a 27 or 30in at some point) and for the ones that don't I'll play them on my Dell 24 as I do now. But the current cards (without spending a grand and even that don't work) in single format still have trouble with 1080p as hardocp etc has shown. I want my next card to at least play EVERY game at 1920x1200 on my dell, and hope for a good portion on the next monitor purchase. With the 5850 I run a lot of games on my 22in at 1680x1050 to enable everything. I don't like turning stuff down or off, as that isn't how the dev intended me to play their game right?

Apparently you think all 7970 and 580 owners are all running 1440p and up? Ridiculous. The steam survey says you are woefully incorrect. 98.75% of us are all running 1920x1200 or below and a TON of us have 7970, 680, 580 etc etc (not me yet) and enjoying the fact that they NEVER turn stuff down (well, apparently you still do on some games...see the point?). Only DUAL card owners are running above as the steam survey shows, go there and check out the breakdown. You can see the population (even as small as that 1% is...LOL) has TWO cards running above 1920x1200. So you are categorically incorrect or steam's users change all their resolutions down just to fake a survey?...ROFL. Ok. Whatever. You expect me to believe they get done with the survey and jack it up for UNDER 30fps gameplay? Ok...

Even here, at 1440p for instance, metro only ran 34fps (and last light is more taxing than 2033). How low do you think the minimums are when you're only doing 34fps AVERAGE? UNPLAYABLE. I can pull anandtech quotes that say you'd really like 60fps to NEVER dip below 30fps minimum. In that they are actually correct and other sites agree...
http://www.guru3d.com/articles_pages/palit_geforce...
"Frames per second Gameplay
<30 FPS very limited gameplay
30-40 FPS average yet very playable
40-60 FPS good gameplay
>60 FPS best possible gameplay

So if a graphics card barely manages less than 30 FPS, then the game is not very playable, we want to avoid that at all cost.
With 30 FPS up-to roughly 40 FPS you'll be very able to play the game with perhaps a tiny stutter at certain graphically intensive parts. Overall a very enjoyable experience. Match the best possible resolution to this result and you'll have the best possible rendering quality versus resolution, hey you want both of them to be as high as possible.
When a graphics card is doing 60 FPS on average or higher then you can rest assured that the game will likely play extremely smoothly at every point in the game, turn on every possible in-game IQ setting."

So as the single 7970 (assuming ghz edition here in this 1440p article) can barely hit 34fps, by guru3d's definition it's going to STUTTER. Right? You can check max/avg/min everywhere and you'll see there is a HUGE diff between min and avg. Thus the 60fps point is assumed good to ensure above 30 min and no stutter (I'd argue higher depending on the game, mulitplayer etc as you can tank when tons of crap is going on). Guru3d puts that in EVERY gpu article.

The single 580 in this article can't even hit 24fps and that is AN AVERAGE. So unplayable totally, thus making the whole point moot right? You're going to drop to 1080p just to hit 30fps and you say this and a 7970 is overkill for 1080p? Even this FLAWED article here proves you WRONG.

Sleeping dogs right here in this review on a SINGLE 7970 UNDER 30fps AVERAGE. What planet are you playing on? If you are hitting 28.2fps avg your gameplay SUCKS!

http://www.tomshardware.com/reviews/geforce-gtx-77...
Bioshock infinite 31fps on GTX 580...Umm, mins are going to stutter at 1440p right? Even the 680 only gets 37fps...You'll need to turn both down for anything fluid maxed out. Same res for Crysis 3 shows even the Titan only hitting 32fps and with DETAILS DOWN. So mins will stutter right? MSAA is low, you have two more levels above this which would put it into single digits for mins a lot. Even this low on msaa the 580 never gets above 22fps avg...LOL. You want to rethink your comments yet? The 580's avg was 18 FPS! 1440p is NOT for a SINGLE 580...LOL. Only 25fps for 7970...LOL. NOT PLAYABLE on your 7970ghz either. Clearly this game is 1080p huh? Look how much time in the graph 7970ghz spends BELOW 20fps at 1440p. Serious gamers play at 1080p unless they have two cards. FAR CRY 3, same story. 7970ghz is 29fps...ROFL. The 580 scores 21fps...You go right ahead and try to play these games at 1440p. Welcome to the stutterfest my friend.
"GeForce GTX 770 and Radeon HD 7970 GHz Edition nearly track together, dipping into the mid-20 FPS range."
Yeah, Far Cry will be good at 20fps.

Hitman Absolution has to disable MSAA totally...LOL. Even then 580 only hits 40fps avg.

Note the tomb raider comment at 1440p:
"The GeForce GTX 770 bests Nvidia’s GeForce GTX 680, but neither card is really fluid enough to call the Ultimate Quality preset smooth."
So 36fps and 39fps avg for those two is NOT SMOOTH. 770 dropped to 20fps for a while.

A titan isn't even serious overkill for 1080p. It's just good enough and for hardocp a game or two had to be turned down even on it at 1080p! The data doesn't lie. Single cards are for 1080p. How many games do I have to show you dipping into the 20's before you get it? Batman AC barely hits 30's avg on 7970ghz with 8xmsaa and you have to turn physx off (not nv physx, phsyx period). Check tom's charts for gpus.

In hardocp's review of 770gtx 1080p was barely playable with 680gtx and everything on. Upping to 2560x1600 caused nearly every card to need tessellation down and physx off in Metro Last Light. 31fps min on 770 with SSAA OFF and Physx OFF!
http://hardocp.com/article/2013/05/30/msi_geforce_...
You must like turning stuff off. I don't think you're a serious gamer until you turn everything on and expect it to run there. NO SACRIFICING quality! Are we done yet? If this article really tells you to pair expensive gpus ($400-1000) with a cheapo $115 AMD cpu then they are clearly misleading you. It looks like is exactly what they got you to believe. Never mind your double gpu comment paired with the same crap cpu adding to the ridiculous claims here already.

 

In summary, the article on Anandtech isn't about "the best CPU for the money". It's "what is the cheapest CPU you can get away with, if you're one of the <1% who has a 2560x1440 monitor, as well as one of the highest end GPU on the market, and only care about gaming and nothing else?".

This is by no means a "best CPU for the money" article.

 

I think the problem here is that the article is written for an extremely niche group of people, and people then think that it applies to all systems.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×