Jump to content

I switched from AMD: A Testimonial.

So I'm writing this as a testimonial. Anyone familiar with @Faceman and his lengthy spoiler thread, or @itachipirate and his abridged thread will know that FX CPUs are not a cost-effective choice for modern gaming. Further proof isn't needed at all, the argument is sound and complete, there is no need for further information.

Or so one would think; but people blindly coming to the defense of AMDs FX CPUs for gaming, after seeing dozens of pages and lists of proof that this is just untrue, are still prevalent. So I thought I'd come and add yet another drop into the massive ocean of support for those thinking of making the switch to Intel, building a new system, or maybe looking into bringing new life to an old system.

For reference I'll limit my games to those I play most often and for the longest time on both the old and new systems and to those games I've owned for the duration of both systems' lives.

I had no interest at the time in benchmarks when I owned my AMD CPU, so synthetics will not be used either, as I did not have any besides Unigen Valley prior to my switch. As this is an entirely GPU based benchmark, it means nothing in this case and will not be referenced.

My main rig (at the time I called it "The Battleship" and will henceforth be referenced as such) was built in January of 2014. My updated rig (currently called "Cinders" which can be found here) was updated in February of 2015. The games I played were as follows:

Left4Dead2 (FPS) (heavily modded)

Warframe (TPS) (which has gone through several dozen updates so I'm not 100% how their own optimizations have changed things)

League of Legends (MOBA)

Tera (MMO)

I referenced in a few of my other posts on certain threads how I had also seen differences in performance using FarCry4, however, I had that game only a few weeks prior to my switch and I don't believe I have enough comparative data to reference it. For that reason, it will be excluded.

The systems compared are as follows:

The Battleship-

Gigabyte GA990FXA-UD3 motherboard.

AMD FX6350 CPU (OC'd to 4.6GHz)

Sapphire Tri-X R9 280X GPU

G.Skill Ripjaws X 8GB (2x4GB) DDR3 1600)

Noctua NH-U12S CPU cooler

Samsung 840EVO 120GB SSD + Seagate Barracuda 1TB HDD

EVGA 850G Power Supply

Cinders-

Gigabyte GA Z97X-SOC

Intel Core i5 4690K (OC'd to 4.4GHz)

Same GPU

Same RAM (comparisons will be made using information gathered PRIOR upgrading to 16GB)

Same cooler

Same Storage

Same Power Supply

For the last 3 weeks of running the FX 6350, it was run on an open test bench. Prior to that, it was housed in the Cooler Master HAF XM Mid Tower

For the first 2 weeks of running the i5 4690K, it was run on the same test bench. Afterwards, it became housed in the NZXT H440 Mid Tower

These will be the circumstances through which the comparisons will be made.

Left4Dead2 (heavily modded, max settings, FPS capped at 120)

Map: Heavy Rain (My personal favorite)

The Battleship:

Max FPS - 110

Min FPS - 47

Average - 91

Cinders:

Max FPS - 120

Min FPS - 71

Average - 102

Conclusion: A visually noticeable increase in performance as I did see hitching or stuttering. The 6350 never reached the FPS cap. I can maybe chock this one up to possible inconsistency or an error, but the drop in minimum FPS was rather annoying. I'll move on.

League of Legends (max settings, FPS capped at 120)

5v5 Multiplayer, Summoners Rift.

The Battleship:

Max FPS - 120

Min FPS - 98

Average FPS - 118

Cinders:

Max FPS - 120

Min FPS - 118

Average FPS - 119

Conclusion: There is no denying there were frame dips with the FX 6350, but again, they weren't noticeable. Based on the average FPS, if you're only going to play this game, maybe the 6350 isn't a bad choice? I digress...

Tera (max settings, no FPS cap, onscreen player count set to max)

Dungeon Raid: Cultist's Refuge (Solo & Multiplayer [with 3 other players])

The Battleship:

Multiplayer

Max FPS - 68

Min FPS - 23

Average FPS - 49

Solo

Max FPS - 79

Min FPS - 27

Average FPS - 50

Cinders:

Multiplayer

Max FPS - 107

Min FPS - 62

Average FPS - 80

Solo

Max FPS - 131

Min FPS - 67

Average FPS - 86

Conclusion: Tera (like most MMOs from what I've seen from other benchmarks) really likes strong cores. During the heat of battle, especially with multiple players throwing god-knows-how-many spells around, the minimum frames took a huge hit on The Battleship; however, Cinders was smooth as silk, never once dropping below 60 FPS. This has been a huge issue in the past as split second reactions can make or break a raid (especially when playing the Priest class and your job is to keep everyone alive). Stuttering this severe is not acceptable. You may argue that turning down a couple settings and sacrificing a few visuals could fix it. Why would I want to do that when I could have the best of both?

Warframe (max settings, 60 FPS cap)

Survival Mission, Apollodorus, Single Player using Nova Frame (highest particle density abilities)

The Battleship:

Max FPS - 60

Min FPS - 9

Average FPS - 34

Cinders:

Max FPS - 60

Min FPS - 51

Average FPS - 59

Conclusion: During this game things can get extremely crowded on screen, especially on maps with narrow corridors. Enemies have a tendency to group up and swarm and the effects a single high-explosive can have on FPS, especially in densely populated scenarios, was catastrophic for The Battleship (which is now aptly named "The Tugboat"). Minimum FPS as horrific and caused the game to become unplayable at certain points, forcing a drop in multiple settings and putting it more towards "mid" rather than "high".

Overall Conclusion and thoughts:

I often see comparisons being made using AAA titles, and that makes perfect sense to do; the majority of people will play these games and want to know what they're getting. For my particular tests, however, I tested the games I play most often, or that I didn't see tested at all by others. For those that play these games, I think I have given a representation of what you will be getting when going with Intel over AMD for gaming.

The price argument can be brought up if you want. My upgrade was made using the money from my tax return. As a working individual, with very few bills to pay beyond living expenses and gas, making what I would consider to be an adequate wage, spending the extra $100-$200 makes plenty of sense; especially when paired with a performance increase well worth the money. But I didn't have to spend as much as I did on my upgrade to get the performance increase I wanted. I simply chose to buy a certain more expensive motherboard than what I really needed. Nobody has to spend as much as I did to get similar performance increases in games. In fact you could spend far less.

I'll be honest: I like my games to look pretty. I like having maxed settings. I like my games to look beautiful, even if the action gets so heated sometimes I don't notice the details put into that particular blade of grass. The developers of the games made these games good looking, put so much effort into the designs, the visuals are begging to be turned up! But if you have to sacrifice playability for a bit of eyecandy, there's no point. And that's exactly what you do when you buy an FX processor for gaming.

I'll be the first to say that I am not an expert at benchmarking and there may be some variables that I did not account for. But I tried to remove as many as possible and test the hardware at hand. These are the results.

Cinders: - i7 4790K (4.5GHz) - Gigabyte Z97X-SOC - 16GB Klevv DDR3 1600MHz - EVGA GTX 980Ti ACX2.0+ (1548MHz Boost) - EVGA Supernova 850GS - NZXT H440 Orange/Black (Modified) -
Unnamed System: i5 4690K (4.2GHz) - MSI Z97I-AC - 8GB G.Skill DDR3 2400MHz - EVGA GTX 950 SSC - Raidmax Thunder V2 535W - Phanteks Enthoo Evolv ITX

Link to comment
Share on other sites

Link to post
Share on other sites

does'nt the fx 6350 bottleneck the r9 280x?

 

and the i5 is much better than the fx 6350 that cpu is even better than the fx 8350

 

edit: the i5 is about even with teh fx 8350 from a gaming form but has MUCH better single core performance

 

On 11/19/2014 at 2:14 PM, Syntaxvgm said:
You would think Ubisoft would support the Bulldozer based architectures more given their digging themed names like bulldozer, Piledriver, Steamroller and Excavator.
Link to comment
Share on other sites

Link to post
Share on other sites

Thank you! I don't play these types of games so it's nice to know what experience this CPU can provide for different people in different games. Keep in mind the i5 does cost twice as much but for most MMOs it'd be worth it to cut back a little bit on the graphics card for a better CPU

does'nt the fx 6350 bottleneck the r9 280x?

No.

No, no no no. no

Nude Fist 1: i5-4590-ASRock h97 Anniversary-16gb Samsung 1333mhz-MSI GTX 970-Corsair 300r-Seagate HDD(s)-EVGA SuperNOVA 750b2

Name comes from anagramed sticker for "TUF Inside" (A sticker that came with my original ASUS motherboard)

Link to comment
Share on other sites

Link to post
Share on other sites

does'nt the fx 6350 bottleneck the r9 280x?

 

and the i5 is much better than the fx 6350 that cpu is even better than the fx 8350

 

Yes. It does. (In most games, there will be discrepancies) 

5800X3D - RTX 4070 - 2K @ 165Hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

does'nt the fx 6350 bottleneck the r9 280x?

 

 

 

This :/

 

 

Yes. It does. (In most games, there will be discrepancies) 

 I will refer you to the very last line of my post.

Cinders: - i7 4790K (4.5GHz) - Gigabyte Z97X-SOC - 16GB Klevv DDR3 1600MHz - EVGA GTX 980Ti ACX2.0+ (1548MHz Boost) - EVGA Supernova 850GS - NZXT H440 Orange/Black (Modified) -
Unnamed System: i5 4690K (4.2GHz) - MSI Z97I-AC - 8GB G.Skill DDR3 2400MHz - EVGA GTX 950 SSC - Raidmax Thunder V2 535W - Phanteks Enthoo Evolv ITX

Link to comment
Share on other sites

Link to post
Share on other sites

No.

No, no no no. no

It does though, especially in AAA titles like the ones tested.

I will refer you to the very last line of my post.

Bottlenecking is a variable that will reflect the CPU, and not change the results that much. (Technically, the numbers will show how the FX6 is behind in time)

So, you don't want to eliminate it.

5800X3D - RTX 4070 - 2K @ 165Hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Link to comment
Share on other sites

Link to post
Share on other sites

Thank you! I don't play these types of games so it's nice to know what experience this CPU can provide for different people in different games. Keep in mind the i5 does cost twice as much but for most MMOs it'd be worth it to cut back a little bit on the graphics card for a better CPU

I did want to put in a price/performance point into this but the only other CPU I have on hand is a Pentium Anniversary. That being said, slapping my 280X onto the board with the Pentium actually yielded higher minimum FPS in Warframe than the 6350 (only dropping to 17 as opposed to 9). The maximum and average was still held by the 6350 however. And there is yet another price discrepancy if trying to compare those.

Cinders: - i7 4790K (4.5GHz) - Gigabyte Z97X-SOC - 16GB Klevv DDR3 1600MHz - EVGA GTX 980Ti ACX2.0+ (1548MHz Boost) - EVGA Supernova 850GS - NZXT H440 Orange/Black (Modified) -
Unnamed System: i5 4690K (4.2GHz) - MSI Z97I-AC - 8GB G.Skill DDR3 2400MHz - EVGA GTX 950 SSC - Raidmax Thunder V2 535W - Phanteks Enthoo Evolv ITX

Link to comment
Share on other sites

Link to post
Share on other sites

Yes it does....

 

Not completely but you literally can't just straight up say no.

 

It does though, especially in AAA titles like the ones tested. 

MMMMMMMMMMMMMMM-no. I only see small, rare bottlenecks with my 6300 and GTX 970. If a game is limited to very few cores maybe? I can't really call it a CPU bottleneck in something like Rome 2 which is a 2013 game that uses 1 CPU core xD I call that a software bottleneck. In a game utilizing at least 4 processing cores, you won't see an FX-6300 bottleneck a R9 280

 

I could also say an i7-4790k bottlenecks an HD 3850, not completely but you literally can't just straight up say no. What if there's a game where you do protein folding while watching a video? Your CPU will be at max while your graphics card could do more

Nude Fist 1: i5-4590-ASRock h97 Anniversary-16gb Samsung 1333mhz-MSI GTX 970-Corsair 300r-Seagate HDD(s)-EVGA SuperNOVA 750b2

Name comes from anagramed sticker for "TUF Inside" (A sticker that came with my original ASUS motherboard)

Link to comment
Share on other sites

Link to post
Share on other sites

I don't see how it loses credibility there. They still perform fine for modern gaming, regardless of price. Intel just does it better.

 

Well, it means they're not "fine". I guess you assume people already own the system, but I was mainly talking about new builds.

Link to comment
Share on other sites

Link to post
Share on other sites

@FLUFFYJELLO Thanks for telling us about your experience.  This thread will inevitably turn into a flame war, but you laid it out as objectively as you could, and I expect that the less vocal folks very much appreciate that as well.  

 

There seem to be folks arguing that this comparison isn't fair, and sure, it's not comparing similar generations or years, but there is a very important comparison being made: processor frequency.  Both processors are running at similar frequencies, yet there are significant performance differences.  This is a great illustration that not all GHz are equal, and that you can get a significant performance gain from upgrading to a more recent cpu.  

Unfortunately I don't think everyone will accept your argument that Intel will give superior performance to AMD cpus, but maybe it will persuade or help a few folks, and that's all you can hope for.   :)

Isopropyl alcohol is all you need for cleaning CPU's and motherboard components.  No, you don't need [insert cleaning solution here].  -Source: PhD Student, Chemistry


Why overclockers should understand Load-Line Calibration.


ASUS Rampage IV Black Edition || i7 3930k @ 4.5 GHz || 32 GB Corsair Vengeance CL8 || ASUS GTX 780 DCuII || ASUS Xonar Essence STX || XFX PRO 1000W

Link to comment
Share on other sites

Link to post
Share on other sites

MMMMMMMMMMMMMMM-no. I only see small, rare bottlenecks with my 6300 and GTX 970. If a game is limited to very few cores maybe? I can't really call it a CPU bottleneck in something like Rome 2 which is a 2013 game that uses 1 CPU core xD I call that a software bottleneck. In a game utilizing at least 4 processing cores, you won't see an FX-6300 bottleneck a R9 280

 

I could also say an i7-4790k bottlenecks an HD 3850, not completely but you literally can't just straight up say no. What if there's a game where you do protein folding while watching a video? Your CPU will be at max while your graphics card could do more

 

The 6300 bottlenecks the 970 from 5-15%. The only game that it won't on is Battlefeild for obvious reasons. 

Less than average usage is not a sign of bottlenecking, but stuttering and lower FPS than other GPUs is. (Of the same type) 

The FX 6300 is a poor CPU for gaming, and that is just because it is running on 2011 architecture. (With modules, not cores!)

Also, the FX 6300 will bottleneck from R9 270 up on modern games. 

5800X3D - RTX 4070 - 2K @ 165Hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

I had a similar experience when I moved past my FX 8320. At that time it was paired with the HD 7970 (which is a R9 280X, in case someone is new to the GPU world). I swapped from it to an i7 3770K I got used, kept running with that same GPU for quite some time, and oh boy the difference!

AMD CPUs aren't as many call them 'not for gaming', they can run games at totally playable frames, and the bottleneck is not major as many of you are saying here (the FX 6300 will slow down the R9 280X in some games, but most titles aren't going to be an issue, specially those that are primarily eye candy). However with CPUs like the i3 4130 (and its brothers the 4150, 4160, etc) that sit around the $100 mark and can be used in a basic $40-60 1150 motherboard, there isn't much of a excuse to pick the red team. Haswell i3s perform way better in single core, they will run 4-core-locked games (as it's the threads that matter for this regard), use half the power, stay cool easily, and provide a modern platform with more features.

Pretty much anyone with a gaming build in mind should look no less than $100 for their CPU (new hardware), and the i3s get the medal for the price. And it's much better if you can stretch to the $150 bracket where i5s begin, even the i5 4440 is a mighty gaming CPU, and again a $40-60 mobo will suffice. And the main reason for this is minimum FPS! Frame-dipping is the worst visual experience in a game (besides a blue screen lol), going into or below 30fps kills the immersion and fun most of the times.

My advice to anyone thinking on going FX 6300 (or 6350) + R9 280X, that they should opt for i3 4130 (or similar) with a R9 280 or GTX 960.

Link to comment
Share on other sites

Link to post
Share on other sites

@FLUFFYJELLO Thanks for telling us about your experience.  This thread will inevitably turn into a flame war, but you laid it out as objectively as you could, and I expect that the less vocal folks very much appreciate that as well.  

 

There seem to be folks arguing that this comparison isn't fair, and sure, it's not comparing similar generations or years, but there is a very important comparison being made: processor frequency.  Both processors are running at similar frequencies, yet there are significant performance differences.  This is a great illustration that not all GHz are equal, and that you can get a significant performance gain from upgrading to a more recent cpu.  

Unfortunately I don't think everyone will accept your argument that Intel will give superior performance to AMD cpus, but maybe it will persuade or help a few folks, and that's all you can hope for.   :)

I figure it will inevitably become one, yes; but for the short time it isn't I hope people can take from it what I did: if you have the money to switch, there isn't a downside.

I also tried to note that the choice of motherboard (at the time a $170 overclocking mobo, which I've seen fluctuating from $145-$180 since) could be changed to save a few bucks and lower the price discrepancy; which isn't as high as people make it out to be considering the 990FXA-UD3 still costs $110 even when on sale, and $130 when not. When overclocking an FX CPU, there isn't a cheaper option available; you will need a similar motherboard (performance and usually price-wise) to get a stable overclock as high as mine. And I was lucky.

As for the processor frequency, had I pushed a bit more I probably could have done a GHz to GHz comparison outright; but I left the 4690K at 4.4; oh well  ^_^

Thanks for your input, @Queek !

Cinders: - i7 4790K (4.5GHz) - Gigabyte Z97X-SOC - 16GB Klevv DDR3 1600MHz - EVGA GTX 980Ti ACX2.0+ (1548MHz Boost) - EVGA Supernova 850GS - NZXT H440 Orange/Black (Modified) -
Unnamed System: i5 4690K (4.2GHz) - MSI Z97I-AC - 8GB G.Skill DDR3 2400MHz - EVGA GTX 950 SSC - Raidmax Thunder V2 535W - Phanteks Enthoo Evolv ITX

Link to comment
Share on other sites

Link to post
Share on other sites

Thank you! I don't play these types of games so it's nice to know what experience this CPU can provide for different people in different games. Keep in mind the i5 does cost twice as much but for most MMOs it'd be worth it to cut back a little bit on the graphics card for a better CPU

Or don't cut back on the GPU and still have a better CPU. 

 

All prices from amazon.

 

AMD

Asus M5A97 = $94, FX6350 = $125, Hyper212Evo = $34 Total =  $253

 

Intel

Gigabyte H97M-HD3 = $75, i54430 = $176, Total = $251

 

There is no need to purchase FX CPUs for gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

Very informative thread. A+ for effort and don't worry, your benchmarking proves fine. :)

Only issue is this.

 

AMD CPUs are totally fine for modern gaming. Intel's just offer better performance.

They are not, as evidenced by all of these modern games that are running better on an i3 than an FX6/8/9.

 

They cost the same.  So, why would you want to buy a processor that is only able to play 4 out of 5 games to a satisfactory level, when you can play 5 out of 5 games very well.  There is a big difference between these two, especially shown in this thread.  The difference is minimums.  While the FX processors are capable of hitting 60fps+, their minimums are not as high.  With Intel, you get a much higher minimum, which translates into a more fluid and immersive gameplay experience.  This is the goal: fluidity, immersion.

 

My friend who owns an FX8, his reaction is: "you get used to the drops."  This is not an acceptable rationale.  Do not defend these processors for someone who is buying new.  The price of an overclocked FX8 system is greater than a locked i5 system, and no matter how far you overclock the FX8, it will fall behind the locked i5.

 

@FLUFFYJELLO

 

Thank you for taking the time to write this up.  Especially like that you tested some less known, but still popular games.  Adding your thread to my user testimonial section on the big spoiler.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

Also, the FX 6300 will bottleneck from R9 270 up on modern games. 

Haha xD

Nude Fist 1: i5-4590-ASRock h97 Anniversary-16gb Samsung 1333mhz-MSI GTX 970-Corsair 300r-Seagate HDD(s)-EVGA SuperNOVA 750b2

Name comes from anagramed sticker for "TUF Inside" (A sticker that came with my original ASUS motherboard)

Link to comment
Share on other sites

Link to post
Share on other sites

Haha xD

 

A CPU that was pretty crap when it came out is not better than CPUs 4 years newer. 

5800X3D - RTX 4070 - 2K @ 165Hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

A CPU that was pretty crap when it came out is not better than CPUs 4 years newer. 

Keep going. All of my experiences with this CPU are completely blown out of the water by your ramblings and opinions, so please continue. I'm an owner of this CPU playing GTA V on nearly ultra with extremely rare FPS dips. My overall opinion: for $100 meh I'm satisfied while still understanding an i3 would be better for the same price. You're just some guy on the internet saying it's crap

 

I'm not saying that these are CPUs you should consider buying for modern computers. http://linustechtips.com/main/topic/351398-amd-fx-6-month-conclusion/#entry4777673

There are bad things to be said about the FX CPUs, but you just go overboard

Nude Fist 1: i5-4590-ASRock h97 Anniversary-16gb Samsung 1333mhz-MSI GTX 970-Corsair 300r-Seagate HDD(s)-EVGA SuperNOVA 750b2

Name comes from anagramed sticker for "TUF Inside" (A sticker that came with my original ASUS motherboard)

Link to comment
Share on other sites

Link to post
Share on other sites

A CPU that was pretty crap when it came out is not better than CPUs 4 years newer. 

And yet people are still considering it for new builds; and they just shouldn't.

At the time of purchase (looking at my Newegg purchase history) my 6350 cost ~$130, and the motherboard an additional $127. $260 for that combo. The Pentium I have running in my ITX rig, clocked at 4.8GHz on a $100 Z97 mobo (and I got the chip on sale for $62) actually plays Warframe with higher minimum FPS, just lower max and average. This is a dual core, that with mobo cost $100 less than a 6350 system, and performs only just below it. IMAGINE what could be done with that extra $100...

Cinders: - i7 4790K (4.5GHz) - Gigabyte Z97X-SOC - 16GB Klevv DDR3 1600MHz - EVGA GTX 980Ti ACX2.0+ (1548MHz Boost) - EVGA Supernova 850GS - NZXT H440 Orange/Black (Modified) -
Unnamed System: i5 4690K (4.2GHz) - MSI Z97I-AC - 8GB G.Skill DDR3 2400MHz - EVGA GTX 950 SSC - Raidmax Thunder V2 535W - Phanteks Enthoo Evolv ITX

Link to comment
Share on other sites

Link to post
Share on other sites

Keep going. All of my experiences with this CPU are completely blown out of the water by your ramblings and opinions, so please continue

Okay. Well, we all know that single thread performance is the most important in games. (Again, except Battlefeild)

So in Cinebench r11.5 the 6300 gets a score of 1.07

While the i3 4150/4330 (Which is about the same price, but uses much less power) gets 1.57.

The 6300 is about the same as the i3 in gaming, getting better on some games, while the i3 has higher min FPS. While using about double the power, and only keeping up with a slight overclock, upgraded cooler, and better motherboard.

On the other hand, a stock 4690 won't bottleneck until you have dual 980s and only costs $230 with a B or H series chipset. (No need for overclock)

Also, the i3 has a clear upgrade path, while the 6300 can only be upgraded to a FX8 CPU. (The FX9 is going to need a $200+ motherboard)

Sources

http://images.anandtech.com/graphs/graph6396/51135.png

http://images.anandtech.com/graphs/graph7963/63178.png

Civ V:

(There was an image here, but it was too low resolution)

Also, my friend has a 6300 and because this game uses net server hosting, when he is playing we have 10-15 sec loading times. But when it is just core i5s and i3 on the net server, then the loading times are 1-6 secs.

And yet people are still considering it for new builds; and they just shouldn't.

At the time of purchase (looking at my Newegg purchase history) my 6350 cost ~$130, and the motherboard an additional $127. $260 for that combo. The Pentium I have running in my ITX rig, clocked at 4.8GHz on a $100 Z97 mobo (and I got the chip on sale for $62) actually plays Warframe with higher minimum FPS, just lower max and average. This is a dual core, that with mobo cost $100 less than a 6350 system, and performs only just below it. IMAGINE what could be done with that extra $100...

Yes, got an i3. Because the Pentium will die out soon with some games not allowing CPUs to only have 2 threads.

5800X3D - RTX 4070 - 2K @ 165Hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

You're acting like the CPUs are incapable of gaming at high settings when that's not true. I know the benchmarks -- the FX series are inconsistent in FPS in many games but they're fine for everything.

Like I said though, Intel will perform better, but buying an FX chip won't inhibit you from gaming on current AAA titles. If that was the case, then they wouldn't be fine, but it's not. For the normal PC gamer, it's enough.

But they are incapable though....Just moreso for certain usage cases.

Chasing my minfps of 75fps that I NEED for the OculusRift 75hz refresh using dual gpus.

Maintaining 120fps MINIMUMS even on low details can be hard to achieve on my 120hz panels in some titles with my old FX yet have zero issues like that at all with my i5 CPU/Mobo.

I'd still be using AMD if they actually had a CPU that feeds my needs. But they do not have anything to offer for my specific needs.

My needs are not uncommon either, and will become more common each day.

As said why buy a cpu capable of playing 4/5 games great when for no extra cash, you could be playing 5/5 just fine, why wouldnt you.?

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Okay. Well, we all know that single thread performance is the most important in games. (Again, except Battlefeild)

So in Cinebench r11.5 the 6300 gets a score of 1.07

While the i3 4150/4330 (Which is about the same price, but uses much less power) gets 1.57. 

 

The 6300 is about the same as the i3 in gaming, getting better on some games, while the i3 has higher min FPS. While using about double the power, and only keeping up with a slight overclock, upgraded cooler, and better motherboard. 

On the other hand, a stock 4690 won't bottleneck until you have dual 980s and only costs $230 with a B or H series chipset. (No need for overclock)

Also, the i3 has a clear upgrade path, while the 6300 can only be upgraded to a FX8 CPU. (The FX9 is going to need a $200+ motherboard)

 

Sources

 

http://images.anandtech.com/graphs/graph6396/51135.png

 

http://images.anandtech.com/graphs/graph7963/63178.png

 

Civ V:

 

(There was an image here, but it was too low resolution)

 

Also, my friend has a 6300 and because this game uses net server hosting, when he is playing we have 10-15 sec loading times. But when it is just core i5s and i3 on the net server, then the loading times are 1-6 secs. 

 

My god you've convinced me of what I already know. You sir deserve a medal. Are you just arguing to argue? I don't know what you think you're accomplishing here

http://linustechtips.com/main/topic/351398-amd-fx-6-month-conclusion/#entry4777673

You're bashing on FX CPUs like they're the worst performing things in the whole world. And my ingame-server hosting performs about the same as my friends i7-4790k with few people so I think you may have had another problem there besides CPU. (minecraft, gmod, unturned, AOM)

Nude Fist 1: i5-4590-ASRock h97 Anniversary-16gb Samsung 1333mhz-MSI GTX 970-Corsair 300r-Seagate HDD(s)-EVGA SuperNOVA 750b2

Name comes from anagramed sticker for "TUF Inside" (A sticker that came with my original ASUS motherboard)

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×