Jump to content

Why is the fx-8350 So Underrated for 4k Gaming?

So I've been recently talking in a lot of forums and discords about potentially upgrading my current rig slightly, which currently has an fx-8350 CPU and a GTX 1080 Ti graphics card. My original plan was to potentially upgrade to a 2700x, however right now I don't have too much of a reason for it, as I am already getting really good performance at 4k. But one thing I've constantly noticed is for some reason, many people greaty underestimate the power of the 8350. Even on a thread about overclocking it, many people just said "don't bother, get a Ryzen", while in many other places, people were either super confused about my config, or even outright refused to believe me when I said I was getting great performance. I've seen people saying that "gaming at 4k would be impossible" while providing random YT videos as "prove", and when I tried to say that it can't be impossible as I've been doing it since October, they simply refused to believe me, and that's just one example. Someone else even started talking about some lawsuit I've never even heard about, which included AMD lying about the performance of the 8350, and that somehow was suppose to be "proof" of it being bad? So why is underestimating the 8350 such a common theme, and why do some people go so far as to deny my performance just to keep on believing it's "really bad" or something?

Link to comment
Share on other sites

Link to post
Share on other sites

The AMD FX line is kind of underrated, but again it has one of the worst single-core performance you can get on a CPU and a lot of heating issues.

I'd upgrade to Ryzen though, the FX line is old and doesn't have any really good upgrades for now, so if you plan to upgrade it's better.

Ryzen 7 3700X / 16GB RAM / Optane SSD / GTX 1650 / Solus Linux

Link to comment
Share on other sites

Link to post
Share on other sites

 

Here is the information about the lawsuit about the FX series of chips and the misleading practices surrounding it. 

 

To answer your original question, the FX8350 is an 8 year old chip now, has poor IPC and poor single threaded performance and introduces bottlenecks as a result. I would be curious to see what your CPU usage looks like while gaming at 4k in a game such as GTA:V, Witcher 3, PUBG etc.. 

Community Standards | Fan Control Software

Please make sure to Quote me or @ me to see your reply!

Just because I am a Moderator does not mean I am always right. Please fact check me and verify my answer. 

 

"Black Out"

Ryzen 9 5900x | Full Custom Water Loop | Asus Crosshair VIII Hero (Wi-Fi) | RTX 3090 Founders | Ballistix 32gb 16-18-18-36 3600mhz 

1tb Samsung 970 Evo | 2x 2tb Crucial MX500 SSD | Fractal Design Meshify S2 | Corsair HX1200 PSU

 

Dedicated Streaming Rig

 Ryzen 7 3700x | Asus B450-F Strix | 16gb Gskill Flare X 3200mhz | Corsair RM550x PSU | Asus Strix GTX1070 | 250gb 860 Evo m.2

Phanteks P300A |  Elgato HD60 Pro | Avermedia Live Gamer Duo | Avermedia 4k GC573 Capture Card

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Skiiwee29 said:

 

Here is the information about the lawsuit about the FX series of chips and the misleading practices surrounding it. 

 

To answer your original question, the FX8350 is an 8 year old chip now, has poor IPC and poor single threaded performance and introduces bottlenecks as a result. I would be curious to see what your CPU usage looks like while gaming at 4k in a game such as GTA:V, Witcher 3, PUBG etc.. 

I don't have any of those games you listed, but with R6 Siege, it sits at around 45%.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, avrona said:

I don't have any of those games you listed, but with R6 Siege, it sits at around 45%.

R6S is a heavy CPU bound game. Could you post proof of your setup and the FPS you get in game along with screenshots of your CPU usage?

Community Standards | Fan Control Software

Please make sure to Quote me or @ me to see your reply!

Just because I am a Moderator does not mean I am always right. Please fact check me and verify my answer. 

 

"Black Out"

Ryzen 9 5900x | Full Custom Water Loop | Asus Crosshair VIII Hero (Wi-Fi) | RTX 3090 Founders | Ballistix 32gb 16-18-18-36 3600mhz 

1tb Samsung 970 Evo | 2x 2tb Crucial MX500 SSD | Fractal Design Meshify S2 | Corsair HX1200 PSU

 

Dedicated Streaming Rig

 Ryzen 7 3700x | Asus B450-F Strix | 16gb Gskill Flare X 3200mhz | Corsair RM550x PSU | Asus Strix GTX1070 | 250gb 860 Evo m.2

Phanteks P300A |  Elgato HD60 Pro | Avermedia Live Gamer Duo | Avermedia 4k GC573 Capture Card

 

Link to comment
Share on other sites

Link to post
Share on other sites

because its a hot mess that wasnt cost or performance competative. used they can make a decent rig but i would go 4th gen I5 or I7 over anything FX because IPC

 

at 4k CPU matters a lot less than GPU though

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Last I recall, you said your performance was bouncing between 40 and 60 fps, which you deemed fine. However, most people don't want to put up with frame drops like that, and would prefer some better CPU performance.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/29/2019 at 3:52 PM, Skiiwee29 said:

R6S is a heavy CPU bound game. Could you post proof of your setup and the FPS you get in game along with screenshots of your CPU usage?

image.thumb.png.0176cbe0d3f1b8f8dba66ecfd2d99e84.png

 

On 1/29/2019 at 3:59 PM, fasauceome said:

Last I recall, you said your performance was bouncing between 40 and 60 fps, which you deemed fine. However, most people don't want to put up with frame drops like that, and would prefer some better CPU performance.

And since then I checked again and the performance is even better than I thought, with it mainly staying at around 50-60, and that's only with really demanding games, while with everything else, it's at a solid 60 FPS all the time.

Link to comment
Share on other sites

Link to post
Share on other sites

4K puts stress on the GPU, making it unable to crank out as many frames as lower resolutions. Which in turn puts less pressure on the CPU to keep sending the GPU more things to do. However, it's not efficient and Ivy Bridge can do the job just as good if not better.

 

It's also worth pointing out more recent game are using multithreading more effectively than back then.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Mira Yurizaki said:

4K puts stress on the GPU, making it unable to crank out as many frames as lower resolutions. Which in turn puts less pressure on the CPU to keep sending the GPU more things to do. However, it's not efficient and Ivy Bridge can do the job just as good if not better.

 

It's also worth pointing out more recent game are using multithreading more effectively than back then.

I know the GPU is more important at higher resolutions, and many people also know it, so I don't know why so many still somehow don't want to believe that I get great performance. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, JoostinOnline said:

It probably doesn't matter much at 4k. You'd be at a severe bottleneck if you were playing at 1080p, or even 1440p.

But that's only if those lower resolutions like 1080 or 1440 also came with a higher refresh rate, right?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, avrona said:

But that's only if those lower resolutions like 1080 or 1440 also came with a higher refresh rate, right?

Well pairing it with a 1080 Ti, that's what you'd be more likely to have. But for the most part, yes. That being said, games are no longer designed with the FX series in mind (many seemed to mostly ignore it from the start), so don't be surprised if it prevents you from getting 60fps in newer AAA titles.

 

If you're going to upgrade, a 2700x is overkill. A 2600 (or a 9600k) would be plenty.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, JoostinOnline said:

Well pairing it with a 1080 Ti, that's what you'd be more likely to have. But for the most part, yes. That being said, games are no longer designed with the FX series in mind (many seemed to mostly ignore it from the start), so don't be surprised if it prevents you from getting 60fps in newer AAA titles.

 

If you're going to upgrade, a 2700x is overkill. A 2600 (or a 9600k) would be plenty.

Well I am getting 60FPS is AAA games right now, but we'll see, maybe when I get Cyberpunk or something whenever that comes out, the framerate may suffer a lot. I only want to update once it is really necessary. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/29/2019 at 4:00 PM, avrona said:

 

The 6.4 min fps in the pool table scene Is a proof of the CPU bad performance, try to monitor cpu usage with a software like hardwareinfo and the cores that the game uses will probably be at 100% during that scene.

I personally wouldn't consider 60fps avg 6 min a good experience

Tag me @mbox or quote me if you want me to read

CPU Ryzen 5 1600 @3.9GHz  MOBO Asus Prime X370 Pro  GPU XFX RX 580 GTS XXX 8GB

COOLER Noctua NH-U12S  RAM 2x4GB HyperX Predator 3000 MHz  SSD Samsung 850 evo 250GB  CASE NZXT s340

MONITOR 1 AOC G2590PX MONITOR 2 LG 29WN600

Link to comment
Share on other sites

Link to post
Share on other sites

Hi, Will share my fx 8150 at 1080p, it bottlenecs the RX480, it really depends on the game.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, mbox said:

The 6.4 min fps in the pool table scene Is a proof of the CPU bad performance, try to monitor cpu usage with a software like hardwareinfo and the cores that the game uses will probably be at 100% during that scene.

I personally wouldn't consider 60fps avg 6 min a good experience

Well it's not really proof, especially since in the benchmark I didn't even notice that the framerate dropped so low, meaning it must've been just for a second or less. And the highest the CPU reached during the benchmark was 84%, but it mainly stayed at around 45-55.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, avrona said:

And the highest the CPU reached during the benchmark was 84%, but it mainly stayed at around 45-55. 

That's really bad performance to have, especially considering you've got a 1080 Ti. You're being bottlenecked to hell.  Also, using Task Manager to estimate your CPU usage is extremely inaccurate.

 

You don't have to upgrade if you don't want to, but trying to convince us that you're getting good performance is kind of silly.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/29/2019 at 4:00 PM, avrona said:

 

Those terrible minimum fps numbers are why people say FX is dead. As a former 8350 owner, I can say this: my core 2 quad q6600 at 3ghz had the same CB r15 single core score as the 8350 at 4ghz. It has worse IPC than 10 year old+ chips. Any game like CS:GO that only uses 2-4 cores would run WAY better on a quad core ryzen 3 than poor old FX. It might be ALRIGHT, but its on its way out. I'd get rid of FX before 2020 for sure

CPU: INTEL Core i7 4790k @ 4.7Ghz - Cooling: NZXT Kraken X61 - Mobo: Gigabyte Z97X SLI - RAM: 16GB G.Skill Ares 2400mhz - GPU: AMD Sapphire Nitro R9 Fury 4G - Case: Phanteks P350X - PSU: EVGA 750GQ - Storage: WD Black 1TB - Fans: 2x Noctua NF-P14s (Push) / 2x Corsair AF140 (Pull) / 3x Corsair AF120 (Exhaust) - Keyboard: Corsair K70 Cherry MX Red - Mouse: Razer Deathadder Chroma

Bit of an AMD fan I suppose. I don't bias my replies to anything however, I just prefer AMD and their products. Buy whatever the H*CK you want. 

---QUOTE ME OR I WILL LIKELY NOT REPLY---

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Most people don't play at 2160p so the 8350 is typically treated as hot garbage. Is it right to be treated as hot garbage? Yea, I mean unless you're playing on a budget and get the thing for free, there's no reason to own one, especially when paired with something as comparatively high end as a GTX 1050.

 

I mean your argument in favor of the 8350 is quite legitimately "My GPU is so weak at this resolution that it doesn't matter," not that the 8350 is good. I'm sure you still get worse 1% framerates than an R5 1400 would get.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, JoostinOnline said:

That's really bad performance to have, especially considering you've got a 1080 Ti. You're being bottlenecked to hell.  Also, using Task Manager to estimate your CPU usage is extremely inaccurate.

 

You don't have to upgrade if you don't want to, but trying to convince us that you're getting good performance is kind of silly.

Well again, I am getting good performance, and I do have the benchmark results to prove it.

6 hours ago, Vegetable said:

Those terrible minimum fps numbers are why people say FX is dead. As a former 8350 owner, I can say this: my core 2 quad q6600 at 3ghz had the same CB r15 single core score as the 8350 at 4ghz. It has worse IPC than 10 year old+ chips. Any game like CS:GO that only uses 2-4 cores would run WAY better on a quad core ryzen 3 than poor old FX. It might be ALRIGHT, but its on its way out. I'd get rid of FX before 2020 for sure

 I didn't even notice that the framerate dropped so low, meaning it must've been just for a second or less. There is literally no way around the fact that the benchmarks all point towards a fx-8350 and GTX 1080 Ti combo working out really well.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/29/2019 at 10:50 PM, Skiiwee29 said:

Here is the information about the lawsuit about the FX series of chips and the misleading practices surrounding it.   

Its still bullshit for some Commiefornians who want to get some money...

 

On 1/29/2019 at 10:50 PM, Skiiwee29 said:

To answer your original question, the FX8350 is an 8 year old chip now,

Yes and others are still working on their 10 Year old Intel stuff and nobody cares...

 

On 1/29/2019 at 10:50 PM, Skiiwee29 said:

has poor IPC and poor single threaded performance

Not with modern Software, if optimized well for Bulldozer, it runs very well there.

 

On 1/29/2019 at 10:50 PM, Skiiwee29 said:

and introduces bottlenecks as a result.

No, that was due to the crappy slow K10 Northbridge with the L3 inside...

With that at 3GHz or at best CPU Clock rate, Bulldozer would have been much better as NB Clock is in some instances equal to CPU Clock, sometimes even better than that.

 

And that's why you will see Bulldozer for years and years, even your bashing will not change that it will be used until its dead.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, avrona said:

Well again, I am getting good performance, and I do have the benchmark results to prove it. 

No you aren't. You just have incredibly low standards. Those benchmark results were terrible.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, JoostinOnline said:

No you aren't. You just have incredibly low standards. Those benchmark results were terrible.

Once again, it's a 60Hz monitor, so for that the performance is good, it's not just my standards.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×