Jump to content

Why is the fx-8350 So Underrated for 4k Gaming?

3 minutes ago, avrona said:

Sounds more like just personal preference. I don't feel that things are too small, I have plenty of space for work such as animating, video editing, recording, streaming, etc, and I have an additional giant display if needed. 

 

Listen, I have no problem with your personal preference, but they're comes a point that 4k is just too small without scaling to read anything comfortably. You're going to honestly tell me that you don't use scaling for anything outside of games? Sometimes I even scale my 32" 4k and I have perfect vision. 

Ryzen 3800X + MEG ACE w/ Radeon VII + 3733 c14 Trident Z RGB in a Custom Loop powered by Seasonic Prime Ultra Titanium
PSU Tier List | Motherboard Tier List | My Build

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/31/2019 at 3:19 PM, i_build_nanosuits said:

when i was gaming on my 24'' 1080p 60hz panel and i had never known any better i tought it was pretty good...and then...144hz came along and animations felt like real life smooth or something hard to descibe and it felt so much better...and now when i game at 60hz it just feel aweful...you just havn't known better ;) which is fine...ignorance is bliss

I can say the same to you regarding resolution. I used to play on a 1080p display, which I thought looked great, but now I get to enjoy the breath-taking and crisp 4k visuals, while also running at a way better framerate than before, as I upgraded my monitor and graphics card at the same time.

 

On 1/31/2019 at 3:20 PM, ChewToy! said:

Listen, I have no problem with your personal preference, but they're comes a point that 4k is just too small without scaling to read anything comfortably. You're going to honestly tell me that you don't use scaling for anything outside of games? Sometimes I even scale my 32" 4k and I have perfect vision. 

Yes I do use scaling, but I don't see how using it is a bad thing. 

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

I think is still a relevant CPU yes a rx480 is the limit but for a budget 1080p its a good cpu.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, avrona said:

Yes I do use scaling, but I don't see how using it is a bad thing. 

 

1 minute ago, avrona said:

I can say the same to you regarding resolution. I used to play on a 1080p display, which I thought looked great, but now I get to enjoy the breath-taking and crisp 4k visuals, while also running at a way better framerate than before, as I upgraded my monitor and graphics card at the same time.

read both of these, think about it...think again...and then a bit more...and then consider 2560x1440 at 27''...and then think again...does it ring a bell?

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, i_build_nanosuits said:

 

read both of these, think about it...think again...and then a bit more...and then consider 2560x1440 at 27''...and then think again...does it ring a bell?

Again, you just seem to against my whole rig setup at this point for no reason.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, avrona said:

Again, you just seem to against my whole rig setup at this point for no reason.

it's okay man, i won't argue any further...if you like it you like it, it's fine by me...like i said, whatever float your boat ;)

Have a good evening.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, avrona said:

Again, you just seem to against my whole rig setup at this point for no reason.

Don't worry about it, you're setup is just fine if you're happy with it. People are just upset that you claimed the FX series were underrated. And your still not getting the FPS at 4k that you should. And that's when the CPU matters the least. So technically, it's not underrated at all. 

Ryzen 3800X + MEG ACE w/ Radeon VII + 3733 c14 Trident Z RGB in a Custom Loop powered by Seasonic Prime Ultra Titanium
PSU Tier List | Motherboard Tier List | My Build

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, avrona said:

Unless you combine it with a 1080 Ti that is. Even with the games you mentioned, you only used a 780, meaning the performance will be much better, despite my CPU still being slower, as mine isn't overclocked at all, yet it can keep a steady 50 FPS in games like Siege, and in every other game I have, it's a solid 60 FPS all the time.

Again, I don't really need to be able to produce 80-90 FPS if my monitor is only a 60Hz one.

Thats just the part you're not understanding. This is how a CPU bottleneck works: once that CPU maxes out utilization-wise (100% usage), thats your max fps. It doesn't mean a single fuck what your GPU is, because it is being limited to however many frames your CPU can handle. So, with that being said, his 780 would get the same fps as your 1080 ti would, because its still being bottlenecked. You might be able to run ultra settings and get the same fps as his 780 would on medium, but it will never be higher than his. So you're getting <gtx 780 performance out of a 1080 ti.

CPU: INTEL Core i7 4790k @ 4.7Ghz - Cooling: NZXT Kraken X61 - Mobo: Gigabyte Z97X SLI - RAM: 16GB G.Skill Ares 2400mhz - GPU: AMD Sapphire Nitro R9 Fury 4G - Case: Phanteks P350X - PSU: EVGA 750GQ - Storage: WD Black 1TB - Fans: 2x Noctua NF-P14s (Push) / 2x Corsair AF140 (Pull) / 3x Corsair AF120 (Exhaust) - Keyboard: Corsair K70 Cherry MX Red - Mouse: Razer Deathadder Chroma

Bit of an AMD fan I suppose. I don't bias my replies to anything however, I just prefer AMD and their products. Buy whatever the H*CK you want. 

---QUOTE ME OR I WILL LIKELY NOT REPLY---

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, ChewToy! said:

Don't worry about it, you're setup is just fine if you're happy with it. People are just upset that you claimed the FX series were underrated. And your still not getting the FPS at 4k that you should. And that's when the CPU matters the least. So technically, it's not underrated at all. 

Well because it clearly is underrated if so many people don't even believe me that I have good performance. And how am I not getting the FPS I should? I'm getting as many as my monitor allows me, so 60.

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, avrona said:

I can say the same to you regarding resolution. I used to play on a 1080p display, which I thought looked great, but now I get to enjoy the breath-taking and crisp 4k visuals, while also running at a way better framerate than before, as I upgraded my monitor and graphics card at the same time.

And thats simply false. CPU load doesn't change with resolution, it changes based on how many fps you have. Drawing 60fps in 1080p takes the same CPU horsepower as 60fps in 4k. So you definitely AREN'T getting any more fps than you were before by going from 1080p to 4k, because your fps is still COMPLETELY dependent on your poor old FX. All that the above quote from you proves is you actually don't know what you're talking about. Unless you went from say a 750 ti, which probably doesn't get bottlenecked by FX much if any, then your fps is identical because the CPU is the limiting factor and always will be.

CPU: INTEL Core i7 4790k @ 4.7Ghz - Cooling: NZXT Kraken X61 - Mobo: Gigabyte Z97X SLI - RAM: 16GB G.Skill Ares 2400mhz - GPU: AMD Sapphire Nitro R9 Fury 4G - Case: Phanteks P350X - PSU: EVGA 750GQ - Storage: WD Black 1TB - Fans: 2x Noctua NF-P14s (Push) / 2x Corsair AF140 (Pull) / 3x Corsair AF120 (Exhaust) - Keyboard: Corsair K70 Cherry MX Red - Mouse: Razer Deathadder Chroma

Bit of an AMD fan I suppose. I don't bias my replies to anything however, I just prefer AMD and their products. Buy whatever the H*CK you want. 

---QUOTE ME OR I WILL LIKELY NOT REPLY---

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Vegetable said:

Thats just the part you're not understanding. This is how a CPU bottleneck works: once that CPU maxes out utilization-wise (100% usage), thats your max fps. It doesn't mean a single fuck what your GPU is, because it is being limited to however many frames your CPU can handle. So, with that being said, his 780 would get the same fps as your 1080 ti would, because its still being bottlenecked. You might be able to run ultra settings and get the same fps as his 780 would on medium, but it will never be higher than his. So you're getting <gtx 780 performance out of a 1080 ti.

But again, it doesn't matter if I'm being bottlenecked or not as my monitor would simply be unable to process more frames.

2 minutes ago, Vegetable said:

And thats simply false. CPU load doesn't change with resolution, it changes based on how many fps you have. Drawing 60fps in 1080p takes the same CPU horsepower as 60fps in 4k. So you definitely AREN'T getting any more fps than you were before by going from 1080p to 4k, because your fps is still COMPLETELY dependent on your poor old FX. All that the above quote from you proves is you actually don't know what you're talking about. Unless you went from say a 750 ti, which probably doesn't get bottlenecked by FX much if any, then your fps is identical because the CPU is the limiting factor and always will be.

Well I am getting more FPS though. Previously I was getting around 30-40 at max settings in a game like Siege. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, avrona said:

But again, it doesn't matter if I'm being bottlenecked or not as my monitor would simply be unable to process more frames.

Well I am getting more FPS though. Previously I was getting around 30-40 at max settings in a game like Siege. 

Seeing as your minimum fps is less than 30 in ALL of the scenes, AND the average is below 60 for ALL of the scenes of the SINGLE benchmark you have provided, then yes, your monitor CAN process more frames and your 1080 ti REALLY can process a lot more frames than that. I personally turn settings down until my minimum fps is ABOVE 60, because below that just feels like crap to me. You average less than that, so you have a terrible experience if you ask me. Like I've said before, I had an FX 8350. Overclocked it to 5ghz with 2133mhz ddr3 on my kraken x61. It still BOTTLENECKED SLI GTX 570S. Those are slower than gtx 950s and they were held back by my FX. If I slapped a 1080 ti on that system today, it would get the EXACT same fps as my SLI 570s did, because they weren't the limiting factor in my fps.

CPU: INTEL Core i7 4790k @ 4.7Ghz - Cooling: NZXT Kraken X61 - Mobo: Gigabyte Z97X SLI - RAM: 16GB G.Skill Ares 2400mhz - GPU: AMD Sapphire Nitro R9 Fury 4G - Case: Phanteks P350X - PSU: EVGA 750GQ - Storage: WD Black 1TB - Fans: 2x Noctua NF-P14s (Push) / 2x Corsair AF140 (Pull) / 3x Corsair AF120 (Exhaust) - Keyboard: Corsair K70 Cherry MX Red - Mouse: Razer Deathadder Chroma

Bit of an AMD fan I suppose. I don't bias my replies to anything however, I just prefer AMD and their products. Buy whatever the H*CK you want. 

---QUOTE ME OR I WILL LIKELY NOT REPLY---

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, avrona said:

But again, it doesn't matter if I'm being bottlenecked or not as my monitor would simply be unable to process more frames.

Well I am getting more FPS though. Previously I was getting around 30-40 at max settings in a game like Siege. 

In games like CSGO, and Rainbow Six, FPS still matter. If you get 120fps on a 60hz monitor, you still have an advantage since it'll show the second frame instead of the first. Like say instead of frames 1-60 it'll show 2,4,6,8-118,120 - in 60 frames. You wont get all the 120 fps but you'll get a faster refresh of what going on around you quicker with more FPS.

Ryzen 3800X + MEG ACE w/ Radeon VII + 3733 c14 Trident Z RGB in a Custom Loop powered by Seasonic Prime Ultra Titanium
PSU Tier List | Motherboard Tier List | My Build

Link to comment
Share on other sites

Link to post
Share on other sites

that game sucks tbh but about performance I played witcher 3 at max settings with an 8350 and also got 60 fps

 

the forum has always had a huge intel fanboy-ish portion of users just like any forum so that's why AMD is hated, I wouldn't worry about it.

ASUS X470-PRO • R7 1700 4GHz • Corsair H110i GT P/P • 2x MSI RX 480 8G • Corsair DP 2x8 @3466 • EVGA 750 G2 • Corsair 730T • Crucial MX500 250GB • WD 4TB

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Vegetable said:

Seeing as your minimum fps is less than 30 in ALL of the scenes, AND the average is below 60 for ALL of the scenes of the SINGLE benchmark you have provided, then yes, your monitor CAN process more frames and your 1080 ti REALLY can process a lot more frames than that. I personally turn settings down until my minimum fps is ABOVE 60, because below that just feels like crap to me. You average less than that, so you have a terrible experience if you ask me. Like I've said before, I had an FX 8350. Overclocked it to 5ghz with 2133mhz ddr3 on my kraken x61. It still BOTTLENECKED SLI GTX 570S. Those are slower than gtx 950s and they were held back by my FX. If I slapped a 1080 ti on that system today, it would get the EXACT same fps as my SLI 570s did, because they weren't the limiting factor in my fps.

By "it can't process more frames" I've been talking about every single other game I play, in which it is constantly maxed at 60 FPS. And before, at max settings, framerates were lower than what they are now, meaning that a upgrade of my graphics card must've helped.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, avrona said:

By "it can't process more frames" I've been talking about every single other game I play, in which it is constantly maxed at 60 FPS. And before, at max settings, framerates were lower than what they are now, meaning that a upgrade of my graphics card must've helped.

Like we said, if you're happy, then everything is all good and dandy. You could get better performance by upgrading your CPU/Motherboard/RAM. You don't need to if you're happy with what you have, though, so it's really a non issue. I already said it once, people are only upset because you're trying to tell them that the FX is an underrated chip.

 

18 minutes ago, aezakmi said:

that game sucks tbh but about performance I played witcher 3 at max settings with an 8350 and also got 60 fps

60fps isn't the issue.

 

18 minutes ago, aezakmi said:

the forum has always had a huge intel fanboy-ish portion of users just like any forum so that's why AMD is hated, I wouldn't worry about it.

Most people are now using a Ryzen systems. I hardly see Intel even recommended anymore unless the OP requests it.

 

I won't lie, though, I would never have bought AMD before Ryzen, although I was never really into hardware until recently.. Now I'll never buy Intel unless they do something revolutionary..even If the pricing and performance is the SAME. AMD is just doing better in the CPU (price to performance) department at the moment.

Ryzen 3800X + MEG ACE w/ Radeon VII + 3733 c14 Trident Z RGB in a Custom Loop powered by Seasonic Prime Ultra Titanium
PSU Tier List | Motherboard Tier List | My Build

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, ChewToy! said:

Like we said, if you're happy, then everything is all good and dandy. You could get better performance by upgrading your CPU/Motherboard/RAM. You don't need to if you're happy with what you have, though, so it's really a non issue. I already said it once, people are only upset because you're trying to tell them that the FX is an underrated chip.

 

60fps isn't the issue.

 

Most people are now using a Ryzen systems. I hardly see Intel even recommended anymore unless the OP requests it.

 

I won't lie, though, I would never have bought AMD before Ryzen, although I was never really into hardware until recently.. Now I'll never buy Intel unless they do something revolutionary..even If the pricing and performance is the SAME. AMD is just doing better in the CPU (price to performance) department at the moment.

Well I wouldn't get better performance as again, my monitor can only manage 60 FPS. And it clearly is underrated is so many people don't even believe when I say I have great performance with it. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, ChewToy! said:

Like we said, if you're happy, then everything is all good and dandy. You could get better performance by upgrading your CPU/Motherboard/RAM. You don't need to if you're happy with what you have, though, so it's really a non issue. I already said it once, people are only upset because you're trying to tell them that the FX is an underrated chip.

 

60fps isn't the issue.

 

Most people are now using a Ryzen systems. I hardly see Intel even recommended anymore unless the OP requests it.

 

I won't lie, though, I would never have bought AMD before Ryzen, although I was never really into hardware until recently.. Now I'll never buy Intel unless they do something revolutionary..even If the pricing and performance is the SAME. AMD is just doing better in the CPU (price to performance) department at the moment.

I had an 8350 paired with a 290x (yeah real toaster) and managed to squeeze every bit of performance out of it with overclocking, it was pretty good for its time tbh, at first every game ran perfectly but as games got more complex it started to show its flaws, stuttering, low framerates, random crashes or freezings, etc at 1440p

 

a 3770k which was the "rival" CPU at the time was literally twice the price, not to mention the price of a decent board so it was impossible to get for me

 

yeah and ryzen is 10 times better, can confirm since I have one, more cores, low temps, better (yet not as good as an 8th gen Intel) single-core perf, the only downside of it is that it cannot overclock as good as the new Intel stuff or the old FX

ASUS X470-PRO • R7 1700 4GHz • Corsair H110i GT P/P • 2x MSI RX 480 8G • Corsair DP 2x8 @3466 • EVGA 750 G2 • Corsair 730T • Crucial MX500 250GB • WD 4TB

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/31/2019 at 4:52 PM, aezakmi said:

I had an 8350

I never really gamed on my PC until recently and this is my first great gaming PC. Before this was an I7 laptop with a 1060 at 1080p, actually worked very well. I really do love what AMD is doing right now and hope they keep it up. I don't even want to consider Intel unless they're pricing becomes more than reasonable, and even then, AMD would have to be on a shit streak or something. I like supporting AMD right now as they're giving Intel a run for it's money and I'm hoping to see this in the GPU space soon. These companies have been giving us SHIT product updates for too long without any real innovation.

 

Just like @avrona, if he's happy with his performance, then good for him. He's just proud of what he has and I don't blame him for it. But he's off a little bit by saying his processor is underrated.

 

/rant.

 

On 1/31/2019 at 4:52 PM, aezakmi said:

I had a

Just because your monitor only has 60hz refresh rate doesn't mean higher frame rates aren't beneficial. I mentioned this above somewhere.

Ryzen 3800X + MEG ACE w/ Radeon VII + 3733 c14 Trident Z RGB in a Custom Loop powered by Seasonic Prime Ultra Titanium
PSU Tier List | Motherboard Tier List | My Build

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, ChewToy! said:

I never really gamed on my PC until recently and this is my first great gaming PC. Before this was an I7 laptop with a 1060 at 1080p, actually worked very well. I really do love what AMD is doing right now and hope they keep it up. I don't even want to consider Intel unless they're pricing becomes more than reasonable, and even then, AMD would have to be on a shit streak or something. I like supporting AMD right now as they're giving Intel a run for it's money and I'm hoping to see this in the GPU space soon. These companies have been giving us SHIT product updates for too long without any real innovation.

 

Just like @avrona, if he's happy with his performance, then good for him. He's just proud of what he has and I don't blame him for it. But he's off a little bit by saying his processor is underrated.

 

/rant.

Again, it clearly is underrated though if so many people don't even believe my performance.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, avrona said:

And it clearly is underrated is so many people don't even believe when I say I have great performance with it. 

This is what is wrong with this post, not your setup. You saying that you have "great performance" isn't a proof. The only scientific proof in this post is the result of the benchmark and since it gives a 6.5 min is a proof that something is wrong. If you want to give us a proof of your "great performance" give us real time graph of cpu usage per core (not with task manager), record all the fps perfomance of the test and give us the avg but also the 1% low (I guess 0.1% won't be necessay in this case).

Also on a 60Hz monitor even if your min Is above 60 is beneficial to get more fps since the game data (as player position and bullets trajectory is updated at the same frequency)

Tag me @mbox or quote me if you want me to read

CPU Ryzen 5 1600 @3.9GHz  MOBO Asus Prime X370 Pro  GPU XFX RX 580 GTS XXX 8GB

COOLER Noctua NH-U12S  RAM 2x4GB HyperX Predator 3000 MHz  SSD Samsung 850 evo 250GB  CASE NZXT s340

MONITOR 1 AOC G2590PX MONITOR 2 LG 29WN600

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, avrona said:

Again, it clearly is underrated though if so many people don't even believe my performance.

We believe the performance you're getting. We just don't agree that it's great. It's way below what something modern would get you.

 

It's OKAY to be on a budget CPU. Nobody is telling you that you NEED to upgrade.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/1/2019 at 4:56 AM, mbox said:

This is what is wrong with this post, not your setup. You saying that you have "great performance" isn't a proof. The only scientific proof in this post is the result of the benchmark and since it gives a 6.5 min is a proof that something is wrong. If you want to give us a proof of your "great performance" give us real time graph of cpu usage per core (not with task manager), record all the fps perfomance of the test and give us the avg but also the 1% low (I guess 0.1% won't be necessay in this case).

Also on a 60Hz monitor even if your min Is above 60 is beneficial to get more fps since the game data (as player position and bullets trajectory is updated at the same frequency)

Well here's the thing though, recording the results will just make them worse.

 

On 2/1/2019 at 5:58 AM, JoostinOnline said:

We believe the performance you're getting. We just don't agree that it's great. It's way below what something modern would get you.

 

It's OKAY to be on a budget CPU. Nobody is telling you that you NEED to upgrade.

I mean, I never intended it to be a budget CPU, and back in 2015 when I got it, it was bundled in a gaming PC after all.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, avrona said:

I mean, I never intended it to be a budget CPU, and back in 2015 when I got it, it was bundled in a gaming PC after all.

And it wasn't a budget option 6-7 years ago when it was released. But it's like having a GTX 680 in your rig. It was high end in 2012, but its only competition is with entry level GPU's now, while being much more power hungry.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, JoostinOnline said:

And it wasn't a budget option 6-7 years ago when it was released. But it's like having a GTX 680 in your rig. It was high end in 2012, but its only competition is with entry level GPU's now, while being much more power hungry.

But you could always have some killer CPU in your system to help make up the difference. That's what I did and it's working out really well so far.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×