Jump to content

An issue with people bashing the FX CPUs !

I/O

More important than the max framerate, an i5 will usually yield higher minimums as well, which is very noticeable.

It can be, depending on your GPU solution. However personally I try and cap my FPS in a reasonable fashion to avoid any crazy framerate jumps. Usually I cap at 62 (or for some games like CSGO, leave it at 144 cap), but if it's a game where I can't hold a stable 60 I'll cap it at 32 and that tends to play a lot better than floating around in the 40s and 50s.

I have a powerful enough GPU solution though that I don't really have too many games that struggle to get 60 let alone 30. So min framerate rarely affects me, since it's always above 30 and I cap my fps sensibly to prevent crazy drops.

Hey guys, since I happen to own an FX-8350 and a GTX 780, I'm in a great spot to weigh in on the situation.

The FX-8350, or basically every FX chip, sucks for high end gaming. Overclocking does not help, the minuscule amount of power you gain from a 5GHz clock rate is not worth draining the life on your CPU. If you argue that it doesn't ever bottleneck, or only "some" games bottleneck, you're dumb. You shouldn't buy a PC component for one game that you're going to play a couple times, I think that's seriously stupid. That Crysis 3 benchmark? I played Crysis 3 for a week because I got it for free. I haven't played it in almost a year now. That argument is stupid. If you want benchmarks, here you go;

8350bottleneck2_zps58bbe116.png

Did you see that massive bottleneck? Because I see it often.

I have the pleasure of seeing a bottleneck in every CPU intensive game, or games just not optimized to run on eight cores. I know because I never get to see my 780 at 100%. Yes, I can see some really awesome detail and frame rates, but my GPU isn't at 100%, so I can be seeing lots more, but can't.

However, the FX-8320 is a pretty good budget option. If you only want an R9 270X, then you could get away with owning an 8320. If you want anything better, you better go Intel, because you'll see bottlenecks.

Here is what I don't understand. Past 60 FPS, what does it matter if you're bottlenecked at 90 frames? Unless you're running a 120Hz monitor it literally has no effect on your experience. If you're gaming at higher resolutions, you'll see 99% usage just the same on an FX proccessor as on an Intel one outside of some very limited examples.

Why not run your GPU quieter and cooler with capped 60 FPS? If you're on a 1080p, 60 Hz monitor there's literally no difference in fidelity of the image.

Like look at those benchmarks. Is CIv V really a game where framerate even matters to the game experience, even past 30?

Like yes, if you want to always get maximum GPU usage, FX is not appropriate. But really as long as you can fulfill your desired performance bracket, (Resolution, Monitor refresh rate, Graphics settings) then who the fuckcares what benchmarks say? If you're playing a game and the GPU isn't even being stressed enough on max settings to see 100% usage THAT IS A GOOD THING. It means you have more wiggle room to crank up settings like SSAA or higher resolutions, or for streaming.

I just don't get it, and I own a fucking 144Hz monitor. I cap my framerate to the nearest available sensible place (30, 60, 120. Rarely have to even turn v-sync on.) for the game inquestion and have a smooth experience with as few dips below my target framerate as possible and it is a MUCH MUCH MUCH better experience than uncapped framerates or any of that nonsense.

Unless you have a 120HZ+ monitor bottlenecking really isn't a concern. Admittedly there's enough games that run past 100 FPS on intel than on AMD, so if you're really looking for a complete 120Hz gaming experience you'll want Intel. But lets be honest, people don't buy 120Hz monitors so they can play Civ V...

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Here is what I don't understand.  Past 60 FPS, what does it matter if you're bottlenecked at 90 frames?

 

Having more GPU power available is good,...more than 60fps has it's uses, think...framedrops in heavy particle scenes (smoke n such), I'd rather drop from 70-90fps to 50fps, than 60fps down to say..35-45fps.

I don't mean to knock your post, I agree, just wanted to add my thoughts ie: framedrops.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Here is what I don't understand.  Past 60 FPS, what does it matter if you're bottlenecked at 90 frames?  Unless you're running a 120Hz monitor it literally has no effect on your experience.  If you're gaming at higher resolutions, you'll see 99% usage just the same on an FX proccessor as on an Intel one outside of some very limited examples.

If you're already limited to 60 rather than 90, what happens when a new more demanding game comes out? You'd then be at 40 rather than 70

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Here is what I don't understand.  Past 60 FPS, what does it matter if you're bottlenecked at 90 frames?  Unless you're running a 120Hz monitor it literally has no effect on your experience.  If you're gaming at higher resolutions, you'll see 99% usage just the same on an FX proccessor as on an Intel one outside of some very limited examples.

 

Why not run your GPU quieter and cooler with capped 60 FPS?  If you're on a 1080p, 60 Hz monitor there's literally no difference in fidelity of the image.

 

Like look at those benchmarks.  Is CIv V really a game where framerate even matters to the game experience, even past 30?

 

Like yes, if you want to always get maximum GPU usage, FX is not appropriate.  But really as long as you can fulfill your desired performance bracket, (Resolution, Monitor refresh rate, Graphics settings) then who the fuckcares what benchmarks say?  If you're playing a game and the GPU isn't even being stressed enough on max settings to see 100% usage THAT IS A GOOD THING.  It means you have more wiggle room to crank up settings like SSAA or higher resolutions, or for streaming.  

 

I just don't get it, and I own a fucking 144Hz monitor.  I cap my framerate to the nearest available sensible place (30, 60, 120.  Rarely have to even turn v-sync on.)  for the game inquestion and have a smooth experience with as few dips below my target framerate as possible and it is a MUCH MUCH MUCH better experience than uncapped framerates or any of that nonsense.  

 

Unless you have a 120HZ+ monitor bottlenecking really isn't a concern.  Admittedly there's enough games that run past 100 FPS on intel than on AMD, so if you're really looking for a complete 120Hz gaming experience you'll want Intel.  But lets be honest, people don't buy 120Hz monitors so they can play Civ V...

You know what? That's exactly why I have a GTX 780, upgraded from a GTX 770, upgraded for an HD 7870. I only want so much, not the absolute best. Who, on this forum, full of enthusiasts crazier than I, wants or needs anything better? Nobody here, of course!

 

You're pitching the wrong argument to the wrong person. Maybe you are right in one aspect, and it's that you don't get it. I was under the assumption that this was an enthusiast forum, but it seems that people will settle for less because anything better makes no sense, anyway. Let's all go to the Xbox One, am I right?????

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

The benchie shows the game at 720 not at 1080.

720 is consider as HD, but that is even lower than 1280x1024.

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

Having more GPU power available is good,...more than 60fps has it's uses, think...framedrops in heavy particle scenes (smoke n such), I'd rather drop from 70-90fps to 50fps, than 60fps down to say..35-45fps.

I don't mean to knock your post, I agree, just wanted to add my thoughts ie: framedrops.

Sure, but if you can run the game at 90FPS you already have that wiggle room.

 

 

 

If you're already limited to 60 rather than 90, what happens when a new more demanding game comes out? You'd then be at 40 rather than 70

 

If the GPU stuff becomes more demanding, you transition from a CPu bottleneck to a GPU one.  The CPU bottleneck only becomes relevant in *most* games because the GPU is kicking so much ass that you can reach it.   (Outside of a few games like Arma 3 or World of Tanks which are heavily CPU bottlenecked on FX processors and Intel will net you double the framerates.)  I'm talking about framerate caps, not buying hardware to only reach 60 FPS.  Very different things.

 

 

You know what? That's exactly why I have a GTX 780, upgraded from a GTX 770, upgraded for an HD 7870. I only want so much, not the absolute best. Who, on this forum, full of enthusiasts crazier than I, wants or needs anything better? Nobody here, of course!

 

You're pitching the wrong argument to the wrong person. Maybe you are right in one aspect, and it's that you don't get it. I was under the assumption that this was an enthusiast forum, but it seems that people will settle for less because anything better makes no sense, anyway. Let's all go to the Xbox One, am I right?????

I understand wanting the higher end hardware, wanting the best graphics with the highest resolutions, wanting to run at great framerates.  I understand crazy  eyefinity rigs for surround gaming, 4K resolutions and tools like DSR and VSR.  

 

But you literally cannot see the difference between 90 FPS and 110 FPS on a 60Hz monitor,  your monitor cannot even display it.  What matters more is still a stable framerate with a very small delta between min framerate and max framerate.  Running at a higher max framerate doesn't benefit you unless you can do it with a much higher min framerate as well.  I've never seen a real world example of this happening in any significant fashion, since even if min and max frames increase they still often have a huge delta between them. 

 

The benchie shows the game at 720 not at 1080.

720 is consider as HD, but that is even lower than 1280x1024.

That's because it's a CPU test.  In the actual game the difference between the two is much closer at resolutions people actually play at, because the GPU is more important.  The low resolution is used to show the biggest difference possible between the CPU, but in the real world these differences are often much smaller. (Usually 5-10 frames at most, with the most egregious examples being 30-40 difference in Skyrim or SC2.)

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

If the GPU stuff becomes more demanding, you transition from a CPu bottleneck to a GPU one.  The CPU bottleneck only becomes relevant in *most* games because the GPU is kicking so much ass that you can reach it.   (Outside of a few games like Arma 3 or World of Tanks which are heavily CPU bottlenecked on FX processors and Intel will net you double the framerates.)  I'm talking about framerate caps, not buying hardware to only reach 60 FPS.  Very different things.

I'm not talking about buying lesser hardware either. Games aren't JUST getting more demanding on the GPU, they're also getting more demanding on the cpu. So if an FX chip + 780 only gets 60fps now while an i5+780 gets 90, then in a more demanding game even 60 wouldn't be achieveable. 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not talking about buying lesser hardware either. Games aren't JUST getting more demanding on the GPU, they're also getting more demanding on the cpu. So if an FX chip + 780 only gets 60fps now while an i5+780 gets 90, then in a more demanding game even 60 wouldn't be achieveable. 

Maybe 5 years down the line that might be a concern, but there's a lot of cpu optimizations developers could do like distributing workloads to other cores more that should largely limit that.  I'd actually expect the bottlenecks to get smaller over the next couple of years.  We're not even at the point where games use all four cores in a quadcore, let alone more than that.  Just compare the performance difference between FX and Intel in recent games (2013-2014) versus 2010-2011 games and a lot of bottleneck issues start disappearing or becoming very minor. 

 

Personally, I usually upgrade GPU every 2 years and CPU every 3-4 to try and stay ahead of the curve.  

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe 5 years down the line that might be a concern, but there's a lot of cpu optimizations developers could do like distributing workloads to other cores more that should largely limit that.  I'd actually expect the bottlenecks to get smaller over the next couple of years.  We're not even at the point where games use all four cores in a quadcore, let alone more than that.

 

Personally, I usually upgrade GPU every 2 years and CPU every 3-4 to try and stay ahead of the curve.  

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

 

Maybe 5 years down the line that might be a concern, but there's a lot of cpu optimizations developers could do like distributing workloads to other cores more that should largely limit that.  I'd actually expect the bottlenecks to get smaller over the next couple of years.  We're not even at the point where games use all four cores in a quadcore, let alone more than that.

 

Personally, I usually upgrade GPU every 2 years and CPU every 3-4 to try and stay ahead of the curve.  

 

 Cryengine 3 and Frostbyte already do.  Dunia does.  UE4 engine does.  Unity does. (Both the licensed engine and the new assassins creed engine.)     With the game engines supporting it, developers will follow suite.  

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

I have a 8350 at 4.65GHZ and it idles at 20C on average. So the hotter argument doesn't really matter. Now I will admit that it does take a lot more power to run, but I love my OC'd 8 core

Yeah that's not really possible. I saw you have an air cooler. My i7 3820 runs at about 30 Celsius idle. And that's liquid cooled and set a stock speeds.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah that's not really possible. I saw you have an air cooler. My i7 3820 runs at about 30 Celsius idle. And that's liquid cooled and set a stock speeds.

Depends on ambient temps; sometimes my 3570k's 3rd core gets to 17-18c at idle while the rest of the cores are ~20c (ambient temp of 15-16c). 

But idle temps don't matter. 

RIP in pepperonis m8s

Link to comment
Share on other sites

Link to post
Share on other sites

Depends on ambient temps; sometimes my 3570k's 3rd core gets to 17-18c at idle while the rest of the cores are ~20c (ambient temp of 15-16c). 

But idle temps don't matter. 

I've had some issues with my CPU running hotter than it should. I think it's due to cheap designing of my motherboard.

Link to comment
Share on other sites

Link to post
Share on other sites

X99 FOR LIFE!

inb4somethingbetterthanx99

Current PC build: [CPU: Intel i7 8700k] [GPU: GTX 1070 Asus ROG Strix] [Ram: Corsair LPX 32GB 3000MHz] [Mobo: Asus Prime Z370-A] [SSD: Samsung 970 EVO 500GB primary + Samsung 860 Evo 1TB secondary] [PSU: EVGA SuperNova G2 750w 80plus] [Monitors: Dual Dell Ultrasharp U2718Qs, 4k IPS] [Case: Fractal Design R5]

Link to comment
Share on other sites

Link to post
Share on other sites

you wish it was 8 cores

It does, that's the ambient temp of my room and my case has good fans. So of course that's what I idle at.

 

I live in Indiana so it's absolutely freezing out right now, and I pay the heating bill (I'm not going ot crank that shit in the winter), so my room is always at about 20C

CPU: R5 5800X3D Motherboard - MSI X570 Gaming Plus RAM - 32GB Corsair DDR4 GPU - XFX 7900 XTX 4GB Case - NZXT H5 Flow (White) Storage - 2X 4TB Samsung 990 Pro PSU - Corsair RM100E Cooling - Corsair H100i Elite Capellix Keyboard Corsair K70 (Brown Switches)  Mouse - Corsair Nightsword RGB

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah that's not really possible. I saw you have an air cooler. My i7 3820 runs at about 30 Celsius idle. And that's liquid cooled and set a stock speeds.

It does, that's the ambient temp of my room and my case has good fans. So of course that's what I idle at.

 

I live in Indiana so it's absolutely freezing out right now, and I pay the heating bill (I'm not going ot crank that shit in the winter), so my room is always at about 20C

CPU: R5 5800X3D Motherboard - MSI X570 Gaming Plus RAM - 32GB Corsair DDR4 GPU - XFX 7900 XTX 4GB Case - NZXT H5 Flow (White) Storage - 2X 4TB Samsung 990 Pro PSU - Corsair RM100E Cooling - Corsair H100i Elite Capellix Keyboard Corsair K70 (Brown Switches)  Mouse - Corsair Nightsword RGB

Link to comment
Share on other sites

Link to post
Share on other sites

I have a 8350 at 4.65GHZ and it idles at 20C on average. So the hotter argument doesn't really matter. Now I will admit that it does take a lot more power to run, but I love my OC'd 8 core

 

...idles at 20C on average...what the hell is your room temperature???

Owner of a top of the line 13" MacBook Pro with Retina Display (Dual Boot OS X El Capitan & Win 10):
Core i7-4558U @ 3.2GHz II Intel Iris @ 1200MHz II 1TB Apple/Samsung SSD II 16 GB RAM @ 1600MHz

Link to comment
Share on other sites

Link to post
Share on other sites

...idles at 20C on average...what the hell is your room temperature???

like 67 Fahrenheit 

Link to comment
Share on other sites

Link to post
Share on other sites

It does, that's the ambient temp of my room and my case has good fans. So of course that's what I idle at.

 

I live in Indiana so it's absolutely freezing out right now, and I pay the heating bill (I'm not going ot crank that shit in the winter), so my room is always at about 20C

My motherboard is cheap so that may be part of the issue

Link to comment
Share on other sites

Link to post
Share on other sites

...idles at 20C on average...what the hell is your room temperature???

About 67 Fahrenheit.

CPU: R5 5800X3D Motherboard - MSI X570 Gaming Plus RAM - 32GB Corsair DDR4 GPU - XFX 7900 XTX 4GB Case - NZXT H5 Flow (White) Storage - 2X 4TB Samsung 990 Pro PSU - Corsair RM100E Cooling - Corsair H100i Elite Capellix Keyboard Corsair K70 (Brown Switches)  Mouse - Corsair Nightsword RGB

Link to comment
Share on other sites

Link to post
Share on other sites

About 67 Fahrenheit.

Amd is known for inaccurate cpu temperature.

But you should be fine.

 

google: amd fx idle temps weird

Link to comment
Share on other sites

Link to post
Share on other sites

like 67 Fahrenheit 

 

That is 19.5C...I call BS...:D Maybe it's in the 20s but definitely not 20C exactly...

Owner of a top of the line 13" MacBook Pro with Retina Display (Dual Boot OS X El Capitan & Win 10):
Core i7-4558U @ 3.2GHz II Intel Iris @ 1200MHz II 1TB Apple/Samsung SSD II 16 GB RAM @ 1600MHz

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×