Jump to content

fx 8350 bottleneck

abdoo

@Ialyrn

 

No, I don't believe you because there is a throng of people on this very forum who have problems running games in a wide variety of genres.  You don't provide any evidence, its all hearsay with you.  You are absolutely bottlenecking your GTX780 with an FX4, and the one time you tried to show it wasn't bottlenecking was in Skyrim, running around in Whiterun with absolutely nothing going on.  No GPU-Z showing your GPU loads, no fps counter(that I remember) and no graphics mods which most everyone uses on their Skyrim.  Not all mods impact CPU performance, but enough do that it makes the FX processors struggle.

 

Saying that you have no problems is not getting you anywhere.  Provide evidence to back up your statements because I have a lot to back up mine, including personal experience.  I've owned both AMD and Intel, I don't hate either company, but right now, it makes no sense at all to purchase an FX processor for a gaming rig when a 4th Gen i3 is beating FX processors in the majority of games.

We back to that again are you? I showed you what you wanted to see, which was me running around a town in skyrim. Something you said my FX4300 wouldnt be able to do at all. Skyrim doesnt drop below 60fps with everything in the game by default maxxed out. So sorry if there wasnt graphics mods everywhere, but I am not a graphics whore. Gaming since the 1980's will do that to a person. I care more about the gameplay itself, and the graphics are only a small part of that.

Link to comment
Share on other sites

Link to post
Share on other sites

We back to that again are you? I showed you what you wanted to see, which was me running around a town in skyrim. Something you said my FX4300 wouldnt be able to do at all. Skyrim doesnt drop below 60fps with everything in the game by default maxxed out. So sorry if there wasnt graphics mods everywhere, but I am not a graphics whore. Gaming since the 1980's will do that to a person. I care more about the gameplay itself, and the graphics are only a small part of that.

When did I say you wouldn't be able to run around a town in Skyrim?  Lets not forget this was Whiterun with 1 person on the screen at once.  No fps counter(that I remember, maybe there was?)  Also, I don't believe you for one instant you don't experience fps drops.  You're just lying now.  There are countless FX users who complain of fps drops, in fact, one guy said "you just learn to live with it."

 

Why is that an acceptable alternative to paying the same, and getting better gameplay?

 

Stop it.  Monitor your GPU loads.  You are bottlenecking that high end GPU.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

@Faceman Honestly, too old to deal with your shitposting. I have far better things to do with my time, than to sit here and argue with am arrogant person who doesn't have a clue.

Link to comment
Share on other sites

Link to post
Share on other sites

@Gix7Fifty

Again, those videos show nothing. In the first video, he is using an Intel i7-3770k, and then disabling cores and threads to mimic the performance of lower end hardware.  This does work to an extent, but not when you are trying to tell someone that an FX8 doesn't bottleneck, because as we all know, Intel cores are much more powerful than FX "cores".  Also, when he drops it down to i3 levels, the 3rd gen i3s are no where near as powerful as 4th Gen i3s, which are beating out FX8s in the majority of games.

 

 

Look through all of these sources... the i3 is handing it to the FX8s and FX9s in so many games!

Benchmarks:

http://www.hardcorew...-4340-review/2/

http://www.hardwarep...8-games-tested/

http://www.tomshardw...cpu,3929-7.html

http://www.anandtech...w-vishera-95w/3

http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/14

https://translate.google.com/translate?hl=en&sl=auto&tl=en&u=http%3A%2F%2Fgamegpu.ru%2Ftest-video-cards%2Figry-2014-goda-protiv-protsessorov-test-gpu.html

http://pclab.pl/art57842.html

 

 

"To put it nicely, the FX-8370E is a true middle-of-the-road CPU. Using it only makes sense as long as the graphics card you choose comes from a similar performance segment.

Depending on the game in question, AMD’s new processor has the potential to keep you happy around the AMD Radeon R9 270X/285 or Nvidia GeForce GTX 760 or 660 Ti level.

A higher- or even high-end graphics card doesn’t make sense, as pairing it with AMD's FX-8370E simply limits the card's potential."

 

"This is a huge result – it wasn’t until we used a Haswell core CPU that the R9 280X  was able to deliver consistent frame times and a 60 FPS frame rate in Assassin’s Creed IV. All three AMD CPUs we used – even the FX 8350 – and the Ivy Bridge Core i3 would deliver a sub 60 FPS frame rate, with frame spikes throughout the benchmark run.

In this case, the Core i3 4340 allows the R9 280X GPU to run at maximum potential, just like the Core i5 (and Core i7 would)."

 

"Pop over to the gaming scatter, though, and the picture changes dramatically. There, the FX-8350 is the highest-performance AMD desktop processor to date for gaming, finally toppling the venerable Phenom II X4 980. Yet the FX-8350's gaming performance almost exactly matches that of the Core i3-3225, a $134 Ivy Bridge-based processor. Meanwhile, the Core i5-3470 delivers markedly superior gaming performance for less money than the FX-8350. The FX-8350 isn't exactly bad for video games—its performance was generally acceptable in our tests. But it is relatively weak compared to the competition.

This strange divergence between the two performance pictures isn't just confined to gaming, of course. The FX-8350 is also relatively pokey in image processing applications, in SunSpider, and in the less widely multithreaded portions of our video encoding tests. Many of these scenarios rely on one or several threads, and the FX-8350 suffers compared to recent Intel chips in such cases. Still, the contrast between the FX-8350 and the Sandy/Ivy Bridge chips isn't nearly as acute as it was with the older FX processors. Piledriver's IPC gains and that 4GHz base clock have taken the edge off of our objections.

The other major consideration here is power consumption, and really, the FX-8350 isn't even the same class of product as the Ivy Bridge Core i5 processors on this front. There's a 48W gap between the TDP ratings of the Core i5 parts and the FX-8350, but in our tests, the actual difference at the wall socket between two similarly configured systems under load was over 100W. That gap is large enough to force the potential buyer to think deeply about the class of power supply, case, and CPU cooler he needs for his build. One could definitely get away with less expensive components for a Core i5 system."

 

"The FX-8370E stretches its legs a little in terms of minimum frame rates, particularly in SLI, however it is handily beaten by the i3-4330."

 

"Average frametimes did not do AMD’s processors any justice either. As we already said the game was fluid with i7 and i5’s, and somewhat playable with the i3 processor line. When we switched to FX CPUs not only did we have worse framerate but the gameplay was simply put, laggy."

 

The second video is all conjecture, with zero benchmarks, graphs or from a credible reviewer.  Jay is far from a credible source.  While the videos he does on watercooling, and building PCs with his daughter are good and cute, he has a lot of videos that are downright wrong on a very simple technical level.  In this video, he is talking over BF4.  This is one of the few games that actually runs well on an FX8 when Mantle is enabled.  When Mantle isn't enabled?  You have to overclock an FX8 to 5.0Ghz to match an Intel i5-4440 @ 3.1Ghz in BF4 multiplayer paired with an R9 290X.

 

This is what a benchmark looks like, and here are many others just like it showing the Haswell i3s outperforming FX8s in the majority of games.

 

bf4_cpu_radeon.png

 

H93GZC3.png

---

67506.png

---

67507.png

---

67510.png

---

batman.png

---

civilization.png

---

http--www.gamegpu.ru-images-stories-Test

---

http--www.gamegpu.ru-images-stories-Test

---

http--www.gamegpu.ru-images-stories-Test

---

Even this supposedly very good multi-threaded game, Call of Duty:Advanced Warefare runs better on an i3 than an FX9

http--www.gamegpu.ru-images-stories-Test

---

d1b73da9_http--www.gamegpu.ru-images-sto

---

http--www.gamegpu.ru-images-stories-Test

---

http--www.gamegpu.ru-images-stories-Test

---

60-Bioshock-R9-295X2.png

---

65-DiRT-3-R9-295X2.png

---

arma3_1920.png

---

civ_1920.png

---

csgo_1920.png

---

crysis3_1920_2.png

---

fc3_1920.png

---

fc4_n_1920.png

---

starcraft_1920.png

---

gta4_1920.png

---

rome2_1920.png

---

witchercpu_1920.png

This one above is Witcher 2

---

assassin_1920n.png

---

fsx_1920n.png

Sorry i cant believe any of those benchmarks when i have seen video's of these games and none of them run like that in real gameplay. 

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry i cant believe any of those benchmarks when i have seen video's of these games and none of them run like that in real gameplay. 

Cognitive dissonance much? 

 

So because they don't reflect on your expected bias, they're therefor faulty. Not even questioning the integrity of those youtubers, you flatout deny results.

Wow dude, just wow. If you had some technically explained reason to discard the results, i'd have no issues with that. But this form of denial is what is ultimately perpetuating this discussion.

 

Screw Logan and his shitty video's. They're poorly conducted, stop using those bogus tests as evidence all other benchmarks where the Intel wins are fake.

Link to comment
Share on other sites

Link to post
Share on other sites

Cognitive dissonance much? 

 

So because they don't reflect on your expected bias, they're therefor faulty. Not even questioning the integrity of those youtubers, you flatout deny results.

Wow dude, just wow. If you had some technically explained reason to discard the results, i'd have no issues with that. But this form of denial is what is ultimately perpetuating this discussion.

 

Screw Logan and his shitty video's. They're poorly conducted, stop using those bogus tests as evidence all other benchmarks where the Intel wins are fake.

gameplay>benchmarks. Thats how i know these were forged and not real. Benchmarks can be made to look anyway its not that hard its pretty dam simple to do. I know from gameplay that the benchmarks in fact are not true. 

Link to comment
Share on other sites

Link to post
Share on other sites

gameplay>benchmarks. Thats how i know these were forged and not real. 

 

Yes, because you have every way of verifying the youtube video's aren't manipulated in any way.

Let's just skip over the fact that the benchmarks actually reflect on the actual computational power of both chips, the way DirectX works, and that every benchmark can be rationally explained.

 

Yeah, let's just stick with subjective "lol this 8350 runs the game just fine" video's. You aren't biased at all.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, because you have every way of verifying the youtube video's aren't manipulated in any way.

Its pretty simple dude there are 100's of videos of 8350 cpu on youtube of people who bought it not all of them are fake some yes but going over 100's of them they all run better than benchmarks say. 

Link to comment
Share on other sites

Link to post
Share on other sites

Its pretty simple dude there are 100's of videos of 8350 cpu on youtube of people who bought it not all of them are fake some yes but going over 100's of them they all run better than benchmarks say. 

Yes it's called post-purchase rationalization. That does not prove anything viable. You can find an equal amount of video's of people running 4690K's telling you the exact same thing.

You can't conclude anything based on those video's, not without the proper context or caveat's in place.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes it's called post-purchase rationalization. That does not prove anything viable. You can find an equal amount of video's of people running 4690K's telling you the exact same thing.

You can't conclude anything based on those video's, not without the proper context or caveat's in place.

Wtf you can conclude anything from a game running on hardware your looking at just with msi afterburner fps counter on ok makes sense. Ill only trust the benchmarks which always seem to under up wrong in real life. 

Link to comment
Share on other sites

Link to post
Share on other sites

Wtf you can conclude anything from a game running on hardware your looking at just with msi afterburner fps counter on ok makes sense. Ill only trust the benchmarks which always seem to under up wrong in real life. 

Yeah, whatever dude. If you want to blindfold yourself for everything beyond uncontrolled/non-verifyable youtube video's, subjective opinions and AMD-echochambers be my guest.

Just don't discredit benchmarks for being fake or unrepresentative. Sure, they usually look for the most harsh moments in the game. So a 55avg fps won't necessarily mean the game won't run at 60fps sometimes.

But that does not mean the relative performance difference in those tests isn't representative or the "real-world". Or your "real-world" isn't really that, it's "skewed-world".

Link to comment
Share on other sites

Link to post
Share on other sites

@OP, are you a professional, competitive gamer? You playing on 4k?1080p 144Hz? Surround Display setup? Alien things like that?NO???

Then you can go with a new graphics card, be it GTX970 or an R9 290X.

 

Most people tend to make a drama out of this "bottlenecking" issue...It's not like you're gonna be playing at 20FPS, or that the frames will drop each second. It's not gonna happen...

For 1080p, it will be more than enough. Would you really start yelling like a little girl if your FPS dropped from 120 to like 90-80? Of course not, because on a 60Hz monitor you're not gonna notice the difference.

DX12 is coming, nextgen games have multithreaded optimizations, so CPU will be less of an issue. It will still be slower than the i7s or i5s in games(duh..it's almost 3years old), but it won't make your gaming experience unpleasant. 

 

I am really faceplaming when someone asks to pair a high-end gpu with an AMD FX CPU, and they recommend switching the platform. I know people who paired GTX970s with FX 6300s, and although they admit they're not using the full potential of the card, the boost was still significant.

 

The point is. If you have the CPU, then leave it there, and use it. If you don't, then AMD it's not worth taking into consideration. And this comes from an AMD guy.

MARS_PROJECT V2 --- RYZEN RIG

Spoiler

 CPU: R5 1600 @3.7GHz 1.27V | Cooler: Corsair H80i Stock Fans@900RPM | Motherboard: Gigabyte AB350 Gaming 3 | RAM: 8GB DDR4 2933MHz(Vengeance LPX) | GPU: MSI Radeon R9 380 Gaming 4G | Sound Card: Creative SB Z | HDD: 500GB WD Green + 1TB WD Blue | SSD: Samsung 860EVO 250GB  + AMD R3 120GB | PSU: Super Flower Leadex Gold 750W 80+Gold(fully modular) | Case: NZXT  H440 2015   | Display: Dell P2314H | Keyboard: Redragon Yama | Mouse: Logitech G Pro | Headphones: Sennheiser HD-569

 

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, whatever dude. If you want to blindfold yourself for everything beyond uncontrolled/non-verifyable youtube video's, subjective opinions and AMD-echochambers be my guest.

Just don't discredit benchmarks for being fake or unrepresentative. Sure, they usually look for the most harsh moments in the game. So a 55avg fps won't necessarily mean the game won't run at 60fps sometimes.

But that does not mean the relative performance difference in those tests isn't representative or the "real-world". Or your "real-world" isn't really that, it's "skewed-world".

If i could go back i would got amd cpu not intel on my build just no reason to have intel atm when there is so little difference. 

Link to comment
Share on other sites

Link to post
Share on other sites

If i could go back i would got amd cpu not intel on my build just no reason to have intel atm when there is so little difference.

Yup. Early nomination for dumbest post of 2015 right here.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

If i could go back i would got amd cpu not intel on my build just no reason to have intel atm when there is so little difference. 

Buddy you're exagerating...

 

 

 

Yeah, whatever dude. If you want to blindfold yourself for everything beyond uncontrolled/non-verifyable youtube video's, subjective opinions and AMD-echochambers be my guest.

Just don't discredit benchmarks for being fake or unrepresentative. Sure, they usually look for the most harsh moments in the game. So a 55avg fps won't necessarily mean the game won't run at 60fps sometimes.

But that does not mean the relative performance difference in those tests isn't representative or the "real-world". Or your "real-world" isn't really that, it's "skewed-world".

Benchmarks are most often, if not always, meant to recreate the most demanding scenarios for the game or app used, so I think that some of them are irrelevant, because if that scenario occurs 2 times, let's say for 1 minute or 2 in a 6hour gameplay.. it's even more irrelevant. I'm generally speaking, not analizing the benches posted in this topic.

MARS_PROJECT V2 --- RYZEN RIG

Spoiler

 CPU: R5 1600 @3.7GHz 1.27V | Cooler: Corsair H80i Stock Fans@900RPM | Motherboard: Gigabyte AB350 Gaming 3 | RAM: 8GB DDR4 2933MHz(Vengeance LPX) | GPU: MSI Radeon R9 380 Gaming 4G | Sound Card: Creative SB Z | HDD: 500GB WD Green + 1TB WD Blue | SSD: Samsung 860EVO 250GB  + AMD R3 120GB | PSU: Super Flower Leadex Gold 750W 80+Gold(fully modular) | Case: NZXT  H440 2015   | Display: Dell P2314H | Keyboard: Redragon Yama | Mouse: Logitech G Pro | Headphones: Sennheiser HD-569

 

Link to comment
Share on other sites

Link to post
Share on other sites

Benchmarks are most often, if not always, meant to recreate the most demanding scenarios for the game or app used, so I think that some of them are irrelevant, because if that scenario occurs 2 times, let's say for 1 minute or 2 in a 6hour gameplay.. it's even more irrelevant. I'm generally speaking, not analizing the benches posted in this topic.

 

How does the scene being demanding negate any of the relative results. I don't understand...

 

 

If i could go back i would got amd cpu not intel on my build just no reason to have intel atm when there is so little difference. 

 

zl8Be3w.gif

Link to comment
Share on other sites

Link to post
Share on other sites

I might just go sell my i7 today just to post my own comparisons and get 8350. 

Link to comment
Share on other sites

Link to post
Share on other sites

AMD themselves only recommend a 280/280x to be paired with an 8350. Just saying...

Recovering Apple addict

 

ASUS Zephyrus G14 2022

Spoiler

CPU: AMD Ryzen 9 6900HS GPU: AMD r680M / RX 6700S RAM: 16GB DDR5 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I might just go sell my i7 today just to post my own comparisons and get 8350.

Do it!

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

AMD themselves only recommend a 280/280x to be paired with an 8350. Just saying...

I generally advocate the same thing I'd the situation calls for it, but do you have a source on this?

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

I might just go sell my i7 today just to post my own comparisons and get 8350. 

I'm sure they won't be biased, as soon as you realise what a mistake you made.

 

Please do it, your rationalization would make for a very funny post to watch.

Link to comment
Share on other sites

Link to post
Share on other sites

I generally advocate the same thing I'd the situation calls for it, but do you have a source on this?

 

It was a slideshot with the 8370 was released IIRC,

 

That same slide also recommended a 295X2 with a 9590 tho.

Link to comment
Share on other sites

Link to post
Share on other sites

I might just go sell my i7 today just to post my own comparisons and get 8350. 

Really?..I give you 100$ if you do that...(lol..no I won't)

 

How does the scene being demanding negate any of the relative results. I don't understand...

Because benchmarks should be done using different scenes from the game.

To expand and detail a bit.

I, for instance played Crysis 3 with the rig in the signature, at 1440x900. Settings were:

Textures:very high

System spec: high

Antialiasing: MSAA x4

 

The average framerate was by my telling around 45-50(which for me is playable).

During the "Welcome to the Jungle" mission, I encountered some spikes after I picked the Bolt Sniper and in the Sewers before reaching the first wave of CELL units.I got some spikes in the mission where I was shooting from the VTOL(dunno how was it called). Other than that, the game was really smooth.

 

If I were to benchmark the game and asked to be biased, I would have picked these moments specifically, and make a graph with them. 

But as I said, if there are 3-4 moments, short moments(1-2 minutes) when this happens...what about the rest 5-6 hours of gameplay?

 

Tgat's my point, this  is how some benchmarks are made, and this is the main reason why most of you start burning when you see the words FX and gaming. Did you personally used an AMD chip for an extended period of time and declared yourselves, "Yeah man, this CPU is a piece of crap, you can't play anything with it."

 

There's also this guy in the previous pages(sorry i forgot the name) with an 8350 and R9 290...but he has stock cooling...sorry I seem harsh and stupid, but man, have you lived under a rock? I bet his FX is throttling like hell, and of course, the games play like sh*t...

 

I finished WatchDogs, with its crappy optimization on a Core 2 Quad at 2.33Ghz and an HD5670 and i didn't comlpain...that's what I had at that time, that's what I used. Same applies for the OP.

Why should he replace something that he has and that's working(not the best results, but still an enjoyable experience)

MARS_PROJECT V2 --- RYZEN RIG

Spoiler

 CPU: R5 1600 @3.7GHz 1.27V | Cooler: Corsair H80i Stock Fans@900RPM | Motherboard: Gigabyte AB350 Gaming 3 | RAM: 8GB DDR4 2933MHz(Vengeance LPX) | GPU: MSI Radeon R9 380 Gaming 4G | Sound Card: Creative SB Z | HDD: 500GB WD Green + 1TB WD Blue | SSD: Samsung 860EVO 250GB  + AMD R3 120GB | PSU: Super Flower Leadex Gold 750W 80+Gold(fully modular) | Case: NZXT  H440 2015   | Display: Dell P2314H | Keyboard: Redragon Yama | Mouse: Logitech G Pro | Headphones: Sennheiser HD-569

 

Link to comment
Share on other sites

Link to post
Share on other sites

Also this is the image for the CPU+GPU.

1%20-%20Pricing.png

 

 

It was a slideshot with the 8370 was released IIRC,

 

That same slide also recommended a 295X2 with a 9590 tho.

I don't see any 9590 + R9 295x2...

MARS_PROJECT V2 --- RYZEN RIG

Spoiler

 CPU: R5 1600 @3.7GHz 1.27V | Cooler: Corsair H80i Stock Fans@900RPM | Motherboard: Gigabyte AB350 Gaming 3 | RAM: 8GB DDR4 2933MHz(Vengeance LPX) | GPU: MSI Radeon R9 380 Gaming 4G | Sound Card: Creative SB Z | HDD: 500GB WD Green + 1TB WD Blue | SSD: Samsung 860EVO 250GB  + AMD R3 120GB | PSU: Super Flower Leadex Gold 750W 80+Gold(fully modular) | Case: NZXT  H440 2015   | Display: Dell P2314H | Keyboard: Redragon Yama | Mouse: Logitech G Pro | Headphones: Sennheiser HD-569

 

Link to comment
Share on other sites

Link to post
Share on other sites

story about crysis3

 

This is nothing to do with absolute (44-45fps in game X or Y) performance, and everything to do with relative (45 vs 75 fps in set scenario) performance. I don't think you quite understand my point. If the scene is heavy, like the one you described, both the Intel and AMD have a hard job. Meaning, the results will be equally likely (that's a very important rule if you're going to add statistics into this) results. If the i5 gets 75fps and the fx 45 (arbitrary numbers!), that means the intel is 75/45*100 = 66.7% better. Sure, the game might be playable on both CPU's, but it does not negate the i5 clearly being better at the same task on the same GPU. Given the pricing, you have a much more smooth experience on the i5...

 

If your GPU can attain 75fps at given settings, but your CPU only manages to squeeze out 45, that's bottlenecking. That's why you shouldn't really couple the 8350 with anything higher than you're currently using. However, even if your CPU manages 60 and you use vsync anyway, there is no point in getting a slower CPU at the same pricerange. And sadly, the FX 99.99% is the slower CPU.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×