Jump to content

fx 8350 bottleneck

abdoo

It is how it works.  I just demonstrated it in this thread.  The only game that showed no difference so far is Arma 3.  Everything else saw framerate increases.  Throwing more GPU at it works. (And yes, that includes minimum framerates.  WoW went from 35 FPS to 51 in the most CPU limited area I've played the game in.)  

 

I've clearly demonstrated that a GTX 980 is not bottlenecked by an FX8core.  It doesn't perform *as well* as it would on Intel, but excluding arma 3 the minimum framerate, average framerate and maximum framerate has increased in every single game I've tested.  

No you haven't.  That is not how it works.  You have demonstrated time and time again that you have no understanding of how it works.  Adding GPU in a CPU bound scenario doesn't increase performance.  You spent that money on NOTHING!  I don't believe half of the results you got because you have been proven to be lying in the past, and continue to do so now.  At least you are finally admitting that the i3 is better than your FX8, so I guess that is progress.

 

To ignore you go.  The sheer amount of idiocracy in your posts drives me mad. Stop lying to people.  Just stop.  People are asking for help and you are trying to drag them down with you. 

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

One other thing to help amd will be fact there hardware is in next gen consoles  that means most games over the next 5 years won't have processing threads larger than AMD's limit. So amd woont really be that far off and there is already little difference.

Link to comment
Share on other sites

Link to post
Share on other sites

No you haven't.  That is not how it works.  You have demonstrated time and time again that you have no understanding of how it works.  Adding GPU in a CPU bound scenario doesn't increase performance.  You spent that money on NOTHING!

 

lolwhut

 

So you are litterly saying

if i upgrade my 7870GHz to a 290X for Skyrim, which is a cpu bound game, im not going to see any performance increase, because i use a FX8350?

 

You are joking right?

 

Because a good friend of me, who has a FX8350 + 290X gets literly twice as much performance in skyrim, then i get with my FX8350 and 7870GHz lol. :D

Link to comment
Share on other sites

Link to post
Share on other sites

alright man if you say so...look i'll not argue with you i feel it's not worth it i have had a different experience than you obviously and it's alright as long as you are happy with your FX CPU it,s all good...you should have went with a GTX970 BTW :P the extra 250$ you've spend for the 8% performance increase on the 980 should have been invested toward a proper CPU IMHO.

peace out!

You need to realize the average user just wants there games to run 60fps thats why no one cares or notices difference in amd and intel. Intels are running at higher fps we will never see.

 

The only game that i would say actually works poorly with amd is skyrim and thats because of there coding and ability to make a game not cpu's fault.

Link to comment
Share on other sites

Link to post
Share on other sites

No you haven't.  That is not how it works.  You have demonstrated time and time again that you have no understanding of how it works.  Adding GPU in a CPU bound scenario doesn't increase performance.  You spent that money on NOTHING!  I don't believe half of the results you got because you have been proven to be lying in the past, and continue to do so now.  At least you are finally admitting that the i3 is better than your FX8, so I guess that is progress.

 

To ignore you go.  The sheer amount of idiocracy in your posts drives me mad. Stop lying to people.  Just stop.  People are asking for help and you are trying to drag them down with you. 

placing people on ignore is not the way to go @Faceman , if you cant read you cant reply and you can no longer help others doing that...btw i think i served my time dont you think?

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

lolwhut

 

So you are litterly saying

if i upgrade my 7870GHz to a 290X for Skyrim, which is a cpu bound game, im not going to see any performance increase, because i use a FX8350?

 

You are joking right?

 

Because a good friend of me, who has a FX8350 + 290X gets literly twice as much performance in skyrim, then i get with my FX8350 and 7870GHz lol. :D

it depends how much of a cpu restriction there is.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

You need to realize the average user just wants there games to run 60fps thats why no one cares or notices difference in amd and intel. Intels are running at higher fps we will never see.

 

The only game that i would say actually works poorly with amd is skyrim and thats because of there coding and ability to make a game not cpu's fault.

i do realise...but what i realise as well is that an FX cpu even highly overclocked FAILS to deliver even 60fps in MANY games even when paired with an high-end gpu and i've experienced this first hand.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

it depends how much of a cpu restriction there is.

 

yes offcourse, but going with a higherend gpu will allways increase performance.

No matter which cpu you pair with it.

 

Drops in minimums will offcourse allways be there, that doesnt gonne change.

BEcause if the gpu load drops, thats affected by the cpu.

 

like i said a few posts back, if a GPU gets siginificant more fps on 1 cpu then on the other.

Then there is simply a cpu bottleneck.

 

If its realy noticable or not, thats a diffrent story.

But in some scenario´s i would definitely say yes

Link to comment
Share on other sites

Link to post
Share on other sites

yes offcourse, but going with a higherend gpu will allways increase performance.

No matter which cpu you pair with it.

 

Drops in minimums will offcourse allways be there, that doesnt gonne change.

BEcause if the gpu load drops, thats affected by the cpu.

 

like i said a few posts back, if a GPU gets siginificant more fps on 1 cpu then on the other.

Then there is simply a cpu bottleneck.

 

If its realy noticable or not, thats a diffrent story.

But in some scenario´s i would definitely say yes.

it will indeed improve the average framerates and mostly the max framerates but the minimum (cpu bottlenecks) will remain just as bad and noticeable.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

it will indeed improve the average framerates and mostly the max framerates but the minimum (cpu bottlenecks) will remain just as bad and noticeable.

 

exactly. :)

 

in some games this will realy be noticable.

You have experianced it your self with a FX8320 to 4770K upgrade.

Link to comment
Share on other sites

Link to post
Share on other sites

No you haven't.  That is not how it works.  You have demonstrated time and time again that you have no understanding of how it works.  Adding GPU in a CPU bound scenario doesn't increase performance.  You spent that money on NOTHING!  I don't believe half of the results you got because you have been proven to be lying in the past, and continue to do so now.  At least you are finally admitting that the i3 is better than your FX8, so I guess that is progress.

 

To ignore you go.  The sheer amount of idiocracy in your posts drives me mad. Stop lying to people.  Just stop.  People are asking for help and you are trying to drag them down with you. 

 

The only CPU bound scenario is Arma 3.

http://i.imgur.com/4KpFagj.jpg

http://i.imgur.com/HFNBiAK.jpg

 

My framerate in both these areas literally doubled.  (these were the areas I took the biggest performance hit, dropping down to 35-45 FPS in each of them.)  I had old screenshots, but they unfortunately got wiped when I changed out my motherboard and reinstalled windows.  If I get really bored one day, I'll reinstall the cards to take some new screenshots.  

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

exactly. :)

 

in some games this will realy be noticable.

You have experianced it your self with a FX8320 to 4770K upgrade.

indeed..i must admit i'm picky and i sometimes get down to 50 or 55fps in some games and right there i notice it i look at the fps meter and i'm always right...but i think the avetage joe notice framedrops around 45fps i think...heck i notice drops to 58fps quite often.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

The only CPU bound scenario is Arma 3.

http://i.imgur.com/4KpFagj.jpg

http://i.imgur.com/HFNBiAK.jpg

 

My framerate in both these areas literally doubled.  I had old screenshots, but they unfortunately got wiped when I changed out my motherboard and reinstalled windows.  If I get really bored one day, I'll reinstall the cards to take some new screenshots.  

41% gpu load? that a hell of a bottleneck my friend...

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

41% gpu load? that a hell of a bottleneck my friend...

(I have a 120 FPS framerate cap with RTSS)

 

I didn't say I wasn't limited.

 

I said that when I changed video cards, my framerate doubled.  I had a much lower framerate on the 270X CF set up in both of those areas.

 

A true bottleneck looks like Arma 3... where performance doesn't change one iota.  The 980 is still pushing significantly more frames, even though it isn't being used to the fullest.

 

EDIT:

 

Here's a pretty straightforward comparison.  Dawn of War 2's benchmark is very CPU bound due to the high amount of units and effects on screen. Intel CPU's would runaway with the min framerate here.

 

270X CF

http://i.imgur.com/tEt8Rsw.jpg

 

GTX 980

http://i.imgur.com/X6InqNk.jpg

 

Keep in mind, the relative performance difference between these two setups is fairly small.  270XCF is on par with a 970.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

A true bottleneck looks like Arma 3... where performance doesn't change one iota.

not everything is black and white...this is a hard cpu limitation...you can have bottlenecks to various degree from slight to insane...

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

i suppose the % in those screens are the fan speeds?

Because double the fps on only 41% load, that looks a bit strange to me lol

Link to comment
Share on other sites

Link to post
Share on other sites

not everything is black and white...this is a hard cpu limitation...you can have bottlenecks to various degree from slight to insane...

 

yeah but then you may as well argue about how 4770K's are bottlenecked relative to a 5960X.

 

My point is to prove that a high end GPU delivers high quality framerates in an FX8core and that a more powerful GPU will increase the framerate, including the minimum.  (In the vast majority of titles. Arma 3 is a unique case.)

 

Obviously if you just want to believe I'm inventing numbers go ahead.  I have no reason to do so, I'm just tired of people giving bad advice and telling people that they're throwing money away pairing a high end GPU with an FX.  There's still a tangible benefit.

 

 

i suppose the % in those screens are the fan speeds?

 

Because double the fps on only 41% load, that looks a bit strange to me lol

Keep in mind that the 270X's were equally limited in utilization.  

 

Really what I think is happening here is crossfire wasn't really taken advantage of at all in WoW. Two cards at 51% = pretty much a single 270X.  

 

 

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

 

Keep in mind that the 270X's were equally limited in utilization.  

 

Really what I think is happening here is crossfire wasn't really taken advantage of at all in WoW. Two cards at 51% = pretty much a single 270X.  

 

 

 

 

THat definitely could be.

How CF or Sli works out, highly depends from game to game.

Not every game works well with dual gpu setups.

 

But the bottom line is that you still get cpu bottlenecked basicly.

But in some scenario´s you will not realy notice this, because you get significant better fps overall.

There are also games in which you definitely feel the bottleneck, like you said your self. ARMA 3 is a good example.

 

Anyway.

 

In the end, the only thing that matter is,

that you are happy with the performance you get out of the system.

Link to comment
Share on other sites

Link to post
Share on other sites

I was surprised by what I saw, as during game play I did not notice at all just how crazy the FPS was, with it swinging as low as 32 right up to 104. (I've being using Nvidia Quadro NVS 110M graphics-which was previously the best I'd ever used- and it struggles with games from as far back as 2001, so maybe that's ruined my ability to tell the difference in games).

The results of playing Crysis 3 on the high presets with a Core 2 and a GTX 970 OCd were as follows, and if I read things right GPU usage whilst playing Crysis 3 was from 10% to 90%, with the 100% usage being in the very first scenes. And the PCIe slot only being utilized 14% (if I am reading it correctly (BUS)):
post-155575-0-22770600-1420357741_thumb.

 

post-155575-0-05517900-1420357762_thumb.

 

So judging by this (going to do this again with an X5450-the equivalent of the best Core 2 Quad extreme) an FX 8350 with its lower per-thread performance stands a high chance of bottlenecking a high end GPU, though with the extra cores it shouldn't be as bad (would test it out myself as well if I still had access to an FX 8350 rig).

Edit: This was done on the high preset, with the game being played from the start until Prophet got his bow. I would have done BF4 as well but the last update broke the game so I couldn't even get it started (trying to open it made my cpu usage go to 50%, then the bf4 process stopped and closed down)

 

GTX 970 with an FX 8350:

http://www.3dmark.com/fs/2852276

 

My OC rig (it really gets let down by the CPU in the physics tests):

http://www.3dmark.com/3dm/5359552

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

THat definitely could be.

How CF or Sli works out, highly depends from game to game.

Not every game works well with dual gpu setups.

 

But the bottom line is that you still get cpu bottlenecked basicly.

But in some scenario´s you will not realy notice this, because you get significant better fps overall.

There are also games in which you definitely feel the bottleneck, like you said your self. ARMA 3 is a good example.

 

Anyway.

 

In the end, the only thing that matter is,

that you are happy with the performance you get out of the system.

I'm happy because for $15 (cost of the cpu, the rest was given to me because the original HDD was shot,  the 970 and the WD blue are from my officially dead main rig-it doesn't even power on any more, and I've checked everything.) I have a computer that does very well in games (from my perspective) and which has cost me very little. I know for a fact that with the $100 I've spent on the Xeon and Liquid cooling, I've got a computer that beats the shit out of pretty much any pre-built system (saw an Acer desktop with some sort of low-end Nvidia gpu-1x hdmi and dvi port and an AMD A6 apu for $899 with 8GB of RAM) for less.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

yes offcourse, but going with a higherend gpu will allways increase performance.

No matter which cpu you pair with it.

 

Not always... if the CPU is consistently delaying frames more GPU power won't do anything.

I had a Core 2 Quad Q6600 and there was no difference between GTX 460, GTX 560 and GTX 760 with it in Battlefield 3 & 4 multiplayer and World of Warcraft unless I created a very intentional (and unplayable) GPU bottleneck.

Intel i5-4690K @ 3.8GHz || Gigabyte Z97X-SLI || 8GB G.Skill Ripjaws X 1600MHz || Asus GTX 760 2GB @ 1150 / 6400 || 128GB A-Data SX900 + 1TB Toshiba 7200RPM || Corsair RM650 || Fractal 3500W

Link to comment
Share on other sites

Link to post
Share on other sites

 

yeah but then you may as well argue about how 4770K's are bottlenecked relative to a 5960X.

 

Yes beyond the point of usefulness though. In the majority of games a 4770K/4790K isn't going to be a bottleneck until hundreds of frames per second. In many games, from what I've seen, FX CPUs can drag fps as low as 30s-40s while Intel's i5/i7s don't drop below 100 (BF4 for instance) :S I don't know the scope of this problem, as I mainly just play Battlefield 3, Battlefield 4, World of Warcraft and Civilization V. Of course any CPU can be a bottleneck at some point... but I think the point many are trying to make is that FX CPUs can fail to provide what many would consider an ideal experience.

 

 

 

My point is to prove that a high end GPU delivers high quality framerates in an FX8core and that a more powerful GPU will increase the framerate, including the minimum.  (In the vast majority of titles. Arma 3 is a unique case.)

 

I would be very interested in seeing frame latency and benchmarks in games comparing... seems to me that if a game is performing worse on one CPU than another that the CPU (or at least one of its cores) is being maxed and can't keep up with all the calls the GPU is making, so frame gets delayed until it can provide required data... I fail to see how adding more GPU power would improve performance at all.

That said I've seen a few benches myself indicating this, where a 4670K outperformed an FX 8350 significantly, yet a better GPU still performed better than a lower end one with the FX. I'd be interested to know why this is and see the frame latency, since I can only imagine that it would be somewhat stuttery if just completing GPU requests between spikes of CPU load.

 

Per chance could anyone shed some further light on the relationship between CPU and framerate?

Intel i5-4690K @ 3.8GHz || Gigabyte Z97X-SLI || 8GB G.Skill Ripjaws X 1600MHz || Asus GTX 760 2GB @ 1150 / 6400 || 128GB A-Data SX900 + 1TB Toshiba 7200RPM || Corsair RM650 || Fractal 3500W

Link to comment
Share on other sites

Link to post
Share on other sites

Yes beyond the point of usefulness though. In the majority of games a 4770K/4790K isn't going to be a bottleneck until hundreds of frames per second. In many games, from what I've seen, FX CPUs can drag fps as low as 30s-40s while Intel's i5/i7s don't drop below 100 (BF4 for instance) :S I don't know the scope of this problem, as I mainly just play Battlefield 3, Battlefield 4, World of Warcraft and Civilization V. Of course any CPU can be a bottleneck at some point... but I think the point many are trying to make is that FX CPUs can fail to provide what many would consider an ideal experience.

 

 

 

I would be very interested in seeing frame latency and benchmarks in games comparing... seems to me that if a game is performing worse on one CPU than another that the CPU (or at least one of its cores) is being maxed and can't keep up with all the calls the GPU is making, so frame gets delayed until it can provide required data... I fail to see how adding more GPU power would improve performance at all.

That said I've seen a few benches myself indicating this, where a 4670K outperformed an FX 8350 significantly, yet a better GPU still performed better than a lower end one with the FX. I'd be interested to know why this is and see the frame latency, since I can only imagine that it would be somewhat stuttery if just completing GPU requests between spikes of CPU load.

 

Per chance could anyone shed some further light on the relationship between CPU and framerate?

If a CPU is having to do a lot such as calculate physics (not PhysX) and control the AI in a game it starts to become stressed as its load is extremely high (Crysis 3 pushes my core 2 duo to 100%, and so does 3d mark firestrike). As a result the GPU has to actually wait for the CPU to send it the required data to display what is being run correctly. In solely GPU bound cases however, I had my GTX 970 pushed to the point where I had coil whine that was as bad as when it was in my i5 rig when it was being pushed to the limits, and as a result I could have used the P4 630 that came with my OC rig and gotten the same results.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Yes beyond the point of usefulness though. In the majority of games a 4770K/4790K isn't going to be a bottleneck until hundreds of frames per second. In many games, from what I've seen, FX CPUs can drag fps as low as 30s-40s while Intel's i5/i7s don't drop below 100 

 

I've seen a lot of benchmarks that show average framerates being within 10% of each other and AMD's minimum being 30-40 frames lower.  To me, this  means that the minimum framerate is being determined by a monkey who is just taking the bottom framerate recorded or whatever and doesn't really represent anything of note statistically.  A spike to 30 isn't great, but it's not going to ruin the game either. I usually try and take a representative sample of the bottom frames, like 1/4 or 1/3 depending on benchmark time and average it.  The better way to do it would be to actually apply more complicated statistics, but that's my quick and dirty way of doing it.

 

My personal experience has been that in most games, particularly AAA, GPU is still the deciding factor.  And it's not like I haven't tried a wide assortment of games, either...

http://steamcommunity.com/id/Dullahan/games/?tab=all

 

 

I just haven't seen it.  A few games have caused issues, but honestly I can count them on one hand.  Even something that bottlenecks as hard as dawn of war 2 is still very playable in a normal game. (Only in very extreme late game scenarios do I experience the drops.  When lots of effects etc are going off.  Which is what the built in benchmark simulates, but not really common in normal gameplay.)  

 

I can't speak for Watch Dogs, Ryse or Dead Rising 3 on PC, but pretty much any other game you can think of I have run on this system extremely well.  

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

I've seen a lot of benchmarks that show average framerates being within 10% of each other and AMD's minimum being 30-40 frames lower.  To me, this  means that the minimum framerate is being determined by a monkey who is just taking the bottom framerate recorded or whatever and doesn't really represent anything of note statistically.  A spike to 30 isn't great, but it's not going to ruin the game either. I usually try and take a representative sample of the bottom frames, like 1/4 or 1/3 depending on benchmark time and average it.  The better way to do it would be to actually apply more complicated statistics, but that's my quick and dirty way of doing it.

 

My personal experience has been that in most games, particularly AAA, GPU is still the deciding factor.  And it's not like I haven't tried a wide assortment of games, either...

http://steamcommunity.com/id/Dullahan/games/?tab=all

 

 

I just haven't seen it.  A few games have caused issues, but honestly I can count them on one hand.  Even something that bottlenecks as hard as dawn of war 2 is still very playable in a normal game. (Only in very extreme late game scenarios do I experience the drops.  When lots of effects etc are going off.  Which is what the built in benchmark simulates, but not really common in normal gameplay.)  

 

I can't speak for Watch Dogs, Ryse or Dead Rising 3 on PC, but pretty much any other game you can think of I have run on this system extremely well.  

I've got to admit, I get sick of these AMD 8350 vs Intel I5 / I7 debates because they end up pointless. I am personally using an 8350 and 970 because for me, the price and performance was right. And it still is! I don't take sides or fanboy over one company as I will be upgrading to an intel I7 but for completely different reasons. In terms of gaming an 8350 or i5 is absolutely fine. I understand that Intel chips with their hyperthreading and much mrope efficient power consumption, heat output etc make them generally better for gaming but the FX chips arent as bad as some benchmarks seem to show. For example, I get constant 60FPS on MGSV with rare drops to 50-55ish but nothing like the benchmark shows with an average of 45 Same with BF4, i get 65 - 80fps average on ultra the benchamrk was showing around 50 - 55 i think? That isn't right. And then when people show how an intel chip with the same gpu runs at 100fps in a game and the amd at 80ish, does it really matter? The average person / gamer probably has a 60Hz monitor so those extra frames aren't a big deal as long as the game is a steady 60? because in my opinion if you have the money to buy a 144Hz monitor, then you probably have enough to get an I7 and not have to worry about bottlenecks. Stuff like this seems to be more fanboyism and peopel trying to prove their setup us better than it is about the gaming experience and how well the games perform.

 

So that's my 2 cents. Alot of the time the performance difference isn't enough for a price difference (And don't forget that price differences are different for different places aswell). As long as you get a steady, playable frame rate, who cares? Intel, AMD they're both good, both got disadvantages and advantages. Get whatever. For the OP, yes an 8350 will bottleneck a gpu in some games, some i notice im only using around 60% but still getting stable playable fps and in other games i take advantage of 100% of the gpu and get stable, playable frame rates. But don't forget some games like GTA IV and WATCH_DOGS which had terrible optimisation, they run badly for people with very high end rigs as well from what they've said.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×