Jump to content

FX 6350 Bottleneck GTX 970?

Exactly like the title asks.

 

Will it bottleneck?

If yes can I overclock to a certain point to minimize it?

 

(P.S. I know Intel is better, but all things considered that's another 300$/200$ upgrade to get Intel CPU+Motherboard...So lay off the Intel Elitism...)

Link to comment
Share on other sites

Link to post
Share on other sites

It will be fine until you play Star Craft 2 LOL. Intel compiler for the LOSE.

Link to comment
Share on other sites

Link to post
Share on other sites

it'll probably bottleneck.

Computing enthusiast. 
I use to be able to input a cheat code now I've got to input a credit card - Total Biscuit
 

Link to comment
Share on other sites

Link to post
Share on other sites

It will bottelneck a fair bit unless you're planning on upgrading the CPU in the future it would be cheaper to buy a GPU that is more evenly matched with that CPU's performance. Check this out (Benchmark with 780Ti but the 970 is almost as fast) http://www.pcgameshardware.de/screenshots/original/2014/09/CPU_Performance_Index_Gaming_PC_GAMES_HARDWARE_09-2014-pcgh.png

Core i7 4820K  |  NH-D14 | Rampage IV Extreme | Asus R9 280X DC2T | 8GB G.Skill TridentX | 120GB Samsung 840 | NZXT H440  |  Be quiet! Dark Power Pro 10 650W

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

Well, I'll OC it some, and see if that helps.

 

I'm not looking to invest 300$ to switch to Intel right now. Perhaps in 5 years.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, I'll OC it some, and see if that helps.

 

I'm not looking to invest 300$ to switch to Intel right now. Perhaps in 5 years.

it will a little but thats because games dont use more than 4 cores so when dx12 comes out you wont bottleneck

Please follow your topics guys, it's very important! CoC F.A.Q  Please use the corresponding PC part picker link for your country USA, UK, Canada, AustraliaSpain, Italy, New Zealand and Germany

also if you find anyone with this handle in games its most likely me so say hi

 

Link to comment
Share on other sites

Link to post
Share on other sites

It will probably bottleneck a fair amount. You'd be better off getting an 8320 if you don't want to drop a ton switching to the blue team.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

It will probably bottleneck a fair amount. You'd be better off getting an 8320 if you don't want to drop a ton switching to the blue team.

I might, we'll see. 

 

I'm hoping directX 12 and AMD multi-threaded consoles will take advantage of my 4 cores that are just sitting there... ='[

Link to comment
Share on other sites

Link to post
Share on other sites

The amount of stupidity in this thread hurts my brain.

 

No, your 6350 will NOT bottleneck a GTX 970 in any situation that you are likely to come across. Feel free to overclock it but you are unlikely to max out any of its cores in-game.

[AMD Athlon 64 Mobile 4000+ Socket 754 | Gigabyte GA-K8NS Pro nForce3 | OCZ 2GB DDR PC3200 | Sapphire HD 3850 512MB AGP | 850 Evo | Seasonic 430W | Win XP/10]

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Depends on the resolution you are going to game at.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

The amount of stupidity in this thread hurts my brain.

Why not help rectify it instead of just pointing it out?

AMD FX-6300 @ 4.5ghz (1.332v) | Cooler Master Hyper 212 EVO | Asus M5A97 R2.0 | Kingston HyperX 16GB @ 1600mhz | MSI Radeon R9 290 Twin Frozr


OCZ ModXStream Pro 600w PSU | 256GB Samsung 850 PRO SSD | 1TB Seagate Barracuda 7200.14 | Zalman Z11 Plus

Link to comment
Share on other sites

Link to post
Share on other sites

Why not help rectify it instead of just pointing it out?

i will...OP if you game at 1080P and you play mostly modern games (console ports, FPS, racing, sports etc.) no it will not bottleneck the card most of the time...if you play a lot of mmo's and RTS OR if you play at 720p or lower resolution THEN the card will be bottlenecked...but it's normal even my 4770K overclocked bottleneck my GTX 780 in MMO's and RTS at 1080p...definitive answer is GO FOR IT a better GPU is always giving better gaming experience and will deliver better average FPS and performance will be nice, also will allow you do play with higher anti-aliasing, texture quality settings and other GPU based effects...

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

The amount of stupidity in this thread hurts my brain.

 

No, your 6350 will NOT bottleneck a GTX 970 in any situation that you are likely to come across. Feel free to overclock it but you are unlikely to max out any of its cores in-game.

Honestly, I thought it said 4350 for some stupid ass reason.

 

@459pm, disregard what I said. What @i_build_nanosuits said above me is more like what I'm thinking now that I know it's a 6350.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

It will probably bottleneck a fair amount. You'd be better off getting an 8320 if you don't want to drop a ton switching to the blue team.

I wouldn't even recommend pairing an 8320 with the 970, the bottleneck is there and it's real lol.

The amount of stupidity in this thread hurts my brain.

No, your 6350 will NOT bottleneck a GTX 970 in any situation that you are likely to come across. Feel free to overclock it but you are unlikely to max out any of its cores in-game.

I really feel like I don't need to argue, it should be common sense at this point that the FX series induces major bottlenecks on high level graphics cards, but you're proving that it still needs to be argued, so I provide you a single benchmark on a relatively recent game with a significant chunk of graphical horsepower.

56764.png

The bottleneck is there, it's very real and highly noticeable. To insist that it's not is foolish.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

i will...OP if you game at 1080P and you play mostly modern games (console ports, FPS, racing, sports etc.) no it will not bottleneck the card most of the time...if you play a lot of mmo's and RTS OR if you play at 720p or lower resolution THEN the card will be bottlenecked...but it's normal even my 4770K overclocked bottleneck my GTX 780 in MMO's and RTS at 1080p...definitive answer is GO FOR IT a better GPU is always giving better gaming experience and will deliver better average FPS and performance will be nice, also will allow you do play with higher anti-aliasing, texture quality settings and other GPU based effects...

Pretty much this. Any CPU can "bottleneck" any GPU. It's a Yes/No answer. it all depends on the games you play. If you just play MMOs/RTS and you already have a decent GPU there is no point in upgrading, but if you play a variety of games then yes go for it and upgrade your CPU later if the performance doesn't suit your needs.  

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't even recommend pairing an 8320 with the 970, the bottleneck is there and it's real lol.

I really feel like I don't need to argue, it should be common sense at this point that the FX series induces major bottlenecks on high level graphics cards, but you're proving that it still needs to be argued, so I provide you a single benchmark on a relatively recent game with a significant chunk of graphical horsepower.

The bottleneck is there, it's very real and highly noticeable. To insist that it's not is foolish.

You happened to find the "perfect" benchmark that has little to do with CPU performance, but rather crossfire scaling and PCIe bandwidth. The benchmark shown below highlights what performance is like without Crossfire issues. The FX 6350 practically performs the same as the FX 8350 due to a similar clock speed and a cache/core ratio.

54520.png

Civ 5 is known for not fully utilizing quadcore chips, let alone hexacore or octacore chips.

 

If we pull a more intensive game from the bag, we can clearly see that the FX 6350 holds its own

CPU_01.png

Overclocking does return visible performance improvements

CPU_03.png

 

ARMA 3 is another title that has historically been favored by Intel CPUs

CPU_03.png

Although once we get to overclocking the 8350 we see performance that is on par with the i7-3960X and i7-4770K.

CPU_02.png

 

Let's take Watch Dogs for example, where more threads=better

CPU-FR.png

The 8350 ending up ahead of the Ivy i5 while the i7-3960X takes the cake. The 3.5GHx FX 6300 sits around 36 fps so we could infer that the 6350 would end up around the 45-50fps mark with a good overclock.

 

In a second test conducted by techspot, the 8350 and 6350 are only a few fps behind Intel's i5s and i7s which could easily be corrected with an overclock.

CPU_01.png

 

 

In conclusion, both the FX 6350 and FX 8350/8370 are great options that deliver 95% of the performance of an i7 for a fraction of the price. As long as you aren't messing around with crossfire setups in poorly optimized games and PCIe lane configurations that restrict bandwidth, you will experience just as smooth a gaming experience as you would with an i7.

[AMD Athlon 64 Mobile 4000+ Socket 754 | Gigabyte GA-K8NS Pro nForce3 | OCZ 2GB DDR PC3200 | Sapphire HD 3850 512MB AGP | 850 Evo | Seasonic 430W | Win XP/10]

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

If that's the game, alright, just about 100 more benchmarks exist showing the opposite.

c1ZWhQ9.jpg

 

FX-9590-62.jpg

 

FX-9590-64.jpg

(notice the i3 performing better in Skyrim)

FX-9590-66.jpg0

 

batman.png

 

civilization.png

(Notice the GPU bottleneck with the resolution change, which is almost irrelevant but just to show that the benchmarks aren't alike)

hitman.png

(A similar GPU bottleneck, what a shock)

sleepingdogs.png

 

4rU8fJF.png

 

06UVKqD.png

 

500x1000px-LL-1fa7a24a_3350949334973.png

 

bf4_1920n.png

 

sc2_1920n.png

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

If that's the game, alright, just about 100 more benchmarks exist showing the opposite.

c1ZWhQ9.jpg

 

FX-9590-62.jpg

 

FX-9590-64.jpg

(notice the i3 performing better in Skyrim)

FX-9590-66.jpg0

 

batman.png

 

civilization.png

(Notice the GPU bottleneck with the resolution change, which is almost irrelevant but just to show that the benchmarks aren't alike)

hitman.png

(A similar GPU bottleneck, what a shock)

sleepingdogs.png

 

4rU8fJF.png

 

06UVKqD.png

 

500x1000px-LL-1fa7a24a_3350949334973.png

 

bf4_1920n.png

 

sc2_1920n.png

Nearly every benchmark you provided was unrealistic or irrelevant. I'm concerned that the BF3 benchmark is either botched or biased. I was just running BF3 the other day with an average framerate in the 150 range, and that's with an FX-6350 and GTX 465s.

OWwrabz.png

[AMD Athlon 64 Mobile 4000+ Socket 754 | Gigabyte GA-K8NS Pro nForce3 | OCZ 2GB DDR PC3200 | Sapphire HD 3850 512MB AGP | 850 Evo | Seasonic 430W | Win XP/10]

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

You happened to find the "perfect" benchmark that has little to do with CPU performance, but rather crossfire scaling and PCIe bandwidth. The benchmark shown below highlights what performance is like without Crossfire issues. 

Not really it has to do with CF scaling or Pcie bandwidth (especially this one not). http://www.anandtech.com/show/7189/choosing-a-gaming-cpu-september-2013/8 there's where its pulled from.

- Looking at the first graph with a single 7970, we already see AMD behind like 10-15% and the gap between Intel clearly says AMD was bottlenecking the first card slightly. Intel bottlenecking? Can't be said, its unknown. Once you do that, forget adding a 2nd or you run into a complete bottleneck.

- Look at the 2nd graph, then it's clear that Intel wasn't bottlenecking in the 1st graph. Gap extends to 80% because like I said don't add a 2nd card if you already bottleneck a single card. Intel bottlenecking 2way CF? Unknown.

- Look at the 3rd graph, Intel clearly bottlenecks the 3rd card. Loads could have been 99%/99% in 2 way CF and since there's no gain it would be most likely 33/33/33%. It happens you get 99% on a single card and in SLI you get 50/50%, bottlenecking unexpectedely which is freaking annoying.

AMD loses some performance with 2 cards, Intel does as well with 3 cards. I've noticed it as well between 80% on a single card or 40/40% on 2 way SLI the performance is slightly slower. You just need a lot of horsepower for Multiple GPU's in SLI.

 

The FX 6350 practically performs the same as the FX 8350 due to a similar clock speed and a cache/core ratio.

And because the game doesnt need extra cores.

 

Civ 5 is known for not fully utilizing quadcore chips, let alone hexacore or octacore chips.

Which many games are. Octacore only Metro 2033/Crysis 3. 

 

Overclocking does return visible performance improvements

Lower IPC > Lower performance per clock. With my 3930K at stock at a certain spot I get 52 FPS, disabling all 2 cores & HT bumping the clock up to 4.9GHz bumps to 73 fps at 99% gpu load (single).

 

ARMA 3 is another title that has historically been favored by Intel CPUs

That's what you actually can say about all CPU bound games since theyre mostly lightthreaded.

 

Although once we get to overclocking the 8350 we see performance that is on par with the i7-3960X and i7-4770K.

You won't. If you haven't realized it yet, whats the name of that source again (?), they always show us that every cpu has the GPU hitting its limit, which we are nothing with when we know our GPU loads are nowhere near 50% in that game they tested, you're not a lot with 1080p offline tests with every settings cranked up when in real world the CPU gets by a shitload more stressed. You need to lower the settings/resolution to simulate a CPU bottleneck that you always will have in Arma 2/3 to give a realistic scenario of which CPU performs better. http://www.hardwarepal.com/wp-content/uploads/2014/01/ARMA-3-CPU-Benchmark.jpg

Lets say the game has around 3-4 threads, one will be the main thread, rest will be the trashthreads, in real world the main thread overloads heavily and the trashthreads will wait on the mainthread, have 50 trashthreads ifyou like, all of them waiting at the main thread 50 core cpu's mean nothing. Multithreading in games will never ever get perfect, meaning perfect distribution, high-level parallelism, etc you'll never see games magically saturating 8 threads to make AMD somehow the best out there. How games are multithreaded atm, will never get better. That Console stuff is BS, 2 cores are reserved for the OS which Sony said; http://www.eurogamer.net/articles/digitalfoundry-ps3-system-software-memory (look at picture)

In fact, Crysis 3/Metro 2033 which are Xbox 360 ports are far better multithreaded than BF4 and that laughable game WD. Metro 2033/Crysis 3 was loading my 3930K to around 70% load, which is sick, different map, bottlenecks at 15% load which is why you see I5's outperforming in x benchmark and 8350's outperforming i5 in x benchmarks.

 

Let's take Watch Dogs for example, where more threads=better

Isn't true. Clocking my 3930K all the way down to 1200MHz, running everything at low settings/resolution, the max load I've ever seen was around 40% (8% each thread so 5 cores). Basically it averages at 32%, very very rarely it ever goes to 40%, it even bottlenecks at 20% load a lot. I got a video earlier, adding 2 physical cores to 4 cores was just a silly 16% gain. You don't get lineairly scaling by adding more cores.

wd_1920n.png

Check how much performance a 9590 adds, basically almost nothing, compare that difference between the 4790/4790K which is a significantly lower clock. Why? Because more IPC means also more performance per clock. The higher clock speed, the faster the cycles, how more instructions being done, the more you'll benefit from a higher clock.

 

The 8350 ending up ahead of the Ivy i5 while the i7-3960X takes the cake. The 3.5GHx FX 6300 sits around 36 fps so we could infer that the 6350 would end up around the 45-50fps mark with a good overclock.

 

4 threads will decide 84% of your performance in my test, you can't improve a thread's performance with more cores, think again if you can get 4 AMD cores equal to 4 Intel cores and add that 16% if you'd like. Lets analyse WD's process;  http://i.imgur.com/zbTAvFs.jpgµ

7 threads so you'll never leverage 8 cores, because you have 7 doesn't mean you'll be taking advantage of 7 cores at all, can demonstrate it cba atm. (Did this in the last quote lol). Taking advantage of extra cores means in this case that you get more performance or what else are you're going to do with your cores anyways? 

I'll do the same for BF4, which many people were selling it as a "8 threaded optimized games 8350 yolo best cpu"; http://i.imgur.com/GKFYo70.png

A thread is just a stream of instructions, you can only execute a thread on one core obviously, this program I'm using shows how many threads there are inside the game's process and gives us a cycles delta along with how much its using of the cpu, it's not total cycles, lets say its how many cycles per sec, BF4 we only see 6 threads not 8 to take advantage of 8 cores. A thread can only use maximum [100/<xcores>] of your CPU because remember a thread can only use 1 core, in my case 8.33% of my CPU. A 5960x will never outperform a 5930K. Why do you get 5% gain from a 8350 to a 6300? CMT and clock/cache. 

 

Nearly every benchmark you provided was unrealistic or irrelevant. I'm concerned that the BF3 benchmark is either botched or biased. I was just running BF3 the other day with an average framerate in the 150 range, and that's with an FX-6350 and GTX 465s.

 

BF3 doesn't take advantage of more than 4 cores, Intels IPC scales up to 100% so it's not biased at all. A friend of me had a 8350 with a 660Ti, I was running a 2600K with a 670 back then, both at stock, it wasn't even a challenge keeping my 670 at 99%, doesnt even dip below that but on the other hand that guy's GPU load were completely awful. Any benchmark that shows a difference there between 4 & 6 cores in BF3 is useless, I'll quickly prove that wrong putting 3930K at stock and disabling 2 cores & HT at 5GHz. youtube.com/watch?v=pDdqWoj3kF4 & BF3 thread analyse; http://i.imgur.com/1JozPaf.png

See? We had the threads to use more than 4 cores but we aren't taking ever advantage of more than 4 cores. Having the threads doesn't translate to more performance from extra cores.

Also I'm getting 250 fps as well at ultra but yeah.. But when? You can't manage 120 fps all the time at ultra in BF3, no CPU is up to that job yet. Besides Strike at Karkhand for example even on an empty server, is far more CPU intensive than something like plain boring Khargh Island. When you're getting baseraped there, it's awful, shitload of tanks, idiots spamming explosives etc, I actually love being baseraped as a sniper, headshot headshot headshot 50/1 all headshots, it's impossible to maintain 120 FPS. Can't be arsed with Ultra anymore so I just run it at low settings instead, sounds funny but visuals are even better, can't be bothered anymore with 5GHz OC's and shit IPC.

The 5960x, well we'll have some idiots in the world that will buy it and use futureproof 8 core yolo optimized bs, give them a year or three their almighty Star wars cpu will get rekt by a 100$ CPU just like that 990x atm is getting owned by a G3528 that's overclockable on a 40$ H81m-HDS with the stock cooler to 4.5GHz for 100$. You'll always prefer a quadcore with the same multithreaded as a 8 core over a 8 core, it's much easier pulling the max performance out of a quadcore than an octacore.

For AMD the sweet spot is the 6300, anything up is BS, for Intel the 4670K and anything up is BS, for singlethreaded games the Pentium G3258 but that one will be discontinued with Broadwell and we won't see an unlocked pentium again so again the i5 [broadwell number].

To answer the OP's question well; will [put any cpu here] bottleneck a 970? Yes and no. Monitor your GPU loads, if they're around 50% and its singlethreaded, you'll get a massive boost upgrading to Intel. If its averaging at 80% in all of your games, well the load actually scales lineairly with your FPS. You'll get 20% more FPS with more CPU performance. Don't upgrade your CPU if your GPU is sitting at 90%, thats pointless.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×