Jump to content

FX-8320 w/ GTX970 2-way SLI. Will there be bottlenecks?

vern021

-later posts-

You are fucking completely wrong!

I have an FX 8320 with HD7770 on 1280X1024 and all I see is fucking fuck bottlenecks!

Although probably I should drink less, but it is for a different thread

 

in Other notes

 

people tend to forget, that AMD is dirt cheap. FX 8320= £105

http://www.overclockers.co.uk/showproduct.php?prodid=CP-337-AM&groupid=701&catid=1967

FX 9590= £170 and was on sale for £140

http://www.overclockers.co.uk/showproduct.php?prodid=CP-360-AM&groupid=701&catid=1967

 

motherboards are virtually the same, as personally I get decent mobo`s no matter my CPU choice

 

I just love how people just try to prove each other whose penises  processor is better by basically shouting at each other. I mean here is Andy here, I am reading his threads, and he was comparing his Xeon to his FX with valid points, head to head, at the newest OS. and there were similar results

 

power consumption i would say does not matter. why? because those processors have that TDP at 100% usage. And we come back to the argument, what is love and TDP, but please do not. whenever you are doing nothing (most of the time) they will have a tiny usage anyways, and it can cost a few £ in a year. 

 

I just love how fanboys call each other fanboys, while doing nothing, but showing benchmarks that proves them from one source

Link to comment
Share on other sites

Link to post
Share on other sites

You are fucking completely wrong!

I have an FX 8320 with HD7770 on 1280X1024 and all I see is fucking fuck bottlenecks!

Although probably I should drink less, but it is for a different thread

 

in Other notes

 

people tend to forget, that AMD is dirt cheap. FX 8320= £105

http://www.overclockers.co.uk/showproduct.php?prodid=CP-337-AM&groupid=701&catid=1967

FX 9590= £170 and was on sale for £140

http://www.overclockers.co.uk/showproduct.php?prodid=CP-360-AM&groupid=701&catid=1967

 

motherboards are virtually the same, as personally I get decent mobo`s no matter my CPU choice

 

I just love how people just try to prove each other whose penises  processor is better by basically shouting at each other. I mean here is Andy here, I am reading his threads, and he was comparing his Xeon to his FX with valid points, head to head, at the newest OS. and there were similar results

 

power consumption i would say does not matter. why? because those processors have that TDP at 100% usage. And we come back to the argument, what is love and TDP, but please do not. whenever you are doing nothing (most of the time) they will have a tiny usage anyways, and it can cost a few £ in a year. 

 

I just love how fanboys call each other fanboys, while doing nothing, but showing benchmarks that proves them from one source

 

Bottlenecks? lmao.

 

You clearly don't understand how games work. Some games are coded to lay off of the CPU. In fact, most recent games are not CPU dependent and if AMD get their way (Mantle and HSA) then the CPU will be irrelevant. See also - DX 12.

 

Just because you are only using a certain percentage of the CPU or GPU/s does not mean you have a bottleneck. It means the drivers are not working correctly, the game is poorly coded or many, many other things.

 

I was in the top ten of Firestrike on OCUK running my 8320 and 670s in SLI (all nicely overclocked) and my GPU scores were no lower than any I7 rig. So the rest comes down to software.

 

It's amazing how rubbish goes around the internet. One guy loads up a CPU monitor and states he is only using 60% of his CPU in a certain game and the sheep all come along and tell him he has a bottleneck.

 

The only bottleneck on a GPU is the bandwidth available to it via the PCIE lanes. And you'd be very, very hard pushed to saturate the bandwidth completely with any GPU subsystem available these days. PCIE 3 makes it even harder.

 

Furmark as an example will stress the GPU, and only the GPU, to 100%. No ifs, no ands, no buts. Prime 95 will stress *any* CPU to 100%. Again, no ifs etc. So if you run both together then you will be running your CPU and GPU to 100% at the same time. If a game can not or does not do that it's down to the game.

 

As I said, about three times now, you need to understand how this stuff works at a low level. PC games are slop cross coded from a console.

Area 51 2014. Intel 5820k@ 4.4ghz. MSI X99.16gb Quad channel ram. AMD Fury X.Asus RAIDR.OCZ ARC 480gb SSD. Velociraptor 600gb. 2tb WD.

Link to comment
Share on other sites

Link to post
Share on other sites

Bottlenecks? lmao.

 

You clearly don't understand how games work. 

(my first two line tried to be a joke between bottleneck and alcoholism, nothing to be taken seriously :D)

Link to comment
Share on other sites

Link to post
Share on other sites

Guys... guys. Don't rant too much let's keep this discussion as civilized as possible(no negative messages towards other people(unless you're talking about the post)). As what I've posted earlier, it might be better if we test it 1st if it really bottlenecks(when I already bought the 970 and see the results).

My Current PC Codename: Scrapper

Spoiler

Intel i5-3570 | Some LGA 1155 MOBO Some Generic DDR3 8GB 1600Mhz | PowerColor RX 560 2GB | Recycled HP Case Crucial MX100 128GB 1TB WD Blue 7200RPM | Some Generic 500w PSU | Intel Stock Cooler

Link to comment
Share on other sites

Link to post
Share on other sites

I think I'll take your advice. I forgot that GK110 is more complicated than GM204. Perhaps we can't see massive fps drops with it. I would appreciate if there's a benchmark test about it. This topic is pretty hot right now(already in the front page). Maybe @LinusTech @Slick @nicklmg might want to take a look at it?

Why not check your cpu load with MSI Afterburners overlay? If your looking at 60fps and your pushing out stupid numbers like 200fps @ 90% then you should be OK.

Try and load up the CPU with antivirus scans etc. till it hits 100% and see what FPS your running at, don't use benchmarks as your trying to bring the actual demand of the CPU to 110%, there's no way to check this so you have to guestimate

Link to comment
Share on other sites

Link to post
Share on other sites

All I can say is that I have a brand new stock clocked FX-8350 and that it does bottleneck my R290 in Planetside 2. But I still rarely get below 60 FPS, so...

 

Oh, and I have a pretty underpowered cooler for it (Scythe Big Shuriken) so I believe it never even hits the "turbo" mode or whatever the hell AMD calls it.

 

...

 

However, as always, I'll strongly advise against SLI (or Crossfire) as the compatibility is not even close to 100%.

Plural of PC is PCs, not PC's. Plural of CPU is CPUs, not CPU's. Plural of LED is LEDs, not LED's.

 

You can build computers really well, not real good.

Link to comment
Share on other sites

Link to post
Share on other sites

Even with single 970 8320 still gonna bottleneck in some heavy CPU intensive games. I can show video where GTX 780 is being bottlnecked by 8350 in BF4 if you want. However if you gonna upgrade CPU to intel, then it will be okay.

We should all probably just ignore this guy
Link to comment
Share on other sites

Link to post
Share on other sites

All I can say is that I have a brand new stock clocked FX-8350 and that it does bottleneck my R290 in Planetside 2. But I still rarely get below 60 FPS, so...

 

Oh, and I have a pretty underpowered cooler for it (Scythe Big Shuriken) so I believe it never even hits the "turbo" mode or whatever the hell AMD calls it.

 

...

 

However, as always, I'll strongly advise against SLI (or Crossfire) as the compatibility is not even close to 100%.

 

How do you know it is bottlenecking? is it because your FPS are lower than some one using an I7 ?

 

It's not bottle necking, dude. It's simply calling on the CPU and your CPU is not as good as an I7. To truly bottle neck a GPU you would need to saturate the PCIE lanes and bandwidth available to them.

 

TBH even the term bottle necking is flawed and bullshit.

 

Simple equation - a 780 GTX for example with a I7 4790k will put out higher FPS than, say, a 780 GTX and 8350 clocked to a comparative speed. However, that's not bottle necking it's just because the I7 (note bold letters to state I7 and *not* I5) is a faster CPU in absolutely everything than the FX 8.

Area 51 2014. Intel 5820k@ 4.4ghz. MSI X99.16gb Quad channel ram. AMD Fury X.Asus RAIDR.OCZ ARC 480gb SSD. Velociraptor 600gb. 2tb WD.

Link to comment
Share on other sites

Link to post
Share on other sites

We should all probably just ignore this guy

 

With that profile picture? No kidding  :D

 

How do you know it is bottlenecking? is it because your FPS are lower than some one using an I7 ?

 

"A bottleneck is a phenomenon where the performance or capacity of an entire system is limited by a single or limited number of components or resources." 

 

"Bottleneck - something that holds up progress, esp of a manufacturing process"

 

(from a Culture dictionary) "The point at which an industry or economic system has to slow its growth because one or more of its components cannot keep up with demand."

 

..

 

 

Please don't try to change the meaning of words.

(PS: I know that because Planetside 2 has a feature that says which component is the bottleneck, if any)

Plural of PC is PCs, not PC's. Plural of CPU is CPUs, not CPU's. Plural of LED is LEDs, not LED's.

 

You can build computers really well, not real good.

Link to comment
Share on other sites

Link to post
Share on other sites

It depends on the game; in MMOs, it most defnetly bottleneck since they usually only use 1 core-and well-we all know amds single core performance isnt the best; But usually you wont get a bottleneck which will seriously limit your experience in gaming.

My Rig: AMD Ryzen 5800x3D | Scythe Fuma 2 | RX6600XT Red Devil | B550M Steel Legend | Fury Renegade 32GB 3600MTs | 980 Pro Gen4 - RAID0 - Kingston A400 480GB x2 RAID1 - Seagate Barracuda 1TB x2 | Fractal Design Integra M 650W | InWin 103 | Mic. - SM57 | Headphones - Sony MDR-1A | Keyboard - Roccat Vulcan 100 AIMO | Mouse - Steelseries Rival 310 | Monitor - Dell S3422DWG

Link to comment
Share on other sites

Link to post
Share on other sites

So I would like to see any one here explain to me why the I5 is ever faster than the FX 8, given it mixes it with the earlier I7s.

 

Otherwise it's cherry picking which is cheating.

You are lying about AMD's performance, that's called fanboyism. There's no such thing as cherrypicking benchmarks when AMD loses everywhere.

 

I saw Windows 7 because the likelihood is that you are using Windows 7 (or it was used) when those benchmarks were ran. As for how many cores it supports? see my analogy. 

 

A flawed analogy is no analogy. FX CPU's are properly supported, Windows 7 had just a scheduler issue. It had to fill a module completely before it could use the next module which is fixed with the hotfixes. http://www.anandtech.com/show/7189/choosing-a-gaming-cpu-september-2013/2 (bulldozer challenge)

 

As for "destroy"? less than 10% between a CPU that costs around £100, compared to the one you paid £280+ for. I don't see that as destroying in any shape or form, so you should watch your words in future. If I go harder on my FSB and lower on the multi and combine that with a higher NB clock I can get to 810. That's 60 points less than your I7.

I can disable 2 cores and play with the BCLK (max is 167MHz) your point gets nowhere. Haswell can play with the ring clock.

 

So, as I said earlier in the thread and I reiterate - No I5, not even the very latest Haswell refresh, is powerful enough to take on an AMD FX 8. Not when the FX is supported properly.

You keep moaning about applications not supporting the FX properly but what about applications that support the AVX2 extensions Haswell offers? There's NO way a FX will outperform a i5 with AVX2 supported;

multimedia.png

There you go a 4770k thats twice as fast as the 8350 multithreaded.

 

As for your I3 benchmarks? put the wheels back on the car. As I already mentioned, those games are either old or shite.

 

Then it would be a BMW. Only 2 wheels spin. Another flawed analogy.

 

 

but in anything else the FX 8 will win hands down. 

No.

This game (metro last light) is loading my 3930K above 60% which uses 8 cores.

S2npH0I.jpg

Metro-Last-Light-Cpu-Benchmark.jpg 

 

500x1000px-LL-2d80c463_metroproz.jpeg

 

That is why AMD are not wasting money on pointless IPC and new CPUs, they are concentrating more on getting the ones they already have properly supported.

 

Then you're an idiot tbh, CPU monopoly is all about IPC. It's the reason why AMD was better than Intel before Conroe, back when Intel was focusing on clock speeds getting outperformed at a 50% higher clock. Now AMD is making that mistake along with corecount. Making stronger cores along with lower power consumption = win. What do we gain? A 8 core sitting at 250% of a 5GHz 8350.

5960X123-42.jpg

Wake up. I've explained you how games are threaded, why gaming is all about IPC (to a moderate amount of corecount) and it will NEVER ever change. Go google how multithreading/parallel coding works before you think something stupid that you can have 8 main threads all doing the same thing at the same time and can run independently of each other, it's readily available and use some logic.

 

 

Bottlenecks? lmao.

A 8320 will bottleneck a gtx 970 SLI setup, period. 

 

You clearly don't understand how games work. Some games are coded to lay off of the CPU. In fact, most recent games are not CPU dependent and if AMD get their way (Mantle and HSA) then the CPU will be irrelevant. See also - DX 12.

ROFL. Seriously? Lol. You tell him to learn how games work when you're claiming here that some games are only running of the cpu. Hahahahah, dude do you even know what CPU's actually do? They are just preparing frames; telling the GPU how the frames should look like, doing physics, basic things like calculating bullet drop etc and in the end the GPU gets the information from the CPU which starts rendering. Downclock a GPU to 0MHz, you won't get a frame in a month time.

You're not going to let a CPU render frames what you'll see on your display, CPU's aren't fast enough for that.

 

Just because you are only using a certain percentage of the CPU or GPU/s does not mean you have a bottleneck. It means the drivers are not working correctly, the game is poorly coded or many, many other things.

Wrong, anything below 99% is a CPU bottleneck. Lets use a proper analogy; your CPU is your engine - your transmission is your GPU - your tires how fast they spin is your speed or FPS.

Whenever the engine/CPU is too slow, it will affect the transmission resulting in a lower speed. Same thing, CPU's are feeding GPU's with information. If the GPU doesnt get its information quickly enough, it won't hit its limit at all. A GPU should be the bottleneck of your system which is at 99%. Official post by nvidia about CPU/GPU bottlenecks; https://forums.geforce.com/default/topic/532913/sli/geforce-sli-technology-an-introductory-guide/

"That last part means we have some responsibility in the scaling equation: making sure the GPUs are the bottleneck in our system."

"Each individual frame is first prepared by the CPU and then handed off to a GPU to be rendered as illustrated."

Jibagsl.png

 

How do you know it is bottlenecking? is it because your FPS are lower than some one using an I7 ?

Simple equation - a 780 GTX for example with a I7 4790k will put out higher FPS than, say, a 780 GTX and 8350 clocked to a comparative speed. However, that's not bottle necking it's just because the I7 (note bold letters to state I7 and *not* I5) is a faster CPU in absolutely everything than the FX 8.

A 8350 with a 780 @ 99% load will perform EQUAL to a 5960x with the same 780 @ 99%. It will be within margin of error.

56769.png

If a 8350 lacks by 30% behind, then it means the 8350 is bottlenecking.

56763.png

56764.png

 

It's not bottle necking, dude. It's simply calling on the CPU and your CPU is not as good as an I7. To truly bottle neck a GPU you would need to saturate the PCIE lanes and bandwidth available to them.

 

Again a GPU will perform at its best at its FULL load which is 99% which we call the bottleneck like I do and that in nvidia official post. PCIe bandwidth is a total different story, mining barely uses VRAM and you'll have the GPU running at 99% with pcie 1.1 x4.

 

TBH even the term bottle necking is flawed and bullshit.

Sorry you are spreading here bullshit and flawed nonsense analogies. 

 
Link to comment
Share on other sites

Link to post
Share on other sites

I admire your dedication @Faa (bookmarking those posts, nice collections), still years later people do not seem to understand what role a CPU plays in games. And that 50% CPU load on all cores and 99% GPU load during a game means it's 'multithreaded' and therefor more IPC or overhead will thus not generate any extra drawcalls. 

 

Honestly, I think science in any sector is no longer what it used to be. No reviewers give any detailed information or elaboration in CPU gaming benchmarks anymore, or test in scenario's that matter. They just throw something at it and draw half-assed conclusions. So if anything, it's perpetuated by modern reviewers that test CPU's at 1080/1440p maxed settings on a single graphics card. Even if your CPU has 50% overhead, the extra generated drawcalls in this situation is nearly 0. Number e and all that..

 

i've ran out of steam explaining it to people, personally. I know Intel is beating AMD, meaning the underpowered and powerconsuming behemoths will soon dissappear. AMD FX CPU's are poor products, and should not be in anyone's gaming PC.

 

To answer OP's question; Yes it will, It'll bottleck a single card aswell in most consoleports.

Link to comment
Share on other sites

Link to post
Share on other sites

What's the real answer guys? Can you give some supporting reasons from a reliable source?

 

The problem is that there ISN'T an answer. It just depends.

 

It'll bottleneck you from getting above 60 in Star Citizen, because that's all your CPU can do. It won't bottleneck you in FC3, because the CPU can make more frames than your GPUs can render. Does that make more sense?

 

Will it bottleneck you from getting 60 frames in games? Probably not. Keep in mind that Low Textures and Ultra Textures use almost the same amount of CPU.


 

[spoiler = "My Computer Stuff"]

My ITX:

240 Air ; Z87I-Deluxe ; 4770K ; H100i ; G1 GTX 980TI ; Vengeance Pro 2400MHz (2x8GB) ; 3x 840 EVO (250GB) ; 2x WD Red Pro (4TB) ; RM650 ; 3x Dell U2414H ; G710+ ; G700s ; O2 + ODAC + Q701 ; Yamaha HTR-3066 + 5.1 Pioneer.

 

Things I Need To Get Off My Shelf:

250D ; 380T ; 800D ; C70 ; i7 920 ; i5 4670K ; Maximus Hero VI ; G.Skill 2133MHz (4x4GB) ; Crucial 2133MHz (2x4GB) ; Patriot 1600MHz (4x4GB) ; HX750 ; CX650M ; 2x WD Red (3TB) ; 5x 840 EVO (250GB) ; H60H100iH100i ; H100i ; VS247H-P ; K70 Reds ; K70 Blues ; K70 RGB Browns ; HD650.


Link to comment
Share on other sites

Link to post
Share on other sites

Ohhh man... ohhh maaan.

 

Brutal. 

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

@ALXAndy @Faa Guys!!! Calm the fuck down. Seriously, can you guys give your opinion without the negative messages? Not that I hate what you're doing but when I look at your quotes or multiquotes it looks like you guys will throw some hands.  :mellow:  :(  :unsure: No offense andy, but Faa's statements are pretty convincing.

My Current PC Codename: Scrapper

Spoiler

Intel i5-3570 | Some LGA 1155 MOBO Some Generic DDR3 8GB 1600Mhz | PowerColor RX 560 2GB | Recycled HP Case Crucial MX100 128GB 1TB WD Blue 7200RPM | Some Generic 500w PSU | Intel Stock Cooler

Link to comment
Share on other sites

Link to post
Share on other sites

@ALXAndy @Faa Guys!!! Calm the fuck down. Seriously, can you guys give your opinion without the negative messages? Not that I hate what you're doing but when I look at your quotes or multiquotes it looks like you guys will throw some hands.  :mellow:  :(  :unsure: No offense andy, but Faa's statements are pretty convincing.

true points can be convincing. in many scenarios a 8320 will bottleneck a gtx970 sli.

 

in a perfect world where everything uses 50000 cores we wouldn't have that problem but we do because it's a programming limitation.

Link to comment
Share on other sites

Link to post
Share on other sites

56763.png

 

 

The 8350 actually does worse with the crossfire, lol...

 

I genuinely regret my decision of picking up the 8350 more and more every post you make about it, haha!

56764.png

 

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

@ALXAndy @Faa Guys!!! Calm the fuck down. Seriously, can you guys give your opinion without the negative messages? Not that I hate what you're doing but when I look at your quotes or multiquotes it looks like you guys will throw some hands.  :mellow:  :(  :unsure: No offense andy, but Faa's statements are pretty convincing.

 

ahm, no they cant thats obvious lol :D

 

The problem with fans, like i said earlier, is that they miss use the term "bottlenecking" too much in theire own favour.

They all yell and scream about the FX8320 will bottleneck a singe GTX970 as hell..(sure there allways will be a bottleneck at a certain point)

But honnestly its all based on speculations.

Because nobody has tested it allready.

Also every system is diffrent.

And offcourse also games are diffrent, so it depends on what you play.

 

That the unlocked haswell i5 / i7´s are better for gaming, sure there is nothing to argue about, because of better single threaded performance, on which some games still relay on allot, like MMO´s.

But people seem to forget, that the GM204 is NOT a highend chip.

They just perform better because of the internal improvements on some sort of compression magical thingy, fast stock speeds, and MFAA (super sampler) But this magic all happens inside the card.

The chip it self the GM204 has allot less cuda cores, so it will basicly be lighter for any cpu to work with.

 

In the end, there will allways be a cpu bottleneck at a certain point, no matter which cpu you trow at it.

people just need to stop miss using the term bottlenecking, based on speculations..

 

Bottlenecks will only become a part of an discussion, wenn you realy gonne notice it.

Link to comment
Share on other sites

Link to post
Share on other sites

@ALXAndy @Faa Guys!!! Calm the fuck down. Seriously, can you guys give your opinion without the negative messages? Not that I hate what you're doing but when I look at your quotes or multiquotes it looks like you guys will throw some hands.  :mellow:  :(  :unsure: No offense andy, but Faa's statements are pretty convincing.

 

It's been said a million times. Honestly, this is like people denying climate change. The ignorance surrounding this subject is really annoying. And it's sometimes hard to distinguish the ignorance from the trolls in these topics. Some people just create posts/topics to get people pissed off. Just make a "8350 or 4670K?" topic and you're guaranteed to have a 20page topic, like this one. Don't forget the fact that this is being "debated" for years before judging someone getting a little cranky.

 

@Sintezza

 

What speculations, did you not read Faa's post? What does it matter if it's a GK110 or GM204. That doesn't suddently change the ballgame. I agree that the term bottlenecking is being misued, but that's not the main issue that is causing so much cognitive dissonance. It's people thinking a 8320 will be able to able to generate roughly the same drawcalls on modern graphics cards as a Intel i5/i7 withing a certain marginal error. Using benchmarks made by absolute tools (like Teksyndicate) to prove some bogus argument.

 

There is no point to brandloyalty. Companies give two shits about you or me. We just have to make sure we're making the proper recommendations. And right now, except for some sort of obscure mini-itx APU build, Intel is the best option for just about any setup or budget. Rendermachines are better off using Xeons or 2011, gaming rigs use i5/i7, budget builds Celeron-K/i3. 

Link to comment
Share on other sites

Link to post
Share on other sites

You do not need an I7 at all to run 2-way SLI GTX 970

Yeah, just i5 is fine for most games.

Anyone who has a sister hates the fact that his sister isn't Kasugano Sora.
Anyone who does not have a sister hates the fact that Kasugano Sora isn't his sister.
I'm not insulting anyone; I'm just being condescending. There is a difference, you see...

Link to comment
Share on other sites

Link to post
Share on other sites

@ALXAndy @Faa Guys!!! Calm the fuck down. Seriously, can you guys give your opinion without the negative messages? Not that I hate what you're doing but when I look at your quotes or multiquotes it looks like you guys will throw some hands.  :mellow:  :(  :unsure: No offense andy, but Faa's statements are pretty convincing.

 

Opinions? Well I don't like working with opinions when advising people especially if it's all about performance. You are about to spend 700$, do yourself a favour and let me help you. 

 

 

Currently playing Battlefield 3, AC: Black Flag, Far Cry 3, DoTA 2(sometimes), LoL(sometimes), HoN, Starcraft II, Skyrim, CoD MW MW2.

Now I have a better view. 

- BF3: You won't take enough advantage of two 970's in 1080p to justify that purchase and surely not with a 8320. A 8350 will bottleneck a single 970 in BF3, a decent overclocked 4670K will keep it easily at 99% but in SLI it won't max them completely out because no CPU atm has that single threaded performance. Maintaining 120 fps in BF3 has been pretty much a hopeless adventure for me, been at 5GHz even and still bottlenecking a SLI setup. It depends on many things as well like how many people are close to you, what are they doing spamming explosives, which map are you playing, playercount etc, if you scope out you'll get the minimum frames and a few miliseconds later you'll get more fps etc..

BF3 only takes advantage of 4 cores thats all -> youtube.com/watch?v=pDdqWoj3kF4

4 Intel cores are far more powerful than 4 AMD cores, like twice as fast. Illustrated here;

c1ZWhQ9.jpg

- AC Black Flag; from what I heard its somewhat CPU bound. Found a video; youtube.com/watch?v=xsCaSBRVMNg - a 4930k at 4.5GHz bottlenecking in a game that only needs 2/3 cores I wouldn't be surprised you wouldn't see any difference between one or two cards. Again you need a better CPU here.

- Skyrim: Uhm upgrade to a 4670K and a single 970 should be plenty to provide 100-120fps there. 

- Cod MW2: Wouldn't be surprised that its CPU bound but I guess it just runs at 500 fps on an Intel setup and perhaps 300 fps on AMD. No point here for an Intel CPU unless the game is somehow much more CPU intensive than I expected.

- Hon, LoL: Never played them but they are cpu bound but the game isn't intensive.

- Far Cry 3: Well there wouldn't be much of a difference with a single card between AMD & Intel but in this video again a 3770k bottlenecking two 780's; youtube.com/watch?v=1XjIumF9EZs 

I heard from a guy here that his 8320 was bottlenecking his 780 in FC3 so. Need a new CPU as well if you want to take full advantage of them.

If you can raise your budget up, grab yourself a 4670K as well. Keep in mind that you're never ever guaranteed that Intel won't bottleneck. Intel CPU's bottleneck as well but a lot less than AMD, there are plenty of scenario's that both bottleneck and there are scenario's that AMD bottlenecks a huge amount and Intel is having the GPU sitting at max load. Or AMD sits by 20% behind because it wasn't enough CPU bound to show the potential difference but adding a 2nd card will be like the CiV 5 benchmarks I showed above making the gap bigger. If youre gpu bound with AMD/Intel and you want to simulate a CPU bound scenario, add more GPU's in there (with GPU's not hitting 99% at all) or just take the resolution down to 720p. 

 

The 8350 actually does worse with the crossfire, lol...

 

I genuinely regret my decision of picking up the 8350 more and more every post you make about it, haha!

But you won't regret switching to Intel :P You're atleast one of the few 8350 owners I have respect for, being honest about its performance rather than trying to turn the world upside down. Honestly people like you deserve better than misinformative/meaningless reviews or fanboy'ed advice. I hate people being a victim of it. Regretting a purchase, isn't the end of the world.. If I list what stupid upgrades I've done and how much money I wasted, it's worse than buying a 9590/Crosshair V formula/custom loop instead of a 4670K/70$ z97 board/evo 212. I'll have the 5960x whenever I can get my hands on the Asus X99-e WS, but you'll see me laughing HW-e completely out when people want it for a gaming rig and you won't ever see me coming up with salestalk like that games in the future will use all 16 threads and the CPU is "futureproof".

 

 

I admire your dedication @Faa (bookmarking those posts, nice collections), still years later people do not seem to understand what role a CPU plays in games. And that 50% CPU load on all cores and 99% GPU load during a game means it's 'multithreaded' and therefor more IPC or overhead will thus not generate any extra drawcalls. 

 

Honestly, I think science in any sector is no longer what it used to be. No reviewers give any detailed information or elaboration in CPU gaming benchmarks anymore, or test in scenario's that matter. They just throw something at it and draw half-assed conclusions. So if anything, it's perpetuated by modern reviewers that test CPU's at 1080/1440p maxed settings on a single graphics card. Even if your CPU has 50% overhead, the extra generated drawcalls in this situation is nearly 0. Number e and all that..

 

i've ran out of steam explaining it to people, personally. I know Intel is beating AMD, meaning the underpowered and powerconsuming behemoths will soon dissappear. AMD FX CPU's are poor products, and should not be in anyone's gaming PC.

 

To answer OP's question; Yes it will, It'll bottleck a single card aswell in most consoleports.

Well many people I've seen here actually just straight away deny any benchmarks that show Intel doing better. They post their misinformation just in every new thread after theyve been proved wrong and a few weeks later they upgrade to Intel >.> It's mostly the hardcore AMD fanny's that are like this and the quality of current reviews hardly ever made them realize FX CPU's have awful single threaded performance. Reviewers hardly understand the CPU/GPU logic and have completely no idea that the game starts to become heavily CPU bound in multiplayer mode. WoW benchmarks are a good example, seen one from Anandtech showing no difference between a 8350 & 2500K both at 225 fps that just made no sense. Benchmarking a flight test with the resolution & settings cranked up under a cheap GPU.. Great. Borderlands 2 benchmarks, I've seen AMD & Intel everytime being neck in neck when I'm struggling max'ing one of my 780s out. Goes even down to ridiculously low loads like 50% as far as I can remember. Thanks for telling me the game is GPU bound >.> 

CPU reviews has been mostly flawed.. Test all games at 720p or with multiple gpu's you'll see how awful FX price/performance is and not to forget its power consumption that doesn't even allow proper overclocks on cheap boards/coolers.

Link to comment
Share on other sites

Link to post
Share on other sites

 

@Sintezza

 

What speculations, did you not read Faa's post? What does it matter if it's a GK110 or GM204. That doesn't suddently change the ballgame.

 

This does realy matter a whole lot.

The GM204 is a midrange chip, it only performs better then the 780, because of the compression technology, and the super sampler, and a bit bumped up clockspeeds.

But most of the magic happens inside the card.

THe GM204 it self will be lighter for a cpu to work with.

 

The GM204 does not perform better because its more powerfull, but its just because of the compression and sampler magic.

 

I think that could make some sense.

Link to comment
Share on other sites

Link to post
Share on other sites

This does realy matter a whole lot.

The GM204 is a midrange chip, it only performs better then the 780, because of the compression technology, and the super sampler, and a bit bumped up clockspeeds.

But most of the magic happens inside the card.

THe GM204 it self will be lighter for a cpu to work with.

 

The GM204 does not perform better because its more powerfull, but its just because of the compression and sampler magic.

But the chip it self is not more powerfull then the GK110.

It doesn't matter when 8350 bottlnecks even single GTX 660TI...

| CPU: i7 3770k | MOTHERBOARD: MSI Z77A-G45 Gaming | GPU: GTX 770 | RAM: 16GB G.Skill Trident X | PSU: XFX PRO 1050w | STORAGE: SSD 120GB PQI +  6TB HDD | COOLER: Thermaltake: Water 2.0 | CASE: Cooler Master: HAF 912 Plus |

Link to comment
Share on other sites

Link to post
Share on other sites

It doesn't matter when 8350 bottlnecks even single GTX 660TI...

 

Now you are talking big BS dude lol :).

 

But yeah unlocked haswell i5/i7´s are better sure.

Link to comment
Share on other sites

Link to post
Share on other sites

Now you are talking big BS dude lol :).

 

But yeah unlocked haswell i5/i7´s are better sure.

I already posted proof. It's on the 4th page :)

Not even unlocked, but locked i3/i5 is better than 8350.

| CPU: i7 3770k | MOTHERBOARD: MSI Z77A-G45 Gaming | GPU: GTX 770 | RAM: 16GB G.Skill Trident X | PSU: XFX PRO 1050w | STORAGE: SSD 120GB PQI +  6TB HDD | COOLER: Thermaltake: Water 2.0 | CASE: Cooler Master: HAF 912 Plus |

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×