Jump to content

Is it time to stop discouraging use of the FX-8350?

Look, there is no denying that AMD's drivers have a high CPU overhead.

Driver overhead doesn't explain Gameworks being Gaymeworks

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Driver overhead doesn't explain Gameworks being Gaymeworks

Lower driver overhead=higher number of possible drawcalls=AMD's high driver overhead is limiting their cards. Though to be fair, they haven't released drivers for it yet and they still haven't gotten their drivers to the point where their graphics cards actually reach their potential.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Lower driver overhead=higher number of possible drawcalls=AMD's high driver overhead is limiting their cards. Though to be fair, they haven't released drivers for it yet and they still haven't gotten their drivers to the point where their graphics cards actually reach their potential.

Crimson is due in 2 weeks - probably 15.12 beta will come out in a few days for FO4

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Look, there is no denying that AMD's drivers have a high CPU overhead.

there is no denying that AMD doesnt have a optimized driver for this game. Not even a beta one.

you are comparing bare to the metal performance with fully optimized drivers in a game made for the competition in the first place.

as for drawcalls, AMD cdrivers puts out around 900k drawcalls while Nvidia hits around 1.3 million drawcalls according to some tests, and just 750k according to others. this is under DX11.

in DX12 Nvidia hits around the same amount of drawcalls as AMD does.

mantle, AMD hits 12.4 million drawcalls, and under DX12 they hit 13 million +

http://www.pcworld.com/article/2900814/tested-directx-12s-potential-performance-leap-is-insane.html

So it is more a API limitation then it is a pure driver issue.

just for fun man, i challenge you.

If you still have your 970, go back to a Nvidia driver like, say around a year ago (when Nvidia did their last general performance bump for DX11, increasing their maximum drawcalls further) then try run fallout without any real fallout driver.

Lets see if it runs THAT much better without proper drivers.

Oh, and it is STILL a Bethesda game with failworks.

Whatever bugs and shit is in AMDs basic drivers comes ontop of the broken crap Bethesda served you in the first place

Link to comment
Share on other sites

Link to post
Share on other sites

there is no denying that AMD doesnt have a optimized driver for this game. Not even a beta one.

you are comparing bare to the metal performance with fully optimized drivers in a game made for the competition in the first place.

as for drawcalls, AMD cdrivers puts out around 900k drawcalls while Nvidia hits around 1.3 million drawcalls according to some tests, and just 750k according to others. this is under DX11.

in DX12 Nvidia hits around the same amount of drawcalls as AMD does.

mantle, AMD hits 12.4 million drawcalls, and under DX12 they hit 13 million +

http://www.pcworld.com/article/2900814/tested-directx-12s-potential-performance-leap-is-insane.html

So it is more a API limitation then it is a pure driver issue.

just for fun man, i challenge you.

If you still have your 970, go back to a Nvidia driver like, say around a year ago (when Nvidia did their last general performance bump for DX11, increasing their maximum drawcalls further) then try run fallout without any real fallout driver.

Lets see if it runs THAT much better without proper drivers.

Oh, and it is STILL a Bethesda game with failworks.

Whatever bugs and shit is in AMDs basic drivers comes ontop of the broken crap Bethesda served you in the first place

I'm using my GTX 650ti  (glad I bought the 2GB version-frequently going over 1GB vRAM usage at 1080p in WOT), in the process of putting my GTX 970 up for sale and I can't even use my 4790K or i5 4440 until my PS2 to USB adapter arrives at the end of this month.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Link to comment
Share on other sites

Link to post
Share on other sites

lets keep it on the CPU side of things and not drift from the original topic.

Link to comment
Share on other sites

Link to post
Share on other sites

Their GPU scores also make no sense, since their fallout 4 results show the usual results you'd expect between nvidia and AMD. When it's def. not the case.

AMD does really poorly yet again due to their driver CPU overhead and gameworks titles draw too many drawcalls for those drivers, causing massive CPU bottlenecking. Check this out;

https://www.youtube.com/watch?v=imcj_BxGqD4

That's not the problem with Fallout 4. I've got 30% CPU utilization, sometimes around 50%, with 100% GPU utilization.

 

G3258 V 860k (Spoiler: G3258 wins)

 

 

Spoiler

i7-4790K | MSI R9 390x | Cryorig H5 | MSI Z97 Gaming 7 Motherboard | G.Skill Sniper 8gbx2 1600mhz DDR3 | Corsair 300R | WD Green 2TB 2.5" 5400RPM drive | <p>Corsair RM750 | Logitech G602 | Corsair K95 RGB | Logitech Z313

Link to comment
Share on other sites

Link to post
Share on other sites

Link to comment
Share on other sites

Link to post
Share on other sites

With a 4.2ghz Haswell i7, yes.

Even then I get anywhere from 45-144fps in larger areas.

AMD needs to put out a driver for it.

 

G3258 V 860k (Spoiler: G3258 wins)

 

 

Spoiler

i7-4790K | MSI R9 390x | Cryorig H5 | MSI Z97 Gaming 7 Motherboard | G.Skill Sniper 8gbx2 1600mhz DDR3 | Corsair 300R | WD Green 2TB 2.5" 5400RPM drive | <p>Corsair RM750 | Logitech G602 | Corsair K95 RGB | Logitech Z313

Link to comment
Share on other sites

Link to post
Share on other sites

Gameworks is fine, AMD drivers can't handle the drawcalls and are just shit in tessalation.

 

How is that nvidia's fault -_-

Actually, too many drawcalls aint good either. as it will have an adverse effect and cause the CPU to either spam too small workloads (lots of GPU idle inbetween) or spend too much time doing drawcalls all together.

 

the R9 380, which is based on the R9 285, has way better tesselation then the R9 390 and other GCN 1.1 or GCN 1.0 based cards. Like much much better, notably so. This is why, percentage wise, a R9 285/R9 380 takes less of a hit to the FPS, then say a R9 290 does when you spam tesselation. Note, that the drop is percentage wise, not "the R9 285 will apply magic and run faster then a 290)...

 

 

As for AMDs FX series, yes their low IPC will hinder them, but their high GHz should counter it sufficiently to get the needed number of drawcalls to ATLEAST hit 60FPS... However i think what @ and myself have noted before applies, disabling of FX "cores"... i think FX is sabotaging itself with the shared cache system. Disabling 3 cores (turning the FX 6300 into a 3 core CPU) might actually improve single core performance as the integer and fetch wont be idling so much inbetween cores calling for more work and instead try to just jam one poor core with more work......

Link to comment
Share on other sites

Link to post
Share on other sites

I'm using my GTX 650ti  (glad I bought the 2GB version-frequently going over 1GB vRAM usage at 1080p in WOT), in the process of putting my GTX 970 up for sale and I can't even use my 4790K or i5 4440 until my PS2 to USB adapter arrives at the end of this month.

well, that will do to..

 

take your 650Ti, roll its driver back to pre fallout drivers (say sometime in august, shouldnt be any optimizations for fallout or the gamework systems used in fallout back then).

 

Then tell me. how much of a dump did your FPS take?

Link to comment
Share on other sites

Link to post
Share on other sites

wont matter which CPU you use. If game is broken (its a bethesda title. Nothing works as it should. At least not at launch), game is simply broken.

 

 (if) the game is drawcall/CPU bound, having a faster CPU certainly helps.

Link to comment
Share on other sites

Link to post
Share on other sites

 (if) the game is drawcall/CPU bound, having a faster CPU certainly helps.

up to a point. Unless the game isnt written properly and just abuses drawcall spam. Which is a thing. Which reduces FPS for anyone.

 

and that might be hte case, because although pretty darn multi-threaded, the gaps between the Intel CPUs (i3. i5 and i7) is quite noticable. Something that hasnt been THAT obvious of a gap in other games that are well multi threaded.

 

Time will show, as the game itself is patched and bugs fixed with mods.

Link to comment
Share on other sites

Link to post
Share on other sites

 

As for AMDs FX series, yes their low IPC will hinder them, but their high GHz should counter it sufficiently to get the needed number of drawcalls to ATLEAST hit 60FPS... However i think what @ and myself have noted before applies, disabling of FX "cores"... i think FX is sabotaging itself with the shared cache system. Disabling 3 cores (turning the FX 6300 into a 3 core CPU) might actually improve single core performance as the integer and fetch wont be idling so much inbetween cores calling for more work and instead try to just jam one poor core with more work......

That's pretty much what I said-there is a reason for no other CPU having resources shared within a core (module in CMT's case).

 

well, that will do to..

 

take your 650Ti, roll its driver back to pre fallout drivers (say sometime in august, shouldnt be any optimizations for fallout or the gamework systems used in fallout back then).

 

Then tell me. how much of a dump did your FPS take?

I've got the original driver disc, and its from 2013. I don't have Fallout 4 (played a bit of new Vegas it was kind of meh). However the only game that I have made after 2013 is Titanfall. I could do the FPS difference with that (using my Xeon X5450 ATM-still got to nail down why Windows Updates is turning itself back on, consuming over 4GB of RAM and pegging the CPU at 100% before I can play any games however)

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

That's pretty much what I said-there is a reason for no other CPU having resources shared within a core (module in CMT's case).

 

I've got the original driver disc, and its from 2013. I don't have Fallout 4 (played a bit of new Vegas it was kind of meh). However the only game that I have made after 2013 is Titanfall. I could do the FPS difference with that (using my Xeon X5450 ATM-still got to nail down why Windows Updates is turning itself back on, consuming over 4GB of RAM and pegging the CPU at 100% before I can play any games however)

type msconfig into "run" in the start menu. Go to "boot" and disable win update from even launching.

Link to comment
Share on other sites

Link to post
Share on other sites

Overclock the i5 to 4GHz and it'll smash the FX 8350. Single threaded a 5.3GHz FX 8320 or 8350 can't beat my 3.1GHz i5 4440 at stock. And my i5 consumes less than 40W of power (my i7 is close to its TDP in power consumption) as opposed to the 220W+ needed to get an FX 8350 to 5GHz and higher.

OP's point is that new games aren't single threaded anymore so your argument is kind of ...

Link to comment
Share on other sites

Link to post
Share on other sites

OP's point is that new games aren't single threaded anymore so your argument is kind of ...

Lol, just because a CPU has more threads doesn't mean that one with less can't have the same or similar performance at a lower clock speed. More threads doesn't automatically mean that a CPU will be fine in modern games. It takes the FX 9590 to beat an i5 in games at the moment.

 

 

type msconfig into "run" in the start menu. Go to "boot" and disable win update from even launching.

Done. I'll install Titanfall now and run the comparison (I only re-installed Windows on this last night after my 80GB HDD started corrupting files-60GB boot SSD 250GB storage drive).

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Did you even read OP? 8350 is starting to top i5s in many modern games 91 > 87

Lol, just because a CPU has more threads doesn't mean that one with less can't have the same or similar performance at a lower clock speed. More threads doesn't automatically mean that a CPU will be fine in modern games. It takes the FX 9590 to beat an i5 in games at the moment.

http--www.gamegpu.ru-images-stories-Test

Link to comment
Share on other sites

Link to post
Share on other sites

Did you even read OP? 8350 is starting to top i5s in many modern games 91 > 87

http--www.gamegpu.ru-images-stories-Test

And the OP realised that that website is unreliable due to some of the results. Look back and you'll find the post.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

And the OP realised that that website is unreliable due to some of the results. Look back and you'll find the post.

mhm.

 

however individual benchmarks from around the web do confirm that FX is doing better now in modern titles then before.

While not 100% true for Fallout 4, it IS improving. Which is good for all those poor souls that hasnt made the jump/cannot afford a locked i5.

 

But in light of this... if the FX 6300 can consistently overtake the i3s for less money, it will be a good purchase (and if disabling 2-3 cores helps to give it even more performance, it may be an excellent buy)

Link to comment
Share on other sites

Link to post
Share on other sites

mhm.

 

however individual benchmarks from around the web do confirm that FX is doing better now in modern titles then before.

While not 100% true for Fallout 4, it IS improving. Which is good for all those poor souls that hasnt made the jump/cannot afford a locked i5.

 

But in light of this... if the FX 6300 can consistently overtake the i3s for less money, it will be a good purchase (and if disabling 2-3 cores helps to give it even more performance, it may be an excellent buy)

With the FX 6300, disabling 3 of the ALU effectively gives you a Phenom II X3 with a lot more cache-and those do struggle a bit. The FX 8320/8350 with half of the ALU disabled would end up being a lot better.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

With the FX 6300, disabling 3 of the ALU effectively gives you a Phenom II X3 with a lot more cache-and those do struggle a bit. The FX 8320/8350 with half of the ALU disabled would end up being a lot better.

but again, those cost more. The 8320 in particular costs the same as a i3 6100... and the 6100 also has faster RAM as standard. Thus the 80% higher single core performance and faster RAM will effectively negate even a OCd 4 "core" 8320... (although, using the ASRock 970M Pro3 you can get fast RAM for FX without breaking the bank).

FX8 is for home servers and cheap-o rendering...

 

FX6 is cheap enough to rival the i3s. If i3s drop in price, the FX6 got to follow, if not -> useless due to price vs performance.

 

EDIT:

FX6 with half the ALUs would be faster, single core wise, then the Phenom II X3. Phenom II has 10-15% single core advantage, but the FX, with only 3 ALU could easily be OCd to 4.5GHz or higher without having voltage or heat issues.

Link to comment
Share on other sites

Link to post
Share on other sites

And the OP realised that that website is unreliable due to some of the results. Look back and you'll find the post.

What the op feels about a website has 0 things to do with how CPUs objectively perform. Unless you have information relevant to that graph you are talking out of your butt.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×