Jump to content

First time poster looking for some advice on PC upgrade

It's only really useful for SFF. For a bit more a 240 AIO is superior, and some air coolers perform comparably without any of the drawbacks at all of an AIO.

 

..what do you mean less capable CPU? An AMD CPU is a budget compromise, find me a single game where an 8350 beats a 4670 or 4690k.

Mantle has nothing to do with the CPU.

And yeah I know there's better watercoolers out there but I don't need a better one. This one is working more than fine. 

Link to comment
Share on other sites

Link to post
Share on other sites

Because AMD.

Piss off Intel fanboy :P

Correct me if I'm wrong (and I quite possibly could be wrong) but PCI/e 2.0 @ 8x speeds is very bad for SLI, right? He would need to find an AMD board that allows two 16x PCI/e slots to run for SLI, otherwise he'll slow to 8x speeds which on PCI/e 2.0 is bad. 3.0's 8x speeds aren't as bad, but you can only get them with intel boards.

 

Did I get that right? I was told that before, but truthfully I don't know how true it is. Also pretty difficult to just shove my 780Ms into an old PCI/e 2.0 SLI board with MXM 3.0b compatibility to check xD.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

Uhm.

That's an 8350 beating a 3570k and a 3770k and even a 3820k in some games and definitely the applications I'm looking for (i.e. streaming)

Those have double the threads your 4670/4690 have. And it's more than half the price of those (thrice in some cases)

pls. 

 

...that's not a valid comparison anymore

 

And it doesn't work like that

And I don't even have an Intel chip

 

I can tell you right now, so can 90% of the forum that an 4690k chip is the better choice for numerous reasons in your situation, but at the end of the day it's your money and your computer.

Error: 410

Link to comment
Share on other sites

Link to post
Share on other sites

Uhm.

That's an 8350 beating a 3570k and a 3770k and even a 3820k in some games and definitely the applications I'm looking for (i.e. streaming)

Those have double the threads your 4670/4690 have. And it's more than half the price of those (thrice in some cases)

pls. 

 

By the way, the FX 8350 needs to run at 5GHz to perform equal to my i7-4800MQ @ 3.5GHz when using OBS to stream. That video is very old and was using Xsplit as far as I know. If you grab a 4GHz Haswell refresh i7-4790K you'll need SERIOUS overclocking to match it with a 8350 in terms of OBS usage. Tested with:

Laptop in my sig

Friend's desktop with AMD FX-8350 @ 4.4GHz, 4.8GHz and 5GHz clocks, 12GB DDR3 1600MHz RAM and R9 290 heavily OC'd.

 

I listed the settings and we both did stream previews and watched our CPU usage.

 

So if you were to get a 3770K or 4770k or 4790K and were to use OBS right now to stream, even a bit of overclocking on them renders the AMD the decidedly worse choice.

 

 

But that's ignoring the boost I'll get with mantle which is way more significant than I expected btw  :o.

Another interjection here. Mantle's boost gets diminishing returns as your CPU and GPUs get stronger. With a weaker CPU in a CPU-devouring game like BF4, you'll see a nice boost. With something like an overclocked FX-8350 + a R9 290X? 10fps or so is probably what you'll get. Don't overjudge mantle; it's just a way to get around CPU overhead. It's not a problem if you're not being bottlenecked by CPU in a game, and in BF4 everyone is. There is a reason why nVidia with DX11 and driver updates was able to compete with mantle in Thief 2014 for performance; because the game wasn't that CPU heavy.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

Correct me if I'm wrong (and I quite possibly could be wrong) but PCI/e 2.0 @ 8x speeds is very bad for SLI, right? He would need to find an AMD board that allows two 16x PCI/e slots to run for SLI, otherwise he'll slow to 8x speeds which on PCI/e 2.0 is bad. 3.0's 8x speeds aren't as bad, but you can only get them with intel boards.

 

Did I get that right? I was told that before, but truthfully I don't know how true it is. Also pretty difficult to just shove my 780Ms into an old PCI/e 2.0 SLI board with MXM 3.0b compatibility to check xD.

you'll probably get a ~2% performance penalty per slot dropping from 2x16 to 2x8.

3x8 has the same performance as 2x16 which will see no more than barely margin of error difference.

 

Any 990FX board will be 16/16

Error: 410

Link to comment
Share on other sites

Link to post
Share on other sites

you'll probably get a ~2% performance penalty per slot dropping from 2x16 to 2x8.

3x8 has the same performance as 2x16 which will see no more than barely margin of error difference.

 

Any 990FX board will be 16/16

Thanks! Wanted to be sure before I went ahead and spoke it like 2.0 x8 dual gpu was only 80% as good or something

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

Thanks! Wanted to be sure before I went ahead and spoke it like 2.0 x8 dual gpu was only 80% as good or something

 

If it wasn't why would the 295X2 and Titan Z exist? They run 2 GPUs on 16 PCIE lanes. :)

 

It's not an issue, PCIE is a yawning cavern of bandwidth.

Link to comment
Share on other sites

Link to post
Share on other sites

If it wasn't why would the 295X2 and Titan Z exist? They run 2 GPUs on 16 PCIE lanes. :)

 

It's not an issue, PCIE is a yawning cavern of bandwidth.

The Titan Z and R9 295X2 exist to take peoples' money as far as I'm concerned. Titan Z is $3000 while being two downclocked Titan Blacks. You could use the extra $1000 and buy two Titan Blacks + a 6-core intel CPU AND a mobo to support it. And still spend about the same money.

 

Same for the R9 295X2. It's $1500 but only is two R9 290X cards, which amount to a maximum of about $1100 if gotten separately.

 

The only dual-GPU card I've seen be worth it was the GTX 690, because at the time, 680s in quad SLI wasn't supported (and I don't think it is now, even) and 2 690s was the only way to get 4-way SLI at Kepler's launch. And even then, a 690 was roughly the cost of 2 680s, so you weren't losing much buying them, unlike now.

 

I'm curious as to the theoretical bandwidth limit of PCI/e 3.0 x16 though xD

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

680s in quad SLI wasn't supported (and I don't think it is now, even)

 

I'm curious as to the theoretical bandwidth limit of PCI/e 3.0 x16 though xD

 

680s work in quad SLI.

 

15.75 GB/s in each direction is the bandwidth of PCIE 3 16x, it has doubled with each iteration and PCIE 4.0 is set to double again.

Link to comment
Share on other sites

Link to post
Share on other sites

680s work in quad SLI.

 

15.75 GB/s in each direction is the bandwidth of PCIE 3 16x, it has doubled with each iteration and PCIE 4.0 is set to double again.

I know you can put them in quad SLI, but I don't think they are very supported in quad SLI. It was like these guys who had to use old nVidia drivers to hack titans into 4-way SLI a while ago. I wish I had the site, but basically they're not designed to go past 3-way I think. I know one person with 4-way 680s and he was getting lower frames than my two 780Ms, and I told him to disable one and run it 3-way and he said it more than doubled his framerate.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

I know you can put them in quad SLI, but I don't think they are very supported in quad SLI. It was like these guys who had to use old nVidia drivers to hack titans into 4-way SLI a while ago. I wish I had the site, but basically they're not designed to go past 3-way I think. I know one person with 4-way 680s and he was getting lower frames than my two 780Ms, and I told him to disable one and run it 3-way and he said it more than doubled his framerate.

 

Quad SLI has been a waste of time since it was invented, it's for benchmarking and e-peen and nothing else.

Link to comment
Share on other sites

Link to post
Share on other sites

Quad SLI has been a waste of time since it was invented, it's for benchmarking and e-peen and nothing else.

Man I'll tell you straight, I want a quad SLI machine. I'ma sacrifice 1 card for PhysX and another card for 64xQ CSAA on most games... then I'ma game on 2 cards =D =D =D =D

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

Can you get 6GB DIMMs? I'm not sure you can, and mixing RAM is a really really bad idea.

nope you cant, im using 2 4GB, and 2 2GB. its not ideal, but they are all the same latency and frequency. so any issue caused by mixing 2 and 4GB sticks together is negligible 

How do Reavers clean their spears?

|Specs in profile|

The Wheel of Time turns, and Ages come and pass, leaving memories that become legend. Legend fades to myth, and even myth is long forgotten when the Age that gave it birth comes again.

Link to comment
Share on other sites

Link to post
Share on other sites

What's wrong with the H80? And honestly, I don't want to go for an i5. Call me stupid or a fanboy. I don't care, but I don't see the point in spending more money on a less capable CPU. And losing the potential of the boost that mantle is. 

Mantle is going to integrated into Opengl probably and all it does is reduce overhang on any cpu. 

 

 

Uhm.

That's an 8350 beating a 3570k and a 3770k and even a 3820k in some games and definitely the applications I'm looking for (i.e. streaming)

Those have double the threads your 4670/4690 have. And it's more than half the price of those (thrice in some cases)

pls. 

 

You can upgrade from haswell later on to broadwell i7. With AMD you can't

Computing enthusiast. 
I use to be able to input a cheat code now I've got to input a credit card - Total Biscuit
 

Link to comment
Share on other sites

Link to post
Share on other sites

Mantle is going to integrated into Opengl probably and all it does is reduce overhang on any cpu. 

 

 

You can upgrade from haswell later on to broadwell i7. With AMD you can't

 

 

By the way, the FX 8350 needs to run at 5GHz to perform equal to my i7-4800MQ @ 3.5GHz when using OBS to stream. That video is very old and was using Xsplit as far as I know. If you grab a 4GHz Haswell refresh i7-4790K you'll need SERIOUS overclocking to match it with a 8350 in terms of OBS usage. Tested with:

Laptop in my sig

Friend's desktop with AMD FX-8350 @ 4.4GHz, 4.8GHz and 5GHz clocks, 12GB DDR3 1600MHz RAM and R9 290 heavily OC'd.

 

I listed the settings and we both did stream previews and watched our CPU usage.

 

So if you were to get a 3770K or 4770k or 4790K and were to use OBS right now to stream, even a bit of overclocking on them renders the AMD the decidedly worse choice.

 

 
 

Another interjection here. Mantle's boost gets diminishing returns as your CPU and GPUs get stronger. With a weaker CPU in a CPU-devouring game like BF4, you'll see a nice boost. With something like an overclocked FX-8350 + a R9 290X? 10fps or so is probably what you'll get. Don't overjudge mantle; it's just a way to get around CPU overhead. It's not a problem if you're not being bottlenecked by CPU in a game, and in BF4 everyone is. There is a reason why nVidia with DX11 and driver updates was able to compete with mantle in Thief 2014 for performance; because the game wasn't that CPU heavy.

Ok. First of all the boost of mantle is more substantial than you might think. In another TekSyndicate video they tested pistol's rig against Logans. And she has a fx9590 with a Sapphire Tri-X R9 290X and she got really big boosts out of mantle. Maybe those were skewed results or only for her machine because of other reason but i think it might be substantial enough. 

On a different note I have been considering just getting a nice i7 as mentioned above. If I do go intel I'd rather go all the way and get a really good cpu, but i'm unsure yet. It's going to increase my budget a bit although I could cut cost on the mobo that way. 

Thanks for the advice so far guys. I'll keep you updated if I decide and of course show some pics when i put the thing together ;D 

Link to comment
Share on other sites

Link to post
Share on other sites

Ok. First of all the boost of mantle is more substantial than you might think. In another TekSyndicate video they tested pistol's rig against Logans. And she has a fx9590 with a Sapphire Tri-X R9 290X and she got really big boosts out of mantle. Maybe those were skewed results or only for her machine because of other reason but i think it might be substantial enough. 

I remember that video, and I still want to re-iterate the point: Mantle's benefit depends both on the hardware you have AND the nature of the game.

 

Remember: nVidia gave driver updates for Thief 2014 that caused their cards to surpass AMD users using mantle for the game, while still using DirectX 11. BF4 is a very different story though, and each game will be different. But it depends on driver optimization for DX as well. BF4's mantle boost is probably 100% going to outstrip DX11's optimizations on the nV side. Thief has proven nV can optimize their DX11 usage enough to counter or surpass the mantle boost. But as I said, thief isn't a CPU hog like BF4, so it's understandable. Mantle only makes a massive benefit if the game is CPU heavy, AND the stronger your hardware (especially CPU) the less it benefits. Like, taking BF4, an APU-type machine can get upwards of 15fps boost... but its fps is low from the start. A higher end machine with a much stronger CPU is probably gonna get a smaller boost from not being as-limited.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

Alright, so after some waiting (on x99 mostly) and some debating i ended up choosing the following parts for my upgrade: 

And yes, i did go over my initial budget 

Intel i7 4790k 

corsair 760T

Maximus VII Ranger

Ax760
4 120AF's (corsair)
sleeved cables (yes i like my shit to look good :P )

oh and the 2 780's fcourse

Will post pics when I finish the build (which will be saturday) 

Thanks for all the advice guys, great forum :) 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×