Jump to content

FX-8350 or I5 4670K

Look guys this is what i was thinking to buy if i go with intel - http://de.pcpartpicker.com/user/Frankenstein14RO/saved/41Ak

Can someone change the cpu with and amd one and another mobo ? (i dont wanna go over 800€, I live in germany)

 

switching to the AMD platform won't save you enough money to step up to a 780 and keep your budget.  I say stick with the i5 build like you have here.  Also, I noticed there's no SSD or HDD.  I'm assuming you already have that, not forgetting it?

 

I have an i5 4670K and I'd say get an FX 8350 much better CPU for your money.

I regret paying the premium for the 4670K.

 

there are a bunch of people that say the exact opposite.  and I can tell you're full of sh*t because of the "much better" part.  in some cases the 8350 is a little better, in most cases the i5 is a little better, there is no huge difference either way.

HP something | 5600X | Corsair  16GB | Zotac ArcticStorm GTX 1080 Ti | Samsung 840 Pro 256GB | OCZ Agility 3 480GB | ADATA SP550 960 GB

Corsair AX860i | CaseLabs SM8 | EK Supremacy | UT60 420 | ST30 360 | ST30 240

Gentle Typhoon's and Noctua's and Noiseblocker eLoop's

 

Link to comment
Share on other sites

Link to post
Share on other sites

BF4, AC4, War Thunder, World of tanks, Planetside2 ...

 

Planetside 2, World of Tanks, BF4 I5 in a landslide. BF4 AMD can come close with Mantle but is still behind. I5 will also stream BF4 faster. AMD is very competitive in most SINGLE PLAYER games. The less cores used, the less optimization, the worse the AMD does. I5 is much faster for gaming in many games now, and will be slightly faster when perfect optimization comes. DirectX 12 and OpenGL are getting optimization. That is awhile away though. OpenGL will be demoing it soon and DirectX will prob be playing slides at GDC.

 

 

Also avoid russiangpu benchmarks. Tek Syndicate? SINGLE PLAYER GAMES that are GPU Bound. Worst "benchmark" I have ever seen. Both sites are a joke. An I5 is not 20 fps slower then a I7. That is just laughable. Ask anyone on this forum who has turned off multithreading on their I7.

 

In a year or two the AMD could be a very good buy. It simply is not now if you love multiplayer. Indie games like Arma, Rust etc? Will perform horribly compared to an I5. MMO's? Go Intel or go home. 

 

The higher the video card? Go Intel or go home. The 8320/50 and it's oc rebrand 9xxx are simply like a 2008 Nehalem chip on a overclock. They are not bad by any means for things like rendering. They are however average for gaming. They WILL be good on newer games when the API's go low level. So will an I3...

 

If you play mostly single player games? AMD is fine. Multiplayer is where it can get it's head kicked in.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Budget - FX-8320 (not 8350)

Performance - i5-4670k

The 8350 is terrible, even at the "budget" price.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

So the title says all ... (Only gaming)

For where you live Intel seems to be the way to go at that price point, AMD boards are overpriced over there.

AAAAAAAAAGGGGGGGHHHHH!!!!

Link to comment
Share on other sites

Link to post
Share on other sites

Planetside 2, World of Tanks, BF4 I5 in a landslide. BF4 AMD can come close with Mantle but is still behind. I5 will also stream BF4 faster. AMD is very competitive in most SINGLE PLAYER games. The less cores used, the less optimization, the worse the AMD does. I5 is much faster for gaming in many games now, and will be slightly faster when perfect optimization comes. DirectX 12 and OpenGL are getting optimization. That is awhile away though. OpenGL will be demoing it soon and DirectX will prob be playing slides at GDC.

 

 

Also avoid russiangpu benchmarks. Tek Syndicate? SINGLE PLAYER GAMES that are GPU Bound. Worst "benchmark" I have ever seen. Both sites are a joke. An I5 is not 20 fps slower then a I7. That is just laughable. Ask anyone on this forum who has turned off multithreading on their I7.

 

In a year or two the AMD could be a very good buy. It simply is not now if you love multiplayer. Indie games like Arma, Rust etc? Will perform horribly compared to an I5. MMO's? Go Intel or go home. 

 

The higher the video card? Go Intel or go home. The 8320/50 and it's oc rebrand 9xxx are simply like a 2008 Nehalem chip on a overclock. They are not bad by any means for things like rendering. They are however average for gaming. They WILL be good on newer games when the API's go low level. So will an I3...

 

If you play mostly single player games? AMD is fine. Multiplayer is where it can get it's head kicked in.

 

Lol why is everyone even botter with movies from that "Tech Yes" retared? :D

Link to comment
Share on other sites

Link to post
Share on other sites

AM3+ Is a dead platform at this point, so it depends how long you want this rig to last you. Either CPU is a great choice for what you're doing. My personal recommendation would be an 8320 and a beefier GPU though. Afterwards, just set the 8320 to the 8350s stock settings and voltage. You'll be good to go :)

CPU Overclocking Database <------- Over 275 submissions, and over 40,000 views!                         

GPU Overclocking Database                                                    

Link to comment
Share on other sites

Link to post
Share on other sites

Planetside 2, World of Tanks, BF4 I5 in a landslide. BF4 AMD can come close with Mantle but is still behind. I5 will also stream BF4 faster. AMD is very competitive in most SINGLE PLAYER games. The less cores used, the less optimization, the worse the AMD does. I5 is much faster for gaming in many games now, and will be slightly faster when perfect optimization comes. DirectX 12 and OpenGL are getting optimization. That is awhile away though. OpenGL will be demoing it soon and DirectX will prob be playing slides at GDC.

 

http://www.youtube.com/_don'twanttoquoteembededvid

 

Also avoid russiangpu benchmarks. Tek Syndicate? SINGLE PLAYER GAMES that are GPU Bound. Worst "benchmark" I have ever seen. Both sites are a joke. An I5 is not 20 fps slower then a I7. That is just laughable. Ask anyone on this forum who has turned off multithreading on their I7.

 

In a year or two the AMD could be a very good buy. It simply is not now if you love multiplayer. Indie games like Arma, Rust etc? Will perform horribly compared to an I5. MMO's? Go Intel or go home. 

 

The higher the video card? Go Intel or go home. The 8320/50 and it's oc rebrand 9xxx are simply like a 2008 Nehalem chip on a overclock. They are not bad by any means for things like rendering. They are however average for gaming. They WILL be good on newer games when the API's go low level. So will an I3...

 

If you play mostly single player games? AMD is fine. Multiplayer is where it can get it's head kicked in.

 

so I watched that vid... I know his 780 lightning is overclocked, but how the hell did he manage to pull 600 watts out of his psu???  even 5XX with the intel...  The most I've ever seen out of my 780 is in the low 400's, and thats with a 4770 at 4.6.  granted I don't have crysis 3 and he did say he only saw this on that game... 

 

anybody else that has a watt meter wall thing or corsair link for their PSU that can check this?  if this is true, everyone can shut with their "anything over 500 watts is overkill for a single graphics card"

HP something | 5600X | Corsair  16GB | Zotac ArcticStorm GTX 1080 Ti | Samsung 840 Pro 256GB | OCZ Agility 3 480GB | ADATA SP550 960 GB

Corsair AX860i | CaseLabs SM8 | EK Supremacy | UT60 420 | ST30 360 | ST30 240

Gentle Typhoon's and Noctua's and Noiseblocker eLoop's

 

Link to comment
Share on other sites

Link to post
Share on other sites

so I watched that vid... I know his 780 lightning is overclocked, but how the hell did he manage to pull 600 watts out of his psu???  even 5XX with the intel...  The most I've ever seen out of my 780 is in the low 400's, and thats with a 4770 at 4.6.  granted I don't have crysis 3 and he did say he only saw this on that game... 

 

anybody else that has a watt meter wall thing or corsair link for their PSU that can check this?  if this is true, everyone can shut with their "anything over 500 watts is overkill for a single graphics card"

Who says he did manage it? It's unlikely to pull 600W with an i5 & 780, I'm hardly there with a 3930K/single 780.

Also you guys should stop saying the 8320/8350 is better for its money, it's complety not. 

Link to comment
Share on other sites

Link to post
Share on other sites

so I watched that vid... I know his 780 lightning is overclocked, but how the hell did he manage to pull 600 watts out of his psu???  even 5XX with the intel...  The most I've ever seen out of my 780 is in the low 400's, and thats with a 4770 at 4.6.  granted I don't have crysis 3 and he did say he only saw this on that game... 

 

anybody else that has a watt meter wall thing or corsair link for their PSU that can check this?  if this is true, everyone can shut with their "anything over 500 watts is overkill for a single graphics card"

 

OC AMD CPU's can pull quite a bit of power. Crysis 3 was the only game that pulled that much. Hell my 4770k at 1.22v can pull like 180 and up in a synthetic. and that is just like .1 higher then stock. OC AMD's are running at like 1.4-1.5 often.

 

That was only one game though. On most the CPU is nowhere close to load and neither is the gpu, not at the same time anyways. Add to that an I7 isn't going to be at full load in a 8 thread game. It is going to steamroll it. 

 

 A 600 watt is pretty much fine for a I5/I7 and that card with decent OC. Prob not for an AMD. You would want like 650. 600 is definitely fine for stock and "overkill". Overclocking just increases power dramatically. All depends with the voltages you end up at. You can get unlucky on Haswell and need 1.3v. You can need more voltage on the GPU as well.

 

He also could have had 4 drives in there lots o fans etc. He is basically saying AMD uses more power. It does. It isn't that big a deal at all in the US for power usage. In other countries it is, and he told you that it isn't that big a deal. You do want to go a little bigger on the PSU though on AMD if OC. 

 

Add to all this? Low level OpenGL/DirectX are going to dramatically decrease the power needed from the AMD/Intel. Crysis 3 has tons of CPU overhead like all games that aren't AMD Mantle. Really only a problem in that one game atm. I don't have Star Citizen. It might also push a cpu to the max. It is going to be AMD Mantle and I am sure that guy will go low level on his game with OpenGL/DirectX as soon as possible. Cry Engine is now OpenGL. 

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Who says he did manage it? It's unlikely to pull 600W with an i5 & 780, I'm hardly there with a 3930K/single 780.

Also you guys should stop saying the 8320/8350 is better for its money, it's complety not. 

 

if you watch the vid, he did manage it (watch at 21:30)

 

and since you quoted me (not sure if you're actually talking to me), I never said that the 8320/8350 was better in any case, not even for the money.

HP something | 5600X | Corsair  16GB | Zotac ArcticStorm GTX 1080 Ti | Samsung 840 Pro 256GB | OCZ Agility 3 480GB | ADATA SP550 960 GB

Corsair AX860i | CaseLabs SM8 | EK Supremacy | UT60 420 | ST30 360 | ST30 240

Gentle Typhoon's and Noctua's and Noiseblocker eLoop's

 

Link to comment
Share on other sites

Link to post
Share on other sites

if you watch the vid, he did manage it (watch at 21:30)

 

and since you quoted me (not sure if you're actually talking to me), I never said that the 8320/8350 was better in any case, not even for the money.

That part wasnt for you..

Link to comment
Share on other sites

Link to post
Share on other sites

OC AMD CPU's can pull quite a bit of power. Crysis 3 was the only game that pulled that much. Hell my 4770k at 1.22v can pull like 180 and up in a synthetic. and that is just like .1 higher then stock. OC AMD's are running at like 1.4-1.5 often.

 

That was only one game though. On most the CPU is nowhere close to load and neither is the gpu, not at the same time anyways. Add to that an I7 isn't going to be at full load in a 8 thread game. It is going to steamroll it. 

 

 A 600 watt is pretty much fine for a I5/I7 and that card with decent OC. Prob not for an AMD. You would want like 650. 600 is definitely fine for stock and "overkill". Overclocking just increases power dramatically. All depends with the voltages you end up at. You can get unlucky on Haswell and need 1.3v. You can need more voltage on the GPU as well.

 

He also could have had 4 drives in there lots o fans etc. He is basically saying AMD uses more power. It does. It isn't that big a deal at all in the US for power usage. In other countries it is, and he told you that it isn't that big a deal. You do want to go a little bigger on the PSU though on AMD if OC. 

 

Add to all this? Low level OpenGL/DirectX are going to dramatically decrease the power needed from the AMD/Intel. Crysis 3 has tons of CPU overhead like all games that aren't AMD Mantle. Really only a problem in that one game atm. I don't have Star Citizen. It might also push a cpu to the max. It is going to be AMD Mantle and I am sure that guy will go low level on his game with OpenGL/DirectX as soon as possible. Cry Engine is now OpenGL. 

 

he also said he pulled 570 watts with the i5...

 

the main reason I bring this up is cause "psu is overkill" is the most common comment in the build log and build planning forum.  and I see it a lot even with 6-700 watt psu...  IMHO if you're pulling 570 watts, a 700 psu isn't overkill, thats just some safety headroom.  I'm curious what would happen if you had a 500 watt psu and your card wanted more power, would the system crash or would the gpu just throttle?

HP something | 5600X | Corsair  16GB | Zotac ArcticStorm GTX 1080 Ti | Samsung 840 Pro 256GB | OCZ Agility 3 480GB | ADATA SP550 960 GB

Corsair AX860i | CaseLabs SM8 | EK Supremacy | UT60 420 | ST30 360 | ST30 240

Gentle Typhoon's and Noctua's and Noiseblocker eLoop's

 

Link to comment
Share on other sites

Link to post
Share on other sites

he also said he pulled 570 watts with the i5...

 

the main reason I bring this up is cause "psu is overkill" is the most common comment in the build log and build planning forum.  and I see it a lot even with 6-700 watt psu...  IMHO if you're pulling 570 watts, a 700 psu isn't overkill, thats just some safety headroom.  I'm curious what would happen if you had a 500 watt psu and your card wanted more power, would the system crash or would the gpu just throttle?

 

The I5 was OC and at full load. He may have needed a ton of wattage to get that OC. Haswell's are whacky. The AMD is also OC and at full load because AMD single core blows.

 

You have an I7. This is what I7's looks like in a 8 thread game. They simply don't have to work as hard. Like I said though. It is one game. Crysis 3 is the only game that pulls that and on an I7? Our cpu usage is gonna be much lower. The I5 and 8350 won't be pulling near as much wattage in future games, when all API's are low level. Our I7? Welp...we uh we paid more for something that will be sitting mostly idle in games. These next gen consoles are 1.7ghz mobile AMD cpu's that are much slower clock for clock then an AMD desktop chip. Our I7 is laughing at the "next gen" and that is before optimization. So of course you don't see this on the OC 4770k at 4.6 ghz. :)

 

mkcLTFh.jpg?1

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

so I watched that vid... I know his 780 lightning is overclocked, but how the hell did he manage to pull 600 watts out of his psu???  even 5XX with the intel...  The most I've ever seen out of my 780 is in the low 400's, and thats with a 4770 at 4.6.  granted I don't have crysis 3 and he did say he only saw this on that game... 

 

anybody else that has a watt meter wall thing or corsair link for their PSU that can check this?  if this is true, everyone can shut with their "anything over 500 watts is overkill for a single graphics card"

 

Honnestly i saw alot of his movies, and alot turned out to be just bullshit.

 

GTX780 lightning will probably draw arround 300W / 325W max.

The Lighning has 2x8 pin power connector, this means, 300W of power + 75W from the pci-e slot, would make a maximum total of 375W, but the card will never maxout that. Otherwise it wouldn't be safe to use.

Link to comment
Share on other sites

Link to post
Share on other sites

Honnestly i saw alot of his movies, and alot turned out to be just bullshit.

 

GTX780 lightning will probably draw arround 300W / 325W max.

The Lighning has 2x8 pin power connector, this means, 300W of power + 75W from the pci-e slot, would make a maximum total of 375W, but the card will never maxout that. Otherwise it wouldn't be safe to use.

 

http://linustechtips.com/main/topic/126999-780-sli-help-me-choose/?p=1689577

HP something | 5600X | Corsair  16GB | Zotac ArcticStorm GTX 1080 Ti | Samsung 840 Pro 256GB | OCZ Agility 3 480GB | ADATA SP550 960 GB

Corsair AX860i | CaseLabs SM8 | EK Supremacy | UT60 420 | ST30 360 | ST30 240

Gentle Typhoon's and Noctua's and Noiseblocker eLoop's

 

Link to comment
Share on other sites

Link to post
Share on other sites

there are a bunch of people that say the exact opposite.  and I can tell you're full of sh*t because of the "much better" part.  in some cases the 8350 is a little better, in most cases the i5 is a little better, there is no huge difference either way.

You're entitled to your opinion just like anyone else and what I have said I firmly believe out of thorough personal experience not out of preconceived notions or misconceptions.

If you disagree that's fine but you have no right to throw insults.

Remember I said the 8350 is much better for your money, you get performance superior to a 3770K in any multi-core aware workload and this includes streaming, video and audio conversion as well as modern games like BF4 and Crysis 3 all for 120$ less. I'd call that better for your money any day of the week.

This thread reeks of fanboys trying to push their "agenda" as if it's some sort of crusade, appalling.

Link to comment
Share on other sites

Link to post
Share on other sites

You're entitled to your opinion just like anyone else and what I have said I firmly believe out of thorough personal experience not out of preconceived notions or misconceptions.

If you disagree that's fine but you have no right to throw insults.

This thread reeks of fanboys trying to push their "agenda" as if it's some sort of crusade, appalling.

 

ok, I'm open for you to change my mind.  Show me how the 8350/20 is a "much better" cpu than the i5

 

also did you watch the vid in post 29?  everything he tests they're either on par or the i5 is a little better, couple instances being much better

HP something | 5600X | Corsair  16GB | Zotac ArcticStorm GTX 1080 Ti | Samsung 840 Pro 256GB | OCZ Agility 3 480GB | ADATA SP550 960 GB

Corsair AX860i | CaseLabs SM8 | EK Supremacy | UT60 420 | ST30 360 | ST30 240

Gentle Typhoon's and Noctua's and Noiseblocker eLoop's

 

Link to comment
Share on other sites

Link to post
Share on other sites

A system with a single gpu drawing 795W out of the wall socket? never dude :)

Even a Titan would not draw that.

 

I don't know woman, ask the guy that posted it.  personally I've never seen mine (specs in sig) go over 450 according to corsair link

HP something | 5600X | Corsair  16GB | Zotac ArcticStorm GTX 1080 Ti | Samsung 840 Pro 256GB | OCZ Agility 3 480GB | ADATA SP550 960 GB

Corsair AX860i | CaseLabs SM8 | EK Supremacy | UT60 420 | ST30 360 | ST30 240

Gentle Typhoon's and Noctua's and Noiseblocker eLoop's

 

Link to comment
Share on other sites

Link to post
Share on other sites

I would say buy an 8320 and OC it, then use the saved money and get a better GPU. 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know man, ask the guy that posted it.  personally I've never seen mine (specs in sig) go over 450 according to corsair link

 

Im a girl haha.

 

But anyway, i guess my post looked a bit  rude, but it was not ment to be.

800W for a single card is rediculous, and would never happen, unless, his custom loop cooling uses alot of power.

 

 

You say that your system goes arround 450W and seems very accurate to me :)

I allways say for a single gpu setup, a 650W is more then enough.

Link to comment
Share on other sites

Link to post
Share on other sites

Im a girl haha.

 

But anyway, i guess my post looked a bit  rude, bit it was not ment to be.

800W for a single card is rediculous, and would never happen, unless, his custom loop cooling uses alot of power.

He claims 1300W power for 2cards. he is funny ;)

 

You say that your system goes arround 450W and seems very accurate to me :)

 

sorry, quoted post fixed

 

the guy in the vid also claims 570 with a single card, thats a long way off 800, but still its a lot...

 

for the record, I've never been over 450 (that I know of), usually its in the 400-420 range

 

 

edit:

don't worry, I didn't get offended

HP something | 5600X | Corsair  16GB | Zotac ArcticStorm GTX 1080 Ti | Samsung 840 Pro 256GB | OCZ Agility 3 480GB | ADATA SP550 960 GB

Corsair AX860i | CaseLabs SM8 | EK Supremacy | UT60 420 | ST30 360 | ST30 240

Gentle Typhoon's and Noctua's and Noiseblocker eLoop's

 

Link to comment
Share on other sites

Link to post
Share on other sites

You're entitled to your opinion just like anyone else and what I have said I firmly believe out of thorough personal experience not out of preconceived notions or misconceptions.

If you disagree that's fine but you have no right to throw insults.

Remember I said the 8350 is much better for your money, you get performance superior to a 3770K in any multi-core aware workload and this includes streaming, video and audio conversion as well as modern games like BF4 and Crysis 3 all for 120$ less. I'd call that better for your money any day of the week.

This thread reeks of fanboys trying to push their "agenda" as if it's some sort of crusade, appalling.

 

The I5 is ahead on anything gaming. It is ahead on BF4, it is ahead on Planetside 2 and  both are 8 thread games. The entire reason AMD made Mantle was because they can't keep up on R&D and because they were behind. CPU overhead IS a problem, but not really for the I5 or I7. This is why RELIABLE tech sites like anandtech told you REPEATEDLY to get the I5 for GAMING and told you that Mantle really didn't help those two cpu's much in BF4. The only big gains shown by Dice or AMD on intel were on crossfire and that is because crossfire was horribly optimized. 

 

What did it help? The I3, the 63xx, their new A-whatever the hell it is called, and the 83xx in all its name changes on overclocks.

 

The only "benchmarks" showing a AMD ahead are by the laughable Tek Syndicate on SINGLE player where GPU is the bottleneck. Those were GPU benchmarks sold to naive people as CPU benchmarks.

 

His benchmarks were like doing CPU benchmarks on a game like Tomb Raider. They are stupid.

 

http://www.techspot.com/review/645-tomb-raider-performance/page5.html

 

A I7 at 2.5ghz or a AMD at 2.5 ghz is almost the exact same as 4.5 ghz. Many single player games might as well be a GPU benchmark. There are exceptions, but you have to find and benchmark a VERY chaotic scene in a FPS to test the CPU (Linustechtips just did this with their mantle test, they found the scene with the most crap going on) and if you do? The AMD will only come close to matching the I5 if the game is 8 thread and the game is AMD Mantle and the GPU is a Mantle GPU.

 

Games where an 8350 matches or is close to an I5. GPU bound singleplayer games coded for 8 threads. Period. The 8350 isn't even ahead on streaming. First of it can't even play a game like Guild Wars 2 WELL while clocked at 5ghz. There are only a handful of 8 thread games and the I5 is not slower. It is slightly ahead or even depending on optimization/Mantle and Mantle is in what? 2 games? When the 8350 has to use more of the CPU to catch up? Then the streaming goes out the window. It is a stupid argument. The I5 and the 8350 might as well be = streaming because they are both needing a TON of the CPU to play the game. Whatever CPU runs the 8 thread game faster? Is going to win in streaming as well. 

 

This shouldn't even be a discussion. The only reason it IS a discussion is BS videos by Tek Syndicate and laughable benchmarks from RussianGPU claiming an I7 was 20 fps ahead of a I5 in BF4. I have a I7. I can tell you it is much closer to 1-2 fps not 20, as can anyone who owns an I7. We turn off HT and guess what? We have an I5. The cpu is taxed harder but it isn't faster.

 

The AMD is a good rendering chip on a budget. Anything past the 8320 is a waste. The 8320 can be a decent CPU on some games and not others. It gets worse the bigger GPU you use. When everything goes low level? The 8320 will be good. So will the I3. I would rather have the I3 for gaming. Why? Because it will play the old games better as well.

 

When AMD makes a GOOD gaming cpu we will jump on board. It has nothing to do with being a fanboy. Many of us have owned AMD CPU's in the past when they were much more competitive. We also have owned AMD GPU's and I see no problem with them at all, except litecoin miners making them a bad value.

 

Our "fanboyism" for Intel has to do with the AMD CPU's being = to an overclocked I7-920 from 2008 in games. If you want a 2008 chip in gaming performance? By all means buy one. The AMD has sucked for gaming ever since the Sandy came out. They can't catch Sandy let alone Haswell.

 

Mutlicore (and if you think you are running a 8350 at 5ghz 24/7 good luck with that) from users here. The 8350 is slower clock for clock then a I7 Sandy Bridge rendering.

https://docs.google.com/spreadsheet/ccc?key=0AlC81MjwelBgdEZNV3l6aHl1eUNwSUR4Rml0MXMzN1E&usp=sharing#gid=0

 

Singlecore. The 8350 can't match a stock Sandy Bridge I3 when OC to 4.9ghz. This is why it gets killed in CPU BOUND games.

 

https://docs.google.com/spreadsheet/ccc?key=0AlC81MjwelBgdEZNV3l6aHl1eUNwSUR4Rml0MXMzN1E&usp=sharing#gid=1

 

This isn't even debatable and when you have to use a russianGPU AMD astroturfing site, or single player game "benchmarks" to try and convince people of lies? It makes you look silly. Right now and for the foreseeable future the Intel crushes the AMD on MOST games. The Intel is never going to be slower. Integer cores suck for gaming. They can be optimized but so can the Intel's and so can the Nvidia GPU extensions/drivers to work with low level API's to reduce CPU overhead. 

 

http://www.tomshardware.com/news/directx-direct3d-opengl-mantle,26167.html

 

AMD SHOULD be really good for NEW games SOON. That said? Is this going to make many current and older games that struggle on AMD run better? No idea.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

ok, I'm open for you to change my mind.  Show me how the 8350/20 is a "much better" cpu than the i5

 

also did you watch the vid in post 29?  everything he tests they're either on par or the i5 is a little better, couple instances being much better

The i7 4770K is anywhere from 5-10% faster than the 3770K.

Rendering performance :

51120.png

 

Decompression :

7zip-decomp.gif

Video Transcoding :

51118.png

handbrake.png

totalcode-studio.png

http://www.tomshardw...ew,3521-17.html

 

3dsmax.png

http://www.tomshardw...ew,3521-14.html

x264-2_0_0.png

povray_0_0.png

 

The 8350 is equal or even slightly faster than the i7 3770K and costs 100$ less, 4770K is roughly 5-8% faster than the 8350 but costs 68% more.

Link to comment
Share on other sites

Link to post
Share on other sites

In the last few "next gen" games the 8350 is faster, even though it's only running at 65% load.

 

http--www.gamegpu.ru-images-stories-Test
http--www.gamegpu.ru-images-stories-Test

proz.jpg
proz%20amd.jpg

 

 

The 8350 is a native octa core, similar to the one in the PS4 & XBOX ONE.
So as more games come out the better the 8350 will perform because developers will go crazy optimizing their code for eight cores since all platforms now have 8 core CPUs, PC, PS4 & XBONE.
If you have an intel processor you will lose out on most of these optimizations.
Because of two reasons, the first is that the Intel CPUs typically perform better because applications including games are coded using Intel's compilers which are extremely biased and literally feeds poisonous code into "AuthenticAMD" CPUs and only feeds proper code to "GenuineIntel" processors.

http://linustechtips...d-cpus-surface/
Developers will be forced to give up these dirty compilers to get the performance that they want.

The second reason is that AMD processors in the consoles and on the desktop have AMD specific instruction sets which Intel have not implemented in any of their processors because of pride like the AMD-V and XOP instruction sets.
The 8350 is also 30-40% faster than the i5 in productivity, editing, compression & decompression, encoding and streaming.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×