Jump to content

Game Developers Choose AMD Over Intel For Gaming.

TechFan@ic

In a poll Eurogamer conducted where a number of game developers were asked to choose between the similarly priced AMD 8 core FX 8350 & the; Intel quad core i5 3570K for gaming purposes, all developers chose the AMD 8 core.

We approached a number of developers on and off the record, asking them whether an Intel or AMD processor offers the best way to future-proof a games PC built in the here and now. Bearing in mind the historical dominance Intel has enjoyed, the results are intriguing - all of them opted for the FX-8350 over the current default enthusiast's choice, the Core i5 3570K


Linus Blomberg from Avalanche Studios,

"I'd go for the FX-8350, for two reasons. Firstly, it's the same hardware vendor as PS4 and there are always some compatibility issues that devs will have to work around (particularly in SIMD coding), potentially leading to an inferior implementation on other systems - not very likely a big problem in practice though," he says.


"Secondly, not every game engine is job-queue based, even though the Avalanche Engine is, some games are designed around an assumption of available hardware threads. The FX-8350 will clearly be much more powerful [than PS4] in raw processing power considering the superior clock speed, but in terms of architecture it can be a benefit to have the same number of cores so that an identical frame layout can be guaranteed."


From another game developer who wished to remain anonymous

"This (Sony) approach of more cores, lower clock, but out-of-order execution will alter the game engine design to be more parallel. If games want to get the most from the chips then they have to go 'wide'... they cannot rely on a powerful single-threaded CPU to run the game as first-gen PS3 and Xbox 360 games did. So, I would probably go for the AMD as well, as this might better match a console port of a game... based on what we know so far."


http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen

Link to comment
Share on other sites

Link to post
Share on other sites

It's to be expected anyway, due to the fact that the next-gen consoles are solely running on AMD hardware.

\[T]/ Praise the Sun!
Super Budget Gaming Build: Intel Pentium G1610, Gigabye GA-H61M-DS2 rev. 3, Kingston Value RAM 4GB CL9 1333MHz, Fractal Design Core 1000, Corsair VS 450, WD 1TB, Powercolor Radeon HD 7750 1GB/GDDR5, (Optional: Asus DRW-24B1ST).
 (Total: $340 USD)

Link to comment
Share on other sites

Link to post
Share on other sites

yeah my next build may use AMD if it means sick frames, we shall see though I'm an ITX guy atm and I doubt AMD is gonna catch up with the TDP of intel any time soon

Link to comment
Share on other sites

Link to post
Share on other sites

Intel better starts offering more cores in their mainstream CPUs soon...
Why? The 3770K is still unbeaten (except for video rendering maybe). AMD needs to increase their performance / clock.
Link to comment
Share on other sites

Link to post
Share on other sites

Why? The 3770K is still unbeaten (except for video rendering maybe). AMD needs to increase their performance / clock.

Well that's an impotent way of thinking about it. It's like saying, why? the FX 8350 destroys the pentium G 620 ! well yeah but it's 130$ more expensive ! The 3770K is also 130$ more expensive than the FX 8350 ! & it's barely faster.

However the FX 8350 vs i5 3570k is a more reasonable comparison, since they're closer in price (3570k is still more expensive + intel motherboards are also more expensive) yet still, the FX 8350 has so much more raw performance than the i5 which is unutilized due to poor software coding.

The performance/clock argument is over & done for, all game developers agreed that gaming on more cores is clearly the way to go forward.

Link to comment
Share on other sites

Link to post
Share on other sites

Why? The 3770K is still unbeaten (except for video rendering maybe). AMD needs to increase their performance / clock.
Well that's an impotent way of thinking about it. It's like saying' date=' why? the FX 8350 destroys the pentium G 620 ! well yeah but it's 130$ more expensive ! The 3770K is also 130$ more expensive than the FX 8350 ! & it's barely faster. However the FX 8350 vs i5 3570k is a more reasonable comparison, since they're closer in price (3570k is still more expensive + intel motherboards are also more expensive) yet still, the FX 8350 has so much more raw performance than the i5 which is unutilized due to poor software coding. The performance/clock argument is over & done for, all game developers agreed that gaming on more cores is clearly the way to go forward.[/quote']

exactly. everyone rags on amd, but the fact you can get 31 fps with a 8350 compared to 32 fps with an i7-3770k means absolutely nothing to me for 130$. with that 130$ saved i can use that for a nicer case/betterfans/newperipheral etc. not to mention its 8 cores and i haven't seen a bottleneck yet when i stream. which obviously to me means it has its merits, and the fact that all future platforms sans nintendo are using AMD means more fps/better optimization/better support AGAIN.

Link to comment
Share on other sites

Link to post
Share on other sites

It'll probably be at least another 18 months before I go for a new build, but if AMD's the way to go when it's time then AMD's the way to go.

Everyone needs to stop being fanboys of a particular company and start being fanboys of their individual wallet.

Link to comment
Share on other sites

Link to post
Share on other sites

Well this was expected, finally AMD is having a it's time, bulldozer was a gamble, but it might just pay off.

as for the rest guys cmon we all know 8350 beats the i5 when a task is heavily threaded, don't be fooled guys intel is ready for this, they could have given us mainstream i5 that were 6 cores w/o HT and it would be a bomb, and don't worry, we will have that 2 years from now, or at least i5 will be 4 cores with HT and the i7's will go all 6 cores with HT.

the g620 analogy is bad dude because you do not take into account that the closer you are to the architecture limit, the more expensive it will get, and you should see Logans test i7 3770 vs the 8350 in Crysis 3, that game uses at least 6 cores if you give it to it. You can use a 6 core AMD and put it vs 8 core oc them to the same speed and get the same results until you start using more than 6 cores. And not to mention multi-gpu scaling, AMD does not do nearly as good, the gpu's run to a bottleneck after you pair 2 7950's

Don't get me wrong if i went to the store right now i would buy an 8350 and an ROG mobo, would cost considerably less so i could sepend cash on an SSD and a sound card and still pay a price just a bit over the i5 and an ROG mobo.

Don't be fooled i7 is still the king of the hill, and AMD needs a lot of work now to match it

System

CPU: i7 4770kMotherboard: Asus Maximus VI HeroRAM: HyperX KHX318C9SRK4/32 - 32GB DDR3-1866 CL9 / GPU: Gainward Geforce GTX 670 Phantom Case: Cooler Master HAF XBStorage: 1 TB WD BluePSU: Cooler Master V-650sDisplay(s): Dell U2312HM, LG194WT, LG E1941

Cooling: Noctua NH-D15Keyboard: Logitech G710+Mouse: Logitech G502 Proteus SpectrumSound: Focusrite 2i4 - USB DAC / OS: Windows 7 (still holding on XD)

 
 
Link to comment
Share on other sites

Link to post
Share on other sites

Well this was expected' date=' finally AMD is having a it's time, bulldozer was a gamble, but it might just pay off. as for the rest guys cmon we all know 8350 beats the i5 when a task is heavily threaded, don't be fooled guys intel is ready for this, they could have given us mainstream i5 that were 6 cores w/o HT and it would be a bomb, and don't worry, we will have that 2 years from now, or at least i5 will be 4 cores with HT and the i7's will go all 6 cores with HT. the g620 analogy is bad dude because you do not take into account that the closer you are to the architecture limit, the more expensive it will get, and you should see Logans test i7 3770 vs the 8350 in Crysis 3, that game uses at least 6 cores if you give it to it. You can use a 6 core AMD and put it vs 8 core oc them to the same speed and get the same results until you start using more than 6 cores. And not to mention multi-gpu scaling, AMD does not do nearly as good, the gpu's run to a bottleneck after you pair 2 7950's Don't get me wrong if i went to the store right now i would buy an 8350 and an ROG mobo, would cost considerably less so i could sepend cash on an SSD and a sound card and still pay a price just a bit over the i5 and an ROG mobo. Don't be fooled i7 is still the king of the hill, and AMD needs a lot of work now to match it[/quote']

being that amd is adding a second decode unit in steamroller and upgrading the l1 cache intel will have to do alot more than their 8% every tick and tock.

would be interesting to see stacked cache like what nvidia is doing to the vram. that would shrink the size.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know about you guys but this story basically re-inforced that I should never buy a console over a PC. Looks like PC future proofs Console hands down. PC Future proofing is in the end users mind, as in if you own the top CPU today how are you to hold back when the next new top of the line CPU comes out, do you keep your supposed future proofed computer or do you upgrade to the latest because its well, much cooler and X amount better and well more future proof?

I roll with sigs off so I have no idea what you're advertising.

 

This is NOT the signature you are looking for.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know about you guys but this story basically re-inforced that I should never buy a console over a PC. Looks like PC future proofs Console hands down. PC Future proofing is in the end users mind, as in if you own the top CPU today how are you to hold back when the next new top of the line CPU comes out, do you keep your supposed future proofed computer or do you upgrade to the latest because its well, much cooler and X amount better and well more future proof?
your post seems contradictory?

but know that most users either upgrade every 3-4 years, or upgrade certain things every one

Link to comment
Share on other sites

Link to post
Share on other sites

the concept is flawed as theres no such thing as future proof. the only time i have seen amd surpass intles 3570k is when the 8350 was overclocked faster than the intel. it wasnt by a large amount. so no im not convinced that 8 cores running faster then 4 just to keep pace is the right way to go. i much prefer low power high performance and get the extra i payed up front back on the electric bill. because in the uk we pay a lot more than in the u.s per KWh, we pay over 30cents compared to the u.s. 10cents so the difference will mount up to nearly $30 a quarter difference in running costs. as you can see i would have my money back in about 14 months. everything after that makes the intel better value for money for me.

but seriously future proofing is a myth...

Rig: Intel i7 920 D0 @3.5 | gtx970 | 12 gig Balistix 1333 @ 8-8-8-24 | 2x1TB Spinpoint F3 Raid 0 | 1TB Spinpoint F3 |samsung 840 evo|  Thermaltake 850w Tp |

Xfi extreme gamer | Antec 902

Peripherals: Samsung syncmaster 2494hs @1920/1080 - 60Hz | Q-pad MK85 | g502 | Razor Destructor | Logitech G930 | Logitech 3D pro xtreme | 360 pad | Nitro Wheel

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know about you guys but this story basically re-inforced that I should never buy a console over a PC. Looks like PC future proofs Console hands down. PC Future proofing is in the end users mind, as in if you own the top CPU today how are you to hold back when the next new top of the line CPU comes out, do you keep your supposed future proofed computer or do you upgrade to the latest because its well, much cooler and X amount better and well more future proof?
Console < PC < New PC

I roll with sigs off so I have no idea what you're advertising.

 

This is NOT the signature you are looking for.

Link to comment
Share on other sites

Link to post
Share on other sites

I agree.

I do own a 3930k but I agree.

AMD delivers the best performance for the price. More game developers in the future will start to develop for multiple CPU threads. Some games are already optimised for multiple threads. For example Crysis 3, (Enhanced CryEngine 3) Battlefield 3 (Frostbite 2)

CPU: Intel i7 4790k Motherboard: ASUS Maximus VII Formula RAM: Corsair Vengeance Pro 32GB 2400MHz: GPU: 2x EVGA GTX 780 Ti's with ACX cooling PSU: Corsair AX1200 Watt Gold SSD: SanDisk Extreme 120GB SSD (Operating System) SSD: Mushkin Chronos 240GB (Games) Sound card: Creative Sound Blaster ZxR HDD: Seagate 3TB External OS: Microsoft Windows 8.1 Mouse: Logitech G502 Gaming mouse Keyboard: Corsair Vengeance K60 MX Red switches Monitor: ASUS VG248QE 144Hz

Link to comment
Share on other sites

Link to post
Share on other sites

the concept is flawed as theres no such thing as future proof. the only time i have seen amd surpass intles 3570k is when the 8350 was overclocked faster than the intel. it wasnt by a large amount. so no im not convinced that 8 cores running faster then 4 just to keep pace is the right way to go. i much prefer low power high performance and get the extra i payed up front back on the electric bill. because in the uk we pay a lot more than in the u.s per KWh' date=' we pay over 30cents compared to the u.s. 10cents so the difference will mount up to nearly $30 a quarter difference in running costs. as you can see i would have my money back in about 14 months. everything after that makes the intel better value for money for me. but seriously future proofing is a myth...[/quote']

No, not really, future proofing here applies very well for a number of reasons.

1- The fact that the FX 8350 has 8 cores & only 4 of them are sufficiently loaded in various games, this basically means that you're only using 50% of the CPU's potential, next gen games will support 8 cores thus giving you 50% more performance on your CPU.

That's assuming that all current games support 4 cores, which isn't true, most MMORPGs like WoW & TERA only use 2 cores & games that utilize the bethesda engine also only use 2 cores (Skyrim, Fallout3).

We also have games that support 6 cores but not very well, like Battlefield 3 & games that support up to 8 cores also not very well like Crysis 3.

2-The fact that the 3570K is on a dead platform (1155 socket) & no future CPUs are going to be compatible with that socket, unlike the FX 8350 which uses the AM3+ socket which AMD is committed to support for at least two more generations (Steamroller & Excavator) giving you much greater future upgradability.

Link to comment
Share on other sites

Link to post
Share on other sites

hahaha oh! here we gooo..... 1155 is a dead platform but intel i series isnt. sockets have been going the way of the dodo for years now and thats just th way of things. 2.5 years and on to the next. its how you keep performance...

amd produce the equivalent of a family value SUV and occasionally upgrade the engine.

while intel build a little sports car... then ask do you wanna go faster, heres a new sports car...

im quite happy to cruse the highways with my top down while your stuck in your suv with your kids shouting ARE WE THERE YET!!.

as for amd sticking with am3+ for the next 2 years thats hokum. they said they will stick with sockets for the next 2 years, am3+ will go away in 2014 the same as am2+ went away in 2011. fact is amd have had just as many socket changes since 2007 then intel

since socket 939 amd have had

am2, am2+ ,am3 ,am3+

wile intel have had since socket 775

1156, 1155 or 1366, 2011...

looks like they both have gone through 4 socket changes to me but intels were split into 2 groups so if my math is correct you get slightly longer support per socket from intel... oops!...

so as often is the case you seem a little blind to the reality. amd keep backward compatibility because they have to. intel prefer to give performance.

intel are releasing socket 1150 this year and amd will be releasing socket am4 when they release excavator but it will be steam roller only compatible from what i can find. if they release it with ddr4 support at the off if they dont it wont. so i gotta say your getting a little ahead of yourself and massaging figures to make your point doesn't really help your argument. SORRY!.

so like i said future proof is a myth. yes you get backwards compatibility with amd but at the cost of performance.

btw sorry for the spelling(dyslexias a cruel bitch)

Rig: Intel i7 920 D0 @3.5 | gtx970 | 12 gig Balistix 1333 @ 8-8-8-24 | 2x1TB Spinpoint F3 Raid 0 | 1TB Spinpoint F3 |samsung 840 evo|  Thermaltake 850w Tp |

Xfi extreme gamer | Antec 902

Peripherals: Samsung syncmaster 2494hs @1920/1080 - 60Hz | Q-pad MK85 | g502 | Razor Destructor | Logitech G930 | Logitech 3D pro xtreme | 360 pad | Nitro Wheel

Link to comment
Share on other sites

Link to post
Share on other sites

hahaha oh! here we gooo..... 1155 is a dead platform but intel i series isnt. sockets have been going the way of the dodo for years now and thats just th way of things. 2.5 years and on to the next. its how you keep performance...

amd produce the equivalent of a family value SUV and occasionally upgrade the engine.

while intel build a little sports car... then ask do you wanna go faster, heres a new sports car...

im quite happy to cruse the highways with my top down while your stuck in your suv with your kids shouting ARE WE THERE YET!!.

as for amd sticking with am3+ for the next 2 years thats hokum. they said they will stick with sockets for the next 2 years, am3+ will go away in 2014 the same as am2+ went away in 2011. fact is amd have had just as many socket changes since 2007 then intel

since socket 939 amd have had

am2, am2+ ,am3 ,am3+

wile intel have had since socket 775

1156, 1155 or 1366, 2011...

looks like they both have gone through 4 socket changes to me but intels were split into 2 groups so if my math is correct you get slightly longer support per socket from intel... oops!...

so as often is the case you seem a little blind to the reality. amd keep backward compatibility because they have to. intel prefer to give performance.

intel are releasing socket 1150 this year and amd will be releasing socket am4 when they release excavator but it will be steam roller only compatible from what i can find. if they release it with ddr4 support at the off if they dont it wont. so i gotta say your getting a little ahead of yourself and massaging figures to make your point doesn't really help your argument. SORRY!.

so like i said future proof is a myth. yes you get backwards compatibility with amd but at the cost of performance.

btw sorry for the spelling(dyslexias a cruel bitch)

Do you actually believe that removing 5 pins from a socket makes any difference what so ever in performance ?
Link to comment
Share on other sites

Link to post
Share on other sites

hahaha oh! here we gooo..... 1155 is a dead platform but intel i series isnt. sockets have been going the way of the dodo for years now and thats just th way of things. 2.5 years and on to the next. its how you keep performance...

amd produce the equivalent of a family value SUV and occasionally upgrade the engine.

while intel build a little sports car... then ask do you wanna go faster, heres a new sports car...

im quite happy to cruse the highways with my top down while your stuck in your suv with your kids shouting ARE WE THERE YET!!.

as for amd sticking with am3+ for the next 2 years thats hokum. they said they will stick with sockets for the next 2 years, am3+ will go away in 2014 the same as am2+ went away in 2011. fact is amd have had just as many socket changes since 2007 then intel

since socket 939 amd have had

am2, am2+ ,am3 ,am3+

wile intel have had since socket 775

1156, 1155 or 1366, 2011...

looks like they both have gone through 4 socket changes to me but intels were split into 2 groups so if my math is correct you get slightly longer support per socket from intel... oops!...

so as often is the case you seem a little blind to the reality. amd keep backward compatibility because they have to. intel prefer to give performance.

intel are releasing socket 1150 this year and amd will be releasing socket am4 when they release excavator but it will be steam roller only compatible from what i can find. if they release it with ddr4 support at the off if they dont it wont. so i gotta say your getting a little ahead of yourself and massaging figures to make your point doesn't really help your argument. SORRY!.

so like i said future proof is a myth. yes you get backwards compatibility with amd but at the cost of performance.

btw sorry for the spelling(dyslexias a cruel bitch)

its not that easy mate... if it were we would be running on 2 pin sockets.

Rig: Intel i7 920 D0 @3.5 | gtx970 | 12 gig Balistix 1333 @ 8-8-8-24 | 2x1TB Spinpoint F3 Raid 0 | 1TB Spinpoint F3 |samsung 840 evo|  Thermaltake 850w Tp |

Xfi extreme gamer | Antec 902

Peripherals: Samsung syncmaster 2494hs @1920/1080 - 60Hz | Q-pad MK85 | g502 | Razor Destructor | Logitech G930 | Logitech 3D pro xtreme | 360 pad | Nitro Wheel

Link to comment
Share on other sites

Link to post
Share on other sites

hahaha oh! here we gooo..... 1155 is a dead platform but intel i series isnt. sockets have been going the way of the dodo for years now and thats just th way of things. 2.5 years and on to the next. its how you keep performance...

amd produce the equivalent of a family value SUV and occasionally upgrade the engine.

while intel build a little sports car... then ask do you wanna go faster, heres a new sports car...

im quite happy to cruse the highways with my top down while your stuck in your suv with your kids shouting ARE WE THERE YET!!.

as for amd sticking with am3+ for the next 2 years thats hokum. they said they will stick with sockets for the next 2 years, am3+ will go away in 2014 the same as am2+ went away in 2011. fact is amd have had just as many socket changes since 2007 then intel

since socket 939 amd have had

am2, am2+ ,am3 ,am3+

wile intel have had since socket 775

1156, 1155 or 1366, 2011...

looks like they both have gone through 4 socket changes to me but intels were split into 2 groups so if my math is correct you get slightly longer support per socket from intel... oops!...

so as often is the case you seem a little blind to the reality. amd keep backward compatibility because they have to. intel prefer to give performance.

intel are releasing socket 1150 this year and amd will be releasing socket am4 when they release excavator but it will be steam roller only compatible from what i can find. if they release it with ddr4 support at the off if they dont it wont. so i gotta say your getting a little ahead of yourself and massaging figures to make your point doesn't really help your argument. SORRY!.

so like i said future proof is a myth. yes you get backwards compatibility with amd but at the cost of performance.

btw sorry for the spelling(dyslexias a cruel bitch)

You're confusing the chipset & the socket, they're entirely different things.

We can have a new chipset with a butload of new features on the exact same socket...

Link to comment
Share on other sites

Link to post
Share on other sites

hahaha oh! here we gooo..... 1155 is a dead platform but intel i series isnt. sockets have been going the way of the dodo for years now and thats just th way of things. 2.5 years and on to the next. its how you keep performance...

amd produce the equivalent of a family value SUV and occasionally upgrade the engine.

while intel build a little sports car... then ask do you wanna go faster, heres a new sports car...

im quite happy to cruse the highways with my top down while your stuck in your suv with your kids shouting ARE WE THERE YET!!.

as for amd sticking with am3+ for the next 2 years thats hokum. they said they will stick with sockets for the next 2 years, am3+ will go away in 2014 the same as am2+ went away in 2011. fact is amd have had just as many socket changes since 2007 then intel

since socket 939 amd have had

am2, am2+ ,am3 ,am3+

wile intel have had since socket 775

1156, 1155 or 1366, 2011...

looks like they both have gone through 4 socket changes to me but intels were split into 2 groups so if my math is correct you get slightly longer support per socket from intel... oops!...

so as often is the case you seem a little blind to the reality. amd keep backward compatibility because they have to. intel prefer to give performance.

intel are releasing socket 1150 this year and amd will be releasing socket am4 when they release excavator but it will be steam roller only compatible from what i can find. if they release it with ddr4 support at the off if they dont it wont. so i gotta say your getting a little ahead of yourself and massaging figures to make your point doesn't really help your argument. SORRY!.

so like i said future proof is a myth. yes you get backwards compatibility with amd but at the cost of performance.

btw sorry for the spelling(dyslexias a cruel bitch)

hmm. lol no mate im not. i havent mentioned anything about chipsets. believe it or not i do know my apples from my pairs. being as i have been a techy since i first used an atari 400 in 1979...

not only that im ranked in the top 100 forum members on toms hardware. i wouldnt be ranked so highly if i didnt know what i was on about...

anyways its just an opinion. im not asking you to agree with it.

Rig: Intel i7 920 D0 @3.5 | gtx970 | 12 gig Balistix 1333 @ 8-8-8-24 | 2x1TB Spinpoint F3 Raid 0 | 1TB Spinpoint F3 |samsung 840 evo|  Thermaltake 850w Tp |

Xfi extreme gamer | Antec 902

Peripherals: Samsung syncmaster 2494hs @1920/1080 - 60Hz | Q-pad MK85 | g502 | Razor Destructor | Logitech G930 | Logitech 3D pro xtreme | 360 pad | Nitro Wheel

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 months later...

hmm. lol no mate im not. i havent mentioned anything about chipsets. believe it or not i do know my apples from my pairs. being as i have been a techy since i first used an atari 400 in 1979... not only that im ranked in the top 100 forum members on toms hardware. i wouldnt be ranked so highly if i didnt know what i was on about... anyways its just an opinion. im not asking you to agree with it.

pears*

There haven't really been any major technological advances to give a legitimate reason to change socket, Haswell-e having DDR4 RAM is an example of a technological change, a die shrink or a slight architectural changes do not validate Intel's constant change of sockets, they only do it because they can and they make money from their own motherboards of which they sell.

 

There has been a considerable change from Athlon/Phenom to Bulldozer/Piledriver but AMD continue to have backwards compatibility, know why? They don't like to suck every penny from their costumers, I know AMD is a business and by no stretch of the imagination are they not trying to make money from you but they seem to be content with the money they are making as opposed to Intel's "More more more money approach" I honestly don't see the purpose of spending the extra money on an intel processor especially since the platform costs considerably more in terms of features/price. Compare the 990fx-ud3 to the z77-ud3 or z87-ud3 and look at the price difference, they are essentially exactly the same board but cost more, think to yourself why?

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

pears*

There haven't really been any major technological advances to give a legitimate reason to change socket, Haswell-e having DDR4 RAM is an example of a technological change, a die shrink or a slight architectural changes do not validate Intel's constant change of sockets, they only do it because they can and they make money from their own motherboards of which they sell.

1156->1155: Integrated Graphics added to the CPU

1155->1150: Voltage Regulation added to the CPU

 

You can't really say Intel's socket changes aren't warranted.  By that logic, AMD should have made its APUs compatible with AM3+ instead of creating the FM sockets.

 

Anyway, back on topic, I'm glad "MOAR COREZ" might actually mean something in the near future.  AMD's looking better and better every day.

 

If the devs go all in on parallel processing, hopefully Intel steps up its game and gets more cores on a mainstream socket.  Hyperthreading can only go so far.

Intel Core i7-7700K | EVGA GeForce GTX 1080 FTW | ASUS ROG Strix Z270G Gaming | 32GB G-Skill TridentZ RGB DDR4-3200 | Corsair AX860i

Cooler Master MasterCase Pro 3 Samsung 950 Pro 256GB | Samsung 850 Evo 1TB | EKWB Custom Loop | Noctua NF-F12(x4)/NF-A14 LTT Special Edition

Dell S2716DGR | Corsair K95 RGB Platinum (Cherry MX Brown) | Logitech G502 Proteus Spectrum | FiiO E17 DAC/Amp | Beyerdynamic DT990 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I was watching the after party archive today & a caller asked @LinusTech about i5s compared to 8 core AMD CPUs in future games, well I researched the question and guess what a, wild LinusTechTips forum appears, so here's your answer, from the game developers themselves.

I have to say I disagree with @Slick on this when he said he'd still go for fewer cores with higher IPC than more cores with higher parallelism (i5 vs AMD 8 core) and the game developers clearly side with me on this.

I have a few arguments to present.

 

#1 Being that indeed the 8 cores in the newer consoles are very weak, this means that the developers will be forced to optimize for all 8 cores to get any decent performance out of the CPU, they can't do lazy coding and rely on IPC like they used to, they will have to maximize usage on all cores because they can't afford not to.

#2 The Jaguar cores in the PS4 & XBOX One aren't Modular like the Bulldozer/Piledriver architecture so Linus made a mistake when he said they were.

They are traditional cores, with each individual unit containing all components of a traditional core ( Integer core + decoder + floating point unit) and nothing is shared between the cores except the cache, which is shared on intel CPUs as well.

So what does this mean ? does it mean that quad core intels will simply fail to play games, absolutely not, it does mean though that the 8 core AMD CPUs will definitely gain a huge performance boost because currently all of those cores in the 8350 are horribly under-utilized.

And this performance boost will most definitely allow them to regain ground against the quad core intels, perhaps even surpass the hyperthreaded quad cores (i7s) in game performance (we've already started seeing that in a few games like Far Cry 3 : Benchmark)

All of this is before we even start talking about the financial argument, an FX 8320 (underclocked 8350) as of right now costs $144.99 on Amazon.com that's $80 less than the 4670K, making it 55% more expensive, and this is before we start looking at motherboard prices where you'll be saving even more.

You'd save so much money by going with an AMD system that the efficiency argument becomes completely mute, because the time you'd need to run your intel system to make back the money on efficiency is absolutely impractical (we're talking upwards of 15 years if you game everyday for 4 hours a day and your power costs are $0.15/KW).

Another argument is graphics performance, you'll end up a much happier gamer by spending that $80 on a graphics card because unlike CPUs where an i3 and an i7 might produce the same performance in a game (Skyrim or Bioshock Infinite for example) a better graphics cards will always give you higher frame-rates, $80 would allow you to upgrade from a 7870 to a 7950, or from a 650 TI boost to a 760 yielding significantly higher performance gains compared to spending that 80$ to go from a 4670K to a 4770K, or from an FX 8320 to a 4670K.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×