Jump to content

Cool video from JayzTwoCents on CPUs

pffff

 

The bottom line is, that for allround gaming, ( and then i mean games from al kinds.)

Intel is the better choice.

 

End of story. :)

 

Why wasting so manny BS about this?

 

@Faa the FX4300 will not perform the same as a FX8350 in all games.

Because the FX4300 only has 2 modules enabled, instead of 4 on the FX8350.

So in cpu bound games that use upto 4 cores the FX8350 will be better then the FX4300.

Basicly if you disable 4 cores on a FX8350 and then overclock the cpu, you basicly improve the single threaded performance a bit.

But the intel cores will still be faster offcourse.

Link to comment
Share on other sites

Link to post
Share on other sites

Kill it With Fire!!!   :o

 

They say Video games make people Violent... No! 

 

a Video comparing Intel and AMD Does!  :lol:

CPU: Intel Core i5-4670K @4.5Ghz | Motherboard: ASRock Z87 Extreme 4 | Cooler : Cryorig R1 Ultimate | Ram: 2x4GB G.Skill Ripjaws | GPU : Gigabyte GeForce GTX 970 G1 Gaming 2-Way SLi | Storage: 120GB Samsung 840 Evo / 4x2TB Caviar Black | PSU: EVGA Supernova 1000W G2 | Windows 7 64-bit  | Case: Enthoo Pro | オール・イズ・バニッティー (all is vanity)


If you work hard, you can make your own Luck.


 

Link to comment
Share on other sites

Link to post
Share on other sites

No comment to the video, but these are my views for the future:

 

Can AMD cpu handle 4k? I'm doubting it can. Also it starts to bottleneck higher end graphics card now, unless Jim Keller and the team managed to pull some crazy shit, AMD FX will probably die harder in the CPU market next year. 

 

Looking at the direction of Maxwell is heading, AMD will response with quite a powerful 4k capable graphics card with the lower ends to handle 1080p gaming. In next year, it will be somewhat common to see FX8350/FX6300 with a GTX x40/x50 or a R7 graphics card.

 

Before you tell me about Pentium and i3s, from a business stand point, they are difficult to sell to people who are new to PC gaming. All they know about is the i5 i7 or maybe AMD with a bit of persuasion.

 

Unless 2k and 4k becomes a big norm next year, AMD CPUs will begin to lose its place in budget gaming on 1080p

Link to comment
Share on other sites

Link to post
Share on other sites

pffff

 

The bottom line is, that for allround gaming, ( and then i mean games from al kinds.)

Intel is the better choice.

 

End of story. :)

 

Why wasting so manny BS about this?

 

@Faa the FX4300 will not perform the same as a FX8350 in all games.

Because the FX4300 only has 2 modules enabled, instead of 4 on the FX8350.

So in cpu bound games that use upto 4 cores the FX8350 will be better then the FX4300.

Basicly if you disable 4 cores on a FX8350 and then overclock the cpu, you basicly improve the single threaded performance a bit.

But the intel cores will still be faster offcourse.

10% difference in 4 threaded games is neglible to use as an argument

Link to comment
Share on other sites

Link to post
Share on other sites

10% difference in 4 threaded games is neglible to use as an argument

 

it will be more then 10% because you basicly have more cores to use.

with 2 modules 4 cores of the FX4300 sharing cache and fpu, you can use 2 cores to its full potential all the time

Wth the FX8350 you have 4 modules 8 cores, so basicly 4 cores that can be use upto its full potential all the time

That makes kinda sense ;)

 

But yeah its a bit offtopic basicly.

 

The bottom line of this topic, and Jays video is, that the 4690K is basicly better for gaming.

Like @JayzTwoCents also stated.

So in my opinnion, there is nothing wrong with his video.

 

In the end all what matters is which games you play mainaly.

And also how you play them. Single or Multiplayer.

 

Just my 2 eurocentz :P

Link to comment
Share on other sites

Link to post
Share on other sites

Yup, Jay has lost a lot of respect from me. His "advice" seems to change constantly, and he doesn't remember what he says in videos that are a year old, and he doesn't know that Intel > AMD for the vast majority of games, and he completely disregards MMOs.

He says in the video to go Intel and that Intel are smashing AMD how is that not knowing?

Link to comment
Share on other sites

Link to post
Share on other sites

He says in the video to go Intel and that Intel are smashing AMD how is that not knowing?

You need to rewatch the video.  He does say the Intel is better than AMD at almost everything, but he doesn't go into detail.  He also backtracks based on older videos.  Read what Faa posted earlier in this thread.

 

AMD is not a good option for gaming if you play games.  At any price point, what Intel has to offer is better, and it has an upgrade path.  AMD will flat out not play MMOs well.  Do you own an FX processor?  Go download ArcheAge, its F2P, its an MMO.  There is a lot of stuff going on in this game with a lot of people on the screen.  My guild even requires you to have an Intel processor because those with AMD cannot keep up.  Its more than MMOs also.  High end GPUs, multi-GPUs, etc..  He doesn't do a good enough job of going into detail.  He is just giving his opinion, no where in the video are benchmarks.  No where in the video are actual prices.  Plus the cost of expensive motherboards for AMD to power those chips, plus cooling, plus AMD doesn't really have anything to offer on mATX because of the VRM problems.  If you want to play all games without issue, you get Intel.  Countless people on this forum started with AMD, thought they were fine, then realized they are bottlenecked, switched to Intel and are cursing people who recommended AMD and misleading websites like CPU Passmark and CPU Boss.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

10% difference in 4 threaded games is neglible to use as an argument

Damn i went back & read that post & i wonder what those guys are saying now especially with the 8350 holding back a 780 in BF4 while a i7 & i5 don't people refuse to accept facts 

Link to comment
Share on other sites

Link to post
Share on other sites

You need to rewatch the video. He does say the Intel is better than AMD at almost everything, but he doesn't go into detail. He also backtracks based on older videos. Read what Faa posted earlier in this thread.

AMD is not a good option for gaming if you play games. At any price point, what Intel has to offer is better, and it has an upgrade path. AMD will flat out not play MMOs well. Do you own an FX processor? Go download ArcheAge, its F2P, its an MMO. There is a lot of stuff going on in this game with a lot of people on the screen. My guild even requires you to have an Intel processor because those with AMD cannot keep up. Its more than MMOs also. High end GPUs, multi-GPUs, etc.. He doesn't do a good enough job of going into detail. He is just giving his opinion, no where in the video are benchmarks. No where in the video are actual prices. Plus the cost of expensive motherboards for AMD to power those chips, plus cooling, plus AMD doesn't really have anything to offer on mATX because of the VRM problems. If you want to play all games without issue, you get Intel. Countless people on this forum started with AMD, thought they were fine, then realized they are bottlenecked, switched to Intel and are cursing people who recommended AMD and misleading websites like CPU Passmark and CPU Boss.

As he said depends on what you do with your PC.

I'll check out archage and he actually says if you want to game go intel unless your an AMD fan but not to be scared to use AMD if it is a budget build.

Yes Intel is better and gives an upgrade path but why not buy the best anyway rather than waste money?

Matx is a waste of time yep and the money side of things varies from country to country, in the UK you can still get a 8350 for £60 cheaper with a decent 970 board (that's a big difference for me) but in America prices do seem stupid high.

As for extra cooling this is crap I see going around the forum, stock cooler is fine and whatever works with Intel with work with the 8350 aftermarket wise.

Link to comment
Share on other sites

Link to post
Share on other sites

Coming from someone who has spent the last year mastering and applyingn parallel and approximation algorithms, you are out of your league. Lighting is easy if you plan your deployment accordingly.

Well how'd you go about doing it then? You say only three calculations are needed and I'm inclined to believe you, but surely you mean per pixel, per light source and not including shadows, right? (Don't hate on me, I'm just a med student)

I cannot be held responsible for any bad advice given.

I've no idea why the world is afraid of 3D-printed guns when clearly 3D-printed crossbows would be more practical for now.

My rig: The StealthRay. Plans for a newer, better version of its mufflers are already being made.

Link to comment
Share on other sites

Link to post
Share on other sites

The problem is, you guys consider only gaming.

And in that case, Intel is the way to go. There are no doubts about that.

 

There is a but though. 

If you want to stream or do some video editing, for $200, the 8350 is the best option.

 

And that's pretty much what Jay said.

 

I have tested this myself and the FX 8350 completely destroyed the i5 4690k in Sony Vegas.

While streaming, the i5 dropped some frames while the 8350 didn't.

 

So if you are a streamer/youtuber and you are on a tight budget, the 8350 is the best CPU you can get.

 

The problem with this video is, there's pretty much no point in making it because AMD didn't realease and high end CPUs lately.

So you are comparing "new" chips with 2 years old chips.

Link to comment
Share on other sites

Link to post
Share on other sites

The problem with this video is, there's pretty much no point in making it because AMD didn't realease and high end CPUs lately.

So you are comparing "new" chips with 2 years old chips.

Could agree with you more but it's something he does every year, it's more of a tradition then anything.
Link to comment
Share on other sites

Link to post
Share on other sites

Well how'd you go about doing it then? You say only three calculations are needed and I'm inclined to believe you, but surely you mean per pixel, per light source and not including shadows, right? (Don't hate on me, I'm just a med student)

Read again. 3 calculations for X,Y,Z coordinates, 2 to get distances between between all objects and light sources (plus 1 for angle). This can all be done in one click cycle through OpenCL in a 1920x1080p scenario on a gtx 570 or higher.

Next step is texturing in OpenGL/DirectX. I didn't touch this part. That takes about 5 cycles due to all the 3D bending and occlusion.

Now the lighting. I used a parallel ray-trace approximation which utilizes some simple Taylor series sums (embarrassingly parallel, can approximate up to 80 terms in 2 cycles and then sum them all) for brightness. LukaP should frankly be familiar with given her background in physics. Refraction and dilution are complicated closed-form integrations, but the approximations are just a few exponential functions with positive and negative powers sitting on top of a*N! Where a is a dilution constant based on distance.

Hue adjustment for lighting is a much more complicated process which requires intimate knowledge of how textures were implemented/how reflective/matte they are, and I didn't touch that part of the code.

Shadows are a joke and just require an outline of any figures and some basic algebra. 2 cycles for coordinates, 1 to soften edges (2 steps)

Of course you sacrifice a little accuracy (and I didn't bother with bouncing light beyond what was implemented for water already), but it's not noticeable. Sometimes the leaves on trees would turn black, but there's always an edge case you have to tweak.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Read again. 3 calculations for X,Y,Z coordinates, 2 to get distances between between all objects and light sources (plus 1 for angle). This can all be done in one click cycle through OpenCL in a 1920x1080p scenario on a gtx 570 or higher.

Next step is texturing in OpenGL/DirectX. I didn't touch this part. That takes about 5 cycles due to all the 3D bending and occlusion.

Now the lighting. I used a parallel ray-trace approximation which utilizes some simple Taylor series sums (embarrassingly parallel, can approximate up to 80 terms in 2 cycles and then sum them all) for brightness. LukaP should frankly be familiar with given her background in physics. Refraction and dilution are complicated closed-form integrations, but the approximations are just a few exponential functions with positive and negative powers sitting on top of a*N! Where a is a dilution constant based on distance.

Hue adjustment for lighting is a much more complicated process which requires intimate knowledge of how textures were implemented/how reflective/matte they are, and I didn't touch that part of the code.

Shadows are a joke and just require an outline of any figures and some basic algebra. 2 cycles for coordinates, 1 to soften edges (2 steps)

Of course you sacrifice a little accuracy (and I didn't bother with bouncing light beyond what was implemented for water already), but it's not noticeable. Sometimes the leaves on trees would turn black, but there's always an edge case you have to tweak.

I'd be lying if I said I understand any of this. I assume you're doing raytracing because you mentioned OpenCL, which I don't think you need for regular rasterization as used in games. I'm only really beginning to learn the basics of OpenGL though...

I cannot be held responsible for any bad advice given.

I've no idea why the world is afraid of 3D-printed guns when clearly 3D-printed crossbows would be more practical for now.

My rig: The StealthRay. Plans for a newer, better version of its mufflers are already being made.

Link to comment
Share on other sites

Link to post
Share on other sites

Read again. 3 calculations for X,Y,Z coordinates, 2 to get distances between between all objects and light sources (plus 1 for angle). This can all be done in one click cycle through OpenCL in a 1920x1080p scenario on a gtx 570 or higher.

Next step is texturing in OpenGL/DirectX. I didn't touch this part. That takes about 5 cycles due to all the 3D bending and occlusion.

Now the lighting. I used a parallel ray-trace approximation which utilizes some simple Taylor series sums (embarrassingly parallel, can approximate up to 80 terms in 2 cycles and then sum them all) for brightness. LukaP should frankly be familiar with given her background in physics. Refraction and dilution are complicated closed-form integrations, but the approximations are just a few exponential functions with positive and negative powers sitting on top of a*N! Where a is a dilution constant based on distance.

Hue adjustment for lighting is a much more complicated process which requires intimate knowledge of how textures were implemented/how reflective/matte they are, and I didn't touch that part of the code.

Shadows are a joke and just require an outline of any figures and some basic algebra. 2 cycles for coordinates, 1 to soften edges (2 steps)

Of course you sacrifice a little accuracy (and I didn't bother with bouncing light beyond what was implemented for water already), but it's not noticeable. Sometimes the leaves on trees would turn black, but there's always an edge case you have to tweak.

as far as i know, shadows DX10 hit "really hard" to performance in all games friend  :rolleyes:

Link to comment
Share on other sites

Link to post
Share on other sites

I'd be lying if I said I understand any of this. I assume you're doing raytracing because you mentioned OpenCL, which I don't think you need for regular rasterization as used in games. I'm only really beginning to learn the basics of OpenGL though...

OpenCL is good for math. OpenGL is good for abstraction of implementation for common graphics processes.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

as far as i know, shadows DX10 hit "really hard" to performance in all games friend :rolleyes:

I never said Microsoft was full of great programmers. If you have the coordinates of all physical surfaces and the coordinates of light sources and their intensity, it's a trivial calculation to get the silhouette. Softening it is the intense part they go way overboard on, but even that can be done with approximations similar to anti-aliasing, except localized.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Real time lighting is CPU-bound? Three calculations to get X,Y,Z coordinates of all objects and light sources. 2 calculations to get angle and distances. 5 for occlusion. Lighting is easily GPU offload-able. Whether you do rasterization or ray tracing it can all be GPU-offloaded.

Again, you are making up your own situations to argue against mine?

In a controlled environment, you can do that.

Also, that sounds nothing like advanced lighting.

And no, three draw calls max to light the whole scene.

So, let me get this right. You are saying that the CPU will only need to issue 3 DRAW-calls to implement advanced lightings?
Link to comment
Share on other sites

Link to post
Share on other sites

Again, you are making up your own situations to argue against mine?

In a controlled environment, you can do that.

Also, that sounds nothing like advanced lighting.

So, let me get this right. You are saying that the CPU will only need to issue 3 DRAW-calls to implement advanced lightings?

3 draw calls, 4 OpenCL Kernels.

 

Just because you don't know how to fit a lot into a kernel doesn't mean I don't. Now, it's the close-enough approximation of real lighting that only a snob will notice, but that is what it boils down to. If you need more than 3, you're either overdoing it in the case of a game, or you're really trying to simulate real life, which only the top-end current hardware can do. I don't do secondary illumination beyond water, for instance. That's because it's negligible in effect and not worth it if you want a lot of hardware compatibility as claimed by BF4. You might even be able to shave a draw call and combine coloring and light adjustments into one adjustment before that, but I'd have to toy with it. Besides, look at the clock counts. You take too many more you can't possibly maintain 60fps anymore.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

It was a video about VRAM, but Jay has no understanding on what VRAM is.  He was misinterpreting the frequency for the video RAM amount.  Total idiotic move from him, made me loose respect.  And now this video...  saying AMD works just fine for all scenarios.  It just doesn't.  It bottlenecks high end GPUs, and it is impossible to play MMOs and a lot of other single threaded games on AMD.

I lost respect temporarily when he compared ddr3 memory of a lower frequency to ddr4 ram of a higher frequency. He said it wasnt a perfect test and I was sitting there like "no its a unbalanced test which makes the results useless...."

Link to comment
Share on other sites

Link to post
Share on other sites

3 draw calls, 4 OpenCL Kernels.

 

Just because you don't know how to fit a lot into a kernel doesn't mean I don't. Now, it's the close-enough approximation of real lighting that only a snob will notice, but that is what it boils down to. If you need more than 3, you're either overdoing it in the case of a game, or you're really trying to simulate real life, which only the top-end current hardware can do. I don't do secondary illumination beyond water, for instance. That's because it's negligible in effect and not worth it if you want a lot of hardware compatibility as claimed by BF4. You might even be able to shave a draw call and combine coloring and light adjustments into one adjustment before that, but I'd have to toy with it. Besides, look at the clock counts. You take too many more you can't possibly maintain 60fps anymore.

 

well this guy kinda makes sense in some ways. but in the end we should still be on topic.

 

intel is good and so is amd. they both play games. pretty much all games can be played on either platform. 

 

will you gain 9999999fps by going intel? no

will you lose 9999999fps by going amd? no

 

this is using gaming for now. amd is a choice you can make and so is intel.

 

for gaming is intel better than amd? yes many other facts written out there.

for gaming is amd better than intel? no many facts etc.

 

If you're doing some work related programs that benefits with more cores then a 8350 is a great choice if you're on a budget.

will it be better than intel i5? yes but not an i7.

 

price/performance is good on the amd side. specially if you use a lot of programs for productivity.

 

Intel is like a standard for a lot of people but AMD processors are something to be considered too.

Live your life like a dream.

 
Link to comment
Share on other sites

Link to post
Share on other sites

@Joshua Ondangan Thank you. It really annoys me how people on here assume only industry workers can be knowledgeable. These are the same workers holding us back after all.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

it will be more then 10% because you basicly have more cores to use.

with 2 modules 4 cores of the FX4300 sharing cache and fpu, you can use 2 cores to its full potential all the time

Wth the FX8350 you have 4 modules 8 cores, so basicly 4 cores that can be use upto its full potential all the time

 

2 vs 4 cores, which should be theoretically a 100% boost, have we seen anything like this yet? I'm aware of if you have two threads assigning each of them to a different module will give better performance than if you assign both to the same module. Besides the singlethreaded performance drops slightly, 4 threaded performance was never noticeable between the 4300 & 8350. The 8350 is just better in like 5-8 games, that's all about it.

 

The bottom line of this topic, and Jays video is, that the 4690K is basicly better for gaming.

Like @JayzTwoCents also stated.

So in my opinnion, there is nothing wrong with his video.

He didn't mention anything about the 4690K, he basically said that at 200$ the 8350 is the best choice. It can't be the best choice if half of its cores aren't paying itself back in your games.

 

 

In the end all what matters is which games you play mainaly.

And also how you play them. Single or Multiplayer.

 

Just my 2 eurocentz  :P

If both chips are priced the same the only argument thats valid is which CPU performs better, anything else is irrelevant. Why would I pay 200$ for the 8350 if I can get a i5 for the same price that can perform much better? 

 

 

As he said depends on what you do with your PC.

I'll check out archage and he actually says if you want to game go intel unless your an AMD fan but not to be scared to use AMD if it is a budget build.

Yes Intel is better and gives an upgrade path but why not buy the best anyway rather than waste money?

Matx is a waste of time yep and the money side of things varies from country to country, in the UK you can still get a 8350 for £60 cheaper with a decent 970 board (that's a big difference for me) but in America prices do seem stupid high.

As for extra cooling this is crap I see going around the forum, stock cooler is fine and whatever works with Intel with work with the 8350 aftermarket wise.

Uhm no, he spoke about rendering performance for like a minute towards the end of the video. Before that he was all talking about gaming performance, he only recommended AMD because they're "cheaper" and that you should put any money in the GPU because that's all that matters. Lets quit the budget bullshit, dude himself paid like 800$ for a 8350 when you include his board along with his custom loop while still getting owned by a 180$ i5/40$ board at 1/4th of the price. Why would you take a guy serious who actually paid the price of a 3930K/X79 board/Evo 212 for a 8350? In every new video he says something completely different, just an idiot he is that's all. Claiming the 8 cores are justifying the 200$ price tag for gaming is plain silly, everything above the 4300 from AMD is freaking money waste for gaming just like on Intels side anything above the i5. Desktop is these days only used for gaming, mainstream use is almost dead, few people use it for productivity which most of them are running Intel anyways, why is AMD tricking people with their 8350 pricing it 5$ below the i5 to make it sound like its a good alternative? That's called money milking, most people don't even know how to chose a CPU, brilliant move from AMD.

Also 60£ cheaper? Oo

 

 
CPU: Intel Core i5-4430 3.0GHz Quad-Core Processor  (£132.99 @ Ebuyer) 
Motherboard: MSI H81M-P33 Micro ATX LGA1150 Motherboard  (£30.97 @ Scan.co.uk) 
Total: £163.96
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2014-10-28 17:47 GMT+0000

 
CPU: AMD FX-8320 3.5GHz 8-Core Processor  (£104.16 @ Aria PC) 
Motherboard: Gigabyte GA-970A-DS3P ATX AM3+ Motherboard  (£48.42 @ Aria PC) 
Total: £152.58
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2014-10-28 17:47 GMT+0000

Go find any CPU bound benchmark, put this in a price/performance and you'd see Intel offering atleast 50% better price/performance. If you want to OC, go ahead swap that board with something decent and get yourself a good cooler. A while ago for 20$ more than the 8350 you had a 4430 & H81 board, should give you some ideas why AMD's almighty price/performance is an illusion. Give it a few months you'd be rocking a i5 like many other people trapped in 8350 fairly tales selling your 8350/board with a huge loss.

Also some people care about silence, which is what AMD's stock cooler will never be, what its minimum rpm like? 4K rpm? 

Link to comment
Share on other sites

Link to post
Share on other sites

As he said depends on what you do with your PC.

I'll check out archage and he actually says if you want to game go intel unless your an AMD fan but not to be scared to use AMD if it is a budget build.

Yes Intel is better and gives an upgrade path but why not buy the best anyway rather than waste money?

Matx is a waste of time yep and the money side of things varies from country to country, in the UK you can still get a 8350 for £60 cheaper with a decent 970 board (that's a big difference for me) but in America prices do seem stupid high.

As for extra cooling this is crap I see going around the forum, stock cooler is fine and whatever works with Intel with work with the 8350 aftermarket wise.

Right, and the reality is, if you game, you buy Intel.  The real waste of money is buying AMD, getting worse performance, and making the switch to Intel.  People never switch from Intel to AMD, I wonder why.  You should be scared of AMD if you game.  Some games are ok, but in a lot of games you are getting significantly worse performance in a wide variety of them, you also throw any upgrade path for CPU and GPU out the window.  That is a waste of money.

 

Good luck overclocking your 8350 on the stock cooler.  Not going to happen.  As a minimum, you need a Hyper 212 EVO which is another $25/25£.  You also need a high end motherboard to overclock on AMD, not an issue with Intel.  Overclocking is a MUST if you own AMD.  To attempt to reach stock i5 performance, you have to be overclocked to 4.6Ghz+ and even then, you still aren't getting as good of performance as the locked i5 in all games.

 

PCPartPicker part list: http://pcpartpicker.com/p/VCGVFT

Price breakdown by merchant: http://pcpartpicker.com/p/VCGVFT/by_merchant/

CPU: Intel Core i5-4440 3.1GHz Quad-Core Processor  ($184.98 @ SuperBiiz)

Motherboard: ASRock H81 Pro BTC ATX LGA1150 Motherboard  ($41.99 @ Newegg)

Total: $226.97

Prices include shipping, taxes, and discounts when available

Generated by PCPartPicker 2014-10-28 15:00 EDT-0400

 

PCPartPicker part list: http://uk.pcpartpicker.com/p/VCGVFT

Price breakdown by merchant: http://uk.pcpartpicker.com/p/VCGVFT/by_merchant/

CPU: Intel Core i5-4440 3.1GHz Quad-Core Processor  (£129.99 @ Ebuyer)

Motherboard: ASRock H81 Pro BTC ATX LGA1150 Motherboard  (£31.99 @ Amazon UK)

Total: £161.98

Prices include shipping, taxes, and discounts when available

Generated by PCPartPicker 2014-10-28 19:05 GMT+0000

 

Vs.

 

PCPartPicker part list: http://pcpartpicker.com/p/hFdD3C

Price breakdown by merchant: http://pcpartpicker.com/p/hFdD3C/by_merchant/

CPU: AMD FX-8320 3.5GHz 8-Core Processor  ($139.97 @ OutletPC)

CPU Cooler: Cooler Master Hyper 212 EVO 82.9 CFM Sleeve Bearing CPU Cooler  ($28.98 @ OutletPC)

Motherboard: Asus M5A99X EVO R2.0 ATX AM3+ Motherboard  ($102.99 @ Newegg)

Total: $271.94

Prices include shipping, taxes, and discounts when available

Generated by PCPartPicker 2014-10-28 15:01 EDT-0400

 

PCPartPicker part list: http://uk.pcpartpicker.com/p/zTLJxr

Price breakdown by merchant: http://uk.pcpartpicker.com/p/zTLJxr/by_merchant/

CPU: AMD FX-8320 3.5GHz 8-Core Processor  (£104.16 @ Aria PC)

CPU Cooler: Cooler Master Hyper 212 EVO 82.9 CFM Sleeve Bearing CPU Cooler  (£23.86 @ CCL Computers)

Motherboard: ASRock 990FX Extreme3 ATX AM3+/AM3 Motherboard  (£76.68 @ CCL Computers)

Total: £204.70

Prices include shipping, taxes, and discounts when available

Generated by PCPartPicker 2014-10-28 19:06 GMT+0000

 

 

That is not cheaper.  You can even drop the 212 EVO and its still more expensive to go AMD.  You could even drop down to the Gigabyte 970 which is hit and miss, and its still more expensive.  Lets not forget that the Intel is going to provide better performance in all games, no matter how overclocked the FX is.

 

http://www.hardcoreware.net/intel-core-i3-4340-review/2/

http://www.hardwarepal.com/best-cpu-gaming-9-processors-8-games-tested/4/

http://www.tomshardware.com/reviews/amd-fx-8370e-cpu,3929-7.html

http://www.anandtech.com/show/8427/amd-fx-8370e-cpu-review-vishera-95w/3

 

"To put it nicely, the FX-8370E is a true middle-of-the-road CPU. Using it only makes sense as long as the graphics card you choose comes from a similar performance segment.

Depending on the game in question, AMD’s new processor has the potential to keep you happy around the AMD Radeon R9 270X/285 or Nvidia GeForce GTX 760 or 660 Ti level.

A higher- or even high-end graphics card doesn’t make sense, as pairing it with AMD's FX-8370E simply limits the card's potential."

-Tom's

 

"The FX-8370E stretches its legs a little in terms of minimum frame rates, particularly in SLI, however it is handily beaten by the i3-4330."

-Anandtech

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

Now, it's the close-enough approximation of real lighting that only a snob will notice, but that is what it boils down to.

Are you saying only a snob can tell the difference between real lighting and dummy lighting (cannot remember the real term).

This is what is progressing at this point.

 

If you need more than 3, you're either overdoing it in the case of a game, or you're really trying to simulate real life, which only the top-end current hardware can do.

More and more games are trying to become more realistic (in the sense of motion, lighting, AI, and so forth).

You can put in dummy replacement, but people will notice it immediately.

 

I don't do secondary illumination beyond water, for instance. That's because it's negligible in effect and not worth it if you want a lot of hardware compatibility as claimed by BF4. You might even be able to shave a draw call and combine coloring and light adjustments into one adjustment before that, but I'd have to toy with it. Besides, look at the clock counts. You take too many more you can't possibly maintain 60fps anymore.

Again, as I said: This is decision game developers have to make.

Note:

Sony and Microsoft are having issues with a not strong enough CPU:

Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.

http://www.pcper.com/news/General-Tech/Sony-PS4-and-Microsoft-Xbox-One-Already-Hitting-Performance-Wall
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×