Jump to content

FX-8350 or I5 4670K

GPU based scenario's are anything except showing the difference between those cpu's so I'm not seeing the reason why they are comparing cpu's or are you going to say that benchmarking a 290x vs 780ti with a pentium 3 is realistic? Theyre definitely not telling anything of the cpu bound games so theyre irrelevant. I can do those reviews as well, run some Furmark on every cpu and post 50 fps for each of them.

 

We aren't talking about a Pentium 3, this is about an FX-8350 and a 4670K. They're both relatively new chips. That analogy isn't really relevant.

 

 

Google some Crysis 3 benchmarks you see all cpu's performing exactly the same, a complete gpu bound scenario but they don't cover the cpu bottleneck that starts to kick in when you're using 2 or 3 high-end graphics cards so it's far from being realistic -> youtube.com/watch?v=_hcuYiqib9I 2 GPU's 99% both easily with a 3rd you easily see the 3770K@4.8GHz bottlenecking in Crysis 3 as you see his gpu loads 70/70/70%.

 

We are discussing the best chip for a GTX 770 or equivalent GPU.

 

 

There are plenty of games that cant even maximize a gtx 650 because of a massive cpu bottleneck, if you would go with amd for such games and spend that 100$ to a better gpu thats a waste. World of warcraft is a perfect example, an i5 + a gtx 650 will outperform a 8320@5GHz with a 660ti orsomething with a 100% performance difference. So no cheaping out on the CPU to buy a better gpu isn't always the best choice.

 

This is with a GTX 680, which is roughly equivalent to the 770 of our prospective buyer.

 

world-of-warcraft-1920.png

 

 

It's kinda about what gpu they use, with a single 770 there would be zero difference but with 3 780ti's it could turn complety different out but sadly every reviewer is retarded. The GPU the OP would use would be somewhat in the range of a 770 so theres practically zero difference between an i5/8350 and if he wants to get more frames he should get a better gpu or more. For BF4 the difference is minimal, but if we move on to cpu bound games the difference is clear.

And power consumption is bullshit imo, if I would consider power consumption a valid factor then it would be related to heat/noise not about the freaking bills that nobody cares about.

I've never heard any technology reviewer state that three 780 Ti's couldn't be bottlenecked by a CPU. The second part of this is essentially what I'm saying. The difference with CPU-bound games is clear, you're right, but taking Planetside 2 as an example, while the CPU might bottleneck it to an extent, I don't really think it would be more than 15% under a 4670K (at which point a better graphics card is still worth it).

 

As for power consumption, some people do take that into consideration (as well as heat and noise). In fact, pretty much everyone that buys an 80+ Gold power supply cares about power consumption to some extent. So I wouldn't say 'nobody' cares about it.

Link to comment
Share on other sites

Link to post
Share on other sites

 

We aren't talking about a Pentium 3, this is about an FX-8350 and a 4670K. They're both relatively new chips. That analogy isn't really relevant.

We aren't talking about gpu performance variance either. So we should be cpu bound when we're reviewing new gpu's? Thats a sexy logical analogy >.< Why the hell should people compare products when the limiting factor is somewhere else? 

Nobody is interested in Furmark results with 40 fps for each of them.
 

 

We are discussing the best chip for a GTX 770 or equivalent GPU.

This is with a GTX 680, which is roughly equivalent to the 770 of our prospective buyer.

Those benchmarks mean nothing and are pretty much gpu limited. An i7 sitting there with 10% more frames with a 100MHz higher clock and a 3570 only having 0.1 fps more than a 300MHz lower clocked i5 is typical Tomshardware bullshit. A 4670K overclocked at 5GHz will never have more than 40 fps in 25m raids/40vs40 pvp, to give you an idea that would be a 8350@5GHz around 15 fps. I'll refer you to mmo-champion.com in case you dont believe me.

 

 

I've never heard any technology reviewer state that three 780 Ti's couldn't be bottlenecked by a CPU. The second part of this is essentially what I'm saying. The difference with CPU-bound games is clear, you're right, but taking Planetside 2 as an example, while the CPU might bottleneck it to an extent, I don't really think it would be more than 15% under a 4670K (at which point a better graphics card is still worth it).

15%? Lets refer to that tech yes video who showed a difference of 100%, a 780/i5 sitting at 160 fps and a 8350 that was complety bottlenecking the 780 hardly pushing it to 50% load meaning its hardly pushing a 660 to 99%. Looking at his results 160 fps is a lot, he's probably sitting somewhere forever alone, I never get in middle of combat more than 60 fps so a 8350 would sit around 30 fps so basically an i5 with a 660 outperforms a 8350 with a 770/780.

Plenty of more games showing a 100% difference http://vr-zone.com/articles/amd-fx-8350-vs-intel-core-i7-3770k-4-8ghz-multi-gpu-gaming-performance/17494.html?ak_action=reject_mobile

From those gpu bound benchmarks you clearly dont see the cpu overload in MP scenario's making them complety pointless and let alone pulling a conclusion from them. Vishera's IPC is not even a tiny bit different than Nehalems IPC, so saying a nehalem is better for its price is getting complety nowhere if its only 20$ cheaper. If you're only playing singleplayer games that are mainly gpu bound then theres not even a point to get an i5/8350 if a 4300/i3/apu can perform the same.
 

 

As for power consumption, some people do take that into consideration (as well as heat and noise). In fact, pretty much everyone that buys an 80+ Gold power supply cares about power consumption to some extent. So I wouldn't say 'nobody' cares about it.

You can't get a high quality psu (which you always should be buying) without a 80 plus certification these days and the only reason to buy a platinum psu over a 80 plus is to have it running quieter and cooler.

Link to comment
Share on other sites

Link to post
Share on other sites

keep in mind that almost all games right now dont use any more than 2 cores in your cpu. since intel has 4 really strong cores it would be a better choice for gaming over the 8 "okay" cores.....however, if your planning on doing more stuff than gaming ( photoshop, editing/rendering videos) i would strongly recommend you get the AMD just , ,because programs like that can really put those 8 cores to use.........hope this helps :)

Link to comment
Share on other sites

Link to post
Share on other sites

Those benchmarks mean nothing and are pretty much gpu limited. An i7 sitting there with 10% more frames with a 100MHz higher clock and a 3570 only having 0.1 fps more than a 300MHz lower clocked i5 is typical Tomshardware bullshit.

 

I think you're missing my point.

An 8350 paired with a GTX 770 is going to be GPU limited.

A 4670K paired with a GTX 770 is going to be GPU limited.

 

Because this gentlemen wants to pair one of the above chips with a GTX 770, he is going to be GPU limited in games that he plays. The benchmarks don't mean "nothing", they exactly work with this person's situation.

 

Again, I'm NOT talking about whether an FX-8350 will heavily bottleneck 3 GTX 780s, I'm talking about whether it will bottleneck a single GTX 770 to a large extent. To which the answer is no.

Link to comment
Share on other sites

Link to post
Share on other sites

I think you're missing my point.

I'm not missing anything honestly.

Seems like you cant follow this discussion anymore? You quoted me with a wow benchmark and I quoted you on that benchmark saying that it means nothing. You're never gpu limited in wow, far from it.

I didnt even talk about SLI in my previous post.

 

 

I'm talking about whether it will bottleneck a single GTX 770 to a large extent. To which the answer is no.

 

Doesnt really make sense. In the cpu limited games the frames you get from a 770 on a 8350 will be easily achieveable with a cheaper gpu. Even for many games an i5 will bottleneck a 770 and the performance will be again matching a lower gpu. To even make it more simple to understand, take wow again -> a gtx 650 at 99% in 25m raids would give you around 200 fps & a 770 400fps but you'll never see frames going above 40 fps in that game so basically a 650 would sit on 10% load and a 770 on 5%. When the cpu is the bottleneck, any better card after a certain card isnt going to make the difference. You're not going to upgrade your 650 to a 780 if your current cpu cant even push your 650 to 50% load.

If an i5/8350 is going to be a bottleneck isnt a yes or no question. You will always have bottlenecks going on with the latest cpu's.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not missing anything honestly.

Seems like you cant follow this discussion anymore? You quoted me with a wow benchmark and I quoted you on that benchmark saying that it means nothing. You're never gpu limited in wow, far from it.

I didnt even talk about SLI in my previous post.

 

 

 

Doesnt really make sense. In the cpu limited games the frames you get from a 770 on a 8350 will be easily achieveable with a cheaper gpu. Even for many games an i5 will bottleneck a 770 and the performance will be again matching a lower gpu. To even make it more simple to understand, take wow again -> a gtx 650 at 99% in 25m raids would give you around 200 fps & a 770 400fps but you'll never see frames going above 40 fps in that game so basically a 650 would sit on 10% load and a 770 on 5%. When the cpu is the bottleneck, any better card after a certain card isnt going to make the difference. You're not going to upgrade your 650 to a 780 if your current cpu cant even push your 650 to 50% load.

If an i5/8350 is going to be a bottleneck isnt a yes or no question. You will always have bottlenecks going on with the latest cpu's.

 

You are correct, but the games he wants to play aren't more CPU than GPU limited (except for example Planetside 2). What I'm trying to say is that a superior graphics card (by ~$100) will be better for the majority of current-gen and future games, as well as the majority of games he wants to play, than a higher end CPU.

 

Like I said, I have a 3960X and I find much more often that my GTX 680 struggles, limiting my 3960X. My GPU is limited as a bottleneck to such an extent that in the majority of games, my frame rate over an FX-8350 paired with a GTX 680 isn't that big of a deal (obviously I get higher frame rates, but the reality is that it isn't by much, and it's certainly not worth the difference in cost - or even $100 difference).

 

This is what I have experienced firsthand. I don't play World of Warcraft, and I know that is more CPU than GPU intensive, but you will find the majority of games (especially current generation games) are more GPU intensive and for this reason I would opt for a better graphics card over a better CPU.

Link to comment
Share on other sites

Link to post
Share on other sites

Why do people keep making these threads :c

 

They pop up all the time and cause lots of arguments and flame wars :/

 

But to actually answer your question. If you're just gaming- 4670k will trash the FX.

 

But make sure you're throwing a good GPU as I see lots of people buying i5 machines with 650s when i3 machines plus 760s will trash those as well :D

The i5 will not trash the 8350. It will pull ahead a little bit in most games, but it WILL NOT trash it by any means.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

I would go with an 8320 and then overclock it (if you're on a budget) or the i5 if you don't really have a budget and you want to spend more all around on the CPU and mobo.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

sry to hijack would a fx 8320 be good for video editing or i7 2600k ?

amd fx 6300  @4.4ghz @1.4/ga-970a-ud3/HD78702gb /antec 620w psu

Link to comment
Share on other sites

Link to post
Share on other sites

sry to hijack would a fx 8320 be good for video editing or i7 2600k ?

Between those two, I would pick the 8320, and overclock it.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

hey guys ... i think ill go with 8320 but what cpu cooler i should get? ..(air cooler)

Link to comment
Share on other sites

Link to post
Share on other sites

hey guys ... i think ill go with 8320 but what cpu cooler i should get? ..(air cooler)

If you want something cheap, get the Cooler Master Hyper 212 EVO. If you want something more expensive and better, get the be quiet! Dark Rock Pro 2 (I think that's the name of it). I'm not sure if the Dark Rock supports AM3+ ATM, so give me a minute to look it up.

 

EDIT: Yep, it does.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

just go for the I5 , its a CPU thats good at everything , old games , new games , rendering , streaming it does it all. theres some benchmarks that will show other cpu's ahead at specific tasks , but the I5 is such a good all rounder i dont see the point in going with anything else unless you have a specific requirement.

 

In most games it will be your gpu thats ultimately your bottleneck though.

 

 

The i5 will not trash the 8350. It will pull ahead a little bit in most games, but it WILL NOT trash it by any means.

 

in pretty much all games this is true , with the acception of world of warcraft a game that i play , an i5 utterly destroys amd cpus due to that game engine being so god damn old. but thats a very specific case.

Link to comment
Share on other sites

Link to post
Share on other sites

just go for the I5 , its a CPU thats good at everything , old games , new games , rendering , streaming it does it all. theres some benchmarks that will show other cpu's ahead at specific tasks , but the I5 is such a good all rounder i dont see the point in going with anything else unless you have a specific requirement.

 

In most games it will be your gpu thats ultimately your bottleneck though.

 

 

 

in pretty much all games this is true , with the acception of world of warcraft a game that i play , an i5 utterly destroys amd cpus due to that game engine being so god damn old. but thats a very specific case.

^^^ I agree with this.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

You are correct, but the games he wants to play aren't more CPU than GPU limited (except for example Planetside 2). What I'm trying to say is that a superior graphics card (by ~$100) will be better for the majority of current-gen and future games, as well as the majority of games he wants to play, than a higher end CPU.

 

Like I said, I have a 3960X and I find much more often that my GTX 680 struggles, limiting my 3960X. My GPU is limited as a bottleneck to such an extent that in the majority of games, my frame rate over an FX-8350 paired with a GTX 680 isn't that big of a deal (obviously I get higher frame rates, but the reality is that it isn't by much, and it's certainly not worth the difference in cost - or even $100 difference).

 

This is what I have experienced firsthand. I don't play World of Warcraft, and I know that is more CPU than GPU intensive, but you will find the majority of games (especially current generation games) are more GPU intensive and for this reason I would opt for a better graphics card over a better CPU.

And if he ever acrosses a cpu limited game then i cant see the point of cheaping out. You don't need a 8350/8320 or a K version, a simple i5 with a 40$ board will outperform a 8350@5GHz anytime. He only listed a few games that he plays so if you generalize, an i5 will definitely be the better choice. An i5 + a board should cost you around 220$ so complety no point to even argue about which chip is the best for its price.

Around the 150-300$ intel is 

He plays World of tanks which is a cpu limited game as well (http://gamegpu.ru/images/stories/Test_GPU/MMO/World%20of%20Tanks%200.8.2.0/foto/test/World%20of%20Tanks%20processors.png), war thunder is IPC bound as well but even any amd from 5 years ago will do 120fps, an i5 would be a retarded choice for AC4 (http://gamegpu.ru/images/remote/http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_4_Black_Flag-test-ac4_proz.jpg) unless you're using 780ti's so a 4300/i3 would be "a smarter choice", BF4 an i3/4300 isnt going to bottleneck a gpu like a 770. If we're seeing with the 2nd generation 20nm cards, a 200$ midrange card performing equally as 2 780ti's an i5 will be again much faster in BF4 than a 8350 and this would apply to many games.

 

 

The i5 will not trash the 8350. It will pull ahead a little bit in most games, but it WILL NOT trash it by any means.

A GTX 780 ti wont trash a 8200gt at all with a pentium 1 and the 8200gt is better for its money. You're not comparing cpu's at all when the gpu is the freaking bottleneck:

56770.png

Basically a 1000$ cpu like a 4960x doesnt trash a 5600k apu?

Lots of benchmarks you see on the net are freaking gpu bound or even when they make an attempt like this example here where you clearly see AMDs bottlenecking where as Intel filled both 780ti's to 99% meaning Intel has tons of more headroom that could make a difference of up to 100% even. If they used 2 extra 780ti's and lets say quad SLI scaled pretty lineairly like 200% over 2way, bingo Intel is again twice as fast.

 

 

I'd go with 8350 if for budget

Which is atm massively overpriced for its performance. An i5's price/performance ratio is up to twice as better. The 8350/8320 isnt a budget chip, its price/corecount ratio is the best but thats all :P Their only competive cpu's atm are their APU's or the 6300 vs the i3.

Link to comment
Share on other sites

Link to post
Share on other sites

 

And if he ever acrosses a cpu limited game then i cant see the point of cheaping out. You don't need a 8350/8320 or a K version, a simple i5 with a 40$ board will outperform a 8350@5GHz anytime. He only listed a few games that he plays so if you generalize, an i5 will definitely be the better choice. An i5 + a board should cost you around 220$ so complety no point to even argue about which chip is the best for its price.

 

I agree with you. Like I said in a previous post, 

 

If I had to recommend an i5, I would say a 4440 - but then you're losing out on the potential to overclock your CPU (which is something I personally value highly). Whether or not the OP cares about overclocking is another story.

 

Comparing only those two processors, though, I still think an 8350 would be a better choice than a 4670K. That said, I also think a 4440 would be a better choice (if you don't want to overclock much). An i5-4670K + board is going to run him $320+ euros which is why I don't think it's a particularly value-oriented option. I do believe the i5-4440 is a good option to save 45 euros, and mentioned that above.

Link to comment
Share on other sites

Link to post
Share on other sites

Boys i think you all going abit to far offtopic.

Who cares about a 3way GTX780ti sli anyway?...

 

Like i said most games perform realy well on both cpu´s. i not gonne make any statement about which is better or not. The fact is the diffrences between the 2 cpu´s are so small, in gaming. that it does not matter much, which way you go.

It only matters if you play older games, with a very bad optimized game enginze, to utilize more threads ARMA 3 lol days are some titles.. with this issue. But on most games. the diffrence is just minimal.

 

Somethimes i see some biassed battlefield 4 benchmarks about a FX8350 vs a i5-4670K, on which the i5 is arround 30fps ahead.with the same GPU... its just bullshit.

Link to comment
Share on other sites

Link to post
Share on other sites

Go to this link:

 

http://cpuboss.com/cpus/Intel-Core-i5-4670K-vs-AMD-FX-8350

 

They are quite accurate for their details.

WOW! AMD CPUs are sooo power hungry compared to Intel ones.

ON A 7 MONTH BREAK FROM THESE LTT FORUMS. WILL BE BACK ON NOVEMBER 5th.


Advisor in the 'Displays' Sub-forum | Sony Vegas Pro Enthusiast & Advisor


  Tech Tips Christian Fellowship Founder & Coordinator 

Link to comment
Share on other sites

Link to post
Share on other sites

WOW! AMD CPUs are sooo power hungry compared to Intel ones.

 

Realy? My FX8350 idles arround 42.5W

 

Thermal Design Power (TDP), has not realy something to do, with the amount of power your cpu uses.

TDP means that your cpu cooler, should be able to dispace that amount of heat (thermal power in watts) from the cpu.

 

Same funny statement that i read alot over the internet from people that wanne buy a FX9590 220W tdp cpu, but have a 140W tdp 990FX mobo. That people claim that those are not compatible.. its also the biggest bullshit ive ever read.. those 220W tdp cpu's can be used on 990FX 140Wtdp boards without any problem, the only thing you need to be aware of, is that your cooling solution must be  capable to dispace that 220W of heat from the cpu. otherwise everything will overheat.

 

Grtz Sintezza :)

Link to comment
Share on other sites

Link to post
Share on other sites

Realy? My FX8350 idles arround 42.5W

Thermal Design Power (TDP), has not realy something to do, with the amount of power your cpu uses.

TDP means that your cpu cooler, should be able to dispace that amount of heat (thermal power in watts) from the cpu.

Same funny statement that i read alot over the internet from people that wanne buy a FX9590 220W tdp cpu, but have a 140W tdp 990FX mobo. That people claim that those are not compatible.. its also the biggest bullcrap I've ever read.. those 220W tdp cpu's can be used on 990FX 140Wtdp boards without any problem, the only thing you need to be aware of, is that your cooling solution must be capable to dispace that 220W of heat from the cpu. otherwise everything will overheat.

Grtz Sintezza :)

I was just going off things I've read, seen and also that CPU Boss website. Not trying to offend.

ON A 7 MONTH BREAK FROM THESE LTT FORUMS. WILL BE BACK ON NOVEMBER 5th.


Advisor in the 'Displays' Sub-forum | Sony Vegas Pro Enthusiast & Advisor


  Tech Tips Christian Fellowship Founder & Coordinator 

Link to comment
Share on other sites

Link to post
Share on other sites

I was just going off things I've read, seen and also that CPU Boss website. Not trying to offend.

 

Well dont worry, i did not saw it as an offend. haha.

Maybe i had to write it a bit diffrent, but English is not my native language. sorry for that :)

 

Its allways good to mention things. AMD FX cpu´s do indeed use much more power then Hasswell cpu´s. You are completely right at that point :D

But alot of people look at TDP numbers. but thats something diffrent.

Link to comment
Share on other sites

Link to post
Share on other sites

People need to stop saying the FX-8350 is better for the price. It's not, it's just cheaper. The i5 is just a better CPU, there is no arguing that. Power consumption is going to end up saving you money in the long run, and single core and multi-core performance is just flat out better.

Don't get me wrong, if you only have $180 to spend on a CPU, the 8350 is going to perform well, but I'd say it's very much worth the $30 premium. Intel also has a more advanced architecture and chipsets that will be relevant longer.

I'm that guy with the GPD Win.

Link to comment
Share on other sites

Link to post
Share on other sites

 

And if he ever acrosses a cpu limited game then i cant see the point of cheaping out. You don't need a 8350/8320 or a K version, a simple i5 with a 40$ board will outperform a 8350@5GHz anytime. He only listed a few games that he plays so if you generalize, an i5 will definitely be the better choice. An i5 + a board should cost you around 220$ so complety no point to even argue about which chip is the best for its price.

Around the 150-300$ intel is 

He plays World of tanks which is a cpu limited game as well (http://gamegpu.ru/images/stories/Test_GPU/MMO/World%20of%20Tanks%200.8.2.0/foto/test/World%20of%20Tanks%20processors.png), war thunder is IPC bound as well but even any amd from 5 years ago will do 120fps, an i5 would be a retarded choice for AC4 (http://gamegpu.ru/images/remote/http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_4_Black_Flag-test-ac4_proz.jpg) unless you're using 780ti's so a 4300/i3 would be "a smarter choice", BF4 an i3/4300 isnt going to bottleneck a gpu like a 770. If we're seeing with the 2nd generation 20nm cards, a 200$ midrange card performing equally as 2 780ti's an i5 will be again much faster in BF4 than a 8350 and this would apply to many games.

 

 

A GTX 780 ti wont trash a 8200gt at all with a pentium 1 and the 8200gt is better for its money. You're not comparing cpu's at all when the gpu is the freaking bottleneck:

Basically a 1000$ cpu like a 4960x doesnt trash a 5600k apu?

Lots of benchmarks you see on the net are freaking gpu bound or even when they make an attempt like this example here where you clearly see AMDs bottlenecking where as Intel filled both 780ti's to 99% meaning Intel has tons of more headroom that could make a difference of up to 100% even. If they used 2 extra 780ti's and lets say quad SLI scaled pretty lineairly like 200% over 2way, bingo Intel is again twice as fast.

 

 

Which is atm massively overpriced for its performance. An i5's price/performance ratio is up to twice as better. The 8350/8320 isnt a budget chip, its price/corecount ratio is the best but thats all :P Their only competive cpu's atm are their APU's or the 6300 vs the i3.

 

... This didn't seem to have anything to do with my post by any means, unless I missed something.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×