Jump to content

Why does EVERY AMD thread become a bluegreen vs red war?

Sonefiler
Go to solution Solved by XTankSlayerX,

Casue humans are assholes.

Saying things like that are going to start a flame war. Too much generalization.

 

It's not whether it's true or not, it's the arguments used. And it's harder to defend an inferior product with good arguments, so they tend to have a more subjective/biased aproach. I mean, it's like trying to say Streets by 50 sound better than the HD650's.

You're going to have to lie...

Link to comment
Share on other sites

Link to post
Share on other sites

you say exept for pricing and that's very important, if you look at pure price/performance ratio they do really well!

 

though i'm an bluegreen fanboy, i don't hate AMD i just dont use it myself

 

Well the argument being made is AMD underperforms, regardless of the $50 premium Intel has over them. Everyone knows AMD has excellent price-to-performance over Intel, but it stops at a certain point in the performance department.

 

The issue is there are people who own AMD chips, thinking their 6-core and 8-core CPU's are somehow better than Intel's 4-core CPU's when they're not. I'd like them to be, but they're not, lol.

Link to comment
Share on other sites

Link to post
Share on other sites

Because arguments, constructive criticism and facts get drowned out by cognitive dissonance, expectation/conformation bias and flat-out denial of said facts. 

 

Especially by people arguing in favor for AMD. You can give me flak all you want, i've never got to the point of tearing my hair out arguing with someone favoring Intel or nvidia.

Cherry picking benchmarks, misrepresenting arguments made against, not understanding API's and CPU / GPU relation and constantly repeating debunked arguments. It's truly obnoxious.

 

For example, mantle does not favor AMD. It actually favors Intel because those CPU's have way more overhead in any given situation. But i'm pretty sure i can find people on this forum that deny something so easily demonstrated.

And take into account that AMD hasn't released a truly original or interesting piece of hardware in the last few years and you got a low expectation towards new ones. 

 

Honestly, I don't know what is wrong with some people. Why do you have invested interest or expectation bias towards something proven to be inferior. It's not a part of you, they don't owe you anything.

Just stop trying to convince yourself and others towards a certain conformation bias and ask different questions.

 

Is it smart to argue for 10 pages in a thread to prove an FX-8350 is "sort-of" just as good as an i5-4460? Wouldn't the question; - What product would be a smarter buy? - be better? I mean, any sane/wise person would just get the product using less power and which does the same on half the cores, on platforms that are from 2014 and not 2011. But no, bring out the 1440p benchmarks showing a 5% delta. It's like some sort of digital amnesia.

 

I'm not partial to either brand, but some of your information is incorrect. 

 

Mantle does not "favor" anything. It's a new API with the purpose of reducing CPU overhead in gaming. So you generally see less of an improvement in cases where the CPU already has a ton of overhead, and more of an improvement where the CPU has little overhead. So in situations where you have lower-end hardware in general or a lower-end CPU paired with a higher-end GPU, Mantle should provide a more noticeable increase in performance in those scenarios. It's situation-dependent, not this brand CPU vs. that brand CPU. Also, for the moment anyways, tests have shown that Mantle has a greater effect when being used with an AMD GPU vs. Nvidia.

 

You make several very generalized statements such as; "Why do you have invested interest or expectation bias towards something proven to be inferior?" I'm assuming you're claiming AMD is inferior? Here's the problem with such statements; You haven't specified which two products you're comparing and for what usage. Gaming, rendering, streaming? If you don't think all that matters, then I'm sorry, but you're part of the problem when you make such blanket statements. 

 

Again, with your comparison of the FX-8350 and i5-4460 - you don't provide any context. You just basically our right claim the i5 is "better" and is the "smarter buy" simply because it uses less power. I have news for you because there are some usage cases where the FX-8350 is the superior part and no the i5 doesn't do everything the 8350 can do with half the cores. The 8350 is an excellent multi-tasking CPU where running demanding games while streaming, among other things, is concerned. If someone is looking for a CPU with that type of usage in mind, but doesn't have the money for an i5 ($50 more), then for them, the 8350 may very well be the "smarter buy".

 

The reason you tend to see more people arguing in defense of AMD is because there tends to be more "fanboy"-like people on the Intel side who automatically think Intel is always better in every situation, when that isn't the case. They also often do not properly compare in a per-case basis.

 

There is no "one" brand that is best for every single application and that is a fact. Two main things need to be considered when choosing parts for a PC build; Budget and usage. That should be what dictates which parts should be compared and from which brand.

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

I'm not partial to either brand, but some of your information is incorrect. 

 

Mantle does not "favor" anything. It's a new API with the purpose of reducing CPU overhead in gaming. So you generally see less of an improvement in cases where the CPU already has a ton of overhead, and more of an improvement where the CPU has little overhead. So in situations where you have lower-end hardware in general or a lower-end CPU paired with a higher-end GPU, Mantle should provide a more noticeable increase in performance in those scenarios. It's situation-dependent, not this brand CPU vs. that brand CPU. Also, for the moment anyways, tests have shown that Mantle has a greater effect when being used with an AMD GPU vs. Nvidia.

 

You make several very generalized statements such as; "Why do you have invested interest or expectation bias towards something proven to be inferior?" I'm assuming you're claiming AMD is inferior? Here's the problem with such statements; You haven't specified which two products you're comparing and for what usage. Gaming, rendering, streaming? If you don't think all that matters, then I'm sorry, but you're part of the problem when you make such blanket statements. 

 

Again, with your comparison of the FX-8350 and i5-4460 - you don't provide any context. You just basically our right claim the i5 is "better" and is the "smarter buy" simply because it uses less power. I have news for you because there are some usage cases where the FX-8350 is the superior part and no the i5 doesn't do everything the 8350 can do with half the cores. The 8350 is an excellent multi-tasking CPU where running demanding games while streaming, among other things, is concerned. If someone is looking for a CPU with that type of usage in mind, but doesn't have the money for an i5 ($50 more), then for them, the 8350 may very well be the "smarter buy".

 

The reason you tend to see more people arguing in defense of AMD is because there tends to be more "fanboy"-like people on the Intel side who automatically think Intel is always better in every situation, when that isn't the case. They also often do not properly compare in a per-case basis.

 

There is no "one" brand that is best for every single application and that is a fact. Two main things need to be considered when choosing parts for a PC build; Budget and usage. That should be what dictates which parts should be compared and from which brand.

 

Mantle does favor a specific target. You're incorrectly reading into the results. You think that, because mantle closes the game between high-end and low-end CPU's right now, it therefor is made to tailor lower end hardware. When in fact it only changes the situation from CPU to GPU bound again. Meaning the GPU becomes the limiting factor (drawcalls) again. It does not mean that the actual scaling between the API's favors lower end when you take the GPU out of the equation.

 

http://www.sweclockers.com/image/diagram/5911?k=089f0347d8304f8135436193fe713201

http://www.sweclockers.com/image/diagram/5913?k=7bfa4ddd7cae3a703c638eac87b23c9e

 

So it may not favor "intel" brand, it does favor IPC over cores (4770K not significantly higher than 4670K). So that put's it on Intel's favor right now, since they have the highest IPC. 

 

And I specifically did not mention AMD when I talked about something being inferior, i'm not falling for your loaded question. I meant in context, when a product can be proven to be inferior, still some dissonance is created in some people that start to defend that very product, thereby losing all common sense or overaching picture.

 

However, this;

 

 

 

The 8350 is an excellent multi-tasking CPU where running demanding games while streaming, among other things, is concerned. If someone is looking for a CPU with that type of usage in mind, but doesn't have the money for an i5 ($50 more), then for them, the 8350 may very well be the "smarter buy".

 

Is exactly what I meant when i said;

 

 

 and constantly repeating debunked arguments.

 

AMD is not cheaper, and does not unconditionally provide better performance multitasking or streaming. In fact, the more multitasking you ask of it, the more the strain on it's shitty resource sharing becomes (you get low min. framerates). Besides, streaming is better done on either the iGPU or hardware decoder of your GPU. Seriously, who cares when it gets decoded by either steam/youtube or twitch anyway. And if it's for professional use, just use a capture card in the first place.

 

 

 

The reason you tend to see more people arguing in defense of AMD is because there tends to be more "fanboy"-like people on the Intel side who automatically think Intel is always better in every situation, when that isn't the case. They also often do not properly compare in a per-case basis.

 

Only that last bit i agree on. Things should be judged in context. But the rest isn't true, the AMD defense has much more to do with the new-age narcissistic SJW mentality than actual facts. It has more to do with the recycling of false arguments and ancient knowledge that is no longer representative ("A or B is more expensive/cheaper", when it clearly isn't the case anymore).

 

And i wasn't arguing that there is such thing as a one true brand. I do think AMD gets overrated too often, it requires much more scrutiny than it's receiving. But also sometimes underrated when it's deserving of more. For example, the 860K isn't half bad.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD=Price to performance king (long run)! IMO I would rather sell 20 $5,000 units than 5 $20,000 units. *suggesting more people buy amd vs the competition*

CPU: Intel I7 4790k @ 4.6Ghz 1.255v | GPU: Gigabyte G1 Gaming GTX 980 Ti | Display: Acer XB270HU bprz | RAM: 16GB (4x4GB) Gskill Ripjaws X 1866MHz | CPU Cooler: H80i | Motherboard: MSI Z97 Gaming 5 | SSD: Mushkin 120GB + Sandisk 480GB | HDD: WD Blue 1TB | Case: Enthoo Pro |PSU: Seaconic M12II EVO 850w | OS: Windows 10 64-Bit | Mouse: Logitech RGB G502 | Keyboard: Thermaltake Poseidon Z (Brown Switches) | 

Link to comment
Share on other sites

Link to post
Share on other sites

Humans are assholes. They're slow, loud, and have low energy efficiency. That's why I'm an android fanboy. So much faster. #androidmasterrace

 

2i92jxy.png

Link to comment
Share on other sites

Link to post
Share on other sites

AMD sucks (a lot of power) Intel sucks (a lot of money out of your wallet)  Nvidia sucks (a lot of money out of your wallet)

You can see all of my reviews posted in the Member Reviews Section! 

Link to comment
Share on other sites

Link to post
Share on other sites

Humans are assholes. They're slow, loud, and have low energy efficiency. That's why I'm an android fanboy. So much faster. #androidmasterrace

 

 

 

Really? I think you will find humans are the most efficient things to convert one energy form to another,  A human can convert the energy in one can of coke into enough kinetic energy to mover an 80Kg mass about 7 Km. That's why people have so much trouble losing weight,  you'd have to run for 35min. at 14Kmh to burn off a can of coke.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I personally find fanboy wars to be a viewing pleasure. goldfish in a bowl will swim in circles, and fanboys in an argument will argue in circles indefinitely. no one is actually trying to convince anyone of anything, but rather its like watching a couple idiots trying to climb on each other's backs in an attempt to keep off the ground.  

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think it's because every person who buys a product wants to think that they got a superior product. Most people don't seem to be level headed enough to realize that (especially with Graphics cards) both Nvidia and AMD trade blows with their strengths and weaknesses.

 

Yes, the problm is that some people don't think before buying the product but after, as you said. I know so many people who went and bought the FX 9350, just because it's written 8-core, 4.7GHz on the box and it is priced similar to an i5 4690. They get home and after a couple of days they realize that it doesn't perform as they expected it, without considering that they only paid a fraction of what a similar Intel processor would cost. Then someone says you should have bought Intel and they simply can't admit that they are wrong or that their research was poor and just start slamming the numbers. The same also goes for Intel and their on-board graphics, which are just miserably slow compared to AMD APUs. 

 

PS: I like both Intel and AMD, but it is just plain obvious that AMD is not competitive in the enthusiast CPU market segment. APUs is a completely different story.  

CPU: AMD Ryzen 9 - 3900x @ 4.4GHz with a Custom Loop | MBO: ASUS Crosshair VI Extreme | RAM: 4x4GB Apacer 2666MHz overclocked to 3933MHz with OCZ Reaper HPC Heatsinks | GPU: PowerColor Red Devil 6900XT | SSDs: Intel 660P 512GB SSD and Intel 660P 1TB SSD | HDD: 2x WD Black 6TB and Seagate Backup Plus 8TB External Drive | PSU: Corsair RM1000i | Case: Cooler Master C700P Black Edition | Build Log: here

Link to comment
Share on other sites

Link to post
Share on other sites

Mantle does favor a specific target. You're incorrectly reading into the results. You think that, because mantle closes the game between high-end and low-end CPU's right now, it therefor is made to tailor lower end hardware. When in fact it only changes the situation from CPU to GPU bound again. Meaning the GPU becomes the limiting factor (drawcalls) again. It does not mean that the actual scaling between the API's favors lower end when you take the GPU out of the equation.

 

http://www.sweclockers.com/image/diagram/5911?k=089f0347d8304f8135436193fe713201

http://www.sweclockers.com/image/diagram/5913?k=7bfa4ddd7cae3a703c638eac87b23c9e

 

So it may not favor "intel" brand, it does favor IPC over cores (4770K not significantly higher than 4670K). So that put's it on Intel's favor right now, since they have the highest IPC. 

 

And I specifically did not mention AMD when I talked about something being inferior, i'm not falling for your loaded question. I meant in context, when a product can be proven to be inferior, still some dissonance is created in some people that start to defend that very product, thereby losing all common sense or overaching picture.

 

However, this;

 

 

Is exactly what I meant when i said;

 

AMD is not cheaper, and does not unconditionally provide better performance multitasking or streaming. In fact, the more multitasking you ask of it, the more the strain on it's shitty resource sharing becomes (you get low min. framerates). Besides, streaming is better done on either the iGPU or hardware decoder of your GPU. Seriously, who cares when it gets decoded by either steam/youtube or twitch anyway. And if it's for professional use, just use a capture card in the first place.

 

 

Only that last bit i agree on. Things should be judged in context. But the rest isn't true, the AMD defense has much more to do with the new-age narcissistic SJW mentality than actual facts. It has more to do with the recycling of false arguments and ancient knowledge that is no longer representative ("A or B is more expensive/cheaper", when it clearly isn't the case anymore).

 

And i wasn't arguing that there is such thing as a one true brand. I do think AMD gets overrated too often, it requires much more scrutiny than it's receiving. But also sometimes underrated when it's deserving of more. For example, the 860K isn't half bad.

 

The GPU should be the limiting factor. Your GPU should be at 100% and your CPU shouldn't be anywhere near 100% or it's getting in the way and slowing down the GPU, thus reducing performance. Optimizing doesn't neccessarily mean off-loading/shifting the work, it means getting the same work done in fewer steps. The improvements are more noticeable in systems with a lower end CPU because they become less of a bottleneck to the GPU. If you want to use your terminology, you could say Mantle "favors" systems with lower end CPUs (both brands) and works better with AMD GPUs. The effects of Mantle on a particular system will be dependent on the specific CPU/GPU combination being used.

 

As for you statements about a product being inferior, if I have misunderstood what you meant, then I apologize. It wasn't meant to be a loaded question. I'm so used to hearing people say "AMD is inferior to Intel" that that is what I automatically assumed you were referring to.  

 

As for AMD not being cheaper and not providing better performance in multitasking and streaming, you should check out some comparison videos by Logan on TekSyndicate. Just keep in mind; he uses real-world scenarios for testing, not just synthetic benchmarks. Anyways, my point on this wasn't to argue about multitasking and streaming, my point was that there are situations where the 8350 is a better choice, whether you agree or not.

 

I think your comment on AMD's defense being more a result of SJW mentality is over the top, but not completely wrong. When I see someone boasting that an APU is the bees knees and smokes i7's, I'll be one of the first to correct them and set the record straight. I do agree that AMD parts have become more expensive than they used to be, but certain parts at certain price points still offer exceptional value, depending on what the end-user is looking for. And I'm not saying AMD always offers the cheapest solution. For a basic use PC, it's pretty hard to beat a $60 Pentium on a $50 H81 motherboard. Again; judge in context and judge on a per-case basis. ;)

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

you should check out some comparison videos by Logan on TekSyndicate.

 

Agree with most of what you said, however this is another one of those "debunked" arguments. The amount of mistakes he makes on his benchmarking tests make his assessment skewed/moot.

If you call 22fps vs 26fps results "real-world", you and i have a very different opinion about "real-world" scenario. At that point the saturation is so high, that overhead/drawcall generation is no longer the determining factor. It shifts to area's like chipset, resource sharing and other wierd shit. If you compare a 8350, 3770K and 3820K, you do not compare them on a HD7850 running 1440p.  I'm sorry, i have no faith in Logan's technical knowledge in this aspect. Same goes for JayzTwoCentz or Austin, or any other of those youtubers. They started doing it out of hobby and clearly have no deeper knowledge of the tech behind it. They basically throw spaghetti at the wall and look what sticks.

 

And until Linus redeems himself for his 1000$ FX6300+GTX970 build, I don't believe a word he says either.

Link to comment
Share on other sites

Link to post
Share on other sites

Agree with most of what you said, however this is another one of those "debunked" arguments. The amount of mistakes he makes on his benchmarking tests make his assessment skewed/moot.

If you call 22fps vs 26fps results "real-world", you and i have a very different opinion about "real-world" scenario. At that point the saturation is so high, that overhead/drawcall generation is no longer the determining factor. It shifts to area's like chipset, resource sharing and other wierd shit. If you compare a 8350, 3770K and 3820K, you do not compare them on a HD7850 running 1440p.  I'm sorry, i have no faith in Logan's technical knowledge in this aspect. Same goes for JayzTwoCentz or Austin, or any other of those youtubers. They started doing it out of hobby and clearly have no deeper knowledge of the tech behind it. They basically throw spaghetti at the wall and look what sticks.

 

And until Linus redeems himself for his 1000$ FX6300+GTX970 build, I don't believe a word he says either.

 

It should always be on the side of the viewer to be critical about what information is being presented to you and not just take it at their word. I too have caught many of these youtubers making mistakes or not conducting tests in the most ideal manner. In such cases, the results I take with those issues in-mind. I'm not sure which videos you were specifically referring to with the 8350 vs. 3770k vs. 3820K with a 7850 - I'll have to go back and watch them before I can comment on that. But my point with bringing up some of Logan's videos is because a lot of reviewers setup tests that look at ONLY synthetic benchmarks which do not reflect real-world scenarios and thus give a false sense of how they actually perform under normal circumstances on a day to day basis.

 

For example; when testing CPU performance in gaming, many reviewers will drop the resolution to 720p and use a high-end GPU to reduce the load on the GPU and remove any GPU bottlenecking from the system, effectively showing differences in CPU strength. the problem with that is; in the real world, almost nobody runs games at 720p - especially with a high-end GPU (290's/780's etc.). When Logan compares CPU performance in games, he runs the games at real-world resolutions (1080p+ etc.) which shows the true, real-world performance one would actually see with the given combination of hardware. Who cares if a 4670K gets 180fps and the 8350 only gets 160fps running game X at 720p with a 780Ti? If they both run the same game at well over 60fps at 1080p+ resolutions, then they're both worthy gaming CPUs and that is the whole point. 

 

Linus probably chose those parts because the FX-6300 was cheap enough to leave room in the budget to get a GTX 970. The 6300 is strong enough to handle and not bottleneck the 970 in most games. An entry-level i5 is still quite a bit more expensive than the 6300 and thus wouldn't allow for such a strong GPU to fit the budget. So there is some sound reasoning behind that choice. The 6300 can also be OC'd quite a bit to make up some ground. It is, however, not the choice you or I would make, but that doesn't make it a "wrong" choice. It's simply another option.

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

For example; when testing CPU performance in gaming, many reviewers will drop the resolution to 720p and use a high-end GPU to reduce the load on the GPU and remove any GPU bottlenecking from the system, effectively showing differences in CPU strength. the problem with that is; in the real world, almost nobody runs games at 720p - especially with a high-end GPU (290's/780's etc.). When Logan compares CPU performance in games, he runs the games at real-world resolutions (1080p+ etc.) which shows the true, real-world performance one would actually see with the given combination of hardware. Who cares if a 4670K gets 180fps and the 8350 only gets 160fps running game X at 720p with a 780Ti? If they both run the same game at well over 60fps at 1080p+ resolutions, then they're both worthy gaming CPUs and that is the whole point. 

 

Egh.. This is one of those arguments that is so rooted in the community that it's almost impossible to get rid of. You have to reflect on what you're saying here. You're arguing that it's wrong for a CPU test to focus on the CPU aspect. Don't you find that a little silly in retrospect? I understand that the absolute performances on 720p do not reflect the absolute performance differences with a cheaper GPU at 1080p (law of diminishing returns).. But it does not negate the relative performance difference they have. You might get a diminished return with a cheaper GPU, but your CPU is objectively the better choice for the respective pricerange, and gives you more overhead for upgrades in the future should it prove more capable in a CPU bottlenecked situation. If you limit the performance by another constant like the GPU, youre actively skewing/limiting your results and giving you only partial information. Ultimately, this is something that I feel only people who are in the field of research/technology will understand properly. Some of you just haven't done your own tests and realise the pitfalls of some of these methods, or the fallacies you commit when trusting these methods/results.

 

It's like testing the top speed of a Fiat Punto vs. Nissan GTR-35 curvy track (YACA), nobody would think that to be an objective or smart test, but somehow in computer-community it would be because "roads IRL have many turns aswell". It does not negate that the GTR35 would be much faster if it had room (Faster GPU, SLI, CPU heavy titles).

 

Look, the tests might have their value, but sadly they're used to generalise too often. Based on those reviews, people conclude that an i5-4670K and 8350 are equals, when they sure-as-hell aren't. And the delta-FPS you see in his tests not an indicator of the overhead-delta the CPU's have. And as for Logan himself, he runs the settings way too high and he hasn't the foggiest about what's going on. He literally throws spaghetti at the wall.

 

Here is that video i meant;

https://www.youtube.com/watch?v=eu8Sekdb-IE

 

Also, a FX-6300 is not a good pair with a 970. It does bottleneck, severely.

Link to comment
Share on other sites

Link to post
Share on other sites

Because of this exact mentality:

ps4 vs xbox (not including PC because we all know who wins that fight :P )

Link to comment
Share on other sites

Link to post
Share on other sites

Because of this exact mentality:

What I said is factually true, based purely on performance we all know who has more of it.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

Casue humans are assholes.

winner winner chicken dinner! (if you recognize who says that quote, you get a cookie.)

Link to comment
Share on other sites

Link to post
Share on other sites

What I said is factually true, based purely on performance we all know who has more of it.

Maybe it is, but that "We know it's true" attitude is also why people turn everything into fights of x vs. y; they all think it's better.

Link to comment
Share on other sites

Link to post
Share on other sites

Egh.. This is one of those arguments that is so rooted in the community that it's almost impossible to get rid of. You have to reflect on what you're saying here. You're arguing that it's wrong for a CPU test to focus on the CPU aspect. Don't you find that a little silly in retrospect? I understand that the absolute performances on 720p do not reflect the absolute performance differences with a cheaper GPU at 1080p (law of diminishing returns).. But it does not negate the relative performance difference they have. You might get a diminished return with a cheaper GPU, but your CPU is objectively the better choice for the respective pricerange, and gives you more overhead for upgrades in the future should it prove more capable in a CPU bottlenecked situation. If you limit the performance by another constant like the GPU, youre actively skewing/limiting your results and giving you only partial information. Ultimately, this is something that I feel only people who are in the field of research/technology will understand properly. Some of you just haven't done your own tests and realise the pitfalls of some of these methods, or the fallacies you commit when trusting these methods/results.

 

It's like testing the top speed of a Fiat Punto vs. Nissan GTR-35 curvy track (YACA), nobody would think that to be an objective or smart test, but somehow in computer-community it would be because "roads IRL have many turns aswell". It does not negate that the GTR35 would be much faster if it had room (Faster GPU, SLI, CPU heavy titles).

 

I don't think you fully understood what I was trying to explain. No, I'm not arguing that it's wrong for a CPU test to focus on the CPU aspect. It's a totally valid method of testing that shows the relative overhead "room" or total performance delta between CPUs. I also never suggested such tests should be done with a less powerful GPU either. I agree, using a cheaper GPU introduces limitations that don't necessarily reflect limitations of the CPU, messing with your results. I agree with that completely.

 

Also I understand what you're saying - that "with restrictions" performance appears to be equal, yet left unrestrained, the performance difference becomes much more evident. (Your car comparison analogy). My point is; not everyone is going to use or need that overhead. Not everyone is going to use the most powerful GPUs on the market or run SLI/crossfire setups. Those that will and do, often have the deeper pockets to do so and will buy the CPU with the necessary additional overhead. 

 

Look, the tests might have their value, but sadly they're used to generalise too often. Based on those reviews, people conclude that an i5-4670K and 8350 are equals, when they sure-as-hell aren't. And the delta-FPS you see in his tests not an indicator of the overhead-delta the CPU's have. And as for Logan himself, he runs the settings way too high and he hasn't the foggiest about what's going on. He literally throws spaghetti at the wall.

 

Here is that video i meant;

https://www.youtube.com/watch?v=eu8Sekdb-IE

 

Also, a FX-6300 is not a good pair with a 970. It does bottleneck, severely.

 

 

They are considered equal because in the majority of cases/setups, they are and perform as such, as observed by the end user. Yes, I get that the 4670K has more potential and could run "triple SLI" etc. better than the 8350 could, but if the end user never plans to do that, what difference does it really make? If by going with an 8350 it allowed the user to go with a more powerful GPU with respect to their given budget, there are two things here to note; 1) they now have better gaming performance out of the gate and thus don't need to upgrade their GPU for quite some time, and 2) they have a very capable and overclockable CPU that also doesn't need to be upgraded anytime soon. That's all I'm saying. I understand what you're saying and I agree with you, for the most part.

 

Thanks for the link to the video. I'll watch it and comment on it later.

 

As for the FX-6300 and GTX 970, I honestly have not even looked up any benchmarks. Based on what knowledge I have of the FX-6300 and 970, I'd assume in most games it would run ok, and suffer more in games that are both heavily CPU and or GPU-bound. It would probably tank pretty hard in Assetto Corsa with a full field of AI cars on track. That game even pushes my i5 to it's limits at times. My prior comments on this were only meant to explain the reasoning behind why he may have chosen those parts. That's all. I agree it's not an ideal combination.

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe it is, but that "We know it's true" attitude is also why people turn everything into fights of x vs. y; they all think it's better.

PC vs console is a very different argument though. It's simplicity and lower performance for the money vs complexity (sort of) and higher performance for the money.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

I could've sworn this thread was dead already... But at least the flaming is gentle, so I'll allow it.

"If it has tits or tires, at some point you will have problems with it." -@vinyldash303

this is probably the only place i'll hang out anymore: http://linustechtips.com/main/topic/274320-the-long-awaited-car-thread/

 

Current Rig: Intel Core 2 Quad Q6600, Abit IN9-32MAX nForce 680i board, Galaxy GT610 1GB DDR3 gpu, Cooler Master Mystique 632S Full ATX case, 1 2TB Seagate Barracuda SATA and 1x200gb Maxtor SATA drives, 1 LG SATA DVD drive, Windows 10. All currently runs like shit :D 

Link to comment
Share on other sites

Link to post
Share on other sites

Because for some reason people on the Internet feel that they must pick sides for whatever reason.

Link to comment
Share on other sites

Link to post
Share on other sites

Because for some reason people on the Internet feel that they must pick sides for whatever reason.

This is true for most aspects of life. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×