Jump to content

Is Intel screwing us over

Finally someone that gets the bigger picture.

 

Also, I facepalm when I see someone bitching about lack of improvements and then you see their sig and they're using a 6 year old 920 still.  Nothing makes a company want to spend R&D dollars like watching people sit on their wallets.... :rolleyes:

 

And @OP, you're being an idiot.  The cost of entry to develop a CPU is impossibly high nowadays.  Why you think everyone licenses an ARM core design and calls it a day?  Because it's hella fucking expensive to design a core on your own.

Not to mention the legal stuff to actually license things arent cheap either.

Link to comment
Share on other sites

Link to post
Share on other sites

Intel doesnt care about AMD, intel just does its thing in the CPU market, since AMD is not competing, intel saw this opportunity and locked all their chips/platform and stopped innovating selling same chips for more, taking advatange of AMD's weak products.I couldnt care less about power consumption on Desktop, what has an i5/7 unlocked K cpu have to do with power efficiency? nothing.There are low power model chips for that,and special chips for mobile and low power,theres no reason to not make CPU's stronger in favor of power consumption.

 

iGPU needs to disspear,sadly intel saw another opportunity here if they can shove an expensive iGPU in everyones chips purchase why wouldnt they?with the profit from that they throw it back at iGPU R&D and continue this retarded trend at the expense of the buyers that dont even need that piece of silicon.Hence intel growing its GPU business on users misfortune.

 

With the iGPU gone there are two options, one is cheaper chips like 1/3rd price cut or more/ second put more transistors into the cores or make 6 core i5/i7 on desktop platforms at the same price as current 4core chips.

 

Instead all we get is 5% better synthetic benchmarks every year.Look at skylake they reached 14nm and there's 0 improvement, why would someone who needs and i7 for work and heavy processing/gaming who already own's an sandy/ivy/haswell, be satisfied with a lousy 10% upgrade for a small fortune?

Intel doesn't care about AMD, you're correct about that. But it doesn't matter - they legally cannot kill off AMD. In regards to power draw....Imagine if Intel didn't care about power draw. They wouldn't be able to integrate their products into notebooks because the power draw would be too high. The chips used on both laptop and desktop are technically the same. In addition, we would be seeing 300 watt CPUs because "who cares about power consumption?" The integrated graphics actually has its place - some of us don't have spare hardware laying around and it's a fantastic way of being able to use our system temporarily while say our video card is coming back from RMA. Say what you will about the IGP, but it's been helpful so far.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

On topic: 

 

Correct me if I am wrong.  The symbiotic relationship between AMD and Intel is necessary to keep the x86 and x64 active.  If AMD was to die today, Intel would not be able to produce "x86 CPUs" and Intel would be effectively dissolved/broken apart by the government.

 

The stepping stone for AMDs x64 license is the x86 license, is this right?

 

So this is such a waste.  What a terrible freakin' system.

 

Just release all the fucking licenses and let whoever is capable of development do so.  Everyone that makes it to manufacturing is going to make a CPU with top of the line architecture.  Development costs for further increasing CPU processing yields would be shared by interested parties.  When newer and better architecture is developed, BAM everyone has access. 

 

 

Off topic:

 

As far as the Intel vs AMD for budget gaming discussion... yes you get what you pay for.  With a very low budget, you get more raw performance from Intel with an i5 4440+H81/B85 than with an FX 6XXX and mobo and CPU cooler.  This is not theory, this is trial and error.  Proven in real world setups.  It is not really a question for what to do for a budget gaming PC, at least not with the pricing in the USA.

Link to comment
Share on other sites

Link to post
Share on other sites

Intel doesn't care about AMD, you're correct about that. But it doesn't matter - they legally cannot kill off AMD. In regards to power draw....Imagine if Intel didn't care about power draw. They wouldn't be able to integrate their products into notebooks because the power draw would be too high. The chips used on both laptop and desktop are technically the same. In addition, we would be seeing 300 watt CPUs because "who cares about power consumption?" The integrated graphics actually has its place - some of us don't have spare hardware laying around and it's a fantastic way of being able to use our system temporarily while say our video card is coming back from RMA. Say what you will about the IGP, but it's been helpful so far.

Look im strictly talking about desktop, intel has and can have only special low power models for mobile,i dont care where the trend is, or what it sells, tablets/notebooks sales should not dictate what high end CPU's im able to buy on my desktop.

This mentality goes a bit like this analogy: if you want to buy a quality fast car or racecar you cant because the majority of the people use casual toyota prius's since that servers all their needs, wtf?

 

300W CPU's arent even possible without massive cooling, but there is no reason to not have an i5/7 6 core with ~100-120 watt TDP if they remove the iGPU.The i7 5960x 8 core has 140W so 300watt exageration is too much.

I know that argument with the iGPU serving as a backup,if main dGPU fails but do you even realise how silly is that? everyone is forced to pay for a piece of silicon that serves as backup.If you need a backup get a cheap old gen dGPU from ebay or something.I dont need one i care more about the "pure" cpu and having extra cores/cache for the same price instead of a stupid iGPU backup.

Except on low end models pentium or i3's or mobile where you cant add an GPU card , else iGPU does not make any sense, only a tech illiterate person would buy and i5/i7 just to use its graphics,chances are he wont need that much CPU power to begin with either, he should be buying a low end pentium/i3 or AMD APU, high end SHOULD stay pure cpu with more cores and balanced power usage.

 

I would gladly have an i5 6-core using 30-40watts more, unlocked if possible, at <250$ instead of iGPU, and im sure you and everyone else would.

No one said anything about killing amd, should intel stop improving CPU just to keep lazy AMD alive? are you sure that even makes sense?

Link to comment
Share on other sites

Link to post
Share on other sites

No one said anything about killing amd, should intel stop improving CPU just to keep lazy AMD alive? are you sure that even makes sense?

If Intel improves too much, AMD will lose customers. People will have absolutely no reason to buy their CPUs at all. You can call AMD whatever they want, but Intel legally cannot do it because they very well could cause AMD to go completely bankrupt. What's going on right now with AMD is not ideal and is causing the CPU market to become stagnant pretty much.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

Finally someone that gets the bigger picture.

Also, I facepalm when I see someone bitching about lack of improvements and then you see their sig and they're using a 6 year old 920 still. Nothing makes a company want to spend R&D dollars like watching people sit on their wallets.... :rolleyes:

And @OP, you're being an idiot. The cost of entry to develop a CPU is impossibly high nowadays. Why you think everyone licenses an ARM core design and calls it a day? Because it's hella fucking expensive to design a core on your own.

Yea thanks for noticing the post. I swear 90%+ of people who posted after me just straight up ignored it because EMRG DESKTOP SPECIAL...

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

The reason asus and the likes aren't and can't make cpus is because they just don't have the capital, experience, or knowledge to do so. Intel is so far ahead of everyone else-tsmc, glofo, Samsung, Qualcomm. AMD and nvidia don't even manufacture their own chips; they design them and are using tsmc's manufacturing to make them. I think AMD also uses glofo for their cpu after selling them off. This is all from my very limited understanding lol.

Link to comment
Share on other sites

Link to post
Share on other sites

If Intel improves too much, AMD will lose customers. People will have absolutely no reason to buy their CPUs at all. You can call AMD whatever they want, but Intel legally cannot do it because they very well could cause AMD to go completely bankrupt. What's going on right now with AMD is not ideal and is causing the CPU market to become stagnant pretty much.

It's already starting to happen. Intel's igpu with the non bottlenecking edram is crushing AMD now.

Link to comment
Share on other sites

Link to post
Share on other sites

if they did it would bring prices down and quality up at the same time and intel wouldnt have such a hold on the markets its just like gpus there are plenty of them out there and i dont feel overwhelmed

You say it like they can just snap their fingers and make chips.

 

Designing a CPU takes years, and billions of dollars in R&D. Board manufacturers for GPU's (MSI, Asus, Gigabyte) don't make anything other than an occasional custom PCB and custom cooler. The GPU Market is the same way.

 

If any company is screwing consumers over, it's AMD. If they would have stayed competitive, Intel CPU prices would be much lower, and perfromance would be much higher. If Intel released an i5 at a lower cost AMD would go out of buisness and Intel would have to sell a portion of it's company. They do what they do because they have to.

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

The reason asus and the likes aren't and can't make cpus is because they just don't have the capital, experience, or knowledge to do so. Intel is so far ahead of everyone else-tsmc, glofo, Samsung, Qualcomm. AMD and nvidia don't even manufacture their own chips; they design them and are using tsmc's manufacturing to make them. I think AMD also uses glofo for their cpu after selling them off. This is all from my very limited understanding lol.

Intel makes their own chips.

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

Intel makes their own chips.

I know. I'm just saying there's no other company that makes any chip that is even comparable to Intel. Everyone else is just so far behind.

Link to comment
Share on other sites

Link to post
Share on other sites

I know. I'm just saying there's no other company that makes any chip that is even comparable to Intel. Everyone else is just so far behind.

Ohh I read Nvidia as Intel my bad.

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

I'm delusional? You're the one making up your own facts in the face of clear evidence to the contrary.

 

Here is a comparison showing even a performance benefit going from dual channel to quad channel:

 

i81hTJN.png

 

And here's single vs. dual channel in non-gaming tasks, which matter just as much as gaming:

 

8jP0fHK.jpg

 

OReKlJh.jpg

 

ziAJsME.jpg

None of that has anything to do with what I asked, and non-gaming performance there will be a difference.  In gaming, the difference between a single 8GB stick and 2x 4GB sticks is negligible. IT is not detrimental to performance in any meaningful way.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

Your math for KW/H is wrong, i can tell that that number is WAY off.... With all the taxes and stuff, KW/H costs 0.864 KR pr unit KW/H..... You can already see that that is EPICALLY wrong just by that. My numbers from the national beurau of statistics (SSB). And these numbers are from Q1 2015 so as fresh as you get them.

 

the difference between a 88W i5 4690k (worst case i5) and 125W FX 8320 (anything not FX 9xxx)is 37W

 

So 37W is technically 37W pr second, however this is where things get icky. the formulae is not 37x3600xrate, it is closer to W*hours used*days in a year/1000*rate pr KWH.Using that basic difference, at 24H pr day, every day use, the difference is 37*8760 (24*365)/1000*0.864

 

Which equates to 280 NOK... pretty much HALF of what you suggest. Now if we correct this for a more reasonable running time of 8 hours PR day (to simulate a work enviroment) the number becomes 37*2920 (8*365)/1000*0.864 which is 93,34 NOK....

 

Now, i pulled these formulaes off google. So they may be wrong, and seeing as i deal with power companies on a daily basis, i can tell you they are a bit low, but that is because the power companies measure secondary and tertiary induced and capacitive loads within your home to squeeze out even more money out of you (perhaps not in the states, yet, but in norway we are getting more and more advanced measuring). So realistically lets call it 300 for all year and 105 for 8 hours a day....

 

Cost difference between an 88W i5 4690k in norway and an FX 8320 is 780NOK, which is 2.5 years in electricity if you run them 24 hours a day and over 7 YEARS if you use them 8 hours a day. Below 8 hours, well... if you run a i5 4690k, or a FX 8320 7+ years from now, then i strongly question the framerates of your games....

 

Cost difference between a i5 5675c 65w broadwell vs a AMD FX 8 is:

454 NOK for 24/7 year long operation

151.3 NOK for 8 hours a day operation

 

Cost difference between FX8 and i5 5675c is 1496 NOK.... again, this discussion of power usage is ridiculous and useless since it takes several years to make up the initial costs differences.

 

Conclusion. The power difference, is a ridiculous argument Processor vs Processor.

However TOTAL system load, can be another thing entirely depending on your setup, like multiple HDD vs SSD, pumps, graphics cards etc... But now we are entering a whole new ballgame of arguments, now it is no longer AMD vs Intel or if intel IS screwing us over. Its is "power efficient vs budget". And that is an entirely different thread, for an entirely different discussion.

 

Feel free to correct me if i am wrong.

That is not how you calculate the difference.  See my spoiler for how it is done.

 

To be fair, I also quoted you wrong at 550Kr, It is really 100Kr.  I think I did my calculation wrong when I first did it for Nena.

 

I did use Oslo's price of Kw/H.  I did the calculation for Nena, who lives in Oslo, so the price might be different in your specific region, but for her region it  we used a price of 86.4, which is the same price you used, so it should equal to be about the same. Transfer that to USD, it equals .11 cents which is very close to the US national average of .12 cents.

 

I also want to throw in these power consumption graphs.

 

Top graph is power draw during Far Cry 3.  This is a good example because Far Cry 3 hits both the CPU and GPU adequately.   Some games will draw more power, some less, so this is a good middle of the road example.

power_load.png

 

The Below graph is during a x264 Encoding Benchmark with all processors at stock speeds.  This is hitting the CPU to the max 100%, and you can see when both an i5 and FX8 are hit to the max, there is a 100W+ difference.

x264-power-peak.gif

 

Power consumption is another aspect of the FX CPU that needs to be talked about.  It draws so much more power than the Intel equivalent, that in just 2-3 years of use, the FX will end up costing you even more money.  Of course some places it is less expensive for energy than others, but you cannot deny that there is a 100W+ difference between an FX8 and an i5.  This power disparity only grows the further you overclock the FX.

 

I will use the average price of residential electricity in the U.S., which is $0.1294c per KWh according to EIA in September 2014.  For this example, we will assume the average price is a flat $0.12 per KWh to give a conservative estimate.  We will also assume that the overclocked FX power draw is 100W higher than the stock i5, again a conservative estimate.  Lastly, lets assume that the average gamer plays for two hours per day, with an additional 2 hours of regular use(non-gaming), so lets just call it 3 hours a day to make it easy.

 

Power Consumption = 100W

Hours of Use Per Day = 3

Energy Consumed Per Day = .3 KWh

Price Per Killowatt Hour = $0.12

 

Energy Cost Per Day = $0.036

Energy Cost Per Month = $1.08

Energy Cost Per Year = $13.14

 

With our quick and dirty calculation, we see that the difference between the FX and i5 is going to add up to over $10 per year, and that is a conservative estimate.  With most of us wanting to keep our components as long as possible before having to upgrade, owning components for 2-3 years, and sometimes even longer, is not out of the question and that energy cost per year really starts to add up.  You also have to consider that you will likely need a more expensive PSU to keep up with this power draw, especially if you want to overclock.

 

 

If you would like to calculate this for yourself, you will need to find out what the cost of energy is where you are located, and these two formulas:

Energy consumption calculation

The energy E in kilowatt-hours (kWh) per day is equal to the power P in watts (W) times number of usage hours per day t divided by 1000 watts per kilowatt:

E(kWh/day) = P(W) × t(h/day) / 1000(W/kW)

Energy cost calculation

The energy cost per day in dollars is equal to the energy consumption E in kWh per day times the energy cost of 1 kWh in cents/kWh divided by 100 cents per dollar:

Cost($/day) = E(kWh/day) × Cost(cent/kWh) / 100(cent/$)

 

Temperatures:

I hear the argument that AMD runs cooler than Intel, and this is a really silly misconception.  I can understand why someone would think that it does, but the temperatures from AMD processors are inaccurate.  They don't measure the cores, they measure the socket, cores tend to be hotter than the socket by a fair amount, and its an algorithm, not a direct measurement like with Intel.  It also has to heat up before it becomes more accurate, this is why you see so many people seeing their FX processors are running below ambient temperatures when at idle.  Thats impossible, its the sensor and the algorithm acting up. It is against the laws of physics for an FX processor to be less hot than an Intel one.  The FX draws much more power.  At stock, the FX8 draws 125W compared to 84/88W of an i5. The FX processor heats up the room much more as well.  I know in my friends' house who owns the FX, his room is sweltering after just an hour of gaming.

 

"Concerning your question regarding the temperatures with your processor. The maximum temperature threshold is 62 Celsius which set for the internal die (core) temperature of the chip. The core temperatures have an equational offset to determine temperature which equalizes at about 45 Celsius thus giving you more accurate readings at peak temperatures. The hindrance in this is the sub ambient idle temperature readings you speak of.

 

 The silicon and adhesives used in manufacturing these processors has a peak temperature rating of 97+ Celsius before any form of degradation will take place. The processor also has a thermal shut off safe guard in place that shuts the processor down at 90 Celsius.

The Cpu temperature is read form a sensor embedded within the socket of your motherboard causing about a 7-10 Celsius variance form the actual Cpu temperature, which may be what you are reading about on the net.

 I hope I was able to answer your questions, If you have any more inquiries don't hesitate to contact us.

 You can use an application called AMD overdrive, that will allow you to monitor your temperatures accurately.

 As long as your core temperature has not exceeded the high side of the 60 degree mark for extended periods of time you should be ok. 62 degrees holds a generous safety net to begin with.

 Thank You

 Alex Cromwell

 Senior Technology Director

 Advanced Micro Devices

 Fort Collins, Colorado

 2950 East Harmony Road

 Suite 300

 Fort Collins, CO"

 

The other thing to consider with FX based systems is that their voltage regulation modules on the motherboard get extremely hot as well, and for the most part, they only have a heatsink on them, not direct airflow.  These VRMs are often in the 70C range, and have nothing to dissipate that heat.  This is another reason why FX based systems make the room much hotter.

 

-Source

 

Throwing all of those numbers into the source, at a rate of 100W difference, being used for 3 hours per day, it ends up being ~$12, which equals 97Kr.  Apologies for the 550Kr, I need to try and find that thread I did with Nena awhile ago because I want to look over my math from that day.  But as you can see, this is not nothing, and it is also a fairly conservative estimate.  Here in the U.S., the difference between an Intel and AMD system is often less than $20, and people keep their systems for 2+ years, which is why energy consumption must be taken into consideration, especially if you are a heavy user.

 

Here is another good performance to power consumption benchmark/review.  It really shows you how much power these FX processors start to consume once overclocked.  Also, please note that they are using the FX8320E which is supposed to be the more energy conscious of the FX processors.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

None of that has anything to do with what I asked, and non-gaming performance there will be a difference.  In gaming, the difference between a single 8GB stick and 2x 4GB sticks is negligible. IT is not detrimental to performance in any meaningful way.

 

Actually I just showed memory bandwidth making more than a 15% difference in a game, and the difference would be even bigger between single channel and dual channel because memory performance has diminishing returns. More than 15% is not negligible.

Link to comment
Share on other sites

Link to post
Share on other sites

Actually I just showed memory bandwidth making more than a 15% difference in a game, and the difference would be even bigger between single channel and dual channel because memory performance has diminishing returns. More than 15% is not negligible.

You showed dual channel vs. Quad channel. I asked for single vs. Dual. The difference is no where near 15% in your single game demonstration. The difference between a single 8GB stick and 2x 4GB sticks is nothing meaningful, and will not detract from game play in the slightest.

For arguments sake, let's say that there was a big difference. YOU CAN STILL GO DUAL CHANNEL IN THAT MOTHERBOARD.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

Actually I just showed memory bandwidth making more than a 15% difference in a game, and the difference would be even bigger between single channel and dual channel because memory performance has diminishing returns. More than 15% is not negligible.

 

You showed only one game which is an extremely small sample size. The difference you see might just as well be a fluke or indicative of all games as a whole. But based on one game, you cannot, nor should you draw any conclusions. 

Link to comment
Share on other sites

Link to post
Share on other sites

You showed dual channel vs. Quad channel. I asked for single vs. Dual. The difference is no where near 15% in your single game demonstration. The difference between a single 8GB stick and 2x 4GB sticks is nothing meaningful, and will not detract from gameplay in the slightest.

For arguments sake, let's say that there was a big difference. YOU CAN STILL GO DUAL CHANNEL IN THAT MOTHERBOARD.

 

So what, the difference between single channel and dual channel is the same type of difference as the difference between dual channel and quad channel, and it is an incontrovertible fact that memory bandwidth has diminishing returns; therefore, a performance difference between dual and quad channel proves there's a performance difference between single and dual channel.

 

You showed only one game which is an extremely small sample size. The difference you see might just as well be a fluke or indicative of all games as a whole. But based on one game, you cannot, nor should you draw any conclusions. 

 

It's not a fluke since it shows up repeatedly. Of course not all games are that sensitive to memory performance, but the point is it disproves the myth that memory performance is basically irrelevant in all games. It only takes one counterexample to disprove a sweeping statement like that.

Link to comment
Share on other sites

Link to post
Share on other sites

So what, the difference between single channel and dual channel is the same type of difference as the difference between dual channel and quad channel, and it is an incontrovertible fact that memory bandwidth has diminishing returns; therefore, a performance difference between dual and quad channel proves there's a performance difference between single and dual channel.

 

 

It's not a fluke since it shows up repeatedly. Of course not all games are that sensitive to memory performance, but the point is it disproves the myth that memory performance is basically irrelevant in all games. It only takes one counterexample to disprove a sweeping statement like that.

It is not the same at all, and again, in a single game, showing a difference of 4% and again, that motherboard does supports dual channel which is the very basis of your terrible, and unfounded argument.  There is absolutely nothing wrong with the B85 motherboard.  And oh look, some test results:

 

shogun2-1.jpg

shogun2-2.jpg

cinebench-results.jpg

 

-Source

 

Damnit, I was right.  I hate it when this happens.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

It is not the same at all, and again, in a single game, showing a difference of 4%

 

It shows a difference of over 15% based on memory performance alone. It does not matter that it's only a single game, it's enough to disprove your claim.

Link to comment
Share on other sites

Link to post
Share on other sites

It shows a difference of over 15% based on memory performance alone. It does not matter that it's only a single game, it's enough to disprove your claim.

You've lost it.

 

144fps vs. 150fps.  That is not 15% difference. It is 4%.  Not only that, but it is dual channel vs. quad channel.  Not the same as single vs. dual.  You are mental.

 

I just showed you a game, and a graphical benchmark that shows a less than 1% difference. You have a game that shows a negligible difference using the wrong configuration, and I have a game plus a graphical benchmark that shows even more of a negligible difference, so this must mean... the difference is negligible.

 

Oh, and lets not forget:  THE MOTHERBOARD F^&*%ING SUPPORTS DUAL CHANNEL which is the basis of your terrible argument.  There is nothing wrong with a B85 motherboard.

 

What are you on about?  You have absolutely no argument whatsoever.  You are grasping at straws.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

Cyrix, VIA, IBM and Apple all made their own CPUs for a number of years. Cyrix 686 even worked in Super Socket 7 alongside AMD K6 and Pentium. However back then it was more hardware dipswitch changes instead of software.

Apple made their own chipsets and processors for years and only about 03-04 did they start buying from Intel to save costs.

The New Machine: Intel 11700K / Strix Z590-A WIFI II / Patriot Viper Steel 4400MHz 2x8GB / Gigabyte RTX 3080 Gaming OC w/ Bykski WB / x4 1TB SSDs (x2 M.2, x2 2.5) / Corsair 5000D Airflow White / EVGA G6 1000W / Custom Loop CPU & GPU

 

The Rainbow X58: i7 975 Extreme Edition @4.2GHz, Asus Sabertooth X58, 6x2GB Mushkin Redline DDR3-1600 @2000MHz, SP 256GB Gen3 M.2 w/ Sabrent M.2 to PCI-E, Inno3D GTX 580 x2 SLI w/ Heatkiller waterblocks, Custom loop in NZXT Phantom White, Corsair XR7 360 rad hanging off the rear end, 360 slim rad up top. RGB everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

ill bring this up as a point of reference. there was an architecture intel made called ia64. it was basically x86-64 with none of the x86 components. it died due to compatibility and development headaches, and excessive price. Basically new archtectures are no longer viable in the home pc ever again.

Everything you need to know about AMD cpus in one simple post.  Christian Member 

Wii u, ps3(2 usb fat),ps4

Iphone 6 64gb and surface RT

Hp DL380 G5 with one E5345 and bunch of hot swappable hdds in raid 5 from when i got it. intend to run xen server on it

Apple Power Macintosh G5 2.0 DP (PCI-X) with notebook hdd i had lying around 4GB of ram

TOSHIBA Satellite P850 with Core i7-3610QM,8gb of ram,default 750hdd has dual screens via a external display as main and laptop display as second running windows 10

MacBookPro11,3:I7-4870HQ, 512gb ssd,16gb of memory

Link to comment
Share on other sites

Link to post
Share on other sites

It's 130 vs. 150.

No it's not.

You cannot compare the different speed RAMs as that changes the variable which we are trying to measure. We aren't measuring the impact of RAM frequency on performance, we are measuring the impact of single vs dual channel.

You clearly don't have an understanding of the situation, and it's clear to me that I've been arguing with a troll.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×