Jump to content

M1 Macs Reviewed

randomhkkid
48 minutes ago, Dracarris said:

Well this, which others here are still defending:

And the very unclear comparison/reference points (best selling PC laptop, more than 98% of blabla).

I consider that part of the graph thing.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Dracarris said:

Still, you could re-create this graph with numbers from independent reviewers and use the  usual suspects as benchmark titles and quite closely get to what they showed.

But that's the point!

 

Whenever a company shows graphs and charts, they are most likely cherry-picked results, but they (usually) aren't lies.

 

The same could be said for the graphs and claims Apple have shown. For instance, the 5x faster graphics performance was actually 5x faster than the last generation of the MacBook Air.

 

You should always take any keynote with a grain of salt, but at the same time since it's all you have until review units come out, you can't do anything but believe that's the case. Most companies don't intentionally lie in their graphs to make their products seem better. That would give them very bad press for no reason.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, NotTheFirstDaniel said:

The same could be said for the graphs and claims Apple have shown. For instance, the 5x faster graphics performance was actually 5x faster than the last generation of the MacBook Air.

Good luck recreating that graph without knowing what even the machine is they were comparing against, where exactly zero is, how performance was measured and so on and so on.

Link to comment
Share on other sites

Link to post
Share on other sites

Some of you guys still don’t get the purpose of that graph...giving a general feel about how the M1 generation (not a particular M1 Mac, not a particular TDP, that’s the whole point! you can’t use a direct comparison if what you’re comparing is a continuum in the first place) can be both cooler/power_sipping and more powerful than x86 CPUs we’re used to (here of course we need to trust apple about how they picked the PC curve, but where’s the bluff if the machines were already in the wild and soon to be shipped to regular users??)...the performance per watt is the whole story of this transition...no sense in that context to pinpoint a particular thermal scenario and pick a particular benchmark out of hundreds possible...that’s the perfect graph to say what they wanted to convey, you’re just refusing to see that it’s a perfectly legitimate thing to convey when you’re launching a new architecture (a not just the M1 SoC)..

 

And the guy not understanding the difference from a corporate point of view between poo-poo-ing on Intel indirectly and poo-poo-ing on Intel by name...c’mon..

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, NotTheFirstDaniel said:

But that's the point!

 

Whenever a company shows graphs and charts, they are most likely cherry-picked results, but they (usually) aren't lies.

 

The same could be said for the graphs and claims Apple have shown. For instance, the 5x faster graphics performance was actually 5x faster than the last generation of the MacBook Air.

 

You should always take any keynote with a grain of salt, but at the same time since it's all you have until review units come out, you can't do anything but believe that's the case. Most companies don't intentionally lie in their graphs to make their products seem better. That would give them very bad press for no reason.

The whole thing about marketing.   There used to be laws about out-and-out lying in advertisements.  There was a very famous commercial for “cheer” detergent where an man appeared to wash clothes with ice water.  It was not a lie, but it was out and out fakery. The water was not cold, it was hot.  The commercial had no words.  This was actually critical, because almost anything said at all would be a lie.   It rested on standard misassumptions a viewer might make, like ice melting instantly in hot water.  They don’t. It takes much less time than in cold water, but the time is not zero. 
There is lying and there is misleading.  The commercial was not an out and out lie but it was extremely misleading.  To the point that the difference was immaterial. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Dracarris said:

And how is that relevant at all here? All those presentations are mostly made for the US market, where clearly no such limitation is in place.

No I doubt what he is saying is true, it'll be like everywhere else. So long as it's factually correct a company can, companies typically choose not to because it's safer not because they cannot.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, saltycaramel said:

And the guy not understanding the difference from a corporate point of view between poo-poo-ing on Intel indirectly and poo-poo-ing on Intel by name...c’mon..

You still fail to explain how fact-based, objective comparisons substitute any form of poo-poo-ing at all.

 

So AMD poo-poo-ed in their Ryzen 5000 presentation on Intel since they did direct, named comparisons? How dare they broadcast such a presentation then.

17 minutes ago, leadeater said:

No I doubt what he is saying is true, it'll be like everywhere else. So long as it's factually correct a company can, companies typically choose not to because it's safer not because they cannot.

And yet AMD chose to do it.. twice. This month.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dracarris said:

And yet AMD chose to do it.. twice. This month.

I know, I was the one that pointed it out 😉

 

Just yea, not every company likes to do it, usually because they want to compare in "not so accurate" ways lol.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Dracarris said:

You still fail to explain how fact-based, objective comparisons substitute any form of poo-poo-ing at all.

Apple sells a complete line-up of INTEL Macs except 3 Macs.

Today.

On the shelves of every store. 

Macs Apple reps will still need to sell for months on the floor of those stores. 

And I can assume they’re still in good business relationship with their current CPU supplier Intel.

You won’t see a graph calling an Intel CPU by name in an unfavorable way.

I don’t know how to better explain that.

I don’t understand how industry-savvy Linus can miss this.

Link to comment
Share on other sites

Link to post
Share on other sites

To add context, last quarter was the strongest quarter in 35 years for Mac sales. (probably covid and WFH have something to do with that)

100% Intel Macs.

And the transition to ARM was publicly known since June. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, saltycaramel said:

Apple sells a complete line-up of INTEL Macs except 3 Macs.

Today.

On the shelves of every store. 

Macs Apple reps will still need to sell for months on the floor of those stores. 

And I can assume they’re still in good business relationship with their current CPU supplier Intel.

You won’t see a graph calling an Intel CPU by name in an unfavorable way.

I don’t know how to better explain that.

I don’t understand how industry-savvy Linus can miss this.

Labeling the graph is separate from disclosing what they are comparing to, which I think they did at some point. Also companies are not children, Intel knows full well they are the reference comparison and are fully able to handle that. They aren't going to stop working with Apple because Apple showed they have something faster, and not just from the fact it would be a breach of contract anyway.

 

Intel simply cannot and would not just stop working with and supplying Apple because Apple named them or one of their products on a graph or on a slide in a presentation. The only time that would ever happen is if what was being presented was false and inaccurate information that could damage their reputation and have a financial impact.

 

And for all the consumers that are apparently confused by accurate graphing and can't understand CPU names wouldn't have a clue either way, according to this argument line, and would happily buy a Mac whether it had an Intel CPU or one by Apple.

 

Apple, the company that has publicly said they are transitioning to their own CPUs/SoCs, that is far more reason to stop working with them than anything else. That didn't happen did it, but no a marketing slide would cause that 🤦‍♂️

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, leadeater said:

This argument falls apart by the sheer fact that other companies in the consumer electronics space have been able to and have labeled graphs, ....

9 hours ago, leadeater said:

AMD during their Ryzen 5000 and RX 6800 presentation directly compared to, named their competitors and the specific products, during the presentation. Their graphs were labelled and simple to read.

 

 

See, here's the thing.. Apple isn't AMD. They're not Intel, They're not NVIDIA....they're APPLE.  This isn't to give them "special status" but they're not in the same market or mindspace.

 

YOU (the more general you, leadeater), see them all as part of the same thing, so you want detailed tech info. You're SUPER techy, so that's what you want to hear.  But Apple isn't selling to the "you's", especially with these big rollout videos.  They're more selling to the "Me's"...actually not even to the Me's..but the people who come to the Me's for details, like I got to the YOU's for more detail (We'll call them "thems").

 

I watched the video with the appropriate oohs and ahs, the dazzling of the tech on display, with enough knowledge to understand what they were talking about and enough wits to recognize the BS Marketing speak as well as the graphs which were functional for the THEMS, laughable to the MEs, and insulting to the YOUS.

 

See, the THEMS just want to know "Is it better? Is it good? Will it work?"  They don't care about the hows. The MEs, we're curious about the how, but we don't quite understand the details, so we look for the YOUS who are all bout the hows.

 

9 hours ago, fred82 said:

no need for immature and hatred comment towards Linus

You are welcome to accuse me of BS pop psychology analysis, but neither immature nor hate.

 

9 hours ago, RedRound2 said:

 Linus shouldn't have assumed things. Being skeptical was fine, but he straight up as as influencer judged a book by it's cover. And he has a fragile ego to admit it, rather just spinning it in ways and giving lame justifications like "I never said the iPad was bad"

As I've said, MKBHD and Snazzy Labs were skeptical, because of course the graphs were bs marketing based things...but they weren't negative,e specially in the titles, they just called out the bs. (Snazzy did a thing using measurements to try to decode the graphs...not sure how that panned out with reality).

 

ANd honestly, all the "justifications" he's been using..it honestly reminds of (and I know I'm blowing this WAY up with this, and I'm not making any implications) someone explaining how something they said isn't actually racist.

 

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Video Beagle said:

 

See, here's the thing.. Apple isn't AMD. They're not Intel, They're not NVIDIA....they're APPLE.  This isn't to give them "special status" but they're not in the same market or mindspace.

 

YOU (the more general you, leadeater), see them all as part of the same thing, so you want detailed tech info. You're SUPER techy, so that's what you want to hear.  But Apple isn't selling to the "you's", especially with these big rollout videos.  They're more selling to the "Me's"...actually not even to the Me's..but the people who come to the Me's for details, like I got to the YOU's for more detail (We'll call them "thems").

 

I watched the video with the appropriate oohs and ahs, the dazzling of the tech on display, with enough knowledge to understand what they were talking about and enough wits to recognize the BS Marketing speak as well as the graphs which were functional for the THEMS, laughable to the MEs, and insulting to the YOUS.

 

See, the THEMS just want to know "Is it better? Is it good? Will it work?"  They don't care about the hows. The MEs, we're curious about the how, but we don't quite understand the details, so we look for the YOUS who are all bout the hows.

 

You are welcome to accuse me of BS pop psychology analysis, but neither immature nor hate.

 

As I've said, MKBHD and Snazzy Labs were skeptical, because of course the graphs were bs marketing based things...but they weren't negative,e specially in the titles, they just called out the bs. (Snazzy did a thing using measurements to try to decode the graphs...not sure how that panned out with reality).

 

ANd honestly, all the "justifications" he's been using..it honestly reminds of (and I know I'm blowing this WAY up with this, and I'm not making any implications) someone explaining how something they said isn't actually racist.

 

Sure, but the word you’re looking for is “rube” actually a famous turn of the century baseball player.  Your’e looking to the “yous” to find out if the company is telling the truth when they say “it’s AWESOME!” Because they’ll say that whether it is or not, and when the “yous” look at that stuff they see a bunch of unlabled crap.  And they say so.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Video Beagle said:

YOU (the more general you, leadeater), see them all as part of the same thing, so you want detailed tech info.

But the thing is labeling the graphs isn't actually that detailed, the entire point to labeling a graph is so it's possible to read and understand at all. And that's the point, Apple is making a statement about a piece of information (well two) and using a graph like they did doesn't actually enhance or portray that any better than just having it as bullet points on a slide. There's no proper way to relatively compare it to anything else on the graph, it's just Apple saying 2x faster and 25% less power. Ok great, that's fine. But performance how? And the power usage of the other thing is what?

 

And that's not even getting to the issue of the plotted line for the "Latest PC laptop chip" being actually inaccurate and that's not how it works. But that's not really something I can expect to be reasonably better, rather just don't do it.

 

That's the point of a graph, I should be able to pick a point on either axis and then find the intersecting point and get a value and be able to assess that. You can do neither for either axis and even if you were to do so the value would be incorrect because that's not how power and performance works on either Intel or AMD.

 

The main real issue, the one that is persisting right now, is the valid criticism of what is a bad graph and people treating it like they are. Going out of their way to call people that have done so names and ridiculing them, excusing what actually is an improper usage of a graph. Apple could have chosen to portray the desired information any which way they liked, they went with this and whether you personally agree or care shouldn't mean people are not allowed to criticize. 

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, Bombastinator said:

Sure, but the word you’re looking for is “rube”

 

No. A rube is someone you're putting something over. These aren't naive people being tricked.  THEY DON'T CARE about the tech details.

 

I was watching an Escapist podcast earlier. Yatzhee, who's one of the premiere game reviewers online, and Jack were talking about the new XBox vs PS5, and going over the big marketing points listed on some ads..  Lots of "How this tech is new and great" stuff. And Yatzhee didn't know what the tech meant or care.  Because he's not in the tech news bubble. (Jack knew most of what they meant..because he is in the bubble). All the minutea we talk about on this forum... 99% of computer buyers don't care about. A lot of folks on here live for that stuff, and I find learning about it endlessly fascinating...but most people, just don't care.

27 minutes ago, leadeater said:

The main real issue, the one that is persisting right now, is the valid criticism of what is a bad graph and people treating it like they are.

Dude, I said it was a dumb graph that made me laugh. MKBHD and Snazzy both called it out that it was a bad graph. The difference being, Linus made that the center of his argument. "It's a bad marketing graph so everything is crap" he seemed to say. Rather than just take it as a bad marketing based graph, and evaluating the information given. And look, it was all "Company makes claims with whatever proof they want to use to back it up".... It's all bs and means nothing til people actually evaluate it anyway..whether it's Apple's marketing graphs or AMD's fine point dotted matrix. They're both meaningless company provided info. But because there's an illusion of specificity, some think the AMD's are more valid.  Linus's "don't fanboy a company" has him presenting the AMD info as "here's what they say" while his personal bias had him spin the Apple's as negative and he's now spent weeks trying to justify it.

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Video Beagle said:

 

No. A rube is someone you're putting something over. These aren't naive people being tricked.  THEY DON'T CARE about the tech details.

 

I was watching an Escapist podcast earlier. Yatzhee, who's one of the premiere game reviewers online, and Jack were talking about the new XBox vs PS5, and going over the big marketing points listed on some ads..  Lots of "How this tech is new and great" stuff. And Yatzhee didn't know what the tech meant or care.  Because he's not in the tech news bubble. (Jack knew most of what they meant..because he is in the bubble). All the minutea we talk about on this forum... 99% of computer buyers don't care about. A lot of folks on here live for that stuff, and I find learning about it endlessly fascinating...but most people, just don't care.

Dude, I said it was a dumb graph that made me laugh. MKBHD and Snazzy both called it out that it was a bad graph. The difference being, Linus made that the center of his argument. "It's a bad marketing graph so everything is crap" he seemed to say. Rather than just take it as a bad marketing based graph, and evaluating the information given. And look, it was all "Company makes claims with whatever proof they want to use to back it up".... It's all bs and means nothing til people actually evaluate it anyway..whether it's Apple's marketing graphs or AMD's fine point dotted matrix. They're both meaningless company provided info. But because there's an illusion of specificity, some think the AMD's are more valid.  Linus's "don't fanboy a company" has him presenting the AMD info as "here's what they say" while his personal bias had him spin the Apple's as negative and he's now spent weeks trying to justify it.

Rube was a pitcher in the major league a very long time ago.  He had some sort of serious learning disability but of course no one knows what.  My personal guess is really severe adhd amongst other things but the DSM wasn’t even a thing back then.  He was a good pitcher but very easy to distract. People in the stands would do things like jingle keys or wave bright shiney objects to attract his attention. They would also yell “hey rube!” Which is where the phrase came from.  He was eventually killed during a flood.  He saw two children trapped in the water and immediately jumped in to save them, dying. 
 

1FFFF5D7-64DF-4893-A7B3-3C6833AB1F03.jpeg.157106ca5284d226346f74106994dc0c.jpeg

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Video Beagle said:

Dude, I said it was a dumb graph that made me laugh. MKBHD and Snazzy both called it out that it was a bad graph. The difference being, Linus made that the center of his argument. 

 

Perfect.

We’re all, on both sides, giving FAR too much attention to a “general feel” kinda graph for a single reason: one particular youtube entertainer decided to make it a TENT POLE of his early clickbaity (dumpster fire) apple-skeptic-pandering take. (he’s free to do that, people are free to criticize it without getting a “chill the F out”, some may say that he should have chilled out about that graph likewise)

 

Dear Apple, for the future, please

- keep using vague “general feel” graphs when you see fit with regard to the context and the message to convey

- keep using actual numbers when the context calls for it (like on a mile long product page with all the footnotes explaining the test conditions in detail)

Yours truly,

not a “fanboy”

Just someone able to discern context and purpose of a graph. (and laughing at the idea of making this about a battle for transparency)

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Bombastinator said:

Rube was a pitcher in the major league a very long time ago.  He had some sort of serious learning disability but of course no one knows what.  My personal guess is really severe adhd amongst other things but the DSM wasn’t even a thing back then.  He was a good pitcher but very easy to distract. People in the stands would do things like jingle keys or wave bright shiney objects to attract his attention. They would also yell “hey rube!” Which is where the phrase came from.  He was eventually killed during a flood.  He saw two children trapped in the water and immediately jumped in to save them, dying. 
 

1FFFF5D7-64DF-4893-A7B3-3C6833AB1F03.jpeg.157106ca5284d226346f74106994dc0c.jpeg

Man, this post made me feel sad :(

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Spindel said:

Man, this post made me feel sad :(

That was more or less a regurgitation from my memory of  Kevin burns history of baseball documentary.  The whole early baseball period had that feel for me in that documentary.  The story of shoeless joe was a lot like that too. The Bush leagues was even worse as was the story of the concession stand owner who committed suicide on home plate in protest of the first owner of Major League Baseball.

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Bombastinator said:

They would also yell “hey rube!” Which is where the phrase came from. 

The expression is older than Rube Wadell (the baseball player), coming from carny speak, which is the context I was familiar with.

 

https://en.wikipedia.org/wiki/Hey,_Rube!

 

(yeah, the shoeless joe's story is very sad. ;( )

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Video Beagle said:

The expression is older than Rube Wadell (the baseball player), coming from carny speak, which is the context I was familiar with.

 

https://en.wikipedia.org/wiki/Hey,_Rube!

 

(yeah, the shoeless joe's story is very sad. ;( )

Interesting.  I had heard only the baseball one.  

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bombastinator said:

Interesting.  I had heard only the baseball one.  

Yeah.. I knew the carny one, so I was lookig for how it went from one to the other and found that.

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

The reason Geek bench is bad is the current Geekbench used for these scores compares your CPU to one of the same architecture as a reference score of 1000 and adds or subtracts points.  So the M1 is X number of Lemons (some Snapdragon speed) and this AMD is X number of limes (some Intel i5).  You are presented with a relative number unrelated to any way to each other in a way you can quantify.   You will need to do the exact same job on each platform.  Risc does 1 op per clock cycle, X64 is about 2.3 average, they use different co processors and registers and even instruction sets.  Getting a defined speed is near impossible to define as a single number.  What Apple have done is opened themselves up for a whole world of hurt in the future when everyone else gets to TSMC 5nm and bring out high end chips these power/speed ratios will be crashed, all scores always are.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Generalcrow said:

The reason Geek bench is bad is the current Geekbench used for these scores compares your CPU to one of the same architecture as a reference score of 1000 and adds or subtracts points.  So the M1 is X number of Lemons (some Snapdragon speed) and this AMD is X number of limes (some Intel i5).  You are presented with a relative number unrelated to any way to each other in a way you can quantify.

I am going to need a source on that claim because it does not sound true, and it is the first time I have heard of it.

 

8 minutes ago, Generalcrow said:

Risc does 1 op per clock cycle, X64 is about 2.3 average, they use different co processors and registers and even instruction sets.  Getting a defined speed is near impossible to define as a single number. 

The tests Geekbench does are real world performance tests based on real world programs.

The fact that RISC and CISC use different instructions to accomplish different tasks are kind of irrelevant. Geekbench tests how fast chips complete the same tasks and builds a score based on that. How they accomplish the tasks are kind of irrelevant.

 

Here is an analogy. Geekbench is the Nurburgring and the Geekbench score is a lap time. We have two cars. Car X (diesel) and Car Y (petrol).

Now we have two lap times.

Car X got around the track in 7:58 and Car Y got around the ring in 8:03.

 

Would you really go "well you can't compare a diesel car vs a petrol car because they work differently. Also, car X got 2 more cylinders while car Y got a more aerodynamic chassi".

Does the minute differences really matter when all we care about is how quickly the two cars were able to race around the track? Things like how many operations each architecture does per clock is irrelevant. I also wonder where you got those numbers from because they are far from true.

1 operation per clock? Firestorm has 4 simple ALUs, 2 complex ALUs and one unit dedicated to division. For FP each core can do 4 FADD and 4 FMUL. That is twice as much as even Zen3.

 

"RISC does 1 op per clock cycle, x86 is about 2.3 average" is so hilariously and horrendously wrong I am at a loss for words. Where did you get that ridiculous idea from? The 80's?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, LAwLz said:

I am going to need a source on that claim because it does not sound true, and it is the first time I have heard of it.

 

http://support.primatelabs.com/kb/geekbench/interpreting-geekbench-5-scores Oddly my data is not up to date on geek bench 5 but how you scale an i5 to an arm processor is a very hard question.  To your other analogy it is like trying to compare a truck to a bicycle as a form of moving building materials, you can do it but is it the best use of the gear.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×