Jump to content

Leaked MacBook Air GB5 benchmark shows score higher than 16-inch MacBook Pro; SC higher than 5950X

Go to solution Solved by Spindel,

*DISCLAIMER* All pictures below are stolen from Affinity forum. 

 

Since Apparently Geekbench is bad let's look att Affinity benchmark

 

This is a i9-10900 with a RTX 2070 Super

image.png.2f5c0203504a50b8fa961dd8318a10ff.png

 

 

 

This is a 3900X with a GTX 1080

image.png.7695f37d1eb96d2bd2758a053ca0d179.png

 

 

This is the M1

image.thumb.png.0e7353cdcc881f86e582110920f779c5.png

 

 

2 minutes ago, leadeater said:

No they don't, how many have you talked to? When have an entire creative arts department and teach hundreds of students every year video production and 3D modeling, they love iMacs and iMac Pro. I know a few photographers, they love their iMac and MacBook Pro. I in fact know zero people currently than have the current generation Mac Pro. We used to have a single lab of Mac Pros, the trash can, but that was replaced with iMac Pro.

 

When?

itwbukqnxyw11.jpg

I see the graph starts at 16%.  There was a time when the home market was over 50%.  Even that graph proves my point though.  That time in the 90’s where it was still better than it is now?  That was when there was a guy outside the computer lab I worked in trying to sell Apple stock for a dollar a share in a bid to save the company.  When was the Macintosh introduced on that graph?

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bombastinator said:

I see the graph starts at 16%.  There was a time when the home market was over 50%.  Even that graph proves my point though.

No it was not.

 

bcaad7e071734b977fd04de33b343b6d.png

 

Get any further back in history and you'll be making an impossible argument for a company that didn't exist yet or was not selling product yet.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, leadeater said:

No it was not.

 

bcaad7e071734b977fd04de33b343b6d.png

 

Get any further back in history and you'll be making an impossible argument for a company that didn't exist yet or was not selling product yet.

Oooh! neXT!  I remember those.  What Steve came out with when he left Apple. They were black and white only and they couldn’t get a decent removable storage system for them. Our lab ad only one, and basically no one used it.  Yet it had a 50% share according to that graph. (mistook trash80 for apple2) AppleII was more than long obsolete before the neXT was even an idea, but I see that green stripe goes almost into the 90’s anyway.

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Bombastinator said:

Even that graph proves my point though

Not it doesn't you're making an argument on the basis of the dawn of home computing? How is that even relevant to the discussion of anything close to today, or last decade?

 

Your points you're making about expandability are equally counter on the PC/Windows side, most PC do not have a dGPU. Most PCs never get upgraded, most PCs today are laptops. Expandability is relegated to me, you and everyone else who needs or still care about it, i.e. professionals or enthusiasts, but professionals are not a single group with the same requirements. I am a professional IT worker, my work PC does not have a dGPU and it will never get upgraded in any way, it'll be lifecycle replaced out. My colleague IT workers in the software development team are no different, their computers will never be upgraded, many have laptops.

 

Yes Apple did have a higher market share for a decent period of time, the market has changed, peoples needs have changed, people desires have changed. Quite literally everything has changed.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

That can only ever happen if demand of high performance graphics and compute disappears which it won't. High end GPUs have 10x and greater performance than the M1 and this disparity in performance isn't going to close up at all, not in low power SoC implementations. Even in high power SoC implementations it's still going to be 5x and greater performance disparity, GPU performance and development is not static so M1X, M2 or w/e future product is going to be relative to future GPUs and future software demands.

But if we look at applications like gaming you might use tricks like TBDR to reduce the need for raw compute power with the same result. 
 

When doing things that demand raw compute power in a proffesional setting you don’t rely on an dGPU anyway. You send your model to a cluster. In example when I worked with AutoDesks Simulation CFD we didn’t use a normal PC for the simulations, you send them  to a server and as far as I know most companies that use simillar software do it like that. Same thing with studios rendering 3D-movies etc. 
 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Bombastinator said:

That iMacs don’t have upgradable GPUs is true.   They’re basically laptops pretending to be desktops. I think attempting to market iMacs to pros is a prime reason why the Mac computer (as opposed to phone) market has done so poorly.  Pros look at those things and view them as shit.  Apple as a whole has done well, because their phone thing worked out, but their computer business used to have a higher percentage market share than their phone business does now.  These days it’s still relatively very small.  “We’ve gone from 2% to 6%! Everyone throw confetti!” Is imho stupid. It used to be LOTS higher. 

Pros are a niche market regardless of platform, so that's really not it. And Apple only really dove into the pro angle when the 5K iMac arrived in 2014... and even then, I'd say it was more explicit with the iMac Pro in 2017. The bread and butter of the computer market has been laptops for a long time — if Apple wants to gain computer share like it has in recent years, it's MacBooks that matter.

 

Also, to answer another question: the Mac first arrived in 1984, but I believe the chart also includes the Lisa from 1983.

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Commodus said:

Pros are a niche market regardless of platform, so that's really not it. And Apple only really dove into the pro angle when the 5K iMac arrived in 2014... and even then, I'd say it was more explicit with the iMac Pro in 2017. The bread and butter of the computer market has been laptops for a long time — if Apple wants to gain computer share like it has in recent years, it's MacBooks that matter.

 

Also, to answer another question: the Mac first arrived in 1984, but I believe the chart also includes the Lisa from 1983.

 

This whole “only got into that market after 2014” thing is crap.  The Apple iii was intended as a pro only device. There is educational, casual, and pro. The entire enterprise market, for example, which is often considered to be the one that matters for cpu companies, is almost entirely pro. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Commodus said:

Pros are a niche market regardless of platform, so that's really not it. And Apple only really dove into the pro angle when the 5K iMac arrived in 2014... and even then, I'd say it was more explicit with the iMac Pro in 2017.

I'd like to respectfully disagree.

 

The "Mac Pro" line has been offered since 2006, and back then they had a product called "Xserve" that was a true server, not this rack-mount Mac Pro nonsense, and OS X Server. If anything they were more "pro" back then than they are today.

 

Even before the Mac Pro, the PowerMac G5 was a beast when it was first launched. It was one of the fastest computers you could get and definitely the fastest Mac you could get in 2003. It also introduced 64-bit computing to the general public. Apple has always catered to "enthusiasts" or "pros". It's just that since the iPhone makes up so much of their profits, they focus on it first so it seems like they don't produce any professional products.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Bombastinator said:

This whole “only got into that market after 2014” thing is crap.  The Apple iii was intended as a pro only device. There is educational, casual, and pro. The entire enterprise market, for example, which is often considered to be the one that matters for cpu companies, is almost entirely pro. 

I was referring to the iMac line specifically.

 

And when we talk about pro, there are very different definitions. I'm referring to pros as in creatives, engineers, people who actually need a powerful system to get things done. When you groused about Apple trying to aim the iMac at pros, you were clearly referring to them.

 

The "entire enterprise market" is most definitely not pro under that definition. You don't need to issue $2,000 desktops (let alone a $6,000 iMac Pro) to the large number of rank-and-file employees who won't be running more than Microsoft Office or a database client. And Apple isn't really trying to court that majority of everyday office workers with the iMac. Maybe the Mac mini and MacBooks, but even then I don't think Apple expects to compete with some mass-deployed $400 Windows box.

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, NotTheFirstDaniel said:

I'd like to respectfully disagree.

 

The "Mac Pro" line has been offered since 2006, and back then they had a product called "Xserve" that was a true server, not this rack-mount Mac Pro nonsense, and OS X Server. If anything they were more "pro" back then than they are today.

 

Even before the Mac Pro, the PowerMac G5 was a beast when it was first launched. It was one of the fastest computers you could get and definitely the fastest Mac you could get in 2003. It also introduced 64-bit computing to the general public. Apple has always catered to "enthusiasts" or "pros". It's just that since the iPhone makes up so much of their profits, they focus on it first so it seems like they don't produce any professional products.

I was thinking more about the iMac line than pro machines as a whole. Apple has had pro machines for most of its life, of course, but it didn't try to pitch the iMac as a pro machine until relatively recently.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Spindel said:

You send your model to a cluster.

Which use dGPUs, so like I said the need for dGPUs isn't going away or really going to significantly change. Time on these clusters is also precious so you also need a similar system, just lower configuration to develop on and refine code etc.

 

Some of our clusters the job queue is over a year, if you want to run something now and are not important you'll be waiting a long time. That is why most researchers here also have rather high end workstations.

 

5 hours ago, Spindel said:

But if we look at applications like gaming you might use tricks like TBDR to reduce the need for raw compute power with the same result. 

That's still never going to remove the mid range and higher dGPUs, game developers are still going to push the limits and GPU companies are still going to push the limits of technology and that's not going to change. So it's not going to matter what you do a SoC with what will still be a good GPU in it is not going to replace a 150W+ dGPU. We've had SoC for a long time with good GPU performance, nobody reduced demands to stay within the capabilities of those and it's not going to happen in future either. Lower end dGPUs might disappear and rightly so, they are terrible value and have no right to exist even now.

 

We never have and never will back off on software demand just to stay within the power limits of SoC's.

Link to comment
Share on other sites

Link to post
Share on other sites

Between not being able to dual boot linux or Windows, and the constant phone home/spying thing they have going on, circumventing VPN's...I'm going to have to pass. I really wanted to like it.

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, Commodus said:

I was referring to the iMac line specifically.

 

And when we talk about pro, there are very different definitions. I'm referring to pros as in creatives, engineers, people who actually need a powerful system to get things done. When you groused about Apple trying to aim the iMac at pros, you were clearly referring to them.

 

The "entire enterprise market" is most definitely not pro under that definition. You don't need to issue $2,000 desktops (let alone a $6,000 iMac Pro) to the large number of rank-and-file employees who won't be running more than Microsoft Office or a database client. And Apple isn't really trying to court that majority of everyday office workers with the iMac. Maybe the Mac mini and MacBooks, but even then I don't think Apple expects to compete with some mass-deployed $400 Windows box.

Sure there’s a big market in mass deployment, and the m1 will cover that.  Does every device they sell need to have coprocessor capacity?  No.  The problem is there has to be one somewhere. Currently there is NO coprocessor capacity at all.  It can’t be done. Pros are locked out. The whole thunderbolt egpu thing showed It may not need to be much.  I’d like to see say an m2 and an m2 pro the only functional difference between the two being that the pro would have some spare pcie lanes one could get at.  Some capacity to attach equipment apple didn’t plan for.

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, Spindel said:

GB under rosetta on M1:

 

SC 1313

MC 5888

 

Damn impressive 

Interesting. 

 

So:

Native single core - 1687

Rosetta single core - 1313

Performance penalty: 22%

 

Native multi core - 7433

Rosetta multi core - 5888

Performance penalty: 21%

 

So probably around 20% in your typical program. Sometimes more, sometimes less. 

So as long as the M1 is around 20% faster than the current Intel chips we should see similar or better performance even in non-native apps on the Mac. 

So for example Photoshop will likely perform better on the M1 compared to the Intel Mac despite not officially supporting the M1 at launch. Then when we get the official ARM version of Photoshop the M1 lead will just get even bigger. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I am in the first stage of grief.

 

might skip the other 3 and go straight for the last one when numbers are confirmed 

PC specs:

Ryzen 9 3900X overclocked to 4.3-4.4 GHz

Corsair H100i platinum

32 GB Trident Z RGB 3200 MHz 14-14-14-34

RTX 2060

MSI MPG X570 Gaming Edge wifi

NZXT H510

Samsung 860 EVO 500GB

2 TB WD hard drive

Corsair RM 750 Watt

ASUS ROG PG248Q 

Razer Ornata Chroma

Razer Firefly 

Razer Deathadder 2013

Logitech G935 Wireless

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Commodus said:

I was thinking more about the iMac line than pro machines as a whole. Apple has had pro machines for most of its life, of course, but it didn't try to pitch the iMac as a pro machine until relatively recently.

Well they only started positioning the iMac as a "pro" product when they realized they walled themselves into a thermal corner with the 2013 Mac Pro design. That's why there was only one iMac Pro and it came between the 2013 Mac Pro and the 2019 Mac Pro.

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Sakuriru said:

Assuming, of course, that Rosetta performs without flaws. The jury is still out on that.

Many people with DTKs have tried Rosetta and I haven't heard anything bad about it, unless you can bring up an issue where Rosetta 2 derped.

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, LAwLz said:

So:

Native single core - 1687

Rosetta single core - 1313

Performance penalty: 22%

 

Native multi core - 7433

Rosetta multi core - 5888

Performance penalty: 21%

If we take this at face value, SC is still faster than the i9 found in the MacBook Pro 16", and MC is marginally lower. But still pretty impressive for being fully translated. I think Apple made the right decision when they decided to approach translation at install time rather than runtime like in the Intel switch.

Link to comment
Share on other sites

Link to post
Share on other sites

I can't wait to see if this is a RISC vs CISC result or if this actually matters. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/14/2020 at 8:33 AM, NotTheFirstDaniel said:

It's actually not that bad.

 

It's not equal to the 5300M

no, that's pretty bad.

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/15/2020 at 5:33 AM, NotTheFirstDaniel said:

But this thing just crushes the Xe graphics found in Tiger Lake. Imagine the performance when they scale it up to a 45W part.

Sorry never mind, Intel Iris Xe Max is the dGPU variant, ignore. Stupid name.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, leadeater said:

No it actually doesn't, not sure how GFXBench is aggregating their score and how they arranging them but if you actually go in to the detailed results Intel Iris Xe GPUs are much faster than the M1.

 

https://gfxbench.com/device.jsp?benchmark=gfx50&os=Windows&api=gl&cpu-arch=x86&hwtype=iGPU&hwname=Intel(R) Iris(R) Xe MAX Graphics&did=88993303&D=Intel(R) Iris(R) Xe MAX Graphics

 

https://gfxbench.com/device.jsp?benchmark=gfx50&D=Apple+M1&testgroup=overall

 

The overall score and rank on GFXBench is highly dubious when you look at the detailed results, it should be muuuuch further down the list. 

I think the Metal test use Vsync. That would explain why the onscreen tests are capped at 60FPS, but the offscreen ones show significant ground over the Xe MAX.

 

Yes, looking at the 5300M benchmarks, for some reason the Metal test caps at 60FPS.

 

And I was comparing it to the regular Xe. I had no idea there was even an Xe Max...

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, NotTheFirstDaniel said:

I think the Metal test use Vsync. That would explain why the onscreen tests are capped at 60FPS, but the offscreen ones show significant ground over the Xe MAX.

 

Yes, looking at the 5300M benchmarks, for some reason the Metal test caps at 60FPS.

 

And I was comparing it to the regular Xe. I had no idea there was even an Xe Max...

Don't worry check my edit, stupid Intel names for very different things lol.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×