Jump to content

Apple M1 Ultra - 2nd highest multicore score, lost to 64-core AMD Threadripper.

TheReal1980
57 minutes ago, saltycaramel said:

All the latest clues and sifting thru patents put together.

Hell no, that video is absolutely clickbait and is absolutely not what Apple is doing.

 

We know this - the people reverse-engineering the chip (Hector Martin for Asahi Linux primarily) warned when the M1 Max first came out that it had extra IRQ controller slots for a second chip, and a dual-chip M1 Max was incoming, which happened. However, the M1 Max does not contain the hardware necessary for a 4-chip design. Multiple parts of the chip had capacity for two, but not four, and Hector Martin has called Bloomberg's report about a 4-chip M1 Max standard Bloomberg garbage misreporting. 

 

The next Apple chip will be a completely different design with its own capabilities. Hector Martin has reported you can actually see this on a hardware level pretty easily: The M1 (original) is literally just a rebranded A14X, complete with Apple's 1st-generation Apple Interrupt Controller that goes all the way back to the original iPhone. The M1 Pro, M1 Max, and M1 Ultra are all based on one design (with a brand-new 2nd-generation AIC) - the M1 Pro being almost exactly an M1 Max with the remaining 16 GPU cores cut off, and the M1 Ultra being 2 M1 Maxes linked together. For the super-pro models (Apple X1?), they will most likely all be based on one design, again, but it won't be M1 Max-based. 

 

This means Apple really only has three CPU designs architecturally:

 

- The A-series (A14, and actually, the original M1 which is just rebranded A14X)

- The M1 Max series (M1 Pro [cut down Max], M1 Max, M1 Ultra [doubled Max])

- Series Three For "Extreme Workloads" which we haven't seen yet

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Dracarris said:

latency basically also always benefits when you make distances shorter or electrical properties (resistance, inductance) of the link better. As for software support it looks a lot like there doesn't need to be a lot as at least a large part is managed in HW; we already know e.g., that the GPUs present themselves as a single device to the OS.

Hmm if the final words of the last presentation is anything to go by this was the last iteration of M1. It could be possible they do a dual-socket M1 Ultra but I am not sure if the IO of the current die supports this. I'd heavily root for M2. After all the base M1 was now announced quite some time ago, pretty sure M2 is in the making for quite some time and we'll see public announcement latest this fall.

I agree. M2 makes a lot more sense than M1++. The Blizzard+Avalanche cores of the A15 will be more than a year old by the point we see the M2. In fact, I wouldn't be surprised if they use the A16 cores. In addition to that, the I/O capabilities of the M1 platform are lacking, especially at the very high/professional end of the market. Also, the M2 would probably dump the cool tricks that apple implemented for getting rosetta to work. The M2 would  probably start a series of incremental upgrades for the M series platform(after all, TSMC has been having some troubles at n3: https://english.etnews.com/20210901200003 and the M series chips are using A series cores in them).

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, hishnash said:

GB is not hand coded in assembly, it will be mostly c/c++ so really this is a judgment of the quality of the c/c++ compilers

Don't doubt the compiler. At this point, the differences between C/C++ compilers and their various versions on different architectures/platforms are nearly perfect(duh, your new platform is dead if it has a crappy C/C++ compiler)

So unless GB devs are using some out of whack/different standards for their programs or something like icc/i++, no the compiler is not the issue.

 

It might have inline asm, so don't discount ASM completely(scratch the above statement, inlining is not perfect everywhere).

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, gjsman said:

Hell no, that video is absolutely clickbait and is absolutely not what Apple is doing.

 

We know this - the people reverse-engineering the chip (Hector Martin for Asahi Linux primarily) warned when the M1 Max first came out that it had extra IRQ controller slots for a second chip, and a dual-chip M1 Max was incoming, which happened. However, the M1 Max does not contain the hardware necessary for a 4-chip design. Multiple parts of the chip had capacity for two, but not four, and Hector Martin has called Bloomberg's report about a 4-chip M1 Max standard Bloomberg garbage misreporting. 

 

The next Apple chip will be a completely different design with its own capabilities. Hector Martin has reported you can actually see this on a hardware level pretty easily: The M1 (original) is literally just a rebranded A14X, complete with Apple's 1st-generation Apple Interrupt Controller that goes all the way back to the original iPhone. The M1 Pro, M1 Max, and M1 Ultra are all based on one design (with a brand-new 2nd-generation AIC) - the M1 Pro being almost exactly an M1 Max with the remaining 16 GPU cores cut off, and the M1 Ultra being 2 M1 Maxes linked together. For the super-pro models (Apple X1?), they will most likely all be based on one design, again, but it won't be M1 Max-based. 

 

This means Apple really only has three CPU designs architecturally:

 

- The A-series (A14, and actually, the original M1 which is just rebranded A14X)

- The M1 Max series (M1 Pro [cut down Max], M1 Max, M1 Ultra [doubled Max])

- Series Three For "Extreme Workloads" which we haven't seen yet

 

While it’s on the clickbaity side (always risky to get too excited about patents that may depict products of the distant future, if ever), I would save a couple of things from that video

 

- If Apple knew it will botch its own 2-years deadline (June 2020 to June 2022) for the transition, they wouldn’t remind us about the “transition”, and if Apple knew the MacPro was still far away they wouldn’t (in a rare move for Apple) discourage potential MacStudio buyers with that MacPro comment, so it’s safe to say we’re going to see the new MacPro on June 6 2022 at around 10:20 AM Cupertino time (after 1:20h of software presentation, I’ve checked WWDC 2019)

 

- He does address the IRQ thing and that’s one of the reasons he believes that the new MacPro will not be based on a derivative of the M1 Max.

 

That’s where his take and yours diverge. He believes the MacPro will be based on the M2 Ultra Duo (A15 gen), you believe it will be based on the X1 Ultra (A14 gen). Both theories have some merit. Yours is certainly more plausible with how things have always worked in CPU fabbing, you start a generation from the smaller chips (like Intel U-chips) and you finish 2-3 years later with the bigger chips (like Xeons). How could Apple release the M2 in May and already the M2 Ultra Duo in June?
 

On the other hand

- actually Apple/TSMC have been pumping out A15 and M2 chips for a while now, the mass production of the A15 started in May 2021, so we’re not that early in the A15 generation

- MacPros are always just a paper launch at WWDC, a preview, with actual availability at the end of the year and widespread availability into the new year, so it’s not unthinkable to have M2 Max/Ultra/Ultra_Duo ready for a low volume desktop in late 2022/early 2023 

- we have zero rumors about the “X1”, not a codename, nothing, unless the “Lifuka” chip is not a bespoke GPU but the X1 extreme chip

- as I said the financial reasoning behind developing a dedicated chip for the MacPro is an open question, unless it’s the foundation for something bigger and profitable like an industry of virtual experiences for the VR headset

 

 

X1 or M2 Ultra Duo? In 86 days we’ll probably know. 

Link to comment
Share on other sites

Link to post
Share on other sites

In his latest newsletter Mark Gurman proposes (among other things and general ad-libbing) a marketing loophole I had already thought of: just call the MacPro 4-chiplet configuration “Dual M1 Ultra”, so “technically” it would be still true that the “M1 Ultra” is the last member of the M1 family. You just can get two of those in the MacPro. If they go the “dual socket” route, I suppose the IRQ limitation would also be moot. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, saltycaramel said:

In his latest newsletter Mark Gurman proposes (among other things and general ad-libbing) a marketing loophole I had already thought of: just call the MacPro 4-chiplet configuration “Dual M1 Ultra”, so “technically” it would be still true that the “M1 Ultra” is the last member of the M1 family. You just can get two of those in the MacPro. If they go the “dual socket” route, I suppose the IRQ limitation would also be moot. 

Sure not impossible, but somehow I seriously doubt we will get a multi socket system.

Link to comment
Share on other sites

Link to post
Share on other sites

Just stumbled on this tweet (from a leaker that’s not necessarily super reliable)

 

 

I see people in the replies not taking into account that by the time the MacPro is out 64GB LPDDR5X Samsung modules will be available, so even forgoing half of the modules (because of that arrangement) could still mean a 512GB max RAM for the MacPro..plus whatever they come up with for the user-serviceable pool..

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/9/2022 at 8:33 PM, Imbadatnames said:

It’ll be more than that. The M1 Max can draw 110W at full load with a normal boost of around 60 in applications. Should be roughly double plus the iPSU and other components that draw power. Expect 250-300

He was talking about only the CPU part.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/11/2022 at 7:39 PM, LAwLz said:

Source on this claim? I can't find any evidence that this is true. 

I suspect that it's one of those myths that has been repeated enough times to the point where it's "common knowledge" that doesn't get checked for validity. 

 

Ignore what I said, it seems to be true only for their ML and Compute benchmarks, not for GB5 (if this answer is still to be believed in 2022)

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

I'm calling it.. Apple Silicone is headed to using a sticklebrick design! Borrow My First Sticklebricks (CP22)

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/11/2022 at 12:07 PM, atxcyclist said:

Yes, really. Apple having a closed ecosystem and non-upgradable hardware will not fly in many industries. You as well as many of the other Apple fanboys here don’t understand the needs/wants of x86 power users.

 

I don’t have to watch my back, lol. You Apple fanboys are in the idle threats game now? You going to break my kneecaps if I don’t change?
 

My industry doesn’t use and isn’t supported by Apple hardware or software, You fanboys need to put the Kool Aid down and realize we don’t all live in your little ecosystem and don’t want to.
 

I am the IT at my office, we’re not switching to Apple ever because it doesn’t work for us. I know this better than you do, I’ve been in this business for two decades.

Amusingly, my wife's company recently fired 2/3 of their IT department because they wouldn't get onboard with moving from Windows to MacOS. IT has as much power as the company wants them to have.

 

She's now happily on a 16" MacBook Pro M1 Max, after a decade of hating her Lenovos.

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, Obioban said:

Amusingly, my wife's company recently fired 2/3 of their IT department because they wouldn't get onboard with moving from Windows to MacOS

That sounds rather extreme. I'm glad I live somewhere with employment rights.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Obioban said:

Amusingly, my wife's company recently fired 2/3 of their IT department because they wouldn't get onboard with moving from Windows to MacOS. IT has as much power as the company wants them to have.

In 2016 (though this was before the Apple Keyboards fiasco and before M1), IBM found that Macs were, despite their upfront cost, they were between $273-543 per device cheaper in the long run. They also found only 5% of Mac users called the help desk, whereas 40% of PC users did. The Mac users also had 16% more high-value deals than PC users and there were 22% more Mac users who exceeded performance reviews. It also took only 7 engineers to support 200,000 Macs, whereas it took 20 engineers to manage 200,000 PCs. 

 

https://www.computerworld.com/article/3131906/ibm-says-macs-are-even-cheaper-to-run-than-it-thought.html

 

Of course, the IT Department most likely has a longstanding bias against macOS, but regardless,

 

34 minutes ago, Paul Thexton said:

That sounds rather extreme. I'm glad I live somewhere with employment rights.

IANAL, but refusing to do a legitimate company order that is not blocked by nondiscrimination or other protection rules, is absolutely grounds for being fired and you will never win a lawsuit. Otherwise, what's preventing, say, the employees at Amazon have accounting raise their salary 25% against the corporate board's approval? It would be the collapse of the business to have such anarchic rule.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/13/2022 at 1:19 PM, saltycaramel said:

Just stumbled on this tweet (from a leaker that’s not necessarily super reliable)

 

 

I see people in the replies not taking into account that by the time the MacPro is out 64GB LPDDR5X Samsung modules will be available, so even forgoing half of the modules (because of that arrangement) could still mean a 512GB max RAM for the MacPro..plus whatever they come up with for the user-serviceable pool..

Apple building bridges for everything lol. 

CPU Legos

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Obioban said:

Amusingly, my wife's company recently fired 2/3 of their IT department because they wouldn't get onboard with moving from Windows to MacOS. IT has as much power as the company wants them to have.

 

She's now happily on a 16" MacBook Pro M1 Max, after a decade of hating her Lenovos.

I imagine your wife can't share exactly why those IT workers objected to the Mac switch, but that strikes me as oddly petty. The only valid reason I see for mounting an objection would be a serious technical issue (a missing app or an overly clunky workaround, for example). After that, it might be down to a childish Anything But Apple bias, an unwillingness to change or even a fear that the Macs might put them out of a job (through improved stability or just a lack of training). In those last cases... well, your job as an IT manager is to make the technology do what the company leadership wants, not what keeps you in your comfort zone.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, gjsman said:

IANAL, but refusing to do a legitimate company order that is not blocked by nondiscrimination or other protection rules, is absolutely grounds for being fired and you will never win a lawsuit.

Yeah that’s fair, I guess it depends on how exactly the IT folk in question were behaving re: the company decision to use macs

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Commodus said:

I imagine your wife can't share exactly why those IT workers objected to the Mac switch, but that strikes me as oddly petty. The only valid reason I see for mounting an objection would be a serious technical issue (a missing app or an overly clunky workaround, for example). After that, it might be down to a childish Anything But Apple bias, an unwillingness to change or even a fear that the Macs might put them out of a job (through improved stability or just a lack of training). In those last cases... well, your job as an IT manager is to make the technology do what the company leadership wants, not what keeps you in your comfort zone.

I’ve worked with people who actively refuse to engage with anything Linux based and in a few cases went so far as to migrate in-use systems to something Windows based instead. I don’t understand people willingly turning down the opportunity to learn/upskill.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Paul Thexton said:

Yeah that’s fair, I guess it depends on how exactly the IT folk in question were behaving re: the company decision to use macs

If I had to guess, it went down like this:

 

Management: We are changing to Macs for our workforce.

IT: No we don't want to!

Management: You have to.

IT: Nope we don't!

Management: Switch or you are fired.

IT: Then fire us!

 

I seriously doubt the conversation was:

Management: We are changing to Macs.

IT: We don't want to.

Management: Everyone who said no is immediately fired.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Paul Thexton said:

I’ve worked with people who actively refuse to engage with anything Linux based and in a few cases went so far as to migrate in-use systems to something Windows based instead. I don’t understand people willingly turning down the opportunity to learn/upskill.

I know Linux can be a challenge, but egads... if you're in IT, Linux shouldn't be a completely foreign concept. As a Mac user I still know a number of common Unix commands and am familiar with concepts like user and app permissions.

 

Besides, it's not like the company would force them to install Ubuntu or Debian on their personal laptop.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

If I had to guess, it went down like this:

 

Management: We are changing to Macs for our workforce.

IT: No we don't want to!

Management: You have to.

IT: Nope we don't!

Management: Switch or you are fired.

IT: Then fire us!

 

I seriously doubt the conversation was:

Management: We are changing to Macs.

IT: We don't want to.

Management: Everyone who said no is immediately fired.

Well the problem here like a lot of the time is this does not mean the management wanting to move to Mac OS was the right call at all. And I truly object to this notion of not listening to IT about the area they are expert in. My team has to fight this kind of idiocy all the time about many different things because someone somewhere has been marketing pitched something but they have no clue at all but try to push for it anyway.

 

Everything is a two way street. IT, finance, facilities management etc. If the reasoning for change was nothing more than "trendy and new and to hell with everything else" then I'd happily leave the company first.

 

Sometimes the company is crap and is not worth working for.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/12/2022 at 1:04 PM, WolframaticAlpha said:

I agree. M2 makes a lot more sense than M1++. The Blizzard+Avalanche cores of the A15 will be more than a year old by the point we see the M2. In fact, I wouldn't be surprised if they use the A16 cores. In addition to that, the I/O capabilities of the M1 platform are lacking, especially at the very high/professional end of the market. Also, the M2 would probably dump the cool tricks that apple implemented for getting rosetta to work. The M2 would  probably start a series of incremental upgrades for the M series platform(after all, TSMC has been having some troubles at n3: https://english.etnews.com/20210901200003 and the M series chips are using A series cores in them).

There's no way Apple ditches the Rosetta bits in M2. Way too soon.

Link to comment
Share on other sites

Link to post
Share on other sites

Since the discussion has shifted a bit in here I will chip in with my experiences. 
 

During my years working there has been 2 major hurdles to get a Mac at work. One is lack of support for software needed (hard to do something about) and the second is IT vehemetly opposing Mac usage just because (even when no software needs limits the choice). 

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Obioban said:

There's no way Apple ditches the Rosetta bits in M2. Way too soon.

Well I sincerely hope they won't. That would be a bummer.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/11/2022 at 12:59 PM, atxcyclist said:

 

There’s nothing ironic about what I’ve stated,

r/woooosh

He's saying that when you said "I'm not apart of your little ecosystem, we only have x86". You essentially said "I'm not a part of your ecosystem, BECAUSE I have my own tiny ecosystem."

Your company is definitely not big enough to be compared to the entirety of the market, and therefore you can go and shove your company experience up your butt. There is a reason Apple still exists. There is a reason why they have such a large market share (compared to anything that isn't Windows, but even then, Windows is losing significant market share due to Apple, and Chromebooks), is because there are people out there who find MacOS to be better for their workloads. You personally may not need that for your company, but other companies would use these things. Even companies you might not even think of.

My father has a Non-profit Advocate law firm, and hey tried using things such as Google Drive, DropBox, and other software that I have literally never heard of before, in order to share files among all of his employees when they went digital. Most of his employees had Windows devices, my father and one other person had Macs. They had a horrible time getting DropBox to work properly just between Windows computers, let alone Windows->Mac and vice versa. But the Macs had 0 issues with DropBox. So, what do you think they did? That's right, the entire firm transitioned over to Macs, with the only software issues being a result of a bug in accounting software. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×