Jump to content

Apple will allow Linux VMs to run Intel apps with Rosetta in macOS Ventura

Spindel
35 minutes ago, mr moose said:

It might just be specific to the version he uses or it might be something more specific to command line interface.  @hishnash might know better.

The main issues with doing dev on windows for ruby is your production env is very unlikely to be NT based, so things like file access, network etc are very different to production.  This can be a big perf hit if your doing a lot of file IO or network work IO (aka web socket server on widows need to be written very much with windows network stack in mind and the low level server parts from langue's like ruby do not bother with that as who would run windows as a web socket server).

Using a Posix style os (linux, macOS, BSD etc) means the optimised path (by the core devs) is the path your using on your machine.  

While you can do Ruby, Python etc dev on windows you really should only do that if your production env is windows otherwise your just making extra plain for yourself... and if your doing server dev and your production env is windows then you really need to have a long talk with your CTO to find out what backward tec stack constraints have lead to this nightmare of hell you are forced to live in. (from a dev who has had to deal with this and does not want to every need to deal with this again). 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

Every laptop I have owned to date (bar 1) has allowed me to change out ram, hdd, m.2.  All have been able to be easily opened and cleaned by myself without special tools or risk of breaking anything.    Given the nature of hardware companies to glue everything together this is becoming harder and harder to do and so is something I am very weary of.

That's great for you, and I'd love to see those devices existing going forwards as an option, since option is always good. However, when going for ease of repairability and upgradeability you need to sacrifice performance and battery life due to physics, you can't have LPDDR5 dimms, those need to be soldered to reach such high speeds and low voltages. Same applies to other components.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, mr moose said:

I never said it was,

You literally said that. I quoted it, bolded it, underlined it and italicized it. I'll make the text bigger for you if it still wasn't obvious

Quote

Please show me the misinformation in the simple postulation that a hardware company preventing an end user from running their own software is a bad concept.

 

6 hours ago, mr moose said:

this whole thread the only thing I have claimed is that when a title says "X will allow", the premise follows that X can also disallow,  and that should never be the case for hardware makers.  I really don't know how you or any of the others managed to take that one simple concept and screw it into the mess of licensing and apple hate that you did.

So basically you are admitting that you jumped to conclusion. Its basically equivalent to stating a completely unrelated opinion on a thread. The title, nor the opinion posted on this forum changes the actual news in any way, yet you let what was said in the title down a wrong path of "opinions" and "arguments" and still for some reason defending your wildly incorrect conclusion. And mind you it was pointed out in the very beginning that the title was misleading, to probably perhaps serve the iHateApple bandwagon that exists on the forum. Just admit it and move on. Stop digging deeper and deeper

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, RedRound2 said:

You literally said that. I quoted it, bolded it, underlined it and italicized it. I'll make the text bigger for you if it still wasn't obvious

You claimed that I said I owned rosetta, and to prove it you posted a quote from me that not only says nothing of the sort but you claim it literally does.  I don't know if you even know what the word literal means,  but it certainly doesn't mean saying "software I own" equals "I own rosetta".

4 hours ago, RedRound2 said:

So basically you are admitting that you jumped to conclusion. Its basically equivalent to stating a completely unrelated opinion on a thread. The title, nor the opinion posted on this forum changes the actual news in any way, yet you let what was said in the title down a wrong path of "opinions" and "arguments" and still for some reason defending your wildly incorrect conclusion. And mind you it was pointed out in the very beginning that the title was misleading, to probably perhaps serve the iHateApple bandwagon that exists on the forum. Just admit it and move on. Stop digging deeper and deeper

 

So back to the you hate apple arguments. I never projected any hate towards apple and made very clear attempts to explain that but you are just ignoring it to carry on whatever delusion you are under.   Mate you can't even quote where I said the things you claim.    

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Just for those who want clarity on my postulations I posted this a while back:

 

 

On 6/10/2022 at 9:45 PM, mr moose said:

  This has nothing to do with being only about apple or being about rosetta.   It is literally a just meandering thought that was spawned by the title.  If running intel apps on apple M1 hardware is not a problem then the title is erroneous,  but if there is a problem then it is reflective of current corporate tech attitudes towards consumers and heaven help any poor soul who doesn't see a problem with that.

 

 

 

 

To read that again, this has nothing to do with rosetta or being only about apple, it is literally just a meandering thought spawned by the title that suggested hardware companies could control what software you run.    Even though @RedRound2  wants us to believe something entirely different.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, mr moose said:

Just for those who want clarity on my postulations I posted this a while back:

 

 

 

 

To read that again, this has nothing to do with rosetta or being only about apple, it is literally just a meandering thought spawned by the title that suggested hardware companies could control what software you run.    Even though @RedRound2  wants us to believe something entirely different.

At this point, your comments just seem off topic.

 

The Title is misleading. You know this now. Therefore you understand that the comments you were making, do not apply to this topic.

 

If you wish to discuss the wider ramifications of the unrelated issue of companies allowing or disallowing software on different hardware, I'd suggest creating a new thread topic to pursue that discussion, rather than keeping it off-topic in here.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, mr moose said:

So back to the you hate apple arguments. I never projected any hate towards apple and made very clear attempts to explain that but you are just ignoring it to carry on whatever delusion you are under.   Mate you can't even quote where I said the things you claim.    

I think this seems to be a problem with tech in general when someone brings up a concern with anything, you're suddenly "anti company x" for not just blindly praising everything the company does.

22 hours ago, igormp said:

That's great for you, and I'd love to see those devices existing going forwards as an option, since option is always good. However, when going for ease of repairability and upgradeability you need to sacrifice performance and battery life due to physics, you can't have LPDDR5 dimms, those need to be soldered to reach such high speeds and low voltages. Same applies to other components.

Unless there is a comparison of two same spec'ed laptops showing the difference in power draw, I have to doubt the significance in power consumption on a system which has DIMM slots and a M.2 SSD, in addition a battery which isn't sealed into the system. Although I would rather take the tradeoff of slightly lesser battery life of a device I can keep much longer that I can easily clean and upgrade if I want to.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Blademaster91 said:

Unless there is a comparison of two same spec'ed laptops showing the difference in power draw, I have to doubt the significance in power consumption on a system which has DIMM slots and a M.2 SSD, in addition a battery which isn't sealed into the system. Although I would rather take the tradeoff of slightly lesser battery life of a device I can keep much longer that I can easily clean and upgrade if I want to.

You just need to look at the difference between lpddr4 and ddr4 SODIMMs, the former has lower voltage and therefore lower current needs while being able to achieve higher speeds. Same applies to lpddr5 vs ddr5.

3~5W of difference for a device that aims to race to idle is significant.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, igormp said:

You just need to look at the difference between lpddr4 and ddr4 SODIMMs, the former has lower voltage and therefore lower current needs while being able to achieve higher speeds. Same applies to lpddr5 vs ddr5.

3~5W of difference for a device that aims to race to idle is significant.

LPDDR4 comes as SO-DIMMs and also soldered on main board. LPDDR4x does not come in SO-DIMM form and is either on system board or on package, however it's also half the bus width of DDR4 (so is LPDDR4 btw) so is only faster in the highest MT/s variants than DDR4-3200 (by ~40%). If you have faster than 3200 then the difference is a bit smaller.

 

LPDDR4 rather than LPDDR4x just isn't that widely used or popular, it's lower bandwidth for not much power savings. It has the drawbacks of LPDDR4x without the significantly increased MT/s and 0.6 I/O voltage.

 

Also DDR4, LPDDR4 and LDRPP4x all use 1.8V DRAM array voltage. DDR4 uses 1.3v core and I/O, LPDDR4 uses 1.1v core and I/O, LPDDR4x uses 1.1v core and 0.6v I/O.

 

LPDDR4x is 20% more power efficient than LPDDR4. Not sure the difference between DDR4 and LPDDR4 but I'd suspect less than 20%.

 

Also note most LPDDR5 in use today is 1.05v I/O. I would assume however Apple is using 0.5v I/O since theirs is on package, I am however just assuming that. This is in respect to laptop devices i.e. Mx family. Maybe most LPDDR5 in phones and tablets are 0.5v I/O but I don't really know because they do not interest me.

 

https://www.micron.com/products/dram/lpdram/lpddr5

 

Quote

A 64GB laptop configured with two 32GB DDR4 modules consumes less than 4.6 watts (W) in active mode and less than 1.4W when idle. 

https://semiconductor.samsung.com/newsroom/news/the-industrys-first-32gb-ddr4-sodimm/#:~:text=A 64GB laptop configured with,than 1.4W when idle.

 

So long story short, I doubt it's as high as 3W-5W difference between SO-DIMM, soldered or on package because regular old DDR4 is only 4.6W for two DIMMs. LPDDR4x being 5W less than SO-DIMM means it's negative power usage 🙃

 

It'll be more like 1.5W, on maybe the higher side. So for 50Wh and higher class battery devices I can't realistically see how anyone would notice a significant difference in run time, idle or not.

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, leadeater said:

So long story short, I doubt it's as high as 3W-5W difference between SO-DIMM, soldered or on package because regular old DDR4 is only 4.6W for two DIMMs. LPDDR4x being 5W less than SO-DIMM means it's negative power usage 🙃

 

It depends on if you are aiming for matching just capacity or also bandwidth.  M2 as a 100GB/s bandwidth to do this with DDR5 your going to need 3 full size dims (not so-dim) while per GB DDR5 dims are not going to draw much more power if you need to 3x them to match the bandwidth they will draw a lot more power and good luck fitting a battery in the laptop while your there. 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, dalekphalm said:

At this point, your comments just seem off topic.

 

The Title is misleading. You know this now. Therefore you understand that the comments you were making, do not apply to this topic.

 

If you wish to discuss the wider ramifications of the unrelated issue of companies allowing or disallowing software on different hardware, I'd suggest creating a new thread topic to pursue that discussion, rather than keeping it off-topic in here.

 

My comments only ever applied to the title and I explained several times that that's all they were. The reason I posted in this thread was because it was this threads title that spawned the thoughts.  My comments could simply have died on the first page if people had actually bothered to accept my explanation rather than insist I was saying a whole lot of stuff I explicitly said I was not saying.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, hishnash said:

It depends on if you are aiming for matching just capacity or also bandwidth.  M2 as a 100GB/s bandwidth to do this with DDR5 your going to need 3 full size dims (not so-dim) while per GB DDR5 dims are not going to draw much more power if you need to 3x them to match the bandwidth they will draw a lot more power and good luck fitting a battery in the laptop while your there. 

DDR5 generation the bandwidth is the same between DIMM and SO-DIMM, as in data channels and bus widths etc they are the same which was not the case for DDR4 generation. SO-DIMM currently is limited to lower MT/s data rates but a LPDDR5-4800 SO-DIMM is 38GBps per channel so 3 SO-DIMMs would also do it, however in reality it'll likely be 4 required to actually achieve 100GB/s (or bit more).

 

But this comes with the assumption you need that bandwidth in the first place, which in general is no it's not required.

 

LPDDR5 also has a lot more power saving features, biggest gains in idle.

 

Quote

Thanks to the use of Dynamic Voltage Scaling (DVS), LPDDR5 can support two voltage modes: 1.05V (C) and 0.5V (I/O), while operating at higher frequencies and 0.9V (C) and 0.3V (I/O) while operating at lower frequencies.

 

LPDDR5 features include a new scalable clocking architecture for command/address (C/A) clock (CK) to allow easier SoC timing closure, and most of the features of DDR5 such as decision feedback equalizer (DFE), Write X feature to save power, and link ECC to enhance memory channel RAS.

 

LPDDR5 is up to 45% more power efficient than LPDDR4x, note the up to though.

 

Apple's main power saving with their LPDDR5 packages is likely using more dense modules of a data width of 64 bits per package. For SO-DIMM to save cost you'd opt for less dense modules and 32 bits per package and have more packages in total on the SO-DIMM. So the power saving you're still comparing a 128 bit solution vs 128 bit solution just with different numbers of DRAM packages. A more dense wider data bus DRAM package will use more power but be more efficient per capacity and bandwidth.

 

So again not really, the power difference between what Apple is doing and SO-DIMM just isn't that big still. We're still talking low watts, below 5W probably.

 

Bigger thing to remember is each small thing you do to gain more power efficiency adds up, it's not just a single thing that makes a large difference to the total overall end product.

 

Also 4 SO-DIMM laptops have existed before, it's actually not that much of a problem space wise. you just double side the SO-DIMM slots on the main board. It's just really complicated and expensive to do that.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, leadeater said:

But this comes with the assumption you need that bandwidth in the first place, which in general is no it's not required.

 

If you are running a powerful iGPU + NPU and video encoders you need the bandwidth. Sure you can avoid this by soldering higher bandwidth GDDR (this would destroy power draw having 2 pools of memory when you could have 1) do the motherboard for the GPU but then that is soldered and not expandable.  If you want expandable memory then you want expandable memory no?  the same reason you might want to expect your system memory also applies to VRAM in fact might even apply more if your on a laptop with a very limited pool of vram. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, hishnash said:

If you are running a powerful iGPU + NPU and video encoders you need the bandwidth.

Well yes of course, but that doesn't mean the device you are comparing to has that. You can't just go throwing in a bunch of memory and memory bandwidth in to a device comparison just because Apple has something, you only compare using the hardware each device actually needs and is suitable for the architecture.

 

3 hours ago, hishnash said:

Sure you can avoid this by soldering higher bandwidth GDDR (this would destroy power draw having 2 pools of memory when you could have 1) do the motherboard for the GPU but then that is soldered and not expandable.

You don't need GDDR, two SO-DIMMs or soldered DDR5 which will give you near enough the same bandwidth Apple has. You're mistaking the M1/M2 with the Pro/Max/Ultra variants which have wider memory buses than which Intel or AMD deploy in to their consumer silicon designs.

 

As mentioned what is quite "slow" DDR5-4800 is 38GB/s per DIMM (channel pair) and you're going to have a real hard time proving that 100GB/s over 76GB/s actually matters in reality. Simply going with DDR5-6400 gives you 102 GB/s bandwidth.

 

3 hours ago, hishnash said:

If you want expandable memory then you want expandable memory no?

DDR5 gives you all of this, wait for M2 Pro/Max/Ultra before raising the impractical achievable bandwidth without doing dense, multi layered/stacked memory packages on package. Right now M2 isn't giving anything more than you can get any other way, and it's not doing it at significantly lower power either. DDR5 is just simply very power efficient this time around so a differential of a few watts at memory load in devices with such large batteries with much more power demanding SoCs, CPUs, GPUs just doesn't really make much difference by itself.

Link to comment
Share on other sites

Link to post
Share on other sites

I wanna step in with a dumb guy question on this.. What will this do for linux? Will it let them run older linux programs? Will it let them run mac programs? Windows?  Don't linux programs usually get compiled on the machine they're gonna run on?

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Video Beagle said:

I wanna step in with a dumb guy question on this.. What will this do for linux? Will it let them run older linux programs? Will it let them run mac programs? Windows?  Don't linux programs usually get compiled on the machine they're gonna run on?

It lets linux applications compiled for x86 run on ARM64 linux VMs (and possibly bare metal) on M1/2 devices. 

Open source linux applications will typically be provided in such a way that it is easy to build them for most cpu arcs but closed source applications that run on linux are still very common in the engineering space or even video production space. These tools being closed source your going to depend upon the author issuing a new arm64 build and since M1 Pro/mMax/Ultra is bascule the first ARM64 linux machines that you would consider using as a workstation they have not yet bothered doing this.

There are, and have been for many many years, linux users using ARM64 but until recently very few have been using this platform as a workstation, it has either been IOT style devices (like raspberry pies or smaller) or servers neither of these platforms is of much interests to people building workstation software. 

As to windows, not this does not affect windows. Would likly be a massive amount more work for apple, and would require modification to windows itself, for apple to be able to provide Rosseta for windows.  MacOS and linux are so much closer in design under the hood that supporting linux is much much simpler than windows.  Also currently it is not even 'legal' to run ARM windows on an M1 VM (MS licensing terms restrict the usage) so I cant see MS being able to colobrate on this without breaching contracts they may or may not have signed with some hardware vendors for exclusivity on ARM. 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, hishnash said:

As to windows, not this does not affect windows

I was more wondering if this was something that would let Linux users run Windows software....

 

Thanks for the explanations....all of this kind of stuff exists in that twilight space of understanding for me... I kinda of get it.. but once I start to grab it, it poofs into smoke.

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Video Beagle said:

I was more wondering if this was something that would let Linux users run Windows software....

 

You mean x86 Wine/Proton compatibility layers for windows with rosetta2 yes that should work just the same as it does on macOS (even supporting 32bit windows games even through macOS does not ship with any 32bit kernel apis any more.. )

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, hishnash said:

You mean x86 Wine/Proton compatibility layers for windows with rosetta2 yes that should work just the same as it does on macOS (even supporting 32bit windows games even through macOS does not ship with any 32bit kernel apis any more.. )

Thank you for assuming I know more than I do on this topic. It is that generous nature that I enjoy your posts in the forum 🙂

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, leadeater said:

Well yes of course, but that doesn't mean the device you are comparing to has that. You can't just go throwing in a bunch of memory and memory bandwidth in to a device comparison just because Apple has something, you only compare using the hardware each device actually needs and is suitable for the architecture.

Absolutely. How do AMD APUs such as the one used in the SteamDeck compare in this regard though? I would have thought that's a more analogous product to do comparisons with than a generic x86_64 laptop with a discreet GPU (that has it's own VRAM).  Happy to be corrected if I'm wide of the mark with that assumption though 😀

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, Paul Thexton said:

How do AMD APUs such as the one used in the SteamDeck compare in this regard though

currently there are no other APUs on the market that come close to apples product lines in bandwidth, some of this is of course that with apples SOCs it is more than just the GPU when you consider video encoders/decoders streaming data in then writing out the decoding values that then being read in by the gpu being process being written out then read back end for compositing by the window manager then being written back out then being read back in by the display controller ... I don't think there are any APU products on the market today from other vendors that have these levels of bandwidth needs as the higher end M1/2 let along M1 Pro/Max.

Gaming on M1/2 is in fact a comparably low bandwidth task compared to other GPU bound tasks. The TBDR approach does result in a rather large reduction in bandwidth usage (assuming devs have consider this in their pipeline).  

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Paul Thexton said:

Absolutely. How do AMD APUs such as the one used in the SteamDeck compare in this regard though? I would have thought that's a more analogous product to do comparisons with than a generic x86_64 laptop with a discreet GPU (that has it's own VRAM).  Happy to be corrected if I'm wide of the mark with that assumption though 😀

image.thumb.png.718225eabc7f2a42a2dfac4b8c0c844d.png

 

I would have to look up what the exact bandwidth of DDR5-5500 actually is but it's above 38GB/s and below 51GBs so the Steam Deck would be around 85GB/s-90GB/s memory bandwidth.

 

Apple M1 GPU is 2.6 TFLOPs btw, just for a raw comparison that isn't so strictly useful other than to say it's faster. M2 GPU is 3.6 TFLOPs.

 

10 hours ago, hishnash said:

currently there are no other APUs on the market that come close to apples product lines in bandwidth, some of this is of course that with apples SOCs it is more than just the GPU when you consider video encoders/decoders streaming data in then writing out the decoding values that then being read in by the gpu being process being written out then read back end for compositing by the window manager then being written back out then being read back in by the display controller ... I don't think there are any APU products on the market today from other vendors that have these levels of bandwidth needs as the higher end M1/2 let along M1 Pro/Max.

Gaming on M1/2 is in fact a comparably low bandwidth task compared to other GPU bound tasks. The TBDR approach does result in a rather large reduction in bandwidth usage (assuming devs have consider this in their pipeline).  

 

AMD is coming out with one next year but it's in the server market not consumer, it's called the MI300 but is actually an APU with HBM, chiplet cache dies, CPU chiplet and GPU chiplet using advanced 3D stacking. Probably wildly expensive haha. But some parts of that work could filter down to consumer products eventually but I doubt before 2024, so 2025+.

 

CDNA3_678x452.jpg

https://www.anandtech.com/show/17445/amd-combining-cdna-3-and-zen-4-for-mi300-data-center-apu-in-2023

 

Very good video at timestamp talking about it

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, leadeater said:

CPU chiplet and GPU chiplet using advanced 3D stacking.

I wonder if Apple will do 3D stacking for the upcoming Mac Pro because the M1 Ultra is just atrociously big imo. 
 

 

C71D7096-AFAE-4FF8-801E-0FFE060AF503.png

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, captain_to_fire said:

I wonder if Apple will do 3D stacking for the upcoming Mac Pro because the M1 Ultra is just atrociously big imo. 

I don't see a reason why not. Kind of doesn't fit with what they have done so far with the M1 Ultra, it doesn't need 3D stacking, however whatever they are doing with the Mac Pro could be a completely different design altogether but still using functional elements like the CPU cores just packaged differently.

 

From just a cost and supply perspective it makes sense to me to break things down and go chiplet, the flip side is Apple doesn't do or need 100 different SKUs like Intel and AMD does lol.

 

So does Apple even need to do 3D stacking? Chiplets address problems and needs I suspect Apple just doesn't have, monolithic is still superior in many ways, M1 Ultra is more monolithic-ish than it is chiplet.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, leadeater said:

AMD is coming out with one next year but it's in the server market not consumer, it's called the MI300 but is actually an APU with HBM, chiplet cache dies, CPU chiplet and GPU chiplet using advanced 3D stacking. Probably wildly expensive haha. But some parts of that work could filter down to consumer products eventually but I doubt before 2024, so 2025+.

 

Interesting I can see why they would start with server space as it will be easier to get full stack software support for the users that want this hardware.  Getting windows (and game devs) to adopt even the things MS did for xbox so that it can really make use of a unified memory target is going to be hard. Even through theoretical existing APUs from AMD should support this no one to date has bothered doing this you always need to do memory copy operations. 

 

 

2 hours ago, leadeater said:

So does Apple even need to do 3D stacking? Chiplets address problems and needs I suspect Apple just doesn't have, monolithic is still superior in many ways, M1 Ultra is more monolithic-ish than it is chiplet.

Yer I agree I think apple will continue down the M1 Ultra style with a high perfomance interposer strip so that the chipsets can operate more or less as one unit. This is so much easier from a software support standpoint not just for GPU but also for all the other axially co-prossoros units (I remember reading there are at least 12 seperate small arm co-prosoosrso cores on the M1 for handing everything from SSD, ANE to GPU and many more) going to a NUMA arc would require all the firmware on all of these units to be updated to understand NUMA and thats a LOT of work much better to just spend a little more.  With the macStudio out I fully expect the entry price of the macPro to be quite a bit higher than before and the top end configuration to be north of $80. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×