Jump to content

The $10,000 Mac Pro Killer

TannerMcCoolman

We’ve done it again, we built the PC Pro 2 electric boogaloo, this time with a much lower budget. Would you prefer to have the PC Pro 2 or the Mac Pro?

Link to comment
Share on other sites

Link to post
Share on other sites

Apple seems committed to the Mac Pro, given that it still returned to the old form after the trash can. 

I don't seem them quitting after the latest investment, and can see the need for the product to still exist and have Apple Care, with the need for big GPUs for AI and gaming continuing to grow.

A return of AMD and nVidia GPUs to Mac OS is needed though.

Link to comment
Share on other sites

Link to post
Share on other sites

It's so annoying that Apple thinks people don't need PCI slots.
Have you ever tried to run an SDI input card to an Apple device through a USB adapter? it goes terribly, if it's a MacBook and you close the lid you also get the fun that all your routing changes.

I still have the intel version and it's great. I wouldn't mind using an Apple silicon version but I don't want to spend three grand for PCI slots.

If they made a desktop size mac studio with PCI cards and charge a extra grand for it, I would buy it and know multiple industries which would use it. But they don't so they currently use mac studios with a stupid amount of USB to HDMI adapters and things will eventually go wrong.

also my nokia 950L was great, windows phones were great.

Link to comment
Share on other sites

Link to post
Share on other sites

Why did Linus make the point of "silicone"? Isn't it correct that silicon is used to make chips? I am so confused.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Insta Chocolate said:

Why did Linus make the point of "silicone"? Isn't it correct that silicon is used to make chips? I am so confused.

It is a joke about silicone breast implants.

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, Ahoy Hoy said:

It's so annoying that Apple thinks people don't need PCI slots.
Have you ever tried to run an SDI input card to an Apple device through a USB adapter? it goes terribly, if it's a MacBook and you close the lid you also get the fun that all your routing changes.

I still have the intel version and it's great. I wouldn't mind using an Apple silicon version but I don't want to spend three grand for PCI slots.

If they made a desktop size mac studio with PCI cards and charge a extra grand for it, I would buy it and know multiple industries which would use it. But they don't so they currently use mac studios with a stupid amount of USB to HDMI adapters and things will eventually go wrong.

also my nokia 950L was great, windows phones were great.

It's based on Apple's customer data.  Most people don't open or upgrade their computers, even professionals.  I don't remember the stats but vast majority didn't ever change, and treat their Mac as an Appliance to eventually be replaced.  That's why Apple invested in Thunderbolt.

They don't think people don't need PCI ports, internally there are people that use them, but they know there aren't many who will pay for the development and support of the product.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, ToboRobot said:

It is a joke about silicone breast implants.

Okay thank you, that's subtle...

Link to comment
Share on other sites

Link to post
Share on other sites

I think the Mac Studio is the Mac Pro that Apple actually wants to make, and they're only building the tower Mac Pro as a reluctant "ugh, fine" appeasement to the MacPro5,1 tower purists.

I sold my soul for ProSupport.

Link to comment
Share on other sites

Link to post
Share on other sites

3 parts of this video that may become obsolete in roughly 100 days (at WWDC)

1) the Mac Pro being just a Mac Studio in a larger case

2) the Mac Pro being "owned" by the PC (if money is no question)

3) Linus thinking this form factor will be "canned" by Apple

 

Take these with an oversized grain of salt (they're just rumors), but from the rumor breadcrumbs I've gathered around the web (mostly X):

- there will be an M3 Ultra (possibly more than 1) and it will be completely different from the M1/M2 Ultra, it will be its own thing (it won't be 2 x M3 Max, the M3 Max lacks the I/O area for the UltraFusion packaging, just look at die shots) and it will be 100% p-cores (no e-cores)

- there may be a 100k$ Mac Pro (you read that right, hundred thousand US dollars)

 

If a 100k$ Mac Pro configuration exists, then there must be a way to add more M3 Ultra SoCs and an ungodly amount of unified memory.

 

TSMC can make active interposer carriers that are up to 6x as large as the EUV machine reticle...imagine 6x M3 Ultra SoCs on a package, each with 256GB of unified memory (an M3 Max supports 128GB, it's plausible an M3 Ultra would support 256GB), finally matching (256GBx6=1536GB) the 1536GB of RAM of the 2019 Intel Mac Pro. But this time around it wouldn't be just RAM, it would be unified memory, so potentially 1.5TB of VRAM. Let that sink in.

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, Needfuldoer said:

I think the Mac Studio is the Mac Pro that Apple actually wants to make, and they're only building the tower Mac Pro as a reluctant "ugh, fine" appeasement to the MacPro5,1 tower purists.

I think that is true, but I feel if they made the fabled xMac with two internal expansion slots (GPU + whatever) they could satisfy most of the pro market and gamers. 

*wishful thinking*

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, ToboRobot said:

I think that is true, but I feel if they made the fabled xMac with two internal expansion slots (GPU + whatever) they could satisfy most of the pro market and gamers. 

*wishful thinking*

So, a new G4 Cube? That thing had a socketed processor, SDRAM DIMMs, and an AGP slot.

I sold my soul for ProSupport.

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, Needfuldoer said:

So, a new G4 Cube? That thing had a socketed processor, SDRAM DIMMs, and an AGP slot.

Yeah.  Just imagine Apple tax for RGB. lol

Although the time to do that was when they had x86 Intel CPUs and real GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

I totally agree with most of the conclusions and for its target audience power consumption is probably the last thing on their mind, but! I would've also liked to see a power consumption measurement. If you pull the power cost metric into the equation, then the performance picture changes a lot.

Did you guys forget or is there another reason why not to measure the power cost? Seems weird because Linus does make a point to show the huge and expensive power supply.

Link to comment
Share on other sites

Link to post
Share on other sites

I wish you tested some HPC applications such as;

 

* CFD with openFOAM or FluidX3D

* FEM; fenics

* AI; Stable diffusion,  

* VFX; Houdini or UE5 

 

Honestly anything scientific or something that pushes these systems more. Even PY-HPC, something more domain specific, instead of broad abstract tests or benchmark suites. 

There’s quite some cross platform applications out there (Metal & CUDA) to show the best available on respective platforms. Especially given the time-spam since the last coverage that just dedicated one small snippet or segment as “other workloads”. There’s been quite some AS push to bring more performance parity and take more leverage of the hardware. Especially given that AMD’s Rocm isn’t taking off that fast and in a lot of conferences I see lenovo thinkpads and mac laptops ;). 

Consult a university professor / company or even students in STEM. 

 

I understand the focus on video production or rendering, especially as coverage of above domains require expert knowledge, but the high bandwidth memory is apparent with CPU & GPU compute workloads, even if there’s a limited amount available. 

Moreover, would’ve also been interesting to see some detailed explanation of why you wanted an update with the latest M3, through the lens of its advanced GPU features. They released quite an interesting explainer on that and apple has made strides to be ahead or on par with NVIDIA’s Ada architecture.

Nonetheless the overall Apple’s to apple’s comparison felt quite unsubstantiated in my opinion. Show some more of that NVIDIA GPU or lots of ram and cores with threadripper. Cinebench and running 1 (or two) games is just….come on. There’s enough rant only videos out there..

 

Although given the ad-hoc way the video was set up I understand that extensive testing isn’t within the scope and the framing is to hammer on the negative aspects or limitations of the mac-pro, for which I think there are valid points. However, I just see a lot of overlap in discussion or content with the first coverage or comparison.

 

You need to have some framework of open-source HPC/scientific tests if you want to really test these many core systems properly, there’s a huge gap in content between scientific workloads done on your laptops or pc’s to those done at racks of servers. I think that should be within the scope of coverage for the future, especially since we’re nearing certain limits of Node Processes. I understand that’s not needed for your content as the video will do well regardless, but just a lil rant of mine ;). 

 

Also, fyi, for future reference, MetalHUD can be enabled through the command line. It’s a google away. Moreover, BG3, Re Village, Re4, Stray, Grid Legends, Lies of P, No Man’s Sky all could’ve been tested also. Some of them are on Steam…maybe even test retro games with mac source ports…. Unless this video was months and months old. Like come on.

 

PS: Just in general I’d like to refer you to Geekerwan and their tests ;).

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, ToboRobot said:

Apple seems committed to the Mac Pro, given that it still returned to the old form after the trash can. 

I don't seem them quitting after the latest investment, and can see the need for the product to still exist and have Apple Care, with the need for big GPUs for AI and gaming continuing to grow.

A return of AMD and nVidia GPUs to Mac OS is needed though.

Yes apple will keep the Mac Pro.

But your not going to see AMD or NV Gpu support of the simple reason that apple does not want feature divergence in the Metal api space as that leads devs to developer for the lowest common denominator of features.  NV and AMD gpus both come from a very differnt pathway than apples (that are based on PowerVR IP) as such there are a LOAD of important GPU features that will never be supported on AMDs or NV gpus so adding these back to the Mac would in effect mean all pro applications would never do any work to support those features (that are very important for good perf on Apples other, high volume, product lines). 

But I would not be surprised if apple end up shipping some Metal compute cards (let's not call them gpus as I do not expect them to support display outputs).   I think apple might well just create these from M* Ultra chips that have to many defective cpu cores to be of use for the SOC but fully functional GPUs.  These would be monsters for ML work due to the massive addressable VRAM but of no interest for Gaming due to the massive addressable VRAM (and thus the $$$ cost). 

 

4 hours ago, R3ndevous said:

Also, fyi, for future reference, MetalHUD can be enabled through the command line

yer it seems non of the YT reviews know about Macs at all, also you can use the profiler that ships with Xcode (even from the command line) to log frame times (and much much more) so that they can create graphs.  Using Tomb raider that is x86 (and thus also has a very poor gpu optimisation) is not at all a good GPU test for someone doing professional work.

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

5 hours ago, R3ndevous said:

Moreover, would’ve also been interesting to see some detailed explanation of why you wanted an update with the latest M3, through the lens of its advanced GPU features. They released quite an interesting explainer on that and apple has made strides to be ahead or on par with NVIDIA’s Ada architecture.

yer but from the throw away comments about dynamic cache feature being related to the VRAM (it is not) I would not expect them to get this correct. 

LTT are not known for low level system arc understanding this is something that Gamers Nexus might be better suited for or even better High Yield or TechTechPotato, LTT tend to have way to many errors when they attempt to go into details like this.  

What apple have done within the M3 GPU is very very impressive and well ahead of the PC vendors in some ways.   It would be very interesting to get more details and understand the die area cost of this as well. 

Link to comment
Share on other sites

Link to post
Share on other sites

Was rewatching on youtube and noticed that version doesn't have the remark about the motherboard not having M.2 raid at around 6:14. The floatplane video still has it. Was this an error that was corrected for the youtube upload or did you upload different versions to youtube and floatplane?

Link to comment
Share on other sites

Link to post
Share on other sites

I really enjoyed the little build sequences in this video, with the close ups etc. Reminds me of some woodworking channels.

Link to comment
Share on other sites

Link to post
Share on other sites

I think some of the testing methodology for Blender may have been incorrect. I believe they may have inadvertently run the Classroom render using the CPU on the Mac Pro.

 

Here are the (admittedly blurry) render options from LTT's Mac Pro: (in spoiler since its a large picture)

Spoiler

image.thumb.png.0619c6789efa94c428e6f95145b37bf5.png

 

Comparing that to Blender on my Mac, that is most likely the following:

image.png.a13f9eb2601bc91544103f4568bd04c2.png

 

For Blender to use the GPU, the Device setting needs to be set to "GPU Compute".

 

In addition to take full advantage of Apple's Metal GPU framework, you need to enable Metal under "Cycles Render Devices" in Blender's Preferences.

image.png.35940bd9ccbf70f818dd7f7c09fe6c33.png

 

With these settings, I was able to render classroom in 2:44.47 on my MacBook Pro (14 inch - 2021 - M1 Pro 10 core CPU 16 core GPU) while multitasking. CPU based took much longer.

 

Based on my results I would think that the M2 Ultra's 60 GPU cores would finish much faster than my M1 Pro's 16. My guess is that the M2 Ultra's 24 CPU cores would make up the difference.

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/17/2024 at 3:01 PM, GarlicDeliverySystem said:

Now I want my Lumia 820 back, loved that phone.

My Samsung Focus S may it rest in peace. 😭 Unironically the best UI of any phone I've owned. 

ask me about my homelab

on a personal quest convincing the general public to return to the glory that is 12" laptops.

cheap and easy cable management is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

It's refreshing to see Linus looking normal again, the world has finally gone back to normal after COVID lol

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/17/2024 at 8:01 PM, GarlicDeliverySystem said:

Now I want my Lumia 820 back, loved that phone. Also had the best third party YT app, man do I miss myTube.

I still miss my 920, the most annoying thing (aside from Android being inferior to WP as an OS) was how it took so many years for Android/Apple to catch up to WP on the hardware side.  Even today they expect you to pay around 1k for just to get wireless charging, high end cameras and always on screens when that was just standard on a decent Nokia WP >.>

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Ubersonic said:

I still miss my 920, the most annoying thing (aside from Android being inferior to WP as an OS) was how it took so many years for Android/Apple to catch up to WP on the hardware side.  Even today they expect you to pay around 1k for just to get wireless charging, high end cameras and always on screens when that was just standard on a decent Nokia WP >.>

It was funny, in some ways it was even true for the OS side. Of course, not for everything, WP had some issues especially early on.

But having life tiles that responded to notifications, or the ability to have shortcuts directly to specific settings on the home screen, apps like calendar that were their own widget and customizable to a much larger extent than most android options at the time.

 

Then again a lot of things were also pretty weird, like no simple notification center for a long time, or just plainly the lack of app support.

 

Still sad about breaking my 820 and having to move to android at that point.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×