Jump to content

Mac pro and XDR display orders available now + unboxing

williamcll
1 hour ago, SenKa said:

USB4 is USB 3.1 Gen 2 + TB3 integrated in to one standard to avoid the confusion of the ubiquitous USB-C port as to whether it is USB or TB3. From my understanding, anyways.

Doesn't TB3 already encompass everything USB 3.1 gen 2 can do? One of the companies mentioned to Linus (he said in one of the WAN shows) that USB4 was going to just be a namesake standard where the vendor is free to choose what he wants to implement and not, while TB3 branding will be used for fully featured USB4

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, RedRound2 said:

Doesn't TB3 already encompass everything USB 3.1 gen 2 can do? One of the companies mentioned to Linus (he said in one of the WAN shows) that USB4 was going to just be a namesake standard where the vendor is free to choose what he wants to implement and not, while TB3 branding will be used for fully featured USB4

No. Thunderbolt 3 only used the USB-C connector but the protocol itself is not compatible with USB 3.x or 2.0. USB4 is based on the Thunderbolt 3 specification and the two are compatible, in addition to having backwards compatibility with USB 3.x and 2.0

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, sazrocks said:

Haven't seen anyone mention this yet, so I'll put it here:

Sure, the desktop case looks like a cheese grater, but darn, that rackmount case though

  Hide contents

758151089_mac-pro-rack-hero-splitter(1).thumb.jpg.8ca1ad79fe146dfe7cebb5057da2ec97.jpg

IMO that looks really cool.

AAAh. I was expecting a more custom (different?) board and a slimer rackmount. Totally forgot fat ones exist (I'm obvs not in that space/industry :P ).

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/15/2019 at 4:55 PM, mr moose said:

 

So people in defense of this situation are o.k with user space drivers for professional workstations (which come with their own overhead and aren't as good as kernel space) while in the same breath are trying to argue it's the best product because the professionals who use it care about every last drop of performance?

 

Its clear you do not understand what `user` space drivers mean in when it comes to the darwin kernel. 

User Space drivers enable better performance since they do not run within the kernel restrictions:

  1.  they can use as many threads as they like (a kernel space driver runs within the kernel runloop and thus cant have any of its own threading controles) a user space driver can use 100s of threads without any limitations
  2.   they can use much more memory (kernel space drivers must sit within the memory restrictions of the kernel, as defined by the ring0 and ring1 buffers provided by the cpu) a user space driver an use as much memory as any other user space application.
  3. they can make use of other system level optimization (like AVX instruction sets etc) kernel space drivers are required to always run and need to be as small as possible so you should not include switching to switch between diffrent sets of optional instruction sets as this increases the kernel size and thus limits the proportion of it that can fit within L2/L3 on the cpu (this has a LARGE impact on usability of the entire system)
  4. Bandwidth through to PCIe devices is increased in user-space drivers since you don't need to jump into the kernel and back out again so you don't need to have all of the limitations added to protected against speculative cpu attacks.


The only performance issue that user-space drivers might have is in a very small `latency` impact (less than 50ns). Not something that is at all cryptical for compute based workloads, where you are issuing a command that might run for minutes.

 

Latency could be seen as an issue for display drivers (and that might be one of the reasonse display drivers are not supported as user-space).

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, hishnash said:

Its clear you do not understand what `user` space drivers mean in when it comes to the darwin kernel. 

User Space drivers enable better performance since they do not run within the kernel restrictions:

  1.  they can use as many threads as they like (a kernel space driver runs within the kernel runloop and thus cant have any of its own threading controles) a user space driver can use 100s of threads without any limitations
  2.   they can use much more memory (kernel space drivers must sit within the memory restrictions of the kernel, as defined by the ring0 and ring1 buffers provided by the cpu) a user space driver an use as much memory as any other user space application.
  3. they can make use of other system level optimization (like AVX instruction sets etc) kernel space drivers are required to always run and need to be as small as possible so you should not include switching to switch between diffrent sets of optional instruction sets as this increases the kernel size and thus limits the proportion of it that can fit within L2/L3 on the cpu (this has a LARGE impact on usability of the entire system)
  4. Bandwidth through to PCIe devices is increased in user-space drivers since you don't need to jump into the kernel and back out again so you don't need to have all of the limitations added to protected against speculative cpu attacks.


The only performance issue that user-space drivers might have is in a very small `latency` impact (less than 50ns). Not something that is at all cryptical for compute based workloads, where you are issuing a command that might run for minutes.

 

Latency could be seen as an issue for display drivers (and that might be one of the reasonse display drivers are not supported as user-space).

 

That's it, that's your response to all of this, call me ignorant then ignore the actual problem as to why people who want Cuda acceleration in modeling/render space need Nvidia's kernel space drivers.

 

 

https://helpx.adobe.com/au/premiere-pro/kb/gpu-and-gpu-driver-requirements-for-premiere-pro.html

 

You cannot install the Required nvidia driver onto the latest mac because it is a kernal space driver and Apple will not permit it. 

 

There is a reason Nvidia don't just have a user space driver. and there is a reason you need them for Adobe if you want to use cuda acceleration.

 

Quote

if you prefer CUDA graphics acceleration, you must have CUDA 9.2 drivers from NVIDIA installed on your system before upgrading to Premiere Pro versions 13.0 and later.

 

 

and:

Quote

 

Note:

Mac0S 10.14 (Mojave) does not currently support CUDA.

 

 

So no, you can't just install user space drivers and make it work, in fact you would be lucky to get an Nvidia GPU working inside a mac even if you run windows through bootcamp.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Cuda is not a Graphic api, it is 100% a compute api. As a compute api a user-space driver is all you need. People want CUDA support thats all.

Display driver support is very diffrent from CUDA support, and while nvidia normally ship this as a single installer it is a separate feature set, (for linux you will find that you can install just the CUDA driver without the display driver since your normaly running on a server)

Yes Apple would need to aprove a display driver iff nvidia wanted it to be used as a drop in replacement for the existing systems without developers needed to re-compile their applications.
 

If however nvidia said they were only going to support display driver features for developer that explicitly target them (aka include an nvidia sdk in addition to the system one)

  • then they could include features such as Vulkan support (even on macOS)
  • they could do that as a user-space driver no-problem (since its not rendering the system UI just the `3d` part of a given application)
  • users would however still need a AMD card to render the main system UI/mouse etc
  • User input might have a `very small` latency (less that the latency what a kernel driver in BSD based systems have a much lower latency)

There is nothing about a user-space driver that restricts it from writing an output to a buffer on the AMD display GPU. (in fact this is exactly how other accelerator cards like the Afterburner work).

 

1 hour ago, mr moose said:

in fact you would be lucky to get an Nvidia GPU working inside a mac even if you run windows through bootcamp.

Through bootcamp (for linux or windows) Nvidia GPUs work just fine 100% of the time.

 

1 hour ago, mr moose said:

you can't just install user space drivers and make it work

Of course Nvidia do need to write them (the api is diffrent) (if they do then yes you can install them without apple being in the loop), apple literally can not write nvidia's driver for them, (even if they did nvidia would be forced to SUE them if they did reverse engineer nvidia cards,  otherwise they would not be able to protect their patents, that is how Software patents work you must defined them or they are void.)


I somehow think you don't understand what User-space drivers means, it has NOTHING to do with the USER of the system and everything to do with how they run. and the fact that they no-longer run within the run-loop of the SINGLE kernel process. This is a good page https://spdk.io/doc/userspace.html to understand the basics, user-space drivers have been a thing for linux for a long time know and more and more the linux kernel is pushing (like the darwin kernel) to require new drivers to use this solution. Much of the PCIe NVMe drivers for linux run this way.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, hishnash said:

Cuda is not a Graphic api, it is 100% a compute api. As a compute api a user-space driver is all you need. People want CUDA support thats all.

Display driver support is very diffrent from CUDA support, and while nvidia normally ship this as a single installer it is a separate feature set, (for linux you will find that you can install just the CUDA driver without the display driver since your normaly running on a server)

Yes Apple would need to aprove a display driver iff nvidia wanted it to be used as a drop in replacement for the existing systems without developers needed to re-compile their applications.
 

If however nvidia said they were only going to support display driver features for developer that explicitly target them (aka include an nvidia sdk in addition to the system one)

  • then they could include features such as Vulkan support (even on macOS)
  • they could do that as a user-space driver no-problem (since its not rendering the system UI just the `3d` part of a given application)
  • users would however still need a AMD card to render the main system UI/mouse etc
  • User input might have a `very small` latency (less that the latency what a kernel driver in BSD based systems have a much lower latency)

There is nothing about a user-space driver that restricts it from writing an output to a buffer on the AMD display GPU. (in fact this is exactly how other accelerator cards like the Afterburner work).

 

Through bootcamp (for linux or windows) Nvidia GPUs work just fine 100% of the time.

 

Of course Nvidia do need to write them (the api is diffrent) (if they do then yes you can install them without apple being in the loop), apple literally can not write nvidia's driver for them, (even if they did nvidia would be forced to SUE them if they did reverse engineer nvidia cards,  otherwise they would not be able to protect their patents, that is how Software patents work you must defined them or they are void.)


I somehow think you don't understand what User-space drivers means, it has NOTHING to do with the USER of the system and everything to do with how they run. and the fact that they no-longer run within the run-loop of the SINGLE kernel process. This is a good page https://spdk.io/doc/userspace.html to understand the basics, user-space drivers have been a thing for linux for a long time know and more and more the linux kernel is pushing (like the darwin kernel) to require new drivers to use this solution. Much of the PCIe NVMe drivers for linux run this way.

You haven't read any of the links I posted have you?

 

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, mr moose said:

You haven't read any of the links I posted have you?

I think you still miss-undertand what user-space means:

* nvidia will need to write new drivers to user it, so current status on CUDA support (as in the links you have posted) is null and void as that assumes the current CUDA drivers nvidia has written.

 
In the end this will be up to nvidia if they feel the development effort is worth it. Not realy apples job to write nvidia's drivers, just like MS does not write them or the linux foundation (developers employed/volunteering by/for it).

none of your links have shown any evidence as to why CUDA drivers (not display drivers) could not be writing as `user-space` drivers. If you find such info it would be very interesting.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, hishnash said:

I think you still miss-undertand what user-space means:

* nvidia will need to write new drivers to user it, so current status on CUDA support (as in the links you have posted) is null and void as that assumes the current CUDA drivers nvidia has written.

 
In the end this will be up to nvidia if they feel the development effort is worth it. Not realy apples job to write nvidia's drivers, just like MS does not write them or the linux foundation (developers employed/volunteering by/for it).

none of your links have shown any evidence as to why CUDA drivers (not display drivers) could not be writing as `user-space` drivers. If you find such info it would be very interesting.

I posted some very black and white problems with using cuda on mac and linked supporting documents from both apple and adobe and you are still trying to argue that it isn't the case.  You have to have the appropriate cuda  drivers from Nvidia before the adobe suite will run cuda acceleration on mac, you can't install those drivers because apple will not let them be installed.  Apple doesn't even list Nvidia cards as being supported inside a mac, only as an eGPU through bootcamp.  

 

 

Why buy a mac only to use windows?  the whole thing is a failure of stupid proportions and your solution is to run user space drivers that aren't good enough for a workstation that only has said GPU for their acceleration.

 

So instead of telling me I don't understand, tell me how you can get cuda acceleration in adobe when apple refuse to let the cuda driver be installed?

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, mr moose said:

I posted some very black and white problems with using cuda on mac and linked supporting documents from both apple and adobe and you are still trying to argue that it isn't the case.  You have to have the appropriate cuda  drivers from Nvidia before the adobe suite will run cuda acceleration on mac, you can't install those drivers because apple will not let them be installed.  Apple doesn't even list Nvidia cards as being supported inside a mac, only as an eGPU through bootcamp.  

 

 

Why buy a mac only to use windows?  the whole thing is a failure of stupid proportions and your solution is to run user space drivers that aren't good enough for a workstation that only has said GPU for their acceleration.

 

So instead of telling me I don't understand, tell me how you can get cuda acceleration in adobe when apple refuse to let the cuda driver be installed?

 

 

All of the thinks you have posted reference drivers created BEFORE 10.15 so implicitly kernel space only drivers.  Apple will review these drivers and MUST accept them before you can install them yes. 

I am not saying you can get CUDA in Adobe im saying if nvidia want to put in the work you could get it in Adobe. But its not apple to blame for nvidia not writing the drivers thats not apples job.

 

What i am trying to say is it is know possible for nvidia (if they want to) to write a user-space driver and provide this to users without apple needing to review it, this is how user-space drivers work. These drivers WOULD NOT be the current kernel space drivers that nvidia has already written they would need to adapt them to use the user-space api. now that api exists in 10.15 it does not apply if apple were best mates with nvidia they would not approve a kernel space driver for CUDA since there is a better solution out there using use-space api.

 

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, hishnash said:

I am not saying you can get CUDA in Adobe im saying if nvidia want to put in the work you could get it in Adobe. But its not apple to blame for nvidia not writing the drivers thats not apples job.

User-space still isn't a get out of jail free for Apple's blessing to work, but yes you could do it but you would have to also go in to security settings and allow the install of the un-trusted application. Apple's Mac OS security is getting more and more obnoxious as time goes on, I have to fight it every damn OS update, minor ones not just major ones. This is just for Commvault Endpoint Backup Agent too, I can with a very high level of confidence say it's actually not as easy as creating a user-space driver and providing it via Nvidia's website.

 

Each time I have to get a new build of the Endpoint Agent created it has to go through Apple's Notarization, unless Nvidia as an approved developer status and developer ID that's an automatic deny. https://developer.apple.com/documentation/xcode/notarizing_macos_software_before_distribution.

 

However there is a bigger issue for the professionals and their applications, the certification process which is the entire point of paying more for professional cards and using the professional drivers. End to end certification of feature correctness, stability and proper spec compliance and this cannot actually be done without Apple being involved. Even if you were to ignore that and just trust Nvidia has you covered nothing is preventing Apple from making a change later that does break your workflow or a feature of it and worst case it's not noticed and effects end product like colour calibration.

 

For the smaller professionals this likely isn't much of a problem but in a larger business it's likely not going to be a risk you're willing to take and I'm not even sure Nvidia would either, hence why it's not a thing now. 

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, hishnash said:

All of the thinks you have posted reference drivers created BEFORE 10.15 so implicitly kernel space only drivers.  Apple will review these drivers and MUST accept them before you can install them yes. 

I am not saying you can get CUDA in Adobe im saying if nvidia want to put in the work you could get it in Adobe. But its not apple to blame for nvidia not writing the drivers thats not apples job.

 

What i am trying to say is it is know possible for nvidia (if they want to) to write a user-space driver and provide this to users without apple needing to review it, this is how user-space drivers work. These drivers WOULD NOT be the current kernel space drivers that nvidia has already written they would need to adapt them to use the user-space api. now that api exists in 10.15 it does not apply if apple were best mates with nvidia they would not approve a kernel space driver for CUDA since there is a better solution out there using use-space api.

 

 

But you are missing the point completely.  Up until now all the arguments laid forth for the mac being worth the price is because "it just works" and professionals 'will pay however much for the system to do the job without question or exception'.   Even in our small discussion there has been several exceptions and issues raised that make a complete mockery of that argument.   

 

Having to wait for apple to maybe certify nvidia drivers or waiting for Nvidia to release user space drivers (that many still argue are not ideal for GPU regardless whether that is cuda compute or basic GPU functioning is not a condition/argument  that works in favor of the machine being sold at a premium price point.  

 

As for nvidia they no longer support apple and have said as much.  To be honest I don't care who anyone blames for this, it is still fact that has to be considered in comparing the mac pro to any other windows based workstation.

https://www.overclock3d.net/news/gpu_displays/nvidia_plans_to_drop_cuda_support_for_macos/1

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, leadeater said:

User-space still isn't a get out of jail free for Apple's blessing to work, but yes you could do it but you would have to also go in to security settings and allow the install of the un-trusted application. Apple's Mac OS security is getting more and more obnoxious as time goes on, I have to fight it every damn OS update, minor ones not just major ones. This is just for Commvault Endpoint Backup Agent too, I can with a very high level of confidence say it's actually not as easy as creating a user-space driver and providing it via Nvidia's website.

 

Each time I have to get a new build of the Endpoint Agent created it has to go through Apple's Notarization, unless Nvidia as an approved developer status and developer ID that's an automatic deny. https://developer.apple.com/documentation/xcode/notarizing_macos_software_before_distribution.

 

However there is a bigger issue for the professionals and their applications, the certification process which is the entire point of paying more for professional cards and using the professional drivers. End to end certification of feature correctness, stability and proper spec compliance and this cannot actually be done without Apple being involved. Even if you were to ignore that and just trust Nvidia has you covered nothing is preventing Apple from making a change later that does break your workflow or a feature of it and worst case it's not noticed and effects end product like colour calibration.

 

For the smaller professionals this likely isn't much of a problem but in a larger business it's likely not going to be a risk you're willing to take and I'm not even sure Nvidia would either, hence why it's not a thing now. 

Yep you go still need to be notarized that however is an automated proses.

 

i did not think the CUDA compute drivers are certified (since that’s an NV only tec who can certify it?)  the display drivers are certified (for pro cards) buts that’s about the OpenCl and OpenGL/Vulcan implementation. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, hishnash said:

i did not think the CUDA compute drivers are certified (since that’s an NV only tec who can certify it?)  the display drivers are certified (for pro cards) buts that’s about the OpenCl and OpenGL/Vulcan implementation. 

Yes which are used in programs by companies like AutoCAD or Adobe who work with Nvidia and AMD, plus Microsoft etc to get those certified drivers to actually mean something. You can see there difference in real world application tests as well as in SpecView tests, plus professional drivers still implement a lot of legacy support not in Geforce/Radeon drivers.

 

Any CUDA acceleration in professional applications are certified, CUDA development stack for pure compute applications is not and is a little different in this context.

 

3 hours ago, hishnash said:

Yep you go still need to be notarized that however is an automated proses.

If you have a developer ID, does Nvidia have one? Would Apple allow it? Would Apple also use this if they did have one to block any user spaces drivers by them using that developer signature included in all software. If Apple doesn't want something then Apple gets it's way, that's just how it works in their ecosystem. If for what ever reason they can't get their way now the next OS update will make it that way. This is why Apple annoys me so much.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/12/2019 at 1:33 PM, hishnash said:

So one would assume your `average` consumer is not buying this machine right? someone who is spending (even for the base) model $5k or more will take the needed time to research things. 

not necessarily, someone who's uneducated about the technical side a product might go on google and do the research and still end up making a wrong purchase, this happens all the time, getting the wrong tool for the job analogy. then you have to think of the people who are willing to spend 35k on a mac, two things can happen, either the people who are gonna spend that much just don't care what video card's in it, or just don't want to spend that much on something and break it or worse void warranty. 

your analogy does work in some cases but when you're talking about a 35k pc's i'm willing to bet no one's gonna bother to hack nvidia cards into their expensive macs and have it break or not work, if someone is specifically buying 35k mac just so that they can switch to nvidia, then i guess that's on them.

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly at this point regarding NVIDIA support on macOS, whoever actually wants to use NVIDIA hardware has already switched their tool chains and processes over.

 

Trying to make the environment adapt to you when you want to do work is a big time waster. Either find the right tool for the job in the first place, switch to something that is the right tool for the job, or adapt your process to the tool you ended up getting and deal with it.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Mira Yurizaki said:

Honestly at this point regarding NVIDIA support on macOS, whoever actually wants to use NVIDIA hardware has already switched their tool chains and processes over.

 

Trying to make the environment adapt to you when you want to do work is a big time waster. Either find the right tool for the job in the first place, switch to something that is the right tool for the job, or adapt your process to the tool you ended up getting and deal with it.

As it stands, it's not like the Mac Pro is anything less than blazing fast for the jobs it's meant for (primarily audio/video editing).  It's just that it isn't the best at absolutely everything.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Commodus said:

As it stands, it's not like the Mac Pro is anything less than blazing fast for the jobs it's meant for (primarily audio/video editing).  It's just that it isn't the best at absolutely everything.

This is the problem with this thread,  no one  is saying otherwise.   I don't recall seeing a single post in this entire thread where someone said it was slow for it's price.     No one said it has to be the best at everything,  they just said you can't argue like it is best at everything in response to people finding better alternatives.   This  Entire thread has been propelled by people who are trying to defend it's honor like it is a living thing that has emotions.  It's a tool, it works for some and not for others, until that cold hard fact can be accepted,  then many of the claims in this thread are only going to keep being made no matter how ridiculous they are.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Commodus said:

As it stands, it's not like the Mac Pro is anything less than blazing fast for the jobs it's meant for (primarily audio/video editing).  It's just that it isn't the best at absolutely everything.

I think this impression is given because literally every review on the internet (at least on YT) is YTers talking about how great it is for video. But this machine would be amazing for my job which requires a lot of virtualization. The RAM, high core count, and VMWare Fusion (which I like a lot more than Workstation...) makes this a killer VM host/box. 

 

Are there better machines for that? Maybe, but I'm quite partial to using MacOS these days. (before someone has a heart attack... breathe) I quite like macOS because it's basically "fancy Linux." Do I think it's worth the premium? That depends. If you are an average user, absolutely not. You can probably get away with an iMac. Watching YTers like MKBHD use this in lieu of a 5K iMac because it saves them a few hours just feels crazy. They don't produce enough videos (IMO) to warrant needing the extra processing - but when you are grossing as much as they are and are as big of a tech enthusiast as they are, I can understand why you buy it. 

 

For a video production company (or a number of other fields), having dedicated machines with high quality support is really great. The difference between what this cost and what a comparable machine from another provider (like Lenovo) is basically a tax write off.

 

If I could get my boss to buy me one of the higher end ones, I would - but even suggesting this would probably send him into shock. 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, mr moose said:

This is the problem with this thread,  no one  is saying otherwise.   I don't recall seeing a single post in this entire thread where someone said it was slow for it's price.     No one said it has to be the best at everything,  they just said you can't argue like it is best at everything in response to people finding better alternatives.   This  Entire thread has been propelled by people who are trying to defend it's honor like it is a living thing that has emotions.  It's a tool, it works for some and not for others, until that cold hard fact can be accepted,  then many of the claims in this thread are only going to keep being made no matter how ridiculous they are.

People on this forum have been ranting that they can get a Threadripper that would supposedly be faster in virtually every way for less money.  In my case, defending this thing isn't about protecting its feelings.  It's acknowledging that no, a Threadripper isn't going to be unambiguously better, and that there are very practical reasons Apple has for choosing Xeons besides theoretical performance (like re-optimization and the cost of switching).

Link to comment
Share on other sites

Link to post
Share on other sites

Some good news is that apparently, ifixit is surprised by its repairability.

Quote


The new Mac Pro is a Fixmas miracle: beautiful, amazingly well put together, and a masterclass in repairability.

 

Quote

We love that a good portion of the modules can be swapped without tools; we love the use of (mostly) standard screws and connectors; we love the step numbers and diagrams for certain repairs right on the device; and most of all, we love the free public repair manuals and videos.

Though, they do have the caveat:

Quote

Despite the many things to love, however, Apple still keeps the keys to certain repairs, like the proprietary SSD. And some of Apple’s repair manuals include (or entirely comprise) a disclaimer insisting that you contact an Apple Authorized Service Provider, when in reality the repair could easily be done at your desk.

I admittedly don't know much about Apple, but I'm guessing this means that in a couple years, you could get dead/second hand units and merge them together as simply as a normal PC (if not more so).

Spoiler

CPU: Intel i7 6850K

GPU: nVidia GTX 1080Ti (ZoTaC AMP! Extreme)

Motherboard: Gigabyte X99-UltraGaming

RAM: 16GB (2x 8GB) 3000Mhz EVGA SuperSC DDR4

Case: RaidMax Delta I

PSU: ThermalTake DPS-G 750W 80+ Gold

Monitor: Samsung 32" UJ590 UHD

Keyboard: Corsair K70

Mouse: Corsair Scimitar

Audio: Logitech Z200 (desktop); Roland RH-300 (headphones)

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, The1Dickens said:

I admittedly don't know much about Apple, but I'm guessing this means that in a couple years, you could get dead/second hand units and merge them together as simply as a normal PC (if not more so).

They're saying that even though component repair/replacement is stupid easy, Apple will insist that you spend money on a certified technician to replace it.

 

Imagine say Digital Storm or CyberPowerPC saying you need a qualified technician to replace your video card or add a hard drive.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Commodus said:

People on this forum have been ranting that they can get a Threadripper that would supposedly be faster in virtually every way for less money.  In my case, defending this thing isn't about protecting its feelings.  It's acknowledging that no, a Threadripper isn't going to be unambiguously better, and that there are very practical reasons Apple has for choosing Xeons besides theoretical performance (like re-optimization and the cost of switching).

For those people a threadripper might easily be better,   don't confuse personal preference/requirements with blanket claims about everyone.  

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/8/2019 at 5:04 AM, RejZoR said:

Probably some binding contract with Intel after the switch from PowerPC to x86. They used Radeon graphics because Intel had nothing remotely as powerful, but for CPU, since they collaborated with Intel to port MacOS X to x86, they probably still need to use Intel because of that even if AMD has better offerings this moment. That's my guess.

They did not work with Intel to bring Mac OS X to Intel processors. They had 1 engineer do it over like 18 months:

 

https://www.theverge.com/2012/6/11/3077651/apple-intel-mac-os-x-retrospective

 

That being said, Mac OS X runs on AMD hardware with what are presumably minor patches:

 

https://amd-osx.com

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, mr moose said:

For those people a threadripper might easily be better,   don't confuse personal preference/requirements with blanket claims about everyone.  

Yes, but they argued that Threadripper is objectively better for everyone, not just for them.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×