Jump to content

Mac pro and XDR display orders available now + unboxing

williamcll
2 minutes ago, Zando Bob said:

Never said that. In no way, at all, did I say that. I said their switch, when they did it, turned out for the better, because Nvidia GPUs were having issues,

Yes you did,   you literally said this:

 

30 minutes ago, Zando Bob said:

 I said it's for the best because Nvidia GPUs in specific MacBook Pro models are known for toasting themselves and killing the laptop,

old problem with old products. 

2 hours ago, Zando Bob said:

 Probably for the best, Nvidia GPUs in old macbooks have a bad habit of toasting themselves and bricking the laptop. 

You are literally saying it is for the best that nvidia isn't used in macbooks because of what they did many years ago.  How is it for the best not to use Nvidia?  the only reason you have given was because their GPUs in old macbooks had issues. All hardware has had issues at some point.  You can't single out one company and claim it only applies to them, that is bias speaking.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, mr moose said:

 

better not buy any hardware then, they all have had problems at some point in their lifetime. 

 

 

The "it just works" may apply with Apple's software, but it really hasn't with their hardware for quite a while. All of them have problems like the butterfly keyboards, SSD's randomly failing, display ribbon cables, T2 chips failing bricking the whole system, and the T2 chip causing audio dropouts. Studios working with a USB 2.0 audio setup are going to wonder why their mac doesn't "just work".

AMD GPU's failing in macbooks were common enough for there to be a hardware mod forcing the laptop to only use the Intel iGPU, so it wasn't just Nvidia GPU's failing in Apple products.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Zando Bob said:

To my knowledge, Apple wanted full control over drivers and Nvidia didn't like that. Probably for the best, Nvidia GPUs in old macbooks have a bad habit of toasting themselves and bricking the laptop. 

I think you’ve got that backwards.  Nvidia only ever releases binaries of their drivers.  It’s a big issue with Linux.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, mr moose said:

Yes you did,   you literally said this:

 

old problem with old products. 

You are literally saying it is for the best that nvidia isn't used in macbooks because of what they did many years ago.  How is it for the best not to use Nvidia?  the only reason you have given was because their GPUs in old macbooks had issues. All hardware has had issues at some point.  

It was a passing remark.

3 minutes ago, mr moose said:

You can't single out one company and claim it only applies to them, that is bias speaking.

I have no forum-friendly words for you on this matter. Refer to my sig:

1447193231_ScreenShot2019-12-12at4_14_39PM.png.7baf3439c089969f02e5e70ffac60d08.png
 

I use hardware from both sides. I enjoy hardware from both sides. From a logic standpoint, I consider Nvidia better for the average consumer due to RTX, NVENC, and CUDA on the RTX/GTX166x series being just better QoL things for most people, whereas AMD has RIS which is useful for 4K gaming and that's about it. Oh they have boost now, which is also only useful for 4K gaming, seriously degrades the experience at 1440p or 1080p. Not that that's a massive issue, 1440p and 1080p are usually easy to push with an appropriate card anyways. 

Personally, I like both. My 1080 Ti was great fun, the 780s I have now offer almost the tweakability of my Radeon cards, and my 1660 Ti fuckin slaps for the power consumption. The Radeon VII lives in my big rig, I have an emotional attatchement to that lad. Bought it 3 hours after release, had it overnighted, was probably one of the first few non-reviewers to actually have a Radeon VII in hand. Had a Vega FE before that, it was used though but I still loved the heck out of it. Have had a great time with all the cards I've used, loved ones from both team green and team red equally but for different reasons. 

CPUs are the only place I have a personal beef with specifically only Ryzen and 1000/2000 TR, which is poor single core performance coupled with a low OC headroom, allowing Intel chips from 2013-2014 to beat it while costing less (than what I paid for my 2700X/CH7 setup) and having full HEDT features. On a logical standpoint, both platforms have advantages, though Intel really only holds the lead for something like very high refresh rate gaming where every last fps matters, or a very very small and extremely specific set of possible HEDT users, other than that Ryzen and TR 3000 take the cake on every single front. 

So no, I am not saying Nvidia GPUs shouldn't be used in Macs because they're bad lmao, I love Nvidia cards. And AMD cards. I just made a passing remark that Apple's decision to move to AMD based off drivers turned out well, because a lot of the Nvidia GPUs at the time, not right now, at the time, when Apple made that choice, based off previous generations, specifically the 15" and 17" MacBook Pro models with Nvidia 8000 and 600 series GPUs, had a bad habit of cooking themselves. Something that may not have been an issue at the time (the 8000 series issue was well known, the newer ones may not have had issues yet, at the time), but in hindsight turned out to not be a terrible idea. A little hidden possible benefit I found amusing enough to mention, not a valid reason for Apple to never run Nvidia GPUs in their Macs again. 

 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Blademaster91 said:

The "it just works" may apply with Apple's software, but it really hasn't with their hardware for quite a while. All of them have problems like the butterfly keyboards, SSD's randomly failing, display ribbon cables, T2 chips failing bricking the whole system, and the T2 chip causing audio dropouts. Studios working with a USB 2.0 audio setup are going to wonder why their mac doesn't "just work".

AMD GPU's failing in macbooks were common enough for there to be a hardware mod forcing the laptop to only use the Intel iGPU, so it wasn't just Nvidia GPU's failing in Apple products.

Yep “when they’re not screwing up their hardware”. They’ve gotten rid of the butterfly keys I understand.  I have no info about the other stuff.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Bombastinator said:

Yep “when they’re not screwing up their hardware”. They’ve gotten rid of the butterfly keys I understand.  I have no info about the other stuff.

IIRC they've tweaked the hinge design and sensors too, as well as working on the thermals for once. And then I think they added rivets because this is still Apple. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Zando Bob said:

IIRC they've tweaked the hinge design and sensors too, as well as working on the thermals for once. And then I think they added rivets because this is still Apple. 

The rivets are a bit crazy imho.  A rivet is not much cheaper than a screw now that there are machines to instal screws automatically.  There has to be some kind of cost savings associated with it, unless it was a deliberate middle finger waggled at the buyers.  It is that anyway of course. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, scuff gang said:

hmmmm sounds like an angry apple fan trying to justify an insane price tag

Or, someone who knows that workstations aren't the same as gaming PCs, and has a basic understanding of how multi-core scaling works.  This still wouldn't be cheap even if it were a Threadripper or Epyc rig, and I'll bet you'd still find a way to moan about it.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bombastinator said:

Yep “when they’re not screwing up their hardware”. They’ve gotten rid of the butterfly keys I understand.  I have no info about the other stuff.

Yeah at least the 2019 macbooks went back the scissor switches, but they should've gone back after the first failure with the 2016 macbooks. I have no idea how common the display cable or T2 failures are, but it was enough to make the news.

2 hours ago, scuff gang said:

hmmmm sounds like an angry apple fan trying to justify an insane price tag

Well even if Apple with with Threadripper or Epyc, they'd still find a way to attach an Apple tax price tag to it, and people would find a way to defend it because any criticism turns into generalizations of "you're not in the market for one anyway". Except people paid to manage company hardware will be doing their research and know they can get more for less with an AMD Threadripper or Epyc workstation.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Blademaster91 said:

Yeah at least the 2019 macbooks went back the scissor switches, but they should've gone back after the first failure with the 2016 macbooks. I have no idea how common the display cable or T2 failures are, but it was enough to make the news.

Well even if Apple with with Threadripper or Epyc, they'd still find a way to attach an Apple tax price tag to it, and people would find a way to defend it because any criticism turns into generalizations of "you're not in the market for one anyway". Except people paid to manage company hardware will be doing their research and know they can get more for less with an AMD Threadripper or Epyc workstation.

I agree about the butterfly switch thing.  It took Apple too long to get that one through their skulls and they paid for it.

 

As for the “apple tax” you mean the OSX development cost.  Yes.  It will always be there. OSX costs more to develop than windows, and they sell fewer copies so each copy has to pay a larger section of it, and they roll it into machine cost.  Partially because if they want to stay in front they will have to keep on doing front line development and windows will cherry pick useful stuff from them for free.  Linux does it too.  Everyone copies Apple and there’s a reason for it.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Zando Bob said:

It was a passing remark.

I have no forum-friendly words for you on this matter. Refer to my sig:

1447193231_ScreenShot2019-12-12at4_14_39PM.png.7baf3439c089969f02e5e70ffac60d08.png
 

I use hardware from both sides. I enjoy hardware from both sides. From a logic standpoint, I consider Nvidia better for the average consumer due to RTX, NVENC, and CUDA on the RTX/GTX166x series being just better QoL things for most people, whereas AMD has RIS which is useful for 4K gaming and that's about it. Oh they have boost now, which is also only useful for 4K gaming, seriously degrades the experience at 1440p or 1080p. Not that that's a massive issue, 1440p and 1080p are usually easy to push with an appropriate card anyways. 

Personally, I like both. My 1080 Ti was great fun, the 780s I have now offer almost the tweakability of my Radeon cards, and my 1660 Ti fuckin slaps for the power consumption. The Radeon VII lives in my big rig, I have an emotional attatchement to that lad. Bought it 3 hours after release, had it overnighted, was probably one of the first few non-reviewers to actually have a Radeon VII in hand. Had a Vega FE before that, it was used though but I still loved the heck out of it. Have had a great time with all the cards I've used, loved ones from both team green and team red equally but for different reasons. 

CPUs are the only place I have a personal beef with specifically only Ryzen and 1000/2000 TR, which is poor single core performance coupled with a low OC headroom, allowing Intel chips from 2013-2014 to beat it while costing less (than what I paid for my 2700X/CH7 setup) and having full HEDT features. On a logical standpoint, both platforms have advantages, though Intel really only holds the lead for something like very high refresh rate gaming where every last fps matters, or a very very small and extremely specific set of possible HEDT users, other than that Ryzen and TR 3000 take the cake on every single front. 

So no, I am not saying Nvidia GPUs shouldn't be used in Macs because they're bad lmao, I love Nvidia cards. And AMD cards. I just made a passing remark that Apple's decision to move to AMD based off drivers turned out well, because a lot of the Nvidia GPUs at the time, not right now, at the time, when Apple made that choice, based off previous generations, specifically the 15" and 17" MacBook Pro models with Nvidia 8000 and 600 series GPUs, had a bad habit of cooking themselves. Something that may not have been an issue at the time (the 8000 series issue was well known, the newer ones may not have had issues yet, at the time), but in hindsight turned out to not be a terrible idea. A little hidden possible benefit I found amusing enough to mention, not a valid reason for Apple to never run Nvidia GPUs in their Macs again. 

 

What hardware you have makes no difference to the validity of your opinions.  the rest is trying to backpedal from a claim that makes no sense.  You literally said it's a good thing they don't use nvidia GPUs and tried to use an old problem to justify it.

 

The only reason I take issue with this is because I am tired of people recommending hardware based on old problems,  I am sick of the hot and slow rhetoric about AMD, the old gimpworks crap about nvidia, the tired old "Intel are intentionally forcing me to upgrade" rubbish.  It all stems from the same inane comments like the one you made about it being best apple don't use nvidia because of a 7 year old issue.

 

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, mr moose said:

What hardware you have makes no difference to the validity of your opinions.  the rest is trying to backpedal from a claim that makes no sense.  You literally said it's a good thing they don't use nvidia GPUs and tried to use an old problem to justify it.e comments like the one you made about it being best apple don't use nvidia because of a 7 year old issue.

I never made that claim, I have told you multiple times it was an observation that mildly amused me and I therefore included it. If you have to correct me on what I said, uh IDK what you think you're doing. 
 

34 minutes ago, mr moose said:

The only reason I take issue with this is because I am tired of people recommending hardware based on old problems,  I am sick of the hot and slow rhetoric about AMD, the old gimpworks crap about nvidia, the tired old "Intel are intentionally forcing me to upgrade" rubbish.  It all stems from the same inane comments like the one you made about it being best apple don't use nvidia because of a 7 year old issue.

I don't do that. I'm not telling people to not buy something because 7 years ago it was bad. I'll tell people not to buy FX because it's bad, but I won't tell them to not get Ryzen because FX was bad. My life is basically hardware at this point, I've been hands on with one version or another of most mainstream offerings of most currently relevant platforms. Those I'm not hands on with I watch reviews for, I see comparisons, I weigh those together and make a reasoned decision. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Zando Bob said:

I never made that claim,

 

except you did:

7 hours ago, Zando Bob said:

To my knowledge, Apple wanted full control over drivers and Nvidia didn't like that. Probably for the best, Nvidia GPUs in old macbooks have a bad habit of toasting themselves and bricking the laptop. 

and

5 hours ago, Zando Bob said:

 I said it's for the best because Nvidia GPUs in specific MacBook Pro models are known for toasting themselves and killing the laptop, that's a well known and documented issue with one set of laptops that should be avoided for that reason.

 

I don;t know how else to interpret that,  the conversation was that Apple don;t support nvidia and your reasoning for that being "for the best" (your own words) was because they had an issue years ago.   You said it, I did not nor have not twisted your words.

 

saying it's best that Nvidia isn't used because of an issue years ago is exactly what you said.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Blademaster91 said:

Yeah at least the 2019 macbooks went back the scissor switches, but they should've gone back after the first failure with the 2016 macbooks. I have no idea how common the display cable or T2 failures are, but it was enough to make the news.

Well even if Apple with with Threadripper or Epyc, they'd still find a way to attach an Apple tax price tag to it, and people would find a way to defend it because any criticism turns into generalizations of "you're not in the market for one anyway". Except people paid to manage company hardware will be doing their research and know they can get more for less with an AMD Threadripper or Epyc workstation.

Maybe you can get more CPU, but can you use the Apple accelerator card (which likely would blow either of those AMD CPUs out of the water for video editing)?

15" MBP TB

AMD 5800X | Gigabyte Aorus Master | EVGA 2060 KO Ultra | Define 7 || Blade Server: Intel 3570k | GD65 | Corsair C70 | 13TB

Link to comment
Share on other sites

Link to post
Share on other sites

Are we all on the same page that these products get finalized e freezed in some regards a number of years before launch? Relevant both to the scissor keyboard talk and the AMD talk...

 

Also switching to AMD is probably regarded at Apple as a switch of almost the same magnitude of switching to (or from) a non-x86 architecture...not something to be taken lightly...

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Blade of Grass said:

Maybe you can get more CPU, but can you use the Apple accelerator card (which likely would blow either of those AMD CPUs out of the water for video editing)?

Just because Apple has an accelerator doesn't mean that they shouldn't provide the best CPU horsepower they can for the money. Playing the other side here, if the accelerator card is already better than threadripper (which is in turn better than the current Intel CPU inside), then might as well stick to the iMac pro CPU series instead (the 8 to 18 core Intel xeon models) and reduce the prices (for either themselves or the consumer).

Link to comment
Share on other sites

Link to post
Share on other sites

Do you all realize this was created by listening to the actual prospective users (pros of different fields) every step along the way, after the 2013 debacle?

 

Who’s actually buying this wanted intel xeons and that FPGA card (which may be just the tip of the iceberg, with more purpose-build FPGAs to come..)..

Link to comment
Share on other sites

Link to post
Share on other sites

One crazy stat I just heard from a podcast: Apple went public about a new MacPro being in the making almost 1000 days ago.

And that’s when they went public.

Make of that what you want in the switching-to-AMD debate..

Link to comment
Share on other sites

Link to post
Share on other sites

 

On 12/12/2019 at 11:59 PM, Thaldor said:

And why there is a USB-A port in the case?

This makes a lot of sense in the pro software space there are lots of software products CAD and AUDIO tools that use USB Dongle based licensing. These licenses are bound to the dongle so that little USB dongle is effectively worth upto $5K USD. Putting this on the outside of a case both uses up some external port but also leaves it vulnerable for someone walking through your office space to easily pick up and put in their pocket.  (sneaking an entire MacPro, or opening the case, is a little harder to hide than just unplugging a small USB dongle and putting it in your pocket). The company i work for make software that is licensed using these Dognels, and it is not uncommon for our customers to report them as `lost`.

On 12/12/2019 at 11:59 PM, Thaldor said:

half of it is on a removable card.

It should all be on removable cards, IO specs changes quite often (we are about to see USB4 come out) should we expect you to junk your entire case (and motherboard) every time a new connector spec is released. I think they should have made the top IO be like this as well.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, thechinchinsong said:

Just because Apple has an accelerator doesn't mean that they shouldn't provide the best CPU horsepower they can for the money. Playing the other side here, if the accelerator card is already better than threadripper (which is in turn better than the current Intel CPU inside), then might as well stick to the iMac pro CPU series instead (the 8 to 18 core Intel xeon models) and reduce the prices (for either themselves or the consumer).

The Afterburner card is better than the CPU but at a very limited set of tasks (currently decoding 8k Raw ProRes video only) but even for users that will use this card on mass they will also do other things (like colour grading, etc) that do not make use of the card. FPGA can be programmed yes but it will always be a focused use case even if your are swapping in and out that use case. CPUs are the very opposite of this they are general purpose units.

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, saltycaramel said:

One crazy stat I just heard from a podcast: Apple went public about a new MacPro being in the making almost 1000 days ago.

And that’s when they went public.

Make of that what you want in the switching-to-AMD debate..

I imagine the specs were far from set in stone at that point... but I also suspect Apple was thinking of Xeons from the outset.  

 

As it stands, there's no doubt about why Apple went public so early -- it wanted to reassure pros that a new Mac Pro was coming so they wouldn't ditch the platform.  It feels like Apple was acutely aware of a number of complaints about its lineup and has spent the past couple of years addressing them.  Now, if it could have a genuinely new iMac design and a 14-inch MacBook Pro...

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, hishnash said:

It should all be on removable cards, IO specs changes quite often (we are about to see USB4 come out) should we expect you to junk your entire case (and motherboard) every time a new connector spec is released. I think they should have made the top IO be like this as well.

The speed at which connectors change is actually quite slow. You are far more likely to have outdated machine than outdated connector and even then the newest and coolest connector out there will take time to get support and if it's crucial for you at the time and your machine isn't outdated yet, you can just buy some PCIe card to give you that connector. This is especially now that we have USB-C that is extremely widely in use and even Apple supports it in their own twisted way and compare that to the situation currently (holy [censored]! Someone is still using FireWire in audio equipment, what the [censored]? That thing should be dead, burned, buried and forgotten ages ago) I think having a USB-C/Lightning-C/whatever fits the standard port will be good enough for a long time.

 

Talking about long time and hardware getting older before connectors. Loved the point Linus made during WAN show. Everyone is so fixed at the 28-core monster what Mac Pro is when fully loaded that everyone forgets what it is when it has only 8-cores, 32GB RAM, whooping 256GB SSD and Radeon Pro 580X, no afterburner cards, no wheels, only magic Mouse at only $5,999. Like yeah, those are some expensive parts and parts that aren't publicly available, but 6k for them needs hell of a explanations and cheese grater isn't enough.

Not to even begin with the upgrades, only CPUs get very overpriced fast (prices Intel Ark customer recommendations and then Apple upgrade prices):

  1. Intel Xeon W-3223 $749 (base)
  2. Intel Xeon W-3235 $1,398 (+$1,000)
  3. Intel Xeon W-3245 $1,999 (+$2,000)
  4. Intel Xeon W-3265 $3.349 (+$6,000)
  5. Intel Xeon W-3275 $4,449 (+$7,000)

That's almost double the price for the 26-core and 28-core Xeon Ws. Just saying, if you really want Mac Pro and know something, it might get a lot cheaper to buy the base model and buy some other components elsewhere, you are breaking the warranty and all, but holy balls if you need that Xeon W-3275, it becomes cheaper getting it from elsewhere (just remember the 8-core Xeon W-3223 fitted Mac Pro comes with 2666Mhz DDR4-ECC RAM and the others come with 2933Mhz DDR4-ECC RAM so you might want to take that into consideration when tinkering around).
 

Spoiler

 

I can understand the price in the high-end. You have a lot "proprietary" stuff that you can ask high prices and no one should mind. Like the Afterburner card and the Vega Pro IIs, those are going to be expensive. Also that I can understand k or two for the base parts that every Mac Pro shares (PSU, MB, that I/O-card, fans and the little parts), hell, 3k the case included, but we are still pretty far away from that 6k price for Mac Pro Lite and it doesn't have anything special which could cost a lot. And before someone says, I can somehow understand the prices in MacBooks and iMacs because their formfactors come with extra costs, screens and other stuff that might actually get things expensive.

 

But that's the price to enter the walled garden through desktop PC, probably.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Thaldor said:

 

Not to even begin with the upgrades, only CPUs get very overpriced fast (prices Intel Ark customer recommendations and then Apple upgrade prices):

  1. Intel Xeon W-3223 $749 (base)
  2. Intel Xeon W-3235 $1,398 (+$1,000)
  3. Intel Xeon W-3245 $1,999 (+$2,000)
  4. Intel Xeon W-3265 $3.349 (+$6,000)
  5. Intel Xeon W-3275 $4,449 (+$7,000)

 

 

For clarity's sake, the W-3265 and W-3275 that Apple is using in the Mac Pro are the "M" version.  The versions you quoted out there aren't capable of doing more than 1TB of RAM.  The "M" versions from Intel are another $3K more per chip.  So, very close to Apple's upgrade price.

 

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, jasonvp said:

 

For clarity's sake, the W-3265 and W-3275 that Apple is using in the Mac Pro are the "M" version.  The versions you quoted out there aren't capable of doing more than 1TB of RAM.  The "M" versions from Intel are another $3K more per chip.  So, very close to Apple's upgrade price.

 

Hmm.  Would an m version be needed then if you didn’t want to us more than 1tb(!) of ram?  That’s a whole lotta ram.

 

thing might only work with M chips.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, Bombastinator said:

Hmm.  Would an m version be needed then if you didn’t want to us more than 1tb(!) of ram?  That’s a whole lotta ram.

 

thing might only work with M chips.

The non-M chips should be fine.  In fact the rest of the CPUs are non-Ms as they're all limited to 1TB.  You'll know that if you try to price one out because the only way you can option up 1.5TB of RAM is with the 24 or 28-core Xeon.

 

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×