Jump to content

Mac pro and XDR display orders available now + unboxing

williamcll
6 minutes ago, Maticks said:

or for 50K USD, you could build the same system and watercool it with pipes dipped in gold.

 

try 10K

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, DrMacintosh said:

Why would any major system builder partner with them after a few demos and product launches?

we are already seeing Ryzens in top end servers and workstations from system builders.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, DrMacintosh said:

With their GPU division. The Radeon Division and Ryzen/TR/EPYC Divisions are all VERY different. Just because one is doing well doesn’t mean business partners will trust and commit to another. 
 

Corporate life doesn’t work like a PC hardware enthusiast. Corporate life requires the overall picture be taken into account. Going with TR2 or EPYC for the Mac Pro after 4/5th if the development cycle was already complete would have been ignoring the overall picture. Not to mention irresponsible and a disservice to the Pros who require a new modern Pro Workstation that runs macOS and have been waiting on this Mac Pro. 
 

Delaying the Mac Pro for another year or two just to get TR2/EPYC would have hurt Apple more than it would have benefited them. 

considering casual programmers were able to get x570 working almost perfectly with macOS, i really dont see the move from one to another to take that long, its all x86, and most talk in pcie, a few months of work sure, but not much more

and they being apple they would surely be able to get a custom epyc sku with higher boosts on single core, and would have gotten a much faster product in the end, specially with them wanting higher bandwidth between the cpu and gpus

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, DrMacintosh said:

This is not how the MPX slot works. The MPX slot is a a full 16x PCIe slot with an additional slot for for power and TB3. It is an addition to PCIe, not a replacement and removing the MPX modules does not prevent regular cards from being inserted. 

 

Threadripper was not as established when Apple was developing the Mac Pro. If AMD wanted Apple to use TR or EPYC, they needed to come to market sooner. The gears were already in motion for the Mac Pro by the time TR was clearly better than Xeon for Apples uses. 

 

Going with Threadripper isn't as simple as taking an off the shelf part and throwing it in, everything about the Mac Pro would have had to have been reengineered. 

 

dont forget apple most likely gets info on amd's products much sooner than public releases specifically because of this, and and has had rome samples working for a long long time probably close to a year

Link to comment
Share on other sites

Link to post
Share on other sites

Scrolled all the way to the end of the thread after the first mildly uninteresting apple to oranges price comparison.


I have other kinds of curiosities 

1) that’s sweetly silent, why don’t we see many more builders (both OEM and home pc builders) going that route? (lots of holes both front and back, 3 big fans, passive CPU and passive GPUs); I would guess it’s pricey and requires a number of variables to be kept under strict control, but liquid cooling is delicate for other reasons..

2) would it be even feasible in terms of flow to add dust filters to this kind of system? guess the impact is non-trivial..

3) could it be “self-cleaning” somehow?

4) MPX module GPUs are passive and optimized for this, but in terms of aftermarket GPUs what style of cooling (blower, 3 fans, 2 bigger fans, short, long, etc.) would be the most optimized for this kind of case air flow?

5) what would be the most silent RX 5700 XT for this? Maybe the Powercolor Red Dragon? (notice that height clearance is not an issue, MPX modules sport a pretty hefty “forehead” themselves)

6) I hope some vendor or apple itself comes up with a thunderbolt-DP re-routing daughterboard that allows a regular aftermarket consumer GPU to drive the XDR 6K 60Hz display over tb3... (I’m assuming the ones already available from Gigabyte/Asus, even the Titan Ridge ones, cannot do that currently...)

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, hishnash said:

the MPX cards are have advantages over other GPUs, thermals/noise and all that extra thunderbolt IO.  But i don think it is intentionaly hidden that these PCIe slots are PCIe slots.

Now that you mention it, the fact that there is wo normal SATA-ports and proprietary power plug for them is kind of hidden. There is no mention of them in the Mac Pro site, not even in technical specifications. You just can see them clearly in many pictures but Apple doesn't tell that they are there and whats for they are there. They don't hide that they are just normal PCIe-slots but only from technical specifications you will find a note about that the MPX isn't the only way to give more power than 75W to your PCIe-hardware. And why there is a USB-A port in the case? To stick your external storage inside the case so you can fill the empty space that not having 2 full-height MPX cards leaves you?

 

I just love the Apple Ad-talk: "Mac Pro has extremely high‑performance I/O, and lots of it. It begins with four Thunderbolt 3 ports, two USB-A ports, and two 10Gb Ethernet ports." Like yeah, that's a lot of I/O if you are used to only have 2 USB-C ports or charger and couple USB-A ports, hell you might mistake that 4th thunderbolt for a charging port. Those RJ45-ports are sweet, but other way, that's kind of minimalistic I/O and half of it is on a removable card.

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, saltycaramel said:

Scrolled all the way to the end of the thread after the first mildly uninteresting apple to oranges price comparison.


I have other kinds of curiosities 

1) that’s sweetly silent, why don’t we see many more builders (both OEM and home pc builders) going that route? (lots of holes both front and back, 3 big fans, passive CPU and passive GPUs); I would guess it’s pricey and requires a number of variables to be kept under strict control, but liquid cooling is delicate for other reasons..

2) would it be even feasible in terms of flow to add dust filters to this kind of system? guess the impact is non-trivial..

3) could it be “self-cleaning” somehow?

4) MPX module GPUs are passive and optimized for this, but in terms of aftermarket GPUs what style of cooling (blower, 3 fans, 2 bigger fans, short, long, etc.) would be the most optimized for this kind of case air flow?

5) what would be the most silent RX 5700 XT for this? Maybe the Powercolor Red Dragon? (notice that height clearance is not an issue, MPX modules sport a pretty hefty “forehead” themselves)

6) I hope some vendor or apple itself comes up with a thunderbolt-DP re-routing daughterboard that allows a regular aftermarket consumer GPU to drive the XDR 6K 60Hz display over tb3... (I’m assuming the ones already available from Gigabyte/Asus, even the Titan Ridge ones, cannot do that currently...)

Self cleaning? Just stick a dyson on the front, but don't forget to empty the container every month.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Thaldor said:

Now that you mention it, the fact that there is wo normal SATA-ports and proprietary power plug for them is kind of hidden. There is no mention of them in the Mac Pro site, not even in technical specifications. You just can see them clearly in many pictures but Apple doesn't tell that they are there and whats for they are there. They don't hide that they are just normal PCIe-slots but only from technical specifications you will find a note about that the MPX isn't the only way to give more power than 75W to your PCIe-hardware. And why there is a USB-A port in the case? To stick your external storage inside the case so you can fill the empty space that not having 2 full-height MPX cards leaves you?

 

I just love the Apple Ad-talk: "Mac Pro has extremely high‑performance I/O, and lots of it. It begins with four Thunderbolt 3 ports, two USB-A ports, and two 10Gb Ethernet ports." Like yeah, that's a lot of I/O if you are used to only have 2 USB-C ports or charger and couple USB-A ports, hell you might mistake that 4th thunderbolt for a charging port. Those RJ45-ports are sweet, but other way, that's kind of minimalistic I/O and half of it is on a removable card.

Can't speak for certain on the rest, but the internal USB-A port is mainly for hardware keys that some creative apps require (arguably it'd also work for security keys).  You wouldn't need to tie up external ports just to use your media editing tools.

Link to comment
Share on other sites

Link to post
Share on other sites

I wish that “additional 8x pcie + additional power + 4 x display port re-routing to other TB3 ports in the system” slot on the front part of the MPX module would become an industry standard found on every mobo..

 

Now that Thunderbolt is a fact of life, that’s the most elegant way to tie everything together and have many full-blown (including 4K-5K-6K-8K video out) thunderbolt ports on desktop PCs...those tb3 daughterboards on some PCs can’t cut it...

 

That’s what great I/O is about...not a bunch of USB-A and PS/2 ports..,

 

Now, with the base GPU (580X) the new MP only has 4 tb3 total...they become 8 or 12 by upgrading to 1 or 2 full length MPX modules...that’s a ludicrous number of full blown tb3 ports, and with pcie lanes where their mouth is..

 

What’s interesting is the upcoming Navi GPU (W5700X) they’ll add to the available options..it will become the new “cheaper” option to get a full lenght MPX module with 4 added tb3 ports..also for some reason, out of the 4 cards (base Polaris, middle Navi, high Vega, higher dual-Vega), only this particular one is said to support “DSC” (a display port compression feature to drive single tile 8K+ displays)...wonder why apple felt the need to mention that and what displays (not necessarily from apple) they know to be in the pipeline...

 

Also, can’t help but marvel about how “beastly” this kind of I/O system will become when, in the next model, these 3 technologies will be in place: pcie 4.0, tb4 (?) and dp 2.0 

Link to comment
Share on other sites

Link to post
Share on other sites

Another thing to poke at with "why Apple didn't go with AMD"

 

Let's compare Apple to say Dell, who doesn't have to deal with the development of an OS or even drivers. They may do a few things to add their own branding to it, but for the most part, once the item is shipped, it's very likely that Dell hopes that a company's IT department to keep the software side running and they just handle the hardware itself. This is why Dell can supply hardware of differing configurations. They're not the ones who have to deal with making sure the OS and its drivers work with the hardware other than passing a smoke test. If there is a problem, they go complain to who's in charge of said OS or drivers and they take care of it.

 

Apple on the other hand, does not have this luxury. They have to make the system software that sits on top of their hardware, with the possibility of developing and maintaining some drivers. The way I see it, Apple would rather prefer to have as few configurations of hardware as possible. It's a logistical and maintenance concern, especially when they're in charge of the entire product stack.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, RedRound2 said:

This is what the Mac Pro is meant for, and this use case and similar others are enough to justify the price and why people prefer and love this product more than anything else on the market (the ecosystem and how AirDrop just made it that easy, is honestly reason enough for me)

 

 

Saw this and I wanted to commend Jonathan on a different take -- it's not the only valid take, but it's a useful one.

 

I love it when the creatives realize that Jonathan's review unit (28 cores, 384GB of RAM) is absolute overkill for the studio and doesn't hit 50 percent CPU usage even when they go out of their way to push it.  That you could probably go with 'just' a 12- or 16-core model, with less RAM, and still accomplish just about everything you'd want from a professional music workstation.

 

Video producers would be another matter, but I imagine you still don't need to go all-out on the CPU.  Arguably the GPU is more important if you need real-time visualization of your edits and effects.

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Mira Yurizaki said:

Another thing to poke at with "why Apple didn't go with AMD"

 

-snip-

 

Apple on the other hand, does not have this luxury. They have to make the system software that sits on top of their hardware, with the possibility of developing and maintaining some drivers. The way I see it, Apple would rather prefer to have as few configurations of hardware as possible. It's a logistical and maintenance concern, especially when they're in charge of the entire product stack.

In any other environment the device manufacturer handles the drivers, except MacOS where Apple has this odd OCD to do everything themselves except if it's external device. Microsoft doesn't really make drivers (except universal ones) for devices to work with Windows, *nix-side even less the kernel or distribution developers make drivers for hardware. In simple Apple seems to draw the line to when the driver needs really deep integration with the OS, they don't allow anyone else than themselves to make that driver which is probably one reason why they moved to AMD GPUs (AMD being the more open GPU manufacturer while Nvidia keeps everything closed which quite possibly means that they didn't give Apple their source code for drivers, for reasons). So if the drivers were the case, then all the blame to Apple to be as walled garden as they are and so fixated to not get anybody see under the hood so they can do their job and provide customers what they want.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Thaldor said:

In any other environment the device manufacturer handles the drivers, except MacOS where Apple has this odd OCD to do everything themselves except if it's external device. Microsoft doesn't really make drivers (except universal ones) for devices to work with Windows, *nix-side even less the kernel or distribution developers make drivers for hardware. In simple Apple seems to draw the line to when the driver needs really deep integration with the OS, they don't allow anyone else than themselves to make that driver which is probably one reason why they moved to AMD GPUs (AMD being the more open GPU manufacturer while Nvidia keeps everything closed which quite possibly means that they didn't give Apple their source code for drivers, for reasons). So if the drivers were the case, then all the blame to Apple to be as walled garden as they are and so fixated to not get anybody see under the hood so they can do their job and provide customers what they want.

...Which works in their favor. On average, macs just work and usually have less issues than Windows PCs with differing hardware (this is true in my experience here at work, and IBM even did a whole test where it turned out Macs caused less trouble than Windows PCs). Apple gets away with their walled garden tactics because it's usually a very good user experience, and the people buying Macs can do everything they want to with the Mac without much bother. 

If you can't do what you want with a Mac in the first place, why would you ever buy one? 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Thaldor said:

In any other environment the device manufacturer handles the drivers, except MacOS where Apple has this odd OCD to do everything themselves except if it's external device. Microsoft doesn't really make drivers (except universal ones) for devices to work with Windows, *nix-side even less the kernel or distribution developers make drivers for hardware. In simple Apple seems to draw the line to when the driver needs really deep integration with the OS, they don't allow anyone else than themselves to make that driver which is probably one reason why they moved to AMD GPUs (AMD being the more open GPU manufacturer while Nvidia keeps everything closed which quite possibly means that they didn't give Apple their source code for drivers, for reasons). So if the drivers were the case, then all the blame to Apple to be as walled garden as they are and so fixated to not get anybody see under the hood so they can do their job and provide customers what they want.

While you have some points, I do find there to be a tremendous irony in chastising Apple for a walled garden when it's using the more open GPU option instead of the more proprietary NVIDIA.  I'm not an open source die-hard by any means, and I'd love it if Macs came with RTX and Quadro cards, but I find it hard to fault Apple with choosing something for its openness (if just so that it can have complete control over drivers).

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Commodus said:

While you have some points, I do find there to be a tremendous irony in chastising Apple for a walled garden when it's using the more open GPU option instead of the more proprietary NVIDIA.  I'm not an open source die-hard by any means, and I'd love it if Macs came with RTX and Quadro cards, but I find it hard to fault Apple with choosing something for its openness (if just so that it can have complete control over drivers).

It's also fun knowing the macOS kernel is open source.

 

And then everything else on top of it that Apple created isn't.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/9/2019 at 10:27 AM, AngryBeaver said:

I didn't even look at this from a corporate stance where they are arguably an even bigger concern from a security stance POV.

What security concern?

On 12/9/2019 at 11:38 AM, Bombastinator said:

Linux? It is “free”.  The way people get paid writing code for Linux is they keep making a fork or a different version of everything so they can be in charge and get credit for “innovating” something so they can go work somewhere else.  The result is Linux is a mess of competing distros and competing file systems and networking systems and all sorts of garbage, and a lot of them have more little hidden speed bumps designed to make other systems not work than Microsoft ever put in their stuff.

Yeah, the Linux kernel is a bit of a mess and over the past 5 years it’s performance has been a downward spiral.  

On 12/11/2019 at 1:23 PM, Thaldor said:

 

I-snip-

Not sure I get your firmware comment. If Apple has specific requirements of the hardware that it generally doesn’t support, then having a requirement on the firmware makes sense. Heck, if there’s specific tie-ins with the way things are done which are changed in other versions of the firmware then it still makes sense. Version-matching software for compatibility reasons is something the tech industry does across the board—the thing is most computer manufacturers don’t tie-in closely with their hardware (which gives worse performance) so they likely don’t have the same requirements. 

On 12/11/2019 at 1:56 PM, Mira Yurizaki said:

I also don't even think Apple thinks the CPU is that important. Why render things on the slower CPU when you can offload them onto the much faster GPU and/or accelerator card? If that's the way they intend for people who do work on a Mac Pro to use the thing, then it makes more sense to minimize the cost of the CPU and its platform as much as possible. And since they're already cozy with Intel, even if their CPUs are a bit more expensive, the development cost to switch over and support to AMD would probably outpace it.

bingo. Custom hardware > CPU in this case 

22 hours ago, GDRRiley said:

I agree it would add more pain but they did just start supporting a new socket/platform so theoretically switching to AMD shouldn't cause them any more issues than trying to iron out a new intel one.

 

Thats not how it works. Architecture upgrades are incremental and build upon one another—effectively Apple gets their entire history with Intel worth of tuning to carry forward with them to the newest update, whereas they would be starting from scratch with AMD.

15" MBP TB

AMD 5800X | Gigabyte Aorus Master | EVGA 2060 KO Ultra | Define 7 || Blade Server: Intel 3570k | GD65 | Corsair C70 | 13TB

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Zando Bob said:

...Which works in their favor. On average, macs just work and usually have less issues than Windows PCs with differing hardware (this is true in my experience here at work, and IBM even did a whole test where it turned out Macs caused less trouble than Windows PCs). Apple gets away with their walled garden tactics because it's usually a very good user experience, and the people buying Macs can do everything they want to with the Mac without much bother. 

If you can't do what you want with a Mac in the first place, why would you ever buy one? 

I don't take a part whether it's better or worse, it's just odd ball with it's ups and downs.

1 minute ago, Commodus said:

While you have some points, I do find there to be a tremendous irony in chastising Apple for a walled garden when it's using the more open GPU option instead of the more proprietary NVIDIA.  I'm not an open source die-hard by any means, and I'd love it if Macs came with RTX and Quadro cards, but I find it hard to fault Apple with choosing something for its openness (if just so that it can have complete control over drivers).

As I have stated earlier in the thread Apple and Nvidia quite probably has a stormy past that for good or bad isn't that public. I would guess Nvidia wasn't really happy when they got Quadro GPUs to Macs and Apple decided to make sure they get some money from sold GPUs for Macs, so when Apple decided that they take the control of driver development Nvidia probably wasn't that open to give their driver source to Apple which earlier scalped them (I'm pretty sure Nvidia got a bigger cut from a Quadro sold directly by PNY/EVGA than PNY/EVGA Quadro sold by Apple).


Apple and openness is a fun conversation. Apple does use a lot of open source to build their OSs, but at least last time I had touch with the subject the thing was Apple publishes only the parts that they are forced by their source projects (like FreeBSD) to publish and basicly what they publish is as far from working as something can be with huge holes in the codes that they were basicly useless.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Blade of Grass said:

... whereas they would be starting from scratch with AMD.

To add to this, just because a processor is x86-64 compatible, doesn't mean they run the same way: https://en.wikipedia.org/wiki/X86-64#Differences_between_AMD64_and_Intel_64

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Thaldor said:

As I have stated earlier in the thread Apple and Nvidia quite probably has a stormy past that for good or bad isn't that public.

To my knowledge, Apple wanted full control over drivers and Nvidia didn't like that. Probably for the best, Nvidia GPUs in old macbooks have a bad habit of toasting themselves and bricking the laptop. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Zando Bob said:

To my knowledge, Apple wanted full control over drivers and Nvidia didn't like that. Probably for the best, Nvidia GPUs in old macbooks have a bad habit of toasting themselves and bricking the laptop. 

That GPU issue was actually NVIDIA's problem -- there was a problem with NVIDIA GPUs from that era failing no matter what platform you were on.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Zando Bob said:

To my knowledge, Apple wanted full control over drivers and Nvidia didn't like that. Probably for the best, Nvidia GPUs in old macbooks have a bad habit of toasting themselves and bricking the laptop. 

 

better not buy any hardware then, they all have had problems at some point in their lifetime. 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mr moose said:

better not buy any hardware then, they all have had problems at some point in their lifetime. 

....? I mean, 8000 series GPUs in 17" MBPs anyone? Keyboards in 2016-2019 MBPs? Those are very very common issues, the number of failures in most other products is usually vastly lower. Failures are normal in literally all hardware, that's not a reason to not buy it, and I never stated that. I said it's for the best because Nvidia GPUs in specific MacBook Pro models are known for toasting themselves and killing the laptop, that's a well known and documented issue with one set of laptops that should be avoided for that reason. And to my knowledge there haven't been as many cases of AMD GPUs doing that, so it's more a "hey this had a benefit somewhere else, irrelevant to the original reason this choice was made" observation. 

In no way did I say not to buy hardware because of normal hardware failures lmao, only remarked on it being abnormally high for certain specific bits (in this case, MacBook Pros with Nvidia dGPUs, more specifically, C2D and early Core i series 15" and 17" MacBook Pros with Nvidia 8000 and IIRC 600 series dGPUs). 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Zando Bob said:

....? I mean, 8000 series GPUs in 17" MBPs anyone? Keyboards in 2016-2019 MBPs? Those are very very common issues, the number of failures in most other products is usually vastly lower. Failures are normal in literally all hardware, that's not a reason to not buy it, and I never stated that. I said it's for the best because Nvidia GPUs in specific MacBook Pro models are known for toasting themselves and killing the laptop, that's a well known and documented issue with one set of laptops that should be avoided for that reason. And to my knowledge there haven't been as many cases of AMD GPUs doing that, so it's more a "hey this had a benefit somewhere else, irrelevant to the original reason this choice was made" observation. 

In no way did I say not to buy hardware because of normal hardware failures lmao, only remarked on it being abnormally high for certain specific bits (in this case, MacBook Pros with Nvidia dGPUs, more specifically, C2D and early Core i series 15" and 17" MacBook Pros with Nvidia 8000 and IIRC 600 series dGPUs). 

Do you honestly think that because some nvidia chips had issues 5 years ago that they would still have those same issues today? They are not the same chips, they don't use the same process,  5 years of problem solving have gone into nvidias products since then.  trying to argue that nvidia products today won't work or should be avoided because of something that happened 5 years ago is irrational. 

 

EDIT:quite frankly, arguing "it's for the best they avoid nvidia" is as silly as people arguing to avoid AMD because they are hot and loud while under performing.  Applying old problems to new products is wrong.

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/8/2019 at 11:00 PM, Commodus said:

So you ignored everything I said and decided to post a comparison of highly dissimilar systems -- you even deliberately avoided the Xeon and hoped that I wouldn't notice.  Er, try again.  Yeah, the Digital Storm rig would be faster for gamers, and for pros who don't care about Xeon advantages (mainly cache), ECC memory, pro-oriented GPUs or the fastest possible storage, but we're not talking about a consumer desktop, are we?

 

I also noticed that Digital Storm is still using Xeons from 2017, so that might be why you were afraid of doing a real comparison... because you can't.

 

And Threadripper is great if you value sheer core count above all else, but again, you're not actually comparing like-for-like parts, you're just choosing something a gamer would want and then throwing in a workstation GPU almost as an afterthought.  I'd add that there are all kinds of unknowns you're ignoring, such as SSD speeds, noise levels (important in a media workstation), I/O and of course real-world performance in a given workflow.

 

Also, you do know that CPU and GPU performance don't scale linearly, right?  That having 32 cores doesn't mean a chip is four times faster than one with eight?  And that there will be tasks where fewer high-speed cores may be faster?  This isn't even about disputing whether or not the Threadripper would be faster, it's just that you're operating from a position where you're either ill-informed or dishonest.

hmmmm sounds like an angry apple fan trying to justify an insane price tag

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, mr moose said:

trying to argue that nvidia products today won't work or should be avoided because of something that happened 5 years ago is irrational. 

 

Never said that. In no way, at all, did I say that. I said their switch, when they did it, turned out for the better, because Nvidia GPUs were having issues, well documented ones, with many Macs. The main reason for switching was because Nvidia refused to let Apple manage the drivers themselves. 

It's like the classic "someone does something different from the routine and avoids an accident". Someone pointing out "hey, that decision turned out pretty well for this other reason too", isn't saying that the behavior that allowed them to avoid the accident in the first place should now become the norm. It was an observation, not an argument for Apple to never use Nvidia GPUs ever again. Again, the reason for that (as far as I know) is Apple's insistence on doing the drivers themselves, and Nvidia's refusal to go along with that. 

You're trying to disprove an argument that was never made, based on what is obviously a misreading of my original statement.

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×