Jump to content

Mac pro and XDR display orders available now + unboxing

williamcll
On 12/13/2019 at 2:59 AM, mr moose said:

except you did:

and

 

I don;t know how else to interpret that,  the conversation was that Apple don;t support nvidia and your reasoning for that being "for the best" (your own words) was because they had an issue years ago.   You said it, I did not nor have not twisted your words.

 

saying it's best that Nvidia isn't used because of an issue years ago is exactly what you said.

 

 

Yes. The switch did two things. It allowed Apple to use AMD who were not cooking/dying (as in, specifically at that point in time, NVidia WAS giving sub par products). Two, it allowed them to change their driver model. Thus, even though the Nvidia problem was *only* that GPU range, it benefited Apple in other ways too. That's not "opinion" or bias or fanboyism, it's covering the history (and making a comment such as "I think it's cool/drab/good/bad", but those comments are not what you are protesting, you're protesting they mentioned the history at all).

 

Do you deny that Apple wanted to stop getting supplies off NVidia at the time? Or that they wished to change the contract or supply quality/quantity/pricing with them if they did get the GPUs?

 

I'd not expect this new Mac Pro to have any troubles. It's rather generic stuff, just custom made boards etc. At this point Apple is good at this, and it's not hitting the space requirements that kill the iphones and Laptops. I notice Apple iPads just don't seem to get the same faults the phones do, as they are bigger, so tolerances better. The iPads don't get the problems the laptops do, as I guess they are not pushing the specs (battery/screen/space in the chassis/cooling) like they do on the laptops.

 

So hopefully, this generic back to basics but balls to the wall case of a PC... um, Apple Mac, will work well. Any faults could be easily swapped out for new/fixed parts (by Apple though, darnnned T2 chip!).

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Thaldor said:

I think having a USB-C/Lightning-C/whatever fits the standard port will be good enough for a long time.

 

So even within the world of USBC (shaped connectors) there have already been 4 different versions. 2 USBC (specs) (that need idffrent chipsets to drive them) and 2 Thunderbolt version (due to diffrent thunderbolt chipsets that support different versions of tunneld displayport). 

With USB4 coming out in the NEXT YEAR you realy think that it is acceptable to ship a machine at this point were all the current ports are `out of date` within 6 months? Yes USB4 will look gust the same as USB 3.2 (and 3.1 gen 2 and 3.1 gen 1) and thunderbolt (gen 1 gen 2.....) but it will also have its own set of quirks. There will be devices in 2 years time that require USB4 (not the physical connector but the chipset protocol). Its a real shame that the chipset that drives the ports on the top of the case is not replaceable!

 

3 hours ago, Thaldor said:

Intel Xeon W-3275 $4,449 (+$7,000) 

I would suggest taking a look at the parts you listed above, the 28C that apple are using is clearly the W-3275M https://ark.intel.com/content/www/us/en/ark/products/193754/intel-xeon-w-3275m-processor-38-5m-cache-2-50-ghz.html

Since the W-3275 only supports upto 1TB of ram!

The  W-3275M is priced at $7453.00 ! so yes looks like apple is taking ~ $400 markup (since the base cpu is ~800)

The reason the base model costs what it does is due to the fact it includes the same motherboard etc as the fully speeded model, so that you can upgrade later. Go take a look at the XeonW motherboards on the market and find one with the same IO (PCIe) expandability as this one and look at the price. (hint you are looking at a server board it costs at least $3000).

If you are buying the base model MacPro and not going to open the box to upgrade (with parts form apple or other sources, eg a Midi Card for example) then you realy should consider buying a iMacPro or a macMini. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, TechyBen said:

Yes. The switch did two things. It allowed Apple to use AMD who were not cooking/dying (as in, specifically at that point in time, NVidia WAS giving sub par products). Two, it allowed them to change their driver model. Thus, even though the Nvidia problem was *only* that GPU range, it benefited Apple in other ways too. That's not "opinion" or bias or fanboyism, it's covering the history (and making a comment such as "I think it's cool/drab/good/bad", but those comments are not what you are protesting, you're protesting they mentioned the history at all).

 

Do you deny that Apple wanted to stop getting supplies off NVidia at the time? Or that they wished to change the contract or supply quality/quantity/pricing with them if they did get the GPUs?

 

You missed the context of the issue,  There was no mention of "at that time", and the other users post (the one I was discussing this with) was insinuating it is good apple don;t use nvidia now.    I in no way was arguing nothing happened or that Apple should have kept going with nvidia products back then, I am only arguing it is not a logical reason to claim it is for thee best if nvidia isn't used today.    Those problems don't exist anymore and plenty of people would benefit from Nvidia, hell some even were benefiting until apple pulled the pin and left them high and dry. 

 

31 minutes ago, TechyBen said:

I'd not expect this new Mac Pro to have any troubles. It's rather generic stuff, just custom made boards etc. At this point Apple is good at this, and it's not hitting the space requirements that kill the iphones and Laptops. I notice Apple iPads just don't seem to get the same faults the phones do, as they are bigger, so tolerances better. The iPads don't get the problems the laptops do, as I guess they are not pushing the specs (battery/screen/space in the chassis/cooling) like they do on the laptops.

 

So hopefully, this generic back to basics but balls to the wall case of a PC... um, Apple Mac, will work well. Any faults could be easily swapped out for new/fixed parts (by Apple though, darnnned T2 chip!).

 

 

There are definitely no arguments anywhere that the product isn't going to work or isn't quality.  At least not that I've seen or heard.  The only thing people have said is that there are alternatives out there that that do the same job and some are cheaper.  It seems though that some people are hell bent on assuming you can only do that if you build it yourself, which is not true, titan computers will customize a workstation using almost identical specs for a not insignificant saving.  And that comes with after sales services and support too.   HP will sell you a workstation based on a comprehensive list of software you wish to use at the level of performance you want.  This is something I don't know if apple do,  but if they don't then it totally wrecks the argument people are making to try and justify their belief that apple is better on grounds you don't have to research what parts you need. 

 

https://www8.hp.com/us/en/workstations/desktops/index.html

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

Those problems don't exist anymore and plenty of people would benefit from Nvidia, hell some even were benefiting until apple pulled the pin and left them high and dry. 

So given MacOS depends on metal 2.0 to display anything on screen there would be a problem for users, nvidia have been unable to produce stable Metal drivers (even for the cards were apple is supporting them with driver updates). 

When it comes to compute based drivers Nvidia could release a Cuda/OpenCL (only) driver for macOS without any blessing from apple using user-space driver additions in 10.15. However if they want to do a display driver they will be required to support the latest Metal api and will be required to not crash when an eGPU is disconnected randomly from the system.   This last point is something they have not been able to demonstrate on windows yet so i would not put much hope on (the much smaller macOS driver team they have) being able to manage this magicaly on the mac. Apple will never approve a kernel-space driver that creates a kernel panic if a user unplugs a eGPU (not on windows much more of the display driver stack runs in user space so its not so critical). Maybe apple will release user-space driver support for display drivers when this happens nvidia will be able to ship as buggy a driver as they like apple (literally) will have not way to stop them.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

And that comes with after sales services and support too.   HP will sell you a workstation based on a comprehensive list of software you wish to use at the level of performance you want.  This is something I don't know if apple do,  but if they don't then it totally wrecks the argument people are making to try and justify their belief that apple is better on grounds you don't have to research what parts you need. 

 

If you buy a mac as a small or large company you don't buy from apples consumer website you buy through your biz rep, you get both significant savings but also the above listed support and advice. (in addition to extended warranty etc)
 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Marques Brownlee's latest podcast episode had a good explanation of why the Mac Pro is so valuable to him even as expensive as it'll be for the 28-core config.

 

People look at the time saved on a render and think that he won't save much time, but they forget that there's a lot more processing involved than the render that gets posted to YouTube.  There may be multiple renders per video, such as for ones with revised sponsor segments. There's all the CPU-intensive work leading up to the render, like adding effects.  And then there's the simple matter of faster crunching being better for business.  It not only lets him upload sooner, it gives his entire team a break -- they've had to stay late at work because they had to verify that a video was rendered and uploaded.

 

So, that $28K Mac Pro?  Whether or not there's a lower-priced Windows equivalent almost doesn't matter, because Marques and crew will see noticeable performance (and quality of life) improvements that could pay for themselves.

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, hishnash said:

So given MacOS depends on metal 2.0 to display anything on screen there would be a problem for users, nvidia have been unable to produce stable Metal drivers (even for the cards were apple is supporting them with driver updates). 

When it comes to compute based drivers Nvidia could release a Cuda/OpenCL (only) driver for macOS without any blessing from apple using user-space driver additions in 10.15. However if they want to do a display driver they will be required to support the latest Metal api and will be required to not crash when an eGPU is disconnected randomly from the system.   This last point is something they have not been able to demonstrate on windows yet so i would not put much hope on (the much smaller macOS driver team they have) being able to manage this magicaly on the mac. Apple will never approve a kernel-space driver that creates a kernel panic if a user unplugs a eGPU (not on windows much more of the display driver stack runs in user space so its not so critical). Maybe apple will release user-space driver support for display drivers when this happens nvidia will be able to ship as buggy a driver as they like apple (literally) will have not way to stop them.

that specific issue is getting old,  Apple is stopping nvidia from releasing drivers for mac, so everyone who had a mac and an nvidia GPU (there were a few out there) suddenly got dumped in the shit. 

 

 I couldn't be arse looking through an old thread to find the pertinent information So I'll just link to my part of it.

 

This current issue is about apple not permitting Nvidia to write drivers.  period.  It even states that on their webpage where it says if you want use an nvidia GPU you have to use int in an external enclosure through bootcamp.  That effectively makes the mac a rubbish bin option if your work is heavily cuda based.  Why pay for IOS and only to use windows?

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Commodus said:

 

So, that $28K Mac Pro?  Whether or not there's a lower-priced Windows equivalent almost doesn't matter, because Marques and crew will see noticeable performance (and quality of life) improvements that could pay for themselves.

So once again, how does render time save them money on a mac if the windows equivalent is not slower?   If there is no difference in performance between two machines and one is cheaper then the price is the only factor left to consider and it does matter.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, hishnash said:

If you buy a mac as a small or large company you don't buy from apples consumer website you buy through your biz rep, you get both significant savings but also the above listed support and advice. (in addition to extended warranty etc)
 

 

 

You are still not getting anything more than if you buy other brands,  that's the point.    People are trying to make it sound like after sales service, support,  professional manufacturer and testing/validation is exclusive to apple, It's not.  No one can use any of that as an argument to why apple is better because it applies to the competition as well.

 

And further to that, I linked to the HP website that lets you refine your workstation choices by your software workflow, which is a service I have yet to see apple apply, which means small business that orders their own workstations can get a somewhat customized service from HP without having to research components.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, mr moose said:

So once again, how does render time save them money on a mac if the windows equivalent is not slower?   If there is no difference in performance between two machines and one is cheaper then the price is the only factor left to consider and it does matter.

You can't possibly be this dense.  Marques is clearly a Final Cut Pro X user and has been for some time.  He is 100% the kind of person this rig is aimed at: professional video editors/producers that are tied into Apple's operating system and professional applications.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, jasonvp said:

You can't possibly be this dense. 

How about you learn to read and comprehend the entire discussion instead of insulting people like a moron.

Quote

Marques is clearly a Final Cut Pro X user and has been for some time.  He is 100% the kind of person this rig is aimed at: professional video editors/producers that are tied into Apple's operating system and professional applications.

The claim was it was better because of performance, not because of software specifics.  I fully appreciate people will be tied to apple software and don't have a choice,  that is not what this discussion is about. Hence why I qualified my position,  which you clearly don't understand.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, mr moose said:

Why pay for IOS and only to use windows?

exactly

 

you made some solid, valid points as usual in previous posts here. i personally never been an apple fan myself, yet its interesting they are dropping the graphics supplier. mac fanusers are interesting lololol

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, amdorintel said:

exactly

 

you made some solid, valid points as usual in previous posts here. i personally never been an apple fan myself, yet its interesting they are dropping the graphics supplier. mac fanusers are interesting lololol

And I am not even against apple, I just don't like seeing people not only fail to grasp the wider picture with regard to proprietary walled hardware and the obvious anti consumer practices that stem from it,  but that they try to defend it using some of the most inane arguments (i.e insinuating if you want something competitive in price you have to build it yourself or blocking Nvidia drivers from the OS then trying to blame Nvidia for it).

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, mr moose said:

proprietary walled hardware and the obvious anti consumer practices that stem from it

Not the case with the Mac Pro.

5 minutes ago, mr moose said:

 blocking Nvidia drivers from the OS then trying to blame Nvidia for it

False. nVidia drivers are not blocked from macOS, you can install them just fine. The issue is they have not been updated. That is both nVidia and Apple's faults, though primarily nVidia.

Brands I wholeheartedly reccomend (though do have flawed products): Apple, Razer, Corsair, Asus, Gigabyte, bequiet!, Noctua, Fractal, GSkill (RAM only)

Wall Of Fame (Informative people/People I like): @Glenwing @DrMacintosh @Schnoz @TempestCatto @LogicalDrm @Dan Castellaneta

Useful threads: 

How To Make Your Own Cloud Storage

Spoiler

 

Guide to Display Cables/Adapters

Spoiler

 

PSU Tier List (Latest)-

Spoiler

 

 

Main PC: See spoiler tag

Laptop: 2020 iPad Pro 12.9" with Magic Keyboard

Spoiler

PCPartPicker Part List: https://pcpartpicker.com/list/gKh8zN

CPU: AMD Ryzen 9 3900X 3.8 GHz 12-Core OEM/Tray Processor  (Purchased For $419.99) 
Motherboard: Asus ROG Crosshair VIII Formula ATX AM4 Motherboard  (Purchased For $356.99) 
Memory: G.Skill Trident Z RGB 32 GB (2 x 16 GB) DDR4-3000 Memory  (Purchased For $130.00) 
Storage: Kingston Predator 240 GB M.2-2280 NVME Solid State Drive  (Purchased For $40.00) 
Storage: Crucial MX300 1.05 TB 2.5" Solid State Drive  (Purchased For $100.00) 
Storage: Western Digital Red 8 TB 3.5" 5400RPM Internal Hard Drive  (Purchased For $180.00) 
Video Card: Gigabyte GeForce RTX 2070 8 GB WINDFORCE Video Card  (Purchased For $370.00) 
Case: Fractal Design Define R6 USB-C ATX Mid Tower Case  (Purchased For $100.00) 
Power Supply: Corsair RMi 1000 W 80+ Gold Certified Fully Modular ATX Power Supply  (Purchased For $120.00) 
Optical Drive: Asus DRW-24B1ST/BLK/B/AS DVD/CD Writer  (Purchased For $75.00) 
Total: $1891.98
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2020-04-02 19:59 EDT-0400

身のなわたしはる果てぞ  悲しわたしはかりけるわたしは

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

which is a service I have yet to see apple apply, which means small business that orders their own workstations can get a somewhat customized service from HP without having to research components.

This is literally what apple biz reps do. They also offer you discounts on the consumer list price. (and apple website even prompts you if you are a biz that you should consider ordering in that way rather than from the consumer website)
 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

So once again, how does render time save them money on a mac if the windows equivalent is not slower?   If there is no difference in performance between two machines and one is cheaper then the price is the only factor left to consider and it does matter.

So when it comes to compute MacOS and Linux will typically outperform windows in high cor count high memory situations, the resion for this is due to lower level system ark differences. Go take a look at https://www.phoronix.com/scan.php?page=home they commonly do comparisons showing this behavior. Also when it comes to macOS apple do provide a large set of optimised compute libs that help developers make use of system features (AVX etc) in such a way that the developer does not need to explicitly target these instruction sets, this ends up means many more applications in macOS will make use of AVX 512 (if your cpu has it) in comparison to on windows were developers (such as myself) need to explicitly think, code and test for these cases so in the end only a small fraction of the application makes maximum use of the hardware. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

That effectively makes the mac a rubbish bin option if your work is heavily cuda based

So apple has explicitly permitted anyone (included nvidia) to produce CUDA style drivers without them being able to block them that is how the user-space drivers work. It is on Nvidia to produce such drivers, apple cant (and even if they could nvidia would sue) reverse engineer nvidia cards to write them. 

What apple is blocking is kernal-space drivers that are not stable. Maybe you don't understand what happens when a kernal space driver has a bug but in summary it means user data is comprised, corrupted and even in some cases system hardware is damaged. This is yes a downside of macos (compared to some linux systems) that display drivers are kernal-space rather than user-space. 

if a driver runs in `user-space` then if it crashes. or has a security bug or ends up in a loop it does not comprise the netier OS, you can re-start such a driver, and such a driver does not get FULL system memory access.

A kernel space driver, runs within the kernel run-loop, that means if it takes too long to do something the ENTIRE system locks up, if it has a security bug that exposers THE ENTIRE system's MEMORY! of ALL applications, if it crashes then the system KERNEL panics,  if it gets stuck in a loop you CAN NOT Restart it without KILLING the KERNEL (killing the machine). 

Writing kernel space drivers for PCIe devices is hard, writing them for PCIe devices that might at any point be pulled away from underneath you (eGPU) is very very hard. A single bug in that will mean apple will reject the driver, seeing how nvidia handle eGPUs being unplugged on windows (sometimes crashes the system) i would not be supprised if the drivers they submitted for review did the same on macOS.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, hishnash said:

This is literally what apple biz reps do. They also offer you discounts on the consumer list price. (and apple website even prompts you if you are a biz that you should consider ordering in that way rather than from the consumer website)
 

So there is no difference then? great, can people stop arguing there is?  I highly doubt it.

 

 

13 minutes ago, hishnash said:

So when it comes to compute MacOS and Linux will typically outperform windows in high cor count high memory situations, the resion for this is due to lower level system ark differences. Go take a look at https://www.phoronix.com/scan.php?page=home they commonly do comparisons showing this behavior. Also when it comes to macOS apple do provide a large set of optimised compute libs that help developers make use of system features (AVX etc) in such a way that the developer does not need to explicitly target these instruction sets, this ends up means many more applications in macOS will make use of AVX 512 (if your cpu has it) in comparison to on windows were developers (such as myself) need to explicitly think, code and test for these cases so in the end only a small fraction of the application makes maximum use of the hardware. 

And?

5 minutes ago, hishnash said:

So apple has explicitly permitted anyone (included nvidia) to produce CUDA style drivers without them being able to block them that is how the user-space drivers work. It is on Nvidia to produce such drivers, apple cant (and even if they could nvidia would sue) reverse engineer nvidia cards to write them. 

What apple is blocking is kernal-space drivers that are not stable. Maybe you don't understand what happens when a kernal space driver has a bug but in summary it means user data is comprised, corrupted and even in some cases system hardware is damaged. This is yes a downside of macos (compared to some linux systems) that display drivers are kernal-space rather than user-space. 

if a driver runs in `user-space` then if it crashes. or has a security bug or ends up in a loop it does not comprise the netier OS, you can re-start such a driver, and such a driver does not get FULL system memory access.

A kernel space driver, runs within the kernel run-loop, that means if it takes too long to do something the ENTIRE system locks up, if it has a security bug that exposers THE ENTIRE system's MEMORY! of ALL applications, if it crashes then the system KERNEL panics,  if it gets stuck in a loop you CAN NOT Restart it without KILLING the KERNEL (killing the machine). 

Writing kernel space drivers for PCIe devices is hard, writing them for PCIe devices that might at any point be pulled away from underneath you (eGPU) is very very hard. A single bug in that will mean apple will reject the driver, seeing how nvidia handle eGPUs being unplugged on windows (sometimes crashes the system) i would not be supprised if the drivers they submitted for review did the same on macOS.

I suggest you go back through the other threads.  It has been demonstrated over and over:

 

https://appleinsider.com/articles/19/01/18/apples-management-doesnt-want-nvidia-support-in-macos-and-thats-a-bad-sign-for-the-mac-pro

 

This article also explains it, and it was only written at the start of this year.

 

 

Also here's the apple website that clearly doesn't say anything at all about Nvidia

 

https://support.apple.com/en-au/HT208544

 

And here is the mac website that says:

 

Quote

Mac Pro supports the same GPUs that are supported by external graphics processors (eGPUs). If you use Boot Camp and want to install a NVIDIA card to use in Windows on your Mac, don't install the card in slot 2. Learn about using AMD graphics cards with Microsoft Windows on Mac Pro (2019).

It literally says mac pro supports the egpu support list (Link earlier) which does not list nvidia,  and the only mention of nvidia is that it is through bootcamp and windows on an external GPU (you can't put it in the case). 

 

 

 

 

They just do not want you to use Nvidia at all.

 

Remember the argument here is for people who just want it to work, not have to go through and use external GPU and 3 rd party drivers through bootcamp let alone anything that isn't kernel level.  I mean seriously at this point the whole argument of "professionals just want it to work" is an absolute joke if they want nvidia. 

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SenKa said:

Not the case with the Mac Pro.

False. nVidia drivers are not blocked from macOS, you can install them just fine. The issue is they have not been updated. That is both nVidia and Apple's faults, though primarily nVidia.

See above post.

 

and

https://www.gizmodo.com.au/2019/11/apple-and-nvidia-are-over/

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, mr moose said:

 

https://appleinsider.com/articles/19/01/18/apples-management-doesnt-want-nvidia-support-in-macos-and-thats-a-bad-sign-for-the-mac-pro

 

This article also explains it, and it was only written at the start of this year.

So User-space drivers have only been possible in macOS since 10.15 (that is the release that came out this Oct) so an article from the start of the year will only be considering kernel-space drivers, and as i explained quite clearly kernel space drivers `require` apple to approve them. 

Link to comment
Share on other sites

Link to post
Share on other sites

Haven't seen anyone mention this yet, so I'll put it here:

Sure, the desktop case looks like a cheese grater, but darn, that rackmount case though

Spoiler

758151089_mac-pro-rack-hero-splitter(1).thumb.jpg.8ca1ad79fe146dfe7cebb5057da2ec97.jpg

IMO that looks really cool.

Current LTT F@H Rank: 90    Score: 2,503,680,659    Stats

Yes, I have 9 monitors.

My main PC (Hybrid Windows 10/Arch Linux):

OS: Arch Linux w/ XFCE DE (VFIO-Patched Kernel) as host OS, windows 10 as guest

CPU: Ryzen 9 3900X w/PBO on (6c 12t for host, 6c 12t for guest)

Cooler: Noctua NH-D15

Mobo: Asus X470-F Gaming

RAM: 32GB G-Skill Ripjaws V @ 3200MHz (12GB for host, 20GB for guest)

GPU: Guest: EVGA RTX 3070 FTW3 ULTRA Host: 2x Radeon HD 8470

PSU: EVGA G2 650W

SSDs: Guest: Samsung 850 evo 120 GB, Samsung 860 evo 1TB Host: Samsung 970 evo 500GB NVME

HDD: Guest: WD Caviar Blue 1 TB

Case: Fractal Design Define R5 Black w/ Tempered Glass Side Panel Upgrade

Other: White LED strip to illuminate the interior. Extra fractal intake fan for positive pressure.

 

unRAID server (Plex, Windows 10 VM, NAS, Duplicati, game servers):

OS: unRAID 6.11.2

CPU: Ryzen R7 2700x @ Stock

Cooler: Noctua NH-U9S

Mobo: Asus Prime X470-Pro

RAM: 16GB G-Skill Ripjaws V + 16GB Hyperx Fury Black @ stock

GPU: EVGA GTX 1080 FTW2

PSU: EVGA G3 850W

SSD: Samsung 970 evo NVME 250GB, Samsung 860 evo SATA 1TB 

HDDs: 4x HGST Dekstar NAS 4TB @ 7200RPM (3 data, 1 parity)

Case: Sillverstone GD08B

Other: Added 3x Noctua NF-F12 intake, 2x Noctua NF-A8 exhaust, Inatek 5 port USB 3.0 expansion card with usb 3.0 front panel header

Details: 12GB ram, GTX 1080, USB card passed through to windows 10 VM. VM's OS drive is the SATA SSD. Rest of resources are for Plex, Duplicati, Spaghettidetective, Nextcloud, and game servers.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, hishnash said:

So User-space drivers have only been possible in macOS since 10.15 (that is the release that came out this Oct) so an article from the start of the year will only be considering kernel-space drivers, and as i explained quite clearly kernel space drivers `require` apple to approve them. 

 

So people in defense of this situation are o.k with user space drivers for professional workstations (which come with their own overhead and aren't as good as kernel space) while in the same breath are trying to argue it's the best product because the professionals who use it care about every last drop of performance?

 

 

If you can't see the inconsistency in that line of thinking then I don;t know what to tell you. 

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, hishnash said:

So even within the world of USBC (shaped connectors) there have already been 4 different versions. 2 USBC (specs) (that need idffrent chipsets to drive them) and 2 Thunderbolt version (due to diffrent thunderbolt chipsets that support different versions of tunneld displayport). 

With USB4 coming out in the NEXT YEAR you realy think that it is acceptable to ship a machine at this point were all the current ports are `out of date` within 6 months? Yes USB4 will look gust the same as USB 3.2 (and 3.1 gen 2 and 3.1 gen 1) and thunderbolt (gen 1 gen 2.....) but it will also have its own set of quirks. There will be devices in 2 years time that require USB4 (not the physical connector but the chipset protocol). Its a real shame that the chipset that drives the ports on the top of the case is not replaceable!

You do know that USB4 is basically TB3 right? In fact TB3 will branding will be still used for full featured USB4, while the USB protocol will allow you to remove certain features like how current USB protocols can't be fully trusted to do everything.

 

Hence all computers that have a TB3 port will automatically ssupport USB4 in a future sofwtarre update (which would just be a branding rename)

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, RedRound2 said:

You do know that USB4 is basically TB3 right? In fact TB3 will branding will be still used for full featured USB4, while the USB protocol will allow you to remove certain features like how current USB protocols can't be fully trusted to do everything.

 

Hence all computers that have a TB3 port will automatically ssupport USB4 in a future sofwtarre update (which would just be a branding rename)

USB4 is USB 3.1 Gen 2 + TB3 integrated in to one standard to avoid the confusion of the ubiquitous USB-C port as to whether it is USB or TB3. From my understanding, anyways.

Brands I wholeheartedly reccomend (though do have flawed products): Apple, Razer, Corsair, Asus, Gigabyte, bequiet!, Noctua, Fractal, GSkill (RAM only)

Wall Of Fame (Informative people/People I like): @Glenwing @DrMacintosh @Schnoz @TempestCatto @LogicalDrm @Dan Castellaneta

Useful threads: 

How To Make Your Own Cloud Storage

Spoiler

 

Guide to Display Cables/Adapters

Spoiler

 

PSU Tier List (Latest)-

Spoiler

 

 

Main PC: See spoiler tag

Laptop: 2020 iPad Pro 12.9" with Magic Keyboard

Spoiler

PCPartPicker Part List: https://pcpartpicker.com/list/gKh8zN

CPU: AMD Ryzen 9 3900X 3.8 GHz 12-Core OEM/Tray Processor  (Purchased For $419.99) 
Motherboard: Asus ROG Crosshair VIII Formula ATX AM4 Motherboard  (Purchased For $356.99) 
Memory: G.Skill Trident Z RGB 32 GB (2 x 16 GB) DDR4-3000 Memory  (Purchased For $130.00) 
Storage: Kingston Predator 240 GB M.2-2280 NVME Solid State Drive  (Purchased For $40.00) 
Storage: Crucial MX300 1.05 TB 2.5" Solid State Drive  (Purchased For $100.00) 
Storage: Western Digital Red 8 TB 3.5" 5400RPM Internal Hard Drive  (Purchased For $180.00) 
Video Card: Gigabyte GeForce RTX 2070 8 GB WINDFORCE Video Card  (Purchased For $370.00) 
Case: Fractal Design Define R6 USB-C ATX Mid Tower Case  (Purchased For $100.00) 
Power Supply: Corsair RMi 1000 W 80+ Gold Certified Fully Modular ATX Power Supply  (Purchased For $120.00) 
Optical Drive: Asus DRW-24B1ST/BLK/B/AS DVD/CD Writer  (Purchased For $75.00) 
Total: $1891.98
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2020-04-02 19:59 EDT-0400

身のなわたしはる果てぞ  悲しわたしはかりけるわたしは

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, RedRound2 said:

You do know that USB4 is basically TB3 right? In fact TB3 will branding will be still used for full featured USB4, while the USB protocol will allow you to remove certain features like how current USB protocols can't be fully trusted to do everything.

 

Hence all computers that have a TB3 port will automatically ssupport USB4 in a future sofwtarre update (which would just be a branding rename)

Yes USB4 is `basically` TR3 but even within TR3 there are differences (with respect to unneleing display port) depending on the chipset.

Given how many differnt USB specs we have seen in the last few years that use the USB-C connector i would not be surprised if within 2 years of USB4 shipping we will get USB4.1 gen 1 then USB4.0 gen 2 (somehow .....) then USB 3.3 GEN 4 ........ 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×