Jump to content

Apple's doesn't want to support Nvidia in macOS

DrMacintosh
3 minutes ago, firelighter487 said:

personally i'm against this. not everyone has the money to just go buy another gpu when their nvidia card worked on the previous OS. 

 

i don't support nvidia. i think they are an awful company but i do support people who are on a "budget" in the sense that they cannot spend another $500 on a new GPU after already having bought a Mac, some kind of external gpu enclosure and an expensive Nvidia gpu. 

 

IMO nvidia should be able to release drivers for it's older cards so that people who already own them can still use them. if they wanted to do this they should have banned nvidia for releasing drivers for any new cards that come out. not supporting older cards can be a big financial issue for some people...

It's a storm in a teacup, this happens every time they do a macOS update and it always takes months but they always release them eventually.

 

Tbh it's not really news, it's a collection of guesses.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia made the same products for other laptop makers at the same time that they were "faulty within Macbooks". Without issue. So whose fault is it really?

 

(BTW that was a time where Nvidia was still making the top of the market desktop motherboard chipsets as well. My guess is that there was some shared fault that neither company was willing to admit to.)

 

But **shrugs**  

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, DrMacintosh said:

-snip-

CUDA? Apple would rather people use OpenGL and programs on macOS don't use CUDA acceleration.

Compute performance? HELLO? Radeon VII?

-snip-

And so Apple machines still stay as "movie making machines" which you really don't want to use as render machines (almost in any render task CUDA is faster than OpenGL). Not even Radeon VII will save that implied that there's going to be enough of them. But then again looking at the current "Pro" Macs, I don't believe Mac Pro will be anything else.

 

But leaving that to other topics this is still laughable situation. Like for real, I just can't imagine anything else than Apple going "You can't develope drivers to our platform because we say so" and huffing and puffing like a child in sandbox. Like who it would hurt if Nvidia was to develope drivers for OS X? Like it would cost Apple something that someone else does the development (I do think Apple gets better prices from AMD when they block Nvidia, but at the same time that doesn't reflect to the consumer prices at all so fuck that). Like it's kind of normal that HW developers choose which platforms they support and make drivers and support for them, but Apple is still the odd one out and HW developers must ask if they can biuld support for Apples platforms. I just can't wait when/if VR hits itself really out and what VR developer Apple sees "good enough" for them and then comes the 100 and 1 add-ons and other technologies from smaller and smaller companies and someone notices that OS X really isn't governed to be fit for VR development, like it would be some kind of news.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, DrMacintosh said:

what do their cards offer that AMD cards do not

better performance, power consumption and heat output (yes thats an old thing with them but those cards can run hot even if cooled correctly (and no, 15 lines of vents on the top and bottom of your all in one computer isn't going to cut it))

2 hours ago, DrMacintosh said:

CUDA? Apple would rather people use OpenGL and programs on macOS don't use CUDA acceleration.

in a lot of 3d applications it actually improves performance since the 3d API (CUDA) doesn't have to make assumptions of what the card is, with CUDA is designed with nvidia cards specifically so the optimization is extremely good (sound familiar OSX user?), OpenGL has to optimize itself to many cards from many venders, this does cause some performance penalty's and some programs run slower with OpenGL then CUDA (3d and/or hardware accelerated applications are excellent examples)

2 hours ago, DrMacintosh said:

Compute performance? HELLO? Radeon VII?

1. it isn't out yet

2. we don't know the full story of how well it preforms nor if what AMD is saying is true, for all we know it's a bit exaggerated to look like it's more powerful (which is probably isn't)

3. even if it wasn't exaggerated, (again probably not) thats probably a controlled test and doesn't reflect a real world scenario at all, how does it preform in Final Cut Pro X (a popular video editor program on Mac's am i not right?), how well does it do in 3d applications? etc?

3 hours ago, DrMacintosh said:

Ray Tracing? You mean the latest scam from Nvidia?

while i agree with you that ray tracing isn't where nvidia said it was. the tech for real time ray tracing is in EXTREMELY EARLY STEPS and needs more time to develop and mature. the tech itself is fine, nvidia probably just wanted a way to try and get a crap load of hype for their brand new cards and even then back when they were designing the card architecture itself (3 years ago) they probably thought that time real time ray tracing was at a point where it was possible on a consumer desktop was when the card would be released, so they implemented it, obviously they ran into problems and were so late into development that they just had to deal with it instead of starting from scratch.

 

thats perfectly fine, if someone told you that a tech was going to be exactly like this then when it arrived two years later it lived up to what that dude told you, then thats ok...

but thats not real life, things don't go to plan.

 

at the time nvida predicted that the tech would be ready for prime time and rightfully so, ray tracing in computers has been around since the 70's, you would think over 50 years of development and improvement would help right? yea actually, but what nvida did was the equivalent of taking a steak of the BBQ too early and not letting it cook, it looks cooked but on the inside, it's as raw as when it was cut at the butchers. ray tracing still has to be cooked (pun intended) for some more time, maybe in 2 years it will be ready but it's not now.

 

but what isn't right is selling a incomplete product that is unable to meet demands, it has had trouble from day one from windows update bullshitery to cards failing due to manufacturing defects (some reports say it went out like a xbox 360 with the........red ring of death... *shudder*)

not really a scam, more just bad timing and incompetence from multiple party's...

google defines a scam as 

Quote

a dishonest scheme; a fraud.

this doesn't define their position, they are trying to make the best of a bad situation like how anyone would if something similar happened to them...

 

while i don't like nvida for what they did nor do i want to be a fan boy of them that only says good things about them and nothing bad, they did the best they could in a worst case scenario...

they could of done better but thats in the past, you learn from your mistakes and improve, otherwise you wouldn't be here, im sure you have made many mistakes that you've learned from in your life, nvidia has to improve as well, otherwise we would all be using GeForce 256 GPU's in our systems right now... this is nvidia's teenage "Oh-shit-i-got-a-parking-ticket-that-i-can't-afford-and-my-parents-will-kill-me-if-they-find-out-and-they-will-since-i'm-really-fucked-right-now" scenario, they fucked up badly and have to deal with the consequences and learn from their mistakes

 

it happens to everyone, even to apple...

*Insert Witty Signature here*

System Config: https://au.pcpartpicker.com/list/Tncs9N

 

Link to comment
Share on other sites

Link to post
Share on other sites

"Nvidia's RTX GPUs are a scam"  kek. sure mate. 

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

Who cares? If I got snubbed by a company then I wouldn’t want to do business with them moving on either.

 

whats so hard to understand on this?

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, DrMacintosh said:

provides an inferior product that costs more

????

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Bananasplit_00 said:

????

Yeah lol, its kind of ironic to complain about price when a 15" macbook pro starts at over $2,000. It may be cheaper for Apple to use AMD, but AMD probably isn't getting much for it, and that better price doesn't seem to reflect back to consumer prices.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Bouzoo said:

Don't miss Qualcomm vs Apple or Qualcomm vs the rest of the world. 

Don't forget that Qualcomm just got fined for using their dominant market possession. $1.2 billion fine to be more exact.  So I think they wont take on the rest of the world right now.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Kroon said:

Don't forget that Qualcomm just got fined for using their dominant market possession. $1.2 billion fine to be more exact.  So I think they wont take on the rest of the world right now.

Well, it's already on.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

Although i do not give a flying fuck what apple and nvidia do, i am very happy that nvidia loses apple customers and also people who have hackintoshes. I use linux 8 years now and nvidia has been the worst company linux comunity has ever dealt with. These are not my words in fact these are linux creaters Linus Torvalds words! Their drivers suck on linux. Cause many problems and they won't make good drivers themselves and also they won't open their drivers so that comunity can create good drivers. So "Nvidia fuck you!"

 

P.S. this last words are not mine either. I quoted them from Linus Torvalds :D

 

 

nvidia.jpg

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

This is just Apple being Apple. I'm all for picking and choosing which components you want to put in your product, but this just takes the cake.

Anyway, I'd imagine someone's going to find a way to get around this.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Rohith_Kumar_Sp said:

"Nvidia's RTX GPUs are a scam"  kek. sure mate. 

 

Yeah when someone starts making stuff up like this they can go screw themselves with a rusty fire poker up their backside sideways. That kind of outright lying makes their entire position suspect.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, CarlBar said:

 

Yeah when someone starts making stuff up like this they can go screw themselves with a rusty fire poker up their backside sideways. That kind of outright lying makes their entire position suspect.

i'm pretty sure people make over exaggerated content just for the views,  just like how sites write both "how the TLJ was awesome" and an hour later also publish "Why TLJ is really bad" people click one or the other. 

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, mate_mate91 said:

Although i do not give a flying fuck what apple and nvidia do, i am very happy that nvidia loses apple customers and also people who have hackintoshes. I use linux 8 years now and nvidia has been the worst company linux comunity has ever dealt with. These are not my words in fact these are linux creaters Linus Torvalds words! Their drivers suck on linux. Cause many problems and they won't make good drivers themselves and also they won't open their drivers so that comunity can create good drivers. So "Nvidia fuck you!"

 

P.S. this last words are not mine either. I quoted them from Linus Torvalds :D

The creator of a free and open source OS shouldn't be telling a company to go f*ck themselves. And of course the driver support for Nvidia is terrible in Linux when they're told to go away instead of the community working with Nvidia to make good drivers.

Link to comment
Share on other sites

Link to post
Share on other sites

Apple should just buy AMD already..... it isn't like they don't have enough cash

Link to comment
Share on other sites

Link to post
Share on other sites

If I recall wasn't the actual reason for those failing Nvidia GPUs in Macbook Pros because a mis-spec capacitor?
I'm not sure who's jurisdiction that would technically fall under in terms of design, since it could have been Nvidia underestimating their own chip's requirements, or Apple. (Granted I dont think ether have ever admitted to that being the fault... )

I don't like defending Nvidia, but regardless this is kinda a shitty situation for consumers where nobody wins.

Lets face it, as of right now, Nvidia does have the higher end GPUs compared to AMD. If you need a videocard for your workstation with 24GB of VRAM for some reason, a Quadro is your only modern option, AMD just does not have an equivalent workstation card atm (The FirePro W9100 is the only thing with > 16GB, and that is a 5 year old card). If a thing I am rendering takes more time because I am forced to use a certain graphics card (and remember Time = money in the professional space) I would get angry at Apple whoever it was that was blocking me from using hardware better suited for my task.

Also I don't think its a good idea for Apple to just flat out block someone for developing something in MacOS over a dispute.
Like if they blocked... Western Digital external hard drives from being formatted because of lets say hard drive failure rates back in the day.... That be really bad for us consumers, and I wouldn't care if Seagate made similar drives.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Sypran said:

If you need a videocard for your workstation with 24GB of VRAM for some reason, a Quadro is your only option, AMD just does not have an equivalent workstation card.

get SSG, 16gbs HBM2 + 2tb nvme ssd on package > 24gbs DDR6 for alot of use cases, there are hardly any where you need the whole dataset in memory all the time for computations and the ssd is fast and close enough to be sort of l2 cache.

 

Or cash out for MI60, get an intel proc for display output and enjoy that sweet sweet 32gb of hbm2

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Sypran said:

I forgot about that things existence! ... but don't those things only work in Linux?

I guess it doesnt, although there is no reason for it not being able to work on a vm and serve as a "cloud" solution, it got all the standart api's, it just need ROCm to boot, but ye, its nowhere near as plug and play

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, DrMacintosh said:

Every investor of any stock ever, just laughed at your face. 

Good to know my sensibilities are right where I want them.

 

If an investor wants you to do something, chances are that the exact opposite is what you should do. IF you are actually trying to run a business because you have a passion for the thing that business does.

 

You know, customer service, good product, good value, things that companies should actually care about rather than quarterly profits to a psychotic level that is unsustainable and will eventually lead to total fucking economic collapse.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

Knowing Apple, it wouldn't surprise me if their poor design choice were the reason for the nvidia chips failing.

Be it from inadequate cooling, to simply not having adequate spacing between high current lines and data lines, essentially killing chips/GPU (Seen plenty of those on Louis' Channel)

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, mate_mate91 said:

Although i do not give a flying fuck what apple and nvidia do, i am very happy that nvidia loses apple customers and also people who have hackintoshes. I use linux 8 years now and nvidia has been the worst company linux comunity has ever dealt with. These are not my words in fact these are linux creaters Linus Torvalds words! Their drivers suck on linux. Cause many problems and they won't make good drivers themselves and also they won't open their drivers so that comunity can create good drivers. So "Nvidia fuck you!"

 

P.S. this last words are not mine either. I quoted them from Linus Torvalds :D

 

 

 

Lol. Are you being serious right now? Between AMD and Nvidia, Nvidia historically put a ludicrously larger effort into linux support. Yes, their driver is closed, but for most of the Linux development period, AMD/ATI's official driver had been so unbelievably bad that you were legitimately better off using the random hacked efforts of a tiny subset of the already small linux development community in many respects. 

 

The tides have turned recently (last 3 years or so), but the magnitude of effort for their own support has been quite literally incomparable. Talk about revisionist history. 

 

You can be pissed off at them not being 'open enough', but the rest is stupid (and yes Linus Torvald is a melodramatic crazy person. A more impressive list would be who he hasn't raged out at over the years.)

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×