Jump to content

Why Doesn't Apple Use Nvidia GPUs?

Crunchy Dragon

Apple got on Nvidias bad side, just like AMD, Intel, Microsoft(before), Sony, XFX &  more... O3O

Lake-V-X6-10600 (Gaming PC)

R23 score MC: 9190pts | R23 score SC: 1302pts

R20 score MC: 3529cb | R20 score SC: 506cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: Intel Core i5-10600, 6-cores, 12-threads, 4.4/4.8GHz, 13,5MB cache (Intel 14nm++ FinFET) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: ASUS PRIME B460 PLUS, Socket-LGA1200 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W / RAM A1, A2, B1 & B2: DDR4-2666MHz CL13-15-15-15-35-1T "Samsung 8Gbit C-Die" (4x8GB) / Operating System: Windows 10 Home / Sound: Zombee Z300 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Seagate® Barracuda 2TB HDD / Storage 4: Seagate® Desktop 2TB SSHD / Storage 5: Crucial P1 1000GB M.2 SSD/ Storage 6: Western Digital WD7500BPKX 2.5" HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter (Qualcomm Atheros)

Zen-II-X6-3600+ (Gaming PC)

R23 score MC: 9893pts | R23 score SC: 1248pts @4.2GHz

R23 score MC: 10151pts | R23 score SC: 1287pts @4.3GHz

R20 score MC: 3688cb | R20 score SC: 489cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Ryzen 5 3600, 6-cores, 12-threads, 4.2/4.2GHz, 35MB cache (T.S.M.C. 7nm FinFET) / Display: HP 24" L2445w (64Hz OC) 1920x1200 / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: ASUS Radeon RX 6600 XT DUAL OC RDNA2 32CUs @2607MHz (T.S.M.C. 7nm FinFET) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: ASRock B450M Pro4, Socket-AM4 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W / RAM A2 & B2: DDR4-3600MHz CL16-18-8-19-37-1T "SK Hynix 8Gbit CJR" (2x16GB) / Operating System: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Western Digital My Passport 2.5" 2TB HDD / Storage 4: Western Digital Elements Desktop 2TB HDD / Storage 5: Kingston A2000 1TB M.2 NVME SSD / Wi-fi & Bluetooth: ASUS PCE-AC55BT Wireless Adapter (Intel)

Vishera-X8-9370 | R20 score MC: 1476cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Case Fan VRM: SUNON MagLev KDE1209PTV3 92mm / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: AMD FX-8370 (Base: @4.4GHz | Turbo: @4.7GHz) Black Edition Eight-Core (Global Foundries 32nm) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: MSI 970 GAMING, Socket-AM3+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W PSU / RAM 1, 2, 3 & 4: Corsair Vengeance DDR3-1866MHz CL8-10-10-28-37-2T (4x4GB) 16.38GB / Operating System 1: Windows 10 Home / Sound: Zombee Z300 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Barracuda 2TB HDD / Storage 3: Seagate® Desktop 2TB SSHD / Wi-fi: TP-Link TL-WN951N 11n Wireless Adapter

Godavari-X4-880K | R20 score MC: 810cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 95w Thermal Solution / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Athlon X4 860K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / CPU: AMD Athlon X4 880K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / Display: HP 19" Flat Panel L1940 (75Hz) 1280x1024 / GPU: EVGA GeForce GTX 960 SuperSC 2GB (T.S.M.C. 28nm) / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: MSI A78M-E45 V2, Socket-FM2+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W PSU / RAM 1, 2, 3 & 4: SK hynix DDR3-1866MHz CL9-10-11-27-40 (4x4GB) 16.38GB / Operating System 1: Ubuntu Gnome 16.04 LTS (Xenial Xerus) / Operating System 2: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Western Digital My Passport 2.5" 2TB HDD / Storage 3: Western Digital Elements Desktop 2TB HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter

Acer Aspire 7738G custom (changed CPU, GPU & Storage)
Spoiler

CPU: Intel Core 2 Duo P8600, 2-cores, 2-threads, 2.4GHz, 3MB cache (Intel 45nm) / GPU: ATi Radeon HD 4570 515MB DDR2 (T.S.M.C. 55nm) / RAM: DDR2-1066MHz CL7-7-7-20-1T (2x2GB) / Operating System: Windows 10 Home / Storage: Crucial BX500 480GB 3D NAND SATA 2.5" SSD

Complete portable device SoC history:

Spoiler
Apple A4 - Apple iPod touch (4th generation)
Apple A5 - Apple iPod touch (5th generation)
Apple A9 - Apple iPhone 6s Plus
HiSilicon Kirin 810 (T.S.M.C. 7nm) - Huawei P40 Lite / Huawei nova 7i
Mediatek MT2601 (T.S.M.C 28nm) - TicWatch E
Mediatek MT6580 (T.S.M.C 28nm) - TECNO Spark 2 (1GB RAM)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (orange)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (yellow)
Mediatek MT6735 (T.S.M.C 28nm) - HMD Nokia 3 Dual SIM
Mediatek MT6737 (T.S.M.C 28nm) - Cherry Mobile Flare S6
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (blue)
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (gold)
Mediatek MT6750 (T.S.M.C 28nm) - honor 6C Pro / honor V9 Play
Mediatek MT6765 (T.S.M.C 12nm) - TECNO Pouvoir 3 Plus
Mediatek MT6797D (T.S.M.C 20nm) - my|phone Brown Tab 1
Qualcomm MSM8926 (T.S.M.C. 28nm) - Microsoft Lumia 640 LTE
Qualcomm MSM8974AA (T.S.M.C. 28nm) - Blackberry Passport
Qualcomm SDM710 (Samsung 10nm) - Oppo Realme 3 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Oh man I completely forgot about the pandemic Apple had with iMacs and MacBooks having mass GPU failures back in the early 2010s.

Intel® Core™ i7-12700 | GIGABYTE B660 AORUS MASTER DDR4 | Gigabyte Radeon™ RX 6650 XT Gaming OC | 32GB Corsair Vengeance® RGB Pro SL DDR4 | Samsung 990 Pro 1TB | WD Green 1.5TB | Windows 11 Pro | NZXT H510 Flow White
Sony MDR-V250 | GNT-500 | Logitech G610 Orion Brown | Logitech G402 | Samsung C27JG5 | ASUS ProArt PA238QR
iPhone 12 Mini (iOS 17.2.1) | iPhone XR (iOS 17.2.1) | iPad Mini (iOS 9.3.5) | KZ AZ09 Pro x KZ ZSN Pro X | Sennheiser HD450bt
Intel® Core™ i7-1265U | Kioxia KBG50ZNV512G | 16GB DDR4 | Windows 11 Enterprise | HP EliteBook 650 G9
Intel® Core™ i5-8520U | WD Blue M.2 250GB | 1TB Seagate FireCuda | 16GB DDR4 | Windows 11 Home | ASUS Vivobook 15 
Intel® Core™ i7-3520M | GT 630M | 16 GB Corsair Vengeance® DDR3 |
Samsung 850 EVO 250GB | macOS Catalina | Lenovo IdeaPad P580

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia has had terrible experiences with nvidia in the past,  notably with g92 and the 2011 mbp

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

Some other probable theories as to why Apple won't go with NVIDIA:

  • NVIDIA likes to stick their ecosystem on their hardware. Apple would like to play the game by their own rules and since AMD is usually more open than NVIDIA, there's that.
  • NVIDIA doesn't make consumer grade hardware with compute chops. Or rather, they push a strong gaming use case for their consumer GPUs and Apple could care less about gaming. AMD still makes their GPUs as a jack of all trades regardless of market segment.
  • NVIDIA may just be a stubborn donkey when it comes to custom orders.
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LOOK OVER HERE said:

Such an honor for Mr. Sebastian to reply to your thread!

happens all the time...

Quote

(Linus's last name is Sebastian).

we know...

 

11 hours ago, M.Yurizaki said:
  • NVIDIA doesn't make consumer grade hardware with compute chips.

have they ever said why? or is it just so they can have a point of difference between their consumer cards and the quatro series?

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Sierra Fox said:

have they ever said why? or is it just so they can have a point of difference between their consumer cards and the quatro series?

I should've been clearer. What I meant by that is that the GPUs don't have a lot of FP64 performance compared to AMD's offerings. And I found it's also not limited to consumer GPUs. It's any non Gx-100 GPU. As for why, most of it was to save on die space and it was a pragmatic move: how many consumer applications need FP64?

 

Though on the flipside, AMD's GPUs may be preferable for GPGPU usage anyway due that whole ACE thing. But I don't know, I'm not a GPGPU programmer.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, LOOK OVER HERE said:

Such an honor for Mr. Sebastian to reply to your thread! (Linus's last name is Sebastian). Anyways, I sincerely hope that Apple justifies their price point with either Geforce 2000 series or Mobile Vega GPUs in their new Macbook Pros.

Lol. I didn’t made this thread. But nah, unless NVIDIA wants to play with Apple’s rules, there ain’t new Macs with NVIDIA graphics cards

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/23/2018 at 11:57 PM, LinusTech said:

As far as I know it's to do with the RMA issues years ago. Until Nvidia is willing to pay back what Apple thinks they are owed, it's unlikely to ever change imo. 

 

^conjecture^

yes this is on track. there were multiple issues with their mobile lineup starting with 8400GS in 2008 until Apple told them to get rekt

Link to comment
Share on other sites

Link to post
Share on other sites

AMD has better GPUs

Well for Apple's case, AMD helps with a custom driver.

And AMD does have some pretty decent workstation stuff, since Apple products mostly work with OpenCL and OpenGL and Apple's Metal API being influenced by Mantle Vulkan and OGL. It's not a huge surprise.

a Moo Floof connoisseur and curator.

:x@handymanshandle x @pinksnowbirdie || Jake x Brendan :x
Youtube Audio Normalization
 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe they don't want nVidia to fully control the market. Otherwise there is no reason that I could find, Apple always made compromises to make their devices slimmer and more powerful, there is no reason for them to not go with nVidia GPUs. 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, JohnVHSTapes said:

Nvidia is anti-consumer and Apple is anti-consumer. If you put the two together, then no one will buy their products.

Given that Apple is arguably the most profitable and cash-flush company on the planet, I'm pretty sure it does a good job of being pro-consumer.  Can't vouch for NVIDIA as much, but it's selling every GPU it can make at this point.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/23/2018 at 2:38 PM, mr moose said:

I still ponder the idea of apple buying AMD.

i cant see this happening anytime soon.  apple sales are from the iphones. i can see them buying ARM.

Link to comment
Share on other sites

Link to post
Share on other sites

The last Macbook models which used NVIDIA graphics cards was the 2012 retina model (please correct me if I'm wrong). After that, even though the NVIDIA cards gave more performance than the AMD ones, AMD used less battery. So apple went with them on the Macbooks. However, my guess is that after they went with AMD, they couldn't partner with NVIDIA even if they wanted to. So up till today, AMD cards can still be found in computers like the iMac Pro. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Commodus said:

Given that Apple is arguably the most profitable and cash-flush company on the planet, I'm pretty sure it does a good job of being pro-consumer.

that's not what pro-consumer means

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, agello24 said:

i cant see this happening anytime soon.  apple sales are from the iphones. i can see them buying ARM.

Unless I have overlooked something, Apples revenue from mac sales alone could buy AMD outright twice.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Sierra Fox said:

that's not what pro-consumer means

 

If it's selling a lot of gear, that suggests there are a lot of consumers who like what it's doing, that's all!

Link to comment
Share on other sites

Link to post
Share on other sites

If Apple does takeover amd, imagine just the progress amd can make in x86!!! If Apple can beat everyone at arm, their processor engineers alongside amd expertise in x86 and proper funding can defiantly spit out an x86 superior than intel.

 

Although consumers are likely to pay a stupid Apple tax....

Sudo make me a sandwich 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Commodus said:

 

If it's selling a lot of gear, that suggests there are a lot of consumers who like what it's doing, that's all!

thats still not what pro-consumer means. it's got nothing to do cash-flow

 

Pro-consumer is about giving the customer choice in regards to what they can buy and from where and what they can do with the product. it's to protect consumer rights and avoid monopolies.

 

anti-consumer on the other hand is a company essentially forcing people to only buy from them using generally illegal practices to force competitors out of the market. this generally comes in the form of locked down distributors so you can only get them from selected retailers and if left unchecked will turn into a monopoly. companies that do this generally dont care about their customers or customer choice and just want to get the most amount of money possible.

 

Intel has been accused of anti-consumer practices for years. yet they are still more successful financially than AMD, therefore according to their cash-flow are they actually pro-consumer?

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/26/2018 at 6:57 PM, mr moose said:

Unless I have overlooked something, Apples revenue from mac sales alone could buy AMD outright twice.

you may be right. since the R & D does not change anything extreme to make look different.   its the same thing year after year. just slight changes.  still shocked the greedy company canceled the idea of using sapphire glass instead of gorilla glass. this also explains the extra revenue from repairs on cracked screens.

Link to comment
Share on other sites

Link to post
Share on other sites

My humble opinion is the fact that Apple didnt want to use Nvidia cards because Nvidia is highly reticent about supporting compute technologies other than CUDA. If apple would have went full cuda it would have been at the mercy of them, instead they chose to rely on AMD(and Intel) because AMD was moving forward with their openCL implementation, Nvidia enabled openCL 2.0 last year, three years after the standard was ratified.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×