Jump to content

Apple's doesn't want to support Nvidia in macOS

DrMacintosh

Well... Guess I'll just stick with 10.13.6 for the time being till one of the companies starts to properly co-operate with each other so they won't leave behind those who still use Nvidia cards on their Macs. This is especially a blow towards the Hackintosh community, including myself since I personally use a 1050 Ti. Not planning on getting an AMD card anyway for the time being till I require an actual upgrade. 

Desktops

 

- The specifications of my almighty machine:

MB: MSI Z370-A Pro || CPU: Intel Core i3 8350K 4.00 GHz || RAM: 20GB DDR4  || GPU: Nvidia GeForce GTX1070 || Storage: 1TB HDD & 250GB HDD  & 128GB x2 SSD || OS: Windows 10 Pro & Ubuntu 21.04

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, DrMacintosh said:

That’s how any successful company operates. Investors own and control everything. Nothing happenes without investor approval. 

maybe after its already large, but to grow you first need to really listen to your customers.

 

 

These are 2 companies i really dont like, not liking each other, apple for their bad hardware made to overheat and die prematurely at ridiculous prices, nvidea for being absolute dicks about everything. 

buying apple and expecting support for anything not there's is strange, they lock everything down.

in an ideal world they both would be slapped upside the head until they started to behave 

Link to comment
Share on other sites

Link to post
Share on other sites

Does Nvidia have a far more dominant market share or is that just me? It feels odd that AMD is the graphics solution of choice for Mac products, which themselves are very popular, but AMD doesn't stack up to Nvidia from what I remember.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, fasauceome said:

Does Nvidia have a far more dominant market share or is that just me?

They do, but that also means that they feel that they should be more controlling and stubborn. AMD on the other hand is perfectly fine to let Apple order custom designs and accept Apples terms. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, DrMacintosh said:

They do, but that also means that they feel that they should be more controlling and stubborn. AMD on the other hand is perfectly fine to let Apple order custom designs and accept Apples terms. 

Except that given the market share it will disadvantage consumers much more than it will NVidia. People are already fuming at Apple over the current state of Mojave killing NVidia performance that they need for their job/work. It's a big middle finger to their customers that have and run Nvidia cards already. Regardless of their opinion/relationship with NVidia, they should be putting their customer first, I though that was Apple's point of difference.

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Arika S said:

People are already fuming at Apple over the current state of Mojave killing NVidia performance that they need for their job/work.

I find that hard to believe. eGPUs have only been a thing since High Sierra and I don't believe that any appreciable amount of genuine Mac users (aka not Hackintoshers) depend on Nvidia GPU technologies in any way. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, DrMacintosh said:

eGPUs have only been a thing since High Sierra

so? the fact the eGPUs are now natively supported in MacOS means that anyone from now on can only chose AMD, striping them of consumer choice.

 

 

Quote

What we found was support inside the Spaceship for the idea, but a lack of will to allow Nvidia GPUs. We've spoken with several dozen developers inside Apple, obviously not authorized to speak on behalf of the company, who feel that support for Nvidia's higher-end cards would be welcome, but disallowed quietly at higher levels of the company.

"It's not like we have any real work to do on it, Nvidia has great engineers," said one developer in a sentiment echoed by nearly all of the Apple staff we spoke with. "It's not like Metal 2 can't be moved to Nvidia with great performance. Somebody just doesn't want it there."

One developer went so far as to call it "quiet hostility" between long-time Apple managers and Nvidia.

For sure, somebody at Apple in the upper echelons doesn't want Nvidia support going forward right now. But, even off the record, nobody seemed to have any idea who it is. The impression we got is that it was some kind of passed-down knowledge with the origin of the policy lost to the mists of time, or an unwritten rule like so many in baseball.

Two years ago, pre-eGPU support, this block may have made at least a modicum of sense. Any Macs with PCI-E slots were aging, and the user base was dwindling through attrition alone. But, the drivers are available for High Sierra and are getting updated to this day —and we can testify that they still work great in a 5,1 Mac Pro, including the 1000-series cards.

The Nvidia driver can be shoe-horned onto High Sierra machines who want a Nvidia card in an eGPU. We're not going to delve into it here, but there is a wealth of information over at eGPU.io, if you're so inclined. And, don't upgrade to Mojave if you do so.

This decision makes absolutely no sense with eGPUs now being explicitly supported in macOS. They work fine in Windows, so it's not a technical limitation. Some tasks perform better on AMD, and some on Nvidia, it is a fact of silicon. There is no reason beyond marketing and user-funneling to prohibit use of the cards on a software level.

No, there aren't a ton of eGPU installs. Yes, a good portion of those users are fine with AMD cards. But, it is absolutely overly user-hostile to not allow Nvidia to release the drivers not just for future eGPU use, but for the non-zero percent of those users who are keeping the old Mac Pro alive. And if this is some kind of ancient Apple secret or preserved grudges that are preventing it, that's even worse.

https://appleinsider.com/articles/19/01/18/apples-management-doesnt-want-nvidia-support-in-macos-and-thats-a-bad-sign-for-the-mac-pro

 

so even some of the people at Apple think it's absolutely crazy to block NVidia, So unless there is an actual technical reason why they cannot support it, then Apple is abusing their power of having such a locked down ecosystem and their customers. No other company would be able to get away with this, so why should apple? because they are worth 1 trillion dollars? if that's the answer, then it's clear they don't care about their customers at all.

 

Imagine buying a prebuilt computer from HP and it comes with an NVidia GPU. now say you want to upgrade, but you want to switch to AMD but HP have locked down their system to only accept NVidia cards, people would lose their goddamn minds. it's only acceptable for Apple because their customers are used to being told what they should do and what they should have, instead of actually having a choice.

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, DrMacintosh said:

I find that hard to believe. eGPUs have only been a thing since High Sierra and I don't believe that any appreciable amount of genuine Mac users (aka not Hackintoshers) depend on Nvidia GPU technologies in any way. 

Not sure what the percentage of the user base is for mac desktops... but K5000 and GTX 680s were available in geniune Macs and have official Mohave metal support (unlike the WX9100, one of the trashcan gpus). Also, while those things are quite old by now... the 2014 MB Pro used a 750M. Not sure if there are any other important contemporaries...

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Curufinwe_wins said:

Not sure what the percentage of the user base is for mac desktops... but K5000 and GTX 680s were available in geniune Macs and have official Mohave metal support (unlike the WX9100, one of the trashcan gpus). Also, while those things are quite old by now... the 2014 MB Pro used a 750M. Not sure if there are any other important contemporaries...

Those Macs continue to get support because Apple isnt just going to leave paying customers in the dust like that. Most Mac owners run on iGPUs and the number of AMD dGPUs dwarf the number of Nvidia cards. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, DrMacintosh said:

Those Macs continue to get support because Apple isnt just going to leave paying customers in the dust like that. Most Mac owners run on iGPUs and the number of AMD dGPUs dwarf the number of Nvidia cards. 

I don't doubt either of those statements. I was just mentioning it. I would be surprised if the Nvidia running apple share was less than 5%, but 5% with consistent issues (if they are indeed occurring) makes a very loud minority.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Gonna be interesting once Apple creates their own GPU

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, valdyrgramr said:

 

I wouldn't say it's exactly a scam, but you are paying into early adopter fees followed by bs pricing on the 2080 Ti.  The only reason that card costs as much as it does is due to 0 competition and people willing to pay that much.  I guess that is a scam to some people, but still...

same goes for any new tech and it's early adopters, doesn't mean its a scam.

 

 

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, valdyrgramr said:

I'm not saying it is a scam, but some people might consider it that.

well, as an youtuber who has a channel that's dedicated to tech (well, more of an Apple dedicated channel, he who lives in a glasshouse situation), he should know better than call it a scam. 

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, valdyrgramr said:

So, I listened to his argument slightly.  He's arguing the pricing in the context I mentioned partly, but also arguing that it is currently useless.

To be perfectly frank... the argument is bullshit. The performance gain isn't very large... but even without RT, the 20-series slotted in at the same price to performance as the 10-series cards they actually replaced performance-wise. And they are more power efficient at the same performance, and have new hardware blocks that potentially will give dramatically better results in the future.

 

It's quite reasonable to be disappointed that the performance did not grow enough... But this isn't Intel offering less for more price in many situations. This isn't the hardware stagnant performance (with basically only software-locked optimizations) from the 290 to 390 to 480 to 580. Which in case anyone is wondering... The 290 came out 5 1/2 years ago. The 390 launched at a official MSRP higher than the 290 8GB versions were selling for (and had been selling for over the previous 6 months) on the market at the time. Those same 290 8GB cards were flashable to exactly 390 performance.  [Yes I obviously know Polaris isn't Hawaii, but the performance was nearly identical, the US pricing over time was nearly contiguous. You could buy an aftermarket Sapphire 290 8GB in April-August 2015 for 270 dollars.]

 

And yes, those primary software (and general computing trends) performance gains brought the 290 up from just below the 970 to the point where the 580 matches or slightly exceeds the 1060 in most games. (a 'generous' raise of around 20% in 5 years)

 

 

TL:DR If someone wants to complain about the 20 series launch, they should go back and check the consistency in their emotional state against the AMD 300 and 400 series.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/19/2019 at 1:51 AM, DrMacintosh said:

Apple went with AMD GPUs publicly because of performance per watt issues, but the real reason is anybody's guess.

Spoiler

 - Reasons -

867.jpgdUUgLGt.jpg

 

To be honest apple may not be doing what they should be doing in terms of 'taking a high road', but it is precisely what most normal people would do.  If someone cons you, you neither come back for seconds or help enable them to potentially con others whom support you.  So I am inclined to agree with DrMac with nvidia making their own bed. 

However, I've been of the opinion for awhile now that people should really start talking about making apple allow others to make complete hardware solutions (computers, laptops, phones, etc) that can use their OS.  If that were the case, then apple wouldn't be able to get away with holding grudges quit so much.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, MoonSpot said:
  Reveal hidden contents

 - Reasons -

867.jpgdUUgLGt.jpg

 

To be honest apple may not be doing what they should be doing in terms of 'taking a high road', but it is precisely what most normal people would do.  If someone cons you, you neither come back for seconds or help enable them to potentially con others whom support you.  So I am inclined to agree with DrMac with nvidia making their own bed. 

However, I've been of the opinion for awhile now that people should really start talking about making apple allow others to make complete hardware solutions (computers, laptops, phones, etc) that can use their OS.  If that were the case, then apple wouldn't be able to get away with holding grudges quit so much.

As I mentioned previously, and another commented... the specific Nvidia hardware that was used in the 'faulty macbooks' was used in other laptops at the time without issue. We know already the issue was a poorly spec'd capacitor. However, it isn't clear if it was Nvidia's spec was insufficient or if Apple decided to ignore the spec because they thought they knew better. Neither company has admitted fault in the incident at any point. Thus the grudge at upper management I expect.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Curufinwe_wins said:

As I mentioned previously, and another commented... the specific Nvidia hardware that was used in the 'faulty macbooks' was used in other laptops at the time without issue. We know already the issue was a poorly spec'd capacitor. However, it isn't clear if it was Nvidia's spec was insufficient or if Apple decided to ignore the spec because they thought they knew better. Neither company has admitted fault in the incident at any point. Thus the grudge at upper management I expect.

It doesn't really matter what we, or others think factually happened.  Apple says and probably feels it was nvidias fault and the ball here is in their court, so thats that.  You can't go into someone elses house and tell them what to watch and how to be.  If apple thinks they got screwed, then that is the 'reality' for them, and they act accordingly.

Its not like they've only had the 1 such impasse in their relationship, as DrMac has elaborated on in the OP.

 

If I were apple I'd be using radeon too, but I'd be trying to hook into a console level experience with playstation and not the a-typical offer PC gaming experience.  If I could streamline porting as much as possible(even GPU HW wise), thats what I'd do to fill up the money bin more.  Give the option to game, but not PC lvl game.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Blademaster91 said:

And of course the driver support for Nvidia is terrible in Linux when they're told to go away instead of the community working with Nvidia to make good drivers.

It is the other way around! He told nvidia that AFTER years of trying to work with them and Nvidia refused to work with them and made shitty drivers! That's why he told this.

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, valdyrgramr said:

The only cards I have a problem with, in terms of pricing, are the 2080Ti and the RTX Titan.  Those are just priced that high for bs reasons, but again people will still pay that much.  I think he was, in a lazy way, arguing that as well.  He might have argued against the rest of turing too, but still, I didn't bother watching his lazy rant.  They're simply pricing those that high because of 0 competition.  AMD didn't ask like 1200+ "just because" for any of those cards.  In fact, has AMD ever asked 1200+ for a card?  I know Nvidia usually asks over a grand for the Titans.  But, they put the 2080Ti at 1200 USD minimum because of no competition+people willing to pay that much.   Jens is just getting greedy at that point.   For AMD, they usually ask a ridiculous amount for a GPU when they pretty much put two of them on a single PCB and make a brand new space heater for your home.  295x2, Pro Duo, 7990, and so on.  My guess is that Nvidia wanted to ask a ridiculous amount for the RTX Titan, so they put the 2080Ti at the Titan's usual price point to make the Titan's price seem more reasonable.  As for his "scam" argument against raytracing, I would just call it early adopter.

ATI did not for pure gaming cards. The closest thing was when AMD launched the Vega Frontier Edition for 1000/1199 (when a similar performance gaming card from Nvidia was less than 600...) And that card was basically the exact equivalent to a Titan card. Minus Nvidia's much better launch drivers. The highest a pure AMD/ATI gaming card went in insane markup was the 290X going for as much as 900 dollars, but that was more a mining craze than AMD's fault. So that one was a pretty big no-no for AMD, and fairly recently in fact.

 

 

 

AMD back when they held the CPU crown had MSRP that high, tried to sell a number of different CPU's at/above 1000 dollars MSRP. As Intel had done.

 

I certainly don't recommend the 2080ti or above... but if AMD could get off their ass and compete with a design, we might not have this issue. It's hard to say something that is dramatically faster than all other competition is a scam, but certainly, diminishing returns is a huge issue.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Curufinwe_wins said:

 

 

Lol. Are you being serious right now? Between AMD and Nvidia, Nvidia historically put a ludicrously larger effort into linux support. Yes, their driver is closed, but for most of the Linux development period, AMD/ATI's official driver had been so unbelievably bad that you were legitimately better off using the random hacked efforts of a tiny subset of the already small linux development community in many respects. 

 

The tides have turned recently (last 3 years or so), but the magnitude of effort for their own support has been quite literally incomparable. Talk about revisionist history. 

 

You can be pissed off at them not being 'open enough', but the rest is stupid (and yes Linus Torvald is a melodramatic crazy person. A more impressive list would be who he hasn't raged out at over the years.)

I am talking about AMD. Not ATI. Since AMD bought ATI their graphics cards and software improved by orders of magnitude. ATI drivers were bad in windows too. But AMD put a LOT of effort into it. I remember when i bought HD 7970 it was less powerfull then GTX 680 it was on par with GTX 670 and after 2 years my card had better FPS then my friends GTX 770 which was same as GTX 680 about 10% faster. Today AMD drivers are great for linux and we have open source ones too. There is no need to have nvidia in anywhere!

 

MacOS checked!

Windows DX12/Vulkan AMD is better. AMD has Freesync where is Nvidia Gsync costs you about 150$ and is worse then freesync. Amd has so many features in their drivers. OC, fan profiles and so on.

Linux If you want to not have problems with drivers you must not have nvidia GPU

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, MoonSpot said:

It doesn't really matter what we, or others think factually happened.  Apple says and probably feels it was nvidias fault and the ball here is in their court, so thats that.  You can't go into someone elses house and tell them what to watch and how to be.  If apple thinks they got screwed, then that is the 'reality' for them, and they act accordingly.

Its not like they've only had the 1 such impasse in their relationship, as DrMac has elaborated on in the OP.

 

If I were apple I'd be using radeon too, but I'd be trying to hook into a console level experience with playstation and not the a-typical offer PC gaming experience.  If I could streamline porting as much as possible(even GPU HW wise), thats what I'd do to fill up the money bin more.

To be perfectly honest... I'd be willing to bet the reason Apple moved away has nothing to do with being con'ed, and everything to do with not being able to dictate terms the same way they can with AMD. Including telling them when to take the fall, regardless of responsibility.

 

And yes, that is business reality. But it isn't the first time Apple has had that sort of relationship with a supplier.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mate_mate91 said:

I am talking about AMD. Not ATI. Since AMD bought ATI their graphics cards and software improved by orders of magnitude. ATI drivers were bad in windows too. But AMD put a LOT of effort into it. I remember when i bought HD 7970 it was less powerfull then GTX 680 it was on par with GTX 670 and after 2 years my card had better FPS then my friends GTX 770 which was same as GTX 680 about 10% faster. Today AMD drivers are great for linux and we have open source ones too. There is no need to have nvidia in anywhere!

 

MacOS checked!

Windows DX12/Vulkan AMD is better. AMD has Freesync where is Nvidia Gsync costs you about 150$ and is worse then freesync. Amd has so many features in their drivers. OC, fan profiles and so on.

Linux If you want to not have problems with drivers you must not have nvidia GPU

Sounds like you haven't actually used them recently.

 

Freesync (1) is 100% a straight-up inferior product, with awful quality control (mainly because there is no quality control. That was one of the primary aims of Freesync 2).

 

Other than the fact that AMD doesn't make competitive products above the 2060's performance range, certainly there is no need. 

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/19/2019 at 7:51 AM, DrMacintosh said:

What do you think? Should Apple allow Nvidia to keep publishing drivers? Should Apple do business with a company that does not respect them? 

Who hasn't nVidia pissed off yet?
They pissed off M$ with the Original XBox and their unwillingness of shrinking the GPU and doing good prices.

They probably pissed off Sony with their shitty G70 based RSX...

 

Especially if you compare it to the XBox 360 Slim wich basically was the first console to use a single Chip for CPU and GPU (well, not quite but almost)...

 

Its only a question of time until they piss off Nintendo and we see a Switch 2 with an AMD Ryzen based CPU...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, valdyrgramr said:

Vega 20 is alleged to be the high end, which the Radeon VII is part of where Navi is going to be like the x90 and down cards. 

No, that's bullshit.

 

There is a different reason for that:
There was a Problem found with Navi and the "Radeon 7" is something they pulled out of their ass in the last minute to have something to present. 

 


I trust AdoredTV in AMD things more than other people. And he claimed that Navi is awesome and the price low.

But they found an issue that required a "retape", so originally NAVI should have been released in 3months or so. But the Tapeout delayed that by 3 months...

 

Quote

I feel like part of the cost was HBM2.

Many say that AMD looses money with most VEGA Products because of HBM2...

 

And that Radeon 7 cost 750 Dollar to make...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Curufinwe_wins said:

To be perfectly honest... I'd be willing to bet the reason Apple moved away has nothing to do with being con'ed, and everything to do with not being able to dictate terms the same way they can with AMD. Including telling them when to take the fall, regardless of responsibility.

 

And yes, that is business reality. But it isn't the first time Apple has had that sort of relationship with a supplier.

When looking at the history I would say there's a lot beneath the surface. EVGA at least got one very cold shower from Apple around early 2000's when supplying Quadro cards for Mac Pro, IIRC Mac Pro back then had UEFI with locked HW and while EVGA was supplying Apple with Quadros Apple resold as upgrades, Apple included their own driver CD that included UEFI update that whitelisted that model of Quadro to be used in the Mac Pro, the cold shower part, Apple charged twice for that Quadro compared to the price EVGA charged and the only difference was the driver CD (IIRC Apple: ~1000-1600€ EVGA: ~600-1000€). Of course that GPU was not anyway different so at least a company I was trainee around then bought one Quadro from Apple and 11 Quadros from the local computerstore, because all you really needed was that one driver CD to update all Mac Pros and make them whitelist EVGA Quadros. And if I remember rumours around then correctly, Apple wasn't really happy when Nvidia refused to make just enough different chips for EVGA (which wasn't really happy about Apple tax, so probably it wouldn't matter wether or not Nvidia would have made those chips) that EVGA could supply Apple with Quadros that could have been differenciated from normal EVGA Quadros and so made it impossible to upgrade Mac Pros with GPUs bought from elsewhere.

 

So, there can be a lot more than just what shows to the surface. I wouldn't be surprised if there was around same kind of stuff that Nvidia refuses to make something Apple demands (like Nvidia refuses to drop OpenGL/CUDA/something support which could really piss off Apple who is so dictator, or Nvidia stil lrefuses to supply Apple with custom chips). I could very well see that now that there's eGPUs macs could be used for gaming and considering Nvidia is so much more aimed at gaming with their "consumer" line they would keep some API supports and stuff in their drivers to allow people more easily game with macs and as long as all eGPUs use the same port and about the same connection standards, only way for Apple to stop more people hacking and cracking macs to game on them better is to not allow Nvidia drivers on macs. At the same time AMD is ready to strip something like Vulkan support and legacy OpenGL supports and something like that from their drivers to make them "Apple certified".

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×