Jump to content

Apple quietly ditches AMD

Jumballi
7 hours ago, RorzNZ said:

They're still going to make Intel-based Macs with AMD GPUs though. I think everyone expects Apple GPUs with Apple CPUs because they are integrated.

 

If you go for a car based in 0-60 times its called a Honda and there's a reason why people don't buy them. 

You mean the same Honda that sells over 5 million units every year since 2016?  Yeah nobody is buying them.

QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to comment
Share on other sites

Link to post
Share on other sites

Nobody's talking about them mentioning "Nvidia GPUs" on Mac?

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, hishnash said:

So in many ways moving to having apple macs will 120hz displays will make it the largest high refresh rate userbase within a week of the first devices shipping.

Yeah refresh rate isnt gpu performance. 

 

And btw, the original 1998 imac was 112hz and they didnt exactly catapult apple into the gaming forefront.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, emosun said:

Yeah refresh rate isnt gpu performance. 

So the iPad will play eSports titles (that ship for it) like PubG at 120hz solid.

And more importantly the latency on the iPad is very very low as apple have been pushing this over frame rate due to the main reason for high frame rate is low latency input with the pencil so drawing in screen with the pencil has less than 1 frame lag (even if the next frame is already mid render if a pencil stroke happens it will include this in the output buffer, the gpu gets direct access to these inputs without needing to pipe them through the OS).
 

7 minutes ago, Mateyyy said:

Nobody's talking about them mentioning "Nvidia GPUs" on Mac?

So interestingly we did see more updates to use-space drivers that might enable better support for Nvidia gpus (but only for applications that `trust` nvidia, this is all about loading third party .dlls into your application). This would require nvidia to write such drivers but due to how they work apple can't block them even if they tried.

 

17 minutes ago, mr moose said:

hat isn't true is that the apple ecosystem is not actually beneficial over windows for music or sound production.

In raw features it is not better than a dedicated kernel build of linux but it is a lot better than windows both in latency and in throughput. The advantage it has over linux is of course software compatibility. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, mr moose said:

Whilst I agree in sentiment and for the most part in reality,  what isn't true is that the apple ecosystem is not actually beneficial over windows for music or sound production.  In fact many specialists in the field argue it is worse since they dumped firewire.

My understanding is back in the day Apple had the edge in graphics due to OOTB Adobe RGB colour space coverage certification which was still pretty rare on Windows displays and in audio because the Mac OSes sound hardware had significantly less latency than Windows.

 

Now colour space certified displays are much more commonplace and Windows sound cards have all but caught up on the latency there's really no benefit to using one over the other.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, emosun said:

all I know is this is just another nail in the coffin for the forever long battle of not being able to play games on a mac

Yeah this is probably true.

 

I did not read the article, but would native Apple graphics support DirectX12 and the like? Or would they just support Metal?

 

Too bad, I am pretty happy with the AMD Vega Pro 20 graphics in my Macbook Pro. Runs some games pretty well actually on Bootcamp.

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder how on Earth Apple will scale their GPU chips to compete with something like Radeon RX Vega Duo which they offer now in Mac Pro. That thing is a computational monster. How are they going to match that with things they've done for mobile devices so far and was an embedded solution only. It's not like you can just take whatever you works for you in mobile devices and just scale it to wherever you need it. Things are not that simple and especially not when entire process there is about efficiency and now you want is raw performance with elevated consumption and efficiency somewhere on 3rd place of importance?

Link to comment
Share on other sites

Link to post
Share on other sites

Apple must be an only child... they refuse to play nice with anything that is not apple.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, RorzNZ said:

If you go for a car based in 0-60 times its called a Honda and there's a reason why people don't buy them. 

You mean the 4th best selling car brand in the world? Yeah okay

QUOTE ME IF YOU WANT A REPLY!

 

PC #1

Ryzen 7 3700x@4.4ghz (All core) | MSI X470 Gaming Pro Carbon | Crucial Ballistix 2x16gb (OC 3600mhz)

MSI GTX 1080 8gb | SoundBlaster ZXR | Corsair HX850

Samsung 960 256gb | Samsung 860 1gb | Samsung 850 500gb

HGST 4tb, HGST 2tb | Seagate 2tb | Seagate 2tb

Custom CPU/GPU water loop

 

PC #2

Ryzen 7 1700@3.8ghz (All core) | Aorus AX370 Gaming K5 | Vengeance LED 3200mhz 2x8gb

Sapphire R9 290x 4gb | Asus Xonar DS | Corsair RM650

Samsung 850 128gb | Intel 240gb | Seagate 2tb

Corsair H80iGT AIO

 

Laptop

Core i7 6700HQ | Samsung 2400mhz 2x8gb DDR4

GTX 1060M 3gb | FiiO E10k DAC

Samsung 950 256gb | Sandisk Ultra 2tb SSD

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Mateyyy said:

Nobody's talking about them mentioning "Nvidia GPUs" on Mac?

i dont think theres anyway apple is going to go nvidia. apple wants to control the graphics drivers and nvidia wants to control the graphics drivers so its just going to be a mess on that end

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, spartaman64 said:

i dont think theres anyway apple is going to go nvidia. apple wants to control the graphics drivers and nvidia wants to control the graphics drivers so its just going to be a mess on that end

That's what I was thinking too. Considering their at this point long going feud, it seemed quite odd for them to mention Nvidia, alongside AMD and Intel for graphics.

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Orangeator said:

Hold on just one second, actually read the article...

Um, I don't know about you, but to me, that sounds like Epic Games knew all the way back in 2018 (Dec) that ARM was coming to the mac... @gabrielcarvfer

They were talking about Fortnite.

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, gabrielcarvfer said:

I guess it is for Jurassic hardware with Nvidia chipsets, like my 2010 MacMini. :)

Haha yeah, that's also true. Thanks for pointing it out!

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

Really their custom GPU vs a dedicated one, not sure how tuey can compete there. Transition period or not, after that, really comparison should be quite interesting.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, VegetableStu said:

Apple has committed to keeping thunderbolt, so they may go for the approach of "need more horsepower? use an eGPU with Radeon Graphics. it charges your Macbooks as well"

Yep, I fully expect this to be the case - there's been a pretty clear trend of apple just moving things out of their devices and onto unnecessary external dongles.

 

I also don't expect support for AMD cards to go anywhere unless they want to completely kneecap the Mac Pro only a year after release. It's already going to be rough with the CPUs...

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, hishnash said:

 

In raw features it is not better than a dedicated kernel build of linux but it is a lot better than windows both in latency and in throughput. The advantage it has over linux is of course software compatibility. 

 

Nope,  There is absolutely no draws backs in latency for professional sound in windows compared with any version of mac os.  Many many professional studios run windows.

 

Apple used to have an advantage with a simply better approach, but they lost that lead when they dropped features and windows became just as good in the hardware department.

 

5 hours ago, Master Disaster said:

My understanding is back in the day Apple had the edge in graphics due to OOTB Adobe RGB colour space coverage certification which was still pretty rare on Windows displays and in audio because the Mac OSes sound hardware had significantly less latency than Windows.

 

Now colour space certified displays are much more commonplace and Windows sound cards have all but caught up on the latency there's really no benefit to using one over the other.

For audio work the graphics hardware doesn't have any effect, as far as latency goes that hasn't been an issue for quite some time.

 

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Audio problem in Windows doesn't have much to do with soundcards as much as it has t do with OS level audio stack which is just quite inefficient and has a lot of high latency issues. And I think it got even worse when Microsoft decided to sack the HW accelerated audio of the old and replaced it with software one when Vista was released. Now everything goes through Windows audio stack and I wonder how much control audio HW even has over audio these days...

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, VegetableStu said:

my only concern is they'll start making eGPUs or external compute units firsthand as well, which by then they'd truly ditch AMD

Maybe but the point here seems to be that they prefer an SoC rather than that they can deliver a truly high performance gpu... I doubt the market would justify the production of their own discrete GPUs exclusively for Mac prosumers.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

Nope,  There is absolutely no draws backs in latency for professional sound in windows compared with any version of mac os.  Many many professional studios run windows.

 

Apple used to have an advantage with a simply better approach, but they lost that lead when they dropped features and windows became just as good in the hardware department.

 

For audio work the graphics hardware doesn't have any effect, as far as latency goes that hasn't been an issue for quite some time.

Exactly the point, it hasn't been the case for a long time but people still believe it to be true.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

The question not enough people in this thread don’t ask (and for obvoius reasons can’t answer) is: How well will Apples GPUs perform compared to AMDs offerings?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Sauron said:

I doubt the market would justify the production of their own discrete GPUs exclusively for Mac prosumers.

AMD makes custom GPUs exclusively for consoles, which have an average of 10 million sales a year per console brand.
Macs had a total of 18 million units per year. With a conservative estimate of 11%, that's still 2 million radeon chips in apple computers.

CPU: Intel core i7-8086K Case: CORSAIR Crystal 570X RGB CPU Cooler: Corsair Hydro Series H150i PRO RGB Storage: Samsung 980 Pro - 2TB NVMe SSD PSU: EVGA 1000 GQ, 80+ GOLD 1000W, Semi Modular GPU: MSI Radeon RX 580 GAMING X 8G RAM: Corsair Dominator Platinum 64GB (4 x 16GB) DDR4 3200mhz Motherboard: Asus ROG STRIX Z370-E Gaming

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Jumballi said:

AMD makes custom GPUs exclusively for consoles, which have an average of 10 million sales a year per console brand.
Macs had a total of 18 million units per year. With a conservative estimate of 11%, that's still 2 million radeon chips in apple computers.

It's one thing to make a slightly modified version of something you already developed for a partner like Sony who knows for certain that every device will contain that exact chip for the lifetime of the console.

 

It's quite another to develop something from scratch for a total of at best 2 million units per year and still need it to be up to date and trade blows with the likes of nvidia, which sells hundreds of times as many units every year. There's a reason not even Intel has dared to try it up until recently.

 

If Apple goes down that road and they expect to be profitable while doing it they'll have to enter the PC prosumer market with it as well.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Jumballi said:

 

 

My thoughts

While this indeed does sound like they could possibly move away from AMD for the time being, AMD has announced their Radeon GPUs branching out into the mobile market, partnering with Samsung earlier last year. There are 4 reasons as to why they haven't announced why they would continue using AMD GPU;
1. Apple gains a leverage over AMD, they can now argue solely on GPU alone and leave CPU out of it.
2. AMD doesn't have enough time to release a new mobile GPU in time for the the first Apple Silicon Mac launch.
3. AMD might be under a contract that prevents them from working with apple at this time.
4. AMD mobile GPUs aren't mature enough for Apple to put them in a mac today.

 

Sources

https://www.techradar.com/news/could-apple-also-ditch-amd-to-make-its-own-graphics-cards
https://www.pcgamer.com/apple-ditch-amd-gpus-and-intel-cpus/
https://www.gizmodo.co.uk/2020/07/apples-homegrown-chips-could-be-the-end-for-amd-graphics-in-macs/

This sounds like the author doesn't quite understand what Tile-based-rendering is.

 

Tile-based-rendering is the mechanism that GPU's have been using for a while, but the PowerVR tech pioneered. That means that is the "native" renderer of the Apple GPU, not that it's discontinuing AMD support. Immediate mode is "frame based" which is something that is doable with a high end GPU. Tile-based is something you do with low-end parts to fake higher frame rates since intermediate mode is harder to pull off and still be a low-energy part.

 

nVidia and AMD have had tile-based rendering since 2015 and 2017 respectively.

 

Quotes:

Quote

Using tiled regions and buffering the rasterizer data on-die reduces the memory bandwidth for rendering, improving performance and power-efficiency. Consistent with this hypothesis, our testing shows that Nvidia GPUs change the tile size to ensure that the pixel output from rasterization fits within a fixed size on-chip buffer or cache.

Basically to render 4K and above, the GPU's vendors ALL decided to go back to Tile-based rendering as a way to solve the memory bandwidth wall they've hit.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, hishnash said:

And more importantly the latency on the iPad is very very low

not as low as the crt on an original 98 imac

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×