Jump to content

Is apple just going to kill x86 now?

LazyLand
29 minutes ago, Whos Sayin said:

Looking at the current info on M1, it seems like its only a matter of time before intel and amd go all in on ARM at least for mobile chips. 

And if it works that good, I don't think its a bad thing. We might see a split between mobile and desktop software where demanding games and software only work on desktops and the few x86 laptops left and i dont think there's anything about it worse than right now. i dont think shits gonna get worse because of better shit coming out

AMD did dropped development of the ARM high performance server CPU they were working on. This was due to the wasted resources being badly needed to get both develop of Ryzen CPUs and Polaris GPUs released on time.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, whm1974 said:

Just how common are ARM based Servers?

Some of the big tech companies are building them internally for their data centres (eg FB, Google, AWS) and lots of big companies pay to use them in VMs (eg Snapchat, Netflix, Datadog). 
If you look at the top supercomputer list too, there’s a number of arm servers there. 

15" MBP TB

AMD 5800X | Gigabyte Aorus Master | EVGA 2060 KO Ultra | Define 7 || Blade Server: Intel 3570k | GD65 | Corsair C70 | 13TB

Link to comment
Share on other sites

Link to post
Share on other sites

Someone clarify me a little:

What exactly blocks apple (or any ARM processor manufacturer) from "overclocking" their CPUs and GPUs on desktops with proper cooling until they reach the 65-105w we are used to? They might have way more cores or clocks reaching 6Ghz! No?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, RafaMarioFan said:

Someone clarify me a little:

What exactly blocks apple (or any ARM processor manufacturer) from "overclocking" their CPUs and GPUs on desktops with proper cooling until they reach the 65-105w we are used to? They might have way more cores or clocks reaching 6Ghz! No?

Generally chip makers dont overclock. They will have performance and power targets they try to hit. Though in some cases chips can boost their clock speeds  themselves if they have proper cooling, like how many Intel or AMD CPU's can boost up the clock speeds of a few cores when they need the extra performance, but thats designed in the chip. Mostly manual overclocking is left to the end user. Some chip makers might block it as well. Look at Intel, only a few sku's of their chips can be overclocked, as the multiplier is unlocked. 

 

Apple is unlikely to allow over clocking. As they like to have control over their machines. Also keep in mind that people who over clock tend to buy better cooling equipment for their setups. Such as large tower heatsinks or some kind of water cooling setup. Generally standard cooling doesn't really hold up to large over clocks.  

I just want to sit back and watch the world burn. 

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, RafaMarioFan said:

Someone clarify me a little:

What exactly blocks apple (or any ARM processor manufacturer) from "overclocking" their CPUs and GPUs on desktops with proper cooling until they reach the 65-105w we are used to? They might have way more cores or clocks reaching 6Ghz! No?

Re: “someone clarify me a little” 

this I somehow doubt means “warm me in a sauce pan till I start to turn transparent”

 

re: overclocking ARM:

no.  There is a hard physics limit for SOI at room temperature that is somewhere above 5ghz but is still less than 6ghz. It’s a bit like the hard physics limit ota phone modems ran into in the 80’s. there is a point where you just can’t make them go faster.   There are other issues too.  They tend to get gotten around 1 by 1.  The first PC I owned ran at 4.0mhz.  Not ghz, MHz.  1000 times slower.  It was not SOI. It was aluminum on bare silicon.  No Insulator.  You may notice that there is occasionally talk of abandoning silicon and moving to germanium.  This is an end run around SOI that hasn’t been needed yet. Chip designers have been talking about the possibilities of germanium for 30 years.  It’s a lot rarer and harder to work with though.  The knowledge of the issue is not new though.

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, whm1974 said:

Now I'm wonder if Apple will release a Mac Pro with an ARM CPU and dGPU.

I wonder this as well.  I get the impression that the way the m1 is built that could be difficult.  They might need a new design. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Bombastinator said:

I wonder this as well.  I get the impression that the way the m1 is built that could be difficult.  They might need a new design. 

To me M1 was built more as a mobile platform. So there's always a small chance they might build a more "desktop" oriented chip in the future. That could be why they said the transition would take 2 years, as they might have something in the design phase but it was not ready to be released. 

I just want to sit back and watch the world burn. 

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Bombastinator said:

I wonder this as well.  I get the impression that the way the m1 is built that could be difficult.  They might need a new design. 

Come up with a new design centered around Workstations.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, whm1974 said:

Come up with a new design centered around Workstations.

The is is also my hope.  After all gaming is what workstations do in their spare time.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Clearly we are not even talking the same language here anymore. With most of you settling, your mind over or otherwise,  on ok and fine, and not for the best or excellence. I can't reply to every single one hating in progress, in future and change. I am out of here, of this thread. 

I will post-emphasise that though I fully understand that this is not a monolithic block of people in this thread.

 

Finally, I will leave you with this, may this be an inspiration to you, to us all. Don't be staticaly happy because you won't evolve, thou shall evolve or be left behind eventually, a matter of when not if.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, We Didnt_t start_the_fire said:

Clearly we are not even talking the same language here anymore. With most of you settling, your mind over or otherwise,  on ok and fine, and not for the best or excellence. I can't reply to every single one hating in progress, in future and change. I am out of here, of this thread. 

I will post-emphasise that though I fully understand that this is not a monolithic block of people in this thread.

 

Finally, I will leave you with this, may this be an inspiration to you, to us all. Don't be staticaly happy because you won't evolve, thou shall evolve or be left behind eventually, a matter of when not if.

 

 

That sounds overly dramatic and I haven't even gone through the whole thread. Also, I don't know what you're doing with the colour stuff, but it's kinda annoying. If you want to distinguish sections, the enter key is your friend (you're already using it tho)

Either @piratemonkey or quote me when responding to me. I won't see otherwise

Put a reaction on my post if I helped

My privacy guide | Why my name is piratemonkey PSU Tier List Motherboard VRM Tier List

What I say is from experience and the internet, and may not be 100% correct

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, We Didnt_t start_the_fire said:

Clearly we are not even talking the same language here anymore. With most of you settling, your mind over or otherwise,  on ok and fine, and not for the best or excellence. I can't reply to every single one hating in progress, in future and change. I am out of here, of this thread. 

I will post-emphasise that though I fully understand that this is not a monolithic block of people in this thread.

 

Finally, I will leave you with this, may this be an inspiration to you, to us all. Don't be staticaly happy because you won't evolve, thou shall evolve or be left behind eventually, a matter of when not if.

 

 

There are two different questions I think and people have been replying to them interchangeably:1 is apple done with x86?and 2: is personal computing done with x86?

 

As for the first, Apple says yes.  I worry from what has been seen so far that if they don’t produce a more power user oriented chip that Apple may also be done with power users. 
As for the second I think it’s way too early to say, but intel’s 10 year issue with process shrinkage didn’t help x86 much, and while x86 is maintaining in some areas and even growing a bit in some it has by no means won the day.  

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bombastinator said:

I worry from what has been seen so far that if they don’t produce a more power user oriented chip that Apple may also be done with power users. 

It feels a bit early to have that concern...the M1...1, as in the first in a series.... has been out for only a week. Being curious about what comes next is natural, but being worried that nothing is coming next when we know there's more to come...seems a bit pessimistic....I mean, i get it.. 2020 and all, but still...

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Video Beagle said:

It feels a bit early to have that concern...the M1...1, as in the first in a series.... has been out for only a week. Being curious about what comes next is natural, but being worried that nothing is coming next when we know there's more to come...seems a bit pessimistic....I mean, i get it.. 2020 and all, but still...

Apple has been known to dump entire markets.  There was a time when Apple was the premiere gaming machine, and when Apple was the premiere high end graphics machine. That they might dump power users entirely seems very possible to me.   The other issue is the “why” of the speed of these new laptops.  If it does come primarily from integration of systems, it becomes more difficult to allow things that power users need. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Bombastinator said:

There was a time when Apple was the premiere gaming machine, and when Apple was the premiere high end graphics machine.

...I've been using Apple products for over 40 years. (My dad bought an Apple II+ in October, 80).....Unless you're talking about back then with the early Ultima's and such, WHEN was Apple the "premiere gaming machine"? MAYBE had Microsoft not bought Bungie when they were developing HALO and ONI for the Mac and killed ONI and made HALO the XBox launch title, it would have gained traction, but c'mon.

 

As for High End graphics...it's not? I don't recall Apple leaving that market, though that market has to some extent left Apple.

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Video Beagle said:

...I've been using Apple products for over 40 years. (My dad bought an Apple II+ in October, 80).....Unless you're talking about back then with the early Ultima's and such, WHEN was Apple the "premiere gaming machine"? MAYBE had Microsoft not bought Bungie when they were developing HALO and ONI for the Mac and killed ONI and made HALO the XBox launch title, it would have gained traction, but c'mon.

 

As for High End graphics...it's not? I don't recall Apple leaving that market, though that market has to some extent left Apple.

I was.  I was thinking specifically of the appleII+
As to the high end graphics thing that may have to do with my view of the decisions made by Apple for the most recent Mac Pro and iMacs intended for graphic design.  I consider what appear to be deliberate limitations in those devices the primary cause of the graphics market leaving Apple to the degree it has.

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×