Jump to content

Apple M1 = the rest of us are living in the stone age!?

14 minutes ago, mahyar said:

well apple silicon still contains a lot stuff from and thats why they have a licence from arm 

and if nvidia wants to void apples licence they will be in trouble ( which is thing that can happen)

Even when the deal closes, Nvidia will not be able to void Apple's license at will--no company would ever enter into an IP contract that would permit that. Apple's perpetual license agreement is likely ironclad.

15" MBP TB

AMD 5800X | Gigabyte Aorus Master | EVGA 2060 KO Ultra | Define 7 || Blade Server: Intel 3570k | GD65 | Corsair C70 | 13TB

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, gal-m said:

Can Apple Silicon wipe everything else out?

It really does depend on the third party app support for it. Because, if the devs just keep writing apps for x86 Macs and users just emulate them on ARM macs, the users may still be able to get upto 80 percent of the performance, but in situations where that 99.9999999999999999% performance is needed like real pros such as film-makers and other mac users (I dunno I am more of a PC guy, but I don't have any personal bias either), they might not switch to M1 macs. So far the app combability for a fresh launch is good, we will have to see this in a year.

 

45 minutes ago, gal-m said:

I've recently been thinking about building a new high end gaming system, but now it just seems like Apple might speed past anything in the years to come (in reference to single thread performance at least)

But, why does that matter to you if you don't wanna game on macs ? then it shouldn't. As for the foreseeable future the ARM macs can't run windows, and even if they could via parallels or something you are paying some thousands of dollars to run Windows on ARM version not the full fat 64 bit. And these macs as far as I know don't have support for External GPUs either. So, the single threaded performance increase shouldn't be a concern as, you can overclock desktop processors and Intel (Hopefully team Red too) is working on bringing sub-ambient cooling to mainstream market (Go check out Linus' I got the Golden Sample vid)

 

51 minutes ago, gal-m said:

it's just making x86 CPUs feel a bit old :(.

x86 is quite old it has stuck with us since 1978 with extensions pasted to it with flex tape and glue. Like x86-64, originally x86 wasn't 64 bit ! ARM is relatively new in the mainstream market, but ARM is just using the RISC-V instruction set, which you guessed it IBM has been using for like a pretty long time. In fact ARM stands for Advanced RISC Machines, so ARM is old too, it is just that x86 is a bit older.

 

55 minutes ago, gal-m said:

I am NOT saying I want to play games on a Mac

That shoulda really calmed the mob of Anti-Mac Gamers wandering the Internet to find your location. :D 

57 minutes ago, gal-m said:

I am just simply stating that the potential of Apple's new chips to wipe every other Intel or AMD off the face of the planet might move manufacturers like Intel or AMD to start looking at developing an ARM based chip as well and the potential that would have in a Windows machine.

In regards to this, AMD (dunno what it will be called) and Intel (Intel EVO) are in the process of making x86 based processors that follows the BIG.little pattern of cores in an ARM chip for mainly mobile PCs like Tablets and Laptops. So far not a lot of devices featuring these chips have come out, as far as I know Samsung had made some sorta device a while back. And if we are to go by @RILEYISMYNAME review of that device, correct me if I am wrong, the experience is like just imagine Windows on ARM except it actually works !

PS : AMD had tried making ARM based chips but that didn't go too well. And I am sure Redmond has somethin big planned for Windows-on-ARM but they aren't going to let that out yet, they will let that news out only when they themselves have a surface device working with those things. Oh and Yeah, Windows 10X is something right ?

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, aDoomGuy said:

Yeah x86 is old. It must be 15-20 years since I had x86 chip last. Now I run x64. and it's good. :)

 

Apart from that, the more companies that can make good CPU and whatnot the better. Competition benefits us, the users and monopoly benefits the corporation.

x86 is still the basis for all Intel and AMD processors, they just use the x86-64 variant now. 

15" MBP TB

AMD 5800X | Gigabyte Aorus Master | EVGA 2060 KO Ultra | Define 7 || Blade Server: Intel 3570k | GD65 | Corsair C70 | 13TB

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Justaphysicsnerd said:

x86 is quite old it has stuck with us since 1978 with extensions pasted to it with flex tape and glue. Like x86-64, originally x86 wasn't 64 bit ! ARM is relatively new in the mainstream market, but ARM is just using the RISC-V instruction set, which you guessed it IBM has been using for like a pretty long time. In fact ARM stands for Advanced RISC Machines, so ARM is old too, it is just that x86 is a bit older.

ARM processors use ARM's instruction sets (ARMv8 is the 64-bit ARM ISA), not RISC-V.

15" MBP TB

AMD 5800X | Gigabyte Aorus Master | EVGA 2060 KO Ultra | Define 7 || Blade Server: Intel 3570k | GD65 | Corsair C70 | 13TB

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ankh Tech said:

High end will never be achieved by arm.

Dear *insert name here*
I would sincerely like to inform you of some old piece of news, that might seem like a bolt from the blue. So, if you are on a gaming chair then you are alright, but if you are not get one from our sponsors noble chair or you could just sit on a bed or chair near you. Please take a deep breath in deep breath out. Now think of all the mistakes you must have made in your life, that forced those words out of your mortal mouth. I or anybody is definitely not saying that humans don't make mistakes, we are humans that's wut we do ! Our robot overlords also might make a mistake or two in the future when they are born, but that is besides the point. Alright, where were we ? Yaa, about that old piece o news eh ? Brace for impact
.
.

.

.
.
.

.

.

.
.
.
.
.
.

.

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
The world's fastest  SUPERcomputer runs on



 

 

 

 

 

 

 

 

 

 


AN ARM CPU !






































































































NOTE : PLEASE DO NOT REPLY TO THIS MESSAGE AS IT IS A STANDARD MESSAGE THAT I HAVE AT ALL TIMES IN MY CLIPBOARD READY TO BE PASTED TO INFORM THE PCMR FANBOIS THAT OTHER INSTRUCTION SETS DO EXIST ! WE DO NOT COVER OR ARE RESPONSIBLE FOR ANY FEELINGS OR SENTIMENTS HURT DUE TO THIS MESSAGE. THIS MESSAGE IS ISSUED IN PUBLIC INTREST.

SPONSORED BY DBRAND, grab yourself a special LINUS EDITION phone case or skin (not recommended) or an ANTHONY EDITION SKIN OR CASE OR STICKER (DEFINITELY RECOMENDED) at betterlttstore.com , ALSO SPONSORED BY NOBLE GAMING CHAIRS and MAC WELDON LTT EDITION (EXCLUSIVE TO LTTSTORE.COM) and #Lienus gang

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, HelpfulTechWizard said:

Also, geekbench is known to be (somewhat) better on macs.

You are aware that there are several benchmark results of the M1 out right now, that aren't Geekbench, right?

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

To all the people claiming that "High End" will never be achieved by ARM. You all know that the current Top 1 Supercomputer (top500.org) is running ARM? Only with laughable 7.6 Million Cores. 

Single-Core performance seems to be also no issue, looking at the M1 chip, while Ryzen-5000 CPUs are taken to defend the x86-crown. People are comparing an ARM-Mobile CPU (1st gen) with Desktop-CPUs. Windows alone has decades of optimization to the x86-set within. AMD as well knows the x86-64 instruction-set quite well (they pretty much designed it).

 

Yes, up to now, no ARM-CPU has been designed for the specific Desktop/Laptop market. This has nothing to do with "possible performance" but with the chicken-egg-problem. Apart from "x86 will stay because it is NOW the standard", there are to my knowledge no technical benefits in this instruction set. 

CISC had its benefits when Memory was expensive - to shrink program code. Memory is cheap now - RISC takes over.

I would have LOVED to see it being RiscV - because of open-source reasons. But ARM is also quite nice.

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, HelpfulTechWizard said:

These threads need to stop.

 

Apples clames aren't true.

They are hiding information on the graphs.

It’s true, Apple hid information in the graphs, it’s even bettet IRL. 
 

Want to know how I know that?

 

I own one :)

Link to comment
Share on other sites

Link to post
Share on other sites

The M1 is an appetizer, what is gonna happen when the world will see the performance of

 

1) M1T - around Easter 2021 [iMac 24”]  (desktop 12 or 16 core)

 

2) M1X - summer 2021 [MBP 16”, MBP 14”, dark gray MacMini]  (mobile 12 core)

 

3) M2T - late 2021 [bigger iMac, Mac Pro] (massive desktop CPU on the new 5nm+ node)

 

is uncharted territory.

 

One thing we could consider: if absolute speed of the CPU/GPU was the only important factor, everybody would own an iPhone SE by now over any Android. iPhones aren’t “wiping” Android market share anytime soon.

 

Maybe just like with phones, there will be a “master race” (iphone/apple silicon macs) and a “meh, good enough, plus I get more options and customizations”-race (android phones/x86 PCs).

 

Then again, phones are more limited by nature and maybe on big screen computers people will be less happy to be in the “meh” camp.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Laborant said:

Yes, up to now, no ARM-CPU has been designed for the specific Desktop/Laptop market. This has nothing to do with "possible performance" but with the chicken-egg-problem. Apart from "x86 will stay because it is NOW the standard", there are to my knowledge no technical benefits in this instruction set. 

THIS is what I was thinking about when I wrote this post. From what I can gather performance seems to be bottlenecked by x86 quite badly, actually. 

Back in the old days you bought a new computer and after 6 months - 1 year, it was ancient.. tech moved really quick and I believe that it slowed down tremendously somewhere around 2012 onward... 

 

But this I believe has the potential to shift the industry forward by quite a lot. Because if every Mac is going to be MUCH faster (and efficient) for most tasks (except for gaming), then I'm not so sure chip manufacturers are going to continue sticking with the x86 architecture.. 

 

Anyway, thanks for sharing your thoughts @Laborant

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Justaphysicsnerd said:

That shoulda really calmed the mob of Anti-Mac Gamers wandering the Internet to find your location. :D

Hahahaha, thanks for the reply and detailed explanation @Justaphysicsnerd! Really appreciate it.

I guess a lot of people misunderstood what I tried to say, but you caught my doubts with no issue. 

 

I own both a custom Windows (old - 3930k, 16GB, GTX690, etc.) gaming machine from 2011 and a MacBook Pro. I am currently thinking of building a completely new system, due to my love of aviation (don't have a chance to become a pilot IRL at the moment) and as we know Microsoft Flight Sim 2020 is no joke. 

I have known about Apple's transition to ARM for quite a long time, but honestly never thought much of it. However when I saw actual figures come out from actual people, the news troubled me deeply to be honest, because nobody wants to throw a bunch of money at components that might get absolutely smacked around in the potential (extremely) near future, if the future were to be marked by a.. let's say 2X, performance boost (and exponential increase shortly after) - I have to mention I absolutely DO DO and DO welcome competition to the fullest. 

But just like you mentioned, we should be good for some time to come.

 

Cheers!

Link to comment
Share on other sites

Link to post
Share on other sites

In the iPhone vs Android scenario, we could also see

- quality apps coming to ARM Macs faster 

- the same app being generally better and snappier on ARM Macs

 

Just like it’s currently the case with iPhones and, even more, with iPads. 

 

In the x86 camp (and also ARM Windows camp maybe) the push for webapps (both in browser and on desktop) could be even stronger than today, but that would be a win-win for the ARM Macs because they crush web/JS benchmarks as well (and this time without the limitations of an iPad browsers, but with full mouse hovering and all, although lately even the iPad has moved to a desktop-ish browser and mouse support). 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Ankh Tech said:

People buy apple to flex.

While it could be possible to agree with this, some people like Apple products just because of the way they look.. Nowadays there are a lot of good looking laptops around, but a little while back I was extremely impressed with the first unibody macbook. And besides people who can afford a Mac and like the look of it, will just buy it. Good for them, Macs are very nice machines (not talking about price/performance).

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, gal-m said:

THIS is what I was thinking about when I wrote this post. From what I can gather performance seems to be bottlenecked by x86 quite badly, actually. 

Back in the old days you bought a new computer and after 6 months - 1 year, it was ancient.. tech moved really quick and I believe that it slowed down tremendously somewhere around 2012 onward... 

 

But this I believe has the potential to shift the industry forward by quite a lot. Because if every Mac is going to be MUCH faster (and efficient) for most tasks (except for gaming), then I'm not so sure chip manufacturers are going to continue sticking with the x86 architecture.. 

 

Anyway, thanks for sharing your thoughts @Laborant

The best thing with the release of the M1 is that for the first time in a long while all tech forums are really lively, the M1 has stirred the pot. 
 

It’s also comical that some tech forum goers are in total denial about it. 
 

I would say this is the biggest CPU happening since Intel ditched the crappy Pentium 4 design and moved back to the Pentium 3 design in their Core CPUs. 
 

It is totally OK to not like Apple, but some need to realize it is a big deal that a system that draws 25 W under full load (for the entire system) is constantly compared to high end desktop CPUs because it performs so well. 
 

Haters are going to hate and to them I’m just an Apple fanboy.
 

But for the open minded if you know anyone that bought an M1 mac ask if you can borrow it for a couple of hours just to get a general feeling of the system (and don’t borrow it to run benchmarks, do actual computer stuff on it). 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sat here waiting for Mr Anthony's formal review. So far we had a few brief snippets in a few videos and Anthony's unboxing of the MacBook Pro (M1). Everything is pointing to this thing been incredible. Macbook Air is silent and runs everything, the Pro can just sustain higher clocks for longer. But that's irrelevant in our current media format world. 

 

Then you've got all the other reviews slipping out. This is a 10W cpu trading blows with 100+ watt chips.

 

Whatever limitations impacted ARM in the past are long gone. You're now left with a fast, efficient chip. Apple then wrapped it up in a translation app that runs x86 programmes 80% as well.

 

x86 .... it's soon to be done for ...

Link to comment
Share on other sites

Link to post
Share on other sites

The hardware might be good, but the engineering, software, and Corporate will behind it all is as bad, or probably worse, than Facebook's commitment to privacy.

 

The last Apple device I used was an iPhone 4S. I will not ever purchase an Apple product ever again, unless they stop the anti-repair bullshit, and they actually give customers control over their own hardware and software. >_> Neither of these will ever happen, so I will never own an Apple product ever again.

"Don't fall down the hole!" ~James, 2022

 

"If you have a monitor, look at that monitor with your eyeballs." ~ Jake, 2022

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, Sarra said:

The hardware might be good, but the engineering, software, and Corporate will behind it all is as bad, or probably worse, than Facebook's commitment to privacy.

 

The last Apple device I used was an iPhone 4S. I will not ever purchase an Apple product ever again, unless they stop the anti-repair bullshit, and they actually give customers control over their own hardware and software. >_> Neither of these will ever happen, so I will never own an Apple product ever again.

 

Can’t hear you over the sound your laptop fans at full speed 🤷🏻‍♂️

 

 

 

I kid, I kid.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, saltycaramel said:

 

Can’t hear you over the sound your laptop fans at full speed 🤷🏻‍♂️

 

 

 

I kid, I kid.

No you can't; My laptop is turned off, in a closet, and cost me less than $200 brand new. When it is on, the cooling fans don't really make any noise; on top of that, an M1 would be incapable of doing the actual purpose I bought my laptop for... Since the software I require is PC only. On top of that, even if it was available on Mac, I would not trust the Mac to be reliable enough to use it in this case, since it could literally brick the ECU in my vehicle, which quite simply is worth more than even the most expensive Apple product ever made.

"Don't fall down the hole!" ~James, 2022

 

"If you have a monitor, look at that monitor with your eyeballs." ~ Jake, 2022

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/26/2020 at 2:08 PM, Amias said:

I'm sat here waiting for Mr Anthony's formal review.

Where in the world is this review? Feels like it's been ages.

3700X, GTX1060, 64GB HyperX 3000, B450 DS3H, 500GB SN750, TT Core G3, PreSonus Audiobox USB96

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/26/2020 at 7:14 AM, gal-m said:

And as a consumer I don't want to be buying a new computer if we're about to witness a huge leap in performance... 

 

It's going to happen. In the next bunch of months will be the next level of M1 (M1x, M1t, whatever they'll be called) and they'll have amazing performance.

 

That's life in the computer world.

 

So don't worry about it. Build the gaming machine you want and have fun, and don't worry about the numbers the new macs are doing.

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

The issue is not the M1 chip but the company behind it.

Right now you are allowwed / can use your nice mac machine of school, soon they will be barred.

 

Right now macs can use bootcamp to get windows which is and will for the foreseable future be 99% of the marketshare.

No matter how good macs will be, it will be near impossible to have companies switch here entire process to it,

So you need a "normal" x86 laptop for school for anything related to computers.

 

Will parent buy both a windows laptop for school and an iMac for there children, and the max lovers will be dragged out of there ecosystem, will they stay?

Those are mayor questions to ask. if they want to succees crazy enough they need to work with microsoft and linux to get windows/linux on macs.

 

Do you see that happen?

if not the macs will stay a nice forever and similar for the M1 chip and it's siblings in the future.

Just you make a great chip the entire world will not drop everything and put there lives in the hand of 1 company without restraints

 

Link to comment
Share on other sites

Link to post
Share on other sites

M1 macs are great....but can it run Crysis?

 

 What a computer does for you is it solves the problem. It makes your work easier. If it doesn't run the software you need it doesn't matter how powerful it is.

 

x86 runs everything, ARM does not.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Crazywizard said:

windows which is and will for the foreseable future be 99% of the marketshare.

image.png.829a5cee9a6034b2e384a1f0c263179d.png

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

People make too much drama around it. Apple M1 is a bit of a revolution, but only within Apple's ecosystem. Because they made sure everything works together to perfection. Outside of that, it's just another ARM based chip. Nothing special. Anyone could achieve that if they had the control over ecosystem the way Apple has. Which is why Microsoft is struggling with Windows for ARM, because they don't have the same control as Apple has.

 

It's also funny how people say "uh oh x86 is old" and "ARM is new". Uuuum, have you all woken up from cryo sleep? ARM has been around since 1983. I'd say 25+ years counts as old too. Not to mention x86 in its beginnings has very little to do with what it is now. And same applies to ARM. People have this fixated idea that if it's x86 it's the same as it was 30 years ago. Not only it got 64bit registers support, its acceleration function extensions have been added and removed through decades and the whole compute logic of CPU's has dramatically changed since its beginnings. People don't seem to know how to distinguish x86 as logical part from the CPU structure as the physical part. They are not the same thing. It's the physical part that has evolved so dramatically since its early days and it's not just raw transistor count, clock speed or whatever. It's also multi tier caching, branch prediction, multiple cores or how AMD approached the problem with CCX complexes. It's ultimately all still x86, but it's unlike anything anyone even imagined when x86 was created in the beginning. Just look what leaps we've made in just few years of Ryzen CPU's. Has anyone imagined back in 2005 that in 2020 we'll have 32 cores and 64 threads in CPU's which are approaching clocks past 4GHz on ALL cores, that while very expensive are perfectly accessible to regular mortals (Threadripper 3970X for example). x86 is not outdated or problematic and neither CPU's based on them are something to be dismissed as pointless or useless. As evident by AMD how much winning you can do by thinking out of the box without breaking the compatibility.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×