Jump to content

For mac users and those who were looking forward to the M1 chip for the changes it would bring to the industry: Are you disappointed in the M1?

BlakSpectre

I was looking forward to the M1 chip, expecting them to massively boost consumer side propagation towards RISC CPUs. While I credit Rasberry Pi for starting it, once Apple does something, it becomes mainstream. (RIP pebble, I loved you).

I think the boost will still happen but I was expecting M1 chips to do better.

 

But I am disappointed in the chip.

 

Don't get me wrong, the performance is good and it is essentially a beta product like the first MacBook air, but that was expected given the performance of the iPad and the "accelerators" it has for heavy tasks like video playback, editing etc. They are taking using of the advantages afforded to them by the ARM based system and TSMC 5nm process to deliver decent performance on a 10 watt CPU. But it seems like the software team did all the work while the hardware team sat on their asses.

Most users will not notice the problems I have (or care for them) but I expected better from a company based on their previous designs... they expected you to spend more cash on more things but, parts were hard to repair but all problems could be solved as long as you threw enough money at them.

 

Software:

The software is almost well optimized, but have seen too many jitters during normal tasks if heavy tasks are going on in the background. I was expecting the performance I got from my 2012 MBP, no matter how hard you were kicking the CPU, tasks like alt tabbing, switching app focus, resizing windows etc never skipped a beat (well before I updated it). I rarely if ever saw the beach-ball. They had become more common with my newer macs but I was expecting them to go away thanks to those tasks being done by the low power core... that does not seem to have happened (based on the videos I have seen).

Beyond that the emulation is better than I was expecting. And the core OS works, even with the icons which subjectively speaking are a crime against my eyes.

 

Speculation 1:

Some of the issued I think may be because the ram is being shared between GPU and CPU cores but that is pure speculation on my part and I decided to not buy the machine as it is.

 

 

Hardware:

Single thunderbolt controller. HDMI port on the mini only goes up to 4k 60. No multi display support.

Not enough I/0 bandwidth (in my understanding, correct me if I am wrong) so they they are limited to lower number of ports and cant do even 2.5 gig networking let alone 10 gig.

I want to talk about GPU support and them apparently requiring driver updates for DACs to work and flaky high bandwidth external connections but as it stands I do not know enough about them (technically and detailed explanations of the problems) to confidently comment on it.

 

Shared memory for the GPU and CPU (which is basically going to be a positive in a long run I think).

 

Overall, the machines are good but lack the "polish" I attribute to apple. Maybe all engineers are busy working on the M2 or M1x or whatever comes next. Or maybe I am overreacting or missing something. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Monkey Dust said:

It's first gen. First gen is always a bumpy ride.

True that.

 

It does look very impressive. But there are reports of... odd behaviour. Such as the OP above, other weird stitching or lagging, etc. Perhaps these are just software bugs that will be fixed in the coming weeks via software updates. Perhaps they are hardware limitations that won't be resolved until the next revision.

 

Only time will tell.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Monkey Dust said:

It's first gen. First gen is always a bumpy ride.

i expected first gen to be bumpy, but the whole I\O limitation is what is disappointing me, it seems to be lagging behind rest of the soc. Guess I am also mildly concerned that they are going to leave that as status quo for future generations.

With that said, they also shipped first Macbook air with 1.8 inch HDD and a mini DVI (iirc) port 🤷‍♂️.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, BlakSpectre said:

i expected first gen to be bumpy, but the whole I\O limitation is what is disappointing me, it seems to be lagging behind rest of the soc. Guess I am also mildly concerned that they are going to leave that as status quo for future generations.

With that said, they also shipped first Macbook air with 1.8 inch HDD and a mini DVI (iirc) port 🤷‍♂️.

Micro-DVI, actually - it was only ever used on first gen Macbook Air's, as far as I can tell. Despite using the DVI protocol, the port itself is proprietary.

 

There does exist a Mini-DVI port too, but that was used on older Macs, like the Powerbook G4's and a few early Intel models.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think something to keep in mind is that some of the issues you might be seeing have more to do with the new OS release (Big Sur), than the M1. Hopefully Apple with have a major point release for Big Sur in the next month or two.

 

I think what they have announced so far has been a very good first step. However, I am curious what they will offer for the iMac, Mac Pro, and larger MacBook Pros. Hopefully, we will see more memory options and ports, among other things. And of course, increased performance.

 

That being said, Apple is still Apple. Even if the M1 turns out to be the greatest thing ever, it will only be inside Apple hardware and Apple seems intent on having a relatively limited variety of computers.

 

-kp

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, kpluck said:

That being said, Apple is still Apple. Even if the M1 turns out to be the greatest thing ever, it will only be inside Apple hardware and Apple seems intent on having a relatively limited variety of computers.

Yeah, I apple is not going to ever share its CPUs but a huge part of performance per watt uplift apple is seeing is down to the chip being ARM. The reason people have been afraid of going to RISC architectures is the amount of work in software it takes and the fragmentation it will introduce. But this will likely force the industry to start looking into switch architectures... or so I am hoping... Microsoft has tried it for years and they fall on their face every-time because they cant optimize windows for it. There is already enough money into companies developing ARM based solutions for industrial applications.

 

Or maybe me hoping this happens for notebooks is messing with my brain... who knows lol.

 

(I agree with everything you said, just adding that this impact the industry broadly).

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm disappointed at it being made by Apple and therefore never making it outside of their ecosystem. 

 

I've wanted powerful ARM processors on laptops and tablets for a while running Windows.

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

Seems like it's less of the hardware's fault and more of the software's fault. Not exactly atypical of Apple... or any company, as a matter of fact. The fact that the M1 can come out fucking swinging by playing 8K video better than almost any x86 CPU that one could buy tells you a story and a half alone. Just seems that Apple needs to iron out some bugs within Big Sur and to also learn a bit more about how to fine-tune their shit for their ARM CPUs for macOS.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, handymanshandle said:

Seems like it's less of the hardware's fault and more of the software's fault. Not exactly atypical of Apple... or any company, as a matter of fact. The fact that the M1 can come out fucking swinging by playing 8K video better than almost any x86 CPU that one could buy tells you a story and a half alone. Just seems that Apple needs to iron out some bugs within Big Sur and to also learn a bit more about how to fine-tune their shit for their ARM CPUs for macOS.

 

No, based on my understanding a chip being able to play 8k video does not mean anything. I don't disagree that the chip itself is not impressive, but using 8k video processing capabilities as a benchmark in case of this architecture does not mean anything when comparing to x86. It is not a trivial achievement but an unfair comparison.

You can design pipelines on a chip specifically for hardware decoding, and that is what apple is doing. Similar to how software RAID and hardware RAIDare 2 very different things, or Intel quicksync. This is hardware decoding of video and playback, not how general purpose processors do it.

 

With that said, my disappointment  is based on the interfaces and I agree with you that the chip performs impressively well. The GPU specifically is substantially better than I expected.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, BlakSpectre said:

 

No, based on my understanding a chip being able to play 8k video does not mean anything. I don't disagree that the chip itself is not impressive, but using 8k video processing capabilities as a benchmark in case of this architecture does not mean anything when comparing to x86. It is not a trivial achievement but an unfair comparison.

You can design pipelines on a chip specifically for hardware decoding, and that is what apple is doing. Similar to how software RAID and hardware RAIDare 2 very different things, or Intel quicksync. This is hardware decoding of video and playback, not how general purpose processors do it.

 

With that said, my disappointment  is based on the interfaces and I agree with you that the chip performs impressively well. The GPU specifically is substantially better than I expected.

Fair enough about the 8K point. We're a month into Apple transitioning into a new CPU architecture, though, and much like damn near anything that's new, things tend to improve rapidly in terms of performance, stability and compatibility. You need beefy hardware (or a beefy hardware decoder) to even pull off 8K video decoding in the first place, though, and that's definitely something to note. 

I dunno. I guess I'm less pessimistic about things that just come out that are a revolution for tech. I was rather optimistic about RTX when that first made its appearance onto the PC space. Everyone was thinking that Battlefield V at 1080p with ray-tracing barely cranking out 60fps was unimpressive, but I also understood that these were wee days for hardware-accelerated ray-tracing, something that, even with low amounts of rays and a lot of compromise, was only really possible on server farms in real-time. Look at where ray-tracing in real-time on the PC space is now: in a much better spot than it was two years ago, that's for sure. 

I'll happily eat crow if Apple cocks up their ARM CPUs for their actual desktop and laptop solutions later on down the road, but I think it'll take a little bit of time for these to really start shining.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Arika S said:

I'm disappointed at it being made by Apple and therefore never making it outside of their ecosystem. 

 

I've wanted powerful ARM processors on laptops and tablets for a while running Windows.

What about Linux?

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, handymanshandle said:

Fair enough about the 8K point. We're a month into Apple transitioning into a new CPU architecture, though, and much like damn near anything that's new, things tend to improve rapidly in terms of performance, stability and compatibility. You need beefy hardware (or a beefy hardware decoder) to even pull off 8K video decoding in the first place, though, and that's definitely something to note. 

I dunno. I guess I'm less pessimistic about things that just come out that are a revolution for tech. I was rather optimistic about RTX when that first made its appearance onto the PC space. Everyone was thinking that Battlefield V at 1080p with ray-tracing barely cranking out 60fps was unimpressive, but I also understood that these were wee days for hardware-accelerated ray-tracing, something that, even with low amounts of rays and a lot of compromise, was only really possible on server farms in real-time. Look at where ray-tracing in real-time on the PC space is now: in a much better spot than it was two years ago, that's for sure. 

I'll happily eat crow if Apple cocks up their ARM CPUs for their actual desktop and laptop solutions later on down the road, but I think it'll take a little bit of time for these to really start shining.

For clarification, I dont disagree with you regarding the fact that the decoder is impressive. I just meant it is not a fair comparison. One way or the other ARM transition will work out for apple.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not sure what is the benefit of 8K video playback or even resolution given just how expensive such Displays are. Are there any content produce in 8K?

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Nowak said:

What about Linux?

You mean the 3 people that use linux desktops. JK. I would have loved to see linux performance on the surface pro x, but it is a custom CPU. For now we only have Rasberry Pi on the consumer side of things.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, BlakSpectre said:

You mean the 3 people that use linux desktops. JK. I would have loved to see linux performance on the surface pro x, but it is a custom CPU. For now we only have Rasberry Pi on the consumer side of things.

The Pinebook Pro exists.

Link to comment
Share on other sites

Link to post
Share on other sites

Fun fact: Jim Keller is responsible for the architectures of both Apple SOCs and AMDs Ryzen.

It's interesting to see Keller's creations battle each other.

Intel hired Keller but there were a lot of problems there,and Keller quit before the end of the contract as well.

 

What i think about the M1:

Performance is really good no doubt about it,but it's crippled by the amount of PCI-E lanes on the SOC and bottlenecked the performance of the GPU.

Also consider that the 5nm process is a significant advantage that allows Apple to have higher transistor density than AMD,Intel and Nvidia.

I wonder how the performance will be for AMD at 5nm.

 

And for now software support is lacking so users will have problems for a while.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

Just now, whm1974 said:

I'm not sure what is the benefit of 8K video playback or even resolution given just how expensive such Displays are. Are there any content produce in 8K?

What is the disadvantage of being future proof? Or designing pipelines for video editors using the machine?

 

Thinking along those lines, my disappointment is completely baseless.... what is the advantage of having multi monitor support when 90% users will never use it, or having 2 thunderbolt controllers or better bandwidth for i/o? Most people wont need more than a gigabit connection or more than one USB port.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Nowak said:

The Pinebook Pro exists.

True. I forgot, even though the reviews of the device have been excellent (for the price and ignoring QA issues).

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Nowak said:

The Pinebook Pro exists.

Yeah but calling that a consumer device is a bit of a stretch imo.

Quote me to see my reply!

SPECS:

CPU: Ryzen 7 3700X Motherboard: MSI B450-A Pro Max RAM: 32GB I forget GPU: MSI Vega 56 Storage: 256GB NVMe boot, 512GB Samsung 850 Pro, 1TB WD Blue SSD, 1TB WD Blue HDD PSU: Inwin P85 850w Case: Fractal Design Define C Cooling: Stock for CPU, be quiet! case fans, Morpheus Vega w/ be quiet! Pure Wings 2 for GPU Monitor: 3x Thinkvision P24Q on a Steelcase Eyesite triple monitor stand Mouse: Logitech MX Master 3 Keyboard: Focus FK-9000 (heavily modded) Mousepad: Aliexpress cat special Headphones:  Sennheiser HD598SE and Sony Linkbuds

 

🏳️‍🌈

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, BlakSpectre said:

What is the disadvantage of being future proof?

Even the best you can do now may not be enough for the future.

Just now, BlakSpectre said:

Or designing pipelines for video editors using the machine?

On Nvidia GPUs it comes with no compromises,it works great and has a wide support from developers.

Even my GTX 1660 has a full Nvidia Encoder/Decoder (Turing)

3 minutes ago, BlakSpectre said:

Thinking along those lines, my disappointment is completely baseless.... what is the advantage of having multi monitor support when 90% users will never use it, or having 2 thunderbolt controllers or better bandwidth for i/o? Most people wont need more than a gigabit connection or more than one USB port.

More options and functionality is always better,computers are multi-role machines and are not designed for just one task.

 

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, BlakSpectre said:

What is the disadvantage of being future proof? Or designing pipelines for video editors using the machine?

 

Thinking along those lines, my disappointment is completely baseless.... what is the advantage of having multi monitor support when 90% users will never use it, or having 2 thunderbolt controllers or better bandwidth for i/o? Most people wont need more than a gigabit connection or more than one USB port.

I will strongly disagree with you on not needing more then one USB port.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, whm1974 said:

I will strongly disagree with you on not needing more then one USB port.

you are also on LTT forum, you are a small percentage of the population. Most people dont need more than one USB port.

By your logic, since most people dont need 8K playback (limited to ultra rich and video editors) and adding it is of no use, adding more than 1 USB port for a small percentage of population is also of no use.

I strongly disagree with you regarding 8k playback being unnecessary.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, kelvinhall05 said:

Yeah but calling that a consumer device is a bit of a stretch imo.

I know, I'm just saying that it exists.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Nowak said:

I know, I'm just saying that it exists.

Fair, to me it just seemed like you were trying to bring it up as a consumer device designed to run Linux.

 

It is pretty cool though.

Quote me to see my reply!

SPECS:

CPU: Ryzen 7 3700X Motherboard: MSI B450-A Pro Max RAM: 32GB I forget GPU: MSI Vega 56 Storage: 256GB NVMe boot, 512GB Samsung 850 Pro, 1TB WD Blue SSD, 1TB WD Blue HDD PSU: Inwin P85 850w Case: Fractal Design Define C Cooling: Stock for CPU, be quiet! case fans, Morpheus Vega w/ be quiet! Pure Wings 2 for GPU Monitor: 3x Thinkvision P24Q on a Steelcase Eyesite triple monitor stand Mouse: Logitech MX Master 3 Keyboard: Focus FK-9000 (heavily modded) Mousepad: Aliexpress cat special Headphones:  Sennheiser HD598SE and Sony Linkbuds

 

🏳️‍🌈

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×