Jump to content

M1 Macs Reviewed

randomhkkid
59 minutes ago, LAwLz said:

Are you sure the cooling in the Mac Pro is bad? 

The cooling in the MacBook Pro was bad but I haven't heard complaints about the Mac Pro having poor thermal solutions. 

I think the bigger issue with that benchmark is that it doesn't specify which Mac Pro they were testing against. 

The Mac Pro goes all the way from an 8 core CPU to a 24 core CPU. The test is probably the 8 core model at 3.5GHz.

 

You are correct, I was specifically talking about the laptops, your point with the desktop chip is also agreed with.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, Bombastinator said:

I think there have been some systems where it isn’t bad and some where it is.  The 16” in at least a few models was good, and the Mac Pro in I think all models was good. Some weren’t though. Such an issue could only apply to the models that did have such an issue as not all did.  My memory is with the 16” the first 16” was good (2018?2019?) but following models had increasing problems. 

 

56 minutes ago, randomhkkid said:

The MBP 16 actually has pretty good cooling as long as CPU and GPU aren't loaded up at the same time. It's still not adequate for combined CPU and GPU load (I have a whole reddit thread of a VRM mod I did) but in terms of CPU benchmarks it's pretty comparable to top of the line windows laptops. It's not a valid knock on Apple to say M1 vs intel is gimped.

 

Edit: To backup my claims you can see here I posted that the MBP 16 can maintain 3.3Ghz on all 16 threads for an unlimited amount of time. Then notebookcheck showing the Blade Pro 17" maintaining around 3.3 - 3.6GHz in a stress test using a newer 10th gen 8 core.

 

38 minutes ago, Belgarathian said:

Or alternatively, Intel has been consistently letting Apple down on the development front so that their hardware doesn't have sufficient cooling capacity. Hence the M1 we have today.

 

First, I did say some, not all.

 

Second, doesn't the macbook air completely lack contact on the intel chip? One device is beyond bizzare and completely sandbagging.

 

Third, Intel provides the chips, it is purely the responsibility of the oem to make chassis that handle the parts that are ordered. We wouldn't and don't give this type of excuse to Razer or Dell or anyone else. When it's bad, it's bad. And with the notable exception of the 16 inch MBP, they have been shockingly bad for quite a while. The fact that the 16" MBP maybe runs within 10% clocks of another OEM and we look at that as being an impressively good showing by Apple standards shows exactly how much they have been allowed to get away with the problem (and more than a few have chronic issues with devices dying due to thermals). When Apple's cooling design was so bad that the i9 variant was slower than the i7 that's purely their fault.

 

Fourth, it's impressive to be close, even if I think the lead is mainly Apple being bad at cooling before than even Intel being bad at progress. I just want to caveat that we are comparing against some of the worst cooled laptops ever to be sold at market, and definitely the worst from a premium OEM in this century.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I must admit, I am very impressed with the end result performance.

It was above of what I expected.

 

We might get this in the PC space, in 5 years from now, with Qualcomm minor incremental improvements each generation, unless MediaTek magically pulls an AMD out of nowhere. But considering that MediaTek goals is to make SoCs that are basically free, I don't have my hopes up.

 

As for Microsoft "custom" SoC... well, assuming they decide to go all in, and actually invest in making a better SoC chip with Qualcomm, building and bringing its own expertise will take years. I mean, Apple isn't new at this.

 

While I am a big supporter and see the future in ARM in PC space, in all form factors, even desktops. Performance improvements where not big enough each generation (on Qualcomm side) to actually make this a viable solution for delivering an experience that won't affect the user in any way that is noticeable, no matter what they through at the system, including tasks not designed for the target audience of such device.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, vetali said:

Blizzard has native day 1 support of ARM Macs with World of Warcraft. Hopefully somebody runs it and posts results, because that game is really CPU intensive in certain areas. Saw one review running it but it was likely through Rosetta because the patch came out this morning.

 

https://us.forums.blizzard.com/en/wow/t/mac-support-update-november-16/722775

It's amazing how quick developers are with supporting this. 

It's day one of release and a ton of programs already have native versions. Meanwhile on the windows side people like GoodBytes makes a thread basically every single time someone ports a program to Windows on ARM. It's like "holy crap you guys, this random text editor was ported to windows on ARM! This is very newsworthy!".

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, LAwLz said:

It's amazing how quick developers are with supporting this. 

It's day one of release and a ton of programs already have native versions. Meanwhile on the windows side people like GoodBytes makes a thread basically every single time someone ports a program to Windows on ARM. It's like "holy crap you guys, this random text editor was ported to windows on ARM! This is very newsworthy!".

I am hoping this helps bring real productivity and not shitty games to mobile devices in the next few years, but the lure of microtransactions will probably be too high.

 

Two factors of course... money per user and then Apple has made it as easy as possible if you had even thought of supporting Macs before now.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, iEimis said:

Veryyyyyyy impressive

 
 
 
 
 

I am not impressed by GPU performance comapred to Intel.

 

In my book, Intel has always treated their GPU as a free solution (despite increasing the price of their CPUs when they incorporated them inside), and really aimed for tasks like displaying the desktop environment and video watching, and that is about it. 

 

Now, it looks like Intel is actually putting effort with their Xe GPUs, so we will see. But Intel effort in that was really because of AMD being a competitor... So anyways, we will see.

 

But what I do want to say, is that I want to see how it compares to AMD APUs in terms of gaming performance.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Curufinwe_wins said:

 

 

 

First, I did say some, not all.

 

Second, doesn't the macbook air completely lack contact on the intel chip? One device is beyond bizzare and completely sandbagging.

 

Third, Intel provides the chips, it is purely the responsibility of the oem to make chassis that handle the parts that are ordered. We wouldn't and don't give this type of excuse to Razer or Dell or anyone else. When it's bad, it's bad. And with the notable exception of the 16 inch MBP, they have been shockingly bad for quite a while. The fact that the 16" MBP maybe runs within 10% clocks of another OEM and we look at that as being an impressively good showing by Apple standards shows exactly how much they have been allowed to get away with the problem (and more than a few have chronic issues with devices dying due to thermals). When Apple's cooling design was so bad that the i9 variant was slower than the i7 that's purely their fault.

 

Fourth, it's impressive to be close, even if I think the lead is mainly Apple being bad at cooling before than even Intel being bad at progress. I just want to caveat that we are comparing against some of the worst cooled laptops ever to be sold at market, and definitely the worst from a premium OEM in this century.

While I'm not sure of the specifics, Intel definitely works with some OEMs to provide support for the implementation of their chips.

 

Intel would certainly be contributing to Apple, as Apple was pushing Intel to further develop their efficiency and GPU performance for the Mac.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, LAwLz said:

It's amazing how quick developers are with supporting this. 

It's day one of release and a ton of programs already have native versions. Meanwhile on the windows side people like GoodBytes makes a thread basically every single time someone ports a program to Windows on ARM. It's like "holy crap you guys, this random text editor was ported to windows on ARM! This is very newsworthy!".

 
 

Yup. That is the fact of reality.

Since ages, Apple is able to convince companies and its devs to implement features for new hardware or use new software technologies.

 

When its Microsoft, companies and devs goes (being said in a silly way for comedic effect):

"Heumm how about no... we want a full business case that justify the expense, even if the expense is an absolute minor one. We also want people at the front of our doors, with money in their hands, begging for that feature to be implemented for a minimum of 6 months without bathroom breaks, and THEN and only then, we will gladly put it in our "Ideas to consider" list...., it will be item 500, we are doing item #2 today. So maybe, just maybe, in 20 or 25 years if all goes well, no delays anywhere, and we, as a company, still exist, we will get on it!".

 

I never could really understand this. Is it because competition between applications on Macs is more active than on Windows?

Or is it because Apple users sees values in these new features, and that is not the case under Windows with its user base?

Heck, even Android faces similar problem, where even Spotify on iOS is better than on Android, despite the massive Android user base.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, GoodBytes said:

I am not impressed by GPU performance. Intel has always treated their GPU as a free solution (despite increasing the price of their CPUs when they incorporated them inside), and really aimed for tasks like displaying the desktop environment and video watching, and that is about it. How many years I recall Intel saying on how, 'but this time it really supports full OpenGL/DirectX', only for you to face with games that crashes with unsupported OpenGL/DirectX call (or the visuals isn't being drawn)... I mean, this is ages ago story, it's been many years since this is not the case anymore, I don't think. But still, it demonstrates the Intel past commitment.

 

Now, it looks like Intel is actually putting effort with their Xe GPUs, so we will see. But Intel effort in that was really because of AMD being a competitor... So anyways, we will see.

 

I would like to see, but what I do want to say, is that I want it more compete against AMD APUs in terms of gaming performance.

It would still be better than the U and H mobile APUs, though not by as much. The real kicker in that source is actually that the Intel chip starts around 50 fps and the laptop cooler is such a piece of shit that it falls down to that mid teens sustained from thermal throttling.

 

On a rough estimate, it looks similar to the 4900HS, which is rather impressive given the power draw difference, though the HS is obviously not optimized the same way, and I do expect given the PS5 and XBSX efficiencies, that AMD could easily match or surpass the performance and efficency on the GPU side atm. Nvidia definitely can as well, the only problem is that they don't yet have a good SOC system to throw it together with, and those watts of interconnects add up.

 

Oh it's also not apparent yet if the AS gpus are feature complete relative to traditional desktop ones from AMD/Nvidia. Apple isn't telling and Mac gaming is limited enough that it will take some time to find out how they handle tesselation for example. Or similar. 

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Curufinwe_wins said:

It would still be better than the U and H mobile APUs, though not by as much. The real kicker in that source is actually that the Intel chip starts around 50 fps and the laptop cooler is such a piece of shit that it falls down to that mid teens sustained from thermal throttling.

 
 

I see. Yes, because cooling solution is not something looked at by buyers (doesn't even cross their mind), it is the area that gets cheaped out, to achieve a certain price point. Most average buyer on the PC space don't consider cool & quiet (same for touchpad, speakers, webcam, SSD/HDD speed, and back in the old days: chipsets (when they mattered a lot more than today, where they got some cheap non-Intel/AMD chipset so save every penny they can). They just look for the highest specs for the dollar on the sticker price that shows like 3 spec details, and that is about, like if nothing else mattered.

 

Quote

On a rough estimate, it looks similar to the 4900HS, which is rather impressive given the power draw difference, though the HS is obviously not optimized the same way, and I do expect given the PS5 and XBSX efficiencies, that AMD could easily match or surpass the performance and efficency on the GPU side atm. Nvidia definitely can as well, the only problem is that they don't yet have a good SOC system to throw it together with, and those watts of interconnects add up.

 
 
 
 
 
 

If Nvidia makes SoC for the PC space, that would be very interesting. 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, GoodBytes said:

I never could really understand this. Is it because competition between applications on Macs is more active then on Windows?

Or is it because Apple users sees values in these new features, and that is not the case under Windows, with its user base?

Heck, even Android faces similar problem, where even Spotify on iOS is better than on Android, despite the massive Android user base.

One of the reasons for this, as a developer on of tools on the platform, i can say is the apis, apple do a very very good job of creating new apis but they also do a very good job of pushing devs to adopt these new apis and system features, this pushes all developers forwards and means apps that are not updated for a few years end up being forgeten by the comunity so if you make apps for apples platforms and want to many money from this you need to stay ontop of things. This hower also means you can make money since free older apps that are not well maintained end up dieing out your competition on the platform is other paid apps not free tools. Due to this on windows there are very few indi app devs, most that are there are targeting industry enterpise were the users is not the one paying the bill so the UX of the app is not the most important factor on keeping/converting a customer.

Link to comment
Share on other sites

Link to post
Share on other sites

(I'll go thru this thread later when I have time, but wanted to register a question when people are up)

 

I watched MKBHD's overview/review of the M1 MBP..he's pretty positive about it, but he gives a sample of running a final cut render on the m1, his intel MBP, and his Mac Pro. The intel took about 10 minutes to do the render and the m1 took 12 minutes (the mac pro 7)....and nothing is made of this....nothing was made of that in the you tube comments...did i misunderstand? It seemed to me that the M1 is slower than the intel. ( I didn't take note of the details of the different models, so maybe the point is that it's pretty good performance for something new...but it seems like a same task with slower results should be a "uh oh" moment?)

 

image.png.112e4ebf7c2a8057db7121c3399d0f23.png

 

EDIT: Blade of Grass gave me the context to understand what I was missing below.

 

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Bombastinator said:

I have noticed some sandbagging via under cooling from Apple on their intel chips too.  As such I am more interested in comparisons from hackintoshes or win10.  It is important to keep size considerations in mind though. 

I believe someone in this thread said that M1 beat a hackintosh based on a Ryzen R9 5950. Somewhere on page 2 I think

I like cute animal pics.

Mac Studio | Ryzen 7 5800X3D + RTX 3090

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, JoseGuya said:

I'm still very conflicted. Do I switch my aging 15 inch late 2013 MBP to the M1 MBP, or to the 2020 16 inch Intel? Do I wait for the 16 inch with Apple Silicon? WHAT TO DO

wait for the 16inch with apple silicon. by then some of the bugs will have sorted out aswell, and the 2013 MBP is still quite capable. 

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Video Beagle said:

(I'll go thru this thread later when I have time, but wanted to register a question when people are up)

 

I watched MKBHD's overview/review of the M1 MBP..he's pretty positive about it, but he gives a sample of running a final cut render on the m1, his intel MBP, and his Mac Pro. The intel took about 10 minutes to do the render and the m1 took 12 minutes (the mac pro 7)....and nothing is made of this....nothing was made of that in the you tube comments...did i misunderstand? It seemed to me that the M1 is slower than the intel. ( I didn't take note of the details of the different models, so maybe the point is that it's pretty good performance for something new...but it seems like a same task with slower results should be a "uh oh" moment?)

 

image.png.112e4ebf7c2a8057db7121c3399d0f23.png

The top of the line Intel system with a dedicated GPU is faster by about 20% than the base model w/ M1 with integrated graphics (price difference between the laptops, probably another M1 laptop).
Can’t imagine anyone is really that surprised by that? Why would that be an “uh oh” moment? 

15" MBP TB

AMD 5800X | Gigabyte Aorus Master | EVGA 2060 KO Ultra | Define 7 || Blade Server: Intel 3570k | GD65 | Corsair C70 | 13TB

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Blade of Grass said:

The top of the line Intel system with a dedicated GPU is faster by about 20% than the base model w/ M1 with integrated graphics (price difference between the laptops, probably another M1 laptop).
Can’t imagine anyone is really that surprised by that? Why would that be an “uh oh” moment? 

"The top of the line Intel system with a dedicated GPU" THAT was the context I was missing! Thanks.

🖥️ Motherboard: MSI A320M PRO-VH PLUS  ** Processor: AMD Ryzen 2600 3.4 GHz ** Video Card: Nvidia GeForce 1070 TI 8GB Zotac 1070ti 🖥️
🖥️ Memory: 32GB DDR4 2400  ** Power Supply: 650 Watts Power Supply Thermaltake +80 Bronze Thermaltake PSU 🖥️

🍎 2012 iMac i7 27";  2007 MBP 2.2 GHZ; Power Mac G5 Dual 2GHZ; B&W G3; Quadra 650; Mac SE 🍎

🍎 iPad Air2; iPhone SE 2020; iPhone 5s; AppleTV 4k 🍎

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, BuckGup said:

So when are we going to see a Mac Pro with discrete apple graphics and an equivalent xeon based M1 chip?

Craig said in the interview today that all the Macs will be eventually updated to Apple Silicon

 

Quote

“It seems like some of these people were people who don’t buy that part of our product line right now are eager for us to develop silicon to address the part of the product line that they’re most passionate about,” Federighi told me. “You know that their day will come. But for now, the systems we’re building are, in every way I can consider, superior to the ones they’ve replaced.”

Source: https://9to5mac.com/2020/11/17/apple-executive-interview-m1-apple-silicon/

 

4 hours ago, GoodBytes said:

I am not impressed by GPU performance comapred to Intel.

 

In my book, Intel has always treated their GPU as a free solution (despite increasing the price of their CPUs when they incorporated them inside), and really aimed for tasks like displaying the desktop environment and video watching, and that is about it. 

 

Now, it looks like Intel is actually putting effort with their Xe GPUs, so we will see. But Intel effort in that was really because of AMD being a competitor... So anyways, we will see.

 

I would like to see, but what I do want to say, is that I want it more compete against AMD APUs in terms of gaming performance.

It trounces all iGPUs as far as I know and with power consumption ranging from 10W to 20W range. It comes really close to GTX 1050 and RX580 both of which have been dedicated power hungry GPUs. I don't think we've ever had such a powerful iGPU before. 

 

https://www.anandtech.com/show/16252/mac-mini-apple-m1-tested/3

4 hours ago, GoodBytes said:

I never could really understand this. Is it because competition between applications on Macs is more active than on Windows?

Or is it because Apple users sees values in these new features, and that is not the case under Windows with its user base?

Heck, even Android faces similar problem, where even Spotify on iOS is better than on Android, despite the massive Android user base.

That's quite simple. Apple moves on quite fast. If developers don't take advantage of their new APIs or hardware, they will usually get chucked off the platform in no time. They dropped support for 32-bit apps quite recently on Mac and before on iOS.

 

And Apple also makes developer tools to make all these transitions easy. Thats why I laughed at everyone who claimed that no major apps will be updated to ARM. Doing something like that would be suicide on an Apple ecosystem, unless they really want Apple to leave them to dust. On the Windows side, there's no such worry and up till now ARM Windows devices has just been Microsoft's attempt at something that wasn't going to make any difference in the industry as whole.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, RedRound2 said:

That's quite simple. Apple moves on quite fast. If developers don't take advantage of their new APIs or hardware, they will usually get chucked off the platform in no time. They dropped support for 32-bit apps quite recently on Mac and before on iOS.

Actually, if Microsoft does this, no one would implement anything new.

 

23 minutes ago, RedRound2 said:

And Apple also makes developer tools to make all these transitions easy. Thats why I laughed at everyone who claimed that no major apps will be updated to ARM. Doing something like that would be suicide on an Apple ecosystem, unless they really want Apple to leave them to dust. On the Windows side, there's no such worry and up till now ARM Windows devices has just been Microsoft's attempt at something that wasn't going to make any difference in the industry as whole.

True, but so is Microsoft in the recent years. You want to make to make an ARM version of your project, Microsoft made it as easy as picking on a dropdown list on the main toolbar of Visual Studio "ARM64" and hit the Build/Compile button. The compiler will take care of translating everything in an optimal way for ARM for the developer.

 

The problem is that companies goes "Well now it will be a new release/version of ouch software that we will need to support. It might have bug specific issues. where is the user base?". There is no excitement of going on "Yea, let's try, have fun, let's push things forward!", it is all business analysis. The problem, is that you always end up with the chicken and egg problem.

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, n0stalghia said:

I believe someone in this thread said that M1 beat a hackintosh based on a Ryzen R9 5950. Somewhere on page 2 I think

Yah.  There was a video where the reviewer mentioned it.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, GoodBytes said:

Actually, if Microsoft does this, no one would implement anything new.

 

True, but so is Microsoft in the recent years. You want to make to make an ARM version of your project, Microsoft made it as easy as picking on a dropdown list on the main toolbar of Visual Studio "ARM64" and hit the Build/Compile button. The compiler will take care of translating everything in an optimal way for ARM for the developer.

 

The problem is that companies goes "Well now it will be a new release/version of ouch software that we will need to support. It might have bug specific issues. where is the user base?". There is no excitement of going on "Yea, let's try, have fun, let's push things forward!", it is all business analysis. The problem, is that you always end up with the chicken and egg problem.

 

Yeah Microsoft and software companies really don't have a reason to make an ARM version, not when everything runs on x86 just fine.

Link to comment
Share on other sites

Link to post
Share on other sites

This video shows more real world performance of the fanless MacBook Air - and it's nothing short of impressive. The way it handles 4K, 4K HDR content and Canon R5 10 bit with a breeze is honestly mind blowing.

 

Suddenly a minimum $3000 dollar computer can be had for $999

 

 

Also many reviewers are saying that the new devices are very responsive and almost have an iPad like feel in launching apps. Thats not something that can be presented on paper, but that's a huge improvement in experience to the end user that cannot really be presented on paper

 

And these devices seem to be flying off shelves as the shipping dates have slipped to weeks after the embargo lifted

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Curufinwe_wins said:

I mean yes, but also A14 only managed a 5% ipc improvement over A13. And A13 was less of an improvement over A12 than A12 was over A11. I don't want to pretend like I'm expecting Apple to flounder, but they have been slowing down their CPU growth recently. GPUs and other dedicated silicon? Now that is flying ahead. But it's worth noting that it seems like Apple is also hitting some diminishing return situations. 

 

This may not actually be the worst relative position Apple has compared to incumbent x86, but that all depends on how much iterative performance from here out each side can deliver. And we've been waiting for quite a while still for Intel to pick back up. Obviously AMD has been punching yoy gains far and above anyone else atm.

 

 

 

 

 

Also also! This literally validates Linus's position IMO. M1 is only 10-15% faster than A14 despite being almost 4x the power (comparing peak to peak with iphone vs mac mini). I do hope that Apple can broaden their performance curves a bit more moving forward.

 

It is an iPad. Now turns out, iPads are pretty darn powerful though.

How in the hell is this just an iPad?

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, GoodBytes said:

Actually, if Microsoft does this, no one would implement anything new.

And everyone knows they can't. They're struggling with having non supported OSs like Xp and 7 out in the wild. So everyone knows that Microsoft can't really decide to do anything that fast. 

32 minutes ago, GoodBytes said:

True, but so is Microsoft in the recent years. You want to make to make an ARM version of your project, Microsoft made it as easy as picking on a dropdown list on the main toolbar of Visual Studio "ARM64" and hit the Build/Compile button. The compiler will take care of translating everything in an optimal way for ARM for the developer.

 

The problem is that companies goes "Well now it will be a new release/version of ouch software that we will need to support. It might have bug specific issues. where is the user base?". There is no excitement of going on "Yea, let's try, have fun, let's push things forward!", it is all business analysis. The problem, is that you always end up with the chicken and egg problem.

 

Another thing is also the lack of competitive hardware. If Qualcomm is smart they should be upping their PC ARM processors R&D and develop something remotely compelling as the M1 for Windows rn.

 

Once we have a minimum performance, then Microsoft can start slowly enforcing ARM with exclusive features or something.

 

Also can you provide me a link to the tool you mentioned above. I've never heard of it before

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, RedRound2 said:

If Qualcomm is smart they should be upping their PC ARM processors R&D and develop something remotely compelling as the M1 for Windows rn.

Agreed. Instead of having competitions to see who can point at Apple, Intel and AMD the hardest, people should be looking to Qualcomm, Meditek, or hell even Samsung to make new strides in ARM SoCs for Windows and Linux

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×