Jump to content

Apple M1 Ultra - 2nd highest multicore score, lost to 64-core AMD Threadripper.

TheReal1980
8 minutes ago, leadeater said:

 

 

Doesn't really need a deep investigation, has been done already heh.

Exactly. Yet Apple continue with the “we’re equal at lower power draw” statement when everybody who isn’t tone deaf knows that for most people GPU performance means measuring in games or Blender.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

You really should try listening to someone in the industry. So I've done a lot of Mac and Windows support in the education sector, tertiary (University/College) and general schools, and there is one thing that holds true, almost none of the teaching in engineering is done on Macs or Mac OS. People had Mac running Windows but all the software was Windows.

Engineering is a lost cause, but here I was debating Architects which are much more creative/visual design driven, a typical stronghold of Macs.

 

As I said, every time I see a documentary about an architect firm there is a fleet of Macs in the background of the shoot.

1 hour ago, leadeater said:

Some of the software is literally horribly written too, like VectorWorks. To properly silently install mass deploy VectorWorks to the 3x 30 engineering computer labs required a bunch of pissing around with registry modification change capturing because the silent install method doesn't (at the time) work correctly and do everything so we have to drop a reg import after install to make it work properly and actually do things like talk to the license server.

Registry mods? So the Windows version was poorly written? There is no registry on macOS..

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/9/2022 at 10:26 PM, gjsman said:

Take the M1 Max score, increase it a bit because the M1 Max at the time was tested inside of a laptop chassis without desktop-level cooling, and then double it. 3090 territory? 

I don‘t get why this SOTR bechmark keeps getting used. It‘s running on a native API on Windows and through a translation layer on AS. That‘s a very bad comparison.

On 3/9/2022 at 10:35 PM, AluminiumTech said:

If you take Apple's word for it, the M1 Ultra graphics is 80% faster than a W6900X which is a 6900XT and thus the 3090.

No idea where they would‘ve claimed that. I remember them claiming equal perf.

And until we have toe-to-toe native vs native graphics API comparisons of the same benchmark/game, we won‘t know.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Paul Thexton said:

Completely expected. I just don't get why Apple continually shoot themselves in the foot when talking about their GPU relative performance against PC hardware. It's really, really weird.

I don’t think it is that simple. Many reviewers use Shadow of the Tomb Raider or Geekbench 5 both of which are poorly optimized for AS. here are the results from one of Matthew moniz’s videos comparing the 14 inch m1 max to a desktop with 12900k and a 3080. 
8C1E17C9-C961-4291-85D8-8DE4A18717AE.thumb.jpeg.a484aa31732b1a120476ab2dddfa5bf4.jpeg

 

as you can see WoW is the only native AS game in this list and it runs pretty well. remember this is the 14 inch m1 max. Another example would be The Pathless (Apple Arcade) which was running under rosetta and got recently updated to be a native ARM game and better metal support the performance gains are huge at 1080p the fps went from 45-60 to locked 120fps after the update on the m1 max.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Dracarris said:

Registry mods? So the Windows version was poorly written? There is no registry on macOS..

And? Don't assume it's well written for Mac OS either lol. There's a big ass difference between installing software on A Mac and install software on 90 Macs and have the software pointing at the required license server. Don't kid yourself I've had to do the same BS dropping plist files etc on Mac OS as well with JAMF.

 

Bad software exists everywhere 😉

 

2 hours ago, Dracarris said:

Engineering is a lost cause, but here I was debating Architects which are much more creative/visual design driven, a typical stronghold of Macs.

 

As I said, every time I see a documentary about an architect firm there is a fleet of Macs in the background of the shoot.

So? TV is literally on purpose for show. The architect that designed the house we built did not do it on a Mac, don't think the entire company had a Mac at all.

 

Please refer to last line of my post, your assessment isn't equivalent to the person commenting who's by the sound of it life long career. Given the choice, along with my own experiences supporting such software, I'll be siding with Mac simply isn't there for wide usage. You can use it but that's not the same as being a viable replacement for anyone and everyone or even widely for that matter.

Link to comment
Share on other sites

Link to post
Share on other sites

Yep I’m aware of the update to The Pathless, it does look like good gains were made here. I’ve not played it yet myself, from gameplay footage it doesn’t look like a game that will maintain my interest, I’ll probably check it out eventually though. 
 

As for the likes of SOTR, I know the macOS port for that was done by feral interactive but I don’t know if they ported it directly to Metal or if they’re relying on MoltenVK, but if it’s the latter then yeah you have 2 layers of abstraction when you include Rosetta 2 that will get in the way of performance. 
 

That’s not the fault of the reviewers though, they can only show what the software they’re testing performs as, as far as I know world of Warcraft isn’t easy to run reliably consistent benchmarks?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, leadeater said:

And? Don't assume it's well written for Mac OS either lol.

I have no clue why your comment before was then relevant to the discussion we were having. Might as well be written poorly for macOS, or not. We don't know.

 

14 minutes ago, Paul Thexton said:

That’s not the fault of the reviewers though, they can only show what the software they’re testing performs as, as far as I know world of Warcraft isn’t easy to run reliably consistent benchmarks?

 

It's not the fault of  the reviewers, sure. But from the current game benchmarks that we have it's hard to estimate the true GPU performance since almost all of them go through one or more translation layers. A toe-to-toe comparison on native APIs on both Windows and M1 of sth like SOTR would be super interesting.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Dracarris said:

A toe-to-toe comparison on native APIs on both Windows and M1 of sth like SOTR would be super interesting.

No argument from me on that. Whether we’ll get one remains to be seen, I reckon most studios would rather knock a port out quickly using MoltenVK (presuming they already have a Vulkan based renderer) rather than do all the Metal api stuff themselves. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dracarris said:

I have no clue why your comment before was then relevant to the discussion we were having. Might as well be written poorly for macOS, or not. We don't know.

Well as I said in my original post existence of software on Mac OS does not mean that it's the same and equally as good or same feature support as the same software on Windows. My post like the other person is illustrating our actual experiences with the software, and I also happen to have experience with AutoCAD products on Mac OS and iOS too.

 

You just seem to be extremely unreceptive to input from people with experience in such software 🤷‍♂️

 

You singled out that part of my post literally only to have a jab at Windows and make, in that way, an irrelevant post like you are trying to claim my one is now.

 

Again how about you learn to accept comments from those in the know than giving your fly by night comments from above without the understanding on using or actually being in that line of work?

 

It's all good to point out that some of the software exists on Mac OS, but you are being WAY overly persistent that Mac OS would be just fine without really even knowing if it would be.

 

Also I original used "engineering" as that is most typically the department name, or more fully School of Engineering. All the courses like archecture come under that. While the School of Design might also do some archecture it's a totally different thing, one being arts based and has to be translated across to an actual archecturual design. Pretty pictures do not make buildings.

Link to comment
Share on other sites

Link to post
Share on other sites

There's a really simple explanation to this and it's not architecture.

 

5nm

 

Look at how apple never uses AMD to compare.

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Obioban said:

Sadly I agree. I'm stuck in windows for work, because my job is primarily in Solidworks.

 

... if they ever released a (competent) macOS version, I'd be lobbying IT hard.

May well change, maybe soon. Not having to buy overpriced Nvidia workstations cards for the extra VRAM or Pro drivers is quite a big draw card along with getting way more VRAM usable than most would otherwise get for the same equivalent cost. It's really dumb having to buy in the past low end expensive Quadro cards for SolidWorks for no other reason than the required drivers features being deprecated in the gaming drivers.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, leadeater said:

Also I original used "engineering" as that is most typically the department name, or more fully School of Engineering. All the courses like archecture come under that. While the School of Design might also do some archecture it's a totally different thing, one being arts based and has to be translated across to an actual archecturual design. Pretty pictures do not make buildings.

It's one thing if the organization is done that way in your school. In Europe e.g., universities and colleges are often not organized as "School of xx", there are simply departments. And architecture is an own, separate one and not associated with engineering in any way. Students there (the ones that design actual buildings) receive a Bachelor of Arts in contrast to the BSc handed out to all engineering-related studies. And if you speak to architects and mechanical/electrical/computer/civil/whatever engineers and compare them I don't think you'd come to the conclusion that they all should be referred to as engineers.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Dracarris said:

It's one thing if the organization is done that way in your school. In Europe e.g., universities and colleges are often not organized as "School of xx", there are simply departments. And architecture is an own, separate one and not associated with engineering in any way. Students there (the ones that design actual buildings) receive a Bachelor of Arts in contrast to the BSc handed out to all engineering-related studies. And if you speak to architects and mechanical/electrical/computer/civil/whatever engineers and compare them I don't think you'd come to the conclusion that they all should be referred to as engineers.

Eh it's all different here too, one place has it under one department and another has it under another. Whether you call it a department or School of etc is literally the same thing, some places just like to call things their own way. Thing still remains, there are different kinds of archecture, one being more structural and one being more arts. Really depends on what a person is more interested in actually doing.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, williamcll said:

There's a really simple explanation to this and it's not architecture.

 

5nm

 

Look at how apple never uses AMD to compare.

tbh that could also be because they want to bring over intel macbook owners of which comparisons to AMD wouldnt really work as well

✨FNIGE✨

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, FnigePython said:

tbh that could also be because they want to bring over intel macbook owners of which comparisons to AMD wouldnt really work as well

Macbooks don't use i9 12900K and 3090 though. Or any recent Intel CPU, and the Intel CPUs that did go on Macbooks are quite a bit different from the more recent ones used in comparisons.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, KaitouX said:

Macbooks don't use i9 12900K and 3090 though. Or any recent Intel CPU, and the Intel CPUs that did go on Macbooks are quite a bit different from the more recent ones used in comparisons.

yeah true. Though the 12900K + 3090 iirc is more powerful than AMD's offerings which means it would be a better comparison than a 5900X + 3090 Or 6900XT (Only reason why im not saying 5950X is that people probably see that as not a consumer chip of which you'd want to compare consumer chips)

✨FNIGE✨

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, LAwLz said:

And the "cool trick" Apple uses for Rosetta 2 is just standard Arm instructions. The idea that Apple are doing some specialized hardware to get higher performance translation seems to be a false assumption people made early on. 

Thats not quite true apples cores do have some advantages for Rosseta behind the ARM64 spec. There are 2 major bits that apple added:

1) memory ordering modes: the execptions that are provided with non atomic memory modes on x86 and ARM64 are different, M1 cores can be switched into a mode that behaves like x86 (this means apple does not need to use costly atomic memory operations to still fulfil the expectations of the application) other Arm platform running x86 code need to replace almost all memory operations with atomic ones or run the risk of random errors. (atomic memory operations are much much lower)

 2) supporting both 4kb and 16kb page size. M1 cpus can switch modes between 4kb (needed for x86 apps) and 16 (kb pages sizes) tuning 16kb page size does improve perf on native apps.  If apple had just supported 16kb page size then there would be a bit perf hit as almost all load store etc operations would need to a load of extra instruction added around them to ensure the data provided was read correctly. Or if apple had just built there cpu to be hard coded 4kb page size then the perf advantage of 16kb pages size would not be present for native apps. You can see the windows on arm devices use cpus with 4kb page size only! 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, FnigePython said:

yeah true. Though the 12900K + 3090 iirc is more powerful than AMD's offerings which means it would be a better comparison than a 5900X + 3090 Or 6900XT (Only reason why im not saying 5950X is that people probably see that as not a consumer chip of which you'd want to compare consumer chips)

This whole "avoid AMD comparisons" is utter bullocks. Maybe people who think this is true should maybe have a look at the title of this thread.

11 hours ago, williamcll said:

5nm

no, just no. The tech node ofc makes a big contribution, but the biggest factor is a properly designed and architected ARM-RISC desktop-class SoC.

8 hours ago, hishnash said:

Thats not quite true apples cores do have some advantages for Rosseta behind the ARM64 spec. There are 2 major bits that apple added:

Yes, and those are not tricks but simply good design, the very thing that we essentially want companies to do. And even if we call it tricks, or clever tricks, so what? It's an architecture aspect that enables performant execution of x86 code, period.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, leadeater said:

Like w/e buy what anyone wants but having a bad experience on a 1/2 to 1/3 the cost device then proclaiming how much better a different and vastly more expensive device is is rather silly and just bad form.

The point I was making was one of relative cost. Between buying a 500$ laptop every 2 years, and buying a $1000 one every 6, the $1000 ends up being cheaper.

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, DANK_AS_gay said:

The point I was making was one of relative cost. Between buying a 500$ laptop every 2 years, and buying a $1000 one every 6, the $1000 ends up being cheaper.

Then do the same, a good Windows laptop will last just as long. I don't think you understood my point. Stop buying or comparing low end cheap laptops to more expensive options as if it's really proof of anything.

 

Not every $1000 laptop is as good as the next of course but there are plenty of good ones. Never had a problem with an HP EliteBook lasting 6 years nor an IBM/Lenovo Thinkpad, up-spend to equivalency and differences evaporate real fast. 

 

Edit:

Also your example I have seen more than once and it's quite often self inflicted. Company will not fund more expensive Windows laptops, get frustrated, authorizes buying much more expensive MacBook Pros, blame IT for buying bad devices, ignores totally the budget constraint they put in themselves which is why they got cheap laptops the first go round.

 

All you can do at that point is sigh and move on. Literally not worth the argument, they have good devices they are happy with and actually funded it.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

up-spend to equivalency and differences evaporate real fast.

Fair enough. The experience I have at my school (different demographic with different interests) is that the $1000 MacBooks with their metal chassis do not break anywhere near as fast as the $1000 HPs (about 40% of the people in my grade had HP Envys) that most people had freshman year, and no longer have because they all broke.  The main problem with Macs for companies would definitely be the lack of corporation support. For basic use in College, or high school, a MacBook (or iPad, depending on which you prefer for taking notes, paper or digital) is often the best option. Plus, there is the added security of Apple having tight integration with their hardware, meaning viruses are harder to get. (I haven't had viruses on either mac or windows) So it is understandable to be frustrated at my appreciation of Macs, especially given your understanding and experience with previous Windows machines. However:

 

11 minutes ago, leadeater said:

Stop buying or comparing low end cheap laptops to more expensive options as if it's really proof of anything.

is kind of dumb. There is a perfectly valid reason to compare more expensive machines. And it is this: Buying a more expensive machine now will save you money later.

That was and is the point I try to make. I'm not saying that DELL makes garbage even at $1000 (actually yes I am, DELL is shit. Lenovo and HP are fine), I am using the example of a 500$ DELL breaking quickly to compare the cost of a new cheap laptop every 2 years, vs a new expensive laptop every 6. There are perfectly fine laptops from Lenovo and HP at that price, but saying I cannot compare the ACTUAL cost of something is actually really dumb. That's literally one of the main reasons people (incorrectly) say electric cars are better. Because they are cheaper to run in the long run.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, hishnash said:

1) memory ordering modes: the execptions that are provided with non atomic memory modes on x86 and ARM64 are different, M1 cores can be switched into a mode that behaves like x86 (this means apple does not need to use costly atomic memory operations to still fulfil the expectations of the application) other Arm platform running x86 code need to replace almost all memory operations with atomic ones or run the risk of random errors. (atomic memory operations are much much lower)

Not sure what exactly you are referring to, but Arm has native instructions for easily covering between the more loose memory model of regular Arm, and the more strict memory model of x86. 

 

Arm, not Apple, have created several instructions specifically for this purpose. Anyone making Arm cores can implement it if they want. It's not Apple specific. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DANK_AS_gay said:

Fair enough. The experience I have at my school (different demographic with different interests) is that the $1000 MacBooks with their metal chassis do not break anywhere near as fast as the $1000 HPs

The metal chassis isn't the only factor to determine how rugged the laptop is, Apple has had their share of issues with parts failing, and some really dumb things too like display cables or keyboards riveted to the inside of the chassis so you can't just replace the keyboard.

1 hour ago, DANK_AS_gay said:

is kind of dumb. There is a perfectly valid reason to compare more expensive machines. And it is this: Buying a more expensive machine now will save you money later.

That was and is the point I try to make. I'm not saying that DELL makes garbage even at $1000 (actually yes I am, DELL is shit. Lenovo and HP are fine), I am using the example of a 500$ DELL breaking quickly to compare the cost of a new cheap laptop every 2 years, vs a new expensive laptop every 6. There are perfectly fine laptops from Lenovo and HP at that price, but saying I cannot compare the ACTUAL cost of something is actually really dumb. That's literally one of the main reasons people (incorrectly) say electric cars are better. Because they are cheaper to run in the long run.

Comparing a $500 Dell from a big box store to a $1500 macbook is a really flawed argument, Lenovo and HP make junk laptops as well, but their business grade laptops are what you'd want to go for when buying a school laptop and is a fair comparison to a similar priced macbook. Those $500 laptops are built to be cheap throwaway laptops, although being able to upgrade one with more ram and storage is still a nice point for those.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DANK_AS_gay said:

That was and is the point I try to make. I'm not saying that DELL makes garbage even at $1000 (actually yes I am, DELL is shit. Lenovo and HP are fine),

I think with the most recent XPS lineup they finally cut the corner, they are build-quality and price-wise basically equivalent to MBPs or X1 Carbon/whatever the larger ones are called.

 

We bought a quite expensive XPS 13 (around 2k$) in 2017 and the keyboard was outright garbage, the space bar DOA, according to the internet a widespread problem with no fix. The trackpad was outright unusable and started buckling after a year or so. The included USB-C port works with external dongles only in one direction of the plug, JFC. And the 5cm cable of the dock we bought as an add-on started getting intermittent contact problems after a few months of usage.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×