Jump to content

Apple April 20 event - New iMac and iPad Pro with M1 chips, 'AirTags' and a new iPhone color

AndreiArgeanu
3 hours ago, Distinctly Average said:

I wouldn’t be surprised if all three new devices actually use the same system board. Makes sense from a production point of view. If that is the case, then the iPad will probably in time run all Mac software.

If they do it will be discovered and some really interesting hackintoshes will emerge. “Just buy an iPad and run this software suite on it.  Boom.  MacOS”

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Bombastinator said:

If they do it will be discovered and some really interesting hackintoshes will emerge. “Just buy an iPad and run this software suite on it.  Boom.  MacOS”

While the system board may be the same one, there will probably be different firmware, storage chips, power connectors etc. It is easy to design a PCB with a number of options then just let the pick and place machine select the desired configuration. So for instance, the iPad will need a charge controller and battery management, but the iMac would not, instead just a power connector. There may be the need for different connectors for the different screens, probably more power for the larger ones. So while the system board could be the same one, it may be configured differently for each use. That is a guess, but based on my experience. Will be interesting to see the teardown that will no doubt arrive very soon. Will also be interesting to see if macOS on an iPad is possible.

Link to comment
Share on other sites

Link to post
Share on other sites

every time i see an iPad i'm always wondering: why hasn't apple come out with something as good as microsoft's RDC. A tool i've come to appreciate more and more for the past year. VNC viewers don't come even close to it, and apple's screen sharing is as bad as them.

Also was expecting for a refresh to the standard iPad or even the mini. Probably means the mini is closer to get shafted. As for the iPad don't know if they are playing it the long run like they do with the SE and keep it as placeholder to recycle older iPad designs no longer in use, whilst retro fitting the latest chip.

Another would have loved to see is the iPod. inb4 just get an iPhone, not everyone wants one.

Macs: meh, i'm entirely done with that. Too much salt thrown for the past 20 years for me to ever care again about them.

Apple TV: never found a use for them and every other product in this category

airTags: maybe i'm just old, but: wtf? why?

iPhone purple: needed to be a darker shade of purple, to bright to call it the PIMP phone

 

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, suicidalfranco said:

Also was expecting for a refresh to the standard iPad or even the mini. Probably means the mini is closer to get shafted. As for the iPad don't know if they are playing it the long run like they do with the SE and keep it as placeholder to recycle older iPad designs no longer in use, whilst retro fitting the latest chip.

I'd imagine Apple will just keep equipping them with the same chips that go in the iPhone with the higher M chips being reserved for the 'pro' models.

5 minutes ago, suicidalfranco said:

Another would have loved to see is the iPod. inb4 just get an iPhone, not everyone wants one.

I'd imagine they don't sell well, and after all, at least the more recent iPods are just older iPhone designs with refreshed internals with no sim card support. The iPhone SE has probably taken the spot that the iPod had before.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, AndreiArgeanu said:

the more recent iPods are just older iPhone designs with refreshed internals with no sim card support.

Not even close, they way thinner for once, and never shared the same backplate as the iPhone.

Them not selling well is expected, the iPhone pretty much canabalize it's market. But still it's the no commitment access to iOS, a refresh would have been welcomed.

8 minutes ago, AndreiArgeanu said:

Apple will just keep equipping them with the same chips that go in the iPhone

Just to be clear I'm talking about the iPad, not the air.

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, suicidalfranco said:

Just to be clear I'm talking about the iPad, not the air.

Oh, I guess they don't do it anymore for the normal iPad and the mini, they used to do it at least.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, leadeater said:

The overwhelming majority can do everything they need with 16GB ram. You can photo and video edit on the M1 with 16GB ram. The iMac, of everything configuration, has never been a replacement or substitute of a workstation, not even in Apple's product lines. There is a thing called scope creep, that's what your doing. Everything this M1 iMac can't do was never done on an iMac in the first place.

 

Don't buy a two door two seater car and complain you can't get the 4 people in it you needed to, that's your fault not the car's.

 

Their marketing used the 8GB model which is the one I was explicitly pointing out as being under-spec'd for EVERYTHING, as comparing it to the iMac's with GPU's, the GPU's had an additional 4GB. 16GB should have been a standard configuration in all devices since 2016. The fact that we're still seeing 8GB in systems that otherwise might have good specs, and can't be upgraded, is just wasteful.

 

10 hours ago, Spindel said:

Don't bother arguing he/she argued with me the other day about the specs needed to run AutoCAD and said it was impossible on integrated graphics. This while I was sitting and doing actual professional (as in what I make my money from) AutoCAD work on integrated graphics on an M1 on a drawing I have also worked on with a Windows laptop with integrated graphics (when not in my home office). 

Don't bring personal attacks here bud. You're not doing AutoCAD on an iGPU and more than you're going to do Photoshop on an iGPU. You're wasting your own productivity time running AutoCAD at 4fps. I have a hard time believing you or anyone else is employed in a professional manner deployed chromebook-tier computers unless the IT department was incompetent or run by accountants. I am actually working at a fortune 500 engineering company, and it's usually the employee or their manager that misunderstands the specs the employee needs. 

 

So I had to spell it out:

17" Laptop with a Quadro (A/T/P 3000), 64GB = Engineers running AutoCAD/Civ3D/ArcGIS/etc

15" Laptop with a Quadro (A/T/P 1000), 32GB = Office staff who review CAD, Acrobat, Photoshop, Premiere Pro and the ones working on massive spreadsheets. These are 30% below the recommended specs for CAD.

12/13/14" Laptop with iGPU, 16GB = This the the minimum required to run All Adobe software, Office Software, and is 85% below the recommended requirements for AutoCAD and 30% below the minimum requirements for AutoCAD.

 

 

If I even suggest that a lesser laptop can handle a program, they (people who haven't been informed) will take that as a cue that they can. Just because the software says "4GB minimum" does not mean you can load every project. There are people who I know who use photoshop professionally, who need 64GB+ RAM because they are working on projects with 200 layers. There are people who I know who actually use AutoCAD professionally, who need 64GB of RAM just to open their projects, never mind edit them. There are engineers that mistakenly picked 14" laptops with iGPU's who have wasted clients time being unable to load their billion dollar project. Time is money. 

 

If you are nickel and diming over the cost of the computer, when the cost of the computer is a drop in a bucket, you are not the right person for the job. Plenty of people here on this forum and elsewhere make this mistake, especially when they use the argument of "I can build a cheaper PC for the cost of a Mac". Nobody cares about that in a professional environment. Either the computer meets the requirements of the PROJECT/Department, or you don't even consider it.

 

The iMac's here? Probably sufficient for schools, where Apple has traditionally been the computer of choice. They're certainly underwhelming for typical Office, especially the 8GB models. I'd argue the iPad Pro's are a better option than the iMac's, because you at least get something portable for the same lifetime.

 

And never mind scope creep. Apple has a missing middle in their line up, and has had that missing middle ever since the G5. All the M1 switch has done is make that missing middle wider.

 

It's 2021, if you are still selling hardware with 8GB, you're targeting the chromebook market, not the home professional, not the enterprise business.

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Bombastinator said:

Duct tape.

Ahh yes, the best pcmr fix

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, LAwLz said:

That is one beast of an iPad.

Fantastic hardware but it feels like it will be severely held back by software. Maybe the goal is that as more Macs become ARM based, Apple will be able to make Mac software run on the iPad as well. 

As soon as iPadOS gets more updates to make it more able to match up with MacOS, these iPads are gonna roll off the shelves (they're still gonna roll off the shelves, but even more so). But that software might come next year or in the next decade or never imo, since Apple might not actually want an iPad to do what a MacBook can.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Kisai said:

The fact that we're still seeing 8GB in systems that otherwise might have good specs, and can't be upgraded, is just wasteful.

True I do agree with that, but for Apple M1 SoC it is a little harder as we're not utilizing DIMMs but memory chips directly on the same package so achieving high capacities is actually a lot harder. So long as the 8GB works well and does what is needed then there isn't a problem and for the most part that is the case but I'd still much rather see 16GB as the standard.

 

Apple is just limited by what LPDDR4X exists so there isn't anything they can do about it, the highest capacity chips are expensive so having a cheaper lower end option is a good thing not a bad thing, it's up to the buyer to know what they required and purchase that.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Kisai said:

and is 85% below the recommended requirements for AutoCAD and 30% below the minimum requirements for AutoCAD.

I don't know what you have been reading but whatever it is it's not anything official from AutoDesk, below is the actually official requirements for AutoCAD 2022

 

image.png.132855992c10c8845c467bc74aa8b2fb.png

https://knowledge.autodesk.com/support/autocad/troubleshooting/caas/sfdcarticles/sfdcarticles/System-requirements-for-AutoCAD-2022-including-Specialized-Toolsets.html

 

Aka the requirements to run AutoCAD are "be a computer"

 

What your requirements are are for your workloads for your projects not the requirements of AutoCAD nor the requirements of everyone else. AutoCAD projects come in all different sizes and complexities, not everyone is designing a Boeing 747, not everyone is designing a simple PC case either.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, suicidalfranco said:

every time i see an iPad i'm always wondering: why hasn't apple come out with something as good as microsoft's RDC.

 

I assume you're abbreviating RDP?  a) It's had gaping security issues for years.  b) It doesn't support GPU acceleration so it'll max out your CPU at higher resolutions.  c) It doesn't work outside local networks very well without port forwards and shit and exposes gaping holes d) Its compression is very primitive and can't handle video streaming. e) There is only third party support for RDP via apps so it's iffy quality.

 

Teamviewer is probably the best remote software but their licensing model blows and they only want enterprise customers nowadays.  I offered to buy a license and they didn't want to sell me one that would support multiple personal machines. 

Splashtop is second best IMO.  It has some quirks that Teamviewer doesn't have but it's very fast and reliable.

 

[I think RDP does support some form of GPU acceleration / virtual GPU but only with Windows Server remote hosts]

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AnonymousGuy said:

I assume you're abbreviating RDP?  a) It's had gaping security issues for years.  b) It doesn't support GPU acceleration so it'll max out your CPU at higher resolutions.  c) It doesn't work outside local networks very well without port forwards and shit and exposes gaping holes d) Its compression is very primitive and can't handle video streaming.

 

Teamviewer is probably the best remote software but their licensing model blows and they only want enterprise customers nowadays.  I offered to buy a license and they didn't want to sell me one that would support multiple personal machines. 

Splashtop is second best IMO.  It has some quirks that Teamviewer doesn't have but it's very fast and reliable.

RDP has gotten a lot better over the past few years. Lots of the compression issues don't happen anymore. Hell, I've literally played FH4 through RDP over the internet. 

Phobos: AMD Ryzen 7 2700, 16GB 3000MHz DDR4, ASRock B450 Steel Legend, 8GB Nvidia GeForce RTX 2070, 2GB Nvidia GeForce GT 1030, 1TB Samsung SSD 980, 450W Corsair CXM, Corsair Carbide 175R, Windows 10 Pro

 

Polaris: Intel Xeon E5-2697 v2, 32GB 1600MHz DDR3, ASRock X79 Extreme6, 12GB Nvidia GeForce RTX 3080, 6GB Nvidia GeForce GTX 1660 Ti, 1TB Crucial MX500, 750W Corsair RM750, Antec SX635, Windows 10 Pro

 

Pluto: Intel Core i7-2600, 32GB 1600MHz DDR3, ASUS P8Z68-V, 4GB XFX AMD Radeon RX 570, 8GB ASUS AMD Radeon RX 570, 1TB Samsung 860 EVO, 3TB Seagate BarraCuda, 750W EVGA BQ, Fractal Design Focus G, Windows 10 Pro for Workstations

 

York (NAS): Intel Core i5-2400, 16GB 1600MHz DDR3, HP Compaq OEM, 240GB Kingston V300 (boot), 3x2TB Seagate BarraCuda, 320W HP PSU, HP Compaq 6200 Pro, TrueNAS CORE (12.0)

Link to comment
Share on other sites

Link to post
Share on other sites

I'll probably be buying the new iPad just to see what's up with it.  Apple always has a great return policy with no questions asked.  Maybe try using it as a secondary monitor with my Windows laptop.

 

I like the airtags so I'll get a 4 pack just to fuck around with them.  Never bought a Tile but with Apple's ecosystem airtags will probably just work better all around.  Might be fun seeing if they work with checked baggage to locate a bag inside the terminal.  It's kinda creepy though that the "Find My" network seems to have no opt-out so you're always going to be part of the relay system.  

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, AnonymousGuy said:

I'll probably be buying the new iPad just to see what's up with it.  Apple always has a great return policy with no questions asked.  Maybe try using it as a secondary monitor with my Windows laptop.

 

I like the airtags so I'll get a 4 pack just to fuck around with them.  Never bought a Tile but with Apple's ecosystem airtags will probably just work better all around.  Might be fun seeing if they work with checked baggage to locate a bag inside the terminal.  It's kinda creepy though that the "Find My" network seems to have no opt-out so you're always going to be part of the relay system.  

Just watched a video breakdown of the announcement.  From that, My understanding is that the absolute top spec iPad has 2tb of storage, 16gb of memory, a miniLED display several times brighter than the brightest OLEDs (1600nits peak) and is 3 grand. 
 

As for the airtags they’re several times as thick as the thinner TILEs, and their cooler features require modern equipment to work at peak efficiency, which seems to be pretty eye opening,  not just direction, but distance with 1 foot accuracy.  Just Follow the little arrow.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Bombastinator said:

Just watched a video breakdown of the announcement.  From that, My understanding is that the absolute top spec iPad has 2tb of storage, 16gb of memory, a miniLED display several times brighter than the brightest OLEDs (1600nits peak) and is 3 grand. 
 

As for the airtags they’re several times as thick as the thinner TILEs, and their cooler features require modern equipment to work at peak efficiency, which seems to be pretty eye opening,  not just direction, but distance with 1 foot accuracy.  Just Follow the little arrow.

I'll probably go with a more modest 512GB or whatever.  3 grand on a tablet would be a bit cray cray 🙂

 

I love the networking capability of airtags that tile would never be able to securely duplicate on their own.  I can't find the U1 chips range with some quick googling, but the suggestion is that it can work over 200m and bluetooth can work over 400m.  In a public space there's almost guaranteed to be full-coverage by multiple iphones with a radius that large so that could lead to some pretty impressive directional range....in theory.  It would be pretty sick to have pin-point accuracy when you're miles away from an airtag.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, AnonymousGuy said:

It doesn't work outside local networks very well without port forwards and shit and exposes gaping holes

RD Gateway is the proper way to do this which uses a proper HTTPS/SSL tunnel and policy based access rules to control which backend servers users are allowed to connect to. Port Forwarding is not the proper way to do it though in very small environments you in a way have to. However you can reverse proxy behind a VIP instead (the better way below proper RD Gateway) so you aren't exposing the server to the internet at all.

 

35 minutes ago, AnonymousGuy said:

It's had gaping security issues for years

So does literally everything else that only supports or you have not setup any restrictions beyond username/password. Port forward SSH to the internet without any hardening and anyone could spend as much time in the world brute forcing their way in either via username/password or the security bugs that have existed there too that don't always actually get patched.

 

Not everyone knows or goes through the effort to setup only SSH key login and/or lockouts, port knockers etc etc.

 

35 minutes ago, AnonymousGuy said:

It doesn't support GPU acceleration so it'll max out your CPU at higher resolutions

It does actually support GPU acceleration, has for years. It's off by default, it is supported.

 

Quote

Browse to: Local Computer Policy\Computer Configuration\Administrative Templates\Windows Components\Remote Desktop Services\Remote Desktop Session Host\Remote Session Environment
Then enable “Use the hardware default graphics adapter for all Remote Desktop Services sessions

 

There is even RemoteFX for even more advanced GPU acceleration and offloading.

 

Either way GPU accelerated RDS has been supported since Windows Server 2008 R2 (Windows 7)

 

35 minutes ago, AnonymousGuy said:

Its compression is very primitive and can't handle video streaming

You can adjust how that works, part of the more advanced RemoteFX feature set. Might also be part of regular RDS, not sure, only looked at it when investigating RemoteFX (which btw you can use to play full 3D games, like actual proper ones).

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, leadeater said:

RD Gateway is the proper way to do this which uses a proper HTTPS/SSL tunnel and policy based access rules to control which backend servers users are allowed to connect to. Port Forwarding is not the proper way to do it though in very small environments you in a way have to. However you can reverse proxy behind a VIP instead (the better way below proper RD Gateway) so you aren't exposing the server to the internet at all.

 

So does literally everything else that only supports or you have not setup any restrictions beyond username/password. Port forward SSH to the internet without any hardening and anyone could spend as much time in the world brute forcing their way in either via username/password or the security bugs that have existed there too that don't always actually get patched.

 

Not everyone knows or goes through the effort to setup only SSH key login and/or lockouts, port knockers etc etc.

 

It does actually support GPU acceleration, has for years. It's off by default, it is supported.

 

 

There is even RemoteFX for even more advanced GPU acceleration and offloading.

 

Either way GPU accelerated RDS has been supported since Windows Server 2008 R2 (Windows 7)

 

You can adjust how that works, part of the more advanced RemoteFX feature set. Might also be part of regular RDS, not sure, only looked at it when investigating RemoteFX (which btw you can use to play full 3D games, like actual proper ones).

Alluded to at the bottom of that reply, I thought a bunch of these features are supported in Windows Server but I'll be fucked if I could get it working with my home systems not running Windows Server.  And then you end up in MSDN documentation hell trying to figure out what group policies and registry keys actually work with your version of Windows, which ones are legacy and no longer functioning, etc.     And on the client side still using the same client from Windows 2000.  Your product is fucked if some random third party application does a better job making a client than you did.  And I don't think RDP even supports multiple monitors in separate windows which was huge for my use-case where I'm remoted in with 3 monitors.

 

Meanwhile with Splashtop or Teamviewer it's 5 minutes setup and you get where you need to go.   From where I sit the only advantage of RDP is being able to hang multiple users off a single physical machine (which I use on a daily basis)...but again only this works with Server versions of Windows unless you hack a few files on the consumer versions.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

I don't know what you have been reading but whatever it is it's not anything official from AutoDesk, below is the actually official requirements for AutoCAD 2022

 

image.png.132855992c10c8845c467bc74aa8b2fb.png

https://knowledge.autodesk.com/support/autocad/troubleshooting/caas/sfdcarticles/sfdcarticles/System-requirements-for-AutoCAD-2022-including-Specialized-Toolsets.html

 

Aka the requirements to run AutoCAD are "be a computer"

 

What your requirements are are for your workloads for your projects not the requirements of AutoCAD nor the requirements of everyone else. AutoCAD projects come in all different sizes and complexities, not everyone is designing a Boeing 747, not everyone is designing a simple PC case either.

 

The memory bandwidth alone precludes using an iGPU. I don't know about Xe as I have not benchmarked it, but all the Intel 630's have a memory bandwidth between 14 and 20. The Quadro 1000/1200's are 80G, not 106G.  The GTX 1030 meets the minimum (48G), but isn't even half the way to recommended. That minimum is essentially a 4GB GTX 1050Ti which is the Quadro 1000 tier equivalent.

 

I'll save you the effort of scouring nvidia's website and point you here 

https://www.studio1productions.com/Articles/NVidia-GPU-Chart.htm

 

iGPU's (which would include the M1) would be kneecapped to the memory used. 

https://www.anandtech.com/show/16252/mac-mini-apple-m1-tested

Quote

One aspect we’ve never really had the opportunity to test is exactly how good Apple’s cores are in terms of memory bandwidth. Inside of the M1, the results are ground-breaking: A single Firestorm achieves memory reads up to around 58GB/s, with memory writes coming in at 33-36GB/s. Most importantly, memory copies land in at 60 to 62GB/s depending if you’re using scalar or vector instructions. The fact that a single Firestorm core can almost saturate the memory controllers is astounding and something we’ve never seen in a design before.

 

Because one core is able to make use of almost the whole memory bandwidth, having multiple cores access things at the same time don’t actually increase the system bandwidth, but actually due to congestion lower the effective achieved aggregate bandwidth. Nevertheless, this 59GB/s peak bandwidth of one core is essentially also the speed at which memory copies happen, no matter the amount of active cores in the system, again, a great feat for Apple.

 

So the M1, is roughly half of GTX 1050Ti. It's kneecapped to 68GB/s max because it's LPDDR4X, not GDDR5, or GDDR6X

 

That right there, is why the M1 mac does not replace a Mac with a dGPU. I really do not expect anyone to run AutoCAD on the M1 iMac, but suffice it to say the dGPU's it replaced (and is compared to Radeon Pro 560X) had 112GB/sec GDDR5, which is, y'know twice as capable, and meets the recommended for AutoCAD.

 

https://macperformanceguide.com/blog/2021/20210420_1530-Apple-M1-iMac.html

 

Quote

When it comes to real-world performance that I and many other photographers need, it’s dead on arrival because of the 16GB memory limit. Photoshop as I use it hits 32GB memory usage (for Photoshop alone) with even my lightest usage, and ramps up to 100GB and beyond at times. However, some photographers doing light-duty work may find 16GB adequate so long as attention is paid to not too many applications and/or images all at once.

 

Just having 32GB would have expanded the envelope firmly into low-end professional use. It looks like Apple is still working on the M1 chip that can support more memory than laptops had 7 years ago (eons in the computer industry).

 

I'm not alone in pointing out that 8GB is insufficient, and 16GB keeps it squarely out of the home professional/prosumer market. Unless Apple comes out with a Mac Pro, or keeps using Intel Xeon's for the Mac Pro, it's going to send everyone who needs an actual computer for work to Dell. 

 

And quite frankly unless the "M1X" is using 64GB of GDDR6X or HBM, there is no way to quantify an ARM SoC as even being in the same class as an Intel/AMD CPU+dGPU. Sure the M1 makes all Intel iGPU's look sad because they outperform it on DDR4, and that is where the M1's strength lies.

 

But it will be embarrassing for Apple if they take this approach for the Mac Pro. Something tells me they aren't going to stick four SoC's back to back to make up the difference (4 x 16GB for 64GB RAM, 32 cpu cores, 32 gpu cores) with the 560X,  Because the Pro Vega II GPU's in the Mac Pro's are 1TB/sec. That's 32 SoC's to get there.

 

But again, who knows, maybe the Mac Pro M1X is just going be a blade server with M1X blades.

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Kisai said:

And quite frankly unless the "M1X" is using 64GB of GDDR6X or HBM, there is no way to quantify an ARM SoC as even being in the same class as an Intel/AMD CPU+dGPU. Sure the M1 makes all Intel iGPU's look sad because they outperform it on DDR4, and that is where the M1's strength lies.

 

I expect we will see HBM2e stacks on the pro SoC (that will also be a tile based multi chip solution, apple have a load of patents in this space as to TSMC).

 

34 minutes ago, Kisai said:

I'm not alone in pointing out that 8GB is insufficient, and 16GB keeps it squarely out of the home professional/prosumer market.

This iMac is just a replacement for the 21" iMac (that for a while now has only supported 16GB).  Apple are still selling the 27" iMac so the message is clear that this is very much not a `professional/prosumer` device. I expect we will either get a May event or they wait until WWDC for the next round that will be the next SoC with different IO (memory and PCIe bandwidth) 

 

37 minutes ago, Kisai said:

It's kneecapped to 68GB/s max because it's LPDDR4X, not GDDR5, or GDDR6X

Depending on your use case it is worth nothing that TBDR pipelines used in the M1 do massively reduce the memory bandwidth needs: rather than rendering out each object to a buffer then pulling all those buffers back into the gpu to blend them the screen is split into tiles and each tile is rendered on the GPU, the final output buffer stays on the GPU core (in a small high on die memory) each object is rendered and then depth tested and blended into the final output. This does massively reduce memory bandwidth needs for rendering of 3D scenes it also enables a lot more culling since the accumulated depth buffer is used before rendering the next object to cull fragments that will fail the depth test before the fragment function is called. But in compute compute tasks this is not so relevant (sure having that on core memory helps but you need to code for it very well and that is hard to do in many cases).  But in general i agree we will see HBM2e memory on the next higher end iteration. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AnonymousGuy said:

Alluded to at the bottom of that reply, I thought a bunch of these features are supported in Windows Server but I'll be fucked if I could get it working with my home systems not running Windows Server.

I'm pretty sure that first method using local GPO is supported for Windows 10 not just Windows Server, but that particular one is new and Windows 10 was when it was introduced. RemoteFX was the only option pre-dating Windows 10 and yea session host has to be Server.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, hishnash said:

I expect we will see HBM2e stacks on the pro SoC (that will also be a tile based multi chip solution, apple have a load of patents in this space as to TSMC).

 

This iMac is just a replacement for the 21" iMac (that for a while now has only supported 16GB).  Apple are still selling the 27" iMac so the message is clear that this is very much not a `professional/prosumer` device. I expect we will either get a May event or they wait until WWDC for the next round that will be the next SoC with different IO (memory and PCIe bandwidth) 

 

Depending on your use case it is worth nothing that TBDR pipelines used in the M1 do massively reduce the memory bandwidth needs: rather than rendering out each object to a buffer then pulling all those buffers back into the gpu to blend them the screen is split into tiles and each tile is rendered on the GPU, the final output buffer stays on the GPU core (in a small high on die memory) each object is rendered and then depth tested and blended into the final output. This does massively reduce memory bandwidth needs for rendering of 3D scenes it also enables a lot more culling since the accumulated depth buffer is used before rendering the next object to cull fragments that will fail the depth test before the fragment function is called. But in compute compute tasks this is not so relevant (sure having that on core memory helps but you need to code for it very well and that is hard to do in many cases).  But in general i agree we will see HBM2e memory on the next higher end iteration. 

If all they do is run a hotter integrated graphics chip (can’t even call it an iGPU or an APU. It’s it’s own thing)  and don’t use discrete graphics I won’t be buying one. I’ve been hoping they would run some Pcie 3 or 4 and clip on an AMD desktop card. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, hishnash said:

Depending on your use case it is worth nothing that TBDR pipelines used in the M1 do massively reduce the memory bandwidth needs: rather than rendering out each object to a buffer then pulling all those buffers back into the gpu to blend them the screen is split into tiles and each tile is rendered on the GPU, the final output buffer stays on the GPU core (in a small high on die memory) each object is rendered and then depth tested and blended into the final output.

Yep, M1 GPU vs dGPU is rather chalk and cheese. Completely different memory models and memory latencies (which is important for effective bandwidth not just raw spec sheet numbers). The M1 GPU is already handily out performing many mid range GPUs in real applications so spec sheeting battling is a bit pointless since very often it doesn't match real world.

 

Will just have to wait until there is an M1 native AutoCAD version.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Bombastinator said:

If all they do is run a hotter integrated graphics chip (can’t even call it an iGPU or an APU. It’s it’s own thing)  and don’t use discrete graphics I won’t be buying one. I’ve been hoping they would run some Pcie 3 or 4 and clip on an AMD desktop card. 

In the macPro they will for sure have optional add in compute cards (the SoC package will still contain 1 or more GPU dies of its own) not from AMD since AMDs GPU arc does not support the full metal feature set and apple will want a uniform story for developers. Last thing you want when you want to get developers on board to optimise for your chips is make the top end one be completely different.  

 

Not sure why you would want an AMD GPU (apart from being able to buy a regular consume card and not being forced to buy a pro labeled card that of course costs an ARM and a leg). Maybe AMD will put the effort in to port their drivers to ARM and you will be able to use an AMD GPU (but do not expect future applications to optimise for that reduced metal feature set, some might even just plain refuse to use it due to not having the features they expected). (note for compute tasks NV could also write user-space CUDA drivers if they wanted to, these do not need any blessing from apple so it does not matter how bad the relation ship is, but i don't expect NV to see much of a market given apple will be out-performing them).

With apples perf/W on the GPU space they could do a RTX3090 level performance (in fp32) within the macPro SoC (if the SoC is as large as the ZEON package that the macPro users) since that would still just draw about 150W (that socket is built to cool upto 500W at the moment!)

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, hishnash said:

In the macPro they will for sure have optional add in compute cards (the SoC package will still contain 1 or more GPU dies of its own) not from AMD since AMDs GPU arc does not support the full metal feature set and apple will want a uniform story for developers. Last thing you want when you want to get developers on board to optimise for your chips is make the top end one be completely different.  

 

Not sure why you would want an AMD GPU (apart from being able to buy a regular consume card and not being forced to buy a pro labeled card that of course costs an ARM and a leg). Maybe AMD will put the effort in to port their drivers to ARM and you will be able to use an AMD GPU (but do not expect future applications to optimise for that reduced metal feature set, some might even just plain refuse to use it due to not having the features they expected). (note for compute tasks NV could also write user-space CUDA drivers if they wanted to, these do not need any blessing from apple so it does not matter how bad the relation ship is, but i don't expect NV to see much of a market given apple will be out-performing them).

With apples perf/W on the GPU space they could do a RTX3090 level performance (in fp32) within the macPro SoC (if the SoC is as large as the ZEON package that the macPro users) since that would still just draw about 150W (that socket is built to cool upto 500W at the moment!)

Well it’s not going to be an Nvidia or intel gpu.  AMD is all that is left. I have this sort of wishful thinking concept involving using a new parallels to run arbitrary graphically intensive PC software (I.e. games) it is something I am having less and less hope for 😕  Apple hasn’t made a

desktop machine I would consider purchasable by me for a very long time now.  Why would they start now?  I keep hoping I keep being disappointed.

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×