Jump to content

Kuo on Apple’s VR+AR HMD: late 2022 release, as powerful as an M1 Mac, untethered from iPhone/Mac

saltycaramel

Sounds like you’re asking Apple to stifle innovation by adhering to some standard that

- wasn’t even around when Apple planned all of this (like usb-c was not around when Apple planned and introduced Lightning)

- isn’t even that established at this point

 

That’s not how the most innovative consumer electronics company operates. Then again, we know nothing yet about their plans so maybe they will support OpenXR as well.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, sap said:

And that's where extensions come in. If there's a missing feature in OpenXR or Vulkan preventing a hardware vendor from making full use of their hardware, they make an extension. This is what NVIDIA, AMD, Qualcomm, Intel, ARM, Google, Huawei, Broadcom, VeriSilicon, Samsung, Facebook/Oculus, Microsoft, Varjo, Valve, Dell, Sony, HP, HTC and others are currently doing, and I don't see the need for Apple to suddenly do their own thing.

Like making C++ the shading linage for VK? metal gets a lot out of being C++14 based and provides much better compute options than VK due to this. Once you put in that many extensions (and require developer to use them) you might as well build your own api that more closely matches the hardware and thus has less abstractions (both AMD and Nvidia do not consider VK as a good option for Compute and any low latency display solution needs to opt to do some if not most of its compute on the gpu side). 

Apple did not suddenly do thier own thing, metal was publicly released a year bore VK was proposed (and many years before the first stable VK was released)... 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, sap said:

Oculus, Valve, Microsoft and others now all fully support OpenXR and you can run OpenXR applications on their headsets. There's no excuse for Apple to be special and break compatibility today, especially considering they haven't even released any headsets yet and they can still make big changes.

Why, apple have pre-exiting apis and non of the OpenXR apis even support lanagues that apple provide any other system apis for. If apple push people to build apps with OpenXR apis (using existing code bases) they are also pushing these apps to not integrate with the OS (so no iCloud support, no keychain support, no system UI frameworks support, no sleep ways/background tasks... etc). Sure devs can go ahead an build thier own painfull bridges that will be full be bugs.

yer apple could implement OpenXR apis in Obj-c/Swift but that would not help any existing apps come to the platform. Much better to double down on thier exiting AR/VR apis (that developer have been learning).  (both of the major game engines, unity and unreal, have thier own abstraction that already use apples AR/VR apis).  Very very few AR/VR games use thier own custom game engine almost all of them are using unity or unreal an thus already have support for Appel apis.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, hishnash said:

yer apple could implement OpenXR apis in Obj-c/Swift but that would not help any existing apps come to the platform. Much better to double down on thier exiting AR/VR apis (that developer have been learning).

This really doesn't matter, if Apple's development toolkits and documentation for AR/VR and w/e hardware that comes is very good meaning devs can easily and/or quickly deliver high quality software then what does it really matter? Yea they aren't using an open standard that everyone "should" use but that's always way easier said than reality. CUDA vs  OpenCL for a great example.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, saltycaramel said:

Sounds like you’re asking Apple to stifle innovation by adhering to some standard that

- wasn’t even around when Apple planned all of this (like usb-c was not around when Apple planned and introduced Lightning)

- isn’t even that established at this point

 

That’s not how the most innovative consumer electronics company operates. Then again, we know nothing yet about their plans so maybe they will support OpenXR as well.

How does it stifle innovation? How isn't it established? It would be more useful if we were talking about actual individual technical limitations but so far none have been named, and neither have any VR headsets been named that don't follow OpenXR standard to show it's not established.

 

OpenXR is BUILT for innovation. They have an easy extension approval process and if you look at their struct layout, basically every struct has a type enum and a next pointer for easy extensibility.

 

 

1 hour ago, hishnash said:

metal gets a lot out of being C++14 based [...] Once you put in that many extensions you might as well build your own api

No, you might not as well build your own API because this is the literal definition of fragmentation.

 

The whole reason for the existence of OpenXR and Vulkan is to have a universal API that all applications can understand, yet at the same time allow developers to squeeze as much as possible out of particular hardware through the use of specific extensions when they're available.

 

This means that I can take my existing engine, add a check for Apple-specific stuff in the available extensions list, and have special code paths to do things much faster/better on that hardware. On other hardware, I can keep the other codepath as a fallback. Or I can also choose not to change anything and have my app work fine on the Apple headset with no changes. You might say this is bad and will lead to badly performing games/apps, but Apple already has app store guidelines and they can mandate the use of particular extensions there.

 

If you build your own API instead, this means that I suddenly have to now pointlessly write an entire new back-end for that particular API and maintain support for two separate APIs. Why????

 

C++14 support may be good, but not everyone wants to use C++14. Apple could build their own C++14 library that wraps around OpenXR, automatically handles the Apple extensions and gives developers an easy-to-digest C++14 API, and people can choose whether or not they want to use the easy Apple C++14 library or the hardcore OpenXR C API.

 

1 hour ago, hishnash said:

they are also pushing these apps to not integrate with the OS (so no iCloud support, no keychain support, no system UI frameworks support, no sleep ways/background tasks... etc)

These things have no place in an XR API, they should be provided by a regular application API by the OS. If there's some XR-specific behavior needed then again, they could make an extension for that.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, sap said:

How does it stifle innovation? How isn't it established? It would be more useful if we were talking about actual individual technical limitations but so far none have been named, and neither have any VR headsets been named that don't follow OpenXR standard to show it's not established.

Fact is, VR headsets aren’t even a thing right now in the general population. That’s the whole category that it’s not well established.

You could be advocating for Symbian or Blackberry right now.

We don’t know. 

What we know is Apple has been carefully putting the software pieces in place for a decade now (well before OpenXR), for what they think will work for them. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Brooksie359 said:

Yeah I will have to give it to Apple that although they are super restrictive in how things work they really do provide a great experience when using multiple devices from their ecosystem. It's great marketing as it sorta forces you to use more of their products if you want that seamless experience but at the same time they do deliver on that experience so it's hard to get too mad about it.

Because Android whatever makers are so good at doing the same so openly? Oh wait, it's neither open or good most of the time. It's a miracle if two browsers sync their bookmarks properly on two different androids...

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, sap said:

How does it stifle innovation? How isn't it established? It would be more useful if we were talking about actual individual technical limitations but so far none have been named, and neither have any VR headsets been named that don't follow OpenXR standard to show it's not established.

 

OpenXR is BUILT for innovation. They have an easy extension approval process and if you look at their struct layout, basically every struct has a type enum and a next pointer for easy extensibility.

I wouldn't say that OpenXR necessarily stifles innovation, but remember that Vulkan and Metal exist precisely because many developers and OEMs aren't thrilled with OpenGL. Extendability might not matter if the underlying platform has issues.

 

As salty implied, it's not clear if Meta is really going to dominate the VR market and foster OpenXR, or if we're repeating history and Meta is just Nokia before Apple swoops in with the iPhone. That is, it's only the leader because it's the most popular of a so-so bunch. It depends on Apple's hardware and whether it can provide a meaningfully better UI and app selection.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, RejZoR said:

Because Android whatever makers are so good at doing the same so openly? Oh wait, it's neither open or good most of the time. It's a miracle if two browsers sync their bookmarks properly on two different androids...

I'll give you an example right now where "Apple is the only choice"

 

The Vtuber market, only the iPhone is useable as a face tracker. That uses Apple's ARKit and the 52 blendshapes. If you use an Android phone, you get basically less-functionality than a webcam, because there is no standard API or hardware on Android devices.  On Windows/Mac's the webcam support is at best limited, and at worst, non-existant. It's a total crap-shoot if any specific webcam works, and once you use it for a face tracker, you can't use that camera for fullbody either.

 

In the Vtuber space, 2D models basically require an iPhone, or you're going to be stuck with the model's mouth being open or not-moving most of the time, because webcam support is flakey, and incredibly dependent on your hardware.

 

In the 3D space, you have vtubers with $20,000 bodysuit + hmd setup's, and mid-tier setups with an iphone and vive trackers, and low-end setups with just a webcam and maybe a leapmotion controller.

 

Apple can leverage their existing ARKit functionality with their own HMD, and there is already a market using ARKit. There is no market for Android ARCore beyond selfie-apps.

 

Link to comment
Share on other sites

Link to post
Share on other sites

The free version of this week “Power On” newsletter is out and Mark Gurman touches on the VR/AR headset timeline, among other things.

 

Quote

Hey everyone, it’s Mark. Apple’s next major product categories—a virtual and augmented reality headset and a self-driving car—will likely mark a change in how the company rolls out new devices.

 

This is the free version of Power On. If you like it, consider subscribing to Bloomberg.com — you'll receive this newsletter several hours earlier and get exclusive access to a Q&A section with me.

Quote

In Apple Inc.’s ideal scenario, it surprises the world with a product announcement and then releases it in stores just days later. With the company’s next round of major product categories, that likely won’t be the case.

Most iPhones over the last decade were introduced in either June or September before hitting shelves just a week later. We’ve seen similar timelines for many iPads, AirPods and Macs also released under Tim Cook, and this was a strategy also used heavily under Steve Jobs.

When Apple releases the first version of a major, new product, however, consumers typically need to wait. Here are three recent examples:

  • Original iPhone: Introduced Jan. 9, 2007; released June 29, 2007 (171 days)
  • Original iPad: Introduced Jan. 27, 2010; released April 3, 2010 (66 days)
  • Original Apple Watch: Introduced Sept. 9, 2014; released April 24, 2015 (227 days)

Of course, Apple had good reason for such lengthy release delays.

Jobs had said he announced the iPhone so early to preempt the U.S. Federal Communications Commission from leaking it during regulatory approvals. The reality is that the device’s hardware and software were simply not ready yet for release, and the company needed to field test the smartphone on cellular networks.

For the iPad, Apple needed the extra two months to finish up the device’s operating system, gather e-books for the launch of iBooks and push developers to optimize apps for the tablet’s larger screen.

Around the time of the Apple Watch debut, Cook—three years into his tenure as CEO—was under pressure from investors and customers to deliver a new product category. It would have been hard to hold them off another 200 days. Plus, it made for a nice combination with the larger iPhone 6 line and Apple Pay.

Fast forward to the 2020s, and Apple has at least three major new product categories in the pipeline, all of which are likely to cap a two-decade run for Cook: a mixed virtual and augmented reality headset, augmented reality glasses and a self-driving car.

These new products will challenge Apple’s typical launch schedule—and they will likely even stretch the delays we’ve seen with Apple’s previous new categories.

I expect the gap between the introduction of Apple’s first headset—scheduled for as early as next year—to be sizable and perhaps rival that of the original Apple Watch.

Apple’s first headset will have a complex, expensive-to-build design, complete with interchangeable lenses. The company will likely need to work with governments globally on possible prescription lenses and partner with a bevy of manufacturers on complex technologies that neither side has shipped before.

That will take time, and of course, Apple will want to have such a breakthrough new category in public view before exposing it to leak risks when it gets into the hands of more Apple employees and partners who will need to contribute to it before release.

More important will be the months of necessary publicity to get people interested in a new (and pricey) product and to rally enough support among software developers to make it worthwhile. I could see Apple announcing the headset at its 2022 Worldwide Developers Conference and focusing that event on AR and VR app development. Then it could ship the product late next year or in 2023.

While the introduction of a headset will be complex, it will be nothing compared to a car. While Apple is targeting 2025 for a launch, actually getting the vehicle on city streets could take even longer.

Apple, obsessively secretive, likely would not want to take the necessary steps of publicly testing its final car design on city streets before actually introducing it. That testing process will take years. Apple will also need to work with regulators globally and manufacturers, as well as repair centers and fleet management companies.

Electric car makers will probably serve as a better analogue than Apple’s own traditions.

Tesla Inc. showed off the Model S in 2009 before ultimately shipping it in 2012. The Model 3 delay was much shorter, having been announced in 2016 before the first cars rolled off the assembly line in mid-2017. The Model Y was announced in March 2019 and shipped almost exactly one year later.

Looking beyond Elon Musk’s company, the first Lucid Group Inc. prototype was shown at the end of 2016, and the first models only started shipping this month. Rivian Automotive Inc’s. pickup truck was introduced in 2018, and the first deliveries started in September. In other words, don’t expect to ride in an Apple car for a long time.

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, sap said:

These things have no place in an XR API, they should be provided by a regular application API by the OS. If there's some XR-specific behavior needed then again, they could make an extension for that.

But your XR api you think should be C/C++ only? that by definition means a develop who wants to use these needs to build a bridge to obj-c/swift so that they can integrate into the OS. That's fine for a large game eying (like unity or unreal by they can also use any AR api they are provided). Apple is not going to provide system apis in c/c++ (thank good).

 

 

9 hours ago, sap said:

C++14 support may be good, but not everyone wants to use C++14. Apple could build their own C++14 library that wraps around OpenXR, automatically handles the Apple extensions and gives developers an easy-to-digest C++14 API, and people can choose whether or not they want to use the easy Apple C++14 library or the hardcore OpenXR C API.

If apple build thier own OpenCR apis for a different linage why not build thier own apis that better match the hardware. There is no point impended OpenXR in name only, if they impenitent it you want it impended so that existing code based can be used without any changes right?  Im talking about GPU shaders being written in C++ (openXR is a cpu only apis spec and not something that can just be supported in a gpu shader and thus rather limited in its latency response). 

 

 

9 hours ago, sap said:

This means that I can take my existing engine, add a check for Apple-specific stuff in the available extensions list, and have special code paths to do things much faster/better on that hardware. On other hardware, I can keep the other codepath as a fallback. Or I can also choose not to change anything and have my app work fine on the Apple headset with no changes. You might say this is bad and will lead to badly performing games/apps, but Apple already has app store guidelines and they can mandate the use of particular extensions there.

 

No you will need to write a LOT of bridging code to reach into the system apis! (all in obj-c/swift) building good (bug free) bridging apis from a manual memory managed lang (like c/c++ to an arc lanague like swift/obj-c requires a very high level of skill to not end up with memory bugs).

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, RejZoR said:

Because Android whatever makers are so good at doing the same so openly? Oh wait, it's neither open or good most of the time. It's a miracle if two browsers sync their bookmarks properly on two different androids...

Generally speaking Android is open and you can use alot of third part things while the it's less so with Apple. Granted again there is the trade off and you can clearly see this with how well Apple devices work with each other compared to say Android and other devices. I'm not saying either way is bad because both have their positive and negative aspects that depending on who you are and what you want you might prefer one over the other. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×