Jump to content

Apple enters VR game with $3k headset with 8K per eye

rcmaehl
12 minutes ago, wamred said:

What in the world? This does seem overpriced tho so it sounds like apple.

Overpriced for a game toy.  It’s about what really expensive teleconference speaker phones went for when they first came out

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

here's me thinking 4K per eye would be a good enough. 

I had tried the first gen vive near when was released and that was /ok/ just needed a bit more res (wasn't too bad in games designed for VR but was super noticeable and annoying in games like project cars)..

8K per eye will be impressive if they can pull it off. won't be big titles that can run on it though for a while if I had to guess. 

Apple will probably do something proprietary to drive it. Thinking like a Mac Pro with 2 GPU's -- one for each eye.

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

A high resolution VR headset with spacial audio would be impressive, and expensive. However VR has never really taken off and I doubt it ever will. The reason? VR is kinda like the Xbox Kinect but only more expensive. And just like the Kinect, it's really more of a novelty than anything else. VR takes a lot of effort and most people probably just want to sit down and relax without the hassle of a contraption on your face. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

On 2/5/2021 at 1:00 PM, rcmaehl said:

"a crown like knob on the side of the headset"

The digital crown is a staple input method of the Apple Watch and is now also on the AirPods Max

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, descendency said:

How is someone who owns a mac going to play games at 2x8K?

Hooking up to an existing Mac is not likely how this thing would be run since a solid 90% of the Mac lineup doesn't even have a dGPU. A new Mac or some other solution would be powering this. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, seee the state im in nooow said:

this is mostly for myself to understand the difficulty in what could be suggested, and is mostly guesswork:

 

two 8K (square?) streams isn't trivial, i highly suspect apple will be expecting devs to do simpler shaders for it to run at that resolution at 120fps or at least 90fps

 

also they have to figure out how to compress (8Kx8K x2 x 8-bit colour x 3 channels) x 90fps -> about 276.48 Gigabits of raw pixel data per second over a capable connection without running into frequent compression artifacts, and Displayport UHBR 20 only does 77.37 Gigabits/s of raw pixel data (and could frankly require an optical connection at this rate)

 

all of which should be fine for the personnel training industry, but i'm not quite sure if something like half-life alyx could run at that resolution at comfortable framerates

 

one way they could work around all of these would be to have the GPU housed in the headset itself and send texture, polygon and shader data over the cable once or whenever needed (plus transform data stream), but i don't have an idea for the datarate and i'm not sure if it's how that works. after which it's just a matter of finding display driver hardware to push that many pixels without missing a frame. the downside would be that it won't be using a standard video connection, but i highly doubt proprietary solutions would hold apple back

The description seems somewhat unlikely to me.  I suspect there are large sections of the thing that are not as it seems.  2 8k displays implies massive graphics power to a point I strongly suspect shenanigans.  I’m gonna wait till there’s actually a product to look at.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Pretty much if this is a real thing same kind of stuff as Varjo. Totally not for consumers, not even for prosumers, not even for developers but more or less just to show what can be done and some companies that can pour a ton of money for very little actual usability. Even pricing is in the same ballpark (Varjo VR-3 is a bit over 3000€, Varjo XR-3 is around 5500€ + both need yearly subscriptions which are ~800€/year for VR-3 and ~1500€/year for XR-3) which is kind of clear note towards that the product really isn't meant for anybody (except some very, very rare cases where it could be somewhat justified) but more like "we have this product" and they fish investors.

 

Also just a footnote: We don't know what size that thing is. 4K per eye is already something you can just go and buy and an actual product but Pimax 8KX is damn huge thing, the screens aren't small (also needs either RTX 2060 or better for ASIC upscaled image or RTX 2080 or better for native image and those are hard minimums). What you want to know is the PPD (Pixels Per Degree) which is basicly how many pixels is in the vision area after the optics (for example: Pimax 8KX is around 20PPD with full FOV, Pimax 5K+ ~12-22PPD depending on FOV, Rift and Vive are around 10PPD, Quest 2 ~24PPD, Varjo VR-3 is 30PPD for peripheral vision and 70PPD for the focus point (small "floating" FullHD screen on top of the peripheral screen moved according to eye tracking). Think it as PPI but for VR headsets which basicly tells you how sharp your image really is.

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/6/2021 at 10:07 AM, TetraSky said:

Holy sweet Jesus of a resolution.
 

Though they do mention it will use hardware trickery with eye trackers to reduce the resolution where you're not looking. Wonder how that will look.

 

Apple already have a form of variable rate shading (apple call this Rasterization Rate) support in their GPUs, likely they will use that in conjunction with the eye tracking so that maybe the total number of pixels render at any given time per eye is != to a 1080p image but these are not uniformly distributed with the majority of the being very small at the location the eye is looking and others being massive pixels on the edge of your field of vision. 

This is not new tec, some razing games will use this to render the background (that is later blurred out anyway) at a much lower fill rate than the centre of the screen. Applying this to VR is not even new Nvidia have talked about this https://developer.nvidia.com/vrworks/graphics/variablerateshading the main issue in the past as been the latency between the eye tracking and the rendering pipeline.  Apple are masters of getting low latency input with the iPadPro having sub 9ms input to screen update latency so if anyone can solve that problem they can.

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/7/2021 at 7:09 PM, descendency said:

How is someone who owns a mac going to play games at 2x8K?

By buying a new apple silicon mac when they ship. Given the perf/W of apples GPUs if you are willing to pay for the (massive number of) transistors apple can make you are very powerful GPU. A 128Core GPU running at the same frequency as the GPU cores in the M1 would provide over 40Tflops of fp32 (for context that is more than a RTX3090) and that 128Core gpu would draw less than 150W! Apples perf/W advantage and extreme focus over the last 10 years is going to pay off very well..


But that aside for this headset the trick is the eye tracking, the 2 8K displays does not mean you need to render 2 full 8K images, if you know exactly were the user is looking then you maybe need to render 2 1080p images using a variable fill rate (aka the pixels were the user is looking are supper small but those on the edge of your vision are massive).  The thing required to do this is very very low input latency, and apple has shown that they can do very low input latency on the iPadPro with the pencil. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/7/2021 at 5:22 PM, DrMacintosh said:

The digital crown is a staple input method of the Apple Watch and is now also on the AirPods Max

Name me a better duo than apple and their crowns.

 

Though honestly they do not too badly from my experience

✨FNIGE✨

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, hishnash said:

By buying a new apple silicon mac when they ship. Given the perf/W of apples GPUs if you are willing to pay for the (massive number of) transistors apple can make you are very powerful GPU. A 128Core GPU running at the same frequency as the GPU cores in the M1 would provide over 40Tflops of fp32 (for context that is more than a RTX3090) and that 128Core gpu would draw less than 150W! Apples perf/W advantage and extreme focus over the last 10 years is going to pay off very well..


But that aside for this headset the trick is the eye tracking, the 2 8K displays does not mean you need to render 2 full 8K images, if you know exactly were the user is looking then you maybe need to render 2 1080p images using a variable fill rate (aka the pixels were the user is looking are supper small but those on the edge of your vision are massive).  The thing required to do this is very very low input latency, and apple has shown that they can do very low input latency on the iPadPro with the pencil. 

I’d call an apple home brew in-chip gpu “meh”.  It’s performance has been compared to AMD and Intel apu and iGPUs, and it apparently holds its own there, but at the end of the day it’s not a discrete gpu. For it to be even comparable with a low-end discrete GPU it would have to utterly crush the Xe, and it doesn’t.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Bombastinator said:

I’d call an apple home brew in-chip gpu “meh”.  It’s performance has been compared to AMD and Intel apu and iGPUs, and it apparently holds its own there, but at the end of the day it’s not a discrete gpu. For it to be even comparable with a low-end discrete GPU it would have to utterly crush the Xe, and it doesn’t.

In perf/W it is well above what AMD and Nvidia and intel produce. And when comes to scaling up a GPU (as we have seen with this last years GPUs) the limit in the end if power and cooling.  Yer the M1 is not a very powerful gpu but for a gpu that draws just 7W of power it is very very powerful GPU. And there is nothing stoping apple scaling that gpu out, by building dedicated dies with many many GPU cores.

 

I expect the 16" MBP will have as the top end configuration 2 GPU dies each with 32cores this will provide apple with a GPU ~= RTX 3070 (desktop card) while only drawing ~56W + Memory power. Give that these would be on their own dies many people might call them discrete gpus but as a software dev I will reserver the name desecrate GPUs to refer to GPUs that are attached over slow (high latency) PCIe style connections so in My eyes these will not be desecrate GPUs even through they will outperform any laptop gpu on the market.  

 

Of course given the die area required for this (would be over 240mm) it will cost a lot but apple has never been worried about the price of the top end configuration of any of their products.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, hishnash said:

In perf/W it is well above what AMD and Nvidia and intel produce. And when comes to scaling up a GPU (as we have seen with this last years GPUs) the limit in the end if power and cooling.  Yer the M1 is not a very powerful gpu but for a gpu that draws just 7W of power it is very very powerful GPU. And there is nothing stoping apple scaling that gpu out, by building dedicated dies with many many GPU cores.

 

I expect the 16" MBP will have as the top end configuration 2 GPU dies each with 32cores this will provide apple with a GPU ~= RTX 3070 (desktop card) while only drawing ~56W + Memory power. Give that these would be on their own dies many people might call them discrete gpus but as a software dev I will reserver the name desecrate GPUs to refer to GPUs that are attached over slow (high latency) PCIe style connections so in My eyes these will not be desecrate GPUs even through they will outperform any laptop gpu on the market.  

 

Of course given the die area required for this (would be over 240mm) it will cost a lot but apple has never been worried about the price of the top end configuration of any of their products.

Still doesn’t crush.  Even if it is faster, it can’t be faster that what? A 1030? Discrete and integrated gpu is still an order of magnitude difference.  A really fast integrated gpu is still slower than a slow discrete card. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bombastinator said:

Still doesn’t crush.  Even if it is faster, it can’t be faster that what? A 1030? Discrete and integrated gpu is still an order of magnitude difference.  A really fast integrated gpu is still slower than a slow discrete card. 

The M1 8Core GPU at just 7 W is ~= desktop 1050ti. The fact that is connected to the rest of the system using a much faster interconnect than any PCIe gpu helps it not hinders it. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, hishnash said:

The M1 8Core GPU at just 7 W is ~= desktop 1050ti. The fact that is connected to the rest of the system using a much faster interconnect than any PCIe gpu helps it not hinders it. 

Except it’s also not a whole lot faster than the apus/iGPUs of other manufacturers and they aren’t near that fast.  Implies there is marketing afoot.  What the m1 isn’t is slower than other laptops, which is what everyone was afraid it was going to be.  It doesn’t have to be some magical amazeballs thing.  Not sucking is a big enough win.  The constant repetition of a single synthetic benchmark makes your claim seem less reliable to me rather than more.   There was testing done.  Running native code it’s significantly quicker doing some things but not everything.  It’s not half again faster though, which is the ratio I have seen as being easily noticeable.  What’s knocking people’s socks off is the Rosetta implementation.  That IS that much faster than previous systems of that type.   Taken together it seems to be a really good machine.  I might buy one. I haven’t bought new Mac hardware since 2012. I used to SYSOP macs.  I’ve owned like 8 or something. Desktops and laptops.  The only desktop Mac I ever bought was an 840av.  It was ballin for it’s day.  The m1s are  expensive enough though that it needs to be my main rig, and I like to play video games.  What matters to me is what kind of fps the things can generate in PC games.  The answer is not enough for me.  I looked hard. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Is it just me or does 3000 for 2 8k screens sound like a decent deal?

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×