Jump to content

Trick or M3-treat? - Apple’s pre-Halloween “Scary Fast” virtual event

saltycaramel
1 minute ago, leadeater said:

 

Also the other factor that has kept Apple away from OLED is the only premier source is Samsung, particularly with multi-stack. LG is coming out soon. Lots of companies, like Apple, don't like to be in single source situations as that can lead to multiple issues so it's best to just avoid that.

Fair enough, wish it was like plasma where you can just run a moving image for a bit and it'd be fine. I've noticed phones don't have those issues, but don't the Apple watch and iPhone 15 already have dual layer? They reach like 2300 nits peak brightness, and it seems like other brands magically can double their peak brightness as well.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, DANK_AS_gay said:

but don't the Apple watch and iPhone 15 already have dual layer?

Nope, same actual screen as the previous iPhone 14. So unless that was too then it wouldn't be. I don't follow phones enough to know, at least what I am reading Apple's first for that would be the 2024 iPad Pro.

 

Apple will give the new screen tech a new brand name, "Super Retina XDR" is the current. Wonder what it'll be called?

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, leadeater said:

Like I mentioned OLEDs in 2023 aren't the same as 2016, people acted the same way about NAND wear for example. People still do even.

I think that OEMs do to mitigate the issue of retention and burn in is by shifting the pixels. But tbh I'm not yet confident on an OLED laptop not having display problems in a few years especially the macOS dock or Windows taskbar being always there compared that to a phone that has full screen apps where in the content changes everytime.

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, saltycaramel said:

Houston, we have a benchmark:

https://browser.geekbench.com/v6/cpu/3343681

 

 

GB6.2 single core scores

 

Apple M3 inside a 14.2" 15.5mm-thick laptop: 3076 

 

Snapdragon X Elite inside a 14.5" 15mm-thick laptop (23W TDP): 2780 

(Windows)

 

Snapdragon X Elite inside a 16.5" 16.8mm-thick laptop (80W TDP): 2971

(Windows)

 

Snapdragon X Elite inside a 16.5" 16.8mm-thick laptop (80W TDP): 3236

(Linux, fans at 100% full blast all of the time, why not also liquid nitrogen OCing at that point?)

 

Apple M3 shipping next week.

Snapdragon X Elite laptops shipping in "mid 2024", so like in 8 months.

 

 

Taking all of this into consideration, do you think it's fair for the Qualcomm guys to be so adamant in interviews and presentations that they are currently the "single core kings"? Are they really?

 

I did scratch my head when Qualcomm put so much of its energy into claiming an advantage over M2. You don't brag about taking the lead when you probably won't hold that lead by the time your product ships. Even the more pessimistic rumors still had Apple shipping M3 early next year.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, captain_to_fire said:

But tbh I'm not yet confident on an OLED laptop not having display problems in a few years especially the macOS dock or Windows taskbar being always there compared that to a phone that has full screen apps

 

Trust the process, there's a reason Apple wouldn't touch single-stack OLEDs for large displays with a ten-foot pole and there's a reason they'll suddenly move their entire iPad/Macbook line-up to dual-stack OLEDs in the span of 2-3 years. We'll know more about it (in the form of shiny marketing speak) come next spring.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, saltycaramel said:

We'll know more about it (in the form of shiny marketing speak) come next spring.

They're probably still gonna call it "Super Retina XDR" even if it's an OLED Mac/iPad because that's what Apple's been calling the OLED on the Pro iPhones

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, leadeater said:

OLEDs have always been lower peak brightness but the truth is you'll never be running your screen at a brightness setting that will do that 1600 peak because it does actually hurt.

Here I come to late reply.

 

I was in the whole you don't really needs the super high brightness screen until I saw the Dobly pulsar display. 4000 nits peak. And it does make a pretty big difference. If you ever get the oppturntiy to see that display in a grading studio do that. Its pretty amazing the different it makes when used correctly, and almost looks sharper as the eye see contrast as sharpness. 

 

As far as too bright its painful I don't really see that argument. There are probably things brightly than 1600 nits in most rooms like the lights, and items outside are much brighter. I think sidewalk in sun is often over 10000 nits. So brightness wise its still pretty far off from the brightness levels in most rooms.

Link to comment
Share on other sites

Link to post
Share on other sites

I could tell stories of epic HDR movie/series/video watching on the couch or in bed and “wow moments” with my 1600nit/1000nit 16” MBP in the last 2 years..

 

Plus, professionals out there that work with HDR videos. (nowadays it could even be YouTubers)

 

“Can you point me to an alternative?”

”Well you could just get an HDR600 laptop..”.

🤦🏻‍♂️

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, captain_to_fire said:

I think that OEMs do to mitigate the issue of retention and burn in is by shifting the pixels. But tbh I'm not yet confident on an OLED laptop not having display problems in a few years especially the macOS dock or Windows taskbar being always there compared that to a phone that has full screen apps where in the content changes everytime.

I decided to gamble and daily drive an oled tv as a monitor, let's see how long it lasts lol

19 minutes ago, saltycaramel said:

I could tell stories of epic HDR movie/series/video watching on the couch or in bed and “wow moments” with my 1600nit/1000nit 16” MBP in the last 2 years..

 

Plus, professionals out there that work with HDR videos. (nowadays it could even be YouTubers)

 

“Can you point me to an alternative?”

”Well you could just get an HDR600 laptop..”.

🤦🏻‍♂️

I may be the weird one, but I honestly couldn't care less about that HDR stuff.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, saltycaramel said:

Out of curiosity, can somebody point me to a 1599$ 14.2” Windows laptop with a miniLED 1600nit/1000nit Vrr 24-120Hz display and flexless sturdy unibody aluminum build?

 

So I can better appreciate how Apple is ripping off the M3 14” MBP buyers with those pesky 8GB of RAM in the base 1599$ config...

So much this. As per usual ppl here ignore everything besides naked spec numbers on the packaging/label. Even worse below the LTT video (top comment).

They also have no clue what the actual user experience on a ARM Macbook with 8GB of RAM outside of high perf use cases is.

 

„Pro“ in no way necessarily means „requiring high peak performance“, as stupid as the Pro label generally is.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Electronics Wizardy said:

As far as too bright its painful I don't really see that argument. There are probably things brightly than 1600 nits in most rooms like the lights, and items outside are much brighter. I think sidewalk in sun is often over 10000 nits. So brightness wise its still pretty far off from the brightness levels in most rooms.

Emissive light from a screen is very different from sunlight and the way it's measured is very different too, and that's not the light energy going in to your eyes. Sunlight can be 127,000 lumens per square meter, that's direct not reflected. Some of the most powerful flashlights are 120,000 lumens and you absolutely cannot point them at people's faces.

 

A typical living room can be 3000 lumens or 875 nits.

 

Quote

Nits measure the intensity of that light. In other words, nits tell you how bright the light appears to be, while lumens tell you how much light is actually being emitted.

Anyway knowing the difference between luminance and illuminance is quite important. While you can convert lumens to nits understanding these aren't really the same thing is important. So converting like above isn't even really a correct thing to do.

 

3 hours ago, Electronics Wizardy said:

I was in the whole you don't really needs the super high brightness screen until I saw the Dobly pulsar display. 4000 nits peak. And it does make a pretty big difference. If you ever get the oppturntiy to see that display in a grading studio do that. Its pretty amazing the different it makes when used correctly, and almost looks sharper as the eye see contrast as sharpness. 

LCD and derivates do actually need be brighter to give a good HDR experience, compared to OLED. They need to be brighter to get a greater difference between dark and bright where as since an OLED can do much more effective blacks and more precisely (per pixel) you don't actually need the same maximum brightness for equivalent image.

 

I'll cover that screen after this. HDR content is reference mastered at 1000 nits, this is actual celebrated and measured nits not screen spec sheet.

Quote

HDR (Dolby Vision) Reference Monitor (Minimum Requirements):

Color Gamut: P3
White Point: D65
EOTF: SMPTE 2084 (PQ)
Black Level: 0.005 Nits
Peak Luminance: 1000 Nits
Measured Contrast Ratio: 200,000:1

https://professionalsupport.dolby.com/s/article/What-display-should-I-use-for-creating-Dolby-Vision?language=en_US

https://partnerhelp.netflixstudios.com/hc/en-us/articles/360000591787-Color-Critical-Display-Calibration-Guidelines

 

Back to that Dibly Pulsar screen, it's not actually 4000 nits. It has 3 different colours of LEDs, RGB for more accurate colours. Neither does Dobly rate the screen at 4000 nits either, I guess it's technically possible if all 3 LEDs were on and at maximum brightness but that's not what it actually does. Their actual specifications is 600 cd/m2 or 600 nits, hence why the product is no longer sold and withdrawn as it does not meet their HDR reference requirements.

https://www.blutek.it/download/pdf/ProMonitor_OverviewSpecsheet.pdf

 

The key fundamental to remember about HDR is actually it's name, High Dynamic Range. it's not at all about peak brightness it's about the range of light, the difference between dark and light and of each colour. That's why HDR requires Wide Colour Gamut and has contrast ratio requirements, if you can't achieve black level you have to compensate with brightness (1000 / 0.005 = 200,000).

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, saltycaramel said:

”Well you could just get an HDR600 laptop..”.

🤦🏻‍♂️

Doesn't know the difference between HDR600 and HDR600 True Black is the bigger facepalm lol. Your MacBook Pro is multiple HDR qualification ratings below the laptop I pointed you to 🤷‍♂️

 

Which also isn't actually a true indicator of how good a screen is either. Everyone loves to misuse spec sheets and ratings and it's beyond me to stop them.

 

3 hours ago, saltycaramel said:

Plus, professionals out there that work with HDR videos. (nowadays it could even be YouTubers)

I suggest you read above, you might actually learn something useful beyond looking at the spec sheet of a screen 😉

 

You have a fatal habit of going "It's made by Apple so it's the best" even in the face of it not being true. No matter when you are shown it's not rather than "that's some good information" you just deny it. I suggest you don't now or ever again. It's actually not helpful to yourself.

 

Just remember it doesn't actually matter, Apple's screen is great. Many screens are in fact great. It doesn't and never mattered. I showed you the most equivalent laptop to what you wanted to compare against at the price point and all you want to do was argue about OLED without having any idea at all. Not the best move when I already said the laptop overall isn't even as good.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, captain_to_fire said:

I think that OEMs do to mitigate the issue of retention and burn in is by shifting the pixels. But tbh I'm not yet confident on an OLED laptop not having display problems in a few years especially the macOS dock or Windows taskbar being always there compared that to a phone that has full screen apps where in the content changes everytime.

I'm honestly not sure if the problem will ever be completely gone, we'll for sure get better at preventing and correcting it but making it totally go away that I'm not going to bank on.

 

The thing about image retention and the TFT layer is that dual/multi stack OLED still only has one TFT layer in the structure of the display panel so it won't address this. If you want to be scared away even more watch this.

 

 

Apple's next "controversy" will be OLED screen burn, just predicting that now.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Back to that Dibly Pulsar screen, it's not actually 4000 nits. It has 3 different colours of LEDs, RGB for more accurate colours. Neither does Dobly rate the screen at 4000 nits either, I guess it's technically possible if all 3 LEDs were on and at maximum brightness but that's not what it actually does. Their actual specifications is 600 cd/m2 or 600 nits, hence why the product is no longer sold and withdrawn as it does not meet their HDR reference requirements.

https://www.blutek.it/download/pdf/ProMonitor_OverviewSpecsheet.pdf

 

I'm pretty sure it was a 4000 nit monitor. I think the one  you linked was a older model that was also in the mastering studio. I think they only made a few dozen of the  4000 nit monitors, and it was much brighter than the one you linked the pdf to. I can't seem to find any specs for it online other than some articles saying dolby was gonna make a 4000 nit monitor.

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Electronics Wizardy said:

I'm pretty sure it was a 4000 nit monitor. I think the one  you linked was a older model that was also in the mastering studio. I think they only made a few dozen of the  4000 nit monitors, and it was much brighter than the one you linked the pdf to. I can't seem to find any specs for it online other than some articles saying dolby was gonna make a 4000 nit monitor.

I've not seen any other than that and multiple places list it as 4000 nits as well. The standard is 1000/2000/4000 with 1000 being full screen brightness, 2000 being L32 (I think one quarter screen area?) and 4000 I'm not actually sure but it'll be an even smaller screen area.  The new one will be 1000 full screen brightness up from the 600. Odd that there is literally no mention of a newer one anywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

I've not seen any other than that and multiple places list it as 4000 nits as well. The standard is 1000/2000/4000 with 1000 being full screen brightness, 2000 being L32 (I think one quarter screen area?) and 4000 I'm not actually sure but it'll be an even smaller screen area.  The new one will be 1000 full screen brightness up from the 600. Odd that there is literally no mention of a newer one anywhere.

It seems like a contact us type of produce. Dolby said they don't sell them, just rent them out to other studios to use.

 

Its a very impressive display though. I think it was about 42in and 1080p like the old model. Near perfect blacks, and extremely bright. Not sure how it worked, but maybe dual layer lcd as I didn't see any visible blooming, and was able to have the full screen very bright with no noticeable dimming of other elements. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Electronics Wizardy said:

It seems like a contact us type of produce. Dolby said they don't sell them, just rent them out to other studios to use.

 

Its a very impressive display though. I think it was about 42in and 1080p like the old model. Near perfect blacks, and extremely bright. Not sure how it worked, but maybe dual layer lcd as I didn't see any visible blooming, and was able to have the full screen very bright with no noticeable dimming of other elements. 

Still 1080p? I mean I guess that is theatre resolution. And yea it should be bright, reference SDR full screen brightness is 100 nit so if you are in HDR mode and the screen displays mostly a bright white it's going to be damn bright as hell. Thing about brightness though is, other than needing it for the contrast ratio, it's most useful for ambient light as you need more brightness the brighter the ambient is. A cinema theatre screen is only around 48 nits SDR and 100 HDR.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

Still 1080p? I mean I guess that is theatre resolution. And yea it should be bright, reference SDR full screen brightness is 100 nit so if you are in HDR mode and the screen displays mostly a bright white it's going to be damn bright as hell. Thing about brightness though is, other than needing it for the contrast ratio, it's most useful for ambient light as you need more brightness the brighter the ambient is. A cinema theatre screen is only around 48 nits SDR and 100 HDR.

I was surprised how big of a difference 4000 nits made in a dark mastering room. Normally I think dark room mean you don't need a bright screen, but there is a significant difference having the ability to peak to much brighter levels. 

 

Movie theaters are probably gonna have to go led walls or something like that for "true" HDR. I don't see a way there getting projectors much brighter with that screen size. Should also help with black levels compared to a reflective screen. I could see LED walls being the next big cinema selling feature as cinemas seem to keep trying to find technology to get people in the theater.

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, Electronics Wizardy said:

I was surprised how big of a difference 4000 nits made in a dark mastering room. Normally I think dark room mean you don't need a bright screen, but there is a significant difference having the ability to peak to much brighter levels. 

I'd still be very careful in assuming you were seeing 4000 nit output, it'll also not be for full screen brightness. What I dislike about "peak brightness" is there doesn't seem to be a good standard that everyone sticks to. Also what a display can do and what it is doing aren't the same either.

 

Based on this professional mastering monitor I would assume Dolby's is 4000 nit L32.

image.png.62a5a6ec136557b8b9779da4d53dd142.png

https://www.flandersscientific.com/XM312U/tech-specs.php

 

This seems to be quite rare, Sony's for example is only 1000 with no mention of L32 or other. Anyway I really do find these overly bright images too much, being blasted with 1000 nits of actual full screen brightness is not enjoyable but fortunately movies don't actually do that.

 

29 minutes ago, Electronics Wizardy said:

Movie theaters are probably gonna have to go led walls or something like that for "true" HDR

That is true HDR, 100 nits is Dolby Vision cinema

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, igormp said:

I may be the weird one, but I honestly couldn't care less about that HDR stuff.

I wish Windows handled it better. 

My monitor supports HDR1000, but I never use it because it makes everything SDR (so like 99% of everything I see) look like ass. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, LAwLz said:

I wish Windows handled it better. 

My monitor supports HDR1000, but I never use it because it makes everything SDR (so like 99% of everything I see) look like ass. 

i don't even use windows 🤷‍♂️

But even on streaming apps I don't notice such things

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

I wish Windows handled it better. 

My monitor supports HDR1000, but I never use it because it makes everything SDR (so like 99% of everything I see) look like ass. 

I had an issue where it would look fine on screen but if I used snipping tool (I think?) to grab an image then paste it in to Discord it would look completely fucked lol

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, igormp said:

i don't even use windows 🤷‍♂️

But even on streaming apps I don't notice such things

I think you have not seen good HDR on a TV that can display it properly. It is not a subtle improvement.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Obioban said:

I think you have not seen good HDR on a TV that can display it properly. It is not a subtle improvement.

I do have both a 14" mbp and a LG C2, afaik those do have good panels.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, igormp said:

I do have both a 14" mbp and a LG C2, afaik those do have good panels.

A good panel is half the equation.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×