Jump to content

4K HDR Blu Ray - Is it worth it?

nicklmg

Hi! What are good setting for that? LG TV? Im using  LG 55UH7700. thanks 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, zMeul said:

I call bull

why? there are couple of examples in Linus' video where the BluRay image is quite notably blurrierrier - to me it indicates that either the master is not the same or the people that processed the BluRay version are total fucktards

the Life of Pi side by side is the most telling

Isn't the one labelled Blu-Ray 1080p? The HDR one is 4k.

6 hours ago, Ciccioo said:

I don't really understand why i can see the difference between non-HDR and HDR content... on my non-HDR monitor. Can somebody explain that to me?

Because the camera that they recorded the screen with has enough dynamic range to see the difference, but that dynamic range is then compressed so it can be seen on displays that don't have as much range. Also there are other differences than just the dynamic range that you're seeing in certain shots.

 

I wonder if perhaps converting the footage to greyscale (or converting it on the TV if there's such an option) would better show the difference.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, The Ran said:

Isn't the one labelled Blu-Ray 1080p? The HDR one is 4k.

if I recall, he said the master was a 2k format!?

 

Quote

I wonder if perhaps converting the footage to greyscale (or converting it on the TV if there's such an option) would better show the difference.

no it won't - the number of gray tones is identical, the color spectrum does not change

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, zMeul said:

if I recall, he said the master was a 2k format!?

Same thing pretty much, that's still a great difference in resolution.

 

 

Anyone know how I can delete my duplicate posts? I was on the first page of this thread whilst posting and there was no indication that my post was going through.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, The Ran said:

Same thing pretty much, that's still a great difference in resolution.

you don't understand

the master is the same 2k in both cases

 

there are two UHD versions of this BluRay:

8130Jufk32L._SY445_.jpg11253956-1844369233559461.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, zMeul said:

no it won't - the number of gray tones is identical, the color spectrum does not change

I imagine it would still show the difference in the dynamic range whilst eliminating (or at least limiting) the differences in the colourspace.

1 minute ago, zMeul said:

you don't understand

the master is the same 2k in both cases

 

there are two UHD versions of this BluRay, well .. there are more, but these are the ones we're talking about:

8130Jufk32L._SY445_.jpg11253956-1844369233559461.jpg

So you meant 4k, not 2k? Are you sure that copy on the left isn't HDR? If both the copies they were playing were 4k then the difference would either be down to the camera (missed focus on one recording perhaps), crappy processing as you said (perhaps a marketing tactic to make the HDR copy seem even better), or if they're different formats (someone mentioned BDXL earlier) then maybe the HDR version has a higher bitrate or some shit.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, The Ran said:

I imagine it would still show the difference in the dynamic range whilst eliminating (or at least limiting) the differences in the colourspace.

So you meant 4k, not 2k? Are you sure that copy on the left isn't HDR? If both the copies they were playing were 4k then the difference would either be down to the camera (missed focus on one recording perhaps), crappy processing as you said (perhaps a marketing tactic to make the HDR copy seem even better), or if they're different formats (someone mentioned BDXL earlier) then maybe the HDR version has a higher bitrate or some shit.

that's not how color space works - if you are in the same color space you have the same amount of gray tones no matter if the image is B&W or color

 

no, the master, the original source was 2k and was upscaled to UHD in both cases

no, the one on the left is plain UHD - HDR BluRays are very specifically marked HDR and/or UltraHD Premium (HDR10) or Dolby Vision (12bit HDR)

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, zMeul said:

that's not how color space works - if you are in the same color space you have the same amount of gray tones no matter if the image is B&W or color

 

no, the master, the original source was 2k and was upscaled to UHD in both cases

no, the one on the left is plain UHD - HDR BluRays are very specifically marked HDR and/or UltraHD Premium (HDR10) or Dolby Vision (12bit HDR)

But they're different colourspaces, one is Rec709 or whatever and the other is Rec2020. Personally I'm more interested in the dynamic range difference, light to dark, as that's something more noticeable to me than more colours. As long as the dynamic range of that TV is less than that of the recording camera then the difference can be captured.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, The Ran said:

But they're different colourspaces, one is Rec709 or whatever and the other is Rec2020. Personally I'm more interested in the dynamic range difference, light to dark, as that's something more noticeable to me than more colours. As long as the dynamic range of that TV is less than that of the recording camera then the difference can be captured.

but you are displaying it on a standard PC monitor that has 8bit per channel with 256 tone maps /  channel - you are not going to have more graytones

isn't what we're talking about, or is it something else?

 

ps: Rec709 is identical to sRGB

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, zMeul said:

but you are displaying it on a standard PC monitor that has 8bit per channel with 256 tone maps /  channel - you are not going to have more graytones

isn't what we're talking about, or is it something else?

It would only be a representation of the difference between HDR and non-HDR's dynamic range. Of course you aren't going to see the actual entire dynamic range but it could give an idea of how large or small the difference is.

 

Another way to think of it is to imagine we wanted to show the difference between UHD and HD on a 1920x1080 display. We'd have to show the UHD as 1920x1080 and then show HD as 960x540.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, The Ran said:

It would only be a representation of the difference between HDR and non-HDR's dynamic range. Of course you aren't going to see the actual entire dynamic range but it could give an idea of how large or small the difference is.

 

Another way to think of it is to imagine we wanted to show the difference between UHD and HD on a 1920x1080 display. We'd have to show the UHD as 1920x1080 and then show HD as 960x540.

as I said, if the color space does not change you cannot have more graytones - period

even if you record the screen with a camera capable of Adobe RGB you are still going to have the same number of graytones

 

but you need the same god damn master without any post-process bullshit and color gradients

for example, the Revenant uses a diff color gradient in couple of scenes

 

someone here pointed out that even Sicario has different BluRay versions - regarding to Linus pointing out the color banding

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, zMeul said:

as I said, if the color space does not change you cannot have more graytones - period

even if you record the screen with a camera capable of Adobe RGB you are still going to have the same number of graytones

 

but you need the same god damn master without any post-process bullshit and color gradients

for example, the Revenant uses a diff color gradient in couple of scenes

But the colourspace for the two sources is different. The camera should be able to see that difference and compress it down to something that is visible on a non-HDR display.

 

Let's simplify things and say non-HDR would be 8bit greyscale, HDR is 10bit greyscale, and the camera can record 10bit and we wanted to see it on an 8bit display. The camera would show us HDR as 8bit and the non-HDR as something less than that.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, The Ran said:

the camera would show us HDR as 8bit and the non-HDR as something less than that.

actually it won't - both will be 8bit xD

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, zMeul said:

actually it won't - both will be 8bit xD

Why would they? The camera can see the difference. Yes if you viewed the raw 10bit footage on an 8bit display then they'd be identical but that's not what I'm talking about, I'm talking about compressing the dynamic range so that the 10bit scales to 8bit.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, The Ran said:

Yes if you viewed the raw 10bit footage on an 8bit display

if you view the 10bit content on a 8bit panel, the experience will be absolutely horrible - worse than the actual 8bit BluRay

you will have a lot of color shift because the display will try to close approximate the color instead of matching it

it will be similar to displaying Adobe RGB content on a sRGB monitor

Link to comment
Share on other sites

Link to post
Share on other sites

On 27/12/2016 at 6:19 PM, nicklmg said:

Buy 4K HDR Blu Rays on Amazon: http://geni.us/05cCs

 

Is the latest "upgrade" from the entertainment industry truly a step in the right direction, or is it another half-baked semi-upgrade solely designed to outdate your current home theater setup?

 

 

HDR is the new King :) (scorpio have HDR ?)

Link to comment
Share on other sites

Link to post
Share on other sites

learning ATMOS support is coming is beginning to make me consider a One S, I also need a wireless controller for my PC gaming (I hate keyboard and mouse playing in general and it's even worse on the couch), might be time to grab one.

Link to comment
Share on other sites

Link to post
Share on other sites

On 2016-12-28 at 0:40 PM, The Ran said:

Same thing pretty much, that's still a great difference in resolution.

 

 

Anyone know how I can delete my duplicate posts? I was on the first page of this thread whilst posting and there was no indication that my post was going through.

Just so you know, 2K is near identical to 1080p. DCI Cinema 2K = 2048 x 1080, whereas 1080p = 1920 x 1080p. Realistically, both are considered "2K" resolutions.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/28/2016 at 2:07 AM, The Ran said:

So, how would Rec2020 compare to a decent 10bit IPS monitor (what I have, P2415Q)? Would it ever be possible in the future for such a monitor to display these HDR films as they were intended to be?

I made a post about HDR that I recommend you read. It also explains the difference between color space and color depth.

Hopefully it should explain what HDR is and why it wouldn't work on a regular monitor.

 

 

On 12/28/2016 at 11:38 AM, zMeul said:

I call bull

why? there are couple of examples in Linus' video where the BluRay image is quite notably blurrierrier - to me it indicates that either the master is not the same or the people that processed the BluRay version are total fucktards

the Life of Pi side by side is the most telling:

<picture>

 

in PC gaming analogy is like they used blur to mask off lower res textures

I can think of two reasons.

1) It's not the exact same frame.

2) The picture from the video was taken with a camera, so if the focus on the 1080p version shot was slightly better, then that version will end up looking sharper. On top of that, the video encoding might have been more harsh on the 4K photo for whatever reason.

 

You really shouldn't use the comparison shots in the video to judge SDR vs HDR. You should never, ever judge a display's image quality by looking at a picture of the display, on your own computer monitor. It's just fluff to make the video more visually interesting, but don't use it as fact.

It's like if you tried to judge the taste of a meal, by looking at a picture of it... After your friend had already chewed it and spat it out.

 

 

That also explains why you are seeing a difference between the two pictures even though your monitor is limited to standard dynamic range. Because the final image you're seeing, has been processed to death by your own monitor, the youtube encoding, the premier encoding, the color correction they have applied to the video, the camera's sensor which doesn't capture everything perfectly, and possibly even different camera settings. Not to mention that mapping from HDR to SDR is not always flawless. All those steps add their own small variances which build up to becoming a big difference.

 

Like Ciccioo showed in his post, the video looks way different on his computer than it does in Linus' video. That's because his screenshot has not been butchered by all the steps I mentioned earlier.

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, dalekphalm said:

Just so you know, 2K is near identical to 1080p. DCI Cinema 2K = 2048 x 1080, whereas 1080p = 1920 x 1080p. Realistically, both are considered "2K" resolutions.

 1080p

UltraWide 1080p and 2k are the same because one measures horizontal lines, and one measures vertical lines.

 

Probably a combination of "4k" sounding more impressive than "2k", and 1080p being from an era when interlacing was more prevalent and counting scanlines was more reasonable a measure of resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, xnamkcor said:

 1080p

UltraWide 1080p and 2k are the same because one measures horizontal lines, and one measures vertical lines.

 

Probably a combination of "4k" sounding more impressive than "2k", and 1080p being from an era when interlacing was more prevalent and counting scanlines was more reasonable a measure of resolution.

Err what? When you refer to a resolution as "xK", such as 2K, it always refers to the horizontal resolution.

 

Ultrawide 1080p is an odd one, but it's not 2K. You could call it 2.5K at best (Same as 1440p). Though I know that some people incorrectly call 1440p "2K" as well, but those people are wrong and are spreading incorrect terminology.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

You really shouldn't use the comparison shots in the video to judge SDR vs HDR. You should never, ever judge a display's image quality by looking at a picture of the display, on your own computer monitor. It's just fluff to make the video more visually interesting, but don't use it as fact.

It's like if you tried to judge the taste of a meal, by looking at a picture of it... After your friend had already chewed it and spat it out.

 

That also explains why you are seeing a difference between the two pictures even though your monitor is limited to standard dynamic range. Because the final image you're seeing, has been processed to death by your own monitor, the youtube encoding, the premier encoding, the color correction they have applied to the video, the camera's sensor which doesn't capture everything perfectly, and possibly even different camera settings. Not to mention that mapping from HDR to SDR is not always flawless. All those steps add their own small variances which build up to becoming a big difference.

what you didn't understand is that I'm not complaining about the HDR images, but about the standard ones - they are blurry

my standard display has zero issues with reproducing 8bit content

 

the comparison is skewed from the start if the same master wasn't used to produce the standard and HDR BluRays - that's what I'm debating

and it's very clear it wasn't

 

they are still images where YT's compression alg is minimum

if this was a single case, yes, I would say you're right .. but it's in every example

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, xnamkcor said:

 1080p

UltraWide 1080p and 2k are the same because one measures horizontal lines, and one measures vertical lines.

 

Probably a combination of "4k" sounding more impressive than "2k", and 1080p being from an era when interlacing was more prevalent and counting scanlines was more reasonable a measure of resolution.

+ @dalekphalm

 

1080p always stands for 1920 x 1080 16:9 format

 

1440p, same 16:9 format

 

UHD or 2160p - same 16:9 format

 

---

 

2K and 4K - and we're talking about 2048 × 1080 and 4096 × 2160

are 1.9:1 aspect ratio

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, dalekphalm said:

Err what? When you refer to a resolution as "xK", such as 2K, it always refers to the horizontal resolution.

 

Ultrawide 1080p is an odd one, but it's not 2K. You could call it 2.5K at best (Same as 1440p). Though I know that some people incorrectly call 1440p "2K" as well, but those people are wrong and are spreading incorrect terminology.

 

There is no such thing as 2k, so nothing anybody refers to it is correct terminology

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×