Jump to content

This TINY TV Costs $20,000

Plouffe

 

The XMP550 QD OLED mastering monitor from Flanders Scientific is here and it's incredible, but while twenty grand is less than the competition, it's still a LOT of money. Do studios really need them?

 

Check out the Flanders Scientific XMP550 QD OLED Mastering Monitor: https://lmg.gg/RUMmX
Buy a Samsung S95B 55" OLED TV: https://geni.us/tXvM
Buy an LG G2 55" OLED TV: https://geni.us/9jARIVB

Purchases made through some store links may provide some compensation to Linus Media Group.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

fire fox is not playing nice with this video..

fps is stuttering.

i know some hdr video dont play nice with  browsers.

MSI x399 sli plus  | AMD theardripper 2990wx all core 3ghz lock |Thermaltake flo ring 360 | EVGA 2080, Zotac 2080 |Gskill Ripjaws 128GB 3000 MHz | Corsair RM1200i |150tb | Asus tuff gaming mid tower| 10gb NIC

Link to comment
Share on other sites

Link to post
Share on other sites

I'm curious about how LTT tests monitors/TVs.
Before running any tests, are any changes made to the OSD settings?
Do they demonstrate the examples on PCs with color calibration performed first, or is it just raw, as received from the manufacturer?
I know software like DisplayCAL can be used to ensure colors are accurate/correct using hardware like the Spyder from Datacolor, so I was curious whether this kind of software is run first?

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Lascif said:

I'm curious about how LTT tests monitors/TVs.
Before running any tests, are any changes made to the OSD settings?
Do they demonstrate the examples on PCs with color calibration performed first, or is it just raw, as received from the manufacturer?
I know software like DisplayCAL can be used to ensure colors are accurate/correct using hardware like the Spyder from Datacolor, so I was curious whether this kind of software is run first?

 

Hey Lascif, it's Brandon, the display tester for LMG labs.

The testing conditions for any display we test really depends on what information we're trying extract from it. We'll almost always do "out of the box" testing, because that's the configuration that most people are likely to use. We'll also do testing of the picture modes geared towards color accuracy, to see how those stack up. Rarely will I do "post calibration" measurements where I've changed specific settings in the OSD, like RGB gain or color temeperature. This is because settings that work for my unit won't necessarily work for other units, even if they're the exact same model, due to panel variance.

The software we use for testing is CalMAN Ultimate, which is a pretty powerful and streamlined testing suite. If possible, we use a Murideo Seven 8k as our signal generator or CalMAN Client 3 if the display doesn't have any inputs (like with a laptop). For taking measurements, we use a CR-100 colorimeter profiled to a CR-300-RH 50mm Spectroradiometer on a per-display basis. Generally, I do distance measurements and aim for a spot size that fits within a 1% window. To ensure we are centred and perpendicular with the display, we use a cool laser system that we designed in-house that I'd like to show off at some point. I also make sure to warm-up the displays for at least 30 minutes before taking any measurements.

For the two TVs in this video, we took measurements of them in their default "Filmmaker" mode with an HDR10 signal. This is how we used them in the video too, because it's likely how most consumers will use them. Well, at least for those who care enough about accuracy to switch to Filmmaker mode.

We specifically measured the CalMAN HDR ColorMatch patches, as that uses colors that can be found in real HDR content. The HDR10 metadata used for the measurements were the CalMAN defaults:

  • Max/Min MDL = 1000/0.005 nits
  • BT.2020 primaries
  • MaxFALL = 400 nits
  • MaxCLL = 1000 nits

Hope this helps clear things up! Sorry if I get a little technical here, just always happy to talk shop haha.
 

Link to comment
Share on other sites

Link to post
Share on other sites

IMO the LG color and brightness wins like 60% of the time. That $20k TV the brightness looks to be 30% to high. I am using a 43" LG UM7300 from 2019 as my computer monitor it is dam good after adjusting the brightness and colors.

Link to comment
Share on other sites

Link to post
Share on other sites

Didn't know this sequel got the 2018 linus

 

but thanks for talking about the issues with other TV's, its nearly so bad in some media its hard to watch anything even if they say its "accurate".

Doesn't help that streaming services on TV can be worse than they are on phones, it sucks sooo bad and looks horrible.

Edited by Quackers101
Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, dogwitch said:

fire fox is not playing nice with this video..

fps is stuttering.

i know some hdr video dont play nice with  browsers.

you gotta watch it on the $20k tv

Quote
Quote
Quote

By reading this, you're entering a contract that says you have to visit my profile.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

why are you bullying people calling 55in tiny (
did something change because I always felt 55 was average size, with 43 or 32in being considered "tiny"
Like usually the 55-65in class is the most standard tv in everyone's homes because you can make four out of one motherglass
maybe its different for oled? But I think they are using the same manufacturing techniques

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Brandon D said:

Hey Lascif, it's Brandon, the display tester for LMG labs.

The testing conditions for any display we test really depends on what information we're trying extract from it. We'll almost always do "out of the box" testing, because that's the configuration that most people are likely to use. We'll also do testing of the picture modes geared towards color accuracy, to see how those stack up. Rarely will I do "post calibration" measurements where I've changed specific settings in the OSD, like RGB gain or color temeperature. This is because settings that work for my unit won't necessarily work for other units, even if they're the exact same model, due to panel variance.

The software we use for testing is CalMAN Ultimate, which is a pretty powerful and streamlined testing suite. If possible, we use a Murideo Seven 8k as our signal generator or CalMAN Client 3 if the display doesn't have any inputs (like with a laptop). For taking measurements, we use a CR-100 colorimeter profiled to a CR-300-RH 50mm Spectroradiometer on a per-display basis. Generally, I do distance measurements and aim for a spot size that fits within a 1% window. To ensure we are centred and perpendicular with the display, we use a cool laser system that we designed in-house that I'd like to show off at some point. I also make sure to warm-up the displays for at least 30 minutes before taking any measurements.

For the two TVs in this video, we took measurements of them in their default "Filmmaker" mode with an HDR10 signal. This is how we used them in the video too, because it's likely how most consumers will use them. Well, at least for those who care enough about accuracy to switch to Filmmaker mode.

We specifically measured the CalMAN HDR ColorMatch patches, as that uses colors that can be found in real HDR content. The HDR10 metadata used for the measurements were the CalMAN defaults:

  • Max/Min MDL = 1000/0.005 nits
  • BT.2020 primaries
  • MaxFALL = 400 nits
  • MaxCLL = 1000 nits

Hope this helps clear things up! Sorry if I get a little technical here, just always happy to talk shop haha.
 


Hi Brandon,

ust wanted to say thanks for letting me know the details of the testing! I've always been curious about how the testing was conducted, and you've definitely cleared that up. Out-of-the-box testing certainly makes the most sense, especially for 99% of consumers.

Additionally, I understand that with the colorimeter results, that will satisfy most professionals, as you give people a good idea of the available color space. If all monitors were post-calibrated, do you think that would close the gap (or make them visually similar) in terms of colors from the perspective of a layman?

I calibrate my monitors and my friends' monitors using a SpyderX Datacolor and DisplayCal (I know it's outdated, but as far as I know, it's the best free display calibration software out there), and I always wonder how it would compare in terms of watching shows and movies on an uncalibrated monitor of the same model.
I may have to test this out by setting one of my monitors to factory defaults and the other calibrated, to see just how much of a difference there is 🤔

Also, regarding panel variance, is that something that has been tested at LTT, or is it presumed that monitors and TVs are close in terms of color space percentage and accuracy? Or is that something for LTT Labs to verify in the future?

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, OhYou_ said:

why are you bullying people calling 55in tiny (
did something change because I always felt 55 was average size, with 43 or 32in being considered "tiny"
Like usually the 55-65in class is the most standard tv in everyone's homes because you can make four out of one motherglass
maybe its different for oled? But I think they are using the same manufacturing techniques

my gf tells me that 55 is just right

 

Seriously though, as size has become less of an issue due to dollars per inch going down(up to a point) consumers are able to afford bigger. From what little I know usually the optimal viewing distance for your living room is bigger than you think it is. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Lascif said:


Hi Brandon,

ust wanted to say thanks for letting me know the details of the testing! I've always been curious about how the testing was conducted, and you've definitely cleared that up. Out-of-the-box testing certainly makes the most sense, especially for 99% of consumers.

Additionally, I understand that with the colorimeter results, that will satisfy most professionals, as you give people a good idea of the available color space. If all monitors were post-calibrated, do you think that would close the gap (or make them visually similar) in terms of colors from the perspective of a layman?

I calibrate my monitors and my friends' monitors using a SpyderX Datacolor and DisplayCal (I know it's outdated, but as far as I know, it's the best free display calibration software out there), and I always wonder how it would compare in terms of watching shows and movies on an uncalibrated monitor of the same model.
I may have to test this out by setting one of my monitors to factory defaults and the other calibrated, to see just how much of a difference there is 🤔

Also, regarding panel variance, is that something that has been tested at LTT, or is it presumed that monitors and TVs are close in terms of color space percentage and accuracy? Or is that something for LTT Labs to verify in the future?

Hey Lascif,

Always glad to help! 🙂

As far as calibrating the display to close the gap, it depends on a few factors. Firstly, many consumer products don't give you the same sort of fine control over color that a mastering display does. For example, I've calibrated my S95B at home using expensive equipment and even then I couldn't get it to be as good as the Flander's model because the color controls on the S95B are just too limited. In a PC environment, you can work around some of these limitations through software-based calibration, but even that has many limitations itself. For the best calibration results, you'd need a display that supports internal hardware calibration via 3D-LUTs, which is exactly what the Flander's Scientific XMP550 does. This is actually where the LG OLED TVs stand-out, they support internal hardware calibration via 3D-LUTs, if you have the right hardware and software to do it.

All that said, usually any calibration is better than none, so you can definitely close the gap. The degree to which the gap is closed is where all the variables come into play. Just changing the color temperature, gamma, and saturation of colors makes a huge difference to the accuracy, and most displays give you access to these controls.

DisplayCal is pretty great software! It was the first software I used back in 2017 when I bought my first colorimeter, an x-rite colormunki. It's a shame it's no longer updated, but it still does a great job at calibration if you know what you're doing. Having a spectroradiometer on hand will always improve calibrations, but that hardware is really expensive. Luckily DisplayCal has that database of display profiles that other users have generated based on the display technology, so I always recommend using one of those alongside your SpyderX.

As far as panel variance goes, it's not something I've empirically tested, but it's something I have observed in my years of testing and it's a fairly accepted truth in the industry. Displays that are the same make and model will be pretty similar in terms of color space and accuracy, but not perfectly alike, due to manufacturing tolerances.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 2/22/2024 at 6:51 PM, Brandon D said:

Hey Lascif, it's Brandon, the display tester for LMG labs.

The testing conditions for any display we test really depends on what information we're trying extract from it. We'll almost always do "out of the box" testing, because that's the configuration that most people are likely to use. We'll also do testing of the picture modes geared towards color accuracy, to see how those stack up. Rarely will I do "post calibration" measurements where I've changed specific settings in the OSD, like RGB gain or color temeperature. This is because settings that work for my unit won't necessarily work for other units, even if they're the exact same model, due to panel variance.

The software we use for testing is CalMAN Ultimate, which is a pretty powerful and streamlined testing suite. If possible, we use a Murideo Seven 8k as our signal generator or CalMAN Client 3 if the display doesn't have any inputs (like with a laptop). For taking measurements, we use a CR-100 colorimeter profiled to a CR-300-RH 50mm Spectroradiometer on a per-display basis. Generally, I do distance measurements and aim for a spot size that fits within a 1% window. To ensure we are centred and perpendicular with the display, we use a cool laser system that we designed in-house that I'd like to show off at some point. I also make sure to warm-up the displays for at least 30 minutes before taking any measurements.

For the two TVs in this video, we took measurements of them in their default "Filmmaker" mode with an HDR10 signal. This is how we used them in the video too, because it's likely how most consumers will use them. Well, at least for those who care enough about accuracy to switch to Filmmaker mode.

We specifically measured the CalMAN HDR ColorMatch patches, as that uses colors that can be found in real HDR content. The HDR10 metadata used for the measurements were the CalMAN defaults:

  • Max/Min MDL = 1000/0.005 nits
  • BT.2020 primaries
  • MaxFALL = 400 nits
  • MaxCLL = 1000 nits

Hope this helps clear things up! Sorry if I get a little technical here, just always happy to talk shop haha.
 

I noticed you were using Plex as the player for the movies. I am assuming that you have a server with lossless rips of "standard" movies of a few different genres though that's not really my question.

 

Have you found that Plex is or isn't good at representing the source video accurately? If it does alter the original look (assuming a perfect TV) do you think the Plex software or the hardware/software of the endpoint is more responsible?

I suppose even if it does change things this video was more about comparisons so that wouldn't matter too much.

Are there any particular sets of settings you use for media playback in Plex or do you run it more or less stock?

 

I use my own plex server and I'm curious to see how someone who knows what they're doing uses theirs

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/2/2024 at 12:17 PM, Space Potato said:

I noticed you were using Plex as the player for the movies. I am assuming that you have a server with lossless rips of "standard" movies of a few different genres though that's not really my question.

 

Have you found that Plex is or isn't good at representing the source video accurately? If it does alter the original look (assuming a perfect TV) do you think the Plex software or the hardware/software of the endpoint is more responsible?

I suppose even if it does change things this video was more about comparisons so that wouldn't matter too much.

Are there any particular sets of settings you use for media playback in Plex or do you run it more or less stock?

 

I use my own plex server and I'm curious to see how someone who knows what they're doing uses theirs

Hey @Space Potato

Funnily enough, I'm a complete Plex noob. I had never used it before that video shoot, mostly because I'm also not much of a movie guy haha. Although lately I have been building a small library of 4K HDR blu-rays while physical media still exists.

For the video shoot, we used Jake's Plex server. I did notice the movies on his server did have some visual artifacts compared to my 4k blu-rays, but I'm not sure why that is. It's actually something Jake and I have been meaning to look into, we just haven't found the time yet. I'll try to remember to bring this up next week.

Cheers,

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/8/2024 at 8:35 PM, Brandon D said:

Hey @Space Potato

Funnily enough, I'm a complete Plex noob. I had never used it before that video shoot, mostly because I'm also not much of a movie guy haha. Although lately I have been building a small library of 4K HDR blu-rays while physical media still exists.

For the video shoot, we used Jake's Plex server. I did notice the movies on his server did have some visual artifacts compared to my 4k blu-rays, but I'm not sure why that is. It's actually something Jake and I have been meaning to look into, we just haven't found the time yet. I'll try to remember to bring this up next week.

Cheers,

If you do get the opportunity to looks into it and there's enough here to turn it into a forum post or even a video you can count me as an interested party.

As far as using Plex to stream video, I'm a noob myself.

I got into plex mostly for the bit-perfect audio streaming of my CDs with digitizing my 4k blu-rays mostly being an afterthought. However, now that I've seen how much better my streamed 4K content looks than the online streaming services on my low-to-mid range 4k tv I'm curious if a "bit-perfect" video stream is possible and what that might look like. (maybe it's placebo effect. Haven't really looked into it)

I'm also kind of curious how content is mastered differently on the different services. I suspect it's not all the same due to random little things like Disney+ being generally quieter than Netflix, Prime video, or Youtube Movies.

 

Anyway, appreciate the great content!

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×