Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Misinformation in recent ShortCircuit video regarding the Odyssey G9

https://www.youtube.com/watch?v=burNkdZG0Bs

 

There's a disappointing amount of misinformation (and some potential misinformation in the form of claims I have yet to see verified anywhere else) regarding this display that was unfortunately stated as facts in this video.

 

I have not seen a single piece of documentation from Samsung (or any other reputable source) actually indicating that this display has DisplayHDR 1000 certification, nor Full Array Local Dimming. The only source of this "DisplayHDR 1000" claim, to date, is from secondhand sources assuming that Samsung's own "HDR1000" spec carries any weight and deciding for themselves that it must mean the same thing as DisplayHDR 1000 (it doesn't). Furthermore, the only references to local dimming I could find on this display are reports that it has the same local dimming zones as Samsung's previous 49" display, the CRG90, which only had 8 or so zones around the edges, nowhere near the amount you'd need for FALD. I'm especially disappointed that the video even goes as far to mention that the proper way to test local dimming is to do so with a cursor on a black background, but then completely fails to actually do so in the video to prove that it actually does have FALD. 

 

There's some information about this on this article (https://www.flatpanelshd.com/news.php?subaction=showfull&id=1578550696) as well as this comment on that article (http://disq.us/p/26l364j)

 

Overall just feels like you jumped the gun on this one without doing much research to actually verify the wild claims that everyone else has been echoing with next to no citations from a reputable source.

TL;DR
As far as I'm aware, Samsung has never once claimed this display is DisplayHDR 1000 certified, nor have they ever claimed that it has FALD, I have seen no reputable verification of either being true, and if anyone knows otherwise, please provide a source.

Link to post
Share on other sites

So a typical LTT video.

 

Take ltt as a more of an entertainer, if you are looking for real reviews you can quickly see the misinformation and mistakes stack up.

 

That is why i only watch ltt for entertainment, if you know what you are getting yourself into, there's nothing wrong with it.

Link to post
Share on other sites

I noticed this as well. Its local dimming but its not FALD, its edge lit. At least from what i could find on it.

 

I giggled  when he said 'no compromises' ..i was like " yea ok ..ill wait till u list the specs, if they tell the truth ill be shocked"

Specs are listed, then i laughed. 1ms response time ...on a VA panel ..ballllllllsh*t :P

 

 

Still im not blaming LTT on this tbh. Its a sponsored vid and while it would be nice for them call out Samsung on this, i understand why they wont. At least right now.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w | VDU: Panasonic 42" Plasma |

GPU: Gigabyte 1080ti Gaming OC w/OC & Barrow Block | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + Samsung 850 Evo 256GB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P |

Link to post
Share on other sites

Also worth noticing is that you need 10-bit to be DisplayHDR certified. 

Link to post
Share on other sites
2 hours ago, moatmote said:

The only source of this "DisplayHDR 1000" claim, to date, is from secondhand sources assuming that Samsung's own "HDR1000" spec carries any weight and deciding for themselves that it must mean the same thing as DisplayHDR 1000 (it doesn't).

Both the predecessor CRG9 and the Odyssey G9 are DisplayHDR 1000 certified. Takes like two seconds to look up:

https://displayhdr.org/certified-products/

Quote

DisplayHDR 1000

Samsung C49RG90

Samsung LC49G95TSSNXZA
Family Models: LC49G95TSSAXXA, LC49G95TSSCXXF, LC49G95TSSCXXK, LC49G95TSSCXZW, LC49G95TSSEXXD, LC49G95TSSEXXM, LC49G95TSSEXXP, LC49G95TSSEXXS, LC49G95TSSEXXT, LC49G95TSSEXXV, LC49G95TSSEXXY, LC49G95TSSIXCI, LC49G95TSSKXKR, LC49G95TSSLXZB, LC49G95TSSLXZD, LC49G95TSSLXZL, LC49G95TSSLXZP, LC49G95TSSLXZS, LC49G95TSSLXZX, LC49G95TSSMXCH, LC49G95TSSMXUE, LC49G95TSSMXUF, LC49G95TSSMXZN, LC49G95TSSUXEN, LC49G95TSSWXXL, LC49G93TSSUXEN

 

 

2 hours ago, moatmote said:

Furthermore, the only references to local dimming I could find on this display are reports that it has the same local dimming zones as Samsung's previous 49" display, the CRG90, which only had 8 or so zones around the edges, nowhere near the amount you'd need for FALD.

Just like the predecessor the Odyssey G9 has 10 dimming zones.

 

One glaring mistake I found in the video is Linus claiming the Odyssey G9 only supports 240 Hz when using 8-bit. We've already had multiple owners confirm that it can run the full 5120 x 1440 @ 240 Hz HDR 10-bit & 4:4:4 Chroma and even Samsung themselves have confirmed it.

 

 

 

Link to post
Share on other sites
18 minutes ago, GamingWiidesire said:

One glaring mistake I found in the video is Linus claiming the Odyssey G9 only supports 240 Hz when using 8-bit. We've already had multiple owners confirm that it can run the full 5120 x 1440 @ 240 Hz HDR 10-bit & 4:4:4 Chroma and even Samsung themselves have confirmed it.

 

But, now I'm confused.  The post you likned clearly says: "with YcbCr444 it says dynamic output range : limited".  So from the information in that post indicated that HDR do not work with 4:4:4 10-bit @240Hz.

Link to post
Share on other sites

Read the owner's comment again carefully. Both RGB and YCbCr444 can be selected @ 240 Hz 10-bit. If you select YCbCr444 -> limited output dynamic range and RGB -> full output dynamic range. Also I've provided an official source from Samsung themselves. Hope that helps.

 

 

 

Link to post
Share on other sites

Yes I read the german Samsung team wrote but I had no idea that  DSC was allowed in DisplayHDR certification spec.  DSC are hurting your image quality, sometimes quite badly.

 

Quite frankly if DSC is allowed in DisplayHDR certification it quite useless certification.   However I would think that 120Hz would be quite enough to consume HDR content. 

Link to post
Share on other sites
16 minutes ago, Glenwing said:

Source?

 

There are a few reviewers out there that have complained about it but mostly it's from own experience.

 

However you can look at VESA own documentation where they call it "Visually lossless".  To comply to the standar they verify the "visual lossless" by subjective testing in 8-bit but  DSC support upp to 12-bit.  Thing is that if you maximise colour depth, Hz and resolution you can get as much as 66% data compression and that is definitely hurting the image quality.  

Link to post
Share on other sites

Here is another 5120 x 1440 @ 240 Hz HDR 10-bit & 4:4:4 Chroma confirmation:

t27c6t0.thumb.jpg.27fe499024b8d7822ebab560e05ab7de.jpg

@LinusTech

7 hours ago, Glenwing said:

You have experience using DSC?

Kroon is probably confusing Chroma Subsampling with DSC. It doesn't matter to the human eye whether DSC is just visually lossless or mathematically lossless. Any test of DSC I have seen so far says there is no perceptable difference, so the marketing "visually lossless" appears to be true.

 

Test 1 TFTCentral:

Quote

We were pleased that there was no visual loss to our eyes and in our range of tests which was excellent. You can certainly see chroma sub-sampling when you use that old method especially in desktop applications, but that was not necessary now that DSC was being used. We saw no additional lag either when using this and no noticeable side-effects. This seemed to work very nicely to allow you to squeeze more out of the bandwidth of DisplayPort 1.4.

Test 2 Guru3D:

Quote

In its core essence, DSC would never be better than the quality of uncompressed display streams, similar to a JPG image or H.264 video stream it never will be as good as a RAW stream, but it is a solution, perhaps even tradeoffs that will make compression worth the challenge. And admittedly, we have not been able to tell a difference, [...]

 

 

 

 

Link to post
Share on other sites
9 hours ago, Glenwing said:

You have experience using DSC?

Yes, via work.  Unfortunately I can't say more about that at the moment. 

 

2 hours ago, GamingWiidesire said:

Kroon is probably confusing Chroma Subsampling with DSC

Que? No I'm not! 

 

DSC are really good engineering and a truly good way to increase the bandwidth.  But even VESA them self acknowledge that it's not perfect.  Up to a compression of 2:1 it REALLY good and you will have a hard time to notice any difference even if you have monitors side by side, this is also something you can read about in the DSC documentation.   The DSC specifikations are up to a 1:3 compression ratio but at that point you will actually notice difference, agan something VESA acknowledge.

 

This haven't been a problem before due to panel limitations.  Now we see higher and higher resolutions with crazy high refresh rates and 10 or even 12 bit panels that with HDR.  When you have 7.4 million pixels at 10bpp and HDR with 240fps you will actually go over 1:3 compression ratio.  The good news is that it will probably take a few years before we have computers that actually can push the monitor to it's max.  

Link to post
Share on other sites

I was curious about the bit depth and throughput as well in this video. I'd love to see a longer video from LTT on this with this monitor as really DP2 is where its at so you don't need to use DSC. I'm surprised that native res, 240 and HDR still comes through with DSC apparently being lossless. I honestly thought the throughput wasn't available at that point.

 

As to the display HDR compliance not only was the previous CRG9 DisplayHDR 1000 certified but the G9 apparently also carries the HDR10+ certification as well: https://www.samsung.com/au/monitors/odyssey-g9-c49g95t/LC49G95TSSEXXY/

 

 

Keep in mind this was a sponsored video so the talking points aren't all LTT and the information they gave wasn't wrong but they don't go into as much detail in these videos. It is definitely worth a follow up for some clarification though and Linus did say at the end there was going to be an LTT video on it so hopefully its not far away.

 

Link to post
Share on other sites
53 minutes ago, Kroon said:

 The DSC specifikations are up to a 1:3 compression ratio but at that point you will actually notice difference, agan something VESA acknowledge.

Source?

55 minutes ago, Kroon said:

When you have 7.4 million pixels at 10bpp and HDR with 240fps you will actually go over 1:3 compression ratio.

DSC 2.5x compression is enough to run 5120 x 1440 @ 240 Hz HDR 10-bit & 4:4:4 Chroma (24.25 Gbit/s out of 25.92 Gbit/s).

 

 

 

Link to post
Share on other sites
1 hour ago, GamingWiidesire said:

DSC 2.5x compression is enough to run 5120 x 1440 @ 240 Hz HDR 10-bit & 4:4:4 Chroma (24.25 Gbit/s out of 25.92 Gbit/s).

I promised my self not to reply anymore in this thread since it WAY out of topic but I have to:

 

Calculating bandwidth are pretty easy:

(H + Hblank)*(V + Vblank) * C * F

 

Where H is number of pixels horisontal, V pixels vertical, C colour depth and F your Hz or FPS if that is lower.

 

So in this case:

H = 1440

V = 1440

C = 30 (10 bit per R, G and B channel)

I can't find information about Hblank and Vblank so this calculation will not be correct, true value will be higher.

 

 

1440 * 5120 * 30 *60 =  53084160000 or 53,08Gbit/sek

 

4:4:4 Chroma means no compression.

 

If you want to do the calculations your self you can check LTT Data Rate calulator (Results there is 60.62):

 

 

Edit:

Not that this bandwidth are without DHR!

Link to post
Share on other sites
8 minutes ago, Kroon said:

If you want to do the calculations your self you can check LTT Data Rate calulator (Results there is 60.62):

60.62 / 2.5 = 24.25. Literally what I just said.

Why are you writing a wall of text trying to disprove what I just said even though you got the exact same result.

As I said earlier, 5120 x 1440 @ 240 Hz HDR 10-bit & RGB/YCbCr444 Chroma with DSC 2.5x takes 24.25 Gbit/s out of the available 25.92 Gbit/s bandwidth.

 

2 hours ago, Kroon said:

 When you have 7.4 million pixels at 10bpp and HDR with 240fps you will actually go over 1:3 compression ratio.

No.

 

 

 

Link to post
Share on other sites
11 minutes ago, GamingWiidesire said:

Why are you writing a wall of text trying to disprove what I just said even though you got the exact same result.

 

Sorry, for some reason I got in my mind that you ment 24.25 uncompressed.  But you are still missing the HDR channels in your calculation.

 

With that said I will be leaving the tread since it so far from topic now.

Link to post
Share on other sites

Why didn’t Samsung just use DP 2.0?

 

or HDMI 2.1 ?

 

No compression needed when new GPUs arrive in a couple months? 

Link to post
Share on other sites

Really DSC or not, it's still LCD be it FALD or not, it's not true HDR though. 

 

Can only wait for flat 4K 240Hz HDR MicroLED to get the actual ultimate display. 

Ryzen 7 3800X | X570 Aorus Elite | G.Skill 16GB 3200MHz C16 | Radeon RX 5700 XT | Samsung 850 PRO 256GB | Mouse: Zowie S1 | OS: Windows 10

Link to post
Share on other sites

SMH when people buy these 10-zone FALD HDR displays over the PG27UQ / Predator X27 with a 384-zone FALD and hardware G-SYNC module that never tears frames. The G-SYNC Compatible displays including the LG OLEDs randomly start tearing frames for a few seconds every minute even within the VRR range.

Link to post
Share on other sites
1 minute ago, Monstieur said:

SMH when people buy these 10-zone FALD HDR displays over the PG27UQ / Predator X27 with a 384-zone FALD and hardware G-SYNC module that never tears frames. The G-SYNC Compatible displays including the LG OLEDs randomly start tearing frames for a few seconds every minute even within the VRR range.

Never seen this on my CX.

CPU: Ryzen 7 3700x || GPU: RTX 3080 Founders Edition || Memory: 48GB Corsair 3000mhz DDR4 (22GB PrimoCache) || Motherboard: MSI B450 Tomahawk || SSD1: 500 GB Samsung 850 EVO M.2 (OS drive) || SSD2: 500 GB Samsung 860 EVO SATA (Cache Drive via PrimoCache) || Spinning Disks: 3 x 4TB Western Digital Blue HDD (RAID 0) || Monitor: LG CX 55" OLED TV || Sound: Schiit Stack (Modi 2/Magni 3) - Sennheiser HD 598, HiFiMan HE 400i || Keyboard: Logitech G810 || Mouse: Logitech G502 || PSU: EVGA 750-watt Gold
 

Link to post
Share on other sites
2 minutes ago, MadPistol said:

Never seen this on my CX.

Seeing this on my CX right now in the G-SYNC Pendulum demo test pattern. It happens extensively at 4K 60 Hz, but rarely at 1440p 120 Hz.

Link to post
Share on other sites
21 minutes ago, Doobeedoo said:

Can only wait for flat 4K 240Hz HDR MicroLED to get the actual ultimate display. 


MicroLED can easily push beyond 240Hz. 
 

27”

1440p/2160p

XDR 12-bit

HDR10+ / Dolby Vision 10-bit

480Hz 

 

Gimmie!

Link to post
Share on other sites
11 minutes ago, Monstieur said:

SMH when people buy these 10-zone FALD HDR displays over the PG27UQ / Predator X27 with a 384-zone FALD and hardware G-SYNC module that never tears frames. The G-SYNC Compatible displays including the LG OLEDs randomly start tearing frames for a few seconds every minute even within the VRR range.


PG27UQX is the real GOAT with 576 Zones!

 

But OLED / MicroLED is king, forget about zones, it’s per pixel basis!

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Newegg

×