Jump to content

Get Dolby Vision instead of HDR10 on Windows 10?

FinnishArmy
On 12/18/2022 at 1:36 AM, Sawtaytoes said:

I don't know what you mean about switching to the Dolby Vision option. Displays don't choose to be in SDR, HDR, or Dolby Vision, the content provider (Windows, Blu-ray player, Roku, etc) sends a signal to the display telling it what mode to be in.

 

And Windows has no way to tell the display to be in Dolby Vision mode unless you're watching or playing Dolby Vision content.

If you want to play games in Dolby Vision, you need the game or Windows to tell the display to switch to Dolby Vision mode. There's no setting to do this. Enabling Dolby Vision IQ on my Surface Pro 8 simply means I've enabled the capability of playing back Dolby Vision content.

 

Windows tells apps "this display is Dolby Vision capable", so they'll know to enable Dolby Vision mode when HDR is enabled.

 

We know that both the Surface Pro 8's built-in screen and the LG C1 OLED support the exact same Dolby Vision IQ feature, but if I try to load Dolby Vision content on my LG C1 OLED from my Surface Pro 8, it does not output Dolby Vision.

 

---

 

The Surface Pro 8 uses Dolby Vision IQ. I have pictures of the Dolby Access app's settings in the thread.

 

Dolby Vision IQ also uses a sensor built into the TV to measure the ambient light in the room, and adjusts the Dolby Vision dynamic tone mapping accordingly so that details are preserved.

The LG C1 OLED uses Dolby Vision IQ:

image.png.e31c54068eb26e82fe55ea615813d75f.png

Source: https://www.dolby.com/experience/televisions/lg-c1-oled-tv/

 

Source: https://www.avforums.com/articles/what-is-dolby-vision-iq.17050/

 

So it looks like Dolby Vision IQ is true Dolby Vision, it's just one that adjusts the tone mapping based on the ambient lighting (which is what you want). This is a superior technology to standard Dolby Vision, but it's the exact same processing of metadata.

superior, no, but more versatile, yes. the pq eotf is designed for a dark room.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/15/2023 at 1:36 PM, perrinpages said:

lol no, the dolby vision mapping engine is perfectly capable of tone mapping to an sdr wide gamut display. it works well enough on apple devices and licensed dolby vision pcs. 

Imagine tone mapping to view crap SDR instead of real HDR lol. What a SDR display you got. 

 

Dolby really make its point when suddenly SDR can outperform HDR lol. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/15/2023 at 1:39 PM, perrinpages said:

superior, no, but more versatile, yes. the pq eotf is designed for a dark room.

This guy doesn't even understand what PQ EOTF is.  It's a transfer function to output whatever brightness you want.

 

You will know the brightness goes from 0 to 10,000nits If only you knew how to calculate it. It has nothing to do with ambient light. 

 

It's amazing people running their mouth with PQ EOTF to only expose they have never seen HDR. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/18/2020 at 8:44 AM, Chen G said:

You're doing something wrong there, that's not supposed to happen.

The only real meaningful difference between HDR10 and Dolby Vision is... too technical to talk about. Basically you can present dark scenes with less banding, more effective bits to work with.

 

It's unlikely Windows will support Dolby Vision because it'll mess up the desktop. A much simpler solution is to just run the screen in 12-bit.

It's not about the bit dept, it's the use of dynamic meta data vs static meta data that makes Dolby differ from HDR10. It can have a significant different because Dolby basically constantly tone mapped the content based on whether it's the light or the dark scene.

 

Dolby Vision vs HDR10 4K HDR | G2 S95B TV Difference in 2022 - YouTube

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

This is my first time experiencing Dolby Vision on a windows PC and I'm blown away. I have a Asus laptop with 2.8K OLED HDR600 display. I installed Dolby Vision extension and HEVC codec from windows store and played 4K Dolby Vision movies via default Movies & TV app, and it looks absolutely gorgeous. I had no idea movies can look this good on a laptop screen. I had always thought that these Dolby Vision, DTS, Dolby Audio, etc certifications on a laptop were a gimmick, but it turns out with the proper hardware to back it up, its absolutely fantastic.

 

I tried the same bunch of movies on another HDR400 certified monitor but it doesn't look that great. I think Dolby Vision needs atleast a HDR600 certified display.

 

Both Dolby Vision extension and HEVC codec is a must to playback 4K dolby vision movies. Microsoft should bundle these mere 5-6 MB apps with Windows 10/11 out of the box, instead of letting users figure it out on their own. Dolby Vision works with Netflix too, but need to use Edge for that. Doesn't work in Chrome. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/16/2020 at 5:59 AM, FinnishArmy said:

Hi there!

 

I have a Vizio 70" 4k which supports both, HDR10 and Dolby Vision.

After testing both HDR10 (through Windows 10) and Dolby Vision (Through the TV App), I find that Dolby Vision on Netflix looks 10times better.

  • Colours are not washed out on Dolby Vision like they are in HDR10
  • Dolby Vision is also much brighter than HDR10.

 

I have my PC hooked up to a Dolby Atmos (not to confuse with Vision) receiver which I use to watch Netflix and other movies through my computer. I don't have an easy way to hook up my TV to the receiver unless I get optical (which only supports 7.1 and not bitstream). Anyway...

Is there anyway to get Dolby Vision working on Windows 10 through the Netflix app? I pay for the top tier so UHD works for me.

I've tried downloading the Dolby Vision app which only gives me a slider which supposedly turns Dolby Vision on. And that kind of works, when it's switched to on, the Netflix app shows "Dolby Vision + Dolby Atmos", but my TV is only running in HDR10 and when I try to load up any movie/tv show it just crashes the app until I either turn off HDR10 (which also turns off Dolby Vision) or just turn off Dolby Vision through their Dolby Vision app.

 

Anyone have any success with Dolby Vision through their PC?

Yes, I can playback Dolby Vision on my Windows PC. You need to install Dolby Vision extension and HEVC video extension from Windows Store to play it via default "Movies & TV" app (also called Films & TV app). Doesn't work with any other video/movie player except the default windows app. As for Netflix, you need to use it on Edge. Doesn't work with Chrome or Netflix app from Windows Store. 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, Happy_Hopper said:

Yes, I can playback Dolby Vision on my Windows PC. You need to install Dolby Vision extension and HEVC video extension from Windows Store to play it via default "Movies & TV" app (also called Films & TV app). Doesn't work with any other video/movie player except the default windows app. As for Netflix, you need to use it on Edge. Doesn't work with Chrome or Netflix app from Windows Store. 

Take a picture of it turning your TV into Dolby Vision mode when playing from the Movies & TV app.

I've also gotten Dolby Vision content to play there and in the Windows Media Player app (from the Windows Store), but it doesn't switch into Dolby Vision mode, it stays in HDR10 mode, but properly plays back the Dolby Vision content.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/18/2023 at 12:56 PM, MonitorFlicker said:

Imagine tone mapping to view crap SDR instead of real HDR lol. What a SDR display you got. 

 

Dolby really make its point when suddenly SDR can outperform HDR lol. 

wide-gamut sdr with 10bpc is just plain better than rec709...what're you smoking? grading dv properly includes doing a trim pass for lower dynamic range displays lol. you're just spewing nonsense. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/18/2023 at 1:09 PM, MonitorFlicker said:

This guy doesn't even understand what PQ EOTF is.  It's a transfer function to output whatever brightness you want.

 

You will know the brightness goes from 0 to 10,000nits If only you knew how to calculate it. It has nothing to do with ambient light. 

 

It's amazing people running their mouth with PQ EOTF to only expose they have never seen HDR. 

pq eotf is absolute and designed to be viewed in a dark room. sdr brightness is relative and can be viewed in many environments...please do some actual research.

 

https://www.lightillusion.com/what_is_hdr.html

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/11/2023 at 2:37 PM, perrinpages said:

wide-gamut sdr with 10bpc is just plain better than rec709...what're you smoking? grading dv properly includes doing a trim pass for lower dynamic range displays lol. you're just spewing nonsense. 

Funny if you don't have 1000nits you will find it much harder to show 1024 shades of 10bit. No doubt you keep seeing 8bit SDR instead. 
 

Good luck seeing DV low dynamic range on incompetent displays that just look worse than SDR lol.  

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/11/2023 at 2:47 PM, perrinpages said:

pq eotf is absolute and designed to be viewed in a dark room. sdr brightness is relative and can be viewed in many environments...please do some actual research.

 

https://www.lightillusion.com/what_is_hdr.html

Any HDR is designed to be viewed in a dark room. It can go whatever brightness you want. I like how you bust out ambient light as if there is HDR mastered in a bright room lol. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/12/2023 at 2:21 PM, MonitorFlicker said:

Any HDR is designed to be viewed in a dark room. It can go whatever brightness you want. I like how you bust out ambient light as if there is HDR mastered in a bright room lol. 

you keep contradicting yourself...go home. first you say pq eotf is suitable for bright room (it's not), then you say hdr is for dark room only. 😂

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/12/2023 at 2:20 PM, MonitorFlicker said:

Funny if you don't have 1000nits you will find it much harder to show 1024 shades of 10bit. No doubt you keep seeing 8bit SDR instead. 
 

Good luck seeing DV low dynamic range on incompetent displays that just look worse than SDR lol.  

BREAKING NEWS...random pleb doesn't know that you can tone map hdr eotf to sdr gamma and still get 1024 gradations on an sdr display 🤯 or that dolby vision can adjust the pq curve willy nilly to match a display's dynamic range capabilities...I hope you know tons of content is available in 10bit sdr and is objectively better than 8 bit sdr...

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, perrinpages said:

you keep contradicting yourself...go home. first you say pq eotf is suitable for bright room (it's not), then you say hdr is for dark room only. 

Funny this guy knows nothing.

Who said PQ EOTF is suitable for bright room? Is it you? You are the one claiming PQ EOTF is made for dark room. 

 

I already said this guy doesn't even understand what PQ EOTF is.  It's a transfer function to output whatever brightness you want.  It has nothing to do with ambient light.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, perrinpages said:

BREAKING NEWS...random pleb doesn't know that you can tone map hdr eotf to sdr gamma and still get 1024 gradations on an sdr display 🤯 or that dolby vision can adjust the pq curve willy nilly to match a display's dynamic range capabilities...I hope you know tons of content is available in 10bit sdr and is objectively better than 8 bit sdr...

This is even funnier. I like you talk like you've never seen 10bit. Color is lit brightness. There is no color without brightness. This is why true HDR needs at least 1000nits to show 10bit. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, MonitorFlicker said:

This is even funnier. I like you talk like you've never seen 10bit. Color is lit brightness. There is no color without brightness. This is why true HDR needs at least 1000nits to show 10bit. 

you clearly aren't understanding anything I'm saying…why are you saying random words to distract from the actual topic.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, MonitorFlicker said:

Funny this guy knows nothing.

Who said PQ EOTF is suitable for bright room? Is it you? You are the one claiming PQ EOTF is made for dark room. 

 

I already said this guy doesn't even understand what PQ EOTF is.  It's a transfer function to output whatever brightness you want.  It has nothing to do with ambient light.

 

except it does. hdr and pq were designed together for dark room. read the standards yourself.

 

https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.2100-2-201807-I!!PDF-E.pdf

 

https://www.itu.int/dms_pub/itu-r/opb/rep/R-REP-BT.2390-8-2020-PDF-E.pdf

 

i know what an eotf is…you're just taking it completely out of context for no reason. way to derail the discussion 

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, perrinpages said:

you clearly aren't understanding anything I'm saying…why are you saying random words to distract from the actual topic.

It's you don't understand. You cannot see 10bit color without HDR1000. 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, perrinpages said:

except it does. hdr and pq were designed together for dark room. read the standards yourself.

 

https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.2100-2-201807-I!!PDF-E.pdf

 

https://www.itu.int/dms_pub/itu-r/opb/rep/R-REP-BT.2390-8-2020-PDF-E.pdf

 

i know what an eotf is…you're just taking it completely out of context for no reason. way to derail the discussion 

LAMO.

 

You just post links you've never read yourself. You just try to post technical documents and try to be smart lol. 

 

There is a clear statement to "Avoid light falling on the screen" regarding viewing environment. I hope you understand what that means.

 

I like how you run your mouth like you've never made HDR once. No HDR is made for a bright room. A bright room destroys contrast. Jesus. In a bright room you will never see what the creators intended.

 

PQ EOTF can output whatever brightness you want. You better tell me how PQ EOTF is calculated. It's calculated in a way that you can have finer control from 0nits-4,000nits compared to 4,000nits-10,000nits.  It's optimized to suit current post production based on the display capability up to 4,000nits lol. 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 months later...

Sorry to revive a potentially dead topic for my first post (long time listener, first time caller), but came by to mention 5 things:

  1. Rudeness - Some of you folks are pointlessly mean to reach other, especially when someone maybe doesn't know a thing. Mate, just be generous with your knowledge, and if you need to feel superior, then get it from the fact that they didn't know, and now they do. If someone's arguing? Just say your piece and walk away, life isn't worth the conflict, even if you do get those sweet, sweet, endorphins from it.
  2. Licensing - I'm sure it's been mentioned ad-nauseum in the pages I skipped, but DV requires licensing on all elements in the stack. However, I think I've read here (it's why I followed the startpage result) that goes are using stuff like a Lenovo Legion 5 laptop. Theoretically, if it plays back DV to the screen, it should do it to an LG TV.
  3. The DV Windows StoreApp is a Tone Mapping App - I could be very wrong (which I'm fine with), but I swear when I investigated it at launch (ish) time, this app is literally doing what many folks are chatting about above. It's applying tone mapping to the image to fit it to whatever output the display is capable of. This said, I suppose at some level all translation of these intentions is technically tone mapping of some kind.
  4. Intel Arc Pro - The new arc pro cards are advertised with full DV decoding. That means licensing, they wouldn't advertise it on their site without it.
  5. KODI / CoreELEC - Kodi started to support DV recently, and CoreELEC have *just* started to integrate it. 

 

Why Post Those Things?

 

Weeeeeeeellllllllll, I actually came here hoping to find that years later folks had gotten an external DV capable display to work with Windows, because: Gaming.

 

To echo the thoughts (and valid rants) of others before me, HDMI (I suppose more specifically HDCP?) has been the worst things to happen to display technology in the past two decades. Binding together audio and video is great for simplification, but it suuuuuuucks for when you want more from your setup. Atmos and Vision gaming on Windows? Good luck. That's despite it being a USP of Xbox since the ONE. Microsoft's handling of audio and video in general in Windows is terrible, it should be noted. It has not improved since ... well for a long ass time.

 

What To Do, Though?

 

All is not lost. 

 

I hesitated to mention the last two things in my list on one of the foremost internet forums (and LMG, if you don't jump the fark on reddit's woes, then what iz you, cuz ... ) because I don't wish to see a feeding frenzy on either the Arc Pro cards, or the two supported SBC/mini PCs. However, the potential of one of both of those last two things combined has huge potential.

 

Mostly to finally stop the only only devices able to play back your legally ripped UHD BR discs in MKV with the correct colour space being hyper expensive bespoke buys. Things like Audio, dune, and zappiti, because the 2019 Nvidia shield apparently still struggles with colour.

 

That expense-bottleneck is broken now, because the (DV licenced) S922X-J (or K - plus, is it 928? I forget) in the two supported CoreELEC devices has been proven to play back content in their native state. However, those were Android builds. Folks are now able to play their files back in the "just enough Linux" for Kodi builds that are out there. But those being in existence, along with x86 Kodi being a thing, means you don't have to extrapolate far too see ...

 

Arc Pro Kodi builds could be a thing.

 

Why Is Arc Pro Having DV Licencing / Decoding A Big Deal?

 

Gaming.

 

Not on those cards, no. Heh. But if Intel can prove that it can capably include the DV licensing and/or the hardware required to do it at zero on card compute cost, and not massively dent their margins. Well, I can see that being the next "ZOMG, LOOK AT THIS MARKETING WORD ON OUR NEXT GPU" from Nvidia and AMD. The silly fanfolks will be having "yeah, but how is your degradation!?!?" flame wars for a good split second. 😏

 

Anyway, perhaps it's all less effective with gaming, because HDR can be effective at the software level (FAR CRY - the original, yes - had an HDR mode which was insane on my old CRT) just like positional audio. Because lest we forget, that most 3D (and non) games have positional audio, which just pipes around whatever matrix of speakers you have set up in pure PCM. But giving devs the tools to simplify their sound design and delivery and enable simpler cross platform support to consoles probably surely can't be discounted. Even as a good reason to lean into something like Atmos. The same has got to be said for HDR, no?

 

So, yeah, obviously the ideal solution (I get it, Samsung) is an open codec for advanced brightness and colour handling like HDR10+ with full 12bit colour for when those displays are finally around. But right now, literally nothing touches how well colours grade with Dolby Vision, especially in streaming content. Which means if GPUs start to follow the trend offered by the Arc Pro cards? Well, that could be nice, right?

 

I hope that this is a GPU stand point.

 

(also, please bring back separate HDMI soundcards, ASUS!)

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 months later...

I just came across this post recently, and I'm not sure if anyone is still following it. I'd like to share my recent experience with enabling Dolby Vision on my computer. To put it briefly, I successfully enabled Dolby Vision mode while playing the game "Battlefield 1." I remember my settings at the time were as follows: I set my Nvidia graphics card to YCBCr444 and 10-bit color depth. In the game's HDR mode, I enabled Dolby Vision.

To successfully put my TV in Dolby Vision mode, the game resolutions I used were 1920x1080 at no more than 100Hz and 3840x2160 at no more than 60Hz. However, a crucial note of caution before you attempt this: even if you successfully and flawlessly enter Dolby Vision mode, there's a very high chance that your computer screen will go black when you exit the game. It's not a crash; it's just a black screen. Even if you reboot your computer, as soon as the graphics card driver kicks in, your screen will go black instantly. Unless you boot into Safe Mode and completely remove the graphics card driver using DDU, or you have a second graphics card port, your computer will remain stuck with a black screen. 

So, proceed with caution if you decide to try this out!

Link to comment
Share on other sites

Link to post
Share on other sites

With the current drivers, Dolby Vision support has been removed from all games that used to have it AFAIK. It no longer works at all.

Link to comment
Share on other sites

Link to post
Share on other sites

I completely gave up on DV on Windows.

 

After going through the Calman AutoCal workflow first and then the Windows HDR calibration after that i'm completely satisfied with the HDR10 experience that my C2 offers. At some point you just have to enjoy the content instead of always just tweaking and wanting something better.

 

Since i already bought $170 software and a $200 colorimeter to do a full hardware calibration i probably went deeper into this rabbit hole than most consumers will ever be.

 

In my experience the difference between HDR10 and Dolby Vision is way past diminishing returns. Maybe this all changes when we all have 10.000 nit displays, but with today's consumer tech it actually doesn't really matter.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

For those who want to playback Dolby Vision layer only video by using Tone mapping to HDR10, you may try the Jriver MC 31, this player at least won't give you purple green and no Dolby Vision display is required as I test on my Samsung Odyssey OLED G9 and it is running fine with correct color.

 

53109199429_c6cb418753_o.png

 

image.thumb.png.031cd088df46f83e085ca6e141cf4867.png

PC: AMD Ryzen 9 5900X, Gigabyte GeForce RTX 4090 OC 24G, X570 AORUS Elite WIFI Motherboard, HyperX FURY 32GB DDR4-3200 RGB RAM, Creative Sound Blaster AE-9 Sound Card, Samsung 970 Evo Plus M.2 SATA 500GB, ADATA XPG SX8200 Pro M.2 SATA 2TB, Asus HyperX Fury RGB SSD 960GB, Seagate Barracuda 7200RPM 3.5 HDD 2TB, Cooler Master MASTERLIQUID ML240R ARGB, Cooler Master MASTERFAN MF120R ARGB, Cooler Master ELV8 Graphics Card Holder ARGB, Asus ROG Strix 1000G PSU, Lian Li LANCOOL II MESH RGB Case, Windows 11 Pro (22H2).


Laptop: Asus Vivobook "A Bathing Ape" - ASUS Vivobook S 15 OLED BAPE Edition: Intel i9-13900H, 16 GB RAM, 15.6" 2.8K 120hz OLED | Apple MacBook Pro 14" 2023: M2 Pro, 16 GB RAM, NVMe 512 GB | Asus VivoBook 15 OLED: Intel® Core™ i3-1125G4, Intel UHD, 8 GB RAM, Micron NVMe 512 GB | Illegear Z5 SKYLAKE: Intel Core i7-6700HQ, Nvidia Geforce GTX 970M, 16 GB RAM, ADATA SU800 M.2 SATA 512GB.

 

Monitor: Samsung Odyssey OLED G9 49" 5120x1440 240hz QD-OLED HDR, LG OLED Flex 42LX3QPSA 41.5" 3840x2160 bendable 120hz WOLED, AOC 24G2SP 24" 1920x1080 165hz SDR, LG UltraGear Gaming Monitor 34" 34GN850 3440x1440 144hz (160hz OC) NanoIPS HDR, LG Ultrawide Gaming Monitor 34" 34UC79G 2560x1080 144hz IPS SDR, LG 24MK600 24" 1920x1080 75hz Freesync IPS SDR, BenQ EW2440ZH 24" 1920x1080 75hz VA SDR.


Input Device: Asus ROG Azoth Wireless Mechanical KeyboardAsus ROG Chakram X Origin Wireless MouseLogitech G913 Lightspeed Wireless RGB Mechanical Gaming Keyboard, Logitech G502X Wireless Mouse, Logitech G903 Lightspeed HERO Wireless Gaming Mouse, Logitech Pro X, Logitech MX Keys, Logitech MX Master 3, XBOX Wireless Controller Covert Forces Edition, Corsair K70 RAPIDFIRE Mechanical Gaming Keyboard, Corsair Dark Core RGB Pro SE Wireless Gaming Mouse, Logitech MK850 Wireless Keyboard & Mouse Combos.


Entertainment: LG 55" C9 OLED HDR Smart UHD TV with AI ThinQ®, 65" Samsung AU7000 4K UHD Smart TV, SONOS Beam (Gen 2) Dolby Atmos Soundbar, SONOS Sub Mini, SONOS Era 100 x2, SONOS Era 300 Dolby Atmos, Logitech G560 2.1 USB & Bluetooth Speaker, Logitech Z625 2.1 THX Speaker, Edifier M1370BT 2.1 Bluetooth Speaker, LG SK9Y 5.1.2 channel Dolby Atmos, Hi-Res Audio SoundBar, Sony MDR-Z1R, Bang & Olufsen Beoplay EX, Sony WF-1000XM5, Sony WH-1000XM5, Sony WH-1000XM4, Apple AirPods Pro, Samsung Galaxy Buds2, Nvidia Shield TV Pro (2019 edition), Apple TV 4K (2017 & 2021 Edition), Chromecast with Google TV, Sony UBP-X700 UltraHD Blu-ray, Panasonic DMP-UB400 UltraHD Blu-ray.

 

Mobile & Smart Watch: Apple iPhone 15 Pro Max (Natural Titanium), Apple Watch Series 8 Stainless Steel with Milanese Loop (Graphite).

 

Others Gadgets: Asus SBW-06D2X-U Blu-ray RW Drive, 70 TB Ext. HDD, j5create JVCU100 USB HD Webcam with 360° rotation, ZTE UONU F620, Maxis Fibre WiFi 6 Router, Fantech MPR800 Soft Cloth RGB Gaming Mousepad, Fantech Headset Headphone Stand AC3001S RGB Lighting Base Tower, Infiniteracer RGB Gaming Chair

Link to comment
Share on other sites

Link to post
Share on other sites

  • 5 months later...

I think I cracked the code – for Windows to recognize a display as supporting Dolby Vision, all that's required is that the Dolby Vision VSVDB block in the EDID must indicate support for the RGB Low Latency variant of DV. As far as I can tell, that variant basically just sends a normal HDR (BT.2020 + PQ) signal with tone mapping already applied, as opposed to the regular Low Latency variant where a 12 bit 4:2:2 signal is sent.

 

So, I manually edited the EDID of my LG C1 to set that bit (and fixed up the checksum), and after importing the modified EDID in CRU (Custom Resolution Utility), everything seems to work after enabling Windows HDR. Windows recognizes the display as Dolby Vision capable:

 

image.png.cf8c4f2d02370b20ad50b572e7f14767.png

Windows also now reports the primary coordinates and peak luminance of the display as set in the Dolby Vision block:

image.thumb.png.0642acbf5139676919a832f4b92385b5.png

Netflix shows the Dolby Vision logo:

image.png.8891b283f14dd1b1b629def88f54b9db.png

And the Dolby Vision logo shown inside the Netflix player, plus the debug info, confirm that I'm indeed being served a Dolby Vision stream. However, the TV does not actually switch into a Dolby picture mode (probably just because the TV doesn't support that variant). It still seems to receive normal HDR metadata though, as it switches into the regular HDR mode, and the resulting image looks perfectly normal. I think the only thing that really matters here is that the TV is not doing any tone mapping on top of the Dolby processing happening on the PC side, which can be achieved in a few ways on an LG TV. I usually use the Game Optimiser picture mode, which performs no tone mapping when set to HGIG. Alternatively, I think setting the maxCLL metadata to a value lower than the display's peak brightness in a "normal" picture mode by using the hidden HDMI Signalling Override menu should have a very similar effect.

 

If anyone wants to try to replicate this, I'm not sure what the easiest way to do that would be. CRU doesn't support the Dolby Vision block beyond the fact that it shows that it exists, and it doesn't let you import specific blocks. But if anyone comfortable with manual EDID editing wants to try this, here's my edited Dolby Vision VSVDB block:

EB 01 46 D0 00 48 03 77 82 5E 6D 95

The only change compared to the stock EDID is that I changed the 76 byte to 77, i.e. I set the bit representing support for the required variant to 1.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×