Jump to content

10/12bit HW decoding (VP9/HEVC) finally available for GeForce - GeForce 378.66

zMeul

source: http://us.download.nvidia.com/Windows/378.66/378.66-win10-win8-win7-desktop-release-notes.pdf

 

nVidia finally released a driver that enables HEVC/VP9 10 bit HW decoding for GeForce cards - it doesn't specifically state, but I'm guessing this is only available for GTX10xx series; dunno the status for GTX950/960

Quote
  • Video SDK 8.0
    • High-bit-depth (10/12-bit) decoding (VP9/HEVC)

 

after getting my GTX1070 I found it really odd that the only way to HW decode 10bit was though DXVA2 and not through HEVC

 

---

 

the complete list of GPU decode capabilities: https://developer.nvidia.com/video-encode-decode-gpu-support-matrix#Decoder

Edited by zMeul
Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, M.Yurizaki said:

Now we just need HDR monitors to go with it.

already have a HDR TV :D

 

the only problem I have is to find a good fucking HDMI2.0 cable that actually works as stated on the motha' fucking box, 18Gbps

without having to disable sound so the image won't spazz out: https://www.youtube.com/watch?v=acwIDLbuEMI (I added blurr to the video because I captured some personal info without realising it sooner)

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, M.Yurizaki said:

If only NVIDIA would let us consumers output 30bpp over HDMI to it. :(

 

Though I only tried it once, saw that I had no higher color modes than 24bpp, and gave up

what do you mean exactly!?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, zMeul said:

what do you mean exactly!?

The maximum bit depth I was able to select in the NVIDIA control panel for my TV was 24bpp. After digging around, for like five minutes, I found NVIDIA doesn't allow consumer cards to go higher than 24bpp. I'm also under the impression unless you output higher than 24bpp, you cannot view HDR content regardless.

 

I could give it another whack.

Link to comment
Share on other sites

Link to post
Share on other sites

@M.Yurizaki

Quote

The maximum bit depth I was able to select in the NVIDIA control panel for my TV was 24bpp. After digging around, for like five minutes, I found NVIDIA doesn't allow consumer cards to go higher than 24bpp. I'm also under the impression unless you output higher than 24bpp, you cannot view HDR content regardless.

you mean desktop color depth?

 

on my TV it goes to 32bit with 4:2:2 chroma subsampling and 10/12bit output

 

but the cable I tested with was bad, not capable of handling 18Gbps as needed

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, zMeul said:

@M.Yurizaki

you mean desktop color depth?

Yes.  But yeah, I'll try tinkering with this when I get home.

 

The other hurdle is you need a video player that actually plays HDR content. I've only found one, the UWA movie app. I proved this by adjusting the brightness slider and... it was like adjusting the exposure of a camera.

eReGkFd.jpg

 

The other video players I've tried do some kind of tone mapping and it ruins the color.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, M.Yurizaki said:

The other video players I've tried do some kind of tone mapping and it ruins the color.

I tried some HDR enabled games (Shadow Warrior 2 & HitMan) and I found something that indicates the tone mapping is Windows' fault since it does not support HDR natively

the workaround I found was to change resolution from the game to 1080p or 1440p and back to UHD - only then the colors would be displayed correctly; one other way was to have destop set to 180p or 1440p and have the game to UHD

 

last I heard, the upcoming "Creators Update" should add native HDR support to W10

 

ps: I'm not wasting my time with Store UWP aps, don't even have a MS account logged it with the Store

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, zMeul said:

I tried some HDR enabled games (Shadow Warrior 2 & HitMan) and I found something that indicates the tone mapping is Windows' fault since it does not support HDR natively

the workaround I found was to change resolution from the game to 1080p or 1440p and back to UHD - only then the colors would be displayed correctly; one other way was to have destop set to 180p or 1440p and have the game to UHD

 

last I heard, the upcoming "Creators Update" should add native HDR support to W10

 

ps: I'm not wasting my time with Store UWP aps, don't even have a MS account logged it with the Store

Well the thing is Windows supports 30bpp which is the minimum needed for HDR10: http://nvidia.custhelp.com/app/answers/detail/a_id/3049/~/how-to-enable-30-bit-color-on-windows-platforms

 

Most of the spec depends on the display and recording equipment itself, rather than the software. Other than needing the correct codec.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, M.Yurizaki said:

Well the thing is Windows supports 30bpp which is the minimum needed for HDR10: http://nvidia.custhelp.com/app/answers/detail/a_id/3049/~/how-to-enable-30-bit-color-on-windows-platforms

 

Most of the spec depends on the display and recording equipment itself, rather than the software. Other than needing the correct codec.

the problem is that Windows Desktop Composition interferes somehow with the game and fucks up the color pallette

and BTW: exclusive full screen is required because of this

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, zMeul said:

the problem is that Windows Desktop Composition interferes somehow with the game and fucks up the color pallette

and BTW: exclusive full screen is required because of this

Well that probably would explain why the desktop can't have 30bpp. Or display HDR videos correctly.

 

Though I will backpedal a bit on my last post, mostly that the software needs minimal adjustments. I mean for games, enabling HDR would basically be "downsample to 30bpp instead of 24bpp".

Link to comment
Share on other sites

Link to post
Share on other sites

I found that K-Lite is the only one so far that can HW decode 10bit (HEVC10) videos through nVidia CUVID:

 

4VRRXaQ.png

 

the LAV decoder must be modded because the standard (GitHub) release does not detect HEVC10

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×