Jump to content

DD 5.1 to LG TV from Nvidia Card help

Get-In-Da-Robot

Hey guys, I'm having trouble enabling Dolby Digital 5.1 on my LG TV from my graphics card. My Setup is Gtx 1080, HDMI to LG C9, C9 to Sonos 5.1 via optical. Every other device I have connected is able to send a DD signal through expect my PC. I've attempted changing drivers to no avail.

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/16/2019 at 2:37 PM, valkko said:

Optical is a standart with bandwith limitations. 5.1 is possible only via compressed signal and is up to the receiver if can decode the signal or not . Based on your description, the reason why your audio receiver is only playing stereo is becouse:

 

- not all smart tvs output 5.1 compressed audio via optical. many just send stereo signal.

- meanly becouse if you hook up your nvidia card to your tv, the EDID code from the tv will tell the card to send only R L sound signal (stereo). EDID codes act like an id between devices over hdmi and stablishe how the sound is delivered, 2.0 2.1 5.1 or wathever. Your tv has an EDID code that differs from the receivers code. NVIDIA use to had 5.1 capabilities over hdmi. Some people have reported succesfully using Custom Resolution Utility to patch the driver, making up a EDID code and fooling the card to identify the tv as a receiver, but this was crippled in the lastest 6 months driver iteration

 

EDID codes are a requirement of hdcp protocol, long time ago cracked, and were thinked as a form of fighting piracy.

 

Instead you can:

Using optical - Use prologic mode (virtual surround or fake 5.1. The name can change from brand to brand) on your receiver.

Using HDMI   - Buying a EDID box to clone the signal from your card. One cable to tv, the other one to the receiver. This is a hit or miss and there is no warranty that the box will output 5.1 audio, This will work only if your receiver has hdm in.

 

I have an almost equal setup as yours and ended up using prologic decode (fake 5.1) mode of my receiver, It is not perfect, but works

 

 

Nvidia gpu's have supported 5.1 audio for ages... I use my 1050Ti for outputting to a 17 channel AVR receiver (17 channels of processing, 15 actually powered).. Hdmi 2.0 can carry 32 channels of i believe 1536 khz audio.. and for @Get-In-Da-Robot check that the tv has the correct audio settings enabled, i used to have the issue with a yamaha receiver and a playstation. the tv didnt have the correct settings.

LTT's Resident Porsche fanboy and nutjob Audiophile.

 

Main speaker setup is now;

 

Mini DSP SHD Studio -> 2x Mola Mola Tambaqui DAC's (fed by AES/EBU, one feeds the left sub and main, the other feeds the right side) -> 2x Neumann KH420 + 2x Neumann KH870

 

(Having a totally seperate DAC for each channel is game changing for sound quality)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×