Jump to content

Irritated By Ignorance, Forcing 5.1 Through HDMI

Daharen

So let me start by outlining what I know. For starters, I know that the application on the computer side has a large part in determining the output of the audio. It is possible for many applications to bypass the conversion of the audio passed over HDMI to a bitstream, and thus send full DTS Surround via HDMI to a TV, which is then pass through to the sound system and decoded and played there (Most notably VLC can do this). Thus I am aware that most applications use the sources EDID data to determine what kind of audio to send, and thus, if your TV has built in stereo speakers, they will read this, and only send stereo over HDMI. 

I have repeatedly heard people state that HDMI only supports stereo, these people are idiots and need to stop responding to these posts, countless devices send full encoded audio over HDMI with ease, and if HDMI only supported stereo everyone with advanced sound systems would be forced to play musical chairs plugging in different devices direct to their sound system whenever they wished to change the device they used. Since anyone with two brain cells and any competence setting up a home theater system knows this is ridiculous, I am extremely tired of the ignorance from the neckbeards in the PC community who jump online to tell people that HDMI doesn't support what they are asking for (It does categorically, this is a software side issue with Windows, PC Applications, and the software on motherboards and GPUs, NOT an issue with the HDMI standard, or frankly even TV's). 

Most notably, far inferior equipment has had the ability to pass bitstream data through TV's for quite some time, from run of the mill blu-ray players, to your humble XBoxOneX and PS4 Pro. These devices can even figure out that it's okay to send information back to applications based on your selected audio preferences, thus allowing even YouTube to play in full Dolby Surround Sound if the application is playing from an XboxOne X. Microsoft develops the software available on the XboxOne X, and obviously also the windows software, but arguably their more advanced OS is inferior in this regard. Further AMD makes the GPUs for both consoles, and yet for some reason there is no interference from the GPU incapacitating audio-pass through with consoles, while this is a perpetual issue for PC users. 

I'm running a Gigabyte AORUS Master motherboard. The on board audio equipment is more than sufficient for the purposes I'm trying to achieve. Alas, I can do what I want if I disable my RTX 2080ti and use the CPUs on board graphics, connecting my HDMI directly to the motherboard, and then using Realtek's solution to force the signal, but then I can't use my GPU to play games, much  less in surround sound, without resorting to obsolete unsupported software. Naturally there are many people who ask if its possible to render via the GPU while connected to the motherboards HDMI port, but in all these instances, they aren't asking so they can use their motherboards on board audio DACs and other features, and so naturally rather then answering their question, the wise people on the forums come in to ask them why they wish to do it, and proceed to tell them that what they wish to do is stupid, unnecessary, and they should just connect to the GPU, forgoing the possibility that there might ever be a reason to do what they are asking, even if it's not the reasons they were seeking to do it for. 

All this said and done, and with quite a bit of angst and frustration, I wish to do what all the inferior software and hardware of countless other devices so easily does, and force my GPU to disregard the EDID of the TV and allow for encoded surround to be sent, so the TV can pass it through to my speaker system. The TV CAN handle the pass through, it does it with all sorts of other devices, and even does it with my computer when the application specifically forces the audio,  I just need to make it so my other applications can recognize that full surround is a viable audio option, and the only way to do this is to over-ride the EDID. 

Please, I understand many people solve this problem via sound cards, that's not an option, my PCI-E slots are taken up by a GPU, 905p SSD, and HTC Vive Wireless adaptor, so there is no room for a sound card, and frankly, the on board chips and DAC are more than enough for what I want, making it ridiculous that I should have to buy the exact same hardware but in a different format merely to get the same result. In addition since no other media device I own requires me to unplug my TV from the soundbar, and plug it in directly, I don't see why the most expensive and sophisticated piece of hardware I own should be the exception. This is NOT a hardware problem, it is a software problem, so anyone providing hardware solutions, your input is not needed, you likely fall into one of the derogatory categories I listed above, all other people I welcome your input, and apologize for my candor and crudeness. 

CPU | 8700k @ 5.1 Ghz, AVX 0, 1.37 v Stable, Motherboard | Z390 Gigabyte AORUS Master V1.0, BIOS F9, RAM | G.Skill Ripjaw V 16x2 @ 2666 Mhz 12-16-16-30, Latency 38.5ns GPU | EVGA 2080 Ti FTW3 Ultra HydroCopper @ 2160 Mhz Clock & 7800 Mhz Mem, Case | Phantek - Enthoo Primo, Storage | Intel 905p 1 TB PCIe NVME SSD, PSU | EVGA SuperNova Titanium 1600 w, UPS | CyberPower SineWave 2000VA/1540W, Display(s) | LG 4k 55" OLED & CUK 1440p 27" @ 144hz, Cooling | Custom WL, 1 x 480x60mm , 1 x 360x60mm, 2 x 240x60mm, 1 x 120x30mm rads, 12 x Noctua A25x12 Fans, Keyboard | Logitech G915 Wireless (Linear), Mouse | Logitech G Pro Wireless Gaming, Sound | Sonos Soundbar, Subwoofer, 2 x Play:3, Operating System | Windows 10 Professional.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

2080ti DisplayPort -> display
2080ti HDMI -> receiver -> speakers + same display (don't worry, this makes more sense in a minute)

Set display to DisplayPort input
Set Nvidia CP to video over DisplayPort and sound over HDMI (what we did above allows this through a "ghost" monitor.

In display settings you will see a "ghost" monitor representing your HDMI connection.

Move this "ghost" monitor so that its corner is touching the corner of the active display. This will make it impossible to move the pointer to the "ghost" display.

I literally had your same issue a few months back and this solution allowed me to have 5.1 surround sound from a PC using an existing home theater setup.

 

EDIT:

You can see the final result here:
 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/24/2019 at 6:04 PM, Daharen said:

I have repeatedly heard people state that HDMI only supports stereo, these people are idiots and need to stop responding to these posts

I have never heard anyone say that. Ever.

 

On 10/24/2019 at 6:04 PM, Daharen said:

These devices can even figure out that it's okay to send information back to applications based on your selected audio preferences, thus allowing even YouTube to play in full Dolby Surround Sound if the application is playing from an XboxOne X.

Interesting, from everything I've read from YouTube they don't support surround sound at all on anything... You have any links to YT videos with surround sound? I can check it on my buddy's system, just out of curiosity. If that was the case, theoretically YT could run surround sound on anything that isn't HTML based, so any/every console app, TV apps, etc. That's how Netflix does it, and to a lesser extent Hulu.

 

As to the rest of it, I was going to say that monitors don't have ARC, but you're using a TV, so I'll just argue that MS is stupid and won't find a simple way to pass through audio. My buddy has a 5.1 system hooked up to some consoles and wants to get a PC, and I started telling him about how much of a pain in the ass it is using HDMI on PC. We actually started talking about making a splitter that would ignore or change the EDID settings so you can pass through video with stereo or whatever to one, and surround sound to the other. He is an actual rather high level EE at Texas Instruments, but I'm not getting my hopes up.

 

Every time I want to play my PS4 I have to physically unplug my PC and plug in my PS4. I'll have to look into @nosirrahx solution to see if that works, but considering I have a monitor and not a TV it might just freak out and spazz.

Edit* his setup is pretty close to mine and doesn't actually solve anything. The AVR is just an extra monitor that doesn't actually exist. In my case at least it doesn't solve the problem because I can't have the PS4 -> AVR -> monitor at the same time as PC-> AVR -> Monitor and PC-> monitor. That's 2 connections from the PC to the monitor if I complete the loop and also plug in the AVR, which freaks out the monitor, so I have to just physically swap cables. As far as I know there's no real hardware or software solution.

 

As a further note, because the AVR is a second monitor that doesn't exist, applications will sometimes open on it, and good luck getting it back to your primary. Considering your PC won't pass through audio from the TV, that's the likely route you'll have to take.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

Don't be so sure. This setup had not 1 but 2 consoles added to it.

Instead of cable switching the receiver switches input and since the monitor is connected via HDMI, switching the monitor to HDMI in allows the consoles to use both the sound and display, no wire switching involved.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×