Jump to content

Steam In-home Streaming and cross-branded GPUs.

Lildirt

While I don't expect many people to have experience with this, I'm reaching out on a limb about a purchase. In fact, this question might go unanswered.. but I figure I'd take a gamble here first (since this forum likely has a high population of "gamers").

 

I use Steam's in-home streaming to play my games. As of right now, my system has two Radeon HD 7950 cards and an i5-4670k (overclocked). Steam in-home streaming offers hardware encoding, where you can have the gamestream encoded by your GPUs (exciting, right?). There's an option that allows you to decide how the stream is handled. I've attached a screenshot, since I'm awful at explaining things. Forgive the weird look of the fonts; this was taken on the Linux client (but the settings and options are the same).

options-page.png

 

I'm sure many of you (that know what this is) are shouting at me for not encoding with my AMD GPU, but that brings me to my question. There's a fundamental problem with encoding video on the AMD side. For some reason, under Windows (with the 7950s CrossFired on Crimson driver), the stream comes out very bad (only sometimes; it's actually perfectly okay a lot of the time, but others it completely botches the bottom half of the screen). Here's a screenshot of what I mean. Please note that I have no clue if this same issue happens with nVidia GPUs. This is why I am posting.

 

bad-game.png

 

Notice the artifacting on the sand. That's not the proper texture. It gets worse, but I couldn't actually get it to do it when not in the middle of a dungeon (which is a bad time to take a screenshot, IMO; also, I didn't want to run one because the stream is very weird when it does it).

 

Here's what it looks like normally. Crystal clear. Makes for nice gameplay.

 

normal-game.png

 

(random side note: the game in the image is Final Fantasy XIV, an MMORPG. Yes, I am a cat. Yes, I am a female.)

 

See what I mean? It looks awful (or can, anyway). While this doesn't make the game unplayable, it's annoying. It breaks immersion, so on and so forth. I don't need to explain this.

 

So, down to the question. As you noticed, there was an option to encode using the nVidia GPU (which is probably better, because CUDA). Does Steam support this kind of behavior (using AMD GPUs for the game and the nVidia GPU to encode the gamestream)? Could I buy an nVidia GPU, shove it in the machine, and have that dedicated to streaming? I haven't mixed brands of GPUs before, so how would they interact with each other (both drivers running at the same time)? What kind of card could I throw in that would be able to handle 1080p60fps on-demand video encoding (note that the idea is to remove the iGPU from the workload entirely as to not tie-up the CPU any more than we have to)? I also ask all of this because a few other titles that are more CPU-bound are simply killing my poor Haswell 4670k (and I doubt the iGPU is actually worth a damn for doing what I want with this). Note that any titles I have mentioned are perfectly functional at 1080p60fps in a normal, non-streaming setting. I have no issues with them whatsoever.

 

Needless to say, I haven't mixed brands of GPUs together. This is a weird solution I figured might be possible if I'm willing to invest in it, which I am. I don't know how I can possibly tell the game to not touch the nVidia GPUs but tell Steam to for the video encoding (beyond the settings page).

 

If anyone is able to provide any information about this, I'd be grateful. I'm not submitting my next order of GPUs for a little while, but I'm eager to try and get this set straight (also, I totally didn't fry my only free nVidia card a few days ago, I swear).

 

Cheers.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Lildirt said:

As you noticed, there was an option to encode using the nVidia GPU (which is probably better, because CUDA)

I believe the encoding takes place on a special chip on the graphics card (known as a ASIC or application specific integrated chip). Maybe someone else can confirm this. As far as I know, both NVIDIA and AMD's hardware encoding is pretty much equal in every way.

 

19 minutes ago, Lildirt said:

Could I buy an nVidia GPU, shove it in the machine, and have that dedicated to streaming?

In theory... yes. Does Steam support that? I doubt it. It's possibly that OBS Studio does though. Not that it helps you though...

 

20 minutes ago, Lildirt said:

I haven't mixed brands of GPUs before, so how would they interact with each other (both drivers running at the same time)?

It's perfectly fine to mix NVIDIA and AMD gpus. They don't run in SLI or Crossfire together, of course. They work independently. It could allow you to use a ton of screens if you needed to for any reason.

 

So, in conclusion, getting an Nvidia GPU probably won't help. My advice is

  • Upgrade to a GTX 1080 when it comes out
  • Upgrade to an i7 and stick with software encoding
  • Use the iGPU encoding. By the way, you never actually say in your post if that works OK... does it? Is there any performance hit compared to AMD hardware encoding?

On a side note, I could totally be wrong here, but I think the iGPU is a separate chip INSIDE of the CPU, so it shouldn't affect the CPU's performance when it's being used. It looks something like this

i7 4790k | MSI Z97S SLI Krait Edition | G.Skill Ripjaws X 16 GB | Samsung 850 EVO 500 GB | 2x Seagate Barracuda 2TB | MSI GTX 970 Twin Frozr V | Fractal Design R4 | EVGA 650W

A gaming PC for your budget: $800 - $1000 - $1500 - $1800 - $2600 - $9001

Remember to quote people if you want them to see your reply!

Link to comment
Share on other sites

Link to post
Share on other sites

 

Apologies for the weird quoting. I didn't figure out how to quote a specific post multiple times until about halfway through my post.

 

45 minutes ago, HPWebcamAble said:

I believe the encoding takes place on a special chip on the graphics card (known as a ASIC or application specific integrated chip). Maybe someone else can confirm this. As far as I know, both NVIDIA and AMD's hardware encoding is pretty much equal in every way.

 

Is it? I'm not familiar with how either of their hardware-accelerated video encoding works. I was always under the impression that nVidia GPUs would perform slightly better for on-demand video encoding tasks. Never knew that about CPUs. I had always assumed it consumes just a bit more resources when the iGPU is in use (because it seems that way; my i5-4670k never pins at 100% when running GTA V alone, but it does when gamestreaming).

 

45 minutes ago, HPWebcamAble said:
  • Use the iGPU encoding. By the way, you never actually say in your post if that works OK... does it? Is there any performance hit compared to AMD hardware encoding?

It's playable, by all means. However, I can't peak at 60fps all the time. The gamestream will sometimes fall to 30fps (which Steam resorts to if it can't pull 60fps). Something like GTA V can drop down to 20fps which is seriously not playable for a shooter. If I switch to AMD GPU encoding, it absolute demolishes the graphical integrity of the gamestream (weird malformed black and white patterns on the roads for some reason). I just want better performance. The AMD encoding seems to be a bit better when running normally, but as shown above, the stream does deteriorate.

 

Quote

In theory... yes. Does Steam support that? I doubt it. It's possibly that OBS Studio does though. Not that it helps you though...

 

Well, Steam DOES support using a specific GPU. But I was moreso asking if it was possible to utilize the nVidia GPU for the gamestream only (which is possible in Steam's settings; it can leave the iGPU alone, likewise for the AMD GPU) and only have the CrossFired AMD GPUs for the game (how can I force the AMD GPUs on the games specifically). I did find this video (from LTT) that touched on something like this, but it left me a little confused on how exactly to do it.

 

Maybe I missed out somewhere, I'm not sure. It doesn't actually explain how I could choose which manufacturer to go with for a specific game session ("simple; choose who you want to go with and run your game" -Linus).

 

Quote

Upgrade to an i7 and stick with software encoding

 

Would an i7 on an LGA1150 socket REALLY do anything (plus, high-end i7 LGA1150 chips are still not cheap, unless I somehow manage second-hand)? I don't want to basically upgrade to a Skylake system to pull this off, since I'd be buying a new system entirely at that point (the 7950s would be the bottleneck then). This also introduces a likely bottleneck if I want to upgrade to 4K streaming at some point. No clue how well that'd work out, but it's something I'd need to consider.

 

edit: I don't think so. An i5-4670k versus an i7-4790k would yield fairly similar results (both the i5 and i7 would share the same iGPU .. or relatively the same, anyway, yes?).

45 minutes ago, HPWebcamAble said:

Upgrade to a GTX 1080 when it comes out

Hah, figured out how to quote properly! Again, that's more than I wanted to throw into this. I only meant like $100 for a mid-end GPU that could pull off the encoding job. That was it. Not a card that will likely be something like $1K.

 

Quote

It's perfectly fine to mix NVIDIA and AMD gpus. They don't run in SLI or Crossfire together, of course. They work independently. It could allow you to use a ton of screens if you needed to for any reason.

Whoops. I left that in there. Thanks for answering that though. I meant to remove that bit (because duh, of course you can do that). The Linux host that runs the Windows guest that I use for games is running off of some weird GT520 (if I don't, the Linux host freaks out for some reason). or something I had on a rack, so I have no clue what I was thinking when I was typing that.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Lildirt said:

If I switch to AMD GPU encoding, it absolute demolishes the graphical integrity of the gamestream (weird malformed black and white patterns on the roads for some reason). I just want better performance. The AMD encoding seems to be a bit better when running normally, but as shown above, the stream does deteriorate.

This is a major issue. I don't know if steam is causing this... I'd bet that if you record with OBS Studio and force hardware encoding, the resulting video file would be fine. Perhaps you could try updating your drivers? Or even rolling back a version or two.

 

10 hours ago, Lildirt said:

Well, Steam DOES support using a specific GPU. But I was moreso asking if it was possible to utilize the nVidia GPU for the gamestream only (which is possible in Steam's settings; it can leave the iGPU alone, likewise for the AMD GPU) and only have the CrossFired AMD GPUs for the game (how can I force the AMD GPUs on the games specifically)

The fact the steam has multiple options, no matter what GPUs are installed, makes it seem like you could use an Nvidia GPU to do the encoding while rendering the game with the AMD GPUs. I believe you can switch which GPUs do the rendering in the game graphics options; it's not always on every game, but some let you choose which graphics adapter to use. Here's the video options from the game Space Engineers; the first option is what I'm referring to

 

10 hours ago, Lildirt said:

Would an i7 on an LGA1150 socket REALLY do anything?

I don't think so. An i5-4670k versus an i7-4790k would yield fairly similar results (both the i5 and i7 would share the same iGPU .. or relatively the same, anyway, yes?).

Yes the 4670k and the 4790k share the same iGPU. But the extra threads on the 4790k make software encoding viable, instead of hardware encoding. That is, the CPU cores would do all the work, not the graphics cards or iGPU.

 

Speaking of software encoding, how does it work for you? Should be terrible, in theory, but you never know...

 

10 hours ago, Lildirt said:

I don't want to basically upgrade to a Skylake system to pull this off, since I'd be buying a new system entirely at that point (the 7950s would be the bottleneck then). This also introduces a likely bottleneck if I want to upgrade to 4K streaming at some point.

Your 7950s are pretty capable. They do fine even at 4k (maybe 40 FPS average). Crossfired, of course. A single 7950 gets like 20 FPS.

Additionally, you want your GPU to be your bottleneck. It should always be pinned at 100%, which is fine if you have good cooling.

Upgrading to skylake would be awesome, but it would cost a good amount. Maybe 500 USD for CPU + MOBO + DDR4 RAM

 

10 hours ago, Lildirt said:

[ Regarding the GTX 1080 ]

 

that's more than I wanted to throw into this. I only meant like $100 for a mid-end GPU that could pull off the encoding job. That was it. Not a card that will likely be something like $1K.

 

This is what I'd do. A 1080 is a great investment. Dual 7950s are OK, but they're ageing, and are limited to 3 GB of VRAM, if I recall correctly.

The GTX 1080 will go for 600 USD at launch. Aftermarket coolers should be on the market by the middle of June.

The 1070 is 380 USD, but I doubt it will be capable of 4k on its own (A GTX 1080 might be)

 

Quote

Maybe I missed out somewhere, I'm not sure. It doesn't actually explain how I could choose which manufacturer to go with for a specific game session ("simple; choose who you want to go with and run your game" -Linus).

He does explain :P

Around 4:15

i7 4790k | MSI Z97S SLI Krait Edition | G.Skill Ripjaws X 16 GB | Samsung 850 EVO 500 GB | 2x Seagate Barracuda 2TB | MSI GTX 970 Twin Frozr V | Fractal Design R4 | EVGA 650W

A gaming PC for your budget: $800 - $1000 - $1500 - $1800 - $2600 - $9001

Remember to quote people if you want them to see your reply!

Link to comment
Share on other sites

Link to post
Share on other sites

@HPWebcamAble You've been a big help. I'm basically a giant idiot, haha. Thanks for holding my hand. It has helped me think about my setup a lot. I'm going to admit I actually was pulling an all-nighter when I posted this (I was really tired). I just woke back up from a 6-hour sleep session and basically all of this made sense to me when I did. Sorry about wasting your time here, but you've definitely helped tired-me out. ;)

5 hours ago, HPWebcamAble said:

This is a major issue. I don't know if steam is causing this... I'd bet that if you record with OBS Studio and force hardware encoding, the resulting video file would be fine. Perhaps you could try updating your drivers? Or even rolling back a version or two.

Yeah. I've found basically nothing on this. I'm probably going to ask around on the Steam forum or IRC channel. I need some information. I think this is what I was originally trying to answer when I posted this thread, actually. Just the problem with the AMD cards, not if I could circumvent using them.

5 hours ago, HPWebcamAble said:

The fact the steam has multiple options, no matter what GPUs are installed, makes it seem like you could use an Nvidia GPU to do the encoding while rendering the game with the AMD GPUs. I believe you can switch which GPUs do the rendering in the game graphics options; it's not always on every game, but some let you choose which graphics adapter to use. Here's the video options from the game Space Engineers; the first option is what I'm referring to

Of course I can do it this way! I can pick the bloody graphics adapter! What am I doing! I claim to know what the hell I'm doing with a computer, good lord.

5 hours ago, HPWebcamAble said:

Yes the 4670k and the 4790k share the same iGPU. But the extra threads on the 4790k make software encoding viable, instead of hardware encoding. That is, the CPU cores would do all the work, not the graphics cards or iGPU.

 

Speaking of software encoding, how does it work for you? Should be terrible, in theory, but you never know...

I tried software encoding before. No, not a chance. The iGPU buried it; it was absolutely terrible. I didn't realize you were hinting at using software encoding (again my reply was 2 hours before I went to bed from all-nighter), but I'm just going to evade this concept at all cost (use it as a last resort).

 

Fun fact, I apparently got brave when I was as tired as I was. I managed to get a 4.9GHz stable overclock on the i5-4670k. I don't know how I thought that was going to help me. ;) I've apparently got the option to overclock the iGPU as well.. maybe that's worth looking into as well? I'll try that and, if it fails me, I'll just pick up a nVidia GPU in my next hardware shipment.

 

5 hours ago, HPWebcamAble said:

Your 7950s are pretty capable. They do fine even at 4k (maybe 40 FPS average). Crossfired, of course. A single 7950 gets like 20 FPS.

Additionally, you want your GPU to be your bottleneck. It should always be pinned at 100%, which is fine if you have good cooling.

Upgrading to skylake would be awesome, but it would cost a good amount. Maybe 500 USD for CPU + MOBO + DDR4 RAM

The 7950s are VERY capable. They're about -2% of a GTX 980 when crossfired (well, sort of; they're not a huge issue right now which is what I'm saying). They are TOTALLY not the problem here. I actually checked, when using hardware encoding on the most demanding title I owned (GTA V on maximum settings, sparing a couple for VRAM concerns since 3GB cards), and found out that a single HD 7950 wasn't even going over 50% load (according to OpenHardwareMonitor) when handling the game AND the gamestream). I think it peaked at .. 65%(?) which was for a few seconds. Not enough for me to actually see it happen (OHW shows the maximum value of all of the stats).

5 hours ago, HPWebcamAble said:

This is what I'd do. A 1080 is a great investment. Dual 7950s are OK, but they're ageing, and are limited to 3 GB of VRAM, if I recall correctly.

The GTX 1080 will go for 600 USD at launch. Aftermarket coolers should be on the market by the middle of June.

The 1070 is 380 USD, but I doubt it will be capable of 4k on its own (A GTX 1080 might be)

I'm intentionally stalling doing this. I'm planning to upgrade this puny system soon (within a year or two) but I'm unsure what route I want to take. Before I do anything, I'm waiting for AMD to release Zen and maybe some GPUs. Yes, you were right about the 3GB VRAM; this is a prominent issue when running GTA V at absolute maximum settings. Some of the things AMD promises, while they seem out there, are probably worth waiting for, though. Plus, Skylake was overhyped. ;) I don't actually play demanding titles like GTA V often; they just don't interest me, so I'm probably going to skimp by with Old Faithful here for a little longer.

5 hours ago, HPWebcamAble said:

He does explain :P

Around 4:15

or if I was smart I would have pieced how to do this in my peanut-sized brain without having to dig up an ancient video, bah!

 

Thanks again. Honestly I don't know what I was thinking when I made this thread, but you definitely helped me out. :) Also, I never knew exactly how the iGPU worked, so that will definitely help me out. ^^

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Lildirt said:

Yeah. I've found basically nothing on this. I'm probably going to ask around on the Steam forum or IRC channel. I need some information. I think this is what I was originally trying to answer when I posted this thread, actually. Just the problem with the AMD cards, not if I could circumvent using them.

I did do a quick search, but I didn't find anything. And by quick, I mean I looked at the first result from a Google search. It's possible you'll uncover something with more investigation.

 

2 hours ago, Lildirt said:

I actually checked, when using hardware encoding on the most demanding title I owned (GTA V on maximum settings, sparing a couple for VRAM concerns since 3GB cards), and found out that a single HD 7950 wasn't even going over 50% load (according to OpenHardwareMonitor) when handling the game AND the gamestream). I think it peaked at .. 65%(?)

So it sounds like VRAM is the bottleneck :P

You could crossfire 970's and get more performance than a 980 ti, but you'd be limited to 4 GB of VRAM... oh Nvidia.

And then they went and put 12 GB on the Titan X. Maybe the 970 or 980 could have used some of it xD

 

2 hours ago, Lildirt said:

Thanks again. Honestly I don't know what I was thinking when I made this thread, but you definitely helped me out. :)

I hope you figure the whole thing out eventually!

 

I though of another possible solution; don't use game stream. Either move your PC to the other room or move yourself to the PC! Food for though, perhaps ;)

 

i7 4790k | MSI Z97S SLI Krait Edition | G.Skill Ripjaws X 16 GB | Samsung 850 EVO 500 GB | 2x Seagate Barracuda 2TB | MSI GTX 970 Twin Frozr V | Fractal Design R4 | EVGA 650W

A gaming PC for your budget: $800 - $1000 - $1500 - $1800 - $2600 - $9001

Remember to quote people if you want them to see your reply!

Link to comment
Share on other sites

Link to post
Share on other sites

 

29 minutes ago, HPWebcamAble said:

So it sounds like VRAM is the bottleneck :P

You could crossfire 970's and get more performance than a 980 ti, but you'd be limited to 4 GB of VRAM... oh Nvidia.

And then they went and put 12 GB on the Titan X. Maybe the 970 or 980 could have used some of it xD

Surprisingly, VRAM is not the bottleneck. I'm closely monitoring resource usage and it is definitely something to do with the CPU over everything else. Even when using the nVidia card for encoding the video, I still see CPU usage pinned at 100%. Another annoying thing, the nVidia card (a GT 520 that I found!) is so old that I can't seem to get much information on it while it is running (from OHW anyway). I might have to find some cheap 1-bay (since that's all I have free and I have nowhere to mount it if I used a riser .. besides tape ;)) GTX-series card just for it to be able to pretend be relevant. I can't get this CPU to go over 4.7GHz without crashing and burning ("A clock interrupt was not received on a secondary processor" BSOD). As well, if I went any higher with voltage, I'm pretty likely to compete with the sun for heat output. So this CPU is basically going as far as it can possibly go at this point in time.

 

Also, CrossFire 970's? ;) It seems like I really am doomed with this, though. I'll keep going at it though. Maybe I'll hit something.

 

29 minutes ago, HPWebcamAble said:

I hope you figure the whole thing out eventually!

 

I though of another possible solution; don't use game stream. Either move your PC to the other room or move yourself to the PC! Food for though, perhaps ;)

 

Yeah. I had thought of evading gamestream entirely. However, I'm really not sure how to do it. I want to be able to use the same mouse, keyboard, and everything that I have on my main system. Any keyboard I have around here (so basically a pile of rubber-dome keyboards and a couple of Romer-G keyboards) doesn't come close to the beauty that is my IBM Model M (all the way from the early 90s ;)). I love the convenience of being able to alt-tab on my main machine and do something else entirely without actually having to use a Windows platform. As I mentioned before, I play my games on a Linux client. I use Linux for everything except games.

 

The system is literally within eyeshot. It's maybe six feet away from me. I just don't like the fact of having to basically pay for two bloody setups just to play games and be able to do everything else I do with a computer. Also, I have an UpDesk for my main rig (so I can stand or sit whenever I want, duh), which I doesn't have enough room for a second setup. I'd also understand the argument of something like a KVM or just changing monitor outputs. I considered this too, but it isn't viable. Again, I like being able to just alt-tab back to whatever I was doing. This requires me to also merge the audio I/O of both machines (so I can hear my TS3 music bot and the game at the same time, or be able to participate in a call and the game at the same time), which I'm not sure I can do (USB amp which my headphones are connected to). Even if I sat down and figured ALL of this out, I would likely be limited by my Model M (and I cannot believe I'm saying this). The fact of the matter is that my Model M requires a PS/2 keyboard connection, which often freaks the hell out if it's swapped randomly (or requires a restart .. sometimes?). I could probably find a KVM that supports DVI-D from my monitor, a PS/2 keyboard, and a couple of USB devices, but it still isn't quite like running it on a normal system.

 

I don't always play my games actively. I'm a big MMO player, so I will often sit around and just grind levels while working on a programming project or something. Having to flip a switch every twenty seconds or so just to be able to .. I dunno .. craft anything .. is kind of stupid to me.

 

One other option I considered was something like Synergy (here's a link of basically all of my options for accomplishing this specific goal; if it's open-source and I can build it myself, it's an option; yes, I care about FOSS ;)), which I recall talking to Wendel (from TekSyndicate) about during one of his livestreams. I think, based on my knowledge of the application, I could turn one of my monitors into the monitor for the Windows machine and create a virtual monitor on the Linux machine to access the Windows machine (mouse inputs over LAN, yikes). I've got a couple more options, but Steam's in-home streaming REALLY makes it easy for me to do. I'm lazy, basically. ;)

 

 .. but using the Synergy-like approach presents more issues. Then I also realized that I'd need to somehow make it so I could play games on my main monitor instead of one of the weird ones off to the right or left that are just kind of there (and not meant to be focused on forever; random note, I have three monitors).

Edited by Lildirt
BRAIN BLAST .. which got shot down.
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×