Jump to content

Twitch, OBS and NVIDIA to Release Multi-Encode Livestreaming

Dellenn

Summary

Twitch, OBS & NVidia are providing an answer to only some channels getting transcoding. Using  modern GTX & RTX GPUs, streamers will broadcast up to 3 resolutions simultaneously up to 1080p. Also, beta testers will experiment with higher bit rates, up to 5 concurrent streams and AV1 encoding.

 

Quotes

Quote

Twitch — the interactive livestreaming platform — provides server-side transcoding for top-performing channels, meaning it will create different versions of the same stream for different bandwidth levels, improving the viewing experience. But the audience of many channels are left with a single stream option.
 

Twitch, OBS and NVIDIA have collaborated on a new feature to address this — Twitch Enhanced Broadcasting, releasing in beta later this month. Using the high-quality dedicated encoder (NVENC) in modern GeForce RTX and GTX GPUs, streamers will be able to broadcast up to three resolutions simultaneously at up to 1080p.

 

My thoughts

FINALLY!  Twitch is getting on the bandwagon of higher bitrate streaming and newer codecs. I think this will be great for content creators so their viewers will always get the right stream/resolution. My worry is, what's the bandwidth going to be like if a user is streaming 3 (or more) resolutions at the same time. Data caps be damned?

 

Sources

 https://blogs.nvidia.com/blog/twitch-multiencode-av1-livestreaming/

Edited by Dellenn
Link to comment
Share on other sites

Link to post
Share on other sites

I'm sure that AV1 was on Twitch's radar for a long time, and despite it they still decided to shutdown operations in SK. Must be pretty wild over there for telecoms.

Link to comment
Share on other sites

Link to post
Share on other sites

This made me wonder if letting people run three streams at once would actually be more economical than simply scaling put their existing transcoding capability, but then I remembered that like 99.9% of all views on twitch are generated by less than 1% of their creators. So I question how much of an actual affect this will have in the grand scheme of things. 

 

Still cool though, and I am sure some people will benefit from it. 

 

 

5 hours ago, Dellenn said:

Twitch is getting on the bandwagon of higher bitrate streaming and newer codec. 

Doesn't sound to me like they are raising the maximum bit rate, and I wouldn't really call AV1 adoption a bandwagon. Twitch has been one of the big partners from the beginning and they are essentially the second company to roll this out. 

I don't think they are jumping on any bandwagon right now. If anything, they are the ones starting the bandwagon. 

Link to comment
Share on other sites

Link to post
Share on other sites

Since YouTube is probably Twitch's biggest competitor atm and YT already uses AV1 it was only a matter of time until Twich also integrates AV1 into their service. This implementation should ensure that every viewer can get the best possible experience depending on what their hardware supports.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Well this is great. I get punished by buying a GPU (7800XT) when I wanted to (September) when Nvidias GPU options were just crap and they weren't doing anything to improve the encoding situation with regards to AV1 on Twitch and mutliple encode and transcode streams and the 4070 was more than £100 more expensive and came with no game. Now they improve everything, correct their greedy lineup and finally realise the economy they were introducing a 12GB $800 4070Ti and a $600 4070 into wouldn't give them very good sales.

 

Greedy bastards.

My Rigs | CPU: Ryzen 9 5900X | Motherboard: ASRock X570 Taichi | CPU Cooler: NZXT Kraken X62 | GPU: AMD Radeon Powercolor 7800XT Hellhound | RAM: 32GB of G.Skill Trident Z Neo @3600MHz | PSU: EVGA SuperNova 750W G+ | Case: Fractal Design Define R6 USB-C TG | SSDs: WD BLACK SN850X 2TB, Samsung 970 EVO 1TB, Samsung 860 EVO 1TB | SSHD: Seagate FireCuda 2TB (Backup) | HDD: Seagate IronWolf 4TB (Backup of Other PCs) | Capture Card: AVerMedia Live Gamer HD 2 | Monitors: AOC G2590PX & Acer XV272U Pbmiiprzx | UPS: APC BR1500GI Back-UPS Pro | Keyboard: Razer BlackWidow Chroma V2 | Mouse: Razer Naga Pro | OS: Windows 10 Pro 64bit

First System: Dell Dimension E521 with AMD Athlon 64 X2 3800+, 3GB DDR2 RAM

 

PSU Tier List          AMD Motherboard Tier List          SSD Tier List

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, LAwLz said:

This made me wonder if letting people run three streams at once would actually be more economical than simply scaling put their existing transcoding capability, but then I remembered that like 99.9% of all views on twitch are generated by less than 1% of their creators. So I question how much of an actual affect this will have in the grand scheme of things. 

 

Still cool though, and I am sure some people will benefit from it. 

 

 

Doesn't sound to me like they are raising the maximum bit rate, and I wouldn't really call AV1 adoption a bandwagon. Twitch has been one of the big partners from the beginning and they are essentially the second company to roll this out. 

I don't think they are jumping on any bandwagon right now. If anything, they are the ones starting the bandwagon. 

The problem with twitch, (and indeed someone in their "suggestion" forums was doing this) is that some people don't think about the consequences of the settings that are rolled out by default. Many streamers just "login with twitch" and don't even touch the encoding settings.

 

Now what I want, but I don't know if this is how it will work, is to stream AV1 1080p, but record to disk at 4k. Existing tests with the h264/h265 encoders say this is impossible. The h264 encoder at 1080 consumes 30% of the encoder capacity, and h265 consumes more, but trying to record at 4K uses 80% of the encode capacity. So trying to do both always results in "encoder overloaded" and it dropping the stream.

 

This problem has to be laid at Nvidia's feet first, because their hardware encoder isn't built for live streaming, it's built for disk transcoding. There isn't a way to send one 4k stream to it and get two separate (4k and 1080) or separate bitrate (eg 12mbit and 6mbit) encodes. At least not how things have been done to date.

 

And you know why that is?

 

https://docs.nvidia.com/video-technologies/video-codec-sdk/12.1/nvenc-application-note/index.html

Quote

image.thumb.png.879e15e7f1eac98c6a25b31f3894837c.png

That feature was introduced only now.

 

Hence (from the linked article in the OP):

Quote

To simplify set up, Enhanced Broadcasting will automatically configure all OBS encoder settings, including resolution, bit rate and encoding parameters. A server-side algorithm will return the best possible configuration for OBS Studio based on the streamer’s setup, taking the headaches out of tuning settings for the best viewer experiences.

That should hopefully solve a few problems I've seen where twitch streamers stream 1080p and 720p at like 2Mbits, and the video looking like total garbage the second the player moves.

 

That said, I'd probably wait for the 50 series before jumping in to AV1 encoding. Perhaps Nvidia will expand the encoder capacity, or maybe it won't, but present GPU's people have (Eg 20 and 30 series) won't be able to do the AV1 encoding, and the encoder is only good for 2 x 1080p or a single 4k stream on 30-series.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Albal_156 said:

Well this is great. I get punished by buying a GPU (7800XT) when I wanted to (September) when Nvidias GPU options were just crap and they weren't doing anything to improve the encoding situation with regards to AV1 on Twitch and mutliple encode and transcode streams and the 4070 was more than £100 more expensive and came with no game. Now they improve everything, correct their greedy lineup and finally realise the economy they were introducing a 12GB $800 4070Ti and a $600 4070 into wouldn't give them very good sales.

 

Greedy bastards.

1) You aren't being "punished" just because someone else gets an improvement and you don't. Stop thinking of yourself as a victim and unfairly treated just because the other brand releases something new. It's a terrible mentality to have which just breeds annoying fanboys. 

 

2) Not sure why you're saying Nvidia wasn't doing anything to improve their encoder. If anything, Nvidia is the company who has been improving things the most. It's AMD that has consistently been quite far behind when it comes to encoding video. So I'm not sure why you went with AMD if this was important to you. But hopefully this will come to AMD too. Probably will require some more manual tuning though. 

 

3) So you think they were "greedy" when they didn't have these new cards and features, but now that they do you still call them "greedy bastards"? 

What do you want them to do exactly to make you happy? 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, LAwLz said:

This made me wonder if letting people run three streams at once would actually be more economical than simply scaling put their existing transcoding capability, but then I remembered that like 99.9% of all views on twitch are generated by less than 1% of their creators. So I question how much of an actual affect this will have in the grand scheme of things. 

I recall before getting to affiliate I basically didn't have transcoding ever. I had a friend-of-a-friend who couldn't watch my streams because the Western bitrates were too much for their sweet potato internet. Recently I've settled on 4000k (Twitch limit 6000k) at 1080p30 since I mostly stream slower content. With multi-streams, I could do perhaps 1080p60 at 6000k and 720p30/60? at, I dunno, 2000k? I don't know what the sweet spot bandwidths are for each resolution.

 

6 hours ago, Kisai said:

but present GPU's people have (Eg 20 and 30 series) won't be able to do the AV1 encoding, and the encoder is only good for 2 x 1080p or a single 4k stream on 30-series.

In the initial implementation of sending multiple lower streams it should be ok then?

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, porina said:

I recall before getting to affiliate I basically didn't have transcoding ever. I had a friend-of-a-friend who couldn't watch my streams because the Western bitrates were too much for their sweet potato internet. Recently I've settled on 4000k (Twitch limit 6000k) at 1080p30 since I mostly stream slower content. With multi-streams, I could do perhaps 1080p60 at 6000k and 720p30/60? at, I dunno, 2000k? I don't know what the sweet spot bandwidths are for each resolution.

You should always stream at 1080p60 6mbit for:

- ALL FPS games (because the camera panning), and games that are primarily first-person (eg minecraft)

- All TPS shooters (fortnite) and survival-horror type games (Dead by Daylight, IDV, Lethal Company, Phasmophobia, etc)

 

Failure to do so, leads to a blurry unwatchable stream. You can knock this down to 30fps, or to 720p60 in most cases, but you have to be careful about low-framerate'ing FPS because it can induce motion sickness in the viewer too. Hence if your content is exclusively FPS, it's better to stream at 720p60 6mbit than 1080p60.

 

Now, what are the other options? If I take any random stream I'm watching right now and hit the transcoding lists:

Streamer: 1080p60 - 6mbit

Transcoded - 720p60 - 3mbit, 480p30 1mbit, 360p30 - 600kbps, 160p30 -180

 

So altogether that's about 12mbits. Keep in mind the audio stream remains the same. So if a nvidia GPU has 3 encode units, that is 1080p60 to twitch, 720p to stream, and 1080p to disk, assuming you have the capacity for it.

 

I feel that 480p/360p/160p encodes are pointless. Yes, maybe it benefits someone on a low-bandwidth device on a low-bandwidth network (eg India, China, Russia, rural Australia/Canada/USA) but in most situations people on these networks and systems aren't going to enjoy a low-bitrate encode in the first place.

 

I usually stick a stream in 480p when I'm watching it, but doing video work on the main monitor to avoid twitch dropping the stream if it decides to push an ad.. For a site that is "funded by ads", it sure seems to have a hard time showing them to people who aren't running ad blockers.

 

2 hours ago, porina said:

In the initial implementation of sending multiple lower streams it should be ok then?

I think on a 30 or a 20 device, it should be possible to send a 1080p60 6mbit h.265 encode and have it also send a 720p h265 at 3Mbit while still recording that 6mbit stream to disk. on a 40 device it might be possible to send AV1. I don't think we're going to be able to send 4K to twitch unless they up the bitrate to at least 16Mbit and have it able to scale down to 8Mbit. But here's the thing, and I'm sure there are a lot of people who would agree, is that most "streaming" content doesn't gain much at 4K. Only pre-recorded material and certain animated material benefits from 4K. The only reason you'd want to stream at 4K in the first place is if you have an actual studio production setup that works only in 4K.

 

Let me show you something

https://www.blackmagicdesign.com/ca/products/decklink/techspecs/W-DLK-25

This is a device that you'd have in a studio environment 

https://www.blackmagicdesign.com/ca/products/atemtelevisionstudio

Portable broadcast studio basically

 

This stuff basically has to work in a "4K native" to do anything useful.  But take note of the "USB Webcam" feature. 

 

This is a smaller device that only deals with HDMI 1080p:

https://www.blackmagicdesign.com/ca/products/atemmini/techspecs/W-APS-14

Gaming Diagram

 

This device can basically do what a GPU encoder does without being a GPU. And it costs less. Useful in a situation where you are capturing from game consoles and cameras, but don't want a PC involved.

Quote

ATEM Mini Pro also includes direct recording to USB flash disks in H.264 and direct streaming via Ethernet to YouTube Live and more.

 

But until AV1 is in these devices, these devices are more useful to capture the raw input for later editing than streaming. Which brings us back to the AV1 encoder needed in a desktop GPU, as no external hardware currently does it. So if you can do all this on a 500$ non-PC device, why would you want to spend more money on a dGPU that only gets used for the encoder?

 

Anyway, my point here is that, potentially, other kinds of hardware will come out that won't necessitate an Nvidia GPU to do. So streamers might acquire one of these devices like they presently acquire XLR mixers for no other reason than being able to control the audio with hardware and not leave it up to the PC.

 

I think at this point, we need to see Twitch and Youtube support this kind of functionality. Having the encode side send multiple video streams, and push different encodes down different paths. 

 

Side note, Youtube's transcoding pipeline is actually pretty terrible, adding 10's of seconds to stream encoding, where as twitch, the video is always 2-4 seconds.

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, Kisai said:

So if a nvidia GPU has 3 encode units, that is 1080p60 to twitch, 720p to stream, and 1080p to disk, assuming you have the capacity for it.

Most consumer tier NV GPUs have one encode unit. Recently 4070 and above get 2 units, although there are others in older history. The concurrent encode limit for GeForce devices was increased to 5 per system last year regardless of the number of units so you can still send 3 streams while saving a different one locally if you want.

 

What I haven't found yet is a good description of how much you can push through one encoder in real time.

 

47 minutes ago, Kisai said:

But here's the thing, and I'm sure there are a lot of people who would agree, is that most "streaming" content doesn't gain much at 4K.

Agree on this point. A bigger limit is how many would consume streams at 4k if offered. Probably a very small number. Maybe art streams could benefit most? For typical game content good quality 1080p is enough, maybe 1440p as a bonus, but it is more bitrate that would help most. Alternatively, better codec at same bitrate. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, LAwLz said:

1) You aren't being "punished" just because someone else gets an improvement and you don't. Stop thinking of yourself as a victim and unfairly treated just because the other brand releases something new. It's a terrible mentality to have which just breeds annoying fanboys. 

 

2) Not sure why you're saying Nvidia wasn't doing anything to improve their encoder. If anything, Nvidia is the company who has been improving things the most. It's AMD that has consistently been quite far behind when it comes to encoding video. So I'm not sure why you went with AMD if this was important to you. But hopefully this will come to AMD too. Probably will require some more manual tuning though. 

 

3) So you think they were "greedy" when they didn't have these new cards and features, but now that they do you still call them "greedy bastards"? 

What do you want them to do exactly to make you happy? 

Agree with 1 & 3

 

2.) Both AMD and NVIDIA and also Intel were massively improving their encoding capabilities. 

 

People shit on AMD because their x264 sucks and it's not relevant to use anything else for streaming on Twitch due to licensing reasons. 

AMD however runs circles around NVIDIA and Intel with their HEVC encoder. There is a reason why everyone used AMD cards for cloud streaming. 

The quality of encode is on par with others but the speed difference is massive. You can also do way more encodes at the same time with AMD. 

 

And for AV1

 

What I find most funny though is that NVIDIA was "the best x264 for streaming" while Intel could actually do even better quality with Quick Sync on their iGPUs. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

 

Agree on this point. A bigger limit is how many would consume streams at 4k if offered. Probably a very small number. Maybe art streams could benefit most? For typical game content good quality 1080p is enough, maybe 1440p as a bonus, but it is more bitrate that would help most. Alternatively, better codec at same bitrate. 

I think most people will move to 120fps streaming before they do 4k

 

https://m.twitch.tv/videos/637388605

 

Remember this 1440p 120fps av1 demo? 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, WereCat said:

I think most people will move to 120fps streaming before they do 4k

Given free option I'd agree, since hitting 1080p120 is much easier than 4k60 in terms of GPU cost.

 

While it can help as a player, I'm not sure it adds as much value for a viewer. So in that area, If you offered me 1080p120 or 1440p60, I'd take the resolution.

 

16 minutes ago, WereCat said:

Remember this 1440p 120fps av1 demo? 

Nope. Interestingly, Chrome never finishes loading when I try to view it (maybe a conflict with one of many plugins). Edge displays a message from the site saying it is unsupported. Firefox did play it.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia is killing the GPU market right now. Especially with the new price drops I think this really goes back to the hardware unboxed video where some users chimed in basically pointing out that if you want all the user features for both gaming and work related task Nvidia is the only way to go and it appears that way.

 

 

As much as I like Nvidia I was hoping AMD would be able to use their greed to get ahead but it appears that's not the case as their new cards didn't perform well enough to make a dent.

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, WereCat said:

AMD however runs circles around NVIDIA and Intel with their HEVC encoder. There is a reason why everyone used AMD cards for cloud streaming. 

Do you have a source on this? Last time I checked, AMD was still behind in terms of quality, even with HEVC and AV1. I've seen tests where RDNA3's hardware encoder ends up 10-15 VMAF points behind Nvidia and Intel using HEVC and AV1. 

 

I haven't been able to find much reliable info about which GPUs are used for "cloud streaming". I'd like to read about it if you got some links. But please note that if we're talking about real time encoding then cloud providers are usually more focused on speed than quality. With something like twitch you should prioritize quality since you got discrete hardware to handle it for you. 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, LAwLz said:

Do you have a source on this? Last time I checked, AMD was still behind in terms of quality, even with HEVC and AV1. I've seen tests where RDNA3's hardware encoder ends up 10-15 VMAF points behind Nvidia and Intel using HEVC and AV1. 

 

I haven't been able to find much reliable info about which GPUs are used for "cloud streaming". I'd like to read about it if you got some links. But please note that if we're talking about real time encoding then cloud providers are usually more focused on speed than quality. With something like twitch you should prioritize quality since you got discrete hardware to handle it for you. 

Source is mostly EposVox. Which exact video it was I don't know but it was during the 4000 series launch. I think he did some comparison even in the 4090 review. 

I remember the RX 6600 beating 4090 in speed with HEVC but I'm not sure if he tested VMAF in that video or if it was in different one released in similar time

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Fasterthannothing said:

As much as I like Nvidia I was hoping AMD would be able to use their greed to get ahead but it appears that's not the case as their new cards didn't perform well enough to make a dent.

AMD are optimising for profit, not growth. Does that count as greed? Growth is difficult to obtain. Slashing prices gets you so far and only Intel is taking that route for now as they really need growth to set up for the future. I guess AMD doesn't feel like they need or want to do that.

 

NVIDIA GeForce RTX 4060 1.07
NVIDIA GeForce RTX 4060 Ti 1.1
NVIDIA GeForce RTX 4070 1.58
NVIDIA GeForce RTX 4070 Ti 1.16
NVIDIA GeForce RTX 4080 0.73
NVIDIA GeForce RTX 4090 0.9
AMD Radeon RX 7900 XTX 0.32

Since I grabbed the numbers a moment ago: Above % share for current gen desktop GPUs taken from Steam Hardware Survey - December 2023. 7900XTX is the only AMD offering making the reporting threshold of 0.15%. Every NV model smashes that. I know SHS isn't perfect but it is the best we have for a view of what's used in PC gaming.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, WereCat said:

Source is mostly EposVox. Which exact video it was I don't know but it was during the 4000 series launch. I think he did some comparison even in the 4090 review. 

I remember the RX 6600 beating 4090 in speed with HEVC but I'm not sure if he tested VMAF in that video or if it was in different one released in similar time

I think I watched the same video or similar, AMD does very well at higher bit rates, at the restricted non-partner 6mbps of Twitch Nvidia quality wise was better.

 

I'd be happy with both personally, and to be honest if people were happy with Pascal at the time then they should be happy with AMD now also since it's as good as that or possibly better (it depends blah blah etc).

 

The only issue I have is stability and support otherwise I'd not guide people towards Nvidia if they really, like actually, cared about streaming (offline encoding less so because not a problem really). It's almost, almosttt....., to the point where it doesn't matter but not yet. Minor quality differences simply aren't as important as "this will ALWAYS work".

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, WereCat said:

What I find most funny though is that NVIDIA was "the best x264 for streaming" while Intel could actually do even better quality with Quick Sync on their iGPUs. 

Not only that but Intel iGPUs support every bitrate, bit depth, colour profile, codec options etc etc. If you want every box ticked in a comparison table the only one achieving that is Intel iGPU.

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, leadeater said:

Not only that but Intel iGPUs support every bitrate, bit depth, colour profile, codec options etc etc. If you want every box ticked in a comparison table the only one achieving that is Intel iGPU.

The problem is the iGPU resorts to CPU code paths when it exceeds it's capacity. Trust me tried the iGPU on the Haswell and on the Rocket Lake CPU's. You can get away with the iGPU for things like Zoom/Teams/etc but as a Twitch/Youtube encoder it's kinda risky, since if something min-maxes the CPU, the encode will choke.

 

pyFvwSXDhXSXYfJwxAQuQT-1200-80.png.webp

mJWQ6hJYyhnP6Yq5zfZnFM-1200-80.png.webp

 

 

vpgYnkSwfQxyEHcXrdUTZT-1200-80.png.webp

2E6pVW5KQtkdWD4hBERNSM-1200-80.png.webp

 

Note the CPU encode rate is better than the iGPU for HEVC. The AMD parts are also way behind in performance and VMAF scores.

 

Ld6C4mJrhXjSa9edaUJxgT-1200-80.png.webp

YJCgDngQd9aeZfS2Q89q4M-1200-80.png.webp

 

Interesting to see that the AV1 performance is about the same for the ARC, RX 7900 XTX and RTX 4090, at least compared to the CPU encode.

 

https://www.tomshardware.com/news/amd-intel-nvidia-video-encoding-performance-quality-tested

 

If we assume these are all under ideal circumstances, then it should be possible to have 7 concurrent 1080p60 streams (or maybe 3 if they are 120 or being saved to disk)

 

The 4K tells a different story though.

VnnNQswsCeaPTohJMXezhL-1200-80.png.webp

the iGPU is unusable at 4K, and arguably 4K encoding h264 unusable on all but the RX 7900 XTX and RTX4090.  (Remember, 60 to stream, 60 to disk)

 

There are additional graphs that basically point to h265 and AV1 performance scaling much better at 4K (eg the graph above at 4K H264 looks the same as the 4K H265, with the iGPU and CPU diving further.)

 

That said, I tried to use the iGPU Intel encoder as the disk encoder for streaming, and it was basically unusable unless the ingress was 1080p.

 

You have to also consider how many times a stream goes through the encoder if things like NDI (which invokes h264 hardware encoding and decoding) or have hardware acceleration turned on in the web browser/webview (since OBS will invoke dozens of them for browser overlays.) So it is entirely possible to sink the GPU 's video unit before it gets to encode anything.

 

At any rate, just from experience, the intel iGPU encoder is a PITA to get to work to begin with, and is too affected by CPU loads to be viable for streaming. It would not surprise me if the AMD iGPU and Apple M-series also have the same "use CPU when GPU encode resources not available"

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kisai said:

the iGPU is unusable at 4K, and arguably 4K encoding h264 unusable on all but the RX 7900 XTX and RTX4090.  (Remember, 60 to stream, 60 to disk)

Random thoughts:

1, how many streamers save a local copy? I just download the VOD after a stream as my only copy for archive. I don't need any raw footage locally.

2, even if playing at 4k, assuming no one is streaming 4k output. It would seem logical for the resize happen before encoding, so the ingest would be save locally at 4k and stream at 1080p, not 4k twice.

3, 4070 and above have the same two encoder units. Does that double the capacity vs 1 unit models (most other NV GPUs) if you have two concurrent streams?

4, presumably the current AMD range may also share similar configurations across multiple models, but I've never looked since I don't have one.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

@Kisai

I just really quickly glanced trough that RTX 4090 review. 

 

The testing is different than what you linked since it's not clamped by bitrate but by presets so results are different because of that. 

Would have to compare the final file size/quality for complete picture just from this graph, will have to look trough his videos to find the specific HEVC tests I had in mind. 

 

Screenshot_20240109-214631.thumb.png.5138d3ebca4c706ba4e9fb7e7a4e0600.png

 

To add more. 

I know NVIDIA optimised their encoder on GTX / RTX cards for gameplay so HUD elements, text, etc... Tend to suffer less compression. IDK how it affects VMAF scores and such. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, WereCat said:

Source is mostly EposVox. Which exact video it was I don't know but it was during the 4000 series launch. I think he did some comparison even in the 4090 review. 

I remember the RX 6600 beating 4090 in speed with HEVC but I'm not sure if he tested VMAF in that video or if it was in different one released in similar time

I am trying to find it, but so far I've only been able to find this video "AMD just upgraded your stream quality for FREE!" and the conclusion he reached in that video, 10 months ago, was that AMD was still behind in terms of quality.

 

In the "The Video Encoding Card of my DREAMS - EXCLUSIVE SAMPLES!" EposVox also says things that contradict your claim that cloud providers use AMD hardware for encoding. As he says in the video, Google uses their own hardware and Twitch uses x264 on their Xeon-based servers.

 

The same video also includes some samples that AMD encoded for EposVox for use in comparisons. It might be those that you're thinking of. I have some issues with these comparisons.

1) It was AMD that encoded the video, so we don't know how they did it. I am very skeptical of these and wouldn't really trust them any more than other first-party benchmarks. It's better than nothing, but it's still not as reliable as a proper independent test.

 

2) These clips were encoded using their MA35D card, which I don't think we should assume produces the same result as the encoder in their gaming cards.

It's entirely possible that the MA35D produces better looking video than what you get out of for example an RX 9700 XT.

Edit: Posted too soon. EopsVox confirms later in the video that these results do not represent what you'd get from RDNA3 cards. The AV1 output from AMD's RX 7000 series was buggy at the time (might have been fixed now) so he couldn't test it properly.

 

 

38 minutes ago, WereCat said:

To add more. 

I know NVIDIA optimised their encoder on GTX / RTX cards for gameplay so HUD elements, text, etc... Tend to suffer less compression. IDK how it affects VMAF scores and such.

Do you have a source for this? I don't really see how that would be possible so it'd be interesting to read about it if it's true. 

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, LAwLz said:

 

 

In the "The Video Encoding Card of my DREAMS - EXCLUSIVE SAMPLES!" EposVox also says things that contradict your claim that cloud providers use AMD hardware for encoding. As he says in the video, Google uses their own hardware and Twitch uses x264 on their Xeon-based servers.

 

 

Lol, go to the video from the screenshot I've sent and go back 10-15s and listen to what he says. 

 

24 minutes ago, LAwLz said:

 

Do you have a source for this? I don't really see how that would be possible so it'd be interesting to read about it if it's true. 

I remember it being marketed like that by NVIDIA back when they introduced NVENC for game streaming. 

Will have to dig trough trough some very old stuff to back this up. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×