Jump to content

Twitch, OBS and NVIDIA to Release Multi-Encode Livestreaming

Dellenn
1 hour ago, LAwLz said:

In the "The Video Encoding Card of my DREAMS - EXCLUSIVE SAMPLES!" EposVox also says things that contradict your claim that cloud providers use AMD hardware for encoding. As he says in the video, Google uses their own hardware and Twitch uses x264 on their Xeon-based servers.

Cloud providers != Twitch 🤷‍♂️

 

Google sure but GCP and YouTube aren't really the same either, cloud would imply GCP.

 

Anyway there are two factors for why AMD get used a lot there for those use cases, VDI and also share GPU game streaming. Licensing, AMD does not charge any license fees at all in any what to use their GPUs while Nvidia does if you want to use virtual GPU features which you always make usage of for VDI and would for lower end games in a game streaming service. Very, very high end workstations would have a dedicated GPU in VDI however you still allocate it through virtual GPU partitioning and licenses.

 

Nvidia license costs for that is very high so if you are competitive market focuses and don't need to offer specific Nvidia features you would opt for AMD. TCO is king most of the time.

 

The second factor is performance and that includes no limiting of encode streams beyond what the hardware can actually do. Raw performance wise AMD may not always be the best, depend on how you are encoding but if you want to be a cheap ass and use consumer GPUs then AMD trumps Geforce until recently.

Link to comment
Share on other sites

Link to post
Share on other sites

@LAwLz

Man, time flies. I swear they introduced this around 1000 series but I was wrong. It's actually with Turing they added Multi-Frame Reference and Rate Distortion Optimization

 

image.thumb.png.68f512933c7ad843036421e33ed5cfba.png

 

And this is from NVIDIA marketing video but I've originally seen it elsewhere with Apex Legends I believe for comparison... idk where though.

 

 

1 hour ago, LAwLz said:

Do you have a source for this? I don't really see how that would be possible so it'd be interesting to read about it if it's true. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, porina said:

Random thoughts:

1, how many streamers save a local copy? I just download the VOD after a stream as my only copy for archive. I don't need any raw footage locally.

People doing clips, editors for the streamer work from the local copy, not the stream copy. Or at least they should, because for the aforementioned problem of Twitch limited to 6mbit and resulting blur-soup FPS games.

4 hours ago, porina said:

2, even if playing at 4k, assuming no one is streaming 4k output. It would seem logical for the resize happen before encoding, so the ingest would be save locally at 4k and stream at 1080p, not 4k twice.

Theoretically, you would do a 4K canvas, and tell the stream encode to resize, but not the disk encode. In practice this seems to send video down a slow code path in OBS, and you either have to send and record to disk in 4K, or 1080p, not mix and match.

 

4 hours ago, porina said:

3, 4070 and above have the same two encoder units. Does that double the capacity vs 1 unit models (most other NV GPUs) if you have two concurrent streams?

I'm guessing it doubles the amount of peak blocks it can process, so that can be like 2x4k60, 8x1080p60, 4x1080p120, 2x1080p240, and similar, like what happens with HDMI bandwidth. Feature dependent though, since P7 tanks all nvidia GPU's at any resolution and is impossible to use to stream with, but P1-P4 work fine to stream most of the time.  So my assumption is p7=slowest profile with all optimizations enabled, and P1=fast with no optimizations.

 

The result between P5 to P7 might not be noticeable to anyone except an editor working on the disk recording. Streaming at P4 isn't unreasonable, but might not be practical, but go back to point 1. If all you have is the garbage-quality recoding, then you have nothing to make a good clip from.

 

4 hours ago, porina said:

4, presumably the current AMD range may also share similar configurations across multiple models, but I've never looked since I don't have one.

My guess, again, similar to nvidia, is that they probably have an amount of bandwidth they can deal with, so one unit might account for a single 4kp60 stream as it's peak

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, WereCat said:

Lol, go to the video from the screenshot I've sent and go back 10-15s and listen to what he says. 

That video is talking about speed, not quality.

What I am talking about is the quality of the output. What EposVox talks about in his video is speed. For game streaming, once you hit 30 or 60 FPS in encoding everything else is wasted, and quality is the only thing that matters. It's in the quality department that AMD has been lacking. What AMD produces at their "quality" preset is, from what I've seen in the tests I've looked into, quite a lot worse than what Nvidia and Intel produce at their "quality" presets.

 

  

9 hours ago, leadeater said:

Cloud providers != Twitch 🤷‍♂️

 

Google sure but GCP and YouTube aren't really the same either, cloud would imply GCP.

WereCat didn't say "cloud providers" though. They said "Everyone used AMD cards for cloud streaming".

Since they used EposVox as a source, although didn't know which video, I looked through some of his videos and he referenced Youtube and Twitch, neither of which use AMD. 

 

I guess we can't be sure what WereCat meant without them explaining exactly what they meant and providing a source. So we'll have to wait for that first.

 

 

 

14 hours ago, leadeater said:

I'd be happy with both personally, and to be honest if people were happy with Pascal at the time then they should be happy with AMD now also since it's as good as that or possibly better (it depends blah blah etc).

I guess that's true, but products aren't launched in a vacuum. Just because something was considered good ~8 years ago doesn't mean launching a product today that can't outperform it (in this particular task) is okay and should be ignored. Things are meant to improve, and the products AMD launch today will be compared to the products Nvidia launch today.

AMD is playing catchup when it comes to video encoding, and that was my point. This entire conversation started because someone said they bought an AMD card because they thought Nvidia "weren't doing anything to improve the encoding situation", and now they were mad that Nvidia improved things further, and calling them greedy for making things better (???). That post made no sense to me so I replied trying to correct their wrong impressions.

 

 

  

8 hours ago, WereCat said:

Man, time flies. I swear they introduced this around 1000 series but I was wrong. It's actually with Turing they added Multi-Frame Reference and Rate Distortion Optimization

 

-image-

 

And this is from NVIDIA marketing video but I've originally seen it elsewhere with Apex Legends I believe for comparison... idk where though.

That's just a general encoder improvement though. It's not "their encoder is optimized for gameplay".

Things like rate distortion optimizations and multi-frame reference are neither special optimizations for gameplay recording, nor are they exclusive to Nvidia. 

It makes things like HUD and text in games look better simply because it makes everything look better.

 

I think you are jumping to a lot of conclusions that may be incorrect. You hear one thing, like "Nvidia has improved their encoder", you make an assumption that it only applies to game footage, and then you post "Nvidia has optimized their encoder for gameplay footage".

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

That video is talking about speed, not quality.

What I am talking about is the quality of the output. What EposVox talks about in his video is speed. For game streaming, once you hit 30 or 60 FPS in encoding everything else is wasted, and quality is the only thing that matters. It's in the quality department that AMD has been lacking. What AMD produces at their "quality" preset is, from what I've seen in the tests I've looked into, quite a lot worse than what Nvidia and Intel produce at their "quality" presets.

Yes, I even commented on it that it's different kind of test not controlled by bitrate so quality/file size may vary and that I'm remembering different video where HEVC specifically was tested on clamped bitrates instead which I haven't been able to find yet.

Also I never said the AMD HEVC quality is better. I mainly commented on the speed being much better. I did say it's on par which according to the sources you posted is not true either at those low bitrates. Considering these are also fairly new tests as well it's most likely very accurate.

They are significantly closer to others with HEVC than with x264 though.

There is also a difference of looking at VMAF scores and looking at an actual encodes and comparing them. For me the x264 on AMD is completely unusable and unwatchable even at higher bitrates but I don't have an issue with the x265. I went trough GTX 1060, GTX 1080ti, RTX 3060ti and now have RX 6800 XT and I do have experience with encoding with all of them and I don't see the 6800 XT as a downgrade for HEVC.

Now take it as you want, I have not done any scientific testing. Only actual robust tests I've done were with GTX 1080ti where I did like 20 test encodes for YT at different settings to see what gives me the best YT quality just to determine that the only thing that matters the most is the resolution as YT will re-encode my videos anyways they'll just use better codec at higher res.

 

EDIT:

 

9:17 and onwards (they spent the first 9min shitting on the x264 AMD encoder)

This is from the 5000 series launch roughly

Here the speed difference is not apparent at all, usually the RTX 2080 is slightly faster on avg but the RX 5700 can do way more encodes at the same time.

From the screenshot and video I posted before the RX 6600 was even faster than RTX 4090 so IDK what changed between 5000 series and 6000 series or if the testing methodology made that huge of a difference.

 

They also comment that this will be used for Google Stadia (maybe) and that they are still using older Vega (back then).

 

So it seems the AMD card is better for encoding for enterprise where you absolutely benefit from having multiple streams but for home user the speed difference is not really there.

I found one video where they also showed the 4k encode being faster than NVIDIA RTX 2080 but at 1080p the RTX 2080 was more than twice as fast with HEVC (claimed it's a possible bug in Handbrake at the time).

 

I admit that most of my info about this stuff is from EposVox so if he is wrong then I'm wrong as well. I tend to trust him with this stuff and especially when Wendell is involved. I've spent years watching various channels and there are not many people who test this stuff and I tend to remember various tidbits of info but it's just so hard to keep track of everything. I think I've wasted quite a bit of time just trying to find these and having to rewatch so many videos, I'm just not gonna bother anymore. I'll still link to that NVIDIA "being optimized for game streaming" once I stumble upon it.

As I said previously though it's still not easy to just look at some arbitrary score and say X is better than Y. While it probably is true it's still a matter of whether the visual difference is actually perceivable. And looking at comparisons trough YT compression is also just not enough, it's best to download the files and look at them directly... problem is EposVox provides them trough Nebula which you would paid for I presume.

I did watch some downloaded files for comparison but I don't know if it was from him, since I watch his channel the most fot this stuff I don't know who else it may have been that I got it from but I sure as hell did not pay for anything nor did I ever signed up for Nebula.

2 hours ago, LAwLz said:

 

 

I guess we can't be sure what WereCat meant without them explaining exactly what they meant and providing a source. So we'll have to wait for that first.

 

https://www.amd.com/en/press-releases/2019-03-19-amd-radeon-gpus-and-developer-tools-tapped-for-google-stadia-game

 

I'm talking about cloud game streaming. Obviously NVIDIA with their GForce Now or whatever it's called is using NVIDIA but I know Google Stadia used a ton of AMD cards back then (not really relevant now) and Microsoft is probably using a lot of AMD as well though I haven't found if that's the case for actually using them to encode.

 

And the video I referenced is one from the screenshot I've posted as I said

14:40

 

2 hours ago, LAwLz said:

 

 

I think you are jumping to a lot of conclusions that may be incorrect. You hear one thing, like "Nvidia has improved their encoder", you make an assumption that it only applies to game footage, and then you post "Nvidia has optimized their encoder for gameplay footage".

That's not the case. I'm not really trying to make things up or jumping to conclusions. I remember watching tests on this and it was a big deal since SW encoding was very expensive back then and NVIDIA provided better clarity at lower bitrates for this stuff. It was not necessarily "just better encoder" because they lacked quality in other areas as well and it was quite apparent when you looked at the direct comparison. If I'll find it then I'll link it, for now feel free to ignore this claim.

As I said, I remember it being compared in Apex Legends which was a terrible game to stream at the time because of it's very fast paced nature which created a ton of compression artifacts during the encode.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

WereCat didn't say "cloud providers" though. They said "Everyone used AMD cards for cloud streaming".

Eh true, but whenever I hear cloud that denotes actual cloud providers, the ones that are talked about which isn't Twtich. I've never heard anyone refer to Twitch as cloud in such a capacity, Twitch is just Twitch, or game streaming or just streaming 🤷‍♂️

 

Much in the same way I don't assume all streaming is game streaming unless game streaming is said. So if is see cloud that means OVH, GCP, Azure, AWS, VPS etc

 

1 hour ago, LAwLz said:

I guess that's true, but products aren't launched in a vacuum. Just because something was considered good ~8 years ago doesn't mean launching a product today that can't outperform it (in this particular task) is okay and should be ignored. Things are meant to improve, and the products AMD launch today will be compared to the products Nvidia launch today.

Yes but that's only IF you care about VMAF scores. If you want to play games and stream on Twitch the metric is "does it look good enough" and that's a very basic yes or no.

 

It's like saying a 2009 Honda Civic isn't a good vehicle because a 2023 Toyota Corola has more horse power or whatever, at some point it doesn't actually matter unless you actually want to do VMAF score comparisons, or 0-60mph or 60-0mph or lap times. Some things aren't strictly and directly relevant to the user beyond suitable.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, WereCat said:

Yes, I even commented on it that it's different kind of test not controlled by bitrate so quality/file size may vary and that I'm remembering different video where HEVC specifically was tested on clamped bitrates instead which I haven't been able to find yet.

Just to be clear and so that we aren't talking past each other. The problem isn't that it's not set to a specific bit rate. The problem with the test you linked is that it doesn't measure image quality. The same bit rate and "same" preset will output different quality images, and that's where AMD has historically been behind.

A 10 megabyte, 10 second video from an AMD card looks worse than a 10 megabyte, 10 second video from Nvidia. Or at least it used to, and all the tests I can find indicate that this is the case. That goes for all formats I have seen, AV1, HEVC, AVC and so on.

 

 

2 hours ago, WereCat said:

Also I never said the AMD HEVC quality is better. I mainly commented on the speed being much better. I did say it's on par which according to the sources you posted is not true either at those low bitrates. Considering these are also fairly new tests as well it's most likely very accurate.

And I have never said or argued that you said AMD's HEVC output was better. Don't try and pull some strawman argument.

I don't think it's fair to call for example 6Mbps "low bitrate" either. I wouldn't say it's "high" because that's relative, but 6Mbps is quite a lot. If we compare it to movie encodes, most movies I have are far below 6Mbps in bitrate. My rip of the first Hobbit movie is 7.2GB in size, and that's 5.5Mbps with two audio tracks. I don't want people to start calling 6Mbps "low bitrate" because it's quite a lot. It's plenty for a great-looking video feed, if the encoder is good and decent video formats are used.

 

The statements I did not agree with in your post were:

1) AMD's image quality is on par with others. So far, this seems to be false.

 

2) That "everyone used AMD cards for cloud streaming". So far I have not seen any evidence that this is true. You raised one example, which is a service that no longer exists, and then you assumed that it was true for Microsoft without any evidence. It might be true, but I don't think you should go around saying it is true when we don't know.

 

3) And also that it is AMD's video encoder engine that causes number 2 to be true. So far, I haven't seen any evidence of that either. The AMD page you linked talks a lot about things like Vulkan and "game developers", and never even mentions video encoding, which to me indicates that the page is talking about running the games, which is a separate thing from encoding the video feed. The page never mentions the words "video" or "encoding", or any video format names. Again, I think you are jumping to conclusions.

 

 

 

 

2 hours ago, WereCat said:

They are significantly closer to others with HEVC than with x264 though.

By x264 I assume you mean h.264.

x264 is a program that runs on the CPU to create h.264 (aka AVC) files.

x265 is a program that runs on the CPU to create h.265 (aka HEVC) files.

Just want to clear that up because all of those various things are relevant to the discussion and I don't want any confusion to arise from potentially incorrect use of terminology.

 

I think I can agree with you that AMD is closer to Intel and Nvidia when it comes to HEVC encoding than H.264 encoding. 

But they still need to improve quite a bit in order to catch up.

 

2 hours ago, WereCat said:

It was not necessarily "just better encoder" because they lacked quality in other areas as well and it was quite apparent when you looked at the direct comparison.

I am not sure what you mean by this. Are you saying that Nvidia didn't have a "better encoder" (compared to what?) because they lacked quality in other areas (which areas?)?

 

 

 

 

2 hours ago, leadeater said:

Eh true, but whenever I hear cloud that denotes actual cloud providers, the ones that are talked about which isn't Twtich. I've never heard anyone refer to Twitch as cloud in such a capacity, Twitch is just Twitch, or game streaming or just streaming 🤷‍♂️

Well, they said "cloud streaming". I think you put a lot of emphasis on the "cloud" word and not enough on the "streaming" word. As it turns out, they were talking about services like Stadia so I guess we were both wrong. I just thought it made sense to assume they were talking about Twitch and Youtube (I put my focus on the word streaming), since those were the names that got brought up in this thread earlier, and it was the two EposVox mentioned in his video. 

 

 

 

2 hours ago, leadeater said:

Much in the same way I don't assume all streaming is game streaming unless game streaming is said. So if is see cloud that means OVH, GCP, Azure, AWS, VPS etc

But you would say OVH is a "streaming provider"? Because I certainly wouldn't. The words that were mentioned were "cloud streaming", and you can't ignore the word "streaming".

Anyway, that question mark has been solved now. 

 

 

2 hours ago, leadeater said:

Yes but that's only IF you care about VMAF scores. If you want to play games and stream on Twitch the metric is "does it look good enough" and that's a very basic yes or no.

Well, VMAF scores are important if you want the video to look good, and if want to compare which cards are the best.

This is like saying "we shouldn't look at FPS benchmarks because what matters is if the game runs well enough, and that's just a yes or no answer".

I don't like that you are trying to gloss over objective measurements here, and I really don't understand why you are doing that. 

 

Also, why settle for "good enough" when you can get "actually good" from the competition?

 

 

2 hours ago, leadeater said:

It's like saying a 2009 Honda Civic isn't a good vehicle because a 2023 Toyota Corola has more horse power or whatever, at some point it doesn't actually matter unless you actually want to do VMAF score comparisons, or 0-60mph or 60-0mph or lap times. Some things aren't strictly and directly relevant to the user beyond suitable.

No, it's like saying the 30 FPS vs 60 FPS in a game doesn't matter because "both are playable, so it doesn't matter who gets which FPS".

The video output from AMD GPUs looks significantly worse than what Nvidia and Intel GPUs put out. At least last time I checked and according to the various tests I have seen.

AMD's output is usable so it's not the end of the world and it's not like you can't livestream if you got an AMD GPU, but it will be worse looking than other brands. So if the image quality of your live streams are important to you, that is something to consider before buying an AMD GPU. 

 

Not everyone cares about image quality and that's fine. For some people "good enough" is all they want, and they might prioritize other things. That's totally fine too.

All I am doing here is responding to illogical and/or false arguments and conclusions. Things like people saying they chose AMD over Nvidia because Nvidia isn't doing enough with their encoder. That makes no sense since they are doing more than AMD is.

Things like people saying AMD's image quality is on par with Nvidia and Intel, because it just isn't.

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, LAwLz said:

Just to be clear and so that we aren't talking past each other. The problem isn't that it's not set to a specific bit rate. The problem with the test you linked is that it doesn't measure image quality. The same bit rate and "same" preset will output different quality images, and that's where AMD has historically been behind.

A 10 megabyte, 10 second video from an AMD card looks worse than a 10 megabyte, 10 second video from Nvidia. Or at least it used to, and all the tests I can find indicate that this is the case. That goes for all formats I have seen, AV1, HEVC, AVC and so on.

 

 

 

Yes. That's what I've said twice. 

That for that test it's an issue since you don't know both the file size (which bitrate it was encoded at) and quality. 

52 minutes ago, LAwLz said:

 

And I have never said or argued that you said AMD's HEVC output was better. Don't try and pull some strawman argument.

 

You didn't. I confused you with Kisai, sorry. 

Edit:

Lol... I'm getting lost in this. I actually didn't and even quoted you and that's what I replied to. 

 

It's in the quality department that AMD has been lacking. What AMD produces at their "quality" preset is, from what I've seen in the tests I've looked into, quite a lot worse than what Nvidia and Intel produce at their "quality" presets

52 minutes ago, LAwLz said:

 

I don't think it's fair to call for example 6Mbps "low bitrate" either. I wouldn't say it's "high" because that's relative, but 6Mbps is quite a lot. If we compare it to movie encodes, most movies I have are far below 6Mbps in bitrate. My rip of the first Hobbit movie is 7.2GB in size, and that's 5.5Mbps with two audio tracks. I don't want people to start calling 6Mbps "low bitrate" because it's quite a lot. It's plenty for a great-looking video feed, if the encoder is good and decent video formats are used.

 

The statements I did not agree with in your post were:

1) AMD's image quality is on par with others. So far, this seems to be false.

 

2) That "everyone used AMD cards for cloud streaming". So far I have not seen any evidence that this is true. You raised one example, which is a service that no longer exists, and then you assumed that it was true for Microsoft without any evidence. It might be true, but I don't think you should go around saying it is true when we don't know.

 

3) And also that it is AMD's video encoder engine that causes number 2 to be true. So far, I haven't seen any evidence of that either. The AMD page you linked talks a lot about things like Vulkan and "game developers", and never even mentions video encoding, which to me indicates that the page is talking about running the games, which is a separate thing from encoding the video feed. The page never mentions the words "video" or "encoding", or any video format names. Again, I think you are jumping to conclusions.

 

 

 

 

By x264 I assume you mean h.264.

x264 is a program that runs on the CPU to create h.264 (aka AVC) files.

x265 is a program that runs on the CPU to create h.265 (aka HEVC) files.

Just want to clear that up because all of those various things are relevant to the discussion and I don't want any confusion to arise from potentially incorrect use of terminology.

 

I think I can agree with you that AMD is closer to Intel and Nvidia when it comes to HEVC encoding than H.264 encoding. 

But they still need to improve quite a bit in order to catch up.

For movies 6Mbps is plenty. I'm talking about game streaming at 1080p60 and higher. Especially for fast paced games it was never enough for h.264 and even with HEVC I'd say it's a minimum for 1080p60. 

 

Yes, by x264 and x265 I mean AVC and HEVC. I usually differentiate by saying CPU or SW encoding and GPU / HW encoding. 

It's not correct to call it like that perhaps... Yes. Same goes for people calling Graphics Card a GPU, etc... 

 

I also linked videos with time stamps where they talk about this being used for encoding.

 

I just found that AMD link so I posted it. You're right they don't specifically mention that they are also encoding with these cards but for latency sake it makes no sense to not do so. 

Even for SteamLink and other in-home streaming services it's done on the GPU for latency and performance sake. 

 

52 minutes ago, LAwLz said:

 

I am not sure what you mean by this. Are you saying that Nvidia didn't have a "better encoder" (compared to what?) because they lacked quality in other areas (which areas?)?

 

Vs CPU encoding. Specifically it was that in some darker areas the CPU Medium preset at the same bitrate preserved more detail than NVENC. I'll link the video in abou 10h when I get home if you want. 

 

All I'm saying is that each encoder has different tradeoffs, it's not 100% better in all aspecs over the other and you have to pixel peep to get an idea whether the quality difference is even perceivable.

 

52 minutes ago, LAwLz said:

 

Well, they said "cloud streaming". I think you put a lot of emphasis on the "cloud" word and not enough on the "streaming" word. As it turns out, they were talking about services like Stadia so I guess we were both wrong. I just thought it made sense to assume they were talking about Twitch and Youtube (I put my focus on the word streaming), since those were the names that got brought up in this thread earlier, and it was the two EposVox mentioned in his video. 

 

I've never seen anyone mean Twitch or YT by "cloud streaming". It's always remote gameplay over cloud. 

 

52 minutes ago, LAwLz said:

 

No, it's like saying the 30 FPS vs 60 FPS in a game doesn't matter because "both are playable, so it doesn't matter who gets which FPS".

I'd say it's more like comparing FSR vs DLSS and deciding that one has better quality than the other because you get more FPS with it. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

Well, they said "cloud streaming". I think you put a lot of emphasis on the "cloud" word and not enough on the "streaming" word. As it turns out, they were talking about services like Stadia so I guess we were both wrong

No? I mentioned game streaming and VDI as part of "cloud". It's really not a big deal but yes the most critical and informative part of the statement is the prefix modifier "cloud" telling us the type on context just like "game streaming" the most important part is "game" because it's telling us the context. That's why I'm saying if I ever see cloud then the last thing I am thinking of is Twitch regardless if streaming is said. If people mean Twitch then they pretty much always just say Twitch 🤷‍♂️

 

I knew exactly what was being talked about since I bet we both watched the same LevelOne video talking about AMD usage in cloud.

 

Still I maintain AMD is the cheap ass option for those who can't justify Nvidia in their environment. I've always liked that AMD went down SR-IOV for shared GPU before Nvidia did as it's better than software/drive layer but up until recently the AMD datacenter GPUs have been rather crap, although game streaming services typically use dedicated Geforce and AMD GPUs rather than datacenter for cost reasons again.

 

1 hour ago, LAwLz said:

No, it's like saying the 30 FPS vs 60 FPS in a game doesn't matter because "both are playable, so it doesn't matter who gets which FPS".

No it isn't because there is a definitive and more perceptible difference between those two, while it's near impossible to notice the difference unless you are freeze framing and saving to disk and doing spot analytics.

 

You're trying to equate performance to quality and is exactly that I was pointing to as not being the impacter. It doesn't matter if you have 200hp or 2000 in a 30pmh zone, you can only go 30mph. One vehicle might sound more "fun" or get to 30 faster but in reality day to day makes no difference at all. Although the 2000hp vehicle is more likely to be found in a ditch upside down lol

1 hour ago, LAwLz said:

The video output from AMD GPUs looks significantly worse than what Nvidia and Intel GPUs put out.

Not they do not.

 

Are you talking like 5 years ago? Then yes, 2 years ago no.

 

I have two 3060 Ti and a RX 6800 XT and straight up what you are saying and how is just wrong. People do not watch game streams in a way to actually notice. I'm not even sure the edge cases like Apex even apply right now.

 

You can watch this, both the VMAF scores but more importantly the comparison footage is not significantly different between the 3 and nobody watching is going to give a damn which of the 3 is actually being used. I didn't watch long enough, don't really care enough, but I think the one case where it might matter is 3.5mbps limited, but Intel is clearly better down that low.

 

 

If you are stopping and freeze framing and side by siding to be able to tell then you've lost the plot and left real world. People don't watch their favorite streamers like that. The only thing that matter is "The image quality is good enough", it's a yes or a no. Anything else is academic and for as to discuss but lets not lose sight of what matters in actual usage.

 

1 hour ago, LAwLz said:

Well, VMAF scores are important if you want the video to look good, and if want to compare which cards are the best.

False. because differences in encoders can make elements more or less pleasing to different viewers which is not represented in VMAF score.

 

A lower VMAF score can be deemed the better image, obviously not way lower but two within small margins of difference necessitates visual comparison not VMAF score.

 

1 hour ago, LAwLz said:

But you would say OVH is a "streaming provider"? Because I certainly wouldn't. The words that were mentioned were "cloud streaming", and you can't ignore the word "streaming".

I didn't ignore the word streaming, people deploy these services within OVH service instances, and GCP, AWS, Azure etc. Maybe you ignore cloud too much and just assumed too 😉

 

I maintain the most important word in there was cloud because it got me to the correct context.

 

1 hour ago, LAwLz said:

Also, why settle for "good enough" when you can get "actually good" from the competition?

Because NVEC alone doesn't get you the best product, obviously. Why settle for worse actual gameplay or higher cost just so you can get NVEC and for nobody to be able to tell?

 

Why not get the actually good option? Which could be anything. NVEC alone only matters if it's the only thing you'll be doing. But this is outside the point of the dicussion.

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, WereCat said:

Yes. That's what I've said twice. 

That for that test it's an issue since you don't know both the file size (which bitrate it was encoded at) and quality. 

I just wanted to make it clear that what you linked is unrelated to what I have been talking about, which is image quality. I think we can leave that benchmark now because it's not relevant.

 

1 hour ago, WereCat said:

It's not correct to call it like that perhaps... Yes. Same goes for people calling Graphics Card a GPU, etc... 

Well, it's more like calling the case the "CPU". If someone is talking about upgrading their computer it could be important to make distinctions between the case and the CPU, and using them interchangeably can create confusion and misunderstandings, and I'd rather not have more confusion in this discussion.

 

 

 

55 minutes ago, WereCat said:

I just found that AMD link so I posted it. You're right they don't specifically mention that they are also encoding with these cards but for latency sake it makes no sense to not do so. 

Even for SteamLink and other in-home streaming services it's done on the GPU for latency and performance sake. 

I think that's a pretty big assumption by you. I am not so sure doing the encoding on a different chip would add any meaningful amount of latency. Also, I think SteamLink using the GPU for encoding has far more to do with performance and consistency than latency. I kind of doubt doing encoding on a separate chip would add any meaningful latency even for home use. It would require far more resources from the computer though so it doesn't make much sense to do it in software when hardware encoders is available. But the situation is different for Stadia because we don't know which GPUs they got, and we don't know what hardware encoders they got.

Personally, I wouldn't be surprised if Google had said "we use AMD GPUs for rendering the games, and then use our own hardware encoders for encoding the video". Especially not since this was done back when AMD's H.264 encoder was quite bad.

 

 

1 hour ago, WereCat said:

Vs CPU encoding. Specifically it was that in some darker areas the CPU Medium preset at the same bitrate preserved more detail than NVENC. I'll link the video in abou 10h when I get home if you want. 

 

All I'm saying is that each encoder has different tradeoffs, it's not 100% better in all aspecs over the other and you have to pixel peep to get an idea whether the quality difference is even perceivable.

CPU encoding has always been better than hardware encoding, or at the very least it has had the potential to be better. NVEnc and QuickSync are not capable of outperforming the best CPU encoders. I don't think anyone is disputing that. But when was this conversation about CPU vs NVEnc? I feel like you're trying to shift focus away right now.

In the case of AMD vs Nvidia and quality, it is pretty accurate to say Nvidia is flat out better. And we don't have to pixel peep to get an idea. If we go back to the VMAF results posted on the previous page it is pretty easy to see that yes, there is a noticeable difference. 

For H.264 at 6Mbps we're talking a VMAF score of about 70 for AMD's cards, and around 80-85 for Nvidia and Intel. That is a massive difference.

For AV1 we're talking about ~82 to ~88 which is way smaller, but still pretty significant. You should be able to see that big of a difference. 

 

 

1 hour ago, WereCat said:

I'd say it's more like comparing FSR vs DLSS and deciding that one has better quality than the other because you get more FPS with it. 

No, it's not like that at all. I have no idea how you arrived at that comparison.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Still I maintain AMD is the cheap ass option for those who can't justify Nvidia in their environment.

I am not sure what that has to do with anything, because nobody in here has said anything contradicting that.

 

1 hour ago, leadeater said:

No it isn't because there is a definitive and more perceptible difference between those two, while it's near impossible to notice the difference unless you are freeze framing and saving to disk and doing spot analytics.

Judging by the various VMAF tests that have been posted here, it should be fairly easy to tell the difference between the two.

 

 

 

1 hour ago, leadeater said:

You're trying to equate performance to quality and is exactly that I was pointing to as not being the impacter. It doesn't matter if you have 200hp or 2000 in a 30pmh zone, you can only go 30mph. One vehicle might sound more "fun" or get to 30 faster but in reality day to day makes no difference at all. Although the 2000hp vehicle is more likely to be found in a ditch upside down lol

What are you on about? Your analogy doesn't make any sense. I am not the one who started talking about FPS. That was WereCat. I totally agree that having 200 or 2000 hp in a 30mph zone is irrelevant. That's why I said it didn't matter that AMD's GPUs could push more FPS in the context of streaming to Twitch. Because once you get 30 or 60 FPS everything else is wasted (for encoding).

What matters at that point is the image quality. The higher the image quality the better. That will always be the case. That is why your "you can only go 30 in a 30 zone" doesn't make any sense applied to my posts because there is no limit on how good of an image you can send to Twitch as long as it doesn't go above the maximum bitrate.

In your analogy about the cars in the 30 zone, both cars are going 30 (the encoding FPS), but one of the cars uses less fuel, less noise and is more comfortable to ride in (the image quality). It does a better job driving at 30 than the other car. Just because both cars can drive at 30mph doesn't mean they are equal because everything else is just unimportant details. Some things might not be important to you, but it might be important to other people and it is worth bringing up.

 

Nvidia GPUs will produce a better image when streaming to Twitch. That is a fact. Your stream will look better without using any more resources. That is a good thing. How much it matters to people will vary, but I think it is the sign of a sore loser to just go "well that doesn't matter because I can't see the difference". That's what console people used to say when their games were stuck on 30 FPS and the PC crowd was raving about 60 FPS.

 

If you don't care about higher image quality then that's fine. But don't try and be reductive and boil it down to some yes or no thing just because you don't want to admit that Nvidia has the benefit in this area. A lot of people claim that they can't notice the difference between 40 and 60 FPS either, but it's not like we say "Let's not measure FPS to figure out which card performs the best, let's just say a game is playable or not". 

 

It's also worth noting that once Twitch gets support for AV1, this gap between Nvidia/Intel and AMD will shrink since AMD's AV1 encoder is a lot better than their H.264 encoder. It seems like it isn't as good as the competition, but it's hell of a lot closer, to the point where it probably won't matter to most people. 

VMAF ~70 to VMAF ~85 is a large jump. A very large one. If you don't notice that then I don't know what to say. 

VMAF ~82 to ~88 is a much smaller jump.

 

 

By the way, since I suspect a lot of people in this thread don't know much about VMAF and the numbers can seem a bit abstract, this is a good paper to read. It tries to figure out which VMAF scores to target for over-the-top video streaming services. The important scores to look for is the lowest VMAF score that on average 50% or more people would accept for a free or paid service, and the relationship between them.

 

What they found was that a paid streaming service needs 10-15 points higher score to be accepted by customers, compared to a free service.

They also found that a video needs a VMAF of around 70 to be generally acceptable from a free service.

In other words, as it stands today with H.264 at 6Mbps, AMD's cards can just barely produce a video that is acceptable quality for around 50% of people, if it is being shown on a free service.

Nvidia's cards output on the other hand, are more in line with what people would expect from a paid service. 

 

With the move to AV1, AMD will move from being just above the "it's okay quality, for a free service" line, to just below the "this is the quality I expect from a paid service".

That's great news for AMD users. 

 

 

 

 

 

 

1 hour ago, leadeater said:

You can watch this, both the VMAF scores but more importantly the comparison footage is not significantly different between the 3 and nobody watching is going to give a damn which of the 3 is actually being used.

What comparison footage? The footage in the video?

Please don't tell me you don't understand why that is a flawed test... If you don't see why it is a flawed test then I am not sure if this conversation is even worth having, because you clearly don't know anything about video quality in that case.

 

The VMAF score is the only thing that matters in that video.

Those results look very promising, but the gap in EposVox's video is far smaller than I have seen in other place, which makes me question the scores. For example what Tom's hardware did in this test (the one linked on the previous page).

 

Sadly it's hard to find good measurements since everything before mid-2022 is invalid.

 

 

 

 

 

1 hour ago, leadeater said:

The only thing that matter is "The image quality is good enough", it's a yes or a no. Anything else is academic and for as to discuss but lets not lose sight of what matters in actual usage.

I guess we should just say measuring FPS is "academic" as well then... If a graphics card gets "good enough FPS" then that's all that matters. No need to look any more deeply into it.

Is the 4090 and 4060 equal in terms of performance? Yes, because both get good enough FPS in some games...

 

 

 

1 hour ago, leadeater said:

Because NVEC alone doesn't get you the best product, obviously. Why settle for worse actual gameplay or higher cost just so you can get NVEC and for nobody to be able to tell?

Are you implying that Nvidia graphics cards result in worse gameplay and/or higher cost than AMD graphics cards?

 

1 hour ago, leadeater said:

Why not get the actually good option? Which could be anything. NVEC alone only matters if it's the only thing you'll be doing. But this is outside the point of the dicussion.

Are you implying that buying an AMD graphics card is the "good option" and buying an Nvidia graphics card is the "bad option"?

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/9/2024 at 3:28 PM, porina said:

Random thoughts:

1, how many streamers save a local copy? I just download the VOD after a stream as my only copy for archive. I don't need any raw footage locally.

 

I play at 4k, but resize to 1080p at 6kBPS. Then I record at 4k for a separate youtube upload at a much higher bitrate. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×