Jump to content

"Shadowplay" vs QuickSync vs x264 for streaming - Quality Comparison

LAwLz

I started talking about this in the Ryzen thread, but I think it deserves its own thread, and I don't know where to post this (I don't think it fits in the CPU section, GPU section nor the software section).

 

At the Ryzen event, AMD showed off their CPU vs Intel's 6800K for streaming. Since then, people have started saying that Ryzen will be better than Intel for streaming, and I strongly disagree with that simplistic viewpoint. The reason why I disagree with it, is because of the major advancements and widespread adoption of GPU encoders in recent years. Now, AMD and other people have made the argument that for low bit rates (Twitch only allows 3500Kbps), CPU encoding is better. So I decided to put that to the test.

 

 

 

Testing methodology:

Spoiler

 

I recorded a short clip of Overwatch with FRAPs. This is because I wanted the gameplay footage to be exactly the same every time I encoded the footage.

 

I then played the losslessly encoded footage in MPC-HC and recorded the video playback with OBS (latest version as 25 February 2017). I could have just passed the source file into an encoder for each, and skip the playback step, but I was worried that the encoders I would use would not be the same version or use the same video code paths as you would use when streaming with OBS. So in order to make the comparison as similar to actual streaming as possible, while also using the exact same gameplay footage for each encoder, I did it this way.

 

Why Overwatch? Because of Tracer's blink ability. The entire screen rapidly changing such as when you blink with Tracer is one of the worst scenarios for video encoding. So this is about as close to a "worst case scenario" as you can get gaming wise. Also, I really like Overwatch right now.

 

 

 

Quick words about the encoders and settings I used.

I used three different encoders and for two of them you need specific hardware. I was not able to test AMD's encoder since I do not have a recent AMD GPU.

 

x264:

Spoiler

 

This is the default encoder for OBS. It runs on the CPU and is therefore very demanding. This is what AMD used at their event. I used two different presets when testing x264. x264 has 10 different presets which has a wide variety of performance:quality. The faster a preset is, the lower the quality (even if the bitrate is the same).

 

OBS's default is called "veryfast" and is the third fastest (and thus, has the third worst quality). This was the first encode I tested.

 

I also tested the preset called "faster", which is one step above "veryfast" in terms of quality. It is A LOT more demanding on the CPU though, and my 4.4GHz 2500K barely managed to record the video playback at 30 FPS with this setting.

My guess is that even with a 1700X, this will be the highest quality setting you can use for streaming. If my computer could barely play a video at 30FPS with it on, I doubt you could go higher even with something like the 1800X.

The next step is called "fast" and it is a lot more demanding than "faster".

 

 

NVENC (aka Shadowplay):

Spoiler

 

Now, Shadowplay is Nvidia's name for the build in video recording in their drivers. It uses a special chip (called NVENC) on their GPUs which only has 1 job. Encode video. This allow Nvidia GPUs to encode video without any performance hit. OBS allows you to use this video encoding chip, and that's what I did.

I have a GTX 1060 so this is the latest version of their encoder hardware wise.

I had it set with the high quality preset, and high profile.

 

 

QuickSync:

Spoiler

QuickSync is Intel's name on their video encoding function. Like with NVENC, this is a special chip that is dedicated to encoding video. It was introduced with Sandy Bridge and in order to use it, you need an Intel CPU with a built in GPU, as well as a chipset which supports video out such as my Z68 motherboard.

I used the best quality quick sync preset and it was high profile.

Something to keep in mind is that I am on Sandy Bridge, and Intel has continued to make improvements to QuickSync in newer generations. It is therefore likely that with a newer Intel CPU such as Skylake, the footage would look slightly better.

 

 

So, here are the results of the test:

Here is the drive folder with all the different clips.

 

I also have links to the videos uploaded to youtube, but I highly recommend you download the original files instead. YouTube adds extra compression which may skew the results.

YouTube links:

Spoiler

 

 

I also made some comparisons with still frames from the NVENC and QuickSync videos which you can find here:

Comparison 1

Comparison 2

Comparison 3

Comparison 4

Comparison 5

Mouse outside = NVENC

Mouse over the picture = QuickSync

You can find all screenshots, including the same frames as in ScreenshotCompare from x264 and the original video file in this Drive folder.

 

Thoughts?
Personally I think NVENC is the best encoding method out of the ones I have available.

QuickSync on a newer CPU might be equal or better to NVENC but since I only have Sandy Bridge I can't test that. x264 veryfast and NVENC seem to be pretty close quality wise but it seems to me like NVENC has the slight edge. In some images the NVENC looks better, and in some x264 veryfast looks better, but the CPU usage makes it impractical for me.

 

If the 1700X can do 30 FPS encoding with the "faster" preset, and play a game at the same time remains to be seen, but if it does it should have a slight edge in terms of quality over NVENC. It does not appear to be a large lead, but a lead nonetheless.

But assuming that it can handle it, the next question becomes "is it worth it?". You are trading an awful lot of CPU power for a quite minor increase in quality. For a YouTube video or archival purposes or whatever I will gladly trade a ton of CPU power for even the smallest quality increase. For streaming footage to twitch which will then be thrown away? I am not so convinced it is worth it.

 

 

Thoughts? Anything else people want me to test?

Link to comment
Share on other sites

Link to post
Share on other sites

During my own recordings, I find the Quicksync in Haswell to Edge out NVENC (GTX 960) in regards to quality at a similar bit rate. Though if OBS ever enables the use of my 960's h.265 encoder, that may change. (though I haven't updated it in awhile, so it might actually support it). 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

So you recorded a lossless stream once, and compared the results for quality (right?).  Great test, I'm glad someone bothered to :P 

 

But, I think the next step is to upload each to Twitch and see how they look after they've gone through that second encode.  According to Linus on the WAN show, the claim by AMD for using the on-CPU encoding is that and minor differences in quality are greatly exaggerated when you re-endocde again at a low bitrate.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ryan_Vickers said:

So you recorded a lossless stream once, and compared the results for quality (right?).  Great test, I'm glad someone bothered to :P 

Thanks!

Yes that's what I did. Record one lossless stream, and then "stream" that clip using NVENC, QuickSync and X264, and then compare the end result.

 

1 hour ago, Ryan_Vickers said:

But, I think the next step is to upload each to Twitch and see how they look after they've gone through that second encode.  According to Linus on the WAN show, the claim by AMD for using the on-CPU encoding is that and minor differences in quality are greatly exaggerated when you re-endocde again at a low bitrate.

I don't think AMD claimed that the extra transcode from Twitch would lower the quality less for the x264 encode. What I think their claim was is that for low bit rate stuff like Twitch, CPU encoded videos will have the edge in terms of quality.

Twitch doesn't care what encoder you use. It will just apply its own transcode indiscriminately. It is pretty safe to say that the better the image you upload to Twitch, the better the image they transcode to will look. So what is important, and what my test measured, is the quality of the footage that would get sent to Twitch's servers.

I did however upload the footage to YouTube. The results should be more or less the same as if you uploaded to Twitch.

 

CPU encoders can be far superior to GPU encoders at the same bit rate, but the problem is that in order for the CPU encode to look better you need a huge amount of processing power. That's why OBS uses the third lowest quality preset as default for CPU encoding. Because anything higher than that and the majority of CPUs will catch fire and you end up a recording that runs at 10 FPS. Even with Ryzen's massive amount of raw CPU performance, I doubt it will manage more than the "faster" preset, which is the highest quality x264 I tried. But like I said in my post, even if it can manage that it's still hard to justify it. You get a tiny bit higher video quality, but at the cost of running your CPU at ~100% load instead of like, 30%. And also the risk of dropping frames if something starts putting a bit more load on your CPU.

 

For real time encoding, things like QuickSync, NVENC and whatever AMD calls theirs (ReLive?) are just much more reliable and convenient than CPU encoding.

And I mean... If you get a Ryzen CPU then you will most likely have a fairly new Nvidia or AMD GPU as well, and they will have a built in encoder. Why not use it?

 

 

On 2/25/2017 at 8:37 PM, Zodiark1593 said:

During my own recordings, I find the Quicksync in Haswell to Edge out NVENC (GTX 960) in regards to quality at a similar bit rate. Though if OBS ever enables the use of my 960's h.265 encoder, that may change. (though I haven't updated it in awhile, so it might actually support it). 

No HEVC or VP# support yet. I don't think OBS supports any codec other than H.264 regardless of which encoder you use.

Hopefully it is something they will add soon. It would be great if Twitch would start supporting it too. The quality would become way better (I think, haven't actually tried it).

Link to comment
Share on other sites

Link to post
Share on other sites

NVENC is the clear winner. I would also like to see if anyone notices a slight audio lag in the x264 videos.

 

The video quality is slightly better with NVENC but the video also feels like there's no discrepancy between it and the audio.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Kloaked said:

NVENC is the clear winner. I would also like to see if anyone notices a slight audio lag in the x264 videos.

 

The video quality is slightly better with NVENC but the video also feels like there's no discrepancy between it and the audio.

That might have been a glitch. There should be no difference in audio.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, LAwLz said:

That might have been a glitch. There should be no difference in audio.

Glitch with the encoding?

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, LAwLz said:

I don't think AMD claimed that the extra transcode from Twitch would lower the quality less for the x264 encode. What I think their claim was is that for low bit rate stuff like Twitch, CPU encoded videos will have the edge in terms of quality.

Twitch doesn't care what encoder you use. It will just apply its own transcode indiscriminately.

Yeah I know

12 minutes ago, LAwLz said:

It is pretty safe to say that the better the image you upload to Twitch, the better the image they transcode to will look.

Exactly.  But I know certain things can adversely affect it, as this video shows:

Spoiler

 

In theory, if one of the encoding methods created a lot of sharp artifacts or something like that, it could possibly reduce the resulting quality substantially due to this effect.

12 minutes ago, LAwLz said:

So what is important, and what my test measured, is the quality of the footage that would get sent to Twitch's servers.

I did however upload the footage to YouTube. The results should be more or less the same as if you uploaded to Twitch.

That's true I suppose.  Perhaps I will have to try taking screenshots from those clips to compare with your stills.  I could very clearly see in the raw stills that the nvidia technique was better, but will that hold up on YouTube?  I guess I'll have to see for myself :P 

12 minutes ago, LAwLz said:

CPU encoders can be far superior to GPU encoders at the same bit rate, but the problem is that in order for the CPU encode to look better you need a huge amount of processing power. That's why OBS uses the third lowest quality preset as default for CPU encoding. Because anything higher than that and the majority of CPUs will catch fire and you end up a recording that runs at 10 FPS. Even with Ryzen's massive amount of raw CPU performance, I doubt it will manage more than the "faster" preset, which is the highest quality x264 I tried. But like I said in my post, even if it can manage that it's still hard to justify it. You get a tiny bit higher video quality, but at the cost of running your CPU at ~100% load instead of like, 30%. And also the risk of dropping frames if something starts putting a bit more load on your CPU.

 

For real time encoding, things like QuickSync, NVENC and whatever AMD calls theirs (ReLive?) are just much more reliable and convenient than CPU encoding.

And I mean... If you get a Ryzen CPU then you will most likely have a fairly new Nvidia or AMD GPU as well, and they will have a built in encoder. Why not use it?

I've noticed when using OBS + the hardware encoder on my Fury that the recording "often" (enough that it's clearly noticeable at times but not enough to really ruin anything) has dropped frames, even when the scene I recorded from was perfectly smooth while playing.  I was never able to verify 100% scientifically, but I think nvidia's shadowplay had less of a problem with this, but still not none.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

I was wondering about this, because I was finding AMD's (and everyone else's) argument that more cores for streaming is a moot point if GPU encoding is there. Even if you're super duper paranoid about using Shadowplay and it "stealing" GPU processing power, chances are you have an Intel machine with an iGPU that's sitting unused. Unless of course you opted for an X79/X99 platform.

 

I guess it depends on how anal you are about the quality of your video. Or which kool-aid you drank from.

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, M.Yurizaki said:

I was wondering about this, because I was finding AMD's (and everyone else's) argument that more cores for streaming is a moot point if GPU encoding is there. Even if you're super duper paranoid about using Shadowplay and it "stealing" GPU processing power, chances are you have an Intel machine with an iGPU that's sitting unused. Unless of course you opted for an X79/X99 platform.

 

I guess it depends on how anal you are about the quality of your video. Or which kool-aid you drank from.

Given the hardware encoder, I don't believe that NVENC requires any additional processing power vs Quicksync. If anything, there might even be (slightly) less overhead as raw frame data needn't be transferred across the PCI-E connection to the cpu for processing. 

 

1 hour ago, LAwLz said:

No HEVC or VP# support yet. I don't think OBS supports any codec other than H.264 regardless of which encoder you use.

Hopefully it is something they will add soon. It would be great if Twitch would start supporting it too. The quality would become way better (I think, haven't actually tried it).

For Twitch to support it would mean they need additional processing power to transcode the video to h.264 (on the fly, no less) to support devices without the capability to decode HEVC. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Glenwing said:

Might be worth noting that all the AM4 motherboards I've seen have display outputs on the back, so it's likely the lower Ryzen processors will have integrated graphics, and a QuickSync-type solution might become available with those.

Especially given the fact that AMD has been all about heterogeneous computing these past few years.

[Out-of-date] Want to learn how to make your own custom Windows 10 image?

 

Desktop: AMD R9 3900X | ASUS ROG Strix X570-F | Radeon RX 5700 XT | EVGA GTX 1080 SC | 32GB Trident Z Neo 3600MHz | 1TB 970 EVO | 256GB 840 EVO | 960GB Corsair Force LE | EVGA G2 850W | Phanteks P400S

Laptop: Intel M-5Y10c | Intel HD Graphics | 8GB RAM | 250GB Micron SSD | Asus UX305FA

Server 01: Intel Xeon D 1541 | ASRock Rack D1541D4I-2L2T | 32GB Hynix ECC DDR4 | 4x8TB Western Digital HDDs | 32TB Raw 16TB Usable

Server 02: Intel i7 7700K | Gigabye Z170N Gaming5 | 16GB Trident Z 3200MHz

Link to comment
Share on other sites

Link to post
Share on other sites

IMO bitrates are high enough these days that buying a CPU with extra cores just for streaming or recording is kind of stupid. Quick Sync requires more in my experience, but NVENC does great for both streaming and recording as long as the bitrate is above like 1500Kbps. Twitch allows up to 3500 for non-partnered channels, so as long as your internet can handle that, there's no reason not to do it. 

 

This is why I worry about Ryzen, because of how AMD is marketing it. The R7 chips are not really for just streamers, and certainly not gamers. They are for whoever is already buying the high end Intel stuff (6800K-6950X) which is honestly not that huge of a market for how big of a launch this is. They better push out R5, which is really going to be where they beat out Intel for the mainstream, soon.

Lenovo Ideapad 720s 14 inch ------ One day I'll have a desktop again...

Link to comment
Share on other sites

Link to post
Share on other sites

Something I might test is how they compare at 720p.

If you ask me, 3500Kbps is not enough for 1080p streaming. At least not action games like Overwatch.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, LAwLz said:

Something I might test is how they compare at 720p.

If you ask me, 3500Kbps is not enough for 1080p streaming. At least not action games like Overwatch.

I know Sweden has special internet, but for the rest of the world, sustaining an upload at anything higher than that isn't trivial though :D

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Ryan_Vickers said:

I know Sweden has special internet, but for the rest of the world, sustaining an upload at anything higher than that isn't trivial though :D

Haha yeah. I was not really suggesting increasing the bit rate. I was thinking of lowering the resolution. 720p might look better than 1080p at 3500Kbps.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, LAwLz said:

Haha yeah. I was not really suggesting increasing the bit rate. I was thinking of lowering the resolution. 720p might look better than 1080p at 3500Kbps.

That would actually be an interesting test... take a bunch of different feeds, all at the same bitrate and all ultimately scaled down to the same res, but with the sources varying from less than the destination res, to much greater than the destination res.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Spork829 said:

This is why I worry about Ryzen, because of how AMD is marketing it. The R7 chips are not really for just streamers, and certainly not gamers. They are for whoever is already buying the high end Intel stuff (6800K-6950X) which is honestly not that huge of a market for how big of a launch this is. They better push out R5, which is really going to be where they beat out Intel for the mainstream, soon.

The other part of it is I think AMD is once more trying to do the "more cores is better for you" game again, like they did with Bulldozer. But the fact is the program that an overwhelmingly majority of computer users use on a daily basis, the internet browser (or at least, any sort of cloud enabled app), is a terribly serialized program with no way of effectively parallelizing it.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, M.Yurizaki said:

The other part of it is I think AMD is once more trying to do the "more cores is better for you" game again, like they did with Bulldozer. But the fact is the program that an overwhelmingly majority of computer users use on a daily basis, the internet browser (or at least, any sort of cloud enabled app), is a terribly serialized program with no way of effectively parallelizing it.

Unlike the FX line however, they're now pairing that high core count with high IPC, as opposed to trying to use high core count to compensate for a complete lack of IPC.  There's also the massive power consumption improvements to consider.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Ryan_Vickers said:

Unlike the FX line however, they're now pairing that high core count with high IPC, as opposed to trying to use high core count to compensate for a complete lack of IPC.  There's also the massive power consumption improvements to consider.

Still, it's a disservice to the layman to once more claim that one aspect of a processor means absolutely everything about performance and cherry picking benchmarks to make themselves look better (but I blame marketing for that).

 

But I guess you have to leverage something.

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, M.Yurizaki said:

Still, it's a disservice to the layman to once more claim that one aspect of a processor means absolutely everything about performance and cherry picking benchmarks to make themselves look better (but I blame marketing for that).

 

But I guess you have to leverage something.

What matters to me is single-threaded performance, core count, power efficiency, and cost.  The FX lineup was seriously lacking in two of those, and AMD seems to have addressed that in a big way with RyZen.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

H.264 software produces better quality than NVEC from nvidia.. (hardware h.264 encoder) or even quicksync from intel cpu's...  

 

This have been proven multiple times and the others really suck... (i'm a streamer myself so i have tried all the different options)

 

Nothing beats software h.264 quality.. period..

Link to comment
Share on other sites

Link to post
Share on other sites

so when people say that ryzen will be better for streaming.. people are correct as it has more cores... and encoding LOVES more cores/threads... cus you would wants to use software encoding.. but its very taxing on the CPU...

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, kladzen said:

H.264 software produces better quality than NVEC from nvidia.. (hardware h.264 encoder) or even quicksync from intel cpu's...  

 

This have been proven multiple times and the others really suck... (i'm a streamer myself so i have tried all the different options)

 

Nothing beats software h.264 quality.. period..

I recommend you read the thread before posting.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×