Jump to content

What GPU is doing when you watch a youtube video?

Luka95

I see in task manager that my gpu is working hard when i just watch a yt video, i thought it isnt damanding task. I supose its doing some encoding or whatever, but am wandering, what exactly is gpu doing when we watch yt videos. I noticed that my gtx 1070ti is on 30% usage when i watch 4k yt video, if video is in 1080p resolucion gpu is working less thatn that. I curently have gtx 1060 in my system and it is working hard when i watch 4k yt video, about 50%. So am curios what job gpu have when we watch videos on internet..?

Link to comment
Share on other sites

Link to post
Share on other sites

Well, maybe it will took a while to explain exactly why, that's a lot going on when you are watching a video, I will leave this for someone more qualified than me haha

 

But maybe this will help you to understand it better:

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Luka95 said:

what exactly is gpu doing when we watch yt videos.

Mostly it's disapproving of the videos you are watching. 🤣

NOTE: I no longer frequent this site. If you really need help, PM/DM me and my e.mail will alert me. 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Luka95 said:

I see in task manager that my gpu is working hard when i just watch a yt video, i thought it isnt damanding task. I supose its doing some encoding or whatever, but am wandering, what exactly is gpu doing when we watch yt videos. I noticed that my gtx 1070ti is on 30% usage when i watch 4k yt video, if video is in 1080p resolucion gpu is working less thatn that. I curently have gtx 1060 in my system and it is working hard when i watch 4k yt video, about 50%. So am curios what job gpu have when we watch videos on internet..?

iirc, you can see the details of GPU usage on task manager, not sure if it's showing it correctly though.

 

Anyway, your browser is using your GPU's hardware decoder to play those videos without using your CPU. If you disable hardware acceleration, then you'll see some load on your CPU since it's now doing software decoding.

 

It's not actually using your actual GPU core, but rather a separate chip responsible solely for decoding media. The encoder/decoder on the 1060 and 1070ti is the same, so the load when playing a 4k video should be the same in both.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, igormp said:

iirc, you can see the details of GPU usage on task manager, not sure if it's showing it correctly though.

 

Anyway, your browser is using your GPU's hardware decoder to play those videos without using your CPU. If you disable hardware acceleration, then you'll see some load on your CPU since it's now doing software decoding.

 

It's not actually using your actual GPU core, but rather a separate chip responsible solely for decoding media. The encoder/decoder on the 1060 and 1070ti is the same, so the load when playing a 4k video should be the same in both.

ok meaby i forgot how was it when i watched with 1070ti. Thanks for explanation!! 🙂

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Luka95 said:

I see in task manager that my gpu is working hard when i just watch a yt video, i thought it isnt damanding task. I supose its doing some encoding or whatever, but am wandering, what exactly is gpu doing when we watch yt videos. I noticed that my gtx 1070ti is on 30% usage when i watch 4k yt video, if video is in 1080p resolucion gpu is working less thatn that. I curently have gtx 1060 in my system and it is working hard when i watch 4k yt video, about 50%. So am curios what job gpu have when we watch videos on internet..?

Depending on your browser.

 

If you have a dGPU, the browsers will use the underlying hardware decoder IF it's available, and fall back to software. 

 

There's a catch there though. You know those video ads every website runs now? Those use video decoding resources. In fact only the first video really makes use of the video decoder. Even if you have a second hardware decoder available. Once that video decoder hits the maximum it can decode, any additional requests to decode are kicked to software decoders.

 

image.png.5e18c149df02937f2584a6acd9601269.png

image.png.33ad0c1be268f26d6a05f1b71e390721.png

 

You likely won't even notice this unless you are watching the video engine data.

 

so right now, I'm watching a single 720p stream full screen at 4k (though the web browser is actually rendering at 1920x1080), it uses 10% of the video decode, BUT 75% of the 3D GPU for some reason.

 

Now if I exit full screen, the 3D GPU goes down to 3% and the video decode goes down to 4%. What changed?

 

image.png.a5bb5c967d70a25e7e8e35485cf017f2.png

 

The only difference between the beginning and the end of the charts here are Firefox being full screen with twitch. Nothing has been closed.

 

Now If I repeat that with youtube on Chrome. 1080p video, vp9. 3% Video Decode, 2% 3D while windowed, and if I switch to full screen:

image.png.e815225238e0aae2198c68e6caded624.png

37% 3D, 14% video decode.

 

So by all accounts, I should in theory be able to run 8 1080p videos on the video decode engine, however the upscaling being done by either Chrome or Firefox will only allow one or two videos (eg two monitors) before the Video subsystem is overwhelmed.

 

So, what actually happens when you do overload it?

Well.. I couldn't.

 

image.png.9d4a40c6e7673b0646c7e9956a87a5fe.png

 

 

6 windowed youtube videos, topped out at 40% on the video decoder and 8% on the 3D engine. However the CPU was at 40%. However they were all dropping frames.

image.png.1427927583bccd46da7a2aac409a0b64.png

After I closed all but one of the videos, the cpu dropped from 40% to 13%. Running just one video, has negligible load on the cpu, and gpu. It's as soon as additional videos run that the load increases (on the cpu primarily.)

 

So the take away from this is that if you have hardware decoding, you can run as many videos as needed on it (eg h264, vp9) until the decode engine is maxed out. Now, caveat, the reason I believe it maxed out at 40% is because these were all windows, and because they were all windows, they likely were automatically switching to 720p streams, because web browsers do not operate in 4K.

 

To reframe this another way, the video decoders aren't "one video, period" they are only passing the bitstream to the decoder, so other things like audio and the actual file container are almost guaranteed to be in software.

Link to comment
Share on other sites

Link to post
Share on other sites

-> Moved to Graphics Cards

^^^^ That's my post ^^^^
<-- This is me --- That's your scrollbar -->
vvvv Who's there? vvvv

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, whm1974 said:

So watching 4K YT videos is more work then 1080p?

I guess

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Kisai said:

Depending on your browser.

 

If you have a dGPU, the browsers will use the underlying hardware decoder IF it's available, and fall back to software. 

 

There's a catch there though. You know those video ads every website runs now? Those use video decoding resources. In fact only the first video really makes use of the video decoder. Even if you have a second hardware decoder available. Once that video decoder hits the maximum it can decode, any additional requests to decode are kicked to software decoders.

 

image.png.5e18c149df02937f2584a6acd9601269.png

image.png.33ad0c1be268f26d6a05f1b71e390721.png

 

You likely won't even notice this unless you are watching the video engine data.

 

so right now, I'm watching a single 720p stream full screen at 4k (though the web browser is actually rendering at 1920x1080), it uses 10% of the video decode, BUT 75% of the 3D GPU for some reason.

 

Now if I exit full screen, the 3D GPU goes down to 3% and the video decode goes down to 4%. What changed?

 

image.png.a5bb5c967d70a25e7e8e35485cf017f2.png

 

The only difference between the beginning and the end of the charts here are Firefox being full screen with twitch. Nothing has been closed.

 

Now If I repeat that with youtube on Chrome. 1080p video, vp9. 3% Video Decode, 2% 3D while windowed, and if I switch to full screen:

image.png.e815225238e0aae2198c68e6caded624.png

37% 3D, 14% video decode.

 

So by all accounts, I should in theory be able to run 8 1080p videos on the video decode engine, however the upscaling being done by either Chrome or Firefox will only allow one or two videos (eg two monitors) before the Video subsystem is overwhelmed.

 

So, what actually happens when you do overload it?

Well.. I couldn't.

 

image.png.9d4a40c6e7673b0646c7e9956a87a5fe.png

 

 

6 windowed youtube videos, topped out at 40% on the video decoder and 8% on the 3D engine. However the CPU was at 40%. However they were all dropping frames.

image.png.1427927583bccd46da7a2aac409a0b64.png

After I closed all but one of the videos, the cpu dropped from 40% to 13%. Running just one video, has negligible load on the cpu, and gpu. It's as soon as additional videos run that the load increases (on the cpu primarily.)

 

So the take away from this is that if you have hardware decoding, you can run as many videos as needed on it (eg h264, vp9) until the decode engine is maxed out. Now, caveat, the reason I believe it maxed out at 40% is because these were all windows, and because they were all windows, they likely were automatically switching to 720p streams, because web browsers do not operate in 4K.

 

To reframe this another way, the video decoders aren't "one video, period" they are only passing the bitstream to the decoder, so other things like audio and the actual file container are almost guaranteed to be in software.

Thank you!

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, whm1974 said:

So watching 4K YT videos is more work then 1080p?

Yeah, it's more frames to decode afterall.

 

3 hours ago, Kisai said:

Depending on your browser.

 

If you have a dGPU, the browsers will use the underlying hardware decoder IF it's available, and fall back to software. 

 

There's a catch there though. You know those video ads every website runs now? Those use video decoding resources. In fact only the first video really makes use of the video decoder. Even if you have a second hardware decoder available. Once that video decoder hits the maximum it can decode, any additional requests to decode are kicked to software decoders.

 

image.png.5e18c149df02937f2584a6acd9601269.png

image.png.33ad0c1be268f26d6a05f1b71e390721.png

 

You likely won't even notice this unless you are watching the video engine data.

 

so right now, I'm watching a single 720p stream full screen at 4k (though the web browser is actually rendering at 1920x1080), it uses 10% of the video decode, BUT 75% of the 3D GPU for some reason.

 

Now if I exit full screen, the 3D GPU goes down to 3% and the video decode goes down to 4%. What changed?

 

image.png.a5bb5c967d70a25e7e8e35485cf017f2.png

 

The only difference between the beginning and the end of the charts here are Firefox being full screen with twitch. Nothing has been closed.

 

Now If I repeat that with youtube on Chrome. 1080p video, vp9. 3% Video Decode, 2% 3D while windowed, and if I switch to full screen:

image.png.e815225238e0aae2198c68e6caded624.png

37% 3D, 14% video decode.

 

So by all accounts, I should in theory be able to run 8 1080p videos on the video decode engine, however the upscaling being done by either Chrome or Firefox will only allow one or two videos (eg two monitors) before the Video subsystem is overwhelmed.

 

So, what actually happens when you do overload it?

Well.. I couldn't.

 

image.png.9d4a40c6e7673b0646c7e9956a87a5fe.png

 

 

6 windowed youtube videos, topped out at 40% on the video decoder and 8% on the 3D engine. However the CPU was at 40%. However they were all dropping frames.

image.png.1427927583bccd46da7a2aac409a0b64.png

After I closed all but one of the videos, the cpu dropped from 40% to 13%. Running just one video, has negligible load on the cpu, and gpu. It's as soon as additional videos run that the load increases (on the cpu primarily.)

 

So the take away from this is that if you have hardware decoding, you can run as many videos as needed on it (eg h264, vp9) until the decode engine is maxed out. Now, caveat, the reason I believe it maxed out at 40% is because these were all windows, and because they were all windows, they likely were automatically switching to 720p streams, because web browsers do not operate in 4K.

 

To reframe this another way, the video decoders aren't "one video, period" they are only passing the bitstream to the decoder, so other things like audio and the actual file container are almost guaranteed to be in software.

Your results are weird, but I can't replicate those since Chrome has no hardware decoding on Linux.

 

However, I can max out the decoder on my 2060 Super with 3x 400Mbps 4k 4:2:0 streams without any frame drops or extra CPU load. With 4 streams frames start to drop

TbDIRAI.jpg

 

3 hours ago, Kisai said:

If you have a dGPU, the browsers will use the underlying hardware decoder IF it's available, and fall back to software. 

That's also valid for integrated GPUs, Intel's iGPU has QuickSync, AMD has the same one as in the discrete GPUs, and mobile SoCs have a shit ton of decoders.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, igormp said:

Your results are weird, but I can't replicate those since Chrome has no hardware decoding on Linux.

That is news to me as I was unaware of this. I course I don't use Chrome anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, igormp said:

 

That's also valid for integrated GPUs, Intel's iGPU has QuickSync, AMD has the same one as in the discrete GPUs, and mobile SoCs have a shit ton of decoders.

The iGPU wasn't invoked because none of the videos were on a monitor attached to the iGPU. Now given, in Windows 10 you can "steer" a program to operate on the iGPU even if you're using the dGPU for output. This is what I was getting at. If you have "decoding hardware" Chrome and Firefox pass this to the decoding hardware, but they're not smart about it, they just pass it to the decoder that is attached to the monitor it's on. If you use something like a displaylink dock, there is no hardware acceleration on that software GPU, so ask yourself "What decoder gets used?", well I haven't personally tested this, but now I want to. If it behaves anything like Autocad, the video decode engine won't get used unless you launch the video player/web browser on the laptop screen, with the laptop screen being the primary monitor.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, whm1974 said:

That is news to me as I was unaware of this. I course I don't use Chrome anyway.

Firefox only recently added support for it on linux. You can build chromium with VA-API support for hardware decoding, but prefer to use regular Chrome anyway.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, igormp said:

Firefox only recently added support for it on linux. You can build chromium with VA-API support for hardware decoding, but prefer to use regular Chrome anyway.

I didn't know this about FF either. And Gee Whiz I never had any trouble watching Streamed 1080p videos on FF however.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, whm1974 said:

I didn't know this about FF either. And Gee Whiz I never had any trouble watching Streamed 1080p videos on FF however.

Probably because your CPU is good enough to handle 1080p on software decoding anyway. Even my old fx6300 or the i5 4210u on my laptop can do so.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×