Jump to content

I was wondering how 4K online content works.

 

So DisplayPort 1.4 can do 32 Gbps, so 4K at 120hz

HDMI 2.0 can do 18 Gbps, so 4K at 60hz

 

How can Amazon just require a 15 Mbps internet connection for 4K content,

Netflix 25 Mbps connection and Google Stadia a 35 Mbps?

 

Is the actual data that much compressed? And what is the loss in quality?

Am I missing something here?

Link to comment
https://linustechtips.com/topic/1109255-bandwidth-requirements-for-4k/
Share on other sites

Link to post
Share on other sites

10 hours ago, sebdhaese said:

I was wondering how 4K online content works.

 

So DisplayPort 1.4 can do 32 Gbps, so 4K at 120hz

HDMI 2.0 can do 18 Gbps, so 4K at 60hz

 

How can Amazon just require a 15 Mbps internet connection for 4K content,

Netflix 25 Mbps connection and Google Stadia a 35 Mbps?

 

Is the actual data that much compressed? And what is the loss in quality?

Am I missing something here?

They do compress things. I mean streaming is definally shittier than watching from a Bluray. Ive heard with Plex as 4K stream can exceed 100 Mbps (thats intenral network). So that being said, yes, Netflix and Amazon are compressing things a bit, the quality is not 100% there. But for most its good enough. 

I just want to sit back and watch the world burn. 

Link to post
Share on other sites

A lot of 4K content is 23.976 fps / 24 fps , or 29.97 fps  / 30 fps.  Some content is 60 fps but most movie makers prefer 24 or 30 fps still because a lot of people tend to equate 60 fps to soap operas and cheap tv shows.

 

In addition to this, most movies are not stored and/or transmitted to you in RGB (red,blue and green for each pixel).

 

In order to compress content better, each pixel color information is converted into brightness Y and chrominances (blue and red)... the green chrominance is determined from blue and red automatically. That's what YCbCr means - Y is luminance (brightness), Cb and Cr are the chrominances.

 

This is done because human eyes are more sensitive to changes in brightness so it's important to keep the brightness for each pixel but human eyes are less sensitive to colors (which is useful because you can do tricks with colors to reduce the amount of data without person watching noticing).

Then, they go further and take groups of 4 pixels and store the brightness for each pixel, but basically compute an average of the chrominances for those 4 pixels. 

So, instead of sending 4 pixels x 3 bytes (red,green,blue) = 12 bytes,  you now have to compress only 6 bytes ( 4 pixels x 1 byte Luminance + 2 bytes, average Cb and average Cr)

 

This way, right from the start, each frame of a 4K video becomes 3840x2160 pixels / 4 pixels per group  * 6 bytes per group = 12,441,600 bytes per frame or ~12 MB

For a 24 fps movie, you have ~284 MB of data in this YCbCr 4:2:0, or in equivalent of bandwidth sent to monitor 2.4 gbps , around a 8th of the 18gbps the HDMI cable supports.

For a 60fps movie, you'd need around 711MB per second or in terms of bandwidth around 6 gbps.

 

For very high quality, you'd want to compress this in around 60-100mbps (or around 8-12 MB/s), a ratio of 284:12 or 24..36:1.

 

For decent quality, you'd need to compress this 24fps content that has ~284 MB/s in about 30-50 mbps or 4-6 MB/s , a compression ratio of around 50..70:1 which is not hard to achieve, because a lot of content between frames is relatively similar. At 15 mbps your ratio is nearly 150:1 which is somewhat bad.

 

So 15 mbps is a bit low for 4K content, but doable for shows which have a lot of scenes where the camera doesn't pan a lot (mostly static background, characters moving in front of the same background), where there's not enough of high movement...

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×