Jump to content

Two resolutions at the same bitrate?

I've heard that 1080p Blu ray has bitrates around 15 Mbps (or much higher) while streamed 4K has the same bitrate (15 Mbps), doesn't this mean that streamed 4K will look worse than a 1080p Blu ray?What's the point of additional pixels (that should provide additional details) if they aren't even used?

Link to comment
Share on other sites

Link to post
Share on other sites

 

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

H265 looks really good.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Enderman said:

 

How did this answer any of my questions?

 

21 minutes ago, BubblyCharizard said:

its got much more to do with the codecs

 

compression is the biggest issue

Ok, let me change my question: what happens when you compress two videos with different resolutions with the same codec at the same bitrate?Which one will look better?

 

21 minutes ago, Some Random Member said:

H265 looks really good.

Wat?

Link to comment
Share on other sites

Link to post
Share on other sites

it is due to different types of Compression and Encodings used in Videos...

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, MyName13 said:

I've heard that 1080p Blu ray has bitrates around 15 Mbps (or much higher) while streamed 4K has the same bitrate (15 Mbps), doesn't this mean that streamed 4K will look worse than a 1080p Blu ray?What's the point of additional pixels (that should provide additional details) if they aren't even used?

This is a hard question to answer.

 

1080p blu ray has a bit rate that for the most part is higher than you can realistically perceive if you convert it to a streamable format like h.264.

Whilst a 4k video stream will have a bitrate a little bit lower than you would ideally want for a clean video signal. (but will look fine on a TV)

 

Both of these sources displayed on a 4K TV will look good. The 4k one will be sharper cos the resolution is higher. But the 1080p one will have much better colours and a cleaner picture (no visible compression artefacts).

 

I hope that answers your question.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, TrigrH said:

This is a hard question to answer.

 

1080p blu ray has a bit rate that for the most part is higher than you can realistically perceive if you convert it to a streamable format like h.264.

Whilst a 4k video stream will have a bitrate a little bit lower than you would ideally want for a clean video signal. (but will look fine on a TV)

 

Both of these sources displayed on a 4K TV will look good. The 4k one will be sharper cos the resolution is higher. But the 1080p one will have much better colours and a cleaner picture (no visible compression artefacts).

 

I hope that answers your question.

Even though blu ray might have unnecessarily high bitrate, doesn't it still have much higher bitrate than streams  and thus better quality (15+ Mbps on blu ray vs 5-6 on netflix)?

My question is how can 4K at the same or lower bitrate than a 1080p blu ray look better (assuming the 1080p blu ray hasn't crossed the point of diminishing returns which it probably won't at ~15 Mbps)?It has more pixels so it should be sharper, but it also has the same bitrate so how can it look better?

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, MyName13 said:

Even though blu ray might have unnecessarily high bitrate, doesn't it still have much higher bitrate than streams  and thus better quality (15+ Mbps on blu ray vs 5-6 on netflix)?

???? blu ray has much higher bit rate than a 1080p stream?

Quote

My question is how can 4K at the same or lower bitrate than a 1080p blu ray look better (assuming the 1080p blu ray hasn't crossed the point of diminishing returns which it probably won't at ~15 Mbps)?It has more pixels so it should be sharper, but it also has the same bitrate so how can it look better?

blu ray is already past the diminishing returns at 15-25 Mbps plus its encoded in a less efficient but more secure manner (i think) than a web stream.

A typical blu ray movie is 25gb? I can re-encode it to a streamable format down to ~8gb with no perceptible quality loss.

 

The reason the 4k file looks better is because its more efficiently encoded. 

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, TrigrH said:

???? blu ray has much higher bit rate than a 1080p stream?

 

blu ray is already past the diminishing returns at 15-25 Mbps plus its encoded in a less efficient but more secure manner (i think) than a web stream.

A typical blu ray movie is 25gb? I can re-encode it to a streamable format down to ~8gb with no perceptible quality loss.

 

The reason the 4k file looks better is because its more efficiently encoded. 

1)Yes, isn't that a well known fact?

2)So a 1080p blu ray wastes bandwidth on unnoticeable details while 4K loses the unnoticeable details, gets an increase in resolution and then looks better?

3)If everything uses the same codec (either h264 or h265) how can they have different encoding efficiencies?A codec has a set efficiency as far as I know.Are you thinking about "aggressiveness" of compression (which determines the bitrate)?

 

What is the point of diminishing returns for 1080p and 4K in terms of bitrate?

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, MyName13 said:

1)Yes, isn't that a well known fact?

2)So a 1080p blu ray wastes bandwidth on unnoticeable details while 4K loses the unnoticeable details, gets an increase in resolution and then looks better?

3)If everything uses the same codec (either h264 or h265) how can they have different encoding efficiencies?A codec has a set efficiency as far as I know.Are you thinking about "aggressiveness" of compression (which determines the bitrate)?

 

What is the point of diminishing returns for 1080p and 4K in terms of bitrate?

Blu ray doesn't use the same codec, it uses some kind of lossless codec, while typical webstuff uses h264.

But yes for example youtube 4k looks a lot worse than a 1080p bluray.

But the point of dimishing returns on a bluray is to make it look as good as possible, because that is what bluray stands for.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, MyName13 said:

How did this answer any of my questions?

It literally just depends on the codec used to encode the video...

Clearly you need to do a bit more research on how encoding works, which is why I linked you that video.

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Enderman said:

It literally just depends on the codec used to encode the video...

Clearly you need to do a bit more research on how encoding works, which is why I linked you that video.

Don't compression standards have different compression levels (that determine the bitrate)?This way if two videos with different resolutions were encoded with the same compression standard and have the same bitrate, which one would look better?I already know everything mentioned in this video.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Some Random Member said:

Blu ray doesn't use the same codec, it uses some kind of lossless codec, while typical webstuff uses h264.

Oh my god, no.  BluRay does NOT use anything lossless and it doesn't use anything even CLOSE to lossless.


The BluRay standard allows for three possible compression codecs: MPEG2, VC-1, and H.264.  MPEG2 has not been used since the early years and you still see the occasional VC-1 disc out there but for the most part every made today and the MAJORITY made ever, use H.264 since it's the most efficient.  It most certainly does not use 'some kind of lossless codec'.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AshleyAshes said:

Oh my god, no.  BluRay does NOT use anything lossless and it doesn't use anything even CLOSE to lossless.


The BluRay standard allows for three possible compression codecs: MPEG2, VC-1, and H.264.  MPEG2 has not been used since the early years and you still see the occasional VC-1 disc out there but for the most part every made today and the MAJORITY made ever, use H.264 since it's the most efficient.  It most certainly does not use 'some kind of lossless codec'.

The more you know

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, MyName13 said:

if two videos with different resolutions were encoded with the same compression standard and have the same bitrate, which one would look better

It still would depend on the codec because every encoder works in a different way...

It would also depend on the two resolutions and what bitrate.

If you try to compress 4 pixels into 1 or 5.7 pixels into 1 you will get very different results even when using the same encoder.

 

Again, you need to research how encoding works, lossy vs lossless, different codecs, individual algorithms, scaling, etc.

There is no "X is always better than Y"

Maybe a degree in computer science or computer engineering will give you the answer you're looking for.

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

Here's the critical problem when you try to argue about 'resolution' in a movie.  A lot of a movie is just blurry.  Like, let's look at this random shot from Pete's Dragon (Fun fact: The overtime I did on this movie bought me a new queen size bed in cash! :3)

 

Movies are shot with lenses and only a certain depth is actually in focus, things fore and aft of the focal point goes out of focus. The background outside the truck is well out of focus.  The truck cab is out of focus.  The passenger is out of focus.  The steering wheel is out of focus.  While Karl Urban's face and torso are mostly in focus, even parts of his arms and hand are out of focus.  This does not change if you go from 1080p to 4K, or to 8K or anything higher.  This is blurry BY DESIGN.  A lens with a specific focal length and aperture width was chosen for this aesthetic effect.  And most movies are like that.

 

5adaa29060ff6_PetesDragon(2016)1080p.mkv_snapshot_01_20.25_2018.04_20_22_30_41.thumb.png.2886fa9608a7b7d5a82aa9ae9ae7c42b.png

 

How about the CG dragon itself?  Ah, right, motion blur.  Even the computer render is simulating the shutter speed of a camera and motion is blurred.  The render even simulates depth of field so parts of the dragon are out of focus.

 

5adaa357d5f73_PetesDragon(2016)1080p.mkv_snapshot_01_20.28_2018.04_20_22_34_36.thumb.png.fa90ccb6a7825fd215147bcdaea605f6.png

 

Movies are not video games, where everything is in focus and crisp and sharp.  They are soft and organic and you will see HUGELY diminishing returns as resolution goes up on the viewing end.

 

But there is one thing that compression really helps: Grain.  Analog film has grain and so does a digital camera's sensor.  In VFX we literally have different film stock and cameras profiled so we can replicate their grain when adding elements or making changes but ensuring the grain is consistent throughout a shot. We're going to use The Walking Dead as an example.  Why  It's grainy.  The Walking Dead, a series which I have NEVER worked on, is shot on 16mm analog film as the production team wants a certain effect.  This results in a LOT of noise.

 

Well this looks like crap, don't it?  I don't mean the grain, grain itself has CHARACTER but video compression optimizes for smooth details.  We can already see where the grain is causing a lot of compression artifacts because this grain is changing with every frame and it's noisy and it is a LOT more information to fit within the data stream.  This is the cost of that, the grain.   A higher bitrate will go a LOT farther here than more resolution.  And look, GLen in the BG here isn't even in focus anyway, he's slightly out of focus even in this wider shot.

 

5adaa50727e2a_TheWalkingDead1x02Guts.mkv_snapshot_11.56_2018_04.20_22_41_58.thumb.png.f53068951685a59639c717413dcd2c71.png

 

There is a reason why the vast majority movies today, projected on GIANT theater screens, are only 2K and you aren't crying 'I CAN SEE THE PIXELS'  Because you get hugely diminishing returns from a viewer experience as resolution goes up beyond 1080p.

Petes Dragon (2016) [1080p].mkv_snapshot_00.03.32_[2018.04.20_22.37.18].png

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AshleyAshes said:

 

 

 

 

 

 

Snippity

I learned about the bandwidth requirements of film grain firsthand when encoding Serial Experiments Lain. Most episodes fell between 200-250MB with a set CR, but one episode that uses a lot of film grain exceeded 800 MB with the same setting.

 

Film grin is expensive.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Zodiark1593 said:

I learned about the bandwidth requirements of film grain firsthand when encoding Serial Experiments Lain. Most episodes fell between 200-250MB with a set CR, but one episode that uses a lot of film grain exceeded 800 MB with the same setting.

 

Film grin is expensive.

Yup and I reather like grain, especially on old film stock.  Modern digital cameras have only a fairly fine noise profile but older anime or movies transfered from film to BluRay, that grain is part of the texture and character of the image. <3  But yeah it is NOT friendly with lower bitrates.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, AshleyAshes said:

Yup and I reather like grain, especially on old film stock.  Modern digital cameras have only a fairly fine noise profile but older anime or movies transfered from film to BluRay, that grain is part of the texture and character of the image. <3  But yeah it is NOT friendly with lower bitrates.

Having lots of things change between frames (eg, film grain) brutalizes the effectiveness of inter-frame compression, even in newer codecs such as HEVC. Not really much of a way around it that I can see.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Zodiark1593 said:

Having lots of things change between frames (eg, film grain) brutalizes the effectiveness of inter-frame compression, even in newer codecs such as HEVC. Not really much of a way around it that I can see.

Yeah, compression is really about optimizing for things that are the same, be they flat areas or things that only change a bit between frames (A gross over simplification of lossy compression but still) so noise, highly detailed textures or rapid changes and such are going to benefit the least.

 

This is a major reason I only Remux my anime discs.  Because I want it to look as it does on the disc itself rather than inuring additional loss by recompression.

 

Personally though for VIEWING I see hugely diminishing returns on anything beyond 1080p.  I've watched The Force Awakens at 4K on my 65" TV and honestly, side by side with 1080p and I could not tell the difference from the couch.  I'm sure if I got my nose to the screen I could see a difference but that's not how I actually watch my TV.

 

And again, gaming, since it's so much sharp and in focus not to mention aliasing that goes on, yeah 4K makes GAMES look better. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, AshleyAshes said:

Yeah, compression is really about optimizing for things that are the same, be they flat areas or things that only change a bit between frames (A gross over simplification of lossy compression but still) so noise, highly detailed textures or rapid changes and such are going to benefit the least.

 

This is a major reason I only Remux my anime discs.  Because I want it to look as it does on the disc itself rather than inuring additional loss by recompression.

 

Personally though for VIEWING I see hugely diminishing returns on anything beyond 1080p.  I've watched The Force Awakens at 4K on my 65" TV and honestly, side by side with 1080p and I could not tell the difference from the couch.  I'm sure if I got my nose to the screen I could see a difference but that's not how I actually watch my TV.

 

And again, gaming, since it's so much sharp and in focus not to mention aliasing that goes on, yeah 4K makes GAMES look better. 

It would be a little impractical for me to go throw 5 GB episodes onto my smartphone or flash drive.

 

Though HEVC encoding with Main10 seems to be very good for anime encodes, with a lot of fine detail preserved. The only bad part to it being that I only have two devices in the house that can actually decode it (my desktop, and my phone).

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...
On 4/20/2018 at 3:31 AM, MyName13 said:

I've heard that 1080p Blu ray has bitrates around 15 Mbps (or much higher) while streamed 4K has the same bitrate (15 Mbps), doesn't this mean that streamed 4K will look worse than a 1080p Blu ray?What's the point of additional pixels (that should provide additional details) if they aren't even used?

The short answer is yes, streamed 4K often DOES look worse than 1080p Blu-Ray.

 

And people ask why some of us still buy Blu-Ray discs...

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/22/2018 at 1:07 AM, Zodiark1593 said:

It would be a little impractical for me to go throw 5 GB episodes onto my smartphone or flash drive.

 

Though HEVC encoding with Main10 seems to be very good for anime encodes, with a lot of fine detail preserved. The only bad part to it being that I only have two devices in the house that can actually decode it (my desktop, and my phone).

Assuming you're watching said content at home, you could try Plex (assuming your smartphone can directplay the content, anyway). And when you're out, you can still stream it from your Plex server, which will transcode on the fly to a lower bitrate.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, dalekphalm said:

Assuming you're watching said content at home, you could try Plex (assuming your smartphone can directplay the content, anyway). And when you're out, you can still stream it from your Plex server, which will transcode on the fly to a lower bitrate.

That might be a tall-ish order on 6GB of mobile data (when my phone is actually home to do the portable hotspot thing) and 2007-era networking equipment. :P

 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×