Jump to content

Which of these video encodes is of higher quality?

presss

Hi guys, 

 

I'm choosing between these two encodes of the same movie: one is in SDR x264 with a higher bitrate and 8 bit colour and the second is in x265 hevc in HDR at a lower bitrate and 10-bit colour? I know x265 is way more efficient but given these differences, which picture quality would be superior? (given that my tv does accept hdr signal but is not a true HDR high brightness monster).

 

Best,

 

 

Screenshot_1.jpg

Screenshot_2.jpg

i9 12900K | 3080Ti FE | 32GB T-Force 8-pack binned edition @3600 CL14

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, PreSlav said:

I'm choosing between these two encodes of the same movie: one is in SDR x264 with a higher bitrate and 8 bit colour and the second is in x265 hevc in HDR at a lower bitrate and 10-bit colour?

Given the limited amount of information you are giving, I'd go with the HEVC-one.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

but why 10bit at all?

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Jurrunio said:

but why 10bit at all?

You mean why since my tv is not 10-bit or?

i9 12900K | 3080Ti FE | 32GB T-Force 8-pack binned edition @3600 CL14

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, PreSlav said:

You mean why since my tv is not 10-bit or?

yup

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/9/2019 at 3:45 AM, Jurrunio said:

but why 10bit at all?

From what I gather, using 10 bit allows for (slightly) smaller file sizes at otherwise similar settings. Though I hadn't tested this much myself.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Zodiark1593 said:

From what I gather, using 10 bit allows for (slightly) smaller file sizes at otherwise similar settings. Though I hadn't tested this much myself.

That's not very logical - 10 bit colour vs 8 bit colour means that each colour value uses 10 bits instead of 8 bits, because there's a much higher variation in colour choices (256 choices per 8-bit channel and 1024 choices per 10-bit channel - RGB of course).

 

So intuitively, all things being equal, 10-bit video should be larger than an otherwise identical 8-bit video.

 

Now, there could be some software things going on that negate that, but I'm not versed enough to say.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, dalekphalm said:

That's not very logical - 10 bit colour vs 8 bit colour means that each colour value uses 10 bits instead of 8 bits, because there's a much higher variation in colour choices (256 choices per 8-bit channel and 1024 choices per 10-bit channel - RGB of course).

 

So intuitively, all things being equal, 10-bit video should be larger than an otherwise identical 8-bit video.

 

Now, there could be some software things going on that negate that, but I'm not versed enough to say.

Color dithering requires a significant amount of data. I suppose if the intended color can be determined and encoded into a higher color depth, bit rate can be reduced.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Zodiark1593 said:

Color dithering requires a significant amount of data. I suppose if the intended color can be determined and encoded into a higher color depth, bit rate can be reduced.

Could be - but enough to counteract the naturally larger size of each frame?

 

I'd want to see testing compare each. I'm sure there's a sweet spot, but which one wins? 10 bit w/ less dithering, or 8 bit w/ more dithering?

 

Would be an interesting head-to-head.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, dalekphalm said:

Could be - but enough to counteract the naturally larger size of each frame?

 

I'd want to see testing compare each. I'm sure there's a sweet spot, but which one wins? 10 bit w/ less dithering, or 8 bit w/ more dithering?

 

Would be an interesting head-to-head.

Perhaps Taran should get on it? I don't have the resources to make uploads myself. (Cell for internet)

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×