Jump to content

Ultra HD Alliance Announces New Specification - Ultra HD Premium

LAwLz

post-216-0-30690000-1452035167.jpg
 
It's about damn time that monitors and TVs caught up with my Chinese cartoons in terms of technology. Today the UHD Alliance (UHDA) announced a new standard called "Ultra HD Premium".
What is Ultra HD Premium and how does it differ from your UHD TV you just bought? Well, here are the requirements.
 
 
Requirements:

  • Image Resolution: 3840x2160
  • Color Bit Depth: 10-bit signal
  • Color Palette (Wide Color Gamut)
    • Signal Input: BT.2020 color representation
    • Display Reproduction: More than 90% of P3 colors
       
  • High Dynamic Range
    • SMPTE ST2084 EOTF
    • A combination of peak brightness and black level either:
      • More than 1000 nits peak brightness and less than 0.05 nits black level
        OR
      • More than 540 nits peak brightness and less than 0.0005 nits black level

 

What does this mean? It means that an UHD Premium display will need to support a much wider color space, it must support 10 bit colors (reduces banding), it must have UHD resolution and they must have good contrast.

 

 

Sources:

Ars Technica
Business Wire

 

Does this sound familiar to someone? It should if you have been following AMD news recently, because they talked about supporting these things last month. You can read more about what AMD had to say about their GPUs and BT.2020 over at Anandtech. It is well worth the read.

 

Are you excited that we will finally replace an ~80 year old monitor standard with something more modern, designed for the types of panels used today?

Windows does currently not support HDR but Microsoft are working on it. Both hardware and software got quite a long way to go before we will see widespread adoption, but this is an important first step towards a much better looking future.

Link to comment
Share on other sites

Link to post
Share on other sites

I want games to recreate the sun, 1.6E9 nits please I need a reason to wear my sunglasses indoors. Which do you think would look better the one with the higher peak brightness or the one with the lower minimum nits?

CPU: Intel 3570 GPUs: Nvidia GTX 660Ti Case: Fractal design Define R4  Storage: 1TB WD Caviar Black & 240GB Hyper X 3k SSD Sound: Custom One Pros Keyboard: Ducky Shine 4 Mouse: Logitech G500

 

Link to comment
Share on other sites

Link to post
Share on other sites

Wow, the display evolution is going faster and faster.

 

First g-sync/freesync, higher refreshrates (144hz+), high refreshrates with IPS.

Ultrawide and 4K. And now this!

 

Where is it going to end?

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

Wow, the display evolution is going faster and faster.

 

First g-sync/freesync, higher refreshrates (144hz+), high refreshrates with IPS.

Ultrawide and 4K. And now this!

 

Where is it going to end?

 

At the point where you can't tell you're looking at a monitor :)

Desktop:     Core i7-9700K @ 5.1GHz all-core = ASRock Z390 Taichi Ultimate = 16GB HyperX Predator DDR4 @ 3600MHz = Asus ROG Strix 3060ti (non LHR) = Samsung 970 EVO 500GB M.2 SSD = ASUS PG279Q

 

Notebook:  Clevo P651RG-G = Core i7 6820HK = 16GB HyperX Impact DDR4 2133MHz = GTX 980M = 1080p IPS G-Sync = Samsung SM951 256GB M.2 SSD + Samsung 850 Pro 256GB SSD

Link to comment
Share on other sites

Link to post
Share on other sites

I want games to recreate the sun, I need a reason to wear my sunglasses indoors.

Epic Troll is epic

Gamma v2.2 | i7 6700k @ 4.6ghz| Dark Rock TF | ASRock Z170 OC Formula | G-SKILL TridentZ Royal 2x16Gb 3200mhz | MSI GTX 1070 Ti Titanium | Sandisk 120Gb SSD | WD Black 1Tb HDD | Corsair RMx 850w | Corsair Spec Alpha | MSI Optix G27C2/2x19" monitors/34" Insignia tv

Spoiler

Secondary rig status: Blendin Blandin | Xeon E5 2670 E3 ES | Noctua L12s | ASRock X99 OC Formula | 48Gb Ram Smoothie | EVGA 980ti Superclocked+ | ADATA SU800 | SFFTime P-Atx | 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This year Samsung SHUD Tvs with Quantum Dot Display will be a big seller, for the big wallet fellas i guess... Its all about "Premium Word", marketing  :blink:

Groomlake Authority

Link to comment
Share on other sites

Link to post
Share on other sites

At the point where you can't tell you're looking at a monitor :)

I prefer just going outside then :D

 

And i think such thing is probably a task for a VR headset and not a monitor. :P

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

Now, if only Nvidia or AMD unlocks 10-bit colors (per channel) on its graphics card, so that it can actually be used, for those who got a 10-bit color enabled monitor (which is very few, as people have a hard time justifying true 8-bit panel, and opts for 6-bit panel instead)

Link to comment
Share on other sites

Link to post
Share on other sites

Wow, the display evolution is going faster and faster.

 

First g-sync/freesync, higher refreshrates (144hz+), high refreshrates with IPS.

Ultrawide and 4K. And now this!

 

Where is it going to end?

When this stuff will get affordable.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

Wow, the display evolution is going faster and faster.

 

First g-sync/freesync, higher refreshrates (144hz+), high refreshrates with IPS.

Ultrawide and 4K. And now this!

 

Where is it going to end?

This is nothing new, my 2009 Dell IPS monitor does all that (I mean what this new Ultra HD Premium) already. While I can't measure the contrast ratio, it can display very deep darks and very bright colors very well, and blacks are very black.

Link to comment
Share on other sites

Link to post
Share on other sites

Wow, the display evolution is going faster and faster.

 

First g-sync/freesync, higher refreshrates (144hz+), high refreshrates with IPS.

Ultrawide and 4K. And now this!

 

Where is it going to end?

When 1440p or 4K becomes a standard like 1080p

Scarlet KnightIntel Core i3 6100 || Antec A40 Pro CPU Cooler || MSI Z170A Gaming M5 || Kingston HyperX 16GB DDR4-2133MHz || Samsung 850 Evo 120GB || Seagate Barracuda 1TB || Gigabyte G1 Gaming R9 390X 8GB || Seasonic M12II 620W || In Win 503 || Corsair Strafe || Steelseries Kinzu V3 MSI Edition || Dell UltraSharp U2414H || Xiaomi Alumunium Mouse Pad (S)

 

#Gadget: 

Phone: BlackBerry Classic Q20, Samsung Galaxy Note 4 S-LTE SM-N916S

Console: PlayStation 4 500GB CUH-1206A

Tablet: iPad Air 2 16GB Wi-fi Only

Laptop: MSI GE62 (i7 4720HQ || 8GB DDR3 || NVIDIA GTX960M || Samsung 650 EVO 120GB + 1TB HDD)

In-ear Monitor: Xiaomi Piston 3.0

Link to comment
Share on other sites

Link to post
Share on other sites

When 1440p or 4K becomes a standard like 1080p

Nha. When we have true 14-bit per channel panels, OLED (with OLED issues solved), 8K on 24inch screen, non glossy, while maximizing sharpness, wider color spectrum then AdobeRGB (laser projectors already do this particular point), then it will stop.

Link to comment
Share on other sites

Link to post
Share on other sites

This reminds me. Has there been an update for Nvidia drivers to support 10-bit color on their non Quadro cards? My 4k monitor with 10-bit color capability will be here soon.

Internets Machine: Intel 4690k w/ Be Quiet! Pure Rock 4.7Ghz. MSI Krait z97. 16GB Crucial Ballistix Sport Ram. MSI GTX 970 SLI 1520mhz. 500GB Samsung EVO 840  & 3TB WD Blue Drive. Rosewill 1000w Modular PSU. Corsair Air 540

My Beats Yo: Desktop:SMSL SA-160 Amp, KEF Q100 w/ Dayton 100w Sub Theater: Micca MB42X-C x3, MB42X x2, COVO-S x2 w/Dayton 120w Sub Headphones:  HIFiMan HE-400i, PSB M4U2, Philips Fidelio X2, Modded Fostex T50RP, ATH-M50, NVX XPT100, Phillips SHP9500, Pioneer SE-A1000, Hyper X Cloud 1&2, CHC Silverado, Superlux 668B

Link to comment
Share on other sites

Link to post
Share on other sites

This reminds me. Has there been an update for Nvidia drivers to support 10-bit color on their non Quadro cards? My 4k monitor with 10-bit color capability will be here soon.

Nope. Same for AMD
Link to comment
Share on other sites

Link to post
Share on other sites

Very interesting.

Thanks for the article and summary.

COMMUNITY STANDARDS   |   TECH NEWS POSTING GUIDELINES   |   FORUM STAFF

LTT Folding Users Tips, Tricks and FAQ   |   F@H & BOINC Badge Request   |   F@H Contribution    My Rig   |   Project Steamroller

I am a Moderator, but I am fallible. Discuss or debate with me as you will but please do not argue with me as that will get us nowhere.

 

Spoiler

  

 

Character is like a Tree and Reputation like its Shadow. The Shadow is what we think of it; The Tree is the Real thing.  ~ Abraham Lincoln

Reputation is a Lifetime to create but seconds to destroy.

You have enemies? Good. That means you've stood up for something, sometime in your life.  ~ Winston Churchill

Docendo discimus - "to teach is to learn"

 

 CHRISTIAN MEMBER 

 

 
 
 
 
 
 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nope. Same for AMD

I thought AMD consumer cards did support 10-bit maybe only in Linux?

Internets Machine: Intel 4690k w/ Be Quiet! Pure Rock 4.7Ghz. MSI Krait z97. 16GB Crucial Ballistix Sport Ram. MSI GTX 970 SLI 1520mhz. 500GB Samsung EVO 840  & 3TB WD Blue Drive. Rosewill 1000w Modular PSU. Corsair Air 540

My Beats Yo: Desktop:SMSL SA-160 Amp, KEF Q100 w/ Dayton 100w Sub Theater: Micca MB42X-C x3, MB42X x2, COVO-S x2 w/Dayton 120w Sub Headphones:  HIFiMan HE-400i, PSB M4U2, Philips Fidelio X2, Modded Fostex T50RP, ATH-M50, NVX XPT100, Phillips SHP9500, Pioneer SE-A1000, Hyper X Cloud 1&2, CHC Silverado, Superlux 668B

Link to comment
Share on other sites

Link to post
Share on other sites

That's good news, but consumer cards don't really support 10-bit color. =/

 

And monitors with that capability are expensive and not very widespread. So basically the monitors that fall under this standard right now are going to be few and far between.

Like watching Anime? Consider joining the unofficial LTT Anime Club Heaven Society~ ^.^

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

That's good news, but consumer cards don't really support 10-bit color. =/

And monitors with that capability are expensive and not very widespread. So basically the monitors that fall under this standard right now are going to be few and far between.

They're getting close ;)

http://www.tweaktown.com/news/49335/aoc-u2879vf-monitor-goes-4k-freesync/index.html

Intel i7 6700k @ 4.8ghz, NZXT Kraken X61, ASUS Z170 Maximus VIII Hero, (2x8GB) Kingston DDR4 2400, 2x Sapphire Nitro Fury OC+, Thermaltake 1050W

All in a Semi Truck!:

http://linustechtips.com/main/topic/519811-semi-truck-gaming-pc/#entry6905347

Link to comment
Share on other sites

Link to post
Share on other sites

so 10bit panels are going to become the norm finally?? That will be cool.

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

Now, if only Nvidia or AMD unlocks 10-bit colors (per channel) on its graphics card, so that it can actually be used, for those who got a 10-bit color enabled monitor (which is very few, as people have a hard time justifying true 8-bit panel, and opts for 6-bit panel instead)

Isn't that exactly what amd is doing with polaris?

http://www.hardwarezone.com.sg/m/feature-heres-how-amd-will-make-games-movies-and-photos-look-better-ever-2016

The article mentions that the 300 series already supports this. Would be awesome if someone at ltt could do a quick test to verify.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

can some one really tell the difference between  8 and 10 bit ? ive never really noticed

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

Now, if only Nvidia or AMD unlocks 10-bit colors (per channel) on its graphics card, so that it can actually be used, for those who got a 10-bit color enabled monitor (which is very few, as people have a hard time justifying true 8-bit panel, and opts for 6-bit panel instead)

It'll probably be a Quadro/FirePro thing for a while. Unless some mainstream or gaming monitors start coming out with this standard.

Link to comment
Share on other sites

Link to post
Share on other sites

can some one really tell the difference between  8 and 10 bit ? ive never really noticed

the first samsung 4k panel had 8+2 bits with dithering. The asus one had 8 bits. I have both. i can assure you you can tell the difference. it's not HUGE but noticeble.

Link to comment
Share on other sites

Link to post
Share on other sites

Now, if only Nvidia or AMD unlocks 10-bit colors (per channel) on its graphics card, so that it can actually be used, for those who got a 10-bit color enabled monitor (which is very few, as people have a hard time justifying true 8-bit panel, and opts for 6-bit panel instead)

AMD will do it. They talked about it a month ago. I don't think they said which cards will get it, but their 300 series and their new generation will get it.

 

 

 

This is nothing new, my 2009 Dell IPS monitor does all that (I mean what this new Ultra HD Premium) already. While I can't measure the contrast ratio, it can display very deep darks and very bright colors very well, and blacks are very black.

I don't think your monitor supports all of it yet.

It might support local dimming and 10 (or maybe even 12) bit colors, but probably not the Rec.2020 color space.

 

 

 

can some one really tell the difference between  8 and 10 bit ? ive never really noticed

It highly depends on what you are looking at. It is not very obvious unless you look a very fine gradients.

You will be able to notice the much wider color space and the better contrasts though.

Link to comment
Share on other sites

Link to post
Share on other sites

Hate to burst your bubble, but 10-bit pirate releases of anime are not native, they are just conversions for the sake of the lower file size video.

 

Same with FLAC audio. Crap in = Crap out, even if you pretty up the toilet it's in.

 

You can't convert a 96kbps mono mp3 to FLAC and expect it to magically sound good. It's just a different container for the same data.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×