Jump to content

Why are there no 1080P HDR TV's

Hikaru12

4K is a huge scam unless you're sitting very close to your TV or can afford a massive screen or have a decent 4K PC monitor so why then the big push for 4K when 1080P still looks amazing and filming HDR on that would solve a lot of the issues we have now (using 4:0:2 on 4K content because it takes up so much space, bandwidth caps and streaming speedd delivery issues, etc.)? Is this simply a money grab or is there more to it than I'm missing? 

Link to comment
Share on other sites

Link to post
Share on other sites

4K is fairly useless in a lot of applications, and content is still not anywhere near worthy of an investment. It's at the same point "HDTV" was in the past, when no channels existed, and no content was out there in a reasonable quantity. It'll get there eventually, but I really have to wonder to what benefit. As you said, 1080P is more than adequate in the majority of use cases.

Link to comment
Share on other sites

Link to post
Share on other sites

Don't know if I'd call 4k a "scam" even considering your points. And HDR would be nice but it's an expensive feature so those only get put on expensive tvs for now and that means 4k or better. It looks like you'll be perpetually out of luck here.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Misanthrope said:

Don't know if I'd call 4k a "scam" even considering your points. And HDR would be nice but it's an expensive feature so those only get put on expensive tvs for now and that means 4k or better. It looks like you'll be perpetually out of luck here.

I'd much rather they improve the color space which hasn't changed since the 60's since color was introduced to TV than 4K - it's a much more noticeable improvement. We're finally starting to get into the P3 color space used by Cinemas but only on 2k+ sets.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Hikaru12 said:

I'd much rather they improve the color space which hasn't changed since the 60's since color was introduced to TV than 4K - it's a much more noticeable improvement. We're finally starting to get into the P3 color space used by Cinemas but only on 2k+ sets.

I guess that's one of those things where people consider better resolution to be inherently better than a better color space. Kinda how by now we pretty much decided 3D is better than 2D for gaming when in fact 2D games can be far more visually appealing and use resources far more effectively.

 

But it's just one of those things where a trend takes off.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

I guess it's the same reason you don't see G-Sync supported on a GT 710, or, a GTX 1080 2 GB VRAM editionn... typically all the high-end stuff comes together as a package as you move up.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

TV manufacturers (for the most part) see 4K as the new standard. This has resulted in an abundance of 4K televisions, so manufacturers are hesitant to be releasing 1080p HDR TVs that will inevitably be competing with 4K TVs. In summary, a lot of consumers barely understand what 4K is and know even less about HDR. This has resulted in manufacturers simply adapting to the majority consumer base rather than educating people on what these terms mean.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Hikaru12 said:

4K is a huge scam unless you're sitting very close to your TV or can afford a massive screen or have a decent 4K PC monitor so why then the big push for 4K when 1080P still looks amazing and filming HDR on that would solve a lot of the issues we have now (using 4:0:2 on 4K content because it takes up so much space, bandwidth caps and streaming speedd delivery issues, etc.)? Is this simply a money grab or is there more to it than I'm missing? 

 

It's all about marketing for the most part.
Easier to sell something that says "4K" than it is to sell something that says "1080P HDR", non-technical people won't know what HDR is, and they'll think that because it's 4K, it's automatically much better than a 1080P TV.

Specs: CPU - Intel i7 8700K @ 5GHz | GPU - Gigabyte GTX 970 G1 Gaming | Motherboard - ASUS Strix Z370-G WIFI AC | RAM - XPG Gammix DDR4-3000MHz 32GB (2x16GB) | Main Drive - Samsung 850 Evo 500GB M.2 | Other Drives - 7TB/3 Drives | CPU Cooler - Corsair H100i Pro | Case - Fractal Design Define C Mini TG | Power Supply - EVGA G3 850W

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, TheKDub said:

 

It's all about marketing for the most part.
Easier to sell something that says "4K" than it is to sell something that says "1080P HDR", non-technical people won't know what HDR is, and they'll think that because it's 4K, it's automatically much better than a 1080P TV.

Exactly!

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, TheKDub said:

 

It's all about marketing for the most part.
Easier to sell something that says "4K" than it is to sell something that says "1080P HDR", non-technical people won't know what HDR is, and they'll think that because it's 4K, it's automatically much better than a 1080P TV.

It's a shame the techy people don't end up being the consumers who go out and buy this stuff. 

 

8 minutes ago, Just-Chill said:

In summary, alot of consumers barely understand what 4K is and know even less about HDR. This has resulted in manufacturers simply adapting to the majority consumer base rather than educating people on what these terms mean.

 

Before you know it 8K will be the new standard before 4K even has anything out. It's almost buzzwords at this point just to get people to buy more shit that doesn't add anything remarkable to their viewing experience. Most people don't calibrate their TV's either so I guess it makes sense.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Hikaru12 said:

It's a shame the techy people don't end up being the consumers who go out and by this stuff. 

 

It's a smaller market, companies are going to target whatever group of people will buy the most of their products, that way they make the most money, so that means they're much more likely to go after the very large groups of non-technical people than they are to go after the smaller groups of technical people.

Specs: CPU - Intel i7 8700K @ 5GHz | GPU - Gigabyte GTX 970 G1 Gaming | Motherboard - ASUS Strix Z370-G WIFI AC | RAM - XPG Gammix DDR4-3000MHz 32GB (2x16GB) | Main Drive - Samsung 850 Evo 500GB M.2 | Other Drives - 7TB/3 Drives | CPU Cooler - Corsair H100i Pro | Case - Fractal Design Define C Mini TG | Power Supply - EVGA G3 850W

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, TheKDub said:

 

It's a smaller market, companies are going to target whatever group of people will buy the most of their products, that way they make the most money, so that means they're much more likely to go after the very large groups of non-technical people than they are to go after the smaller groups of technical people.

That's what I'm harping about. I wish people would take the time to get more informed rather than being like oo shiny because this provides no incentive for the markets to improve their products because they know no one is really watching them or cares if they release the same thing year after year. Some of the most basic things we could be doing to improve picture quality are taking a back seat for buzzwords just to rake more people over the coals meanwhile the people who are more technical who want more options in the market are stuck up a creek without a paddle. If somewhat affordable OLED monitors don't come out in the future then I will have lost my faith in the industry understanding its potential markets and just being a bunch of greedy bastards.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Ryan_Vickers said:

GTX 1080 2 GB VRAM editionn...

Best product of the year 2017

Cor Caeruleus Reborn v6

Spoiler

CPU: Intel - Core i7-8700K

CPU Cooler: be quiet! - PURE ROCK 
Thermal Compound: Arctic Silver - 5 High-Density Polysynthetic Silver 3.5g Thermal Paste 
Motherboard: ASRock Z370 Extreme4
Memory: G.Skill TridentZ RGB 2x8GB 3200/14
Storage: Samsung - 850 EVO-Series 500GB 2.5" Solid State Drive 
Storage: Samsung - 960 EVO 500GB M.2-2280 Solid State Drive
Storage: Western Digital - Blue 2TB 3.5" 5400RPM Internal Hard Drive
Storage: Western Digital - BLACK SERIES 3TB 3.5" 7200RPM Internal Hard Drive
Video Card: EVGA - 970 SSC ACX (1080 is in RMA)
Case: Fractal Design - Define R5 w/Window (Black) ATX Mid Tower Case
Power Supply: EVGA - SuperNOVA P2 750W with CableMod blue/black Pro Series
Optical Drive: LG - WH16NS40 Blu-Ray/DVD/CD Writer 
Operating System: Microsoft - Windows 10 Pro OEM 64-bit and Linux Mint Serena
Keyboard: Logitech - G910 Orion Spectrum RGB Wired Gaming Keyboard
Mouse: Logitech - G502 Wired Optical Mouse
Headphones: Logitech - G430 7.1 Channel  Headset
Speakers: Logitech - Z506 155W 5.1ch Speakers

 

Link to comment
Share on other sites

Link to post
Share on other sites

This topic is so controversial its hard to say. 

I myself find contents recorded at 4k look way better but at the same time, the industry as a whole just isn't there yet. 

Still, i prefer the manufacturers pushing for new heights rather than stay put, that's what tech is, they push the limit of boundaries and not stay put. Cough cough, (AMD needs to learn this logic)

Link to comment
Share on other sites

Link to post
Share on other sites

Just because 4K hasn't become fully ubiquitous doesn't mean it's a scam. Would you say 1080P was a scam nearly a decade ago? I paid over $1500 for my Sony KDL-42V4100.

Intel® Core™ i7-12700 | GIGABYTE B660 AORUS MASTER DDR4 | Gigabyte Radeon™ RX 6650 XT Gaming OC | 32GB Corsair Vengeance® RGB Pro SL DDR4 | Samsung 990 Pro 1TB | WD Green 1.5TB | Windows 11 Pro | NZXT H510 Flow White
Sony MDR-V250 | GNT-500 | Logitech G610 Orion Brown | Logitech G402 | Samsung C27JG5 | ASUS ProArt PA238QR
iPhone 12 Mini (iOS 17.2.1) | iPhone XR (iOS 17.2.1) | iPad Mini (iOS 9.3.5) | KZ AZ09 Pro x KZ ZSN Pro X | Sennheiser HD450bt
Intel® Core™ i7-1265U | Kioxia KBG50ZNV512G | 16GB DDR4 | Windows 11 Enterprise | HP EliteBook 650 G9
Intel® Core™ i5-8520U | WD Blue M.2 250GB | 1TB Seagate FireCuda | 16GB DDR4 | Windows 11 Home | ASUS Vivobook 15 
Intel® Core™ i7-3520M | GT 630M | 16 GB Corsair Vengeance® DDR3 |
Samsung 850 EVO 250GB | macOS Catalina | Lenovo IdeaPad P580

Link to comment
Share on other sites

Link to post
Share on other sites

Eg. 4K signal is widely available in Japan and Korea. I would definitely not call it a scam. Also newer version of HEVC encoders are able to squeeze 4K signal into acceptable bitrates and HEVC algorithms are getting more and more efficient.

 

Speaking of Japan, last two Olympics were recorded and broadcasted  in 8K by NHK (national TV) and since there are still no commercially available 8K TVs (don't mix TVs and monitors), they've organized public events where people could go and watch the Olympics in 8K. For Tokyo 2020, they will broadcast the Olympics in 8K worldwide.

Link to comment
Share on other sites

Link to post
Share on other sites

at the moment HDR guidelines specify that any panel carrying HDR capabilities must be 4K-ready. 

 

4K nor HDR are total money-grabs. the problem is tying everything together and viewing it properly. the TV has to be able to power proper HDR10 metadata, the source has to be mastered properly for HDR, drivers and software have to support HDR viewing, etc. 

 

also 4K allows a much wider color space, making it easier to master HDR 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, BlueChinchillaEatingDorito said:

Just because 4K hasn't become fully ubiquitous doesn't mean it's a scam. Would you say 1080P was a scam nearly a decade ago? I paid over $1500 for my Sony KDL-42V4100.

It's a scam because they're not doing a good job educating people that they need really big TV's to make use of 4K. 1080P, that wasn't so much the case. They're just saying 4K is better and then people buy smaller TV's and sit at normal seating distance at 9 feet and think they're getting a deal.

Link to comment
Share on other sites

Link to post
Share on other sites

 

14 hours ago, Niksa said:

 

Eg. 4K signal is widely available in Japan and Korea. I would definitely not call it a scam. Also newer version of HEVC encoders are able to squeeze 4K signal into acceptable bitrates and HEVC algorithms are getting more and more efficient.

 

It's not so much the bitrate that is the issue it's that master content is recorded in 4:4:4 and it's not available to the consumer market because that's too much to deliver over the wire but if we had gone with 1080P and HDR we wouldn't have this issue. You're right though but the main issue here in America is greedy corporate monopoly ISP's and the vastness of the States and the variety of landscapes that make delivering fast internet very difficult. 

 

Quote

also 4K allows a much wider color space, making it easier to master HDR 

Resolution has nothing to do with color space. I don't follow what you're saying.

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, Hikaru12 said:

It's a scam because they're not doing a good job educating people that they need really big TV's to make use of 4K. 1080P, that wasn't so much the case. They're just saying 4K is better and then people buy smaller TV's and sit at normal seating distance at 9 feet and think they're getting a deal.

That's still not a valid argument to say it's a scam. That's the consumer just being ignorant by not doing their research before they buy.

Intel® Core™ i7-12700 | GIGABYTE B660 AORUS MASTER DDR4 | Gigabyte Radeon™ RX 6650 XT Gaming OC | 32GB Corsair Vengeance® RGB Pro SL DDR4 | Samsung 990 Pro 1TB | WD Green 1.5TB | Windows 11 Pro | NZXT H510 Flow White
Sony MDR-V250 | GNT-500 | Logitech G610 Orion Brown | Logitech G402 | Samsung C27JG5 | ASUS ProArt PA238QR
iPhone 12 Mini (iOS 17.2.1) | iPhone XR (iOS 17.2.1) | iPad Mini (iOS 9.3.5) | KZ AZ09 Pro x KZ ZSN Pro X | Sennheiser HD450bt
Intel® Core™ i7-1265U | Kioxia KBG50ZNV512G | 16GB DDR4 | Windows 11 Enterprise | HP EliteBook 650 G9
Intel® Core™ i5-8520U | WD Blue M.2 250GB | 1TB Seagate FireCuda | 16GB DDR4 | Windows 11 Home | ASUS Vivobook 15 
Intel® Core™ i7-3520M | GT 630M | 16 GB Corsair Vengeance® DDR3 |
Samsung 850 EVO 250GB | macOS Catalina | Lenovo IdeaPad P580

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Hikaru12 said:

 

It's not so much the bitrate that is the issue it's that master content is recorded in 4:4:4 and it's not available to the consumer market because that's too much to deliver over the wire but if we had gone with 1080P and HDR we wouldn't have this issue. You're right though but the main issue here in America is greedy corporate monopoly ISP's and the vastness of the States and the variety of landscapes that make delivering fast internet very difficult. 

 

Resolution has nothing to do with color space. I don't follow what you're saying.

There isn't really a big benefit of full chroma resolution.

 

The color space used in combination with UHD is much bigger.

https://en.wikipedia.org/wiki/Rec._2020

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, .spider. said:

There isn't really a big benefit of full chroma resolution.

 

The color space used in combination with UHD is much bigger.

https://en.wikipedia.org/wiki/Rec._2020

Fair enough but no one has a set that even touches the Rec 2020 color space requirements, at best we opt to try to get high 90s P3. We would probably need 12 bit sets before we would start seeing Rec 2020. I think the main issue is displays have not caught up with media capturing technology. 

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Hikaru12 said:

Fair enough but no one has a set that even touches the Rec 2020 color space requirements, at best we opt to try to get high 90s P3. We would probably need 12 bit sets before we would start seeing Rec 2020. I think the main issue is displays have not caught up with media capturing technology. 

Do they need to though?  Capturing will always be ahead of display, be it sound or visual, and for good reason.  When you capture, you often then want to edit, and having the extra data gives you that option.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Ryan_Vickers said:

Do they need to though?  Capturing will always be ahead of display, be it sound or visual, and for good reason.  When you capture, you often then want to edit, and having the extra data gives you that option.

Yes having extra data to play around with gives you creative choice as to how best to represent your image or sound but you also run into the issue of not picking the right stuff because your display isn't up to snuff to display what you captured to be able to do that. You have all these extra colors and can now make some nice design choices but your display only shows you a tiny fraction of those so now you're making wrong assumptions on the colors to begin with. This has somewhat been mitigated by 10 bit monitors but it's still a ways to go. No one gets to see your great capturing work because it's either too cost prohibitive or the technologies not there yet.

Link to comment
Share on other sites

Link to post
Share on other sites

HDR is a "high end" feature, so it only gets shuffled into "high end" specs, like 4K. Having HDR 1080p is like adding turbo to a four-cylinder vehicle to those manufacturers. I guess.

 

Also few TVs are completely compliant with either HDR10 or Dolby Vision, as they don't reach the 1,000 cd/m^2 requirement. (EDIT: this may not be a real requirement... but yeah, there's no real point in having image contrast if your backlight isn't bright)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×