Jump to content

Will we ever see monitor specs cross over to TV's?

Logun

Will we ever see some of the new monitor technologies cross over and be implemented into TV's? Specifically I'm thinking about Gsync and Freesync being implemented into 4K TVs'

 

As well as pushing higher refresh rates to larger screens.

 

I would love nothing more than to have a 50" or 55" 4K TV which has a max refresh rate of 120Hz and integerated Gsync or freesync. (I believe Display Port 1.3 has a bit enough bit rate to allow for that)

 

I want the world!

 

I did some testing a number of months ago with the Seiki 39" TV, with the 50" firmware flashed on it to enable true 120Hz refresh at 1080p. 

1) 4K truly is eye-popping gorgeous. Instantly noticeable

2) 120Hz was nice but not as instantly noticeable as the resolution improvement (and this is coming from someone who has been at 1600p for the last few years)

3) obviously the GPU horse power is slowly catching up which makes the Sync technologies just about a "must-have" if you're planning on playing anything from the past 5 years or so.

 

Is the market for such a panel just too small? Should I being to look into an indiegogo project akin to what these guys did? https://www.indiegogo.com/projects/ncase-m1-mini-itx-pc-case

 

an indiegogo project for this kind of thing would probably never fly
Link to comment
Share on other sites

Link to post
Share on other sites

TV's arent meant for gaming, all movies are filmed at 24 FPS (some exceptions), so there is no need for 144hz TV's

ITX Monster: CPU: I5 4690K GPU: MSI 970 4G Mobo: Asus Formula VI Impact RAM: Kingston 8 GB 1600MHz PSU: Corsair RM 650 SSD: Crucial MX100 512 GB HDD: laptop drive 1TB Keyboard: logitech G710+ Mouse: Steelseries Rival Monitor: LG IPS 23" Case: Corsair 250D Cooling: H100i

Mobile: Phone: Broken HTC One (M7) Totaly Broken OnePlus ONE Samsung S6 32GB  :wub:  Tablet: Google Nexus 7 2013 edition
 

Link to comment
Share on other sites

Link to post
Share on other sites

True but even now they are testing out filming at different FPS - unless The Hobbit was the first(?) and last of it's kind

Link to comment
Share on other sites

Link to post
Share on other sites

One example would be the Hobbit movie series. Its shot at 48FPS. But I cant see companies making special TVs for these few rare occasions.  

 eGPU Setup: Macbook Pro 13" 16GB DDR3 RAM, 512GB SSD, i5 3210M, GTX 980 eGPU

New PC: i7-4790k, Corsair H100iGTX, ASrock Fatal1ty Z97 Killer, 24GB Ram, 850 EVO 256GB SSD, 1TB HDD, GTX 1080 Fractal Design R4, EVGA Supernova G2 650W

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

No, because monitor specs on a TV is not necessary, and there are penalties to forcing those specs on a TV with our current technology. 

 

Currently the "Game" mode along with normal TV specs is the optimal configuration for a modern TV. 

Error: 410

Link to comment
Share on other sites

Link to post
Share on other sites

I'm waiting for 4k tv with dp input for a reasonable price.

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

My TV screen is already pretty great, The only thing is that it is limited by HDMI. 

Intel 4790k | Asus Z97 Maximus VII Impact | Corsair Vengeance Pro Series 16 GB 1866Mhz | Asus Strix GTX 980 | CoolerMaster G550 |Samsung Evo 250GB | Synology DS215j (NAS) | Logitech G502 |

 

Link to comment
Share on other sites

Link to post
Share on other sites

One example would be the Hobbit movie series. Its shot at 48FPS. But I cant see companies making special TVs for these few rare occasions.  

All movies are shot at 60FPS, they are rendered at 24FPS and The Hobbit was rendered at 48FPS.

Link to comment
Share on other sites

Link to post
Share on other sites

@Logun There'd be no reason.

In a game your FPS may fluctuate dramatically depending on the ingame scenario, and because you're moving and are inputting actions and then waiting for a result any type of lag is easily noticed.

So V-Sync fixes the tearing, but it adds input lag.

G-Sync and FreeSync would prevent input lag, or at least minimize it, prevent screen tearing from inconsistency between the refresh rate of the monitor and the rate at which your graphics card renders the scene, and by preventing such inconsistency would also make lower FPS more playable.

 

But when you're watching a show what would make you notice any latency that was added by a V-Sync type function?

You aren't controlling anything in the show, aren't interacting with it in any way, you're just observing it.

 

What would make a show feel less fluid when it runs at a constant fps and has natural motion blur that compensate for the low fps.

 

So there's no problem with FPS dipping or going above the refresh rate, as you're just playing a stream and it's just playing it back (vs. a game where you render the scene and depending on the scenes complexity your fps will change), there's no problem with input latency.

There's really no reason for them to implement it other than for people who use their TV as a Display for their PC; Very few companies would spend the money necessary for the research and development to bring such a product to market as the market for a TV that has a cost which is much higher than other TVs with similar features, and many features that are only functional when used in tandem with a PC that has a graphics card that is compatible with these technologies, simply isn't there.

 

I mean imagine trying to sell a TV like that to a regular person.

"This TV has awesome features X,Y, and Z, Content for X has not yet been made available to consumers, and you can only benefit from Y and Z by using this with a PC that has a Kepler graphics card in it (or GCN 1.1 for FreeSync)."

Linus Sebastian said:

The stand is indeed made of metal but I wouldn't drive my car over a bridge made of it.

 

https://youtu.be/X5YXWqhL9ik?t=552

Link to comment
Share on other sites

Link to post
Share on other sites

Even if movies are shot at different framerates, any display can adjust its refresh rate.  G-Sync/FreeSync are different because they allow dynamically changing framerate, something that movies don't need.  They always show at a constant framerate, no matter what that rate happens to be.  I doubt TV makers will be interested in the extra expense of variable refresh technologies, since even though people do play games on TVs, the majority of content is movies and shows.  For those, continuously changing refresh rate isn't used, and if the content is at a different framerate then the display can change to that frequency in the normal way and stay there.  Not saying that TVs actually do that now, but the point is, content at different framerates won't be a reason to implement variable refresh.

Link to comment
Share on other sites

Link to post
Share on other sites

TV's arent meant for gaming, all movies are filmed at 24 FPS (some exceptions), so there is no need for 144hz TV's

Newer movies are adopting higher framerates (slowly) and new movies are filmed in 4k and the 3d ones could benift (a little bit) from higher hz tv's..Keep in mind this is all slow changes and some don't effect much.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe they will come out with "Gaming TVs" that are not only larger but have a higher refresh rate

 eGPU Setup: Macbook Pro 13" 16GB DDR3 RAM, 512GB SSD, i5 3210M, GTX 980 eGPU

New PC: i7-4790k, Corsair H100iGTX, ASrock Fatal1ty Z97 Killer, 24GB Ram, 850 EVO 256GB SSD, 1TB HDD, GTX 1080 Fractal Design R4, EVGA Supernova G2 650W

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I am very hopeful because IMO 4K and 24" just don't mesh well - at least not for those of us who don't have natural 20/20 vision.

 

even at 39" I found text on games like Civ5 incredibly hard to read without leaning into to 1-2' away.

 

I am encouraged by seeing 4K monitors at 28"-32" so I'm hoping the manufactures will see people with 2-3 monitor setup and test market a ~50" screen at that demographic....

Link to comment
Share on other sites

Link to post
Share on other sites

Why would they do that? Then the console peasants might get wise as to what they've been duped into buying

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×