Jump to content

HDR TVs: Samsung Q70R vs Sony X950G. Also, nits: Is 700 nits enough?

Jdbye

On the search for a HDR TV I have narrowed it down to two potential suspects.

Either the 65" Samsung QLED Q70R (QE65Q70RAT) or the 65" Sony Bravia X950G (KD-65XG9505)

 

The Samsung is advertised as Quantum HDR 1000, but according to the tests in the comparison linked (under SDR and HDR peak brightness), it can barely do 700 nits.

https://www.rtings.com/tv/tools/compare/samsung-q70r-vs-sony-x950g/782/764?usage=7318&threshold=0.1

The Sony meanwhile excels and does in excess of 1100 nits.

 

First of all I am wondering why it's advertised as 1000 nits, when the tests say otherwise. Is that just down to the manufacturers measuring peak brightness differently, does it still count as HDR 1000 or is this just blatant false advertising?

 

Also, how significant is the difference between 700 and 1000 nits? I know Linus always says you should get 1000 nits minimum, but there are some other key differences between the two that are making the choice difficult, and these two are really the best choices I've found (as you'll see further down)

 

While they both look great on paper for the most part, here are some key differences that are the most important to me:

  • The above mentioned difference in measured peak brightness.
     
  • The Sony does not currently have VRR/FreeSync support, while the Samsung does. Also, the Sony does not appear to have as good support when it comes to resolutions, 1440p@60 has to be forced, and 1440p@120 appears to not be supported at all according to the comparison, which is strange as HDMI 2.0 should have enough bandwidth for it and the lower end X900F series does support it (though it still has to be forced) which for all intents and purposes appears to be using the exact same panel with all the improvements being in the hardware driving it all. I expect this should be easily fixable through an update as the X900F supports it already, but I don't know.
    I figure there is a decent chance Sony will update many of their 2018/2019 TVs with VRR/FreeSync support as the next gen consoles coming out this year are all supposed to have support for it and they are personally invested in that market with the PS5, but that seems like a risk to take.
    I don't currently have a rig capable of doing 1440p@120hz and it will be a while before I get to that point, but I also don't want to have to replace this TV in a few years.
     
  • The Sony runs Android TV, which is preferrable due to the bigger app selection, but costs a bit more. If I end up with the Samsung, I'm going to get a Shield TV for it anyways and will be using that for smart TV functionality (meaning the Samsung actually ends up the more expensive one once I add the cost of that), as there are some apps I want I just can't get on Tizen. Since I can just fix that problem by getting a Shield TV, it's not a huge deal, but it would be nice to have it integrated into the TV, plus I have a feeling using the TV remote to control the Shield TV over HDMI-CEC will not work as well as using the Shield TV remote directly and will leave some functions inaccessible (that has been my experience with HDMI-CEC in the past)
     
  • The Samsung has one neat feature that nothing else seems to have, it can interpolate the framerate of games that are locked at 30 FPS (typically console games), without causing the soap opera effect, and without adding noticeable input lag, it can of course also interpolate higher framerates and other types of content like all TVs can these days but this specific feature is called "Game Motion Plus" and is tailored specifically towards games and supposedly actually works. In the comparison, only 4K@60 was measured for input lag with interpolation, which was probably not using "Game Motion Plus" but 15 ms input lag with interpolation off, and 21 ms with interpolation on is such a small difference that I believe the people saying it doesn't add input lag. How well the effect actually works is subjective though, so I might not like it even though others said they did. But as a Nintendo fan and Switch owner it certainly seems like a feature that could benefit me greatly.

 

There are many minor differences other than that but nothing that would be a deal breaker, at least nothing that came to mind while writing this post.

The general consensus in reviews and posts about them seems to be that the Sony is superior for movies, and the Samsung is superior for games, but both things are important to me.

 

In a perfect world I could get both HDR 1000 and VRR/FreeSync, but it looks like Samsung and LG are the only manufacturers that currently have VRR/FreeSync, and while LG has a 55" and 65" non-OLED model in the SM90 series which has HDR, they both use an IPS panel and seem to suffer from all the things IPS panels usually suffer from, which is not great for content consumption. i don't want to jump to OLED due to higher cost and the risk of permanent burn-in.

 

For those with experience with HDR 700 how good is the HDR effect?

Will it still blow my mind compared to the bog standard 1080p TN panel TV I have now? Or should HDR 1000 be a bigger priority than VRR/FreeSync (consider I will probably be doing more video watching than gaming on this but most of it won't be in HDR, and that my GTX 970 is probably right in that zone where you benefit from variable refresh rate the most, and that is probably not getting upgraded for another couple of years)

 

Also, if anyone has experience with any of these models (in any size) feel free to chime in.

Link to comment
Share on other sites

Link to post
Share on other sites

So I have the pg27uq and I will tell you right now that HDR with it is amazing in part to do with it's high peak brightness of 1000 nits. That being said 1000 nits is fucking bright as hell and when I look at the sun in game it feels like I am actually looking into the sun. I would wager that 700 nits is probably good enough to make it better than the majority of HDR capable monitors on the market but just not the same level as having a 1000 nits peak brightness. 

Link to comment
Share on other sites

Link to post
Share on other sites

i dont have a q70r. i have a 7100 but

if your coming from 1080p then even 4k sdr looks awesome. plus i see is to me is, it supports 1440 (mine doesnt). samsungs when you hook a comp to it you dont need to set game mode. it detects the comp and automatically sets it to where you get the lowest ms. 

samsung does have its "own" way of doing hdr. soap opera effect. so you WILL be adjusting settings to minimize it and like me take a bit to get used to

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, circeseye said:

i dont have a q70r. i have a 7100 but

if your coming from 1080p then even 4k sdr looks awesome. plus i see is to me is, it supports 1440 (mine doesnt). samsungs when you hook a comp to it you dont need to set game mode. it detects the comp and automatically sets it to where you get the lowest ms. 

samsung does have its "own" way of doing hdr. soap opera effect. so you WILL be adjusting settings to minimize it and like me take a bit to get used to

I've seen 4K SDR and while it was cool how crisp it looked up close (so close I could touch it with my hand), that's not a realistic scenario. At the distance I sit from the TV at home I'm not sure I could tell a difference at all. It might matter more for large monitors where you sit a lot closer than for a normal sized (relatively speaking) TV at a normal distance. People have always said HDR is more important than 4K and that seems to be true. But I have yet to try it properly, only experienced "HDR" on a TV with as far as I can tell no dimmable zones, and only the standard 300-400 nit peak brightness (not sure about the exact numbers) which still looked reasonably bright I suppose, but how washed out everything looked kind of ruined it. It did make me want to get a 4K HDR TV of my own to experience it properly, though. But a bit worried about making the wrong choice, and having to return it, which is a pain since I'll be ordering it online.


That is pretty useful. If I had to turn game mode on manually I would likely just leave it on most of the time, maybe turn it off when watching a movie if it has a negative impact on that but when watching TV shows I don't care as much about getting the best quality possible and the game mode on other TV's I've used has made no visible difference to the image quality, at least once the image settings have been readjusted, as they tend to reset when you switch to game mode. It's not a huge deal if it has to be toggled manually though.

 

I think I'm leaning more towards the Samsung currently, but it'll be a huge bummer if Sony decides to update their lineup with VRR/FreeSync later this year as that would make that the superior option in my opinion. I feel like I care less what a game looks like when I'm hooked because I'm too focused on other things, but with a movie, how it looks and how it sounds is key to getting that cinema level immersion and experience.

Link to comment
Share on other sites

Link to post
Share on other sites

well i know the q70r's hdr is way better then my 7100. im not complaining im pretty satisfied how my hdr looks. but i know q70r's hdr is brighter and even more colorful.

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Brooksie359 said:

So I have the pg27uq and I will tell you right now that HDR with it is amazing in part to do with it's high peak brightness of 1000 nits. That being said 1000 nits is fucking bright as hell and when I look at the sun in game it feels like I am actually looking into the sun. I would wager that 700 nits is probably good enough to make it better than the majority of HDR capable monitors on the market but just not the same level as having a 1000 nits peak brightness. 

Man, 1000 nits sounds great. I want that realism in my movies. Not sure it's worth giving up FreeSync for though... That makes the choice harder.

Just had my birthday so I was thinking to buy a new TV as a birthday gift to myself, so I will be ordering one pretty soon.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Jdbye said:

Man, 1000 nits sounds great. I want that realism in my movies. Not sure it's worth giving up FreeSync for though... That makes the choice harder.

Just had my birthday so I was thinking to buy a new TV as a birthday gift to myself, so I will be ordering one pretty soon.

Honestly I have no experience with 700 nit displays but I would imagine that is still bright af and probably still very good HDR. This is speaking from using a monitor emitting 1000 nits so it might be different when the thing is a couple feet away vs at a tv distance. 

Link to comment
Share on other sites

Link to post
Share on other sites

I ended up with the Samsung.

 

I went to a nearby electronics store that had both in stock to compare them, and the Sony clearly looked much brighter, where the Samsung looked gray by comparison, but that might be down to default settings being different between the two. The guy there told me they were at default settings.

 

In one review of the Sony the guy complained about backlight glow around light objects on dark backgrounds, and in the video example it looked pretty awful, but I had decided I had to judge it for myself as things never look the same on a computer screen and IRL. As soon as a white logo appeared on a black background in a video I tested in the store, half the screen lit up with backlight glow, it was just as bad as the review said. The Samsung seems to overcompensate a bit by capping the brightness instead, at least that is my guess, which is why it caps at 700 nits, but at least it doesn't seem to suffer from this issue. That was the deciding factor, it was just too distracting.

 

Rewatched Ready Player One and it wasn't as bright as I would like it in brightly lit scenes, not as good as I expected to be honest, but turning contrast enhancement up to low from off fixed that and it seems plenty bright in a dark room now. I'll have to play around with it more, but I'm happy with it so far after changing that setting.

 

Not much luck getting the TV to see my Synology NAS DLNA server (although DLNA is supposed to work) so I have to copy media to a USB HDD and plug it in. Definitely not ideal, and the Tizen app store is so sparse, there isn't even an alternate media player other than a Plex client which I need to run a bloated server on a PC to use. So a Shield TV is definitely a must, at least for me.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/8/2020 at 9:48 PM, Jdbye said:

I ended up with the Samsung.

 

I went to a nearby electronics store that had both in stock to compare them, and the Sony clearly looked much brighter, where the Samsung looked gray by comparison, but that might be down to default settings being different between the two. The guy there told me they were at default settings.

 

In one review of the Sony the guy complained about backlight glow around light objects on dark backgrounds, and in the video example it looked pretty awful, but I had decided I had to judge it for myself as things never look the same on a computer screen and IRL. As soon as a white logo appeared on a black background in a video I tested in the store, half the screen lit up with backlight glow, it was just as bad as the review said. The Samsung seems to overcompensate a bit by capping the brightness instead, at least that is my guess, which is why it caps at 700 nits, but at least it doesn't seem to suffer from this issue. That was the deciding factor, it was just too distracting.

 

Rewatched Ready Player One and it wasn't as bright as I would like it in brightly lit scenes, not as good as I expected to be honest, but turning contrast enhancement up to low from off fixed that and it seems plenty bright in a dark room now. I'll have to play around with it more, but I'm happy with it so far after changing that setting.

 

Not much luck getting the TV to see my Synology NAS DLNA server (although DLNA is supposed to work) so I have to copy media to a USB HDD and plug it in. Definitely not ideal, and the Tizen app store is so sparse, there isn't even an alternate media player other than a Plex client which I need to run a bloated server on a PC to use. So a Shield TV is definitely a must, at least for me.

For the Sony TV, what you saw with the bleed was almost certainly due to the local dimming zones - there are ways to mitigate that, but the only way to eliminate it entirely is more smaller dimming zones.

 

In this review:

https://www.tvfindr.com/sony-x950g/

They determined that there were at least 60 zones (I'd suspect it's actually 64, which would make it an 8 x 8 local array). That's.... pretty shit, to be honest, for a flagship TV.

 

The Q90R (the big daddy of the Q70R that you bought) has 480 zones - Samsung lists it as "Direct Full Array 16X" - though I can't find any definition of what exactly "16X" is - 480 zones divided by 16 = 30. So I guess the "base metric" is 30 zones for Samsung?

 

Anyway, the Samsung Q70R that you bought, has a "Direct Full Array 4X" - if we're to assume the base metric is 30, yours should have around 120 dimming zones - still double compared to the Sony.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/11/2020 at 11:16 PM, dalekphalm said:

For the Sony TV, what you saw with the bleed was almost certainly due to the local dimming zones - there are ways to mitigate that, but the only way to eliminate it entirely is more smaller dimming zones.

 

In this review:

https://www.tvfindr.com/sony-x950g/

They determined that there were at least 60 zones (I'd suspect it's actually 64, which would make it an 8 x 8 local array). That's.... pretty shit, to be honest, for a flagship TV.

 

The Q90R (the big daddy of the Q70R that you bought) has 480 zones - Samsung lists it as "Direct Full Array 16X" - though I can't find any definition of what exactly "16X" is - 480 zones divided by 16 = 30. So I guess the "base metric" is 30 zones for Samsung?

 

Anyway, the Samsung Q70R that you bought, has a "Direct Full Array 4X" - if we're to assume the base metric is 30, yours should have around 120 dimming zones - still double compared to the Sony.

It is indeed from local dimming, I had heard about it in a review beforehand. It's always going to be a thing with any screen that has local dimming, the Sony just seemed to have it configured in a way that made it way more apparent than with the Samsung.

After buying I realized I could have gotten the Q80R (closest in price to the Sony), which has more dimming zones than either of the other two, as well as a peak brightness more equivalent to the Sony, but it suffered a bit from black crush.

 

I find myself needing to turn contrast enhancement up to low to get a good HDR experience from the Q70R (as content tends to be mastered for 1000 nits so stuff ends up looking dimmed if I don't), which brings it in line with how bright I would expect the sunlight to be in bright scenes. I lose some contrast on the high end doing that, which isn't all that bad as the screen is rarely almost fully white, so I usually don't notice any contrast loss. I think that's preferrable over black crush, as very dark scenes are much more common than very bright ones. Although in theory the Q80R has better contrast since I wouldn't have to enable contrast enhancement to make the brightness look correct, I feel like the black crush would get annoying, so maybe it's a good thing I didn't get that instead.

 

i don't think the numbers in "Direct Full Array" mean a whole lot, but I believe each step up (4X, 8X, 16X) doubles the amount of dimming zones. From what I have heard the Q70R and X950G both have around the same number of dimming zones, the Samsung just seems to handle local dimming better. I can see some backlight glow when there's white text on a black background, but not to an excessive degree like the Sony. And more importantly, there's never any noticeable backlight glow on the black bars at the top and bottom in movies, even when watching with subtitles, which was apparenly an issue with the Sony. The Samsung seems to go a bit lighter on the brightness in those cases making the text look a bit darker, which is preferrable to me.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×