Jump to content

4K for Home Theater and WHY YOU MAY NOT NEED IT: Long Read

I was wondering: there is a significant jump in pixel density between 1080p and 4k (twice as much right?), so perhaps while below the distances you have in the graph you may not be able to perceive full 4k, maybe you can perceive something like 1440p and still get a better experience than 1080p. I'm not going to buy a 4k tv until they are the price of current 1080p tvs and there is a lot of content available for them, but I kind of feel I should be able to notice at least some difference between a 46" 1080p tv and a 60" 4k one from 3 metres. Part of that may depend on the fact that I have exceptionally good eyesight (not a pun on the eye, I really do), but I use my 4k monitor without scaling and sometimes I can distinguish the single pixels (depending on the image being displayed).

The purpose of the graph is pretty much within the 1080p region, any resolution above it (even 1440, or 1081 if it existed lol) will not yield a difference to the eye.

monitors again fall into a weird region because 12" may be fine while 11" may present pixels so id say go as big a res on monitors as you can.

If the graph says 60" 4k may be worth it then go for it, watching football atm and dont feel like referencing the chart lol

Current System Specs:

MOBO: Gigabyte Aorus Ultra Gaming     CPU: Intel i5 9600k      GPU: EVGA GTX 1070 ti FTW Ultra Silent    PSU: EVGA 750 G2 80+ gold

Ram:  16GB Corsair Vengeance RGB Pro DDR4-3200    Storage: 500 GB Samsung 970 EVO/ 4TB WD Blue Case: Corsair 275R-White

Link to comment
Share on other sites

Link to post
Share on other sites

The purpose of the graph is pretty much within the 1080p region, any resolution above it (even 1440, or 1081 if it existed lol) will not yield a difference to the eye.

...

What about for people that are far sighted? Such as myself? Luckily I don't need glasses (and hope I won't for a long time), but is your graph relative for a person with 20/20 vision?

▶ Learn from yesterday, live for today, hope for tomorrow. The important thing is not to stop questioning. - Einstein◀

Please remember to mark a thread as solved if your issue has been fixed, it helps other who may stumble across the thread at a later point in time.

Link to comment
Share on other sites

Link to post
Share on other sites

I was wondering: there is a significant jump in pixel density between 1080p and 4k (twice as much right?), so perhaps while below the distances you have in the graph you may not be able to perceive full 4k, maybe you can perceive something like 1440p and still get a better experience than 1080p. I'm not going to buy a 4k tv until they are the price of current 1080p tvs and there is a lot of content available for them, but I kind of feel I should be able to notice at least some difference between a 46" 1080p tv and a 60" 4k one from 3 metres. Part of that may depend on the fact that I have exceptionally good eyesight (not a pun on the eye, I really do), but I use my 4k monitor without scaling and sometimes I can distinguish the single pixels (depending on the image being displayed).

Don't listen to the OP. he has no idea what he's talking about. "4K" (I like to call it QFHD or just 3840x2160 since 4K is a different resolution altogether than what we're talking about) has four times the pixels and will yield much better image quality if done correctly. There are obviously different factors to image quality as well, but in your comparison, if the "4K" image had a bitrate of four times the higher bitrate as the video given to the 1080p display (same codec used), you would definitely notice a huge difference. 

Link to comment
Share on other sites

Link to post
Share on other sites

Don't listen to the OP. he has no idea what he's talking about. "4K" (I like to call it QFHD or just 3840x2160 since 4K is a different resolution altogether than what we're talking about) has four times the pixels and will yield much better image quality if done correctly. There are obviously different factors to image quality as well, but in your comparison, if the "4K" image had a bitrate of four times the higher bitrate as the video given to the 1080p display (same codec used), you would definitely notice a huge difference. 

 

I know that, but if you're far enough away you start not being able to notice any difference. What I was questioning is the accuracy of the graph, and wether you would see no difference at all from those distances or if the difference would be there, but not as high as it could have been.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

blah blah deleted post

Current System Specs:

MOBO: Gigabyte Aorus Ultra Gaming     CPU: Intel i5 9600k      GPU: EVGA GTX 1070 ti FTW Ultra Silent    PSU: EVGA 750 G2 80+ gold

Ram:  16GB Corsair Vengeance RGB Pro DDR4-3200    Storage: 500 GB Samsung 970 EVO/ 4TB WD Blue Case: Corsair 275R-White

Link to comment
Share on other sites

Link to post
Share on other sites

What about for people that are far sighted? Such as myself? Luckily I don't need glasses (and hope I won't for a long time), but is your graph relative for a person with 20/20 vision?

 

Well that I have no idea hahaha. If by far sighted, you meant that things seem closer to you than others (ie. super hero zoom vision lol), then that may throw things off for you?

But I have 20/20 and never needed glasses so that is 100% a guess and I'm not sure.

 

But I would assume these graphs have been developed using 20/20, or at least corrected 20/20, as reference

Current System Specs:

MOBO: Gigabyte Aorus Ultra Gaming     CPU: Intel i5 9600k      GPU: EVGA GTX 1070 ti FTW Ultra Silent    PSU: EVGA 750 G2 80+ gold

Ram:  16GB Corsair Vengeance RGB Pro DDR4-3200    Storage: 500 GB Samsung 970 EVO/ 4TB WD Blue Case: Corsair 275R-White

Link to comment
Share on other sites

Link to post
Share on other sites

I know that, but if you're far enough away you start not being able to notice any difference. What I was questioning is the accuracy of the graph, and wether you would see no difference at all from those distances or if the difference would be there, but not as high as it could have been.

 

The graph isn't "black and white" so to speak. so if we pick a reference size 65". You wouldn't see "no difference" at 8.25' (in 1080p region) and a huge difference at 8.3' (Ultra HD region).

As you get closer to the lines, differences can start to become evident between the resolutions.

 

Hopefully that answers your question, but it was worded oddly so I may have missed your point?

Current System Specs:

MOBO: Gigabyte Aorus Ultra Gaming     CPU: Intel i5 9600k      GPU: EVGA GTX 1070 ti FTW Ultra Silent    PSU: EVGA 750 G2 80+ gold

Ram:  16GB Corsair Vengeance RGB Pro DDR4-3200    Storage: 500 GB Samsung 970 EVO/ 4TB WD Blue Case: Corsair 275R-White

Link to comment
Share on other sites

Link to post
Share on other sites

The graph isn't "black and white" so to speak. so if we pick a reference size 65". You wouldn't see "no difference" at 8.25' (in 1080p region) and a huge difference at 8.3' (Ultra HD region).

As you get closer to the lines, differences can start to become evident between the resolutions.

 

Hopefully that answers your question, but it was worded oddly so I may have missed your point?

 

Well it is pretty hard to explain in a forum post ^^ yes, that was what I meant.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Don't listen to the OP. he has no idea what he's talking about. "4K" (I like to call it QFHD or just 3840x2160 since 4K is a different resolution altogether than what we're talking about) has four times the pixels and will yield much better image quality if done correctly. There are obviously different factors to image quality as well, but in your comparison, if the "4K" image had a bitrate of four times the higher bitrate as the video given to the 1080p display (same codec used), you would definitely notice a huge difference. 

The officially adopted "codename" for "4K" is UHD (Ultra HD) which is 3840x2160. QFHD might also apply, but I don't think it's the accepted terminology anymore. Otherwise I agree :)

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Don't listen to the OP. he has no idea what he's talking about. "4K" (I like to call it QFHD or just 3840x2160 since 4K is a different resolution altogether than what we're talking about) has four times the pixels and will yield much better image quality if done correctly. There are obviously different factors to image quality as well, but in your comparison, if the "4K" image had a bitrate of four times the higher bitrate as the video given to the 1080p display (same codec used), you would definitely notice a huge difference. 

 

If the bitrate increase only ensures that each frame is fully 4k, yet both sources display at the same frame rate, then you are wrong and everything in the OP is correct. Argue all you want but you

are just spreading misinformation and big box sales swill and arguing against the biological nature of the eye. As well as the many, many experts who have explored this topic.

 

As an example, if you view 1080p content on a 1080p tv in conditions where you are in the 1080p region (say, 65" at 10'), and the same (higher bitrate) 4k content is displayed on a 4k tv right next  to it,

you will not, scratch that physically CAN NOT see the extra data and thus no difference in the images. Argue all you want, it's just biology and how the eyes work. You can't tell how many legs are on a centipede from across the

room since your eyes cannot perceive that detail from a certain distance away. The detail (ie. number of legs) does indeed physically exist (much like the extra pixels), you just cannot see it...

 

On the otherhand, if you are in the ULTRA HD (UHD, or QFHD as you liked to call it) region (65" at 6'), and you are viewing the same two tvs as stated in the above paragraph, you WILL see the difference

in the two pictures since your eyes are close enough to perceive the extra pictures and data presented.

 

Hope this clears things up for you,

but if you bought a 4k tv and just want to believe it was a good investment, then by all means enjoy it!

But do not lead others into making the same mistake of buying for 4k instead of focusing on PQ

Current System Specs:

MOBO: Gigabyte Aorus Ultra Gaming     CPU: Intel i5 9600k      GPU: EVGA GTX 1070 ti FTW Ultra Silent    PSU: EVGA 750 G2 80+ gold

Ram:  16GB Corsair Vengeance RGB Pro DDR4-3200    Storage: 500 GB Samsung 970 EVO/ 4TB WD Blue Case: Corsair 275R-White

Link to comment
Share on other sites

Link to post
Share on other sites

If the bitrate increase only ensures that each frame is fully 4k, yet both sources display at the same frame rate, then you are wrong and everything in the OP is correct. Argue all you want but you

are just spreading misinformation and big box sales swill and arguing against the biological nature of the eye. As well as the many, many experts who have explored this topic.

 

As an example, if you view 1080p content on a 1080p tv in conditions where you are in the 1080p region (say, 65" at 10'), and the same (higher bitrate) 4k content is displayed on a 4k tv right next  to it,

you will not, scratch that physically CAN NOT see the extra data and thus no difference in the images. Argue all you want, it's just biology and how the eyes work. You can't tell how many legs are on a centipede from across the

room since your eyes cannot perceive that detail from a certain distance away. The detail (ie. number of legs) does indeed physically exist (much like the extra pixels), you just cannot see it...

 

On the otherhand, if you are in the ULTRA HD (UHD, or QFHD as you liked to call it) region (65" at 6'), and you are viewing the same two tvs as stated in the above paragraph, you WILL see the difference

in the two pictures since your eyes are close enough to perceive the extra pictures and data presented.

 

Hope this clears things up for you,

but if you bought a 4k tv and just want to believe it was a good investment, then by all means enjoy it!

But do not lead others into making the same mistake of buying for 4k instead of focusing on PQ

I do agree that there is a point where you cannot tell the difference, however I do not consider that graph to represent it correctly. I believe your first example with the 65" at 10" feet to be incorrect.

 

I did not mislead people into buying into "4K", I said if done right it can be better.

Link to comment
Share on other sites

Link to post
Share on other sites

If the bitrate increase only ensures that each frame is fully 4k, yet both sources display at the same frame rate, then you are wrong and everything in the OP is correct. Argue all you want but you

are just spreading misinformation and big box sales swill and arguing against the biological nature of the eye. As well as the many, many experts who have explored this topic.

As an example, if you view 1080p content on a 1080p tv in conditions where you are in the 1080p region (say, 65" at 10'), and the same (higher bitrate) 4k content is displayed on a 4k tv right next to it,

you will not, scratch that physically CAN NOT see the extra data and thus no difference in the images. Argue all you want, it's just biology and how the eyes work. You can't tell how many legs are on a centipede from across the

room since your eyes cannot perceive that detail from a certain distance away. The detail (ie. number of legs) does indeed physically exist (much like the extra pixels), you just cannot see it...

On the otherhand, if you are in the ULTRA HD (UHD, or QFHD as you liked to call it) region (65" at 6'), and you are viewing the same two tvs as stated in the above paragraph, you WILL see the difference

in the two pictures since your eyes are close enough to perceive the extra pictures and data presented.

Hope this clears things up for you,

but if you bought a 4k tv and just want to believe it was a good investment, then by all means enjoy it!

But do not lead others into making the same mistake of buying for 4k instead of focusing on PQ

You did not understand his statement. Someone may not notice the individual pixels from a distance, but noise and compression artifacts may also be less visible with a higher resolution and a higher bitrate.

Back to the original topic of visual acuity:

At the THX recommended 40.04 deg horizontal view angle, 60 cycles per degree means 2403 distinct pixels may be seen... a little above the 1920 from a 1080p screen. If you study hyperacuity though, you will learn that details as fine as 5 arcseconds can be discerned depending on the pattern displayed. This means that until 28,829 pixels wide is reached on such a view angle, additional pixels can make an impact on the image (no matter how small; dependent on the image being displayed). The eye may not resolve each pixel, but the brain can process the information to discern the difference.

Link to comment
Share on other sites

Link to post
Share on other sites

You did not understand his statement. Someone may not notice the individual pixels from a distance, but noise and compression artifacts may also be less visible with a higher resolution and a higher bitrate.

Back to the original topic of visual acuity:

At the THX recommended 40.04 horizontal view angle, 60 cycles per degree means 2403 distinct pixels may be seen... a little above the 1920 from a 1080p screen. If you study hyperacuity though, you will learn that details as fine as 5 arcseconds can be discerned depending on the pattern displayed. This means that until 28,829 pixels on such a view angle, additional pixels can make an impact on the image (no matter how small; dependent on the image being displayed). The eye may not resolve each pixel. but the brain can process the information to discern the difference.

 

When i read his statement i believe he only brought bitrate into the fray to ensure that we are referencing a fair apples to apples fight; obviously if the resolution is 4x greater, the bit-rate would also need to be 4x greater.

It wouldn't really be a fair comparison if we were talking 4k and 1080p both delivering the same bitrate (there's your artifacting...)

 

Getting into faster bitrates, artifacting, and this and that is a whole other topic that has very little to do with the OP, and much more to do with individual set performance and PQ.

 

You do bring up an interesting point with the whole what the brain can see vs what the eyes can "register", but it was my experience in my

testing that this was not the case. I used 60" models and found that I could tell no difference until I was within 2-3" of the line of doom. But then again

I was unbiased and didn't have any part of me that subliminally wanted the 4k to look better... If it repped, it repped. But it didn't...

 

I think the bottom line FOR ME is that there was no difference in the two resolutions when I viewed them from MY proper distance.

The whole point of this thread is to allow consumers to determine what THEIR OWN SEATING DISTANCE AND SCREEN SIZE DICTATES WHAT THEY WILL BENEFIT FROM,

instead of the bestbuy kid with highlights giving them the "4k is the future" or "Come stand 1 foot away from this tv and be amazed by the detail" spiel.

 

If someone goes out and buys a 1080p tv with some bomb ass picture quality instead of a 4k insignia, magnovox, etc... The thread did it's job.

 

By the By,

a 40.04 degree viewing angle would mean sitting ~6' away from a 60" TV (71.8" if we are nitpicking), which indeed follows the chart that I posted in the OP.

For most consumers, a 40.04 degree viewing angle is unreasonable and uncomfortable for all but dedicated home theaters. I believe I did address in the OP that there are many

many many recommendations for viewing angles, seating distances, and such.  I stated that I would be using the common consumer rule of thumb of 1.5 * the diagonal for all of my statements,

which would put you at 7.5' for a 60"; back into the 1080p range.

 

Loving all of this "The OP's chart is wrong" stuff. I didn't make it lol, if you do a quick google search you will find many out there, pretty much all showing the same transition points.

They were made by people who are much smarter and much more experienced than you or I, so if you want to say the chart's wrong, show us some proof :)

Current System Specs:

MOBO: Gigabyte Aorus Ultra Gaming     CPU: Intel i5 9600k      GPU: EVGA GTX 1070 ti FTW Ultra Silent    PSU: EVGA 750 G2 80+ gold

Ram:  16GB Corsair Vengeance RGB Pro DDR4-3200    Storage: 500 GB Samsung 970 EVO/ 4TB WD Blue Case: Corsair 275R-White

Link to comment
Share on other sites

Link to post
Share on other sites

 

You do bring up an interesting point with the whole what the brain can see vs what the eyes can "register", but it was my experience in my

testing that this was not the case. I used 60" models and found that I could tell no difference until I was within 2-3" of the line of doom. But then again

I was unbiased and didn't have any part of me that subliminally wanted the 4k to look better... If it repped, it repped. But it didn't...

 

The reason for this is likely because very few videos are made up of high contrast lines (used for resolution testing). 

 

As for the screen size vs. distance, a lot of us here are into home theater and are used to higher viewing angles. The front seats of a typical THX cinema has a lateral FOV of about 53 degrees, whilst the front seats of an iMAX cinema has a lateral FOV of about 120 degrees (Omnimax is even greater, but requires a curved screen). 

 

One may not want to watch the news with something this big, but it truly does improve immersiveness with movies (since movies are shot to be presented in such a screen). A front projector is a cost effective way to get large screen sizes, if room lighting can be controlled (like that in a dedicated HT room).

 

When i read his statement i believe he only brought bitrate into the fray to ensure that we are referencing a fair apples to apples fight; obviously if the resolution is 4x greater, the bit-rate would also need to be 4x greater.

It wouldn't really be a fair comparison if we were talking 4k and 1080p both delivering the same bitrate (there's your artifacting...)

 

 

This is not true. Though an increase in resolution would require a near proportional increase in bitrate to maintain the same quality per pixel, the actual image quality as a whole can be retained with a much smaller increase in bitrate (granted that the codec is properly optimized). The reason for this is because the same sized artifact in pixels would be less obvious in a high resolution panel of the same size.

 

Once deep color becomes mainstream, the higher panel resolution can also be utilized to display more apparent colors than the panel is otherwise capable of (through dithering). 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

So apparently no one read this or used the search funct...

ive seen this question/debate in the last 6 threads ive read here

Current System Specs:

MOBO: Gigabyte Aorus Ultra Gaming     CPU: Intel i5 9600k      GPU: EVGA GTX 1070 ti FTW Ultra Silent    PSU: EVGA 750 G2 80+ gold

Ram:  16GB Corsair Vengeance RGB Pro DDR4-3200    Storage: 500 GB Samsung 970 EVO/ 4TB WD Blue Case: Corsair 275R-White

Link to comment
Share on other sites

Link to post
Share on other sites

So apparently no one read this or used the search funct...

ive seen this question/debate in the last 6 threads ive read here

To be fair, the only proper way to search the forums is using Google "Site:linustechtips.com" and most users aren't likely to do that.

 

Still, posting in a thread that's almost three weeks old to say that the thread is a rehash doesn't seem particularly useful either ;):P

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

Hi, welcome to the forums! I understand your frustration, and that you disagree with @RogueCow. But please do not attack or insult him, simply because he has a different opinion than you do.

 

You disagree? Great! Say so, then explain why he's wrong. Implying that he has down syndrome? WTF dude. That's 100% unacceptable behaviour.

 

FYI, Plasma TV's are fucking awesome, and, when used in a Home Theatre environment, are objectively superior to LCD TV's (Both CCFL and LED) in many aspects. LED LCD TV's have gotten closer in PQ in recent years, but OLED is going to be the real successor to Plasma for the Home Theatre.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 weeks later...

Dalek I completely forgot the search function is all screwy, my bad

Current System Specs:

MOBO: Gigabyte Aorus Ultra Gaming     CPU: Intel i5 9600k      GPU: EVGA GTX 1070 ti FTW Ultra Silent    PSU: EVGA 750 G2 80+ gold

Ram:  16GB Corsair Vengeance RGB Pro DDR4-3200    Storage: 500 GB Samsung 970 EVO/ 4TB WD Blue Case: Corsair 275R-White

Link to comment
Share on other sites

Link to post
Share on other sites

Hi, welcome to the forums! I understand your frustration, and that you disagree with @RogueCow. But please do not attack or insult him, simply because he has a different opinion than you do.

 

You disagree? Great! Say so, then explain why he's wrong. Implying that he has down syndrome? WTF dude. That's 100% unacceptable behaviour.

 

FYI, Plasma TV's are fucking awesome, and, when used in a Home Theatre environment, are objectively superior to LCD TV's (Both CCFL and LED) in many aspects. LED LCD TV's have gotten closer in PQ in recent years, but OLED is going to be the real successor to Plasma for the Home Theatre.

 

Man I didn't even get to see his post :/

Sounds like a good'n

Current System Specs:

MOBO: Gigabyte Aorus Ultra Gaming     CPU: Intel i5 9600k      GPU: EVGA GTX 1070 ti FTW Ultra Silent    PSU: EVGA 750 G2 80+ gold

Ram:  16GB Corsair Vengeance RGB Pro DDR4-3200    Storage: 500 GB Samsung 970 EVO/ 4TB WD Blue Case: Corsair 275R-White

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×