Jump to content

Why is the industry trying to push 8K instead of 6K

Hexxagone

8K TVs and monitors, 8K gaming. We don't have the frames for 8K. Why isnt 6K the next step like how 1440p was the step from 1080p to 4K

 

LTT did a 8K gaming video with a TV and Cyberpunk a while back ago. Maybe they would take that TV and set a 6K resolution and try out 6K gaming to see what frames they get.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Hexxagone said:

Maybe they would take that TV and set a 6K resolution and try out 6K gaming to see what frames they get.

You can do this relatively easily now with DSR, or at least get close enough. The DSR scaling factors don't always line up perfectly. 

Link to comment
Share on other sites

Link to post
Share on other sites

Because Mother Glass. Same reason there aren't 1440p TVs.
Display manufacturers have unified on a specific size of mother glass, the stuff that displays are cut from. It's a lot easier to cut out a successful 2X2 group of 4K displays and slap the electronics to support 8K than to design a whole separate production system to support the intermediate step. 
 

5950X/3080Ti primary rig  |  1920X/1070Ti Unraid for dockers  |  200TB TrueNAS w/ 1:1 backup

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Hexxagone said:

8K TVs and monitors, 8K gaming. We don't have the frames for 8K. Why isnt 6K the next step like how 1440p was the step from 1080p to 4K

 

LTT did a 8K gaming video with a TV and Cyberpunk a while back ago. Maybe they would take that TV and set a 6K resolution and try out 6K gaming to see what frames they get.

The difference between 8k and 4k is doubling, it means easily scaling of screen resolutions without any artifacting.

 

If you lets say tried putting 4k content on a 6k display you have a 1.5x ratio.  That means you will have essentially pixels where you need to do something like make up the color between or repeat it.  That can create visual artifacts that can be slightly off-putting even if you don't necessarily notice it.

 

Overall as well, it's a lot easier to sell a phrase like 8k than it is to 6k because "6k" doesn't sound as impressive compared to "4k"

3735928559 - Beware of the dead beef

Link to comment
Share on other sites

Link to post
Share on other sites

1440p and 4k kind of equally came to market. Sure it existed before but it saw very low adoption, kind of trading blows with 2560x1600 and eventually replacing it.
6k is actually called 5k because its 5120x2880, there is no dimensional 6k and probably never will be.

you can go out and buy a 5120x2880 monitor, it is a thing, and will eventually grow up in the near future as the next thing past 4k

Link to comment
Share on other sites

Link to post
Share on other sites

8K, 4K and 1080p have the nice property that every resolution is 4 times the next lower resolution. So a 4K screen can show a 1080p image without any weird interpolation artifacts by turning every pixel into 2x2 pixels. Meanwhile 1440p requires non-integer scaling and so would 6K.

 

That said… there's very little practical use to an 8K screen outside of very specific use cases. Manufacturers are basically just hoping people fall for the "new shiny" trick and buy new devices (and new computers to actually drive them…)

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Eigenvektor said:

That said… there's very little practical use to an 8K screen outside of very specific use cases.

This isn't one of them (but It always makes me laugh over how impractical it is)

 

pzk0enj9xd0b1.jpg?auto=webp&s=759d302f9afbad1c5124335baf3a676cd1d16604

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, GuiltySpark_ said:

This isn't one of them (but It always makes me laugh over how impractical it is)

 

pzk0enj9xd0b1.jpg?auto=webp&s=759d302f9afbad1c5124335baf3a676cd1d16604

where is the mouse

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, GuiltySpark_ said:

This isn't one of them (but It always makes me laugh over how impractical it is)

Dang, I wish that was my office

 

~edit: I just noticed the "tower" on the left 😅 Mac cluster

 

3 minutes ago, OhYou_ said:

where is the mouse

Looks like a mouse pad on the left, so likely hidden behind the chair

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, Hexxagone said:

8K TVs and monitors, 8K gaming. We don't have the frames for 8K. Why isnt 6K the next step like how 1440p was the step from 1080p to 4K

 

LTT did a 8K gaming video with a TV and Cyberpunk a while back ago. Maybe they would take that TV and set a 6K resolution and try out 6K gaming to see what frames they get.

Cause of logical resolution

 

540p = 1080p=2160p(4K)=4320p(8k), because it integer scales directly, there is no need to do any funny scaler crap in the television or monitor.

720p = 1440p = 2880p = 5760p

 

But here's the thing, 720 * 3 = 2160p as well. So 4K is the only resolution where both 720p and 1080p scale to. 

8K also makes that include 480p (480 x 9 = 4320p) But at 8K you can pretty much emulate old school CRT aperture grille and screen curvature on 240p and 480p content... if you really care that much.

 

"5K" is 2880p.

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Kisai said:

Cause of logical resolution

 

540p = 1080p=2160p(4K)=4320p(8k), because it integer scales directly, there is no need to do any funny scaler crap in the television or monitor.

720p = 1440p = 2880p = 5760p

 

But here's the thing, 720 * 3 = 2160p as well. So 4K is the only resolution where both 720p and 1080p scale to. 

8K also makes that include 480p (480 x 9 = 4320p) But at 8K you can pretty much emulate old school CRT aperture grille and screen curvature on 240p and 480p content... if you really care that much.

 

"5K" is 2880p.

 

So if I understand this correctly:

 

540p -> 1080p -> 4K 2160p -> 8K 4320p

720p -> 1440p -> 5K 2880p -> 10K 5760p

864p -> 1728p -> 6K 3456p -> 12k 6912p

 

6K wont be a thing because it doesn’t scale to 720p and 1080p for content?

What would happen if you tried to play that content on a 6K resolution? would the image be distorted?

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Hexxagone said:

So if I understand this correctly:

 

540p -> 1080p -> 4K 2160p -> 8K 4320p

720p -> 1440p -> 5K 2880p -> 10K 5760p

864p -> 1728p -> 6K 3456p -> 12k 6912p

 

6K wont be a thing because it doesn’t scale to 720p and 1080p for content?

What would happen if you tried to play that content on a 6K resolution? would the image be distorted?

Pretty much the reason any particular resolution exists is two reasons:

1. Integer scale (not having complicated scalers saves the TV/Monitor manufacturer money), but also allows your GPU to operate at integer scale resolutions without having latency induced from the scaler. Essentially, if you do an integer scale, the ASIC's just go "repeat every pixel 3 times for three rows"

2. 16:9 was the compromise between 4:3 and 1.85:1 (theatrical wide screen), the reason all your screens today are widescreen is so that you don't lose 50% of the screen to black bars on panoramic 2.40:1 films. This means that you lose some % of the vertical resolution when you watch theatrical films.

 

But if your television or computer screen does tricks (Eg frame duplication to make 24fps content look like 120fps content) this requires advanced logic in the screen, that it then uses on all content. 

 

This is why, smart TV's should be avoided for games. You do not want the television to use any tricks, because it will add input latency. You want to be able to turn off everything on the input the PC or Console is connected to. But you'll note that televisions often default to scaling non-native resolutions to the native resolution of the screen, often unsatisfyingly so, because this scaler is intended for standard definition content, so it just blurs everything at the cost of 2-3 frames of latency.

 

So if your game isn't played at the native resolution or an integer scale of that native resolution (Eg 720p -> 4K) on the monitor you get the latency penalty as well as the scaler resulting in blury jpeg-like artifacts from the scaler. Again, these scalers are designed for scaling SD video, so those artifacts aren't usually noticed among MPEG compression artifacts. You might not even notice them if DLSS/FSR are being used, or it might amplify those artifacts.

 

At any rate, the reason why you don't see 5K or 6K or anything between 4K and 8K gaming is because there are no native panels of that resolution. So if you have an 8K screen but try to run it at a "6K" resolution, you get a poorer visual experience than 4K since the scaler is invoked.

 

This get's to one additional problem, HDMI bandwidth. Technically any resolution is valid as long as it does not exceed the HDMI spec. To run 4KP60 at full color gamut you need 18Gbps HDMI 2.0 cables. To run at 8K you need 48Gbps HDMI 2.1 cables.

 

But you also need a native screen, and there's like only one native 6K (6144x3456@60Hz) screen that seems to be available:

https://www.dell.com/en-ie/shop/dell-ultrasharp-32-6k-monitor-u3224kba/apd/210-bhnx/monitors-monitor-accessories#techspecs_section

 

On mobile devices, because they have high PPI, often the native resolution isn't important because the OS is software-scaling in a "resolution independent" way. Windows only started doing this in Windows 10, and it's often only invoked if the target screens are higher than 1080p. It can actually be super infuriating to drag windows between a 4K and a 1080P screen and have the window scale not lock on. 

 

Resolution independent doesn't mean you get more resolution, often it means you are having a 1080p experience on a 4K screen where the text is extra sharp and legibile. However your web browser  (eg chrome) is still giving you a 1080p experience. If you want to watch 4K content in the browser, you actually have to scale to 50% to actually see 4K pixels.

 

The bigger your screen the less you probably want the scaler to fumble with your content. Sure, an 8K screen might still be showing windows at 1080 scale, but unless you are playing a game or watching a movie that was actually outputting 8K resolutions, you tend not to enjoy much of the advantage of having a higher resolution screen. A lot of what people get with 4K right now is HDR support, and that is still super flakey on Windows.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Others pretty much explained it. So scaling, manufacturing, both standardization things, doubling making more sense in a way, marketing. It would be another in-between resolution that would potentially be less popular then any so far, especially now when pixel density is increasing and it's not a huge wow difference like early days with smaller increases over doubling.

Though also I find it a good since 4K is becoming easier to drive and we have new techniques for upscale when needed as option for 8K in future. Can downscale resolution for native 1:1 if it's 8K too. Finally also monitors catching up in PPI department, like I find 4K as minimum, it still is not as sharp if you work, watch small details, text, movements. Comparing it to other higher PPI displays of course. 

So say 32" 4K monitor, in that size if it was 6K PPI would be great, 8K would make it just better and nice to look as a modern phone display. And like said industry standardizations with number of reasons cheaper then making custom resolution that is not as mainstream and mitigating performance problem in start with good modern upscale for games, or straight up downscale 1:1 pixel wise as it would be 4K then.

 

I know some say 4K is even too big res or waste on 32" or so but they are noobs don't know what they're saying. They basically want 1080p experience screendoor effect for some reason, or they just base it on desktop scaling or bad vision. Otherwise makes no sense. You don't want to see pixela even if you look a bit closer, let alone at regular distance.

Apple understands importance of PPI at least on given displays.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, OddOod said:

Because Mother Glass. Same reason there aren't 1440p TVs.
Display manufacturers have unified on a specific size of mother glass, the stuff that displays are cut from. It's a lot easier to cut out a successful 2X2 group of 4K displays and slap the electronics to support 8K than to design a whole separate production system to support the intermediate step. 
 

Im sorry, this has nothing to do with mother glass. the glass size doesn't control the resolution. While I'm not going to say my speculation is correct, I can safely say that one is wrong. There is no additional cost to making the glass any resolution you want outside of size and yield. doing a 2x2 of 4k doesn't give you 3 4k TVs if one fails due to scribe cut width. 

Honestly, because anything less then a 4x increase in resolution would largely go unnoticed in terms of clarity and making content for rapidly changing middle steps just makes things weird.

 

  

17 hours ago, OhYou_ said:

1440p and 4k kind of equally came to market. Sure it existed before but it saw very low adoption, kind of trading blows with 2560x1600 and eventually replacing it.

and I'm still upset.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't see any big reason we need 8k displays. I can definitely see the reasoning behind 8k for recording video. But why do people need 8k tv's? Unless they are like linus's tcl tv i dont see a point. People are not looking at their tv's like an inch away from them.  The only use I can think of is splitting the displays but 4k is like... enough for most people?

Don't get me wrong I think it's cool... just not really practical for most people.

I'm usually as lost as you are

Link to comment
Share on other sites

Link to post
Share on other sites

To expand on the thing Kisai said

 

8k is nice in that it allows more of our current resolutions to scale properly to it and integer scaling is so much simpler

Ones that work with 8k (with whole integers), 240p, 480p, 720p, 1080p, 1440p, 4k

Ones that work with 6k (1.5x from 4k), 1080p...that's it, you can easily scale 1080p content.

 

Now lets maybe do an example of why 1.5x gets complicated.  Lets assume we have a 4x1 display with the following values

Index_4 0 1 2 3
Pixel 255 0 128 255

 

On a 2x scale (8x1); simplest scaling Index_4 = Index_8/2 [lop off the decimal point]

Index_8 0 1 2 3 4 5 6 7
Pixel 255 255 0 0 128 128 255 255

 

As can be observed it's easy, you essentially just divide by two, ignore the decimal (in binary base 2 it's user easy when it's divisible by a power of 2, but still also easy when it's a whole number multiple) and look up from Index_4

 

On a 1.5x scale (6x1); things are complicated...because if you were to try stretching it you have things like 1 and 2 from Index_4 where they land in-between.  So there are a few different methods

 

Matching the Index to the closest when scaled, then fill in the missing numbers using a straight line formula

Index 0 1 2 3 4 5
Initial 255 0 _ 128 255 _
Fill in gaps 255 0 64 128 255 255

 

As you can see in the result, it's no longer the same pretty  figure that you had before; you  have a double pixel on the end and it doesn't look maintained the same scale as before.

 

Lets try maintaining scale by essentially creating a formula for Index_4 which stretches it (so we have pixels in their correct spots)

Index 0 1 5/3 2 3 10/3 4 5
Expanded 255 _ 0 _ _ 128 _ 255
Fill gaps 255 153 0 25.6 102.4 128 178 255

 

So the above, if we could have 1/3 of a pixel would look right, except we don't have 1/3 of pixels, so we grab just the whole number pixels

Index 0 1 2 3 4 5
Pixels 255 153 25.6 102.4 178 255

 

So the above maintains the relative shape of the image, but notice that it's now slightly blurry and the numbers 0 and 128 essentially have been tossed out; but overall there isn't the distortion.

 

That's the general issue, you essentially have tradeoffs when you are doing scaling at non-whole numbers and not only that but it takes more processing power to do that kind of calculation.

 

3735928559 - Beware of the dead beef

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×