Jump to content

Why is 2160p 4K

planetary problem
11 hours ago, planetary problem said:

@Spotty, @dizmo, @Lurick thank you, i guess 1440p being called 2k so many times threw me off, but i have seen 1080p called 1.5k somewhere. this misconception is the reason i first started seeing into the flat mistakes by large YTbers though (not specifically LTT, majorly smartphone channels)

I heard people calling 1080p 1k, never heard of 1.5k myself

 

And yeah, 4k actually isn't that bad, it's everything below that is the problem. 1080p is the real 2k , 1440p is 2.5k and 2160p is 4k (which also include 4096 x 2160, the real 4k) 

Link to comment
Share on other sites

Link to post
Share on other sites

Really it all just stems from people noticing that 4K has four times as many pixels as 1080p, saying "ohhh I see the pattern!" (which is totally how patterns work right, you can definitely identify patterns by looking at a sample of 1) and then saying "well then if that's the case 1080p would be 1K and 1440p would be 2K since it's 2x as many pixels as 1080p, that makes sense, and if something makes sense, that proves that it's true, so I should definitely not try to check at all, and instead start educating other people on the internet about my un-checked assumption about how the system works, but in an expert tone without disclosing that's it's an assumption I haven't checked at all!". And unfortunately when it comes to terminology once you get it up and running you can get this sort of perpetual motion machine going where everyone just says "well I just call it that because everyone else calls it that".

 

The really sad thing is that the standardization of 4K and 8K UHD resolutions were both announced at the same time, so anyone could have just tried to do a basic confirmation "well if my assumption about where the name "4K" comes from is correct, then 8K should be 8x 1080p, let's check if that's true or not" and then immediately seen that the "pattern" doesn't hold and that the whole "4K is 4x 1080p" thing is just a coincidence.

Link to comment
Share on other sites

Link to post
Share on other sites

The K is just rough estimate measure for thousand horizontal pixels. Also usually assumed 16:9 too. People would usually just mentioned resolution fully or with vertical #p and 4K being more marketing push to be easier.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Glenwing said:

Really it all just stems from people noticing that 4K has four times as many pixels as 1080p, saying "ohhh I see the pattern!" (which is totally how patterns work right, you can definitely identify patterns by looking at a sample of 1) and then saying "well then if that's the case 1080p would be 1K and 1440p would be 2K since it's 2x as many pixels as 1080p, that makes sense, and if something makes sense, that proves that it's true, so I should definitely not try to check at all, and instead start educating other people on the internet about my un-checked assumption about how the system works, but in an expert tone without disclosing that's it's an assumption I haven't checked at all!". And unfortunately when it comes to terminology once you get it up and running you can get this sort of perpetual motion machine going where everyone just says "well I just call it that because everyone else calls it that".

 

The really sad thing is that the standardization of 4K and 8K UHD resolutions were both announced at the same time, so anyone could have just tried to do a basic confirmation "well if my assumption about where the name "4K" comes from is correct, then 8K should be 8x 1080p, let's check if that's true or not" and then immediately seen that the "pattern" doesn't hold and that the whole "4K is 4x 1080p" thing is just a coincidence.

I wonder if anyone's actually believed that 720p is called "2K" because 1280+720 = 2K. By that logic, 1080p = 3K, 1440p = 4K, 2160p = 6K, etc. 

 

Regarding "4K is 4x 1080p thing is just a coincidence", I'd say it's not simply a coincidence. Even the cinema standard holds to the same convention. There seems to be enough advantages to keeping to standardized resolutions which goes all the way down to 720p. That goes all the way to 8K, which is just 4x 4K (2x resolution on each axis) just like how 4K is 4x 1080p, 1440p is 4x 720p. 4K is also 9x 720p, being 3x on each axis.

 

It's definitely more than simply a coincidence and more likely intentional. There's too many advantages regarding scaling of content and manufacturing to not, otherwise why don't we see manufacturers simply increasing the resolution from 1920x1080 to 2304x1296 to be able to market having "44% more pixels!" (1.2^2).

 

Disregarding Microsoft/Apple specifically

image.png.6e8e806d277db0f953d902e615c37631.png

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Agall said:

I wonder if anyone's actually believed that 720p is called "2K" because 1280+720 = 2K. By that logic, 1080p = 3K, 1440p = 4K, 2160p = 6K, etc. 

 

Regarding "4K is 4x 1080p thing is just a coincidence", I'd say it's not simply a coincidence. Even the cinema standard holds to the same convention. There seems to be enough advantages to keeping to standardized resolutions which goes all the way down to 720p. That goes all the way to 8K, which is just 4x 4K (2x resolution on each axis) just like how 4K is 4x 1080p, 1440p is 4x 720p. 4K is also 9x 720p, being 3x on each axis.

 

It's definitely more than simply a coincidence and more likely intentional. There's too many advantages regarding scaling of content and manufacturing to not, otherwise why don't we see manufacturers simply increasing the resolution from 1920x1080 to 2304x1296 to be able to market having "44% more pixels!" (1.2^2).

 

Disregarding Microsoft/Apple specifically

image.png.6e8e806d277db0f953d902e615c37631.png

The selection of the resolution isn't the coincidence, I'm just talking about the name having "4" in it while being 4 times as many pixels as 1080p. That's not where the "4" in "4K" comes from, it's just a coincidence.

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, Glenwing said:

The selection of the resolution isn't the coincidence, I'm just talking about the name having "4" in it while being 4 times as many pixels as 1080p. That's not where the "4" in "4K" comes from, it's just a coincidence.

I agree, that falls apart simply because it's a quadratic function and not standard multiplication. Being x^2 can appear multiplicative until the relationship turns from 2x^2 to 4^2 like it does between 4K and 8K.

 

People actually think 4K = 4x 1080p, therefore 1080p = 1K? I guess that's the who misconception of 1440p = 2K in that case even though you're still rounding up.

 

Did the math, it would be a squared relationship, noting by using 720p vs 8K. Mathematically its a 6^2 representation. That's obvious by the nature of the units, however. 4K being 3^2 relative to 720p.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/31/2023 at 8:55 AM, planetary problem said:

720p= K

1080p= 1.5k

1440p= 720x2= 2k

2160p should be 3k, not 4k, then why is it like this? either my understanding of resolutions is wrong or math is not mathing

To fully answer your question, its because people are misusing terms and applying standards improperly.

 

DCI standards dictate that 2K = 2048x1080, 4K = 4096x2160, referring to the horizontal resolution. That's adapted to 1920x1080 and 3840x2160 on monitors, although the only 'K' designation that's widely accepted is that 4K= 3840x2160 when discussing monitors, although this isn't an actual standard.

 

The misnomer that 2K = 1440p appears to be the community in general's lack of understanding regarding what 4K means and mathematics. Someone who believes 4K = 4x the resolution of 1080p, might then conclude that because 1440p is 1.78x the resolution of 1080p, that 1440p ~= 2K. This doesn't accurately represent the relationship properly which is actually x1.33^2 being equal to x1.78 nor use 4K's actual meaning which refers to the horizontal pixel count.


720p vs 4K being 3^2. 720p vs 8K being 6^2.

1080p vs 4K being 2^2. 1080p vs 8K being 4^2.

1440p vs 4K being 1.5^2. 1440p vs 8K being 3^2.

2160p being 4K. 2160p vs 8K being 2^2.

4320p being 8K.

The base of the quadratic variable being the division of either the vertical or horizontal pixel count. simply shown by 2160/1080 = 2 for 1080p vs 4K.


People see 4x and assuming 2*2 not 2^2, because they don't understand that it's a quadratic/squared relationship and not multiplicative.

 

The function being x^2+0x+0 or just x^2. This is also explained by the units, being two axis of pixels.

 

Overall, people shouldn't use the terms outside of 4K and 8K, which are widely accepted on monitors as being 4K = 3840x2160 and 8K = 7680x4320. They likely chose 8K to be that resolution since 7680x4320 is equal to a few whole number multiples of standard resolutions, shown above.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Agall said:

DCI standards dictate that 2K = 2048x1080, 4K = 4096x2160, referring to the horizontal resolution. That's adapted to 1920x1080 and 3840x2160 on monitors, although the only 'K' designation that's widely accepted is that 4K= 3840x2160 when discussing monitors, although this isn't an actual standard.

This itself also hints at some minor misconceptions.

 

4K and 2K are generic terms. They are adjectives, not names. DCI did not invent the terms 2K and 4K, and the use of these terms long predates the DCI standards. As such, the idea that the DCI standard defines what the terms 4K and 2K mean is not correct. 4096 × 2160 is an example of a 4K resolution that has been standardized by DCI. Another standardized 4K resolution is 3840 × 2160, in ITU-R BT.2020.

 

Quote

The ITU-R Recommendation lays out the quality standards for UHDTV in two steps. [...] The first level of UHDTV picture levels has the equivalent of about 8 megapixels (3 840 x 2 160 image system), and the next level comes with the equivalent of about 32 megapixels (7 680 x 4 320 image system).  As a shorthand way of describing them, they are sometimes called the ‘4K’ and ‘8K’ UHDTV systems.

 

http://www.itu.int/net/pressoffice/press_releases/2012/31.aspx#.WAJplugrKCp

Again, 4K and 8K are adjectives. Which UHDTV system? The 4K one. The 8K one.

 

Let's put it this way. I'm defining a new standard right now. My standard defines two video formats: a 16:9 format and a 21:9 format. The 16:9 format has a resolution of 1600 × 900 and the 21:9 format has a resolution of 2100 × 900.

 

Then, there will surely be some people on the internet who read this sentence:

Quote

The 16:9 format has a resolution of 1600 × 900

and then say "Look, 1920 × 1080 isn't really 16:9! See, true 16:9 is defined as 1600 × 900! See, this standard right here establishes the official definition of The 16:9 Video Format!", failing to realize that "16:9" isn't being used as a name here, just a description. In the same way, the DCI standard establishes terms and definitions to use as shorthands within the scope of the document, the way that legal documents work. But it's not intended to establish "2K" and "4K" as exclusive names for the formats it defines. They're just descriptions used within the document.

 

Like I said, the use of these terms is generic and long predates DCI. A 4K scan of 35 mm film will be around 4096 ×3112 or something like that. A 4K cinema crop will be like 4096 × 1728 or whatever. 4096 × 2160 is one particular 4K resolution that some people have standardized around for certain purposes. 3840 × 2160 is another 4K resolution that people have standardized around for some other purposes.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/1/2023 at 1:52 PM, Glenwing said:

4K and 2K are generic terms. They are adjectives, not names. DCI did not invent the terms 2K and 4K, and the use of these terms long predates the DCI standards. As such, the idea that the DCI standard defines what the terms 4K and 2K mean is not correct. 4096 × 2160 is an example of a 4K resolution that has been standardized by DCI. Another standardized 4K resolution is 3840 × 2160, in ITU-R BT.2020.

I'm in no way declaring that because a standard exists for that term in a niche context, that said term applies regardless of context. I don't see how "DCI standards dictate that 2K = 2048x1080, 4K = 4096x2160" would inspire such a conclusion. DCI can declare whatever it wants, doesn't mean people have to use it outside of cinematography, but it is a standard, that includes ITU-R BT.2020.

 

The difference ultimately appears to be between the terms 4K UHD and 4K DCI.

 

Really, it's all just different ways of shortening a response like an acronym or abbreviation, where 4K is a lot easier to say or type than 4096x2160 or 3840x2160. It obviously depends on context on which one is talking about, where in this context, declaring "4K DCI" would be a necessary modifier to point towards 4096x2160. That's dictated by adjusting the context, since otherwise people would reasonably think 4K = 3840x2160.

 

In a previous career, I had 4 different meanings for the acronym 'RAM', which entirely depended on context to which acronym was being used. In the context of a tech forum, 4K will almost always mean 3840x2160. If this were a cinema forum, it would be a different story (likely riddled with confusion around the term). However, I don't believe the discussion is whole without mentioning 4K DCI, like others have.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

when i asked about this, i thought that a standard as common as monitor resolution must be sorted out by now, after over a decade, but apparently not. So the solution for now is write 1080p,1440p rather than 2K,4K?

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, planetary problem said:

when i asked about this, i thought that a standard as common as monitor resolution must be sorted out by now, after over a decade, but apparently not. So the solution for now is write 1080p,1440p rather than 2K,4K?

It's sorted out, some people just don't get it, that's all.

 

If you want to be specific, say 3840×2160. If you want to use a shorthand, call it 4K UHD.

 

The convention of calling resolutions a shorthand like "2160p" refers to any resolution where the second number is 2160. So it could refer to 3840×2160 or 5120×2160 or anything else. If you say "2160p" by itself, people will usually assume you mean 2160p with a 16:9 aspect ratio, which is 3840×2160. If you want to specify a different one, include an aspect ratio, like "2160p 21:9" which would be 5120×2160.

 

The convention of calling resolutions a shorthand like "4K" refers to any resolution where the first number is around 4000. So it could refer to 3840×2160 or 4096×2160 or 3840×1600 or 4096×3072 or any number of other resolutions. Works the same as above, except the number is now rounded to make a shorter shorthand. Don't know why this confuses people so much, but it does. Again, if you just say "4K" with no other context, usually it's assumed that you mean 4K width in a 16:9 aspect ratio, so 3840×2160. If you want to refer to a different one, include an aspect ratio, like "4K 21:9".

 

Or just write out the resolution, 3840×2160, and there is no possibility for confusion.

 

On 9/5/2023 at 7:04 AM, Agall said:

I don't see how "DCI standards dictate that 2K = 2048x1080, 4K = 4096x2160" would inspire such a conclusion.

Ah. Well, the OP's original post was asking how the "K" naming convention works, with no specific context, and your explanation began with how the DCI standard defined 2K and 4K from which the practice of using 2K and 4K as shorthands for 1920×1080 and 3840×2160 apparently derives from.

On 9/1/2023 at 10:01 AM, Agall said:

To fully answer your question, its because people are misusing terms and applying standards improperly.

 

DCI standards dictate that 2K = 2048x1080, 4K = 4096x2160, referring to the horizontal resolution. That's adapted to 1920x1080 and 3840x2160 on monitors, although the only 'K' designation that's widely accepted is that 4K= 3840x2160 when discussing monitors, although this isn't an actual standard.

The implication, to me, was that you believed DCI was the original source of these names, and that the DCI definitions of "2K" and "4K" were the "true meaning of 2K/4K" as it were, which is a common misconception. Hopefully that clears it up.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×