Jump to content

"2K" does not mean 2560×1440

Glenwing

Well I see this seems to be another rehashing of "well it doesn't matter even if people call it 2K based on a misunderstanding, and the name goes against the established convention, as long as you can convince lots of people then it becomes correct". This point is addressed in the original post, I suggest reading it.

 

Question; lots of people, because they think 4K means 4 times 1080p, and therefore think 2K means 2 times 1080p (i.e. 2560×1440), also by extension call 1080p "1K". Do you agree that 1K means 1080p?

 

Also, is "Porsh" the correct pronunciation of Porsche if lots of people say it that way?

Link to comment
Share on other sites

Link to post
Share on other sites

As for the original question, it's pinned because, as we've just seen, there's still a lot of confusion about the proper use of "*K", whether it's for 2K, or higher numbers like 8K, 16K, etc. (I've seen people misunderstand those quite a bit) and where the term comes from, and how official it is.  I'm not going to re-explain everything though since it's all addressed rather nicely in the OP imo.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

...Since when does a significant portion of individuals use '2K' to refer to 2560x1440? o.O  I get that one wants to make the argument of 'Well, people call that 2K so that's what it is generally accepted that 2K means' but I've not actually seen more than a small minority of people use it as such... o.O

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, AshleyAshes said:

...Since when does a significant portion of individuals use '2K' to refer to 2560x1440? o.O  I get that one wants to make the argument of 'Well, people call that 2K so that's what it is generally accepted that 2K means' but I've not actually seen more than a small minority of people use it as such... o.O

idk if it's a significant portion or not, but there are at least a few

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Ryan_Vickers said:

idk if it's a significant portion or not, but there are at least a few

Oh for sure there's a few, I've seen it.  But I've certainly not seen remotely enough to make any argument that 'This is what the term is broadly accepted to mean to people.'

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, AshleyAshes said:

Oh for sure there's a few, I've seen it.  But I've certainly not seen remotely enough to make any argument that 'This is what the term is broadly accepted to mean to people.'

I guess it's broadly accepted among the few who feel that way :P "20% of the time, it works every time"...

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 15/11/2016 at 5:23 PM, EminentSun said:

The whole 4k thing is kind of ridiculous. We should just be calling it 2160p, but nooooooo. Some marketing team decided to introduce a ridiculous moniker.

Technically, real 4K is 4096xc2160p, and the consumer "4K" is actually 3840x2160 which is actually named UHD

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, DanielMDA said:

Technically, real 4K is 4096xc2160p, and the consumer "4K" is actually 3840x2160 which is actually named UHD

Well, as they've come to be known.  But, as this thread tries to point out, 4K doesn't mean any particular resolution, just a class of them.  For example, you could have a 4K 1440p monitor

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, DanielMDA said:

Technically, real 4K is 4096xc2160p, and the consumer "4K" is actually 3840x2160 which is actually named UHD

Not at all, the "4096×2160 is true 4K" is completely made up by consumers, it has no basis in reality, and in addition "UHD" is not a name for 3840×2160, it's a broader term. There's a section addressing this in the first post, actually.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

7680×4320 is not really 8K either. But they will advertise them as being 8K monitors :P

 

I want to run Skyim LE at 7680×4320!

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 weeks later...

Just call it 1080p, 1440p, 2160p. 

PC: AMD Ryzen 9 5900X, Gigabyte GeForce RTX 4090 OC 24G, X570 AORUS Elite WIFI Motherboard, HyperX FURY 32GB DDR4-3200 RGB RAM, Creative Sound Blaster AE-9 Sound Card, Samsung 970 Evo Plus M.2 SATA 500GB, ADATA XPG SX8200 Pro M.2 SATA 2TB, Asus HyperX Fury RGB SSD 960GB, Seagate Barracuda 7200RPM 3.5 HDD 2TB, Cooler Master MASTERLIQUID ML240R ARGB, Cooler Master MASTERFAN MF120R ARGB, Cooler Master ELV8 Graphics Card Holder ARGB, Asus ROG Strix 1000G PSU, Lian Li LANCOOL II MESH RGB Case, Windows 11 Pro (22H2).


Laptop: Asus Vivobook "A Bathing Ape" - ASUS Vivobook S 15 OLED BAPE Edition: Intel i9-13900H, 16 GB RAM, 15.6" 2.8K 120hz OLED | Apple MacBook Pro 14" 2023: M2 Pro, 16 GB RAM, NVMe 512 GB | Asus VivoBook 15 OLED: Intel® Core™ i3-1125G4, Intel UHD, 8 GB RAM, Micron NVMe 512 GB | Illegear Z5 SKYLAKE: Intel Core i7-6700HQ, Nvidia Geforce GTX 970M, 16 GB RAM, ADATA SU800 M.2 SATA 512GB.

 

Monitor: Samsung Odyssey OLED G9 49" 5120x1440 240hz QD-OLED HDR, LG OLED Flex 42LX3QPSA 41.5" 3840x2160 bendable 120hz WOLED, AOC 24G2SP 24" 1920x1080 165hz SDR, LG UltraGear Gaming Monitor 34" 34GN850 3440x1440 144hz (160hz OC) NanoIPS HDR, LG Ultrawide Gaming Monitor 34" 34UC79G 2560x1080 144hz IPS SDR, LG 24MK600 24" 1920x1080 75hz Freesync IPS SDR, BenQ EW2440ZH 24" 1920x1080 75hz VA SDR.


Input Device: Asus ROG Azoth Wireless Mechanical KeyboardAsus ROG Chakram X Origin Wireless MouseLogitech G913 Lightspeed Wireless RGB Mechanical Gaming Keyboard, Logitech G502X Wireless Mouse, Logitech G903 Lightspeed HERO Wireless Gaming Mouse, Logitech Pro X, Logitech MX Keys, Logitech MX Master 3, XBOX Wireless Controller Covert Forces Edition, Corsair K70 RAPIDFIRE Mechanical Gaming Keyboard, Corsair Dark Core RGB Pro SE Wireless Gaming Mouse, Logitech MK850 Wireless Keyboard & Mouse Combos.


Entertainment: LG 55" C9 OLED HDR Smart UHD TV with AI ThinQ®, 65" Samsung AU7000 4K UHD Smart TV, SONOS Beam (Gen 2) Dolby Atmos Soundbar, SONOS Sub Mini, SONOS Era 100 x2, SONOS Era 300 Dolby Atmos, Logitech G560 2.1 USB & Bluetooth Speaker, Logitech Z625 2.1 THX Speaker, Edifier M1370BT 2.1 Bluetooth Speaker, LG SK9Y 5.1.2 channel Dolby Atmos, Hi-Res Audio SoundBar, Sony MDR-Z1R, Bang & Olufsen Beoplay EX, Sony WF-1000XM5, Sony WH-1000XM5, Sony WH-1000XM4, Apple AirPods Pro, Samsung Galaxy Buds2, Nvidia Shield TV Pro (2019 edition), Apple TV 4K (2017 & 2021 Edition), Chromecast with Google TV, Sony UBP-X700 UltraHD Blu-ray, Panasonic DMP-UB400 UltraHD Blu-ray.

 

Mobile & Smart Watch: Apple iPhone 15 Pro Max (Natural Titanium), Apple Watch Series 8 Stainless Steel with Milanese Loop (Graphite).

 

Others Gadgets: Asus SBW-06D2X-U Blu-ray RW Drive, 70 TB Ext. HDD, j5create JVCU100 USB HD Webcam with 360° rotation, ZTE UONU F620, Maxis Fibre WiFi 6 Router, Fantech MPR800 Soft Cloth RGB Gaming Mousepad, Fantech Headset Headphone Stand AC3001S RGB Lighting Base Tower, Infiniteracer RGB Gaming Chair

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/12/2017 at 7:07 AM, Andrewtst said:

Just call it 1080p, 1440p, 2160p. 

That's what I always do.

 

Or call them FHD, QHD or UHD.

 

I'm tired of seeing tech bloggers call QHD screens "2K" and UHD screens "4K". Not big on these consumerized terms.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, D13H4RD2L1V3 said:

That's what I always do.

 

Or call them FHD, QHD or UHD.

 

I'm tired of seeing tech bloggers call QHD screens "2K" and UHD screens "4K". Not big on these consumerized terms.

especially since 2K is misleading at best and straight up wrong at worst since many people seem to use it to refer to 2560 x 1440, rather than 1920 x 1080 or anything else that one would think of when seeing "2K"

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

Urrhh...

 

Digital Cinematic Initiative. Aka DCI. This should be a helpful guide and should help people understand easier. Source: 4k resolution Wikipedia article.

 

1920px-Digital_video_resolutions_%28VCD_to_4K%29.svg.png

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Cuddly Kitty said:

Urrhh...

 

Digital Cinematic Initiative. Aka DCI. This should be a helpful guide and should help people understand easier. Source: 4k resolution Wikipedia article.

 

1920px-Digital_video_resolutions_%28VCD_to_4K%29.svg.png

On 11/15/2016 at 3:18 PM, Glenwing said:

In digital cinema where these terms originate from, "4K" is and always has been a generic term that refers to a class of resolutions, not any one specific resolution. This idea that 4096×2160 is the "true 4K definition" used in cinema, you may notice, is only held by consumer-level internet people, not by anyone actually involved in cinema.

 

Yes, 4096×2160 is established as a standard resolution by the DCI specification, and they do refer to it as 4K, but that is not a term that they came up with, it's only a generic term. It's no more of a name for 4096×2160 than if you wrote a new standard saying "we're going to establish a standardized 16:9 resolution, 1600×900" and then you had a bunch of people running around on the internet saying "1600×900 is the true 16:9 resolution, it's defined in this standard, 1920×1080 isn't really 16:9!"

 

4096×2160 is not "the definition" of 4K, it is just one of several standardized 4K resolutions, just as we have several standardized 16:9 resolutions but none of them are "the definition" of 16:9, because 16:9 isn't a resolution, it's a category (in this case, anything with a width-to-height ratio of 16:9 fits in that category). And in both cases, it's not really about what resolutions are established by standards. If you have a resolution with a ratio of 16:9, then it's a 16:9 resolution, it doesn't have to have a standards document to go with it, and the same applies to 4K; any resolution with ≈4,000 horizontal pixels is a 4K resolution, because that's the definition of 4K.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

Your reasoning is correct.  I'm of the opinion that it doesn't matter what you call it. Remember how stupid it got back when resolution was represented by a string of seemingly random letters?  VGA, then it was SVGA, then things went full retard.

i7 4790k @4.7 | GTX 1070 Strix | Z97 Sabertooth | 32GB  DDR3 2400 mhz | Intel 750 SSD | Define R5 | Corsair K70 | Steel Series Rival | XB271, 1440p, IPS, 165hz | 5.1 Surround
PC Build

Desk Build

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/7/2018 at 1:40 AM, CostcoSamples said:

Your reasoning is correct.  I'm of the opinion that it doesn't matter what you call it. Remember how stupid it got back when resolution was represented by a string of seemingly random letters?  VGA, then it was SVGA, then things went full retard.

While certainly less user friendly than a simple number to anyone who didn't know what they mean, those were all acronyms that stood for things and were officially defined to mean one specific resolution iirc, so in that regard, tbh, it was better than what we have now.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/9/2018 at 1:31 PM, Ryan_Vickers said:

While certainly less user friendly than a simple number to anyone who didn't know what they mean, those were all acronyms that stood for things and were officially defined to mean one specific resolution iirc, so in that regard, tbh, it was better than what we have now.

lol its a sad state we are in.

i7 4790k @4.7 | GTX 1070 Strix | Z97 Sabertooth | 32GB  DDR3 2400 mhz | Intel 750 SSD | Define R5 | Corsair K70 | Steel Series Rival | XB271, 1440p, IPS, 165hz | 5.1 Surround
PC Build

Desk Build

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

The title to this post is silly. Anyone who says that is silly. 2K means 2560x1440, sorry guys. It means 2k in the same way that the word "D'oh" ended up appearing in dictionaries with an official meaning after the Simpson's had been playing for 10 years - it is the meaning we've assigned to it as silly humans and that's just the way it is. People bothered by 2k referring to 2560x1440 have OCD or are trying to prove that they know stuff about computers that you don't know - either way they are the silliest of the sillies. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, beakhole said:

The title to this post is silly. Anyone who says that is silly. 2K means 2560x1440, sorry guys. It means 2k in the same way that the word "D'oh" ended up appearing in dictionaries with an official meaning after the Simpson's had been playing for 10 years - it is the meaning we've assigned to it as silly humans and that's just the way it is. People bothered by 2k referring to 2560x1440 have OCD or are trying to prove that they know stuff about computers that you don't know - either way they are the silliest of the sillies. 

 

On 11/15/2016 at 3:18 PM, Glenwing said:

Q: "Ultimately a shorthand "means" whatever everyone agrees it means; if it’s universally accepted that "2K" means 2560×1440, and whenever you say 2K that’s what people interpret it as, then it DOES mean 2560×1440 no matter if it has logical basis or not!"

 

But everyone doesn’t agree that 2K means 2560×1440, that’s just the problem. Although "2K = 2560×1440" is becoming a widespread misconception among consumers, with people pointing to websites like Newegg and companies like ASUS starting to use the term "2K" to refer to 2560×1440, this is far from "universal agreement" on the meaning of 2K. It may be worth noting that Newegg also lists resolutions like 3440×1440 as "2K" which shows just how far out of touch they are; they’re just using "2K" as a drop in replacement term for "1440p" without any thought about it.

 

Within the cinematography industry, where this naming convention originated from in the first place, it is agreed without any ambiguity that "2K" refers to resolutions like 1920×1080 or 2048×1080, while 2560×1440 is definitely classed as a 2.5K resolution. Examples of this can be seen in the above section, "Examples of How the Cinematography Industry Uses These Terms".

 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know why this upsets people. I think people should be waaaaay more upset over the other terms, like "Quad HD" for 2k and "Ultra HD" for 4k. What's next? Quad-Ultra? Super-Mega-Ultra? Supercalifragilisticexpialidocious-HD? 

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, beakhole said:

I don't know why this upsets people. I think people should be waaaaay more upset over the other terms, like "Quad HD" for 2k and "Ultra HD" for 4k. What's next? Quad-Ultra? Super-Mega-Ultra? Supercalifragilisticexpialidocious-HD? 

It's not a matter of being upset, I was just correcting a misconception that I saw, which is what I do around here. When I see one that is widespread I post a sticky to avoid answering it over and over, as I've done with dozens of other topics here. Obviously this particular one has become somewhat more ingrained than when I originally posted though.

Link to comment
Share on other sites

Link to post
Share on other sites

On 2016-11-16 at 12:18 AM, Glenwing said:

Terms like "2K" and "4K" don’t refer to specific resolutions. They are resolution categories. They are used to classify resolutions based on horizontal pixel count. "2K" refers to resolutions that have around 2,000 (2K) pixels horizontally. Examples include:
  • 1920 × 1080 (16:9)
  • 1920 × 1200 (16:10)
  • 2048 × 1080 (≈19:10)
  • 2048 × 1152 (16:9)
  • 2048 × 1536 (4:3)

All of these are examples of 2K resolutions. 1920×1080 is a 2K resolution. 2048×1080 is another 2K resolution. 2560×1440 is not a 2K resolution, it is a 2.5K resolution.

 

"2.5K" refers to resolutions around 2,500 (2.5K) pixels horizontally. For example:

  • 2304 × 1440 (16:10)
  • 2400 × 1350 (16:9)
  • 2560 × 1080 (64:27 / ≈21:9)
  • 2560 × 1440 (16:9)
  • 2560 × 1600 (16:10)

All of these are examples of 2.5K resolutions.

 

So why do people call 2560×1440 "2K"?

 

Because when "4K" was new to the consumer market, people would ask: "What's 4K?", and usually the response was "it’s four times as many pixels as 1080p". Unfortunately most people misinterpreted this and assumed that the "4" in "4K" actually stood for "how many times 1080p" the resolution was, and since 2560×1440 is popularly known as being "twice as many pixels as 1080p" (it's 1.77 times, but close enough), some people decided to start calling it "2K", and other people heard that and repeated it.

 

While it’s true that 4K UHD (3840×2160) is four times as many pixels as 1920×1080, that isn’t why it’s called "4K". It’s called 4K because it's approximately 4,000 pixels horizontally. The fact that it’s also 4 × 1080p is just a coincidence, and that pattern doesn’t continue with other resolutions.

 

For example, the 5K resolution featured in the Retina 5K iMac, 5120×2880, is equivalent to four 2560×1440 screens. If 1440p is "2K" because it’s twice as many pixels as 1080p, then wouldn’t four of them together be called "8K"? (Well, technically 7K since like I said 1440p is 1.77 times not 2 times 1080p, but that’s beside the point). We don’t call it 7K or 8K. We call it 5K, because it's around 5,000 pixels horizontally. It has nothing to do with "how many times 1080p" the resolution is.

 

In addition, an actual 8K resolution such as 8K UHD (7680×4320) is equivalent to four 4K UHD screens. A single 4K UHD screen is four times as many pixels as 1080p, so four of those together is sixteen times as many pixels as 1080p. But 7680×4320 isn't called "16K", it’s called "8K", because it’s approximately 8,000 pixels horizontally. Again it doesn't have anything to do with "how many times 1080p" the resolution is.

 

So although 2560×1440 is around twice as many pixels as 1080p, it is not called "2K", because that isn’t where these names come from. Since 2560×1440 is approximately 2,500 pixels horizontally, it falls into the 2.5K classification.

 

Examples of How the Cinematography Industry Uses These Terms

  Reveal hidden contents

RED:

  Reveal hidden contents

 

large.58364b40a1101_ss(2016-11-06at03_05

 

large.58364b416c8d7_ss(2016-11-06at03_06

 

large.58364b3fe496d_ss(2016-11-06at02_41

 

 

 

RED Scarlet-W Manual

 

In the charts above, the naming convention is made pretty clear, though it's not without its inconsistencies. For example, every 6:5 format has a far lower horizontal pixel count than its name suggests since these formats are intended to be used with anamorphic lenses, and the images will have a wider horizontal pixel count once they are de-squeezed. There are other minor oddities like 5568×3160 being classified as 6K while 5632×2948 is classified as 5.5K, but this is somewhat expected since this naming convention does not have any "official" set of rules for determining names, it's all just convention-based. In any case, despite the occasional deviation, the main pattern of the naming convention quite clearly follows the horizontal pixel count, and definitely not "how many times 1080p".

 

Just to sum up some of the more interesting parts of the above charts from the RED Scarlet-W manual:

  • 1920×1080 is listed as "2K 16:9 (HD)".
  • 2560×1080 is listed as "2.5K 2.4:1". Despite being an "ultrawide" version of 1920×1080 (2K 16:9), calling it "2K ultrawide" is improper usage of the term 2K, as it is a 2.5K resolution, not 2K.
  • 2560×1340 is listed as "2.5K Full Frame", it’s safe to say if 2560×1440 were included on the list it would be classified as a 2.5K resolution as well. (You might think ''1340" is just a typo for "1440", but actually it's more likely a typo for "1350", which would make it a 256:135 (≈19:10) ratio which is consistent with the other full frame resolutions listed)
  • 3840×2160 and 4096×2160 are both classified as 4K resolutions. 4096×2160 is not "the only" 4K resolution.
  • 5120×2160 (ultrawide version of 3840×2160) is listed as "5K 2.4:1". Calling it "4K ultrawide" is improper usage of the term 4K, as it is a 5K resolution, not 4K.

 

Blackmagic Design:

 

large.58364b3f65fbe_ss(2016-10-15at01_02

Blackmagic Cinema Camera PL

2400×1350 is classified as a 2.5K resolution here. A slightly higher resolution like 2560×1440 would also be classified a 2.5K, certainly not 2K.
 
Canon:
 
large.58364b41bbf12_ss(2016-11-06at03_28
 
Note here that 2048×1080 and 1920×1080 both fall under the "2K" categories. 2K definitely does not refer to 2560×1440 or similar resolutions. 4096×2160 and 3840×2160 are also both classified as "4K" resolutions. 4096×2160 is not "the only" 4K resolution.

 

"True 4K"

  Reveal hidden contents

"True 4K"

 

While I’m here, I may as well address this one too. Some people will get upset when you call 3840×2160 "4K", and will say:

 

"3840×2160 isn’t ‘4K’, that’s ‘UHD’! True 4K is 4096×2160!"

 

And some go as far as saying 4K TVs are a consumer scam because they're not "real 4K". This is nonsense, really. As explained at the top, "4K" isn’t a resolution. It’s a category. The term is used to refer to any resolution approximately 4,000 (4K) pixels horizontally, for example:

  • 3840 × 1600 (24:10 / ≈21:9)
  • 3840 × 2160 (16:9)
  • 3840 × 2400 (16:10)
  • 4096 × 2160 (≈19:10)
  • 4096 × 2304 (16:9)
  • 4096 × 2560 (16:10)
  • 4096 × 3072 (4:3)

All of these are examples of 4K resolutions. None of them is "the one true" 4K resolution, because there is no such thing. They are all classified as 4K resolutions, and neither 3840×2160 nor 4096×2160 is more "true" than the other.

 

In digital cinema where these terms originate from, "4K" is and always has been a generic term that refers to a class of resolutions, not any one specific resolution. This idea that 4096×2160 is the "true 4K definition" used in cinema, you may notice, is only held by consumer-level internet people, not by anyone actually involved in cinema.

 

Yes, 4096×2160 is established as a standard resolution by the DCI specification, and they do refer to it as 4K, but that is not a term that they came up with, it's only a generic term. It's no more of a name for 4096×2160 than if you wrote a new standard saying "we're going to establish a standardized 16:9 resolution, 1600×900" and then you had a bunch of people running around on the internet saying "1600×900 is the true 16:9 resolution, it's defined in this standard, 1920×1080 isn't really 16:9!"

 

4096×2160 is not "the definition" of 4K, it is just one of several standardized 4K resolutions, just as we have several standardized 16:9 resolutions but none of them are "the definition" of 16:9, because 16:9 isn't a resolution, it's a category (in this case, anything with a width-to-height ratio of 16:9 fits in that category). And in both cases, it's not really about what resolutions are established by standards. If you have a resolution with a ratio of 16:9, then it's a 16:9 resolution, it doesn't have to have a standards document to go with it, and the same applies to 4K; any resolution with ≈4,000 horizontal pixels is a 4K resolution, because that's the definition of 4K.

 

The entire "4096×2160 is true 4K" thing was completely made up by tech news websites when 4K TVs were first starting to appear. Of course, all the major tech websites wanted to write a "4K explained" article, and of course being consumer-level writers, they themselves knew nothing about the established usage of the term "4K" (which had been used for years in cinema at this point).

 

Long story short, all the articles about "4K explained" were written by a bunch of consumers who know nothing about cinema, and are based on a few Google searches for "4K" in which these writers saw that 4096×2160 was mentioned a lot (since it is quite a common resolution), investigated a little and saw that it was a DCI standard, and leapt to the conclusion that "4K" was a unique name that referred exclusively to 4096×2160 in the same way that "Full HD" refers to 1920×1080.

 

Unfortunately their little assumption was completely wrong, and none of them researched enough to understand how the term "4K" was (is) actually used in industry. But boy did it catch on. Mostly, I suspect, because people on the internet like the feeling of knowing things that other people don't know, or feeling that they're doing things (or saying things) "how the pros do it", and believing the 4096×2160 true 4K thing makes them feel as though they have special cinema industry insider knowledge. Sadly, the entire thing was completely made up by consumers. Sorry to say.

 

"UHD" is not a name for 3840×2160

 

Secondly, UHD is not a name for 3840×2160. The whole "4096×2160 is 4K, and 3840×2160 is just called UHD" thing is entirely wrong; both of those resolutions are 4K resolutions, and in fact both of those resolutions are UHD resolutions as well. UHD is a term created by CEA as a marketing standard to refer to displays that meet certain requirements. Here is the relevant part of the definition of UHD:

 

UHD is basically a class of display; note that the definition is at least 3840×2160, and 16:9 or wider. This means that higher resolutions and wider ratios, such as 4096×2160, or even 5120×2880 or higher, or ultrawide resolutions, qualify as "UHD resolution". UHD does not have to be 3840×2160, or even a 4K class display at all.

 

3840×2160 is established as a standard by ITU, and it never defines "UHD" as 3840×2160. This is, again, something made up by the internet because it's simple and easy to say "4096 is 4K, and 3840 is UHD", so it catches on easily.

 

This standard does the exact same thing, establishing "4K" and "8K" as shorthand terms for discussing the formats in the context of the standards document. It does not mean 4K is "the name" for 3840×2160, just as the DCI specification's usage of the term does not mean 4K is "the name" for 4096×2160 either. It is just a general term used in the industry for anything ≈4,000 pixels horizontally, but of course may have specific meanings within certain standards documents, which are made clear in the documents themselves and only apply within that respective document.

 

There is no sense in which 3840×2160 is "just called UHD", or in which 4096×2160 is the "one true" 4K resolution.

 

"K" and "Ultrawide"

  Reveal hidden contents

"K" and "Ultrawide"

 

Every once in a while I see someone asking about "4K ultrawide", and what they mean by that is the most common 4K resolution (4K UHD, 3840×2160) extended horizontally to a ≈21:9 aspect ratio, so something like 5120×2160 (or four times 2560×1080).

 

Unfortunately this is really a misuse of the term "4K". Remember that 4K isn’t a name for a specific resolution like 3840×2160, so "4K ultrawide" doesn’t mean "that resolution, but wider". The "K" term refers to the width in pixels, so "something wider than 4K" would be called 5K. Saying "4K ultrawide" is like asking for an extra-wide 4-meter-wide table or something like that. "You mean...a 5-meter-wide table?" "No, a 4 meter table, but extra wide! Like maybe 5 meters in width!" "...Right..."

 

5120×2160, being ≈5,000 (5K) pixels horizontally, is a 5K resolution, so calling it "4K ultrawide" doesn’t really make sense. Terms like "1080p" and "1080p ultrawide", or "1440p" and "1440p ultrawide" work because the numbers 1080 and 1440 refer to the height rather than width, so when you have a resolution that is the same height but wider, you can still use the same number and it makes sense.

 

But a term like "2.5K" can’t just be used as a drop-in replacement for "1440p", because not all 1,440 pixel-tall screens have ≈2,500 pixels horizontally. Only 1440p screens with 16:9 ratios do. A 1440p screen with a wider ratio like 21:9 will have more horizontal pixels, which will classify it as a 3K or 3.5K resolution, even though the vertical pixel count (1440p) remains the same.

 

The same is true with 4K resolutions. A resolution like 3840×2160 (a 16:9 ratio) is a 4K resolution that could also be referred to as "2160p", but this does not mean "2160p" and "4K" are interchangeable. Extending 3840×2160 to a wider ratio like 21:9 results in a resolution that is still 2,160 pixels tall, but is 5K pixels wide instead of 4K. So even though "2160p ultrawide" still makes sense for that resolution, the "4K ultrawide" name does not.

 

"But if I can’t call it 4K ultrawide, what should I call it? If I say "5K", people will think I’m talking about 5120×2880 (16:9), and if I say 5K ultrawide then people will think I mean an ultrawide extension of that resolution, plus 5120×2160 is more like 5K ultrashort anyway..."

 

Usually it’s assumed when you say "4K" or "5K" that you’re talking about the 16:9 resolutions since they are the most common, so when people refer to different aspect ratios they’ll usually include the aspect ratio to avoid confusion. 5120×2160 can be referred to as as "5K 21:9" or something like that.

 

Another alternative that has been used is writing out both dimensions with "K" instead of just the horizontal. For example, 3840×2160 (4K UHD) is often called "4K × 2K". An ultrawide version of that resolution, 5120×2160, would be referred to as "5K × 2K", meanwhile the 16:9 resolution of 5120×2880 is referred to as "5K × 3K", so this convention does make the two resolutions distinguished from each other.

 

A third option (and probably the best option for most people) would be to not mix "K" and "ultrawide" together at all. Just use the old "vertical pixel count" convention and call it "2160p ultrawide".

 

Of course, you can always just write out the full resolution if you want to avoid any ambiguity, that option is always available too.

 

"But what about..."

  Reveal hidden contents

"But what about..."

 

Q: "But what about resolutions like 1280×720? Is that 1K? 1.5K? 1.25K? 1.28K? If we round to the nearest 0.5K, then it’s 1.5K, but then what about 1600×900? Is that also 1.5K?"

 

"K" is a casual shorthand, not a full-blown naming system. Typically it isn’t used at all for low resolutions like 1280×720, and there are some mid-range resolutions like 2304×1440 and 2880×1800 where people question how sensible this convention is (should 2560×1440 be called 2.5K while 2880×1800 and 3200×1800 are both 3K, even though 2880 is an equal distance between 2560 and 3200?).

 

The "K" shorthand originated in the cinema industry where discussions about resolution are generally centered around a few monolithic classes of resolutions, so this shorthand was never intended to be "high precision". If used for resolutions in the PC industry, yes there will be some that are ambiguous in what they should be called. If you’re talking about an unusual resolution, then it’s best to write out the full resolution rather than using abbreviations. These shorthands don’t cover every possible resolution and they were never intended to do so.

 

Q: "But if we round to the nearest 0.5K, shouldn’t 7680×4320 be called 7.5K rather than 8K? If it’s not 7.5K because you’re rounding to the nearest whole number, then 2560×1440 shouldn’t be 2.5K either!"

 

2560×1440 rounds up to 3K if using the nearest whole number, so you still don’t have any reason to call it 2K. Good try though.

 

This system of shorthands isn’t strictly defined by any industry body, and so there are no absolute rules. But the general consensus is that at the higher values (above 5K, usually) more "plus or minus" margin is given and we round to the nearest whole number instead of 0.5 value. At lower resolutions, it doesn't take as many pixels for two resolutions to be significantly different, so more granularity is needed to distinguish them, so we round to the nearest 0.5. There is no "definition" of where the absolute cutoff point is though.

 

For example if we rounded to the nearest 1K for all resolutions, then 1600×900 and 2304×1440 would both be considered "2K" resolutions, even though the difference between them is very significant. So instead, we round to the nearest 0.5K, and the names become 1.5K vs 2.5K, which gives a better representation of the difference. On the other hand if you had higher resolutions that were maybe 7680 vs. 8192, the difference isn’t really very significant (percentage-wise it's the same as the difference between 1920 and 2048), so there’s not much point in naming them to different categories.

 

Q: "Why are we suddenly using horizontal pixels anyway, vertical makes much more sense! The "4K" name is just marketing gibberish created by TV companies!"

 

The "K" shorthand was not created by TV companies. It is borrowed from the cinematography industry, where it has been used commonly for years prior to the introduction of 4K TVs to the consumer market. In cinematography, it makes much more sense to use horizontal resolution to classify images, because movies are often cropped vertically (black bars on top and bottom), so naming resolutions by vertical pixel count would mean the resolution classification of the material would change just based on how much black bar is added, even though the detail level of the image hasn’t changed. Since resolution is used to classify detail level, it doesn’t make any sense to have a classification system that changes designations when the detail level remains the same.

 

Instead, horizontal resolution is used to classify images, so that images with the same level of detail will be classified in the same group regardless of what aspect ratio has been chosen for the material.

 

However, in gaming, images are not cropped vertically when moving to a wider aspect ratio, but instead they are expanded horizontally because the content is rendered in real time, so it is possible to generate additional new content to fill the extra width rather than just expanding the existing image to fill the screen and cropping the top and bottom off. In this case classifying resolutions based on width alone isn’t all that useful, because the aspect ratio plays a much larger role in how the content appears on the screen.

 

But since the TV industry is more concerned with cinematic material than it is with gaming, they chose to use the "K" shorthand that is used in cinematography.

 

Q: "Ultimately a shorthand "means" whatever everyone agrees it means; if it’s universally accepted that "2K" means 2560×1440, and whenever you say 2K that’s what people interpret it as, then it DOES mean 2560×1440 no matter if it has logical basis or not!"

 

But everyone doesn’t agree that 2K means 2560×1440, that’s just the problem. Although "2K = 2560×1440" is becoming a widespread misconception among consumers, with people pointing to websites like Newegg and companies like ASUS starting to use the term "2K" to refer to 2560×1440, this is far from "universal agreement" on the meaning of 2K. It may be worth noting that Newegg also lists resolutions like 3440×1440 as "2K" which shows just how far out of touch they are; they’re just using "2K" as a drop in replacement term for "1440p" without any thought about it.

 

Within the cinematography industry, where this naming convention originated from in the first place, it is agreed without any ambiguity that "2K" refers to resolutions like 1920×1080 or 2048×1080, while 2560×1440 is definitely classed as a 2.5K resolution. Examples of this can be seen in the above section, "Examples of How the Cinematography Industry Uses These Terms".

 

 

Thank you! So many times I have ended up in this discussion on youtube, and everytime I lose the argument because there are simply too many people out there not understanding this concept of horizontal pixel count.

 

Has been especially evident with the release of the 3840x1600 monitors and people (including Linus himself) complaining about "this screen is not 4K".

 

 

People keep forgetting that 2K/FullHD MOVIES are often shot in 21:9, meaning they are not 1080p, but 1920x800 pixels instead. 

Spoiler

Mobo: Asus Z370-A Prime

CPU: Intel i7 8700K

RAM: Kingston Fury 32GB (2x16GB) DDR4 3200MHz CL16 Beast

GPU: Gigabyte Aorus GTX 1080Ti Xtreme Edition 11GB

Case: Fractal Define R6 Tempered Glass, Black

SSD 1: Crucial P3 1TB M.2 PCIe Gen 3 NVMe SSD

SSD 2: Samsung 850 EVO 1TB

SSD 3: Crucial MX500 500 GB

HDD: Seagate Barracuda ST4000DM005 64MB 4TB 7200 rpm

PSU: Corsair RM750X v2

Display 1: AOC Agon AG271QG

Display 2: Dell U2711

CPU Cooler: Cooler Master Nepton 240M AIO

Mouse: Logitech G502 Proteus Core

Keyboard: Cooler Master CM Storm Trigger Z w/ Cherry MX Brown

Speakers: Creative Gigaworks T40 Series II

Soundcard: Creative AE-5 Soundblaster

Headphones: Sennheiser RS 165 Wireless

Microphone 1: Audio Technica AT2020+ USB

Microphone 2: Antlion Audiio ModMic Wireless

OS: Windows 11 Home 64-bit

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×