Jump to content

"2K" does not mean 2560×1440

Glenwing
On 12/21/2016 at 10:09 AM, Leicester77 said:

Well there is a "True 4k" and a "True 2k".

DCI 2K 2048x1080

DCI 4K 4096x2160

And then there are cropped versions for the different TV aspect ratios.

Further more there is a so called "generic term" for resolutions around 2000 pixels and 4000 pixels horizontally.

 

Why don't we just call the resolutions by their true names? Because there is one for almost every thinkable combination.

3840x2160 for example, just call it Ultra High Definition - UHD.

1920x1080 Full High Definition - FHD.

 

But still, people should stop making up things which don't exist - like 2.5k or 2.7k.... just staaaahp please.

Did you read the full post? Almost everything you just said was addressed already.

 

Pardon me for saying, but your post seems to represent a viewpoint where all information is taken from consumer-level sources. Generally terms like "2.5K" haven't been used by consumers so you think it is "made up" by random internet people. But it is very real terminology, and I quoted numerous industry examples using this term in my original post (read the last section).

 

On the other hand, 4096×2160 being "true 4K" is something that is made up by consumers (tech journalists) when "4K" was being introduced to the consumer market and every journalist was scrambling to write a "4K Explained" article. They got hung up on the fact that multiple things were being called "4K", and they didn't quite catch on to the fact that it is just a category, and doesn't refer to a specific resolution. But they kept trying to fit the square block in the round hole (as it were), and in trying to find the "real one" (there is really no such thing), they ended up picking the 4K resolution that was most commonly used or commonly referred to at the time (4096×2160) and decided that it must be the "true" 4K resolution. And boy did it catch on... But it has no basis in the industry itself. It is only repeated by people with no inside experience or knowledge, but like to pretend that they do by reading internet information and feeling like they have some "special insider knowledge".

 

As for resolution names it is useful to have shorthands that are related numerically to the resolution itself. We already tried the "unique name for every resolution" idea with WXGA, WXGA+, UXGA, WUXGA, WQXGA, and so forth... It was abandoned for good reason, as the dictionary of terms grows indefinitely and the names have no direct relation to the resolution, it is very confusing unless you've done a lot of memorizing. "1080p" or "4K" are much better since they are numerically related to the resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

Pretty much the only time that resolution particulars actually matter is when it comes to distribution.  The majority of playback  hardware is designed and certified to specifically support specific resolution, codec, codec settings (Profile/level) audio, frame rates, and stuff like that.  While it -may- operate outside of spec, no one promises anything.  Be it cinema DCP projection hardware, UHD Blu-ray, or Netflix on your Smart TV or you're old DVD player, that hardware is built to expect certain things and it has to be accurately provided those things. 

 

This is why your 2.11:1 wide screen Blu-ray is still 1920x1080 with back bars at the top and bottom.  Because there's no 1920x820 or whatever in the Blu-ray spec, where image is cropped to the exact framing, and maybe SOME players may even play a BD out of spec, but it could be distorted, refuse to play or crash other players.  Same for any other playback platform. Universal support means following standards.  An that's the only place where specific resolution numbers are actually important.  But then the argument isn't 'True 4K' but rather 'Why isn't this to spec, what is this garbage you delivered?'

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 11/16/2016 at 1:18 AM, Glenwing said:

Terms like “2K” and “4K” don’t refer to specific resolutions

yes, they do

 

4K is a DCI standard that refers to a very specific resolution: 4096x2160 - you will never find this resolution on TV sets

 

3840x2160 is not "4K", it's UHD or 2160p because of the 16:9 aspect ratio; and if you go into broadcasting, it's UHD-1

 

---

 

Quote

used to classify resolutions based on horizontal pixel count

3840 x 2160 - what part of 3840 reflects 4K?

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, zMeul said:

yes, they do

 

4K is a DCI standard that refers to a very specific resolution: 4096x2160 - you will never find this resolution on TV sets

 

3840x2160 is not "4K", it's UHD or 2160p because of the 16:9 aspect ratio; and if you go into broadcasting, it's UHD-1

 

---

 

3840 x 2160 - what part of 3840 reflects 4K?

One of the 4K feature films I'm working on, all it's native assets are 2880x2160, would you like the director's email so you can complain to him? :V

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AshleyAshes said:

One of the feature films I'm working on, all it's native assets are 2880x1440, would you like the director's email so you can complain to him? :V

I'm sorry, what does that have to do with anything?!

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, zMeul said:

I'm sorry, what does that have to do with anything?!

 

Typo, it's morning here, 2880x2160. :)

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, AshleyAshes said:

Typo, it's morning here, 2880x2160. :)

same thing, it's not a standard resolution - not 16:9 nor 19:10

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, zMeul said:

same thing, it's not a standard resolution - not 16:9 nor 19:10

Yet it's a 4K movie, with all assets, live footage and CG, at 2880x2160.  But don't worry, the movie is 2.11:1

 

This is just why I think it's silly to see all you guys argue over 'True 4K' while I see all sorts of content basically get squished into a DCP. :P

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, zMeul said:

yes, they do

 

4K is a DCI standard that refers to a very specific resolution: 4096x2160 - you will never find this resolution on TV sets

DCI does not have a monopoly on the term "4K", the idea that "4K" is some specific term that refers to 4096×2160 is something that was made up by tech journalists. Seems like you bought it.

 

Terms like "4K" have always been used as categories, not resolutions. Certain standards may define certain resolutions to standardize around, but that doesn't make the terms no longer applicable to any other resolution.

 

3840×2160 is not "just called UHD". Even in the ITU press release it's referred to as "4K".

 

Of course, I did say all this in the thread already. You may want to read more than the first few lines.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Glenwing said:

DCI does not have a monopoly on the term "4K", the idea that "4K" is some specific term that refers to 4096×2160 is something that was made up by tech journalists. Seems like you bought it.

I'm sorry, what!??!

4K is a standard, not a made up thing like you want to make me believe

 

it's a standard, just like UHD

 

4K s a DCI standard, was introduced in actual recording products in 2003 by DALSA

UHD wasn't even a thing, the talks of adopting UHD date to about 2012 - UHD is CEA standard

 

further reading: http://www.ultrahdtv.net/articles/is-4k-resolution-important-to-consumers/

 

please stop the misinformation

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, zMeul said:

I'm sorry, what!??!

4K is a standard, not a made up thing like you want to make me believe

 

it's a standard, just like UHD

 

13 minutes ago, Glenwing said:

Of course, I did say all this in the thread already. You may want to read more than the first few lines.

 

Link to comment
Share on other sites

Link to post
Share on other sites

The Nerdrage that this hardline stance on '4K' inspires is kinda awe-inspiring.

 

BTW, if anyone wants to know why a 4K feature would be 2880x2160, it's what happens when you use anamorphic lenses.  Anamorphic lenses 'squeeze' an image into a 4:3 area, that being the actual exposure area of a piece of 35mm film, this isn't just like a linear stretch you'd see in a desktop imaging program but it has an aesthetic effect on the image as a whole and even a unique effect on lens flares.  (You can read up more on Wikipedia)  A good number of films are still shot on anamorphic lenses even in the digital age, where in that case the camera ignores the pixels on it's sensor which would otherwise just capture black and create needless data and this is where you get 2880x2160 as the working resolution for an entire film's pipeline from filming to editing to VFX.  ...But ya know, I didn't need to explain this because anyone evoking the Digital Cinema Initiatives for an argument, totally knows all about the insides of moving making, right? :) 

 

PS: Puny human vision is actually pretty insensitive to loss of vertical lines of resolution. :P

Link to comment
Share on other sites

Link to post
Share on other sites

Great posting topic. I always watch out when looking at 2k or 4k to make sure they mean the resolution I'm looking for!!!

Link to comment
Share on other sites

Link to post
Share on other sites

So what about 3440x1400? :)  3.5K or 3k?

 

I know Asus I also coming out with one that is 3800x1600.   Would this classify as 4k? 

Phanteks Enthoo Elite | Intel I9 - 7900X | Asus x299 Rampage VI Extreme | MSI 1080 TI 

32Gb Dominator Platinum Special Edition Blackout 3200MHz  | Samsung 960 Pro | 2x Samsung 961 Pro (Raid 0) 256Gb M.2 SSD  

Samsung 850 Pro 512Gb | WD Black 4TB | Corsair AX1200i

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, brighttail said:

So what about 3440x1400? :)  3.5K or 3k?

 

I know Asus I also coming out with one that is 3800x1600.   Would this classify as 4k? 

3440×1440 would be 3.5K.

 

3840×1600 would be classified as 4K since it's the same horizontal count as 4K UHD.

Link to comment
Share on other sites

Link to post
Share on other sites

Couldnt all this trouble be solved if instead of trying to short up everything we just say the 0000x0000p resolution all times, I personally never go by UHD FHD 2k 4k any of those and just say the resolution I have in mind to avoid such headaches

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Princess Cadence said:

Couldnt all this trouble be solved if instead of trying to short up everything we just say the 0000x0000p resolution all times, I personally never go by UHD FHD 2k 4k any of those and just say the resolution I have in mind to avoid such headaches

Well you can always write out the full resolution if you want, but people have a hard time remembering specific numbers. I can't tell you how many times I've seen 1980×1080, 2560×1400, and 3820×2160 written. Abbreviations are easier to deal with.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Glenwing said:

Well you can always write out the full resolution if you want, but people have a hard time remembering specific numbers. I can't tell you how many times I've seen 1980×1080, 2560×1400, and 3820×2160 written. Abbreviations are easier to deal with.

I would very much agree with you fully if these name standards were less all over the place xD Like in my personal view nothing is easier than knowing the right number straight off, but oh well... like I do have one doubt, is 1920x1200p and 1680x1050p to be considered Full HD? or that is solely to 1920x1080p?

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Princess Cadence said:

I would very much agree with you fully if these name standards were less all over the place xD Like in my personal view nothing is easier than knowing the right number straight off, but oh well... like I do have one doubt, is 1920x1200p and 1680x1050p to be considered Full HD? or that is solely to 1920x1080p?

Full HD is 1920×1080. Technically Full HD is a video format, not a resolution, but when people say it like a resolution it really means "the resolution used in the Full HD video format", which is 1920×1080.

 

Of course the waters get a bit muddy when you get to actual product advertisement, with things like 1680×1050 monitors being advertised as "Full HD" or "Full HD ready". By this, they usually mean something like "capable of processing Full HD video signals", though of course it downscales them to 1680×1050. It's the same deal with 1360×768 TVs being advertised as "1080p" or "Full HD ready" or something like that.

Link to comment
Share on other sites

Link to post
Share on other sites

I am absolutely thrilled that some people actually understand that 2560x1440 is not 2K. It drives me crazy when the reviewers refer to it as 2K.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 weeks later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×