Jump to content
Glenwing

"2K" does not mean 2560×1440

Recommended Posts

Posted · Original PosterOP
On 12/21/2016 at 10:09 AM, Leicester77 said:

Well there is a "True 4k" and a "True 2k".

DCI 2K 2048x1080

DCI 4K 4096x2160

And then there are cropped versions for the different TV aspect ratios.

Further more there is a so called "generic term" for resolutions around 2000 pixels and 4000 pixels horizontally.

 

Why don't we just call the resolutions by their true names? Because there is one for almost every thinkable combination.

3840x2160 for example, just call it Ultra High Definition - UHD.

1920x1080 Full High Definition - FHD.

 

But still, people should stop making up things which don't exist - like 2.5k or 2.7k.... just staaaahp please.

Did you read the full post? Almost everything you just said was addressed already.

 

Pardon me for saying, but your post seems to represent a viewpoint where all information is taken from consumer-level sources. Generally terms like "2.5K" haven't been used by consumers so you think it is "made up" by random internet people. But it is very real terminology, and I quoted numerous industry examples using this term in my original post (read the last section).

 

On the other hand, 4096×2160 being "true 4K" is something that is made up by consumers (tech journalists) when "4K" was being introduced to the consumer market and every journalist was scrambling to write a "4K Explained" article. They got hung up on the fact that multiple things were being called "4K", and they didn't quite catch on to the fact that it is just a category, and doesn't refer to a specific resolution. But they kept trying to fit the square block in the round hole (as it were), and in trying to find the "real one" (there is really no such thing), they ended up picking the 4K resolution that was most commonly used or commonly referred to at the time (4096×2160) and decided that it must be the "true" 4K resolution. And boy did it catch on... But it has no basis in the industry itself. It is only repeated by people with no inside experience or knowledge, but like to pretend that they do by reading internet information and feeling like they have some "special insider knowledge".

 

As for resolution names it is useful to have shorthands that are related numerically to the resolution itself. We already tried the "unique name for every resolution" idea with WXGA, WXGA+, UXGA, WUXGA, WQXGA, and so forth... It was abandoned for good reason, as the dictionary of terms grows indefinitely and the names have no direct relation to the resolution, it is very confusing unless you've done a lot of memorizing. "1080p" or "4K" are much better since they are numerically related to the resolution.

Link to post
Share on other sites

Pretty much the only time that resolution particulars actually matter is when it comes to distribution.  The majority of playback  hardware is designed and certified to specifically support specific resolution, codec, codec settings (Profile/level) audio, frame rates, and stuff like that.  While it -may- operate outside of spec, no one promises anything.  Be it cinema DCP projection hardware, UHD Blu-ray, or Netflix on your Smart TV or you're old DVD player, that hardware is built to expect certain things and it has to be accurately provided those things. 

 

This is why your 2.11:1 wide screen Blu-ray is still 1920x1080 with back bars at the top and bottom.  Because there's no 1920x820 or whatever in the Blu-ray spec, where image is cropped to the exact framing, and maybe SOME players may even play a BD out of spec, but it could be distorted, refuse to play or crash other players.  Same for any other playback platform. Universal support means following standards.  An that's the only place where specific resolution numbers are actually important.  But then the argument isn't 'True 4K' but rather 'Why isn't this to spec, what is this garbage you delivered?'

Link to post
Share on other sites
On 11/16/2016 at 1:18 AM, Glenwing said:

Terms like “2K” and “4K” don’t refer to specific resolutions

yes, they do

 

4K is a DCI standard that refers to a very specific resolution: 4096x2160 - you will never find this resolution on TV sets

 

3840x2160 is not "4K", it's UHD or 2160p because of the 16:9 aspect ratio; and if you go into broadcasting, it's UHD-1

 

---

 

Quote

used to classify resolutions based on horizontal pixel count

3840 x 2160 - what part of 3840 reflects 4K?

Link to post
Share on other sites
46 minutes ago, zMeul said:

yes, they do

 

4K is a DCI standard that refers to a very specific resolution: 4096x2160 - you will never find this resolution on TV sets

 

3840x2160 is not "4K", it's UHD or 2160p because of the 16:9 aspect ratio; and if you go into broadcasting, it's UHD-1

 

---

 

3840 x 2160 - what part of 3840 reflects 4K?

One of the 4K feature films I'm working on, all it's native assets are 2880x2160, would you like the director's email so you can complain to him? :V

Link to post
Share on other sites
1 minute ago, AshleyAshes said:

One of the feature films I'm working on, all it's native assets are 2880x1440, would you like the director's email so you can complain to him? :V

I'm sorry, what does that have to do with anything?!

 

Link to post
Share on other sites
1 minute ago, zMeul said:

same thing, it's not a standard resolution - not 16:9 nor 19:10

Yet it's a 4K movie, with all assets, live footage and CG, at 2880x2160.  But don't worry, the movie is 2.11:1

 

This is just why I think it's silly to see all you guys argue over 'True 4K' while I see all sorts of content basically get squished into a DCP. :P

Link to post
Share on other sites
Posted · Original PosterOP
4 hours ago, zMeul said:

yes, they do

 

4K is a DCI standard that refers to a very specific resolution: 4096x2160 - you will never find this resolution on TV sets

DCI does not have a monopoly on the term "4K", the idea that "4K" is some specific term that refers to 4096×2160 is something that was made up by tech journalists. Seems like you bought it.

 

Terms like "4K" have always been used as categories, not resolutions. Certain standards may define certain resolutions to standardize around, but that doesn't make the terms no longer applicable to any other resolution.

 

3840×2160 is not "just called UHD". Even in the ITU press release it's referred to as "4K".

 

Of course, I did say all this in the thread already. You may want to read more than the first few lines.

Link to post
Share on other sites
15 minutes ago, Glenwing said:

DCI does not have a monopoly on the term "4K", the idea that "4K" is some specific term that refers to 4096×2160 is something that was made up by tech journalists. Seems like you bought it.

I'm sorry, what!??!

4K is a standard, not a made up thing like you want to make me believe

 

it's a standard, just like UHD

 

4K s a DCI standard, was introduced in actual recording products in 2003 by DALSA

UHD wasn't even a thing, the talks of adopting UHD date to about 2012 - UHD is CEA standard

 

further reading: http://www.ultrahdtv.net/articles/is-4k-resolution-important-to-consumers/

 

please stop the misinformation

Link to post
Share on other sites
Posted · Original PosterOP
4 minutes ago, zMeul said:

I'm sorry, what!??!

4K is a standard, not a made up thing like you want to make me believe

 

it's a standard, just like UHD

 

13 minutes ago, Glenwing said:

Of course, I did say all this in the thread already. You may want to read more than the first few lines.

 

Link to post
Share on other sites

The Nerdrage that this hardline stance on '4K' inspires is kinda awe-inspiring.

 

BTW, if anyone wants to know why a 4K feature would be 2880x2160, it's what happens when you use anamorphic lenses.  Anamorphic lenses 'squeeze' an image into a 4:3 area, that being the actual exposure area of a piece of 35mm film, this isn't just like a linear stretch you'd see in a desktop imaging program but it has an aesthetic effect on the image as a whole and even a unique effect on lens flares.  (You can read up more on Wikipedia)  A good number of films are still shot on anamorphic lenses even in the digital age, where in that case the camera ignores the pixels on it's sensor which would otherwise just capture black and create needless data and this is where you get 2880x2160 as the working resolution for an entire film's pipeline from filming to editing to VFX.  ...But ya know, I didn't need to explain this because anyone evoking the Digital Cinema Initiatives for an argument, totally knows all about the insides of moving making, right? :) 

 

PS: Puny human vision is actually pretty insensitive to loss of vertical lines of resolution. :P

Link to post
Share on other sites

So what about 3440x1400? :)  3.5K or 3k?

 

I know Asus I also coming out with one that is 3800x1600.   Would this classify as 4k? 


Phanteks Enthoo Elite | Intel I9 - 7900X | Asus x299 Rampage VI Extreme | MSI 1080 TI 

32Gb Dominator Platinum Special Edition Blackout 3200MHz  | Samsung 960 Pro | 2x Samsung 961 Pro (Raid 0) 256Gb M.2 SSD  

Samsung 850 Pro 512Gb | WD Black 4TB | Corsair AX1200i

 

Link to post
Share on other sites
Posted · Original PosterOP
16 minutes ago, brighttail said:

So what about 3440x1400? :)  3.5K or 3k?

 

I know Asus I also coming out with one that is 3800x1600.   Would this classify as 4k? 

3440×1440 would be 3.5K.

 

3840×1600 would be classified as 4K since it's the same horizontal count as 4K UHD.

Link to post
Share on other sites

Couldnt all this trouble be solved if instead of trying to short up everything we just say the 0000x0000p resolution all times, I personally never go by UHD FHD 2k 4k any of those and just say the resolution I have in mind to avoid such headaches


Personal Use Rig:
CPU:  Intel Core i7 8700 |~| Cooling: Cooler Master Hyper 212x |~| MOBOGigabyte Z370 D3H mATX |~| RAM: 16gb G.Skill Trident Z DDR4 3200mhzCL15|~| GPU: EVGA Founders Edition GeForce GTX 1080 Ti |~| Boot: 480GB SSD SanDisk G26 |~| Storage: 2x 240gb SSD SanDisk G26 + 1 480gb SSD SanDisk G26 |~| PSU: Corsair TX650M 2017 80Plus Gold |~| Display: LG 29UM69-G 2560x1080p100hz |~| Case: CoolerMaster Case Pro 3 OS: Windows 10 Pro.
Workstation Rig(My brother's, I use it too [:
CPU: AMD Ryzen 7 1800x |~| MOBO: Asus PRIME X370-PRO |~| RAM: 32gb DDR4 2933mhzCL14 Corsair Vengeance LPX |~| 2way SLI ~ Water Cooled GPUs: nVidia TITAN X Pascal  |~| PSU: Corsair RM850x 80Plus Gold |~| Boot: 240GB SSD SanDisk G26 |~| Storage: 3x2TB HDD 7200rpm Seagate Barracuda |~| Triple Display Setup: LG 29UM68-P 2560x1080p80hz |~| OS: Windows 10 Pro.

 

 
 
Link to post
Share on other sites
Posted · Original PosterOP
1 minute ago, Princess Cadence said:

Couldnt all this trouble be solved if instead of trying to short up everything we just say the 0000x0000p resolution all times, I personally never go by UHD FHD 2k 4k any of those and just say the resolution I have in mind to avoid such headaches

Well you can always write out the full resolution if you want, but people have a hard time remembering specific numbers. I can't tell you how many times I've seen 1980×1080, 2560×1400, and 3820×2160 written. Abbreviations are easier to deal with.

Link to post
Share on other sites
2 minutes ago, Glenwing said:

Well you can always write out the full resolution if you want, but people have a hard time remembering specific numbers. I can't tell you how many times I've seen 1980×1080, 2560×1400, and 3820×2160 written. Abbreviations are easier to deal with.

I would very much agree with you fully if these name standards were less all over the place xD Like in my personal view nothing is easier than knowing the right number straight off, but oh well... like I do have one doubt, is 1920x1200p and 1680x1050p to be considered Full HD? or that is solely to 1920x1080p?


Personal Use Rig:
CPU:  Intel Core i7 8700 |~| Cooling: Cooler Master Hyper 212x |~| MOBOGigabyte Z370 D3H mATX |~| RAM: 16gb G.Skill Trident Z DDR4 3200mhzCL15|~| GPU: EVGA Founders Edition GeForce GTX 1080 Ti |~| Boot: 480GB SSD SanDisk G26 |~| Storage: 2x 240gb SSD SanDisk G26 + 1 480gb SSD SanDisk G26 |~| PSU: Corsair TX650M 2017 80Plus Gold |~| Display: LG 29UM69-G 2560x1080p100hz |~| Case: CoolerMaster Case Pro 3 OS: Windows 10 Pro.
Workstation Rig(My brother's, I use it too [:
CPU: AMD Ryzen 7 1800x |~| MOBO: Asus PRIME X370-PRO |~| RAM: 32gb DDR4 2933mhzCL14 Corsair Vengeance LPX |~| 2way SLI ~ Water Cooled GPUs: nVidia TITAN X Pascal  |~| PSU: Corsair RM850x 80Plus Gold |~| Boot: 240GB SSD SanDisk G26 |~| Storage: 3x2TB HDD 7200rpm Seagate Barracuda |~| Triple Display Setup: LG 29UM68-P 2560x1080p80hz |~| OS: Windows 10 Pro.

 

 
 
Link to post
Share on other sites
Posted · Original PosterOP
1 minute ago, Princess Cadence said:

I would very much agree with you fully if these name standards were less all over the place xD Like in my personal view nothing is easier than knowing the right number straight off, but oh well... like I do have one doubt, is 1920x1200p and 1680x1050p to be considered Full HD? or that is solely to 1920x1080p?

Full HD is 1920×1080. Technically Full HD is a video format, not a resolution, but when people say it like a resolution it really means "the resolution used in the Full HD video format", which is 1920×1080.

 

Of course the waters get a bit muddy when you get to actual product advertisement, with things like 1680×1050 monitors being advertised as "Full HD" or "Full HD ready". By this, they usually mean something like "capable of processing Full HD video signals", though of course it downscales them to 1680×1050. It's the same deal with 1360×768 TVs being advertised as "1080p" or "Full HD ready" or something like that.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Recently Browsing   0 members

    No registered users viewing this page.


×