Jump to content
Glenwing

"2K" does not mean 2560×1440

Recommended Posts

I can see I'm not the only one that is annoyed by this.


Want to learn how to make your own custom Windows 10 image?

 

Main Rig: Intel i7-7700K 5GHz | Gigabyte Z170N | EVGA 1080 ACX 3.0 SC | 16GB Trident Z 3200MHz | 256GB 840 EVO | 960GB Corsair Force LE | EVGA P2 650W | Custom Loop

Laptop: Intel M-5Y10c | Intel HD Graphics | 8GB RAM | 250GB Micron SSD | Asus UX305FA

Link to post
Share on other sites

Spread the word, this post is the most complete and informal anyone will ever get about resolution categories and detailed explanation that 4K is 4096x2160 and not 3840x2160 aswell x-x! Altough every youtuber will still misinterpretate the terms 2K and 4K for that matter.


Groomlake Authority

Link to post
Share on other sites
19 minutes ago, EminentSun said:

The whole 4k thing is kind of ridiculous. We should just be calling it 2160p, but nooooooo. Some marketing team decided to introduce a ridiculous moniker.

Most Tv's in store say y "4K 2160p" on them, most places list 2160p as well as 4k. like youtube and such.


 

 

 

Link to post
Share on other sites
Posted · Original PosterOP
11 minutes ago, VerticalDiscussions said:

Spread the word, this post is the most complete and informal anyone will ever get about resolution categories and detailed explanation that 4K is 4096x2160 and not 3840x2160 aswell x-x! Altough every youtuber will still misinterpretate the terms 2K and 4K for that matter.

You may want to read the "True 4K" section at the bottom ;)

 

4096×2160 and 3840×2160 are both 4K resolutions.

Link to post
Share on other sites

Very informative and awesome thread. I've been really confused on the 4k 2k things. Now i know. 


Spoiler

PC: CPU: i5 4690k,  Cooler: Kraken x61, Motherboard: Asus Z97-K, GPU: MSI GTX 1080 Gaming X, Kingston HyperX Fury Black 2x4gb DDR3 1866, HDD: Seagate Barracuda 1TB, SSD: Samsung 850 Evo 250GB, Case: NZXT H440 Razer Edition, PSU, Seasonic M12II Evo 620W I Peripherals: Keyboard: Ducky Shine 6 RGB Special Edition, Mouse: Razer DeathAdder Elite, Headset: Kingston HyperX Cloud II, Mousepad: Razer Goliathus Control Extended I Laptop: Lenovo Z400 CPU: i5-3230M @ 2.6ghz, GPU: Nvidia Geforce GT740M, Memory: 4GB DDR3, HDD: 500GB

 

Link to post
Share on other sites

@Glenwing I will admit, I always assumed my 2560x1440 monitor was 2k.

 

Thanks for the learnin


CPU — i7 6700k - 4.4GHz

GPU — EVGA GTX 1080 FTW - Factory OC

Monitor — Acer Predator XB271HU - 2560x1440 165Hz IPS 4ms

CPU Cooler — Noctua NH-D14

Motherboard — MSI z170a Gaming m5

Memory — 32GB G.Skill Ripjaws V series - 2133MHz 2(2 x 8GB)

Storage — WD Black - 2TB HDD 7200rpm

        — Samsung 850 EVO - 250GB SSD

        — Samsung 960 EVO - 250GB M.2 SSD

Case  Phanteks Eclipse P400 [purchased for $49.99?!]

PSU — EVGA SuperNOVA G3 - 850W 80+ Gold

OS — Windows 10 Home

Wireless Adapter — TP-Link Archer T9E - 802.11a/b/g/n/ac 

Case Fans — 120mm Noctua NF-F12 PWM - exhaust

          — 140mm Noctua NF-A14 PWM - intake

          — 140mm Noctua NF-A14 PWM - intake

Keyboard — Max Keyboard TKL Blackbird - Cherry MX blue switches - Red Backlighting 

Mouse — Razer Naga 2014

Headphones — Sennheiser HD600

Extras — Glorious PC Gaming Race - Mouse Wrist Rest  

       — Glorious PC Gaming Race - XXL Extended Mouse Pad - 36" x 18"

       — LED Strip - Phanteks PH-LEDKT - 1 meter

       — Mechanical ten key - Cherry MX blue switches

 

Main Machine : https://pcpartpicker.com/list/R3LPwV

$200 Linux Toy : https://pcpartpicker.com/list/dfw6NN

 

 

Link to post
Share on other sites

I have been annoyed a lot of times when people use 2K wrong 😑


“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to post
Share on other sites
Posted · Original PosterOP
1 minute ago, Snadzies said:

People can wrongly go about calling 3840 4k all they want, I won't stop'em as it is too ingrained into their heads already but I will laugh at them every time they do.

3840×2160 is a 4K resolution. Did you read the "true 4K" section? :P

Link to post
Share on other sites
Just now, Glenwing said:

Did you read the "true 4K" section? :P

Yes, I did, doesn't make it iny less retarded.

If you want to go around saying "Meh, close enough" that is fine, doesn't make it true since there is no agreed upon standard.

Link to post
Share on other sites

I remember being criticized for saying 1440p isn't technically 2K, the user only responded with "Newegg uses it in their marketing and LTT has referred to it as 2K."

 

That said however, there's information on here that I was ignorant to up until now, so thanks for the post and clarification, @Glenwing.


i have a 1080 Ti to run PUBG at medium settings

Link to post
Share on other sites

You really love putting things in categories, don't you. You can analyze and simplify colloquialisms all you want, but how dare you put two types of people even close to a comparison?

On 11/15/2016 at 6:18 PM, Glenwing said:

"True 4K"

  Hide contents

"True 4K"

 

While I’m here, I may as well address this one too. Some people will get upset when you call 3840×2160 “4K”, and will say:

 

“3840×2160 isn’t ‘4K’, that’s ‘UHD’! True 4K is 4096×2160!”

 

And some go as far as saying 4K TVs are a consumer scam because they're not "real 4K". This is nonsense, really.

 

Yeah, I agree that 4k can be considered any resolution with around 4,000 horizontal pixels, but only as a colloquialism. How about when 8k starts to come around? You only touch base on it. It comes out to be 7680x4320, no one is going to call that 7.5k, they're going to call it 8k. Unless you want to avoid that, you have to accept that "True 4k" really does mean 4096x2160. I don't know who the hell says consumer TVs aren't real 4k, those people are just dumb.

 

I am one of those people that say 4096x2160 is still True 4k because Digital Cinema basically decided that while film was still the standard. It's how the film was made and it's how it was transferred, and it's what DCI-Compliant projectors use even still today. Yes, that includes digital projectors. It originates in that high quality 35mm film, and especially 70mm film you must scan at very high resolutions and they found that 2048x1920 was better and easier for cropping into Cinema Scope aspect ratio(2.40:1). So double that and you get 4096x2160, True 4k in Cinema Standards where 4k was first ever used.

 

Here's a list of how film is scanned which includes resolutions, aspect ratios, and pixel aspect(kind of like pixel pitch)

https://renderman.pixar.com/view/resolution-table

 

Bottom line, True 4k is 4096x2160, consumer based TVs call 3840x2160 because digital cameras like the RED ONE had an option for it since it was a perfect 16:9 aspect ratio.

If I had to sum up what I'm trying to say, you're perfectly right in everything you say, but only as a simplification of terms for the general TV consumer, when it comes to the actual industry, and Cinema standards, saying True 4k to anyone who is in the industry or anyone who is a film fanatic will likely know that it's 4096x2160. People who think consumer TVs don't have "real 4k" are just dumb and shouldn't be even remotely compared to people in the industry who will tell you that True 4k is 4096x2160.

 

 

Link to post
Share on other sites

i have two 1680x1050 monitors - now how much k for each monitor then ? 1.7k ? because each screen has 1764000 pixels in total and is almost 1700 pixels wide?

 

honestly, i could not care less about my K score because response time, contrast/color reproduction, viewing angle and overall picture quality are way more important IMO.

 

nobody was talking about any k before they came up with this BS to market their UHD TV's - as if the XXXXp labeling sheme wasn't confusing enough already because that only refered to certain aspect ratios - like 1600x900 is called 900p but 1440x900 is NOT 

 

along with the race for the highest K-rate, pixel density has evolved into some kind of pissing contest as well while the scaling options of most graphical operating systems are still horrible - tell me about how you work your spreadsheet or plaintext HTML code on your slick 14" laptop with the fancy 3840 × 2160 screen - i bet your eyes will be sore before lunchtime

 

sidenote: my all time favorite native resolution is 1920x1200 = 16:10 ratio - because both 1920x1080 16:9 and 1600x1200 4:3 can be displayed without any interpolation (=loss of quality)

 

and at least ONE of these two resolutions is always supported by a game (except maybe for the really, REALLY old stuff from the DOS or early W95 era)

there will in both cases be black bars (or stretching) but they will be smaller than when you view 4:3 content on a 16:9 screen and vice versa (it also won't look half as bad if one decides to stretch the picture instead of adding black borders)

 

 

Link to post
Share on other sites
Posted · Original PosterOP
On 11/21/2016 at 5:16 PM, Andrewf said:

You really love putting things in categories, don't you. You can analyze and simplify colloquialisms all you want, but how dare you put two types of people even close to a comparison?

Yeah, I agree that 4k can be considered any resolution with around 4,000 horizontal pixels, but only as a colloquialism. How about when 8k starts to come around? You only touch base on it. It comes out to be 7680x4320, no one is going to call that 7.5k, they're going to call it 8k. Unless you want to avoid that, you have to accept that "True 4k" really does mean 4096x2160. I don't know who the hell says consumer TVs aren't real 4k, those people are just dumb.

 

I am one of those people that say 4096x2160 is still True 4k because Digital Cinema basically decided that while film was still the standard. It's how the film was made and it's how it was transferred, and it's what DCI-Compliant projectors use even still today. Yes, that includes digital projectors. It originates in that high quality 35mm film, and especially 70mm film you must scan at very high resolutions and they found that 2048x1920 was better and easier for cropping into Cinema Scope aspect ratio(2.40:1). So double that and you get 4096x2160, True 4k in Cinema Standards where 4k was first ever used.

 

Here's a list of how film is scanned which includes resolutions, aspect ratios, and pixel aspect(kind of like pixel pitch)

https://renderman.pixar.com/view/resolution-table

 

Bottom line, True 4k is 4096x2160, consumer based TVs call 3840x2160 because digital cameras like the RED ONE had an option for it since it was a perfect 16:9 aspect ratio.

If I had to sum up what I'm trying to say, you're perfectly right in everything you say, but only as a simplification of terms for the general TV consumer, when it comes to the actual industry, and Cinema standards, saying True 4k to anyone who is in the industry or anyone who is a film fanatic will likely know that it's 4096x2160. People who think consumer TVs don't have "real 4k" are just dumb and shouldn't be even remotely compared to people in the industry who will tell you that True 4k is 4096x2160.

Sorry if you felt generalized. Honestly I can't say I've ever met someone who thought 4096×2160 was "True" 4K but that other resolutions were 4K too; most people tend to think those are mutually exclusive. If you call it "True 4K" because it's the main standard in cinema that's fine, the only viewpoint I'm arguing against is when people say it's the "true" 4K resolution to the exclusion of anything else, i.e. all other resolutions are "fake" 4K.

 

In regards to the 7.5K thing, I already addressed that briefly in the "But what about" section. I would say, yes the further you go, the more "off" it gets, especially if you go up to "16K", where 7680 -> 15360 where it rounds to 15K, and so forth; but even 4096 you still have the same problem, but in the other direction. If you go up to, say 32K you're at 32,768 pixels, so... is that 33K? Or still 32K? The answer is, these are casual shorthands that were not really designed to be indefinitely extensible as it were. What it will be called is for the most part just a matter of convention.

 

28 minutes ago, KenjiUmino said:

i have two 1680x1050 monitors - now how much k for each monitor then ? 1.7k ? because each screen has 1764000 pixels in total and is almost 1700 pixels wide?

 

nobody was talking about any k before they came up with this BS to market their UHD TV's - as if the XXXXp labeling sheme wasn't confusing enough already because that only refered to certain aspect ratios - like 1600x900 is called 900p but 1440x900 is NOT 

I talked about both of these in the "But what about" section.

 

29 minutes ago, KenjiUmino said:

sidenote: my all time favorite native resolution is 1920x1200 = 16:10 ratio - because both 1920x1080 16:9 and 1600x1200 4:3 can be displayed without any interpolation (=loss of quality)

Agreed. I wish 1920×1200 and 2560×1600 were more common.

Link to post
Share on other sites
2 hours ago, Glenwing said:

Agreed. I wish 1920×1200 and 2560×1600 were more common.

for real - seems like no one cares about 16:10 any more they just go wider and wider wich maybe makes sense for games and movies but the industry is totally ignoring the fact that a percentage of people also do other stuff with their computers and a bit more height is really nice to have then - i am thinking of, say a DAW where more vertical "space" means you can have a bunch of tracks on screen at once and still have room for transport buttons and maybe a mixer tab.

 

like so: 

Spoiler

16_10.png

 

you can see 8 tracks on screen at the same time while in 16:9 ....

Spoiler

16_9.png

you only got 7 tracks without scrolling or anything else.

 

it is saddening that this ratio might be forgotten about pretty soon - i got like one of the last thinkpad models that has a 1280x800 screen but the TN panel itself is sooo bad - now for every newer model there are TN and IPS versions available but they are all 16:9 

 

Link to post
Share on other sites
On ‎11‎/‎21‎/‎2016 at 8:16 PM, Andrewf said:

You really love putting things in categories, don't you. You can analyze and simplify colloquialisms all you want, but how dare you put two types of people even close to a comparison?

Yeah, I agree that 4k can be considered any resolution with around 4,000 horizontal pixels, but only as a colloquialism. How about when 8k starts to come around? You only touch base on it. It comes out to be 7680x4320, no one is going to call that 7.5k, they're going to call it 8k. Unless you want to avoid that, you have to accept that "True 4k" really does mean 4096x2160. I don't know who the hell says consumer TVs aren't real 4k, those people are just dumb.

 

I am one of those people that say 4096x2160 is still True 4k because Digital Cinema basically decided that while film was still the standard. It's how the film was made and it's how it was transferred, and it's what DCI-Compliant projectors use even still today. Yes, that includes digital projectors. It originates in that high quality 35mm film, and especially 70mm film you must scan at very high resolutions and they found that 2048x1920 was better and easier for cropping into Cinema Scope aspect ratio(2.40:1). So double that and you get 4096x2160, True 4k in Cinema Standards where 4k was first ever used.

 

Here's a list of how film is scanned which includes resolutions, aspect ratios, and pixel aspect(kind of like pixel pitch)

https://renderman.pixar.com/view/resolution-table

 

Bottom line, True 4k is 4096x2160, consumer based TVs call 3840x2160 because digital cameras like the RED ONE had an option for it since it was a perfect 16:9 aspect ratio.

If I had to sum up what I'm trying to say, you're perfectly right in everything you say, but only as a simplification of terms for the general TV consumer, when it comes to the actual industry, and Cinema standards, saying True 4k to anyone who is in the industry or anyone who is a film fanatic will likely know that it's 4096x2160. People who think consumer TVs don't have "real 4k" are just dumb and shouldn't be even remotely compared to people in the industry who will tell you that True 4k is 4096x2160.

As someone who actually works in the film industry and has EXR and DPX sequences strangling them in their dreams every night, I'd like to make a few things clear:  All of that is just a projection standard.  Also that whole '4k is 4096x2160' thing is nonsense.  Particularly that 2160 part.  That's one weird as heck ratio and the shows we work on are shot at much less wide ratios an get cropped later.  The scales of the crops are borderline arbitrary and decided by production.  The exact resolutions and aspect ratios of footage varies wildly from project to project.

 

Now, I get it, as a movie fan who is on the outside looking in, it gives you a sense of possessing 'special knowledge' for all techno nerd stuff you said up there but we actually don't bother with any of it.  We deal with the exact resolutions and crops because if someone said 'This movie is 4K', that statement is followed by 'But what's the resolution?' cause '4K' means nothing as a resolution to us.  We work at weird wonky resolutions that you've never typically encountered in media.

 

Let me now explain to you what '4K' and '2K' actually means in the film industry:

 

4K means: "But it's expensive and slow and it makes the entire pipeline cry like a statue of the Virgin Mary. :('

 

2K means: 'Woo, dodged a bullet there. :3'

 

That's it.  They are nothing but approximations of the data sets that we'll have to deal with that we use to judge the stresses on our pipelines.  Sorry to kill the romance for you.

 

 

Link to post
Share on other sites

Anyone here who's upset when someone calls 3840x2160 "4K" should be upset if anyone calls KB, MB, GB, TB, etc. as kilobytes, megabytes, gigabytes, and terabytes when technically for most OSes, it's kibibytes, mebibytes, gibibytes, and tebibytes with the appropriate abbreviations KiB, MiB, GiB, and TiB.

 

Or you know, you can not have a thumb up your butt about being pedantic about terms and just accept that certain things, no matter how technically incorrect they are, are in established lexicon.

Link to post
Share on other sites
Posted · Original PosterOP
18 minutes ago, M.Yurizaki said:

Anyone here who's upset when someone calls 3840x2160 "4K" should be upset if anyone calls KB, MB, GB, TB, etc. as kilobytes, megabytes, gigabytes, and terabytes when technically for most OSes, it's kibibytes, mebibytes, gibibytes, and tebibytes with the appropriate abbreviations KiB, MiB, GiB, and TiB.

 

Or you know, you can not have a thumb up your butt about being pedantic about terms and just accept that certain things, no matter how technically incorrect they are, are in established lexicon.

AMD: "4GB MEANS 4GB"

 

Internet: "Actually when you guys say 4GB you're talking about 4 GiB. With a space. Units and values should be spaced. L2unit plz"

Link to post
Share on other sites

Ever sense "4k" became a thing nobody calls anything correctly anymore... althought even 720 on TV's wasn't 1280x720, it was usually 2810x768, and 4k was origonally created by Sony and was actually 4024 or something by something not 3840x2160 it is now, that was UHD.

Link to post
Share on other sites

Well there is a "True 4k" and a "True 2k".

DCI 2K 2048x1080

DCI 4K 4096x2160

And then there are cropped versions for the different TV aspect ratios.

Further more there is a so called "generic term" for resolutions around 2000 pixels and 4000 pixels horizontally.

 

Why don't we just call the resolutions by their true names? Because there is one for almost every thinkable combination.

3840x2160 for example, just call it Ultra High Definition - UHD.

1920x1080 Full High Definition - FHD.

 

But still, people should stop making up things which don't exist - like 2.5k or 2.7k.... just staaaahp please.


People brag here about their stuff right? Would you believe me that I have an i7-6950x and dual Titan X Pascal in an overall 10'000$ PC? No? So why do you believe the others?

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Recently Browsing   0 members

    No registered users viewing this page.


×