Oh yesy I went though this too The friend who said that was instantly rekted, and even after a week we were asking him about how much do he want for a few square meters of internet, and if 10 bucks are ok :D Good times tho
I was at work, talking to one of my female coworkers about Windows 10. In the middle of the conversation a guy that was just transferred to our department comes over and starts screaming at us. It turns out he is her ex-boyfriend. I'm guessing he isn't very tech savvy (or very bright) because he thought we were talking about his penis when we kept saying Microsoft. :lol:
I was asking someone in class next to me what version of Linux they were going to install on their laptop for schoolwork as it's a requirement to use it. He turned around and looked me straight in the face and said "Linux doesn't run on laptops." What in the actual fuck. This happened in a college level computer science class.
At CES 2016 the latest must-have feature for TVs and monitors seem to be High Dynamic Range (HDR). But with all the hype comes a lot of confusion. What does HDR actually mean for monitors? Is it like HDR photography?
While most companies haven't gone into detail what they mean exactly with "HDR", there are some things I believe we will see, and those beliefs are based on current specifications by organizations such as the UHD Alliance with their "UHD Premium" and the International Telecommuncation Union (ITU) with their BT.2020 recommendation. I also used AMD's announcement of their HDR support as a guide on what to expect from monitors.
So just to be clear, HDR is not a well defined standard. We don't know if Dell's definition of HDR is the same as LG's definition of HDR. We do however know about some specifications which try to define HDR and this post is written assuming monitor manufacturers aim for them.
So what new and exciting things can we expect from HDR displays? We can expect:
Higher bit depth
Wider color space
Higher contrast ratio
Better encoding
Color Bit Depth
Let's start with higher bit depth. Right now 8 bits of color depth is the standard. What that means is that the video signal uses 8 bits for blue, 8 bits for red and 8 bits for green, and then mixes them all together to create colors. It might be hard to wrap your mind around, but increasing the bit depth does not actually create brighter reds, or more blue blues. What a higher color bit depth does is increase the granularity. Here is an example of what I mean.
This is what we could do with two bits of color depth:
We can create 4 separate colors.
However, if we increase the bit depth from 2 bits to 3 bits we can create 7 different colors.
As you might have noticed, the extreme red and extreme yellow are exactly the same with both 2 and 3 bits of color. What has changed is the amount of steps between red and yellow.
As I said before, the standard right now is that we use 8 bits for each primary color (red, green and blue). With the move to HDR this will be bumped up to 10bits (or possibly even 12 bits). That will allow us to have much smoother gradients. So when we have something, such as a sky, which transitions smoothly from one color into another (maybe a blue sky transitions to a more red sky the closer to the sun you look), the transition will look smoother the higher the color bit depth we have. Here is an exaggerated example of what the difference could be when looking at for example a picture of a sky:
Color Space
The next improvement will be the color space. The bit depth defined how many steps we had between two colors, but the color space defines which those two extremes of the spectrum are. Let's get back to the red and yellow blocks of colors again.
This might be an example of a color space:
and this might be another one:
The top color space only goes from red to orange. It does not have any yellow in it. No matter how many bits we add to the color depth, we will never get a good yellow color. That's because the color space defines the edge colors. The same is true for the bottom picture which goes from orange to yellow. We will never be able to get a good red color in this color space.
Today the most common color space is called sRGB. The new standard we are moving towards is what is defined in the BT.2020 specification. 4K blu-ray movies will be mastered in the BT.2020 color space.
Here is a picture showing how much larger the BT.2020 color space is compared to sRGB:
The small yellow triangle are the colors in sRGB. The large, black triangle are the colors in BT.2020. Since you are probably viewing this using the sRGB color space the picture won't actually be accurate. It makes it a lot easier to explain though.
Side note: As you can see in the image with the triangles it says "Rec.709" and "Rec.2020", not sRGB and BT.2020. That's because Rec.709 is more than just a color space. Rec.709 defines the color space, the refresh rate, the pixel count and many many more things. The sRGB standard was developed around the color space defined in Rec.709 but with some changes (such as the average gamma used). These differences however are not important to understand the concept of a color space and what we can expect from HDR.
Rec and BT are interchangeable. BT.2020 and Rec.2020 both refer to the same thing. That's because the ITU numbers these recommendations for television BT.####, and they are recommendations.
Contrast Ratio
Contrast ratio is the difference between the brightest and darkest spot. I don't think BT.2020 defines any specific contrast ratio but UHD Premium does. UHD Premium has a minimum contrast requirement of :
Over 1000 nits of peak brightness and less than 0.05 nits of black.
OR
Over 540 nits of peak brightness and less than 0.0005 nits of black.
Those are some seriously high requirements. For comparison, in Anandtech's tests the iPhone 6S got:
582 nits of peak brightness.
0.42 nits of black.
The Galaxy S6 would just barely pass these requirements because its peak brightness was 593 nits of brightness and 0 nits of blacks (because the pixel is literally turn off).
It will be near impossible to achieve these kinds of contrast ratios without using for example OLED (where pixels can be completely turned off, creating perfect black levels) or by getting incredibly good LCD panels and then using local dimming.
Encoding
The last part is the encoding. Or more specifically the "electro-optical transfer function" which defines how the digital signal representing visible light is encoded. Today we are using Rec.1886 EOTF which was designed in the 1930s for use with CRT monitors. Sadly I don't know enough about this to explain it, but the benefits of moving from Rec.1886 encoding to ST 2084 (the standard needed for the UHD Premium certificate) is that it was designed for higher brightness (Rec.1886 only took up to 100 nits of brightness into consideration, and now we are talking about displays with 1000 nits or more) and we can also expect more details in dark areas.
Maybe someone else can explain this part better.
Conclusion
So there you have it. HDR for monitors is not the same as the HDR photos you will see if you image searches "HDR photo". It is far more than that.
It is very exciting but don't get too excited yet. The entire chain of things has to support the same standards for it to work. So you will need a new monitor, possibly a new graphics card (AMD announced that the 300 series will get support for some things and the new generation will support it completely) as well as support in software. Windows does not support it yet but they are working on it, drivers will need to support it and the programs you use might also need to support it. YouTube has announced support for it which means that once they re-encode the original file, it will retain the wider color space, the 10 bits of color depth and so on. Showing a video encoded for the standard dynamic range (SDR) on an HDR display won't make much of a difference. So content will be limited as well. On top of all that, things will most likely have a very hefty price premium in the beginning. It is a very good step towards making it the standard in the future.
Disclaimer: Displays are not one of my strong points so feel free to correct any misinformation that might have slipped through my validations, or come with suggestions on how to explain things even better.
Not correct at all. It's all to do with phonetics. R is pronounced "ARR" which begins with a vowel, so "an" should be used. Please don't correct if what you're saying is shit.
Really it depends, is it primarily for gaming? And if so, primarily FPS titles?
If both of those are yes, 1080p 144Hz hands down. If FPS titles are the least likely to be played, that changes things IMO and I would suggest to get a cheap 1440p IPS panel.
Those who have not tried 144Hz but have gone 1440p will swear they can't go back to 1080p, but those who have 144Hz swear by 100Hz+. I have used both and say that based on my experience with both panel types, it depends on the uses. If you are choosing between 60Hz 1440p IPS or 144Hz 1080p TN and you primarily play CSGO, I would argue that the 1440p is a horrible choice. On that same coin though, if you primarily play RTS titles and are heavy into photo/video editing, etc. I would recommend going for the 60Hz 1440p IPS panel.
Obviously my opinion is just another in a sea of opinions regarding this subject, but as said, I do have personal ownership experience with several 120Hz and 144Hz 1080p panels as well as several korean IPS/PLS panels and my current setup which offers the best of both worlds (XB270HU).