Posted December 27, 2014 · Original PosterOP Welcome to the Displays section, where everything's made up and the specs don't matter! Response time and latency This is probably the biggest and most common misconception I see on here, so it's going first. Most people are under the impression that the response time is a measure of the display's latency, that ingame feeling of delay between hitting a button and seeing your command play out onscreen, "controller lag" as some people call it. A perfectly reasonable assumption; the value for response time is even given in milliseconds after all. But alas, this delay is NOT what response time measures. Response time is something entirely different, it just happens to be confusingly named. That "controller lag", delay, whatever you want to call it, is called input lag or latency, and unfortunately it is never actually listed in the specs of a monitor. You'll need to look to review sites like TFTCentral or Blur Busters which actually test latency using advanced equipment. Response time, in case you were wondering, is the time it takes for a pixel to switch from one color to another. Typically 1–10 ms, it isn't nearly enough to have any kind impact on the perceived latency or "controller lag", especially considering the color transition doesn't even have to be fully completed for us to see and begin reacting to the new image that is being formed. What it can do is cause blurriness in scenes with fast moving objects on high contrast backgrounds. These moving objects might leave a visible "trail" behind them on displays with a very very slow response time. This is called "ghosting" (not to be confused with the smearing/echo effect over VGA connections also called "ghosting"). You can call it motion blur if you like, but that is technically incorrect as the term motion blur usually refers to the "built-in" blur that is part of the content being viewed (movies, etc.). When you pause and the entire scene is blurry, that is motion blur. That blurry image is exactly what the monitor is being instructed to display, and a faster response time won't make it any less blurry. It's also worth noting that response time is not a single number for some display types, like LCDs. Response time is actually a whole range of values, and the specific response time of a color transition depends on which two colors are involved. When an LCD has a "5 ms response time", that doesn't mean it takes 5 milliseconds to switch between any two colors. It means it takes 5 milliseconds to switch between those two specific colors that the manufacturer chose for the test. Of course, since they don't publish which colors are used for the tests, or their testing methodology, or their testing equipment, it does make the number rather meaningless. There is also no standard for which color to use, so not only do you not know which colors are tested, you don't even have the comfort of knowing that all the manufacturers are at least testing the same transitions, whichever they may be. Different display manufacturers can all be testing the response times of different color transitions, which means response time numbers are not directly comparable between monitors. An 8 ms monitor can be faster than a 5 ms monitor in reality, or a 5 ms monitor might be slower than a different 5 ms monitor. In the end it hardly matters anyway, since ghosting hasn't been a big problem since the old days of early IPS panels without modern response time compensation technologies, whose response times reached up to 50 ms and beyond. With any half-decent modern display, even ones with the infamously slow "IPS" type panels, you'll find that they aren't actually slow at all and that severe ghosting issues are a thing of the past. As for latency, it only depends on the monitor's internal circuitry, it has nothing to do with what type of panel is used (IPS, TN, etc.). You can find fast TN monitors and laggy ones, and likewise you can find fast, responsive IPS panels as well as laggy ones. Do you need a 5 ms response time or less for gaming? There are two groups of people that say this. The first are the people that think response time refers to input lag/latency, which is not true as I explained in the previous section. Don't confuse response time with latency. If you switch from a 5 ms monitor to a new 1 ms monitor and the new one "feels faster", what you're feeling is not due to the faster response time. You are feeling a decrease in latency, which is completely unrelated to those 5 ms / 1 ms figures on the box, and is probably more along the lines of a 20–50 ms decrease or even up to 200–400 ms if you're coming from a TV. Even if you can feel decreased latency on your new monitor, the fact that the response time also happens to be lower than your old monitor is basically a coincidence and is not what is causing any feelings of decreased latency. Feeling a difference between 1 ms and 5 ms is totally impossible, and if you think you can, have a go at this so you get an idea of how short a millisecond is: http://humanbenchmark.com/tests/reactiontime. The second group are the people that understand response time isn't latency, but say you'll get bad ghosting and terribly blurry images without 5 ms or less. As I explained in the previous section, the response time numbers printed on boxes are extremely, extremely imprecise. This means any statement like "5 ms and under is ok for gaming, anything more and you will start seeing ghosting" is completely untenable. Any kind of blanket statement like that relies on the assumption that a 5 ms response time translates to a defined amount of ghosting, and likewise for any other numbers, but in reality that just isn't true. You cannot tell how much blur or ghosting a monitor will have from the listed response time number. You can find 2 ms, 5 ms, 8 ms, even 12 ms monitors with minimal ghosting. You can also find 2 ms, 5 ms, 8 ms, or 12 ms monitors with much more pronounced ghosting. Saying 5 ms will have acceptable ghosting for gaming while 8 ms or 12 ms will have too much is simply wrong, even if you put the word "generally" in front. Response time numbers listed on the box are meaningless, and this is easily proven by real-world ghosting tests with high-speed cameras. For example, there is this collection of ghosting tests, which clearly shows a 12 ms monitor with more or less equal ghosting to a 5 ms monitor from another vendor, both of them with less ghosting than a supposedly faster 4 ms monitor from a third vendor. On top of that, the pair of 2 ms "gaming" monitors at the bottom have completely different levels of ghosting despite having the "same" response time. In fact one of the 2 ms monitors actually appears to be slightly faster than the 1 ms monitor at the top, while the other 2 ms monitor is even slower than the 12 ms IPS model mentioned above. The 8 ms monitor seems to have the heaviest ghosting in the bunch. So basically, monitors should never have their response times compared. Your intuition will tell you that 1 ms is less than 2 ms is less than 3, but in reality any number could mean any amount of ghosting, so the numbers in fact are meaningless, quite literally; they have no actual meaning. Response time is, in a nutshell, the second most useless spec on a monitor spec sheet (behind only "dynamic contrast ratio). It's all a moot point anyway, since ghosting hasn't been a big issue for a long time. It's pretty much all a big myth at this point, perpetuated by bunches of people parroting 1 ms is good and 8 ms is too slow to everyone because they heard someone else say it, without having any actual experience, or else are just suffering from a placebo or confirmation bias. If you ask anyone who has actually used a decent modern IPS panel, I've never met a single one who has had ghosting and blurring issues in gaming. From everything I've seen it seems the "IPS displays are bad for gaming" community is composed entirely of people who have never used IPS displays before. Dynamic Contrast Ratio (DCR) – 1,000,000:1+ "Contrast Ratios" are Just Made-up Numbers "Dynamic" Contrast Ratio is the number 1 most useless spec on a monitor spec sheet. Contrast ratio in combination with the maximum brightness spec should give you (in theory) an idea of black levels and how the display handles dark scenes. A contrast ratio like "1000:1" means that the brightest level the display can achieve is 1000x the brightness of the darkest level it can achieve. Common sense would suggest that they are talking about brightness levels in the same scene; meaning that in a single frame, the highest contrast you can get is one part of the screen being 1000x brighter than another part of the screen. Bright objects right next to dark objects is what creates contrast after all, it's the definition of contrast. This is called "static contrast ratio". With a backlit display (one source of light illuminates the entire screen from behind, and it's up to each individual pixel to block the light from passing through if that pixel is supposed to display black) a perfect static contrast ratio cannot be achieved, because even when a pixel tries to display black it can't block the backlight completely, resulting in a slight glow that we're familiar with; even when the TV or monitor is displaying pure black in a dark room, you can still see a glowing rectangle in the room because the backlight isn't completely blocked. More advanced TVs these days have variable brightness backlights which dim the backlight in dark scenes. All well and good, but now they advertise "dynamic contrast ratio" which essentially means they compare the brightest level in a bright scene with the darkest level in a dark scene, when the backlight has been dimmed to its minimum, usually off or nearly off. But the backlight must either be bright or dimmed, it can't be both at once, so although comparing these two brightness levels might yield a number like 100,000,000,000:1, ultimately it's a comparison that is purely academic and can never be realized in actual usage. The number is only calculated by measuring two separate scenes which are impossible to display simultaneously. That bright scene can never be displayed together with that 100-billion-times-darker dark scene, because the bright scene requires full power to the backlight, and the dark scene requires minimum backlight, and you can't have both at once. The advertised "Dynamic Contrast" can never be shown. There is one exception (among backlit displays), which are displays using full-array backlights, sometimes called "local-dimming LEDs" or something similar. Instead of an LED strip along the edge which illuminates the whole display (called edge-lit), the display has an full array of LEDs that spans the entire back of the display. This means the backlight can be dimmed for only certain sections of the display at a time instead of all-or-nothing. If the left half of the scene is dark, only the left side of the display can be dimmed. The LED array isn't precise enough for each pixel to get its own LED, so the screen can only be dimmed in patches, but it's better than edge-lit designs. Local-dimming LEDs is a technology that does genuinely improve contrast, although the "Dynamic Contrast" spec is still useless since the advantage of local dimming will also be reflected in Static Contrast ratio since the darkest and brightest can now be displayed together in the same scene. Unfortunately, to add to the confusion there doesn't even seem to be a standard on terminology. These days manufacturers will often just advertise "contrast ratio" without specifying whether they are talking about static or dynamic contrast. Normal static contrast ratios are around 500:1 (cheap laptop dislays) 1000:1 (decent desktop displays) or up to 5000:1 (high-grade VA-type panels). Anything in the hundred-thousands or above is dynamic contrast ratio. Unless the display has local-dimming backlights, just ignore it. "You can't get audio through a DVI port!" Strictly speaking, DVI doesn't have any audio capability. However, modern NVIDIA and AMD graphics cards will transmit digital audio through DVI ports if you attach them to an HDMI device with an adapter such as a DVI to HDMI cable. DVI and HDMI ports on graphics cards share the same controller, so these days cards are designed to detect when a DVI port leads to an HDMI device, and basically uses it like an HDMI output and starts sending HDMI signals through it rather than DVI. Isn't HDMI limited to 60 Hz? No, HDMI is not limited to 60 Hz. This is a common myth. Sometimes, certain displays will not accept signals above 60 Hz over HDMI even when the monitor does run at 120+ Hz with other interfaces like DVI or DisplayPort, but that is just a limitation of those particular displays. HDMI itself allows unlimited refresh frequencies; this has been the case since version 1.0. The HDMI specification however does not require devices to be capable of the full bandwidth in order to be compliant with a certain version; for example, when HDMI 2.0 first debuted, there were a number of A/V receivers and other products which were HDMI 2.0-compliant, but still limited to HDMI 1.4 bandwidth (10.2 Gbit/s). Likewise, there may be monitors which have HDMI 1.4-compliant ports, but are limited to HDMI 1.2 bandwidth (4.95 Gbit/s, or 1080p 60 Hz max), and that is the choice of the monitor manufacturer if they want to implement it that way, usually for cost-saving or component-availability reasons. Notable examples of monitors which don't accept >60 Hz over HDMI are the BenQ XL2411Z and the ASUS VG248QE (which both have HDMI 1.4a-compliant ports). However, other monitors like the ViewSonic XG2401 do accept 1080p 144 Hz through HDMI 1.4a. "Humans can't see more than 60 Hz anyway!" Bold words coming from someone who's never used a true 120 Hz monitor. "Why are there TVs at 600+ Hz when monitors don't go beyond ≈144 Hz?" The first thing that needs to be made clear is that “Hertz” is a generic scientific unit, and it’s used to specify the frequency of any repeating event. There are many components of the display that have frequencies associated with them. For example, the processor inside the display might run at 500 MHz. The power input might accept AC power between 50 and 60 Hz. The brightness of the backlight is usually controlled through high-speed flickering at some frequency (180–240 Hz on a typical monitor). Just because some number on a display’s spec sheet has “Hz” at the end doesn’t mean it’s the refresh frequency of the display, so if you see something in "Hz" you should at least take a moment to check if the label for that number is actually "refresh rate", or if it's something else. The second thing I want to point out is that the highest-bandwidth connection available on most 1080p TVs is HDMI 1.4, which can transmit a maximum of 120 frames per second at 1080p from the device to the TV, and most TVs will only accept up to 60 frames per second anyway. Newer “4K” TVs use HDMI 2.0, which caps out at 60 Hz at 4K. So that alone tells you that “480 Hz refresh rate” TVs can’t be using the term “refresh rate” to mean the same thing most people do. Most LCD TVs are 60 Hz, even though they are marketed as “120 Hz”, “240 Hz”, “480 Hz” or above. This can mean 2 different things. Most of the time this means they use backlight strobing, which will be explained below, but in some cases it may also mean that the TV uses frame interpolation. In this case, the TV does actually operate at 120 Hz, but they still don't take more than 60 Hz input from the source, so you won't be able to set it to 120 Hz in Windows and your computer won't be sending more than 60 fps to the TV. Instead, the TV takes in 60 Hz from the source, and for each pair of frames it calculates what an in-between frame might look like. Then it displays at 120 Hz, but only half the frames are received from the source, every second frame is generated by the TV itself. Since the TV isn’t actually getting any more information from the computer than it would at 60 Hz, it's still working with the same information, and it doesn’t help in gaming like a normal 120 Hz monitor. In fact the interpolation process actually adds a huge amount of latency and is really horrible for gaming, and it also completely destroys the look of movies, it is the cause of the much-maligned "soap opera effect". Turning off frame interpolation is always the very first thing you should do when you fire up a new TV, the only reason it exists is so that "120 Hz" can be printed on the box. You’re far better off with a normal 60 Hz monitor than a "120 Hz" TV for gaming. Frame interpolation is only used up to 120 Hz. LCDs can't physically operate much faster than that, so if you see “240 Hz” and “480 Hz” LCD TVs, these must be using backlight strobing. They still only take in 60 Hz maximum input and display at 60 Hz (or 120 Hz if they are also using frame interpolation at the same time). The "240 Hz" and "480 Hz" figures come from strobing the backlight (yes, literally just flickering the TV’s illumination source on and off) at 240 Hz / 480 Hz, and they call that a “480 Hz TV”, but the display panel itself is still only refreshing at 60–120 Hz. So they advertise the flicker frequency of the backlight, and simply don't list the actual refresh frequency at all, and leave it to you to assume that the number is for the refresh rate because it's measured in "Hz". Most TV makers also come up with some kind of marketing name for the backlight flicker frequency that sounds very similar to refresh rate, like "clear motion rate" or "true motion rate" or even "effective refresh rate", or some other meaningless nonsense like that, they're careful about not labeling it as "refresh rate" to avoid false advertising lawsuits, they just try to strongly imply that it is the refresh rate, and refuse to list the real one. If you see any weird names like that instead of "refresh rate", it should immediately tip you off that that number is not the refresh rate of the TV. One more word on "600 Hz" TVs, because these are totally different; most “600 Hz” TVs you see are plasma TVs, not LCDs. Plasmas don't have backlights, so the whole backlight strobing thing above doesn't apply at all. There also aren't (to my knowledge) any plasmas that use frame interpolation. Plasma TVs have a refresh frequency of 60 Hz. The “600 Hz” number is not the refresh rate, it is the frequency of the sub-field drive (a part of standard plasma display operation). Refresh frequency is not the only thing measured in Hertz, so don’t confuse the “600 Hz” figure with the refresh frequency just because it has “Hz” at the end. Plasmas are often advertised as having excellent motion clarity and salespeople try to tie this to the 600 Hz “refresh rate” but they are simply wrong and don’t understand the specs. Plasmas have excellent motion clarity because they have a very fast pixel response time (not to be confused with latency; read the first two sections of this mythbuster). It has nothing to do with the “refresh rate” or 600 Hz sub-field drive for that matter. "Does 1920×1080 scale perfectly on 3840×2160 (4K UHD) monitors? What exactly happens when you run a non-native resolution on a display?" The notion that 1080p content viewed on a 4K UHD display will always look just as sharp as if it were a native 1080p monitor is actually a misconception, admittedly one that I also held before 4K displays were really widely available and tested. However now we know better, as actual tests have shown this to be a myth. Theoretically it can be true; 1080p is an exact fraction of 4K and so could potentially be displayed natively at fullscreen. Each pixel could be represented perfectly by a block of 4 pixels on the 4K screen, so the input image could be displayed exactly as it's received without any kind of modification required. In reality though, this is usually not how these images are handled. It will depend on the display, and most 4K monitors won't scale 1080p perfectly. Normally, when a display is fed an image at a resolution smaller than its physical/max resolution, it has to perform a process called interpolation to upscale the image to its max resolution, which basically means it has to approximate what the image would look like if it were at that higher resolution. If you had for example a 7×7 image and you wanted to show it on a 10×10 display fullscreen, the image would have to be modified in order to be displayed. The pixels in the 7×7 image aren’t going to line up evenly with the 10×10 grid on your display: Since each pixel on your 10×10 display can only be a single color, it’s impossible to display the image exactly like it appears above. In order to display anything you need to calculate a new 10×10 pattern that approximates the original as closely as possible: As you can see, the approximation results in some pretty obvious blurriness, although if you stand far enough back you can still sort of make out a hint of the original pattern. Sort of. The negative effects of interpolation are especially noticeable with text: These examples are using a very simplistic interpolation technique, overlaying the desired resolution on the image and calculating each new pixel using the “average color” that lies within each boundary. This is just to help visualize the concept of interpolation, monitors use more complex approaches to interpolation that give better, sharper approximations in most situations. Unfortunately, these approaches also affect resolutions which are exact fractions of the physical resolution. If monitors interpolated using the simple averaging method I used above, exact fractions like 1080p on 2160p actually would display natively, since each pixel on the original image would line up exactly with every 2×2 grid on the new resolution, so every pixel would be a solid color and averaging them wouldn’t change anything. But most monitors use different interpolation methods which do affect resolutions which are exact fractions, and you can easily prove this to yourself by changing your desktop to such a resolution. If you’re using a 1920×1080 monitor, change to 960×540, if you’re on 1440p change it to 1280×720, etc. If it scales perfectly without interpolation then it should look quite blocky, especially with text, but you’ll most likely find that it looks rather fuzzy instead. Long story short: no, most 4K monitors do not scale 1080p with simple 4:1 pixel mapping, no interpolation, despite the fact that it’s an exact fraction of 3840×2160. Most monitors will still interpolate the image. It’s certainly possible for a monitor to display such resolutions natively, but it would have to be purposely designed to not interpolate those resolutions. They won’t be displayed natively simply by virtue of being an exact fraction. If you still don't believe me you can read professional reviews of actual 4K monitors and verify that 1080p does not scale perfectly, but is in fact interpolated. PCMonitors.info said: It is a common misconception that running 1920 x 1080 on a ‘4K’ UHD monitor will automatically provide equivalent sharpness to a native 1920 x 1080 display. That belief is held because the UHD resolution has exactly twice as many pixels vertically and twice as many pixels horizontally as the Full HD resolution. In practice monitor interpolation processes aren’t perfect. In the case of the Dell P2415Q, though, the interpolation process is surprisingly good. In fact we’d go as far as to say it’s excellent. If you run the monitor at 2560 x 1440 (WQHD) or 1920 x 1080 (Full HD) then you do lose a degree of sharpness compared to running that resolution on a 23.8” model that has a similar screen surface. This loss of sharpness is fairly minor, though, and is in fact one of the lowest losses of sharpness we’ve seen from an interpolation process on any monitor. On the desktop text looks a little soft but not really blurry as you’d usually observe from a normal viewing distance.https://pcmonitors.info/reviews/dell-p2415q/ PCMonitors.info said: The image appears noticeably soft, much softer in fact than running 1920 x 1080 natively on your typical 27” Full HD LCD. Text appears to have a soft fringe and games look like they are being viewed through some sort of soft-focus lens. If you’re expecting things to look like they would on a native ‘1080p’ display, think again. [...] It is unfortunate to see that interpolation is handled so poorly by the monitor. It’s not entirely surprising, as we saw similar performance from the 28” ‘4K’ models. The Dell P2415Q, on the other hand, handled non-native resolutions surprisingly well.https://pcmonitors.info/reviews/asus-pb279q/ Does DisplayPort Adaptive-Sync/AMD FreeSync require a new monitor or will existing monitors be upgradeable with just a firmware update?Also, if a monitor has DisplayPort 1.2a input or higher, does that automatically mean it supports DisplayPort Adaptive-Sync? I will settle this question once and for all since there seems to be a lot of confusion on the topic. This is a long one, so I’ll summarize right here: Adaptive-Sync/FreeSync WILL require a new monitor. Current monitors cannot be upgraded with just a firmware update at home, for the detailed story see the paragraphs below. DisplayPort 1.2a support also does NOT guarantee Adaptive-Sync capability. If you want an Adaptive-Sync/FreeSync monitor, look for one that specifically lists compatibility with one of those (either one means AMD FreeSync is supported on that monitor). Confusion arises because AMD’s initial demonstration was performed on normal consumer laptops already on the market without any special modification, and it was implied that FreeSync could work on many existing displays. Later at Computex 2014, AMD demonstrated FreeSync operating on a normal monitor already available, which had been upgraded to support variable refresh with nothing but a firmware update. When VESA announced the new DP 1.2a specification which officially added the Adaptive-Sync protocol which AMD uses for FreeSync, statements were misinterpreted by the community to mean that any DP 1.2a display would be capable of Adaptive-Sync. Understandably, all of this caused a lot of confusion. The Adaptive-Sync protocol that AMD uses for FreeSync was already part of a standard called Embedded DisplayPort for many years prior to the creation of G-SYNC and FreeSync. eDP is a companion standard to the normal DisplayPort, designed for internal use in mobile devices (such as laptops) for their integrated displays. eDP includes a number of features not present in the full DP spec, including Adaptive-Sync (although it wasn’t called that at the time) which allowed for variable refresh frequencies, originally intended as a power-saving measure. AMD used this protocol for their initial FreeSync demonstration on the laptops at CES 2014. Desktop displays don’t use eDP and as such are not natively capable of Adaptive-Sync, so the demonstration wouldn’t have worked in the same manner on a desktop display. Desktop displays have a somewhat different design than laptops, with a more complex and independent architecture. An attached computer or device doesn’t have such direct control over the display panel like a laptop does with its integrated display. Because a laptop display only needs to handle a single type of input and single format, it can be much simpler in design. A desktop monitor typically has multiple inputs of various types, often both analog and digital in addition to multiple formats of each, and needs some standalone functionality as well such as independent controls for brightness and things like that. This is also the reason laptops can dim their displays but computer can’t dim their monitors. Computers don’t have such direct control over monitors like that. To cope with the wide variety of inputs and signals, monitors incorporate a full-blown controller chip called a scaler, which controls the display panel, interprets all input signals from attached devices, and negotiates connections with them. It integrates all the input controllers (HDMI, DP, etc.) and provides front-end features like the OSD, picture-by-picture mode, and image processing options like color and contrast adjustments to the original signal. Since all signals from the computer have to go through the scaler, a monitor can’t be made to refresh at a variable frequency unless the scaler supports such a mode. Without firmware support, a scaler won’t know how to operate in a variable frequency mode, regardless of any drivers or software on the computer’s side of the connection. A number of (but not all) scalers used in monitors already on the market could be capable of Adaptive-Sync from a purely technical standpoint, but don’t have a variable refresh mode programmed into them since there was no standard for supporting such a feature at the time they were designed. The monitor AMD showcased at Computex 2014 was an example of one such monitor, which had been reprogrammed by the manufacturer to include a variable refresh mode without any hardware changes. While it is hypothetically possible for other monitors to support Adaptive-Sync with a similar firmware upgrade, most monitors aren’t designed to have field-upgradeable firmware. Even if a new firmware was released you would have no way of getting the firmware onto the monitor. Most monitors require very specialized tools only found in the manufacturer’s facilities. Manufacturers are not equipped to perform a mass firmware update for all their customers, and they would rather just sell you a new monitor anyway. So hoping for a firmware update to give your current monitor FreeSync capability is not a realistic hope. It is far too impractical to actually happen. From a design standpoint there are other considerations for a variable refresh monitor to take into account anyway, such as the type of display panel being used. Some will cope with variable refresh rates better than others, lower framerates in particular can be problematic depending on the design of the panel. The monitor at Computex 2014 was only capable of varying between 40–60 Hz even though the Adaptive-Sync standard allows for refresh frequencies of down to 9 Hz. The LCD panel could not handle lower than 40 Hz. The monitor manufacturer specifically requested that AMD conceal the brand name and other identifying marks on the monitor, because it was only a proof-of-concept prototype and they did not want to misrepresent the capabilities and limitations of FreeSync or imply that they would be releasing firmware updates to enable Adaptive-Sync on their monitors. Ultimately you would want a monitor designed from the ground up with this capability in mind. As for DisplayPort 1.2a, it simply defines a standardized method of signaling and communication procedures for a variable refresh frequency. The ability to refresh at a variable frequency is not a requirement for DP 1.2a compliance, it is an optional feature. The DP 1.2a spec is only there to ensure that IF a monitor designer wants to implement variable refresh capabilities, they will do so in a standardized way. I personally own two DisplayPort 1.2a-compliant monitors and they do not support Adaptive-Sync, so there really is no debate here. DP 1.2a does NOT guarantee Adaptive-Sync capability, period. It should also be noted that variable refresh also requires a layer of software and driver support on the GPU side, which is what AMD’s FreeSync software is. Having an Adaptive-Sync monitor and a DisplayPort 1.2a or higher output on the graphics card is not enough to enable variable refresh, so even if newer NVIDIA graphics cards have DisplayPort 1.2a or 1.3 outputs and they are connected to an Adaptive-Sync monitor, they still won’t be able to use variable refresh via Adaptive-Sync unless supporting software for the GPU is also released. PLS and AHVA vs. IPS Samsung's PLS and AU Optronics' AHVA are new variants of IPS. Since the introduction of IPS in the late 1990s, new and improved variants have been introduced year after year, and each one is given a new name... S-IPS, AS-IPS, H-IPS, AH-IPS, E-IPS (not to be confused with eIPS), UH-IPS, P-IPS, and the list goes on. PLS and AHVA are nothing interesting at all, simply the latest IPS variants from Samsung and AUO. They decided to move away from the IPS name this time, no doubt to make it seem like a new technology that is "better than IPS" for marketing reasons. Samsung does of course brag that PLS is "better than IPS" but then again any modern IPS panel like AH-IPS is leagues ahead of the original IPS technology from the 90s, if that's what you mean when you are comparing it to "IPS". To no one's surprise, Samsung is rather vague about what they mean by "IPS". AUO, for whatever reason, named their variant AHVA, which only serves to confuse people into thinking it's a VA-type panel (Vertical Alignment, a very different technology compared to IPS). Whether that was intentional or not it's hard to say, but it seems unlikely the name would make it all the way through the pipeline and be approved without anyone realizing it sounds like a type of VA panel. TFTCentral's word on the matter: Quote We want to try and ensure there's no confusion between AHVA and AMVA as well at this juncture. AHVA (Advanced Hyper Viewing Angle) is a relatively new technology developed by AU Optronics, not to be confused with their more long-standing technology AMVA (Advanced Multi-Domain Vertical Alignment). It is AU Optronics' answer to LG.Display's very popular, and long-established IPS (In Plane Switching) technology. Testing of this technology has revealed that it is for all intents and purposes the same as IPS. Performance characteristics, features and specs are all pretty much identical. AUO weren't allowed to simply call their technology IPS due to trademark issues, which is why they adopted their own new name. Samsung are the same with their PLS (Plane to Line Switching) panel tech, which is another IPS-clone. You will see pretty much all monitor manufacturers now simply use the term IPS, since it is so well known in the market, but underneath they may be using an IPS version from LG.Display, AU Optronics or Samsung potentially. People should not get concerned with the semantics here, which is why we will continually refer to this as an "IPS-type" panel throughout the review.http://www.tftcentral.co.uk/reviews/acer_xb270hu.htm IGZO vs. IPS IGZO and IPS are not competing technologies. IPS is a type of LCD panel (Liquid Crystal Display), and describes a certain formation or arrangement of the liquid crystals in the LCD. IGZO is a type of semiconductive material, and can be used in place of silicon in the transistors (TFTs) which control the movement of those crystals. IGZO competes with amorphous silicon (a-Si) which is what is currently used in the TFTs of the vast majority of LCD displays, and low temperature poly-crystalline silicon (LTPS) which is more commonly found in smartphones and other high-density displays, as well as in OLED panels (which are controlled by TFTs as well, just like LCDs). LTPS is the best performing and most efficient of the three, but is complex and expensive to manufacture. It is mostly used in smartphones where the extremely high pixel density makes a-Si unsuitable, and in OLED displays where the electrical demands again make a-Si unsuitable. IGZO technology was introduced to bridge the gap between a-Si and LTPS, offering only slightly worse performance than LTPS and only slightly higher manufacturing cost compared to a-Si. It presents an excellent alternative to LTPS as high-density LCDs and OLED displays become more common, and the lower cost and complexity will help these technologies scale up in production. The TFT layer doesn't affect the image. That still depends on the LCD panel itself, so an LCD using IGZO transistors must still be using a TN, VA, or IPS LCD panel. IGZO transistors are a "background spec" only, they do not have a direct impact on image quality, so specifying that a display uses IGZO transistors is not a replacement for specifying whether it has a TN, VA, or IPS panel. LCD vs. LED LCD and LED are not competing technologies. An LCD panel forms an image, but does not generate any light itself. To illuminate the image, a backlight is required, which lights up the display from behind. Older LCD displays used CCFL (fluorescent) backlights, and since there were no other types at the time there was no need to be specific, we just called the whole thing "LCD". Today, CCFL backlights have been largely replaced by the brighter and more power-efficient LED strips used in modern "LED" TVs, but these TVs still create their images using the same LCD technology. Only the illumination source has changed. AMOLED vs. OLED Ask on a forum what the difference between OLED and AMOLED is, and you'll get all kinds of interesting answers, such as:"AMOLED is Samsung technology, OLED is LG technology" "OLED is better for large screens like TVs, AMOLED is better for phones and tablets" "On OLED, black pixels are just greyed out while AMOLED displays actually turn pixels off" "AMOLED is more efficient and lasts longer" ...and the list goes on. Unfortunately, all these answers are pretty far off the mark. The difference between OLED and AMOLED is the same as the difference between a bird and a falcon. "OLED" is a broad term, and "AMOLED" is slightly more specific, nothing more. It makes no sense to compare two products by saying "this one is OLED, while that one is AMOLED". It's like saying "This vehicle here is a sedan, while that one is a car", as if the two can be compared on that basis. For any kind of actual comparison to be made, you would need to find out what kind of car the second vehicle is. For all you know it could turn out to be a sedan too, which would make the entire conversation meaningless to begin with. And in fact, that is exactly the case for "AMOLED". OLED by itself is a broad type of lighting technology, you could use it to make household light bulbs if you wanted. Active Matrix OLED (AMOLED) specifically is the type of OLED configuration used for display panels. All computer-grade OLED displays are AMOLED, thus "AMOLED" is a redundant and useless term in the context of computers and similar electronics. If you see a phone or TV or monitor with an "OLED" screen then you already know it's AMOLED, there aren't any other types of OLED configurations used for these applications. It's the exact same story as when people used to say "Active-Matrix TFT-LCD". It's not incorrect, but the reason everyone has stopped making the effort to say that whole thing any more is that as the years went on people eventually noticed that ALL monitors with LCD panels are Active-Matrix TFT-LCD, there are no other types of LCDs used in this industry. So there's no need to specify, it's just useless technical-sounding white noise that no one cares about. If you're talking about computer/phone/TV screens, you can just say "LCD" and the rest is implied. The same goes with OLED screens. Just say OLED, it's redundant and pointless to say "AMOLED" every time. The only reason anyone says "AMOLED" when talking about phones and TVs is because that's what Samsung calls it, and people don't know what it means. When a person says "AMOLED" it sounds like someone who insists on saying "computer mouse" every time they talk about peripherals. Being needlessly specific just sounds silly. Don't say AMOLED. Just call it OLED. Quantum Dot vs. OLED / IPS / etc. Quantum Dot technology does not compete with LCD or OLED. Quantum Dots are actually used in conjunction with LCD technology. It is in fact yet another new backlighting method for LCD offering better power efficiency compared to the LED backlights used in "LED TVs", which in turn offer better power efficiency than the CCFL backlights which they replaced, generally known as the original "LCD TVs". All three of these technologies, CCFL, LED, and Quantum Dot, are used to provide illumination for LCDs. Quantum Dot technology is based on principles of quantum physics (atomic emission of energy as light in only very specific wavelengths due to the fact that the energy levels of atomic orbitals are discrete steps, or "quantized", as opposed to being continuous) which is where the name comes from. In a standard LED backlight, the LEDs emit white light, which contains light of every color (roughly). This light then passes through the LCD panel, which is organized in a grid of subpixels. The aperture of each subpixel is covered by a red, green, or blue filter, which only allows that color of light to pass through that particular subpixel. All the other colors of light contained in the white light emitted by the LED, everything that isn't red, green, or blue, does not pass through the LCD panel at all and is wasted. The typical white LEDs (which are actually blue LEDs coated with a yellow phosphor) also do not provide a particularly wide color gamut, which makes them for the most part inadequate for color-critical work. Some displays address this using RGB LEDs instead of only white, but it consumes more power than white LED (due in part to low efficiency of green LEDs) and it is very expensive and requires complex driving circuits. However, they do provide a much wider color gamut with the ability to produce 100% of the AdobeRGB color space for color-critical work. (Note: in this article I use RGB LED as a blanket term which includes hybrid LED-phosphor solutions like GBr-LED, using discrete green and blue LEDs and a red phosphor, since these are very similar in performance to full RGB LED just more cost-effective.) Quantum Dot backlights use a combination of nanocrystals and LEDs to emit RGB light. Current implementations use a blue LED, and two different types of nanocrystals which absorb some energy from the blue light. When that energy is released again, it is still emitted as light, but at a different wavelength (again, due to principles of quantum physics), and these two particular nanocrystals are specifically engineered to emit at red and green light wavelengths, respectively (the blue is provided directly by the blue LED, so in fact quantum dot displays are also both LED and LCD displays too). In effect, compared to traditional WLED-backlit displays, the blue light from an LED is "converted" into RGB, rather than simply filtering the RGB wavelengths out of white light and bouncing the rest of it. This means that much more of the total light emitted by the LED is used usefully rather than wasted, and you get the wider color gamut as if you were using RGB LEDs. Overall, the technology is still based on light from LEDs, but adds the "quantum dot light conversion" to improve energy efficiency. It does offer a wide color gamut in comparison to white LED, but compared to LED technology overall quantum dot offers nothing that couldn't already be achieved with RGB LED backlighting in terms of picture quality, although since quantum dot is cheaper and easier to scale to small mobile devices it may see wider adoption compared to the cost-prohibitive and complex RGB LED solution, which made some appearances but never really caught on. Plus a cool-sounding name also helps sales. Perhaps quantum dots can be considered as a "wide gamut for the masses" technology, but in the big picture it's certainly nothing groundbreaking. Wide-gamut capability is nothing new to LCDs (just historically expensive), and as an LCD-based technology it still faces the same response time and refresh frequency limitations and still offers no improvement in black levels, in contrast with OLED technology which makes enormous strides in all of these in addition to also offering wide gamut color as well as simpler construction. Quantum Dot technology is essentially just an enhanced solution for achieving the benefits of the RGB LED backlight with better power and cost efficiency, and scalability. Forum Rules | Display Technology FAQ / Mythbuster | Guide to Display Cables / Adapters Link to post Share on other sites
Posted December 30, 2014 @Glenwing Can you post a topic about viewing distance for monitors and if people will really notice a difference for 4k. Like this article http://carltonbale.com/1080p-does-matter/ Most people don't have 20/20. Great article and thanks. Love cats and Linus. Check out linuscattips-fan-club. http://pcpartpicker.com/p/Z9QDVn and Asus ROG Swift. I love anime as well. Check out Heaven Society heaven-society. My own personal giveaway thread http://linustechtips.com/main/topic/387856-evga-geforce-gtx-970-giveaway-presented-by-grimneo/. Link to post Share on other sites
Posted December 30, 2014 · Original PosterOP On 12/30/2014 at 11:26 AM, GrimNeo said: @Glenwing Can you post a topic about viewing distance for monitors and if people will really notice a difference for 4k. Like this article http://carltonbale.com/1080p-does-matter/ Most people don't have 20/20. Great article and thanks. Thanks I discussed this with a friend actually. We have both a 50" 4K screen and 50" 1080p screen, and I must disagree with the article here, the difference between the resolutions is quite apparent. The problem is that a simple mathematical approach won't work here, since it's not as simple as how "far until you can't distinguish individual pixels anymore". Even when are too far to see the pixels, you can still make out greater details on the 4K screen that are simply not visible on the 1080p one. It also depends on a person's vision of course. Although maybe I should write a topic just to explain the relationship between pixel density and viewing distance though, and the diminishing returns you get, just as a general concept. Forum Rules | Display Technology FAQ / Mythbuster | Guide to Display Cables / Adapters Link to post Share on other sites
Posted December 30, 2014 Thanks I discussed this with a friend actually. We have both a 50" 4K screen and 50" 1080p screen, and I must disagree with the article here, the difference between the resolutions is quite apparent. The problem is that a simple mathematical approach won't work here, since it's not as simple as how "far until you can't distinguish individual pixels anymore". Even when are too far to see the pixels, you can still make out greater details on the 4K screen that are simply not visible on the 1080p one. It also depends on a person's vision of course. Although maybe I should write a topic just to explain the relationship between pixel density and viewing distance though, and the diminishing returns you get, just as a general concept. Yeah I agree. Maybe like what is good for 1080p at this many inches and 1440p and 4k and so on. I hear people say 24" for 1080p and 27" for 1440p but I have not heard 4k. Im not sure for pixel density. If you know would be appreciated. Thanks again. Love cats and Linus. Check out linuscattips-fan-club. http://pcpartpicker.com/p/Z9QDVn and Asus ROG Swift. I love anime as well. Check out Heaven Society heaven-society. My own personal giveaway thread http://linustechtips.com/main/topic/387856-evga-geforce-gtx-970-giveaway-presented-by-grimneo/. Link to post Share on other sites
Posted January 2, 2015 I was actually going to create a thread very similar to this that would've been called "The Monitor Buying Guide" and have it pinned like you've done. But you, obviously, beat me to it, damn it! Ya ninja! @Glenwing ON A 7 MONTH BREAK FROM THESE LTT FORUMS. WILL BE BACK ON NOVEMBER 5th. Advisor in the 'Displays' Sub-forum | Sony Vegas Pro Enthusiast & Advisor † Tech Tips Christian Fellowship Founder & Coordinator † Link to post Share on other sites
Posted January 3, 2015 Could you compare the display technologies such as CRT, LCD, OLED, maybe even DLP and Plasma? Thanks for the information, it really cleared up my misconceptions! Link to post Share on other sites
Posted January 3, 2015 (edited) Something that needs to be mentioned, @Glenwing, is that often the dynamic contrast ratio is listed as the static contrast ratio or sometimes just as Contrast Ratio. This often leads to people comparing various monitors saying things like "This monitor has a 10,000,000:1 contrast ratio but this other one only has a 500,000:1 contrast ratio". This is a situation that I had with a particular member on here that I had to clear up with him. Could you mention about this and set people straight? Because it can be quite tiring to explain this all because websites won't bother to list the real static contrast ratio because the dynamic one looks more impressive. FYI: Please excuse the badly written post. I'm writing this @ 1:00am and I'm quite tired. Edited January 3, 2015 by Geekazoid ON A 7 MONTH BREAK FROM THESE LTT FORUMS. WILL BE BACK ON NOVEMBER 5th. Advisor in the 'Displays' Sub-forum | Sony Vegas Pro Enthusiast & Advisor † Tech Tips Christian Fellowship Founder & Coordinator † Link to post Share on other sites
Posted January 4, 2015 · Original PosterOP Yeah I agree. Maybe like what is good for 1080p at this many inches and 1440p and 4k and so on. I hear people say 24" for 1080p and 27" for 1440p but I have not heard 4k. Im not sure for pixel density. If you know would be appreciated. Thanks again. Well, "optimum" pixel density doesn't exist as far as I'm concerned. The whole "1080p looks bad at 27in." I'm pretty sure is another one of those things people repeat because they heard other people saying it. It doesn't look that bad to me, and of course it depends on viewing distance. Preferred pixel density is a matter of preference. Some people are fine with 1080p at 27" or higher, depending on their eyesight, and likewise some people can't deal with 4K at 24", but then again it depends on your software as well. If the OS can effectively scale everything so that instead of things being tiny or disproportionate, you just get the AA effect on everything. Then suddenly ultra-high densities become a lot more comfortable. My opinion on this topic is that there is no rule. It is absolutely preference, and also depends on how the software you use handles high pixel density. I do wish people would stop saying 1080p looks bad at 27", I have a feeling the majority of these people haven't actually tried it. Could you compare the display technologies such as CRT, LCD, OLED, maybe even DLP and Plasma? Thanks for the information, it really cleared up my misconceptions! Not planning to do that right now, that can turn into a pretty big topic. I may cover that at a later point in time though, I'm working on a thread talking about display technology and how it works. It's a more science-focused and comprehensive thread than this one, which is more about shopping advice and clearing up misconceptions than explaining the basics of what everything is and how it works. Something that needs to be mentioned, @Glenwing, is that often the dynamic contrast ratio is listed as the static contrast ratio or sometimes just as Contrast Ratio. This often leads to people comparing various monitors saying things like "This monitor has a 10,000,000:1 contrast ratio but this other one only has a 500,000:1 contrast ratio". This is a situation that I had with a particular member on here that I had to clear up with him. Could you mention about this and set people straight? Because it can be quite tiring to explain this all because websites won't bother to list the real static contrast ratio because the dynamic one looks more impressive. Thanks, I added a paragraph to the contrast ratio section to clarify. Forum Rules | Display Technology FAQ / Mythbuster | Guide to Display Cables / Adapters Link to post Share on other sites
Posted January 7, 2015 Thank youuuu!!! Now I don't have to repeat myself 200 times! Also, VERY well explained. I also learned something, which is very nice as well. Link to post Share on other sites
Posted January 11, 2015 Can you add a section detailing the differences between TN and IPS, like a TN vs IPS section, please? This would eliminate me having to explain it a lot. Thanks! ON A 7 MONTH BREAK FROM THESE LTT FORUMS. WILL BE BACK ON NOVEMBER 5th. Advisor in the 'Displays' Sub-forum | Sony Vegas Pro Enthusiast & Advisor † Tech Tips Christian Fellowship Founder & Coordinator † Link to post Share on other sites
Posted January 11, 2015 · Original PosterOP Can you add a section detailing the differences between TN and IPS, like a TN vs IPS section, please? This would eliminate me having to explain it a lot. Thanks! For more "general knowledge" topics like how an LCD works or a crash course on TN, VA, and IPS, I may make a separate topic in the future Forum Rules | Display Technology FAQ / Mythbuster | Guide to Display Cables / Adapters Link to post Share on other sites
Posted January 11, 2015 Can you add a section detailing the differences between TN and IPS, like a TN vs IPS section, please? This would eliminate me having to explain it a lot. Thanks! For more "general knowledge" topics like how an LCD works or a crash course on TN, VA, and IPS, I may make a separate topic in the future Or...you could just add it here. Just a basic difference would even be good to cover all bases. ON A 7 MONTH BREAK FROM THESE LTT FORUMS. WILL BE BACK ON NOVEMBER 5th. Advisor in the 'Displays' Sub-forum | Sony Vegas Pro Enthusiast & Advisor † Tech Tips Christian Fellowship Founder & Coordinator † Link to post Share on other sites
Posted January 14, 2015 Great little guide, hopefully it'll clear up some commonly spread misconceptions and help out those who don't know where to start when monitor shopping! Link to post Share on other sites
Posted January 17, 2015 · Original PosterOP Added a Quantum Dot section at the bottom. Forum Rules | Display Technology FAQ / Mythbuster | Guide to Display Cables / Adapters Link to post Share on other sites
Posted January 17, 2015 Added a Quantum Dot section at the bottom. What about a flux capacitor section? JK ON A 7 MONTH BREAK FROM THESE LTT FORUMS. WILL BE BACK ON NOVEMBER 5th. Advisor in the 'Displays' Sub-forum | Sony Vegas Pro Enthusiast & Advisor † Tech Tips Christian Fellowship Founder & Coordinator † Link to post Share on other sites
Posted January 17, 2015 · Original PosterOP What about a flux capacitor section? JK That could very well be be the next one. Who knows what the TV marketing people will come up with next. Forum Rules | Display Technology FAQ / Mythbuster | Guide to Display Cables / Adapters Link to post Share on other sites
Posted January 17, 2015 That could very well be be the next one. Who knows what the TV marketing people will come up with next. Exactly! ON A 7 MONTH BREAK FROM THESE LTT FORUMS. WILL BE BACK ON NOVEMBER 5th. Advisor in the 'Displays' Sub-forum | Sony Vegas Pro Enthusiast & Advisor † Tech Tips Christian Fellowship Founder & Coordinator † Link to post Share on other sites
Posted January 18, 2015 I love this post SO MUCH! even tho i already knew most of this... but still... THANK YOU! ^2 Link to post Share on other sites
Posted January 19, 2015 · Original PosterOP I love this post SO MUCH! even tho i already knew most of this... but still... THANK YOU! ^2 Thanks Forum Rules | Display Technology FAQ / Mythbuster | Guide to Display Cables / Adapters Link to post Share on other sites
Posted February 10, 2015 · Original PosterOP Added a section (number 8) to address the whole "1080p 4K exact scaling" thing. Forum Rules | Display Technology FAQ / Mythbuster | Guide to Display Cables / Adapters Link to post Share on other sites
Posted February 10, 2015 Thanks for this thread, very useful information there. I was asking number 12 myself. I Just got my PB278Q very recently, before the announcements of 144 HZ IPS G-Sync, or even G-Sync on IPS at all. Looking back this makes me very sad because i went with the Asus over a G-Sync panel because i dislike TN panels. I hope i can trade my PB278Q + some money for a good IPS G-Sync screen once the market has settled. who cares... Link to post Share on other sites
Posted February 10, 2015 · Original PosterOP Thanks for this thread, very useful information there. I was asking number 12 myself. I Just got my PB278Q very recently, before the announcements of 144 HZ IPS G-Sync, or even G-Sync on IPS at all. Looking back this makes me very sad because i went with the Asus over a G-Sync panel because i dislike TN panels. I hope i can trade my PB278Q + some money for a good IPS G-Sync screen once the market has settled. Thanks, glad you found it helpful Forum Rules | Display Technology FAQ / Mythbuster | Guide to Display Cables / Adapters Link to post Share on other sites
Posted February 10, 2015 Added a section (number 8) to address the whole "1080p 4K exact scaling" thing.Yes. Funny thing is that last week I did the test with a 4K monitor at work, and it was indeed not sharp. It wasn't blurry as a non native resolution, but it looked more like displaying text on a TV. I think the problem is the panel grid is not thin enough. Assuming that the grid was non existent in monitor technology, then it will appear correctly.What is interesting however is that Apple, when they introduced their Retina display, they set the monitor to 1440x900, basically double the pixels as software were not high DPI ready at the time, essentially making that payed the premium price waste their money. But Apple did manage to have good sharpness (http://www.anandtech.com/show/5996/how-the-retina-display-macbook-pro-handles-scaling). My guess is that the panel grid is even smaller due to the size of the screen being only 15inch. Link to post Share on other sites
Posted February 10, 2015 · Original PosterOP Yes. Funny thing is that last week I did the test with a 4K monitor at work, and it was indeed not sharp. It wasn't blurry as a non native resolution, but it looked more like displaying text on a TV. I think the problem is the panel grid is not thin enough. Assuming that the grid was non existent in monitor technology, then it will appear correctly. What is interesting however is that Apple, when they introduced their Retina display, they set the monitor to 1440x900, basically double the pixels as software were not high DPI ready at the time, essentially making that payed the premium price waste their money. But Apple did manage to have good sharpness (http://www.anandtech.com/show/5996/how-the-retina-display-macbook-pro-handles-scaling). My guess is that the panel grid is even smaller due to the size of the screen being only 15inch. Pretty sure Apple's software was high-DPI aware on the retina MBPs. They were scaled to 1440x900 effective desktop space as far as the size of the UI goes, but they did use the extra pixels to smooth everything out. Forum Rules | Display Technology FAQ / Mythbuster | Guide to Display Cables / Adapters Link to post Share on other sites
Posted February 14, 2015 @Glenwing Thanks for the topic. Very informative and easy to understand. Link to post Share on other sites