Jump to content

Glenwing

Senior Moderator
  • Posts

    17,126
  • Joined

Blog Entries posted by Glenwing

  1. Glenwing

    Version Numbers
    [I'm writing this post mainly in reaction to this TFT Central article, but this is less of a direct response and more of a general information post for the community. I originally wrote this for r/monitors, but I decided it would probably be better to post it here where I have more control over formatting, because this is going to be a looong post.]
    Introduction
    Well. I was already planning on writing an article on this topic, but since TFT Central published their piece, it seems that if I'm going to say anything, now is the time. So this post is going to be a bit of a rush job, but I'll do my best.
    There are many problems at play here. I agree with some parts of the article, particularly that HDMI devices need to start listing explicit support for features they implement, but there is also a vast misunderstanding about how version numbers work and what they mean, and it is a misunderstanding shared by the community at large. I don’t expect this post to be well-received (I can see the “it’s not broken, you’re just using it wrong” memes already), but... it is what it is.
    The Big Misunderstanding
    I think the problem can be nicely represented with these two lines of text, describing the document published in 2017 by the HDMI Forum:
    “The HDMI 2.1 Specification”
    versus
    “The HDMI Specification, Version 2.1”
    The first one is how people think of the standard. The second one is the reality.
    What’s the difference? Well... there seems to be this belief that there are multiple “HDMI Specifications”, which are used to make different classes of devices. “The HDMI 1.4 Specification” is the standard for building humble 1080p devices, whereas if you want to make a 4K 60 Hz TV, you reach into your filing cabinet and pull out “The HDMI 2.0 Specification” instead, which is a different standard for designing a different class of device. And the latest addition to the collection is “The HDMI 2.1 Specification” which is only for designing “HDMI 2.1 devices” like 4K 120 Hz monitors, but not any other kind of HDMI device; if you want to build an “HDMI 2.0 device” or an “HDMI 1.4 device”, you put that 2.1 Specification away, go back to your filing cabinet and pull out the good old 2.0 or the 1.4 standard and refer to that one.
    But that’s not how these documents work.
    Version 2.1 is not an addition to some imagined “HDMI Specifications Master Collection”. The HDMI Specification IS the master collection. It is The HDMI Specification, Version 2.1. It replaces Version 2.0 as the current specification document used for the design of ALL HDMI devices, from lowly 1080p 60 Hz SDR monitors to 4K 120 Hz HDR. The HDMI Specification, Version 2.0 is not a separate standard, it is the same document, just an earlier edition. It was edited to make some new additions to the standard, and thus became Version 2.1 of the document, because that’s how editing documents works. Make some changes or additions, increment the version number, note your changes in the revision history section, and the old version is now deprecated and the new version supersedes it.
    “Compliance” with Version Numbers
    So yes, all HDMI devices are “HDMI 2.1 compliant”, because they all comply with the requirements of The HDMI Specification, whatever the currently in-force edition is, which currently happens to be Version 2.1 (but if Version 2.2 comes out next year, then that will be the document that all future HDMI devices need to comply with). If they didn’t comply with the Specification, they wouldn’t be considered HDMI devices. At all. The HDMI Specification is the document that describes how to build an HDMI device—any HDMI device. Whatever the most current version is, that’s the document you as a designer refer to for any new designs, regardless of what capabilities you are implementing. Everything from the version 2.0 document is still there in the version 2.1 document, there’s just some additional stuff that’s also been added. You never go back to a previous version and use that for your design work.
    It’s just like if you build a house, you follow the most recent edition of the National Electrical Code or whatever governs such projects in your area. If the 2020 edition changes some requirements that weren’t in the 2017 edition, and you don’t follow them, the response should not be “of course it’s still code compliant, I’m not designing a v2020 house that’s all, this is a v2017-spec house I’m building”. No; the 2017 edition is deprecated, it’s been replaced with the 2020 edition which is the current in-force standard, and if you don’t follow it, you aren’t code-compliant.
    On the flip side, let’s suppose the 2020 edition of the NEC just contains some new requirements for 3-phase industrial facilities; residential requirements are unchanged. So, you build your house, it’s compliant with the 2020 NEC. Should someone come along and say “hey, you say it’s 2020-compliant, but it doesn’t have 3-phase power. How can you claim to be compliant with the 2020 edition? You’re only building a 2017-compliant building!” Naturally this is nonsensical; the 2020 edition still contains the requirements for residential, and they’re the same as they were in the 2017 edition.
    So does that mean “version compliance” is meaningless? Well, if you're trying to determine what features or capabilities the device has, yes it’s entirely meaningless for that purpose, because version numbers do not and were never intended to convey that type of information.
    The idea that what we need to do is “fix” and “standardize” the meaning of version numbers so that they can be used to convey feature support is misguided. In our house and electrical code compliance analogy, the person might say “but the 2020 edition added new things about 3-phase systems, and this house doesn’t even have a 3-phase system! There’s no way it can be 2020-compliant!” Of course, this is a misinterpretation; it doesn’t mandate every building must now have 3-phase, it’s just IF you have that, this is how it needs to be set up, or whatever. But it’s optional. And they might reply “but if it’s optional, then the edition year is meaningless! How am I supposed to tell if a building has industrial three-phase power? What good are these meaningless “edition numbers” if I can’t just check if it’s 2020-compliant to determine if it has three-phase power or not?”
    Perhaps the answer is that using the edition of the standard as a way of identifying what type of power the building has is... not the most sensible way of doing things. And I think the answer is not “we need to come up with some weird arbitrary system relating the claimed compliance-year with the type of electrical configuration and power capacity your building is equipped with based on which edition that configuration was first allowed in because then it will be standardized so all the confusion will disappear!”
    Can Devices “Change Versions”? And Why is Everything Optional?
    This analogy of code compliance raises another interesting question: if you had a house built to the standards of the 2017 edition, did your house magically “become” a 2020-compliant house when the standard was published, if none of the requirements changed? Similarly, did “HDMI 1.4a” devices magically “become” 1.4b devices upon the release of Version 1.4b? (For reference, Version 1.4b just fixed some typos and formatting, no technical changes) Well... basically yes. Did “HDMI 2.0” devices “become” HDMI 2.1 devices when the standard was published? If by “becoming a version” you mean “following the requirements of the standard”, then yes, devices that complied with the requirements of the Version 2.0 Specification also comply with the requirements of Version 2.1, because the minimum requirements for an HDMI device haven't changed. All HDMI devices shall be interoperable with any DVI-compliant interface. All HDMI devices shall be capable of a 480p60 or 576p50 CTA-861 format. All HDMI devices shall be capable of supporting RGB pixel format. And so forth.
    Remember that The HDMI Specification, whatever the most current version is, is the standard that governs the design of all HDMI devices. If something was made mandatory that wasn’t before, it would mean that all future HDMI devices, of every class, would require it. If support for 48G FRL was mandatory in Version 2.1 of the HDMI Specification, that would mean from here on out, all new HDMI devices need to be able to transmit/receive at 48G speed. Goodbye basic HDMI output support on future Arduino boards. Goodbye basic HDMI controllers in 1080p 60 Hz monitors that only need 148.5 MHz capability; gotta put in the fancy controllers and make it much more expensive for absolutely no reason. So yes, most of the features in The HDMI Specification are optional, including everything new in Version 2.1. That’s how it’s been with every new feature, of every version of The HDMI Specification.
    And yes that does mean that basically every single HDMI device in the world is always “compliant” with whatever the latest Specification version is. “Version x-compliant” is not a method of indicating support for features. When the next version is published, all current devices will no doubt still be in compliance with Version 2.2 of the HDMI Specification, or whatever it’s called. If that doesn’t sound right, that HDMI devices can suddenly “become” a different version with the wave of a wand and the stamp of a seal, the issue here is that “become” just isn’t a good word choice here. “HDMI 2.0” and “HDMI 2.1” aren’t “things” that a device can “be”; an HDMI controller can’t “be” a PDF document. HDMI version numbers are not some kind of physical property. Devices that comply with the requirements in Version 1.4a of the HDMI Specification are also compliant with Version 1.4b. And with 2.0. And 2.0a. And 2.0b. And yes, even Version 2.1, and likely any future version of the HDMI Specification.
    The outrage over “new features all optional” stems from the earlier misconception that the “HDMI 2.1 Specification” is just a subset of HDMI standards, and only applies for designing “HDMI 2.1 devices” and not lower-end devices (where you would supposedly pull out the “HDMI 2.0 Specification” and therefore no longer be subject to the 2.1 requirements). But since that isn’t how it works, the HDMI Specification governs the creation of all HDMI devices, so hopefully it makes more sense why new capabilities are always optional.
    Then What Do We Call It?
    But still... if everything is optional, it’s a problem. Shouldn’t there be some kind of classification system for different devices, to differentiate them based on what features they support? Right now people use version numbers for that purpose, but since that doesn’t really make sense (as the latest version of the document always encompasses the entirety of the ecosystem), what should we use? It’s really dumb that they don’t have a standard way of identifying what features are implemented on a device!
    Well, if you recall, the reality is they have made a standard way of identifying what features a device supports. It’s called “declaring all of the features that the device supports”, which is what manufacturers are supposed to do. Devices should not be saying “HDMI 2.1 input”, they should be saying “HDMI input supporting 48G FRL, HDR, and Game Mode VRR”. But, obviously, that’s a lot more syllables than “HDMI 2.1”. Hence it’s also understandable that people and reviewers gravitate toward “version 2.1” as a sort of “code word” shorthand for “supports stuff that is new in version 2.1”. The problem is that people don’t realize it’s an informal code word, and think that it has some kind of standardized technical meaning. And companies copy the notation that people use. And of course, this is made worse by the total lack of enforcement, and the lack of effort to educate consumers on the part of HDMI LA (i.e. an official webpage explaining this would help a lot).
    But shouldn’t they make official meanings for each version number? I mean, it’s pretty sensible right, if your device supports 600 MHz TMDS maximum, that’s HDMI 2.0-spec right? HDMI 1.4 imposed a limit of 340 MHz, so it’s obviously beyond HDMI 1.4. And it doesn’t support HDMI 2.1 FRL mode, so it’s not HDMI 2.1. So it’s pretty clear cut right? We call that an “HDMI 2.0” device and call it a day.
    Unfortunately, things don’t fall into such neat slots, especially when you start considering multiple features. My Dell UP2516D supports 2560×1440 at 60 Hz over HDMI, which is around 240 MHz TMDS. It also supports 10 bpc RGB color, but only over DisplayPort. So... what “version” of HDMI would you call it? The old Version 1.2 Specification mandated a maximum of 165 MHz and only supported 8 bpc RGB color. Version 1.3 changed the maximum limit to 340 MHz, and also added support for 10 bpc color. So... is the Dell an HDMI 1.2 monitor? Can’t be, 240 MHz is outside the maximum limit. Is it a Version 1.3 device? But that would mean it also supports 10 bpc color too, and it doesn’t. So... what do you call it?
    Of course, I know the response from some people will be “I call that a... NOT ALLOWED! If it supports 10 bpc color over DisplayPort, and it has the bandwidth, it should also support it over HDMI! It’s part of the Specification, after all. We should force them to require support for these things, so they can’t call it an “HDMI 1.4 port” but leave out color support like that for no reason!” Perhaps in that case, yes, but be careful what you wish for. Lumping all the features of a version into a “package” and mandating that you have to implement the whole package or not at all might not always be a good thing.
    Let’s consider another case; a 1440p 144 Hz SDR monitor. Perfectly well served by 600 MHz TMDS “HDMI 2.0” speeds. No need for 48G FRL support. However, the monitor manufacturer wants to implement support for the new Game Mode VRR protocol introduced in Version 2.1, to allow for NVIDIA cards to use G-Sync over HDMI. So it’s a monitor with an HDMI input supporting 600 MHz TMDS and Game Mode VRR for variable refresh rate. Is that a problem? Not at all, except for people who are very attached to version number labels, then it does become a problem. Is it an HDMI 2.1 device? Some might say “No, it doesn’t support 48G FRL, it’s limited to 600 MHz TMDS HDMI 2.0 speed, that’s not an HDMI 2.1 device. You need to call it an HDMI 2.0 device or else it’s false advertisement”. But then, HDMI Game Mode VRR was introduced in Version 2.1 of the Specification, so if you call it an HDMI 2.0 display, people will look at the monitor and say “but it only has HDMI 2.0 input, and you need HDMI 2.1 to support VRR, so how is it possible to support it over HDMI 2.0? I’m confused!” So would that be an HDMI 2.0 input or an HDMI 2.1 input? The answer is... neither! It’s an HDMI input supporting 600 MHz TMDS and Game Mode VRR. Devices don’t have “version numbers”. The only thing that has a version number is a PDF document called The HDMI Specification.
    Not Such a New Problem
    We actually had this exact problem in this very subreddit with DisplayPort 1.4 and HDR, multiple times (here, here, and here for example). There are some displays that are 4K 60 Hz or 1440p 144 Hz for example that support HDR over DisplayPort, but they only need HBR2 speed (even at 10 bpc color depth). So they equipped their monitors with a DisplayPort input supporting HBR2 and HDR. But because everyone demands version numbers (they’ll get annoyed reviews and customers constantly asking them “but what version is it??” if they don't include it), they needed to put a version number on it. The problem is, is it a DisplayPort 1.4 input? It supports HDR, which was introduced in version 1.4 of the DisplayPort Standard. Or is it a DisplayPort 1.2 input, since it is limited to HBR2 speed?
    Well... Dell labeled theirs as “DisplayPort 1.4 input” while LG called it a “DisplayPort 1.2 input” on one monitor and a “DisplayPort 1.4 input” on one of their other models. The exact same implementation of DisplayPort.  ¯\_(ツ)_/¯
    Many people’s reactions will be “see, that’s exactly why we need standardization on the meaning of version numbers, so it doesn’t lead to confusion like that”, which is sort of the right feeling, but the wrong conclusion. The real answer is “see, that’s exactly why we need to stop using version numbers”. It’s a DisplayPort input supporting HBR2 and HDR. That’s all there is to it. If you say “no, all we need to do is make a rule that they are only allowed to call it DisplayPort 1.4 if it supports everything—HBR3, HDR, DSC—so there’s no confusion.”, well... Does that really eliminate the confusion? Are you sure?
    Let’s say you make this rule, and then you make a display that implements HDR, but it’s a 1080p display and doesn’t need HBR3 speed. Since it doesn’t support HBR3, we’re not allowed to call it a DisplayPort 1.4 input. Hooray, no one will mistakenly believe it supports HBR3 speed and DSC! But then... what do we call it? DisplayPort 1.2 or 1.1, based on the transmission speed? But then that destabilizes the meaning of those version numbers, because the monitor supports HDR, which was introduced in version 1.4. But the monitor is “DP 1.2”. So HDR isn’t a 1.4 feature after all? Or do “DisplayPort 1.2” monitors support HDR after all? Sometimes yes sometimes no? The confusion is still present. Or should that combination just not be allowed? Should they only allow manufacturers to implement the whole “version” as a package, all or nothing? If you don’t implement HBR3, you aren’t allowed to support HDR either? That just forces lower-end devices to be more expensive by requiring high-speed receivers and other capabilities they can’t even use, and for what reason? Just to force “version numbers” to become usable as a descriptor? Or we could just call it a “DisplayPort input supporting HBR2 and HDR” and be done with it.
    Should we really make an official “HDMI 2.1” label, and if you want to support any HDMI 2.1 features like VRR, you have to also implement 48G FRL transmission speed and the whole package? That essentially makes VRR exclusive to 4K 120 Hz displays, never to be implemented on 1440p 144 Hz options and below, unless the manufacturer wants to implement overkill controllers that just make it significantly more expensive and increase development time in an industry where everyone is already complaining about things taking too long to come to market. And all of that, just so that we can “preserve the meaning of our arbitrary choice to use version numbers of different historical editions of a PDF document as an indicator of feature support instead of just naming the actual feature”? There is nothing to be gained by linking implementation of VRR and 48G FRL speed, nor by linking HDR to 600 MHz TMDS, or any other combinations of features that have absolutely nothing whatever to do with each other from a technical standpoint.
    I have seen some reactions to this TFT Central article along the lines of “they’ve really ruined the standard now, just like USB. It was so easy before with HDMI 2.0 and HDMI 1.4”, apparently in the belief that “HDMI 2.0” and “HDMI 1.4” had some standard meaning. They don’t. Anyone remember the constant stream of forum posts in 2013 to 2016 from the armies of people buying the ASUS VG248QE and wondering why the HDMI 1.4 port was limited to 60 Hz? (Which, just for reference, was because ASUS didn’t implement the maximum allowed 340 MHz TMDS clock, but listed “HDMI 1.4” on the product page). Even now, nearly 2022, we still get the occasional post on this subreddit asking why the BenQ XL2411 or Acer GN246HL (which had the same limitation as the ASUS) is only getting 60 Hz from the “HDMI 1.4” input.
    No, this is nothing new. It has always been this way, even back in the HDMI 1.4 days. And the labeling rules 10 years ago were exactly the same as they are now. Do not describe devices with an “HDMI version number”. List the actual capabilities that the device has.
    The real problem is that HDMI LA has not really enforced this policy, I suspect simply because the practice is so widespread that they gave up on trying to enforce it, and now it sounds like they just allow companies to use the version number of the HDMI CTS standard that their device is compliant with, which as I explained, is simply whatever the latest standard is.
    It’s worth noting we have already seen this cycle with DisplayPort. VESA’s marketing guidelines from 2012 stated:
    But due to the continued trend, they caved a bit and later revised this rule in 2013 to try to make somewhat of an “official meaning”:
    But in even newer editions of the guidelines, since around the release of version 1.4 of the DisplayPort standard, this policy has been removed entirely and VESA has been silent on the “official word” for version number usage ever since, probably still trying to figure out what to do other than telling everyone “you need to completely stop using version numbers” since, obviously, that didn’t work the first time they did it.
    Unfortunately the use of version numbers has become so entrenched that it would be very difficult to get people to stop using them. Most people have an extremely strong negative reaction to being told they should change their notation, pronunciation, spelling, or any other related habit (see GIF debate, pronunciation of Porsche), regardless of any reasoning or evidence presented or if the reasons make practical sense (see 2K/4K, GB vs GiB). Companies also do not want to stop using version numbers, because they want to use whatever notation is popular among consumers, to improve search results and attract customers; if the “common knowledge” is that you need an “HDMI 2.0 device” for 4K 60 Hz, then if one company follows all the rules and lists the exact capabilities of their device with “HDMI input supporting 600 MHz TMDS” they’re probably going to lose customers to the other companies improperly listing “HDMI 2.0 input”, along with a bunch of 1-star reviews from people angry at them for “trying to trick us by hiding the version number of the port” or something like that.
    Indeed, the official word of the official organizations, ultimately, doesn’t seem to carry enough power on its own. I think the only way this “version number problem” will be overcome is if people can be educated about the proper notation enough to stop using it, and to discourage companies from using it. It’s HDMI LA’s job (and VESA’s, in the case of DisplayPort) to legally enforce it, but I doubt they will unless people actually show some interest in it, which will be hard since, as I said, people react negatively to any suggestion that they change their notation about anything.
    Closing Thoughts
    Don’t use version numbers to describe device capabilities. Just describe the device capabilities. Companies need to start doing this first. Ultimately this is the only sensible way of doing things.
    People shouldn’t need to ask “How do I know if it supports HDR over HDMI?” with redditors replying “You need to check if it has HDMI 2.0a or 2.0b, that means it supports HDR”. No; if you want to check whether the HDMI input supports HDR, the procedure should be “1. Check if it says ‘HDMI input with HDR’. Step 2: Done!” That should be the real answer, and would make the landscape absolutely clear. Unfortunately this simply isn’t how things are done right now.
    The TFT Central article does encourage companies to list their device capabilities directly, and I strongly support them there, but unfortunately on the topic of version numbers, it’s the opposite of what we need, just reinforcing the idea that products are “supposed” to use version numbers in a certain way, instead of teaching people why they shouldn’t be used at all.
    There’s no doubt this whole debacle with version numbers is a problem. The vagueness and meaninglessness of version numbers has been a thorn in the industry for the last decade. Again, ASUS VG248QE, BenQ XL2411, Acer GN246HL. Need I say more? HDMI LA makes almost no effort to educate the consumer market on the subject of version numbers, but more importantly their oversight and enforcement of industry practices is very lacking. But my opinion on the solution is a bit different than everyone else’s. While it seems the common response here is “they need to standardize the meaning of version numbers”, I don’t think that’s a good way to go about things, for the numerous reasons explained above. There’s simply no way to make it make sense.
    So what should be done? In addition to actually enforcing the current policies, which mandate that products are not to be advertised or labeled with version numbers and need to list the specific features supported by the device, I think HDMI LA should mandate a very specific standardized format for listing supported features, and require it on all products.
    For example, perhaps on product pages, spec sheets, and online retailer product descriptions, HDMI inputs must be at minimum labeled with their maximum transmission speed, HDR if supported, VRR if supported, and DSC if supported, in exactly this format:
    2× HDMI input (600 MHz TMDS, HDR) 1× HDMI input (32G FRL, HDR, VRR, DSC) And in addition, the manufacturer should be required to fill out a full table (which would be a standardized table distributed to all HDMI Adopters) which is the comprehensive list of all HDMI features, and they are required to fill out the entire table with a check mark on every supported feature (or a number or whatever is applicable). This table should then be required to appear in the product manual of any HDMI device, and on the manufacturer’s product page, and on the product box if space permits (if not, a QR code leading to the product page and a notice saying the HDMI capabilities can be found there), and included as one of the product photos on any online retailer where the manufacturer controls the photos. For example:
    HDMI Capability HDMI INPUT 1 HDMI INPUT 2–4 Maximum FRL Speed 32G — Maximum TMDS Speed: 600 MHz 600 MHz Supported Color Formats RGB, 8–12 bpc
    YCBCR 4:4:4, 8–10 bpc
    YCBCR 4:2:2, 8–12 bpc
    YCBCR 4:2:0, 8–12 bpc
    RGB, 8–10 bpc
    YCBCR 4:4:4, 8–10 bpc
    HDCP Version 2.2 2.2 High Dynamic Range (HDR) HDR10+
    Dolby Vision HDR10 Game Mode VRR ✓ Display Stream Compression (DSC) ✓ Auto Low Latency Mode (ALLM) ✓ ...(etc.) or something like that, anyway. But whatever the format is, it should be exact. No custom table entries, no renaming features on the table with company-specific “brand names”, or any of that nonsense. It should become as uniform and expected as “Nutrition Facts” are on food. Then, perhaps require a footnote at the bottom of this table with these exact words: “The use of a “version number” to describe the capabilities of an HDMI interface is prohibited by the HDMI licensing agreement. Do not interpret “HDMI version numbers” as an indication of support for any particular feature, on this device or on any other HDMI device.” In every manual, on every product page, on every box. Maybe that can help get the ball rolling with consumer education. But most importantly, these things would need to be enforced. Vigorously.
    That, I think, would go a long way toward reducing confusion, and it’s easier to get companies to follow a strictly-formatted guideline than an open-ended mandate of “just list the features... somewhere”, where you just end up thinking “ehh... they probably won’t notice if we don’t have it at all. Are they really going to go through the website, look on the box, read the manual, just to check if it’s somewhere on any of those?” Whereas “this exact table must be present inside this particular document and on this particular web page, if it’s not we’ll notice” is a bit easier.
    And with that, I think I’m finally out of words. (Except for a few footnotes, of course!)
    Footnotes
    Earlier I said each new version of the HDMI Specification is an edited version of the previous version, and still contains everything that the previous one did, just with new additions, and thus they are not simultaneously active concurrent standards that you choose between—but rather the latest version represents the entirety of the HDMI Specification, and older versions are just the same document but now with parts missing, and they are now deprecated.
    Well... that's not entirely true. Because HDMI 2.0 was created by a completely different organization (the HDMI Forum, whereas previous versions were made by the original HDMI Founders), it is technically a completely separate standard. The HDMI 1.4b Specification is still considered an active standard and is the current and most recent version of itself. HDMI 2.0 is a separate standard which incorporates material from HDMI 1.4b and therefore refers to HDMI 1.4b in its normative references, the same way other industry standards are incorporated.
    So for example, the HDMI Specification uses IEC 60958 format for linear PCM audio, so rather than repeating all the material about how the format works, the audio section of the HDMI Specification just says “refer to IEC 60958”. For detailed video timings, it just says “refer to CTA-861”. Likewise, HDMI 2.0 (and its later revisions, up to HDMI 2.1) just says “refer to the HDMI Specification Version 1.4b” for all of the basic HDMI operational details, just like HDMI version 1.0 did with the DVI specification that it was based on. So technically, HDMI 2.0 is sort of like a supplement to HDMI 1.4b rather than a replacement, but this circumstance is unique to version 1.4b because of the change in management. HDMI 2.1 is a replacement for, and contains the entire contents of, HDMI 2.0b, which does the same for HDMI 2.0a, and then HDMI 2.0. The only advice I can give is... Try not to think about it too much. It’s just a weird artifact of the change in control of the Specification.
    I am also seeing many articles and videos now being published today, based on the TFT article, with titles such as “Why HDMI 2.1 Means Nothing Anymore” as if the deprecation of the term “HDMI 2.0” is a new policy that has just been enacted today. Just for the record, it’s been this way since the publication of version 2.1 of the Specification 4 years ago. It’s just that no manufacturers have actually gone along with it until now (probably helped in part by USB-IF’s discovery that this style of rebranding is, shall we say... unpopular among consumers. So when it came to HDMI I imagine most manufacturers thought “we know how that story ends, so let’s just leave that particular beehive alone and keep this new notation thing restricted to the internal development channels where it’s actually used”. Sadly it seems Xiaomi didn’t get the memo). Anyway, just for historical interest here’s an application note for HDMI PHY compliance testing from Keysight that I have, all the way back from 2018. I quote:
    This isn’t something that just happened today, just to be clear on the timeline of events.
  2. Glenwing
    The ViewSonic XG2401 supports HDMI 1.4. I will now demonstrate it operating at 1920 × 1080 @ 144 Hz over HDMI. These tests are performed with an NVIDIA GeForce GTX 780 Ti, which also only supports up to HDMI 1.4.
    Display Settings Demonstration
    These settings show the XG2401 connected via HDMI on both ends at 1920 × 1080 @ 144 Hz with full RGB color. These settings are available out of the box without requiring any overclocking/custom resolutions. 1080p 144 Hz was in fact selected by default when the monitor was connected over HDMI for the first time, I didn't even need to set it to 144 Hz manually.
    Timing Parameters and EDID
    For 1920 × 1080 @ 144 Hz, ViewSonic has decided to define a set of custom timing parameters, with an effective resolution of 2026 × 1157 or a pixel rate of 337.0 Mpx/s, just barely within the 340 Mpx/s maximum of HDMI 1.4. Curiously, when connected via DisplayPort, the monitor uses slightly different parameters defined by the standardized CVT-R2 formula, 2000 × 1157 or 333.2 Mpx/s, which would also fall within the 340 Mpx/s limit of HDMI 1.4. However, these timings are not used for the HDMI connections for some reason.

    The EDID reports a maximum pixel clock of 340 Mpx/s, the highest allowed by HDMI 1.4. The 1080p 144 Hz format is defined within the CTA-861 extension block.

    The EDID is the same on both of the XG2401's HDMI ports, and 1080p 144 Hz works on both ports.
    Verification
    Of course, it is possible that the monitor is simply skipping frames, or failing to truly operate at 144 Hz in some other way. Some form of verification would be desirable.
    Verification by Oscilloscope
    This is measured using a Keysight EDUX1002A oscilloscope and a Texas Instruments TSL14S light-to-voltage converter. A pattern of alternating black and white frames was generated by the blurbusters flicker test (https://testufo.com/flicker). Since oscilloscopes are designed for measuring oscillating waveforms, a set of one white frame and one black frame is counted as a single "wave" (indicated by the two vertical orange lines marking the boundary of "one wave"). For this reason, the frequency displayed on the scope is half the actual refresh frequency, and the displayed period is twice the actual refresh period. In this case, 71.79 Hz indicates 71.79 sets of black-white transitions (2 frames) per second, for a total of 143.58 frames per second.
    Verification by High-Speed Camera
    This is a high-speed video of the blurbusters frame skipping test (https://testufo.com/frameskipping) shot with a Casio Exilim ZR100 at 1,000 FPS. Each frame of video represents 1 ms of real time. The video is played back at 30 FPS, meaning that every 1 second of video shows 30 ms of time. At 144 Hz, the display refreshes at intervals of 6.9444 ms. This means that we should see slightly more than 4 refreshes per second of video, which the video does show. This can also be verified more precisely by examining the video frame by frame and counting 7 frames between each refresh. We can also observe that the display is operating properly, without any frame skipping.
  3. Glenwing
    This post contains an analysis of the v2 G-Sync module's features and behavior. This analysis was performed with a Dell S2417DG, but is not intended to be a review of the monitor itself.
     
    Input / Output
    This G-Sync module supports a single DisplayPort 1.2 input and a single HDMI 1.4 input. G-Sync is only supported over DisplayPort.
     
    This monitor supports up to 165 Hz at 2560×1440 through DisplayPort. The timing parameters used by this monitor can be viewed here: https://linustechtips.com/main/gallery/album/4127-dell-s2417dg-edid-and-timing-parameters/
     
    The DisplayPort EDID on this monitor reports a vertical frequency range of 30–165 Hz and a maximum bandwidth of 19.2 Gbit/s (640 Mpx/s with 8 bpc RGB color), just enough for the maximum format (2560×1440 @ 165 Hz 8 bpc RGB), which operates at a pixel rate of 635 Mpx/s, requiring 19.07 Gbit/s of bandwidth (about 88% of the 21.6 Gbit/s provided by the DisplayPort 1.2 interface).
     
    The HDMI EDID reports a vertical frequency range of 24–60 Hz, and a maximum TMDS clock of 300 MHz (9.0 Gbit/s). This indicates support for around 83% of the 10.2 Gbit/s limit specified by the HDMI 1.4 standard. The maximum format (2560×1440 @ 60 Hz 8 bpc RGB) uses standard CVT-RB timings by default, for a pixel rate of 241.5 Mpx/s and 7.24 Gbit/s bandwidth consumed, about 80% of the monitor's reported maximum.
     
    DisplayPort Behavior
    Unfortunately, the G-Sync module carries the same behavioral flaws that other DisplayPort monitors have. When the monitor is powered down, the operating system considers the display disconnected, and will re-shuffle application windows and icons to the remaining screens. However, this particular monitor has a "Power Saving" option which, when disabled, prevents this behavior. When "Power Saving" is off, the monitor can be powered down without disconnecting from the operating system, and applications will not be moved around. I don't know whether other G-Sync monitors have a similar menu option.
     
    This behavior does not occur with DVI or HDMI (in general, but also including the HDMI port on the G-Sync module), since DVI and HDMI supply a small amount of power from the source to read the sink EDID of the connected device even when it is powered down, which allows the operating system to still recognize the display. DisplayPort does not allow power to be transmitted from source to sink, as the DP_PWR pin is only intended for use by attached devices (such as adapters). Presumably, when the "Power Saving" option on this monitor is disabled, the monitor keeps its internal control chip powered up to some extent even when the monitor is off.
     
    HDMI Limitations
    The HDMI port on the v2 G-Sync module has a flat 60 Hz limit regardless of resolution or bandwidth. This is even more restrictive than normal.
     
    There are many other 144 Hz monitors (particularly older ones) which are limited to ≈60 Hz at full resolution over HDMI, such as the ASUS VG248QE or Acer GN246HL. However, in those monitors, it is usually due to a simple bandwidth limit of the hardware, and high refresh rates can still be achieved at lower resolutions like 720p. This is not the case with the G-Sync monitor.
     
    The G-Sync module seems to have a software restriction which actually enforces a strict 60 Hz limit over HDMI at all resolutions, regardless of bandwidth. The monitor does work at up to 60 Hz at 2560×1440 over HDMI, so the hardware is capable of at least that much bandwidth (7.24 Gbit/s), but when attempting higher refresh rates at a lower resolution such as 1080p 120 Hz, 100 Hz, and even 75 Hz, it only results in a black screen despite the fact that 1080p 100 Hz and 75 Hz use less bandwidth than 1440p 60 Hz.
     
    This limitation is not due to GPU scaling as one might suspect (which would scale the image to 1440p prior to transmitting it across the cable, which would mean the transmitted video is always 1440p no matter what resolution is selected, and would therefore be subject to the monitor's maximum refresh frequency for 1440p video, which is 60 Hz when connected via HDMI). Although display-side scaling is not supported over DisplayPort for some mysterious reason, display scaling is supported over HDMI and I made sure it was set when I tested >60 Hz formats, but it did not help.
     
    This is an unfortunate and seemingly needless software restriction, and I must confess is the first time I've seen a flat 60 Hz limit enforced by software rather than a hardware limitation.
     
    Can AMD graphics cards run a G-Sync monitor at full refresh rate?
    There has been some concern in the past as to whether G-Sync monitors will be limited to 60 Hz when using AMD graphics cards. Unsurprisingly, there are not very many people with the means to test this, as most people with G-Sync monitors don't have AMD graphics cards laying around or vice versa, and there don't seem to be any reviewers who have seen any reason to test it either (at least to my knowledge).
     
    Fortunately, I have an AMD RX 480 on hand, so I have tested it and found that this monitor (the Dell S2417DG) works perfectly fine up to its maximum overclock of 1440p 165 Hz on AMD cards. G-Sync, of course, is not supported, but there does not appear to be any restriction requiring you to have an NVIDIA graphics card to achieve the full resolution and refresh rate of a G-Sync monitor.
    https://i.imgur.com/EIrj9jN.png
     
    G-Sync Behavior
     
    G-Sync behavior at low frame rates
    G-Sync operates from 0 Hz to the maximum refresh rate of the monitor (in this case, 0–165 Hz). Some people are under the impression that G-Sync has a "minimum range" below which it does not operate, such as 30–165 Hz. This is untrue, and comes from people incorrectly assuming that G-Sync stops operating once the framerate drops below the monitor's physical operating limits. Although monitors do have a minimum refresh frequency, usually around 24–30 Hz, G-Sync does continue to operate below the monitor's physical limit by duplicating frames. This technique is visually indistinguishable from single long frames, so there are no disadvantages caused by this behavior. Using this technique, G-Sync can operate at any framerate below the monitor's maximum refresh frequency, even at extremely low framerates.
     
    I demonstrate this on the S2417DG here, where you can see G-Sync continuing to operate at around 18.5 FPS:
     
    Does G-Sync work through a DisplayPort daisy-chain?
    No. I tested this monitor daisy-chained from a Dell U2414H. The S2417DG was recognized, and worked at up to 1440p 120 Hz (higher refresh rates are not available since it exceeds the bandwidth limitations of DP 1.2 when combined with the 1080p 60 Hz U2414H). However, it was not recognized as a G-Sync monitor, and the G-Sync (and ULMB) options were missing from the NVIDIA control panel.
     
    Does G-Sync work through a DisplayPort MST hub?
    No. I tested this monitor through an Accell K088B-004B two-port DisplayPort 1.2 MST hub. The S2417DG was recognized, and worked at up to 1440p 165 Hz. However, it was not recognized as a G-Sync monitor, and the G-Sync (and ULMB) options were missing from the NVIDIA control panel.
     
    ULMB Behavior
     
    ULMB Overview
    ULMB (Ultra-Low Motion Blur) is NVIDIA's implementation of backlight strobing built in to G-Sync monitors. Backlight strobing is a form of reducing perceived motion blur by eliminating the "sample-and-hold" behavior of LCDs. It makes the screen behave in a manner more similar to CRTs, where the image fades to black shortly after it is drawn. This changes the way that the human eye tracks motion, resulting in less perceived motion blur. Backlight strobing does reduce the maximum brightness of the monitor significantly, since the monitor only spends a fraction of the time illuminated, which reduces the total light output of the monitor.
     
    Similar to PWM brightness control, backlight strobing can cause noticeable flickering if the strobing is done at low frequencies. Usually 85 Hz is the recommended minimum for strobing, which is why 85 Hz was a standard refresh frequency in the days of CRTs, where it seems most people stop noticing flickering at or above that level.
     
    PWM brightness control does not achieve the same effect as backlight strobing because the pulses are not synchronized with the monitor's refresh operations, and PWM brightness control usually operates at a much higher frequency than backlight strobing does.
     
    NVIDIA's backlight strobing implementation, ULMB, is only available at 85 Hz, 100 Hz, and 120 Hz. It cannot be activated at other refresh frequenies. ULMB uses single strobes, so at 85 Hz refresh rate, the backlight strobes at 85 Hz, and so forth.
     
    For technical reasons, ULMB is not compatible with variable refresh technologies like G-Sync. The user must choose between either ULMB or G-Sync, they cannot be used at the same time.
     
    Relationship between ULMB Pulse Width setting and actual pulse width
    Monitors often give settings in unitless quantities. The most universal example of this is the "brightness" setting, which most monitors allow you to adjust between "0" and "100", but with no indication of what these numbers actually represent, other than arbitrary relative values.
     
    Since these settings usually go between 0 and 100, many people use the term "percent" when discussing these settings (i.e. "I set the monitor to 50% brightness"). However, some people will recognize that these numbers do not actually represent a percentage of the maximum setting, otherwise a brightness setting of "0" would leave the monitor completely dark. This being the case, a brightness setting of 50 is not actually half as bright as the 100 setting and so forth; in reality, the setting follows an arbitrary (and in some cases, non-linear) scale which differs from display to display, so it can be informative to measure the actual values of these types of settings.
     
    In this case, the subject of discussion is the ULMB pulse width setting. Naturally, the "100" setting does not equate to a 100% pulse width (which would mean no strobing at all), so I decided to measure the strobe at various settings to determine the actual pulse width, and to see how it reacts when the setting is adjusted. Since ULMB is available at three different refresh frequencies, I performed the tests on all three to see if that affected the behavior too.
     
    The ULMB Pulse Width setting does behave differently at different refresh rates; neither the pulse width nor the duty cycle remains the same. The setting is variable between "10" and "100", in increments of 1. The pulse width responds linearly to the setting, meaning that each decrease of 1 in the setting decreases the pulse width by the same amount every time. When set to 100, the pulse width is twice as long as when set to 50, and ten times as long as when set to 10.
     
    Pulse width is often represented in terms of the duty cycle, which is the pulse width as a percentage of the total period. For example, at 100 Hz a single period would be ¹⁄₁₀₀ seconds or 10 ms. A duty cycle of 20% would mean 20% of that period (2 ms) would be spent with the backlight active, and the remaining 80% (8 ms) would be spent off.
     
    Results:
    At 120 Hz, the pulse width was configurable between 2.22% (185 µs) at pulse width setting "10", and 22.1% (1.84 ms) at pulse width setting "100". At 100 Hz, the pulse width was configurable between 2.44% (244 µs) at "10", and 24.1% (2.41 ms) at "100". At 85 Hz, the pulse width was configurable between 3.03% (356 µs) at "10" and 30.1% (3.55 ms) at "100". Actual measurements at every interval of 10 may be viewed here: https://linustechtips.com/main/gallery/album/4129-dell-s2417dg-ulmb-pulse-width-measurements/
     
    Brightness reduction when using ULMB
    Lowering the strobe duty cycle will reduce the total light output of the monitor, which reduces the overall brightness. Brightness is directly proportional to strobe duty cycle; cutting the duty cycle in half will cut the brightness in half. Since the duty cycle scales linearly with the monitor's ULMB Pulse Width setting, the brightness will also scale linearly with it.
     
    Since the monitor uses DC brightness control, it has a "100% duty cycle" when not in ULMB mode. Activating ULMB will reduce the brightness significantly from the monitor's maximum, since the duty cycle will drop to 30% or less. This is not as much of a problem as it might sound, since the monitor has a powerful backlight capable of excessively high brightness (well over 400 cd/m2), presumably for this exact reason. Even 20% of maximum brightness will be enough for most users, and most people will not have the brightness set anywhere near maximum in normal mode. The monitor also keeps separate brightness settings when switching between normal and ULMB mode.
     
    Can ULMB be used with AMD graphics cards?
    No. The ULMB settings in the monitor's internal menu are greyed out in any situation where the monitor isn't recognized as a G-Sync monitor, including when the monitor is attached to an AMD graphics card. ULMB must be enabled through the NVIDIA control panel, and the monitor will not show up in the NVIDIA control panel unless the monitor is plugged into an NVIDIA graphics card.
     
    Does ULMB work through a DisplayPort daisy-chain?
    No. I tested this monitor daisy-chained from a Dell U2414H. The S2417DG was recognized, and worked at up to 1440p 120 Hz (higher refresh rates are not available since it exceeds the bandwidth limitations of DP 1.2 when combined with the 1080p 60 Hz U2414H). However, it was not recognized as a G-Sync monitor, and the G-Sync (and ULMB) options were missing from the NVIDIA control panel. The ULMB settings were also greyed out in the monitor's internal menu.
     
    Does ULMB work through a DisplayPort MST hub?
    No. I tested this monitor through an Accell K088B-004B two-port DisplayPort 1.2 MST hub. The S2417DG was recognized, and worked at up to 1440p 165 Hz. However, it was not recognized as a G-Sync monitor, and the G-Sync (and ULMB) options were missing from the NVIDIA control panel. The ULMB settings were also greyed out in the monitor's internal menu.
     
  4. Glenwing
    The AOC G2460PF supports HDMI 1.4. I will now demonstrate it operating at 1920 × 1080 @ 120 Hz over HDMI. These tests are performed with an NVIDIA GeForce GTX 780 Ti, which also only supports HDMI 1.4.
     
    Display Settings Demonstration
    These settings show the G2460PF (EDID identifies itself as the "2460G4", Windows however does not read the name) connected via HDMI at 1920 × 1080 @ 120 Hz with full RGB color. A custom resolution was necessary to expose the 120 Hz option (CVT-RB timing was used, with a resulting pixel clock of 285 Mpx/s). Without custom resolutions, only options up to 60 Hz were available. Higher formats such as 144 Hz were also attempted, but failed. The monitor's HDMI port appears to support a maximum TMDS clock of approximately 300 MHz.
    Timing Parameters and EDID
    The EDID on this monitor reports a maximum of 170 Mpx/s, around the same as the maximum limit of SL-DVI or HDMI 1.2 (165 Mpx/s). However, in practice, the monitor's hardware works up to around 300 Mpx/s. Several custom resolutions were attempted. 1920 × 1080 @ 120 Hz worked with both CVT-RB timing (285 Mpx/s) and CTA-861 timing (297 Mpx/s), but anything above this point resulted in a black screen with a floating "Input Not Support" text. I attempted 1920 × 1080 @ 144 Hz at 317 Mpx/s without success, and even 138 Hz with a pixel rate of 304 Mpx/s (shown below) was rejected.

    This monitor makes a good demonstration for two important points:
    The maximum limit of an HDMI device can be any arbitrary limit that the manufacturer decides, or that the hardware is capable of. It is not simply "a device can support either HDMI 1.4 speed (340 Mpx/s) or be limited to HDMI 1.2 speed (165 Mpx/s)", or anything like that. The limitations can be anything, and may differ on every individual model. The limits listed in the EDID are simply values typed in by the manufacturer. The EDID does not have some method of magically detecting the actual hardware capabilities of the display. The EDID limits therefore do not necessarily represent the capabilities of the actual hardware. Verification
    Of course, it is possible that the monitor is simply skipping frames, or failing to truly operate at 144 Hz in some other way. Some form of verification would be desirable.
    Verification By Oscilloscope
    This is measured using a Keysight EDUX1002A oscilloscope and a Texas Instruments TSL14S light-to-voltage converter. A pattern of alternating black and white frames was generated by the blurbusters flicker test (https://testufo.com/flicker). Since oscilloscopes are designed for measuring oscillating waveforms, a set of one white frame and one black frame is counted as a single "wave" (indicated by the two vertical orange lines marking the boundary of "one wave"). For this reason, the frequency displayed on the scope is half the actual refresh frequency, and the displayed period is twice the actual refresh period. In this case, 60.00 Hz indicates 60 sets of black-white transitions (2 frames) per second, for a total of 120.00 frames per second. This demonstrates flawless 120 Hz operation.
    Verification By High-Speed Camera
    This is a high-speed video of the blurbusters frame skipping test (https://testufo.com/frameskipping) shot with a Casio Exilim ZR100 at 1,000 FPS. Each frame of video represents 1 ms of real time. The video is played back at 30 FPS, meaning that every 1 second of video shows 30 ms of time. At 120 Hz, the display refreshes at intervals of 8.333 ms. This means that we should see slightly fewer than 4 refreshes per second of video, which the video does show. This can also be verified more precisely by examining the video frame by frame and counting 8–9 frames between each refresh. We can also observe from this video that the display is operating properly, without any frame skipping.
    High-Speed Camera Complete Demonstration
    Just for good measure, this video shows the display operating at 1920 × 1080 @ 120 Hz over HDMI with the frame skipping test in a single take at 1,000 FPS.
  5. Glenwing
    "My laptop only has an HDMI port. I want to connect to the DisplayPort input on my monitor. Can I use this inexpensive DP to HDMI adapter I found on Amazon?"
     
    To answer this question, we must apply some reading skills:
     

     

     

     

     

     

     

     
     
    No, you can't.
×