Jump to content

HDMI 2.0 is "no longer referenced" everything is now HDMI 2.1 [Edit: HDMA 2.1a]

sounds

Summary

A new display from Xiaomi, the "Xiaomi Fast LCD Monitor 24.5-inch 240HZ version" sets a new low for the industry.

 

Edit: HDMI 2.1a has been published, adding Source-Based Tone Mapping.

 

Quotes

Quote

The display that prompted this investigation was from Chinese manufacturer mi.com, and was their so-called “Xiaomi Fast LCD Monitor 24.5-inch 240Hz version“. This is a 24.5” sized display with a 1080p resolution and 240Hz refresh rate. At the bottom of their product page they list inclusion of 2x HDMI 2.1 ports:

 

However, beneath that at the bottom of the page hidden in the terms and conditions they then say the following (translated): “Due to the subdivision of HDMI certification standards, HDMI 2.1 is divided into TMDS (the bandwidth is equivalent to the original HDMI 2.0 and FRL protocols). The HDMI 2.1 interface of this product supports the TMDS protocol, the maximum supported resolution is 1920×1080, and the maximum refresh rate is 240Hz.”

...

We contacted HDMI.org who are the “HDMI Licensing Administrator” to ask some questions about this new standard, seek clarification on several questions we had and discuss the Xiaomi display we mentioned above. Here is what we were told:

  1. HDMI 2.0 no longer exists, and devices should not claim compliance to v2.0 as it is not referenced any more
  2. The features of HDMI 2.0 are now a sub-set of 2.1
  3. All the new capabilities and features associated with HDMI 2.1 are optional (this includes FRL, the higher bandwidths, VRR, ALLM and everything else)
  4. If a device claims compliance to 2.1 then they need to also state which features the device supports so there is “no confusion” (hmmmm)

So according to what they have told us, this means that in theory all devices with an HDMI 2.x connection should now be labelled as HDMI 2.1, even though at the end of the day they may only offer the capabilities of the older HDMI 2.0 generation. With HDMI 2.0 certification now discontinued, these are basically 2.0 devices hiding under the replacement banner name of 2.1. They don’t have to offer any new capabilities whatsoever, but they are still supposed to be labelled as 2.1 it seems.

My thoughts

As if USB 3.1 gen 1 and USB 3.2 gen 2, USB type C cables, Thunderbolt 3 vs. Thunderbolt 4 weren't all bad enough.

 

Is HDMI 2.1 now meaningless?

 

Sources

https://tftcentral.co.uk/articles/when-hdmi-2-1-isnt-hdmi-2-1

https://www.techspot.com/news/92606-fake-hdmi-21-doesnt-really-bother-hdmi-licensing.html

https://today.in-24.com/technology/722383.html

Link to comment
Share on other sites

Link to post
Share on other sites

not surprising, everybody ignores standards until they actually need them.

imagine the amount of times the usb connector has been used for something thats NOT usb.

this is why we have standards, to stop this stupidity since it only confuses others and leads to misinformation about the standard.

*Insert Witty Signature here*

System Config: https://au.pcpartpicker.com/list/Tncs9N

 

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, sounds said:

As if USB 3.1 gen 1 and USB 3.2 gen 2, USB type C cables, Thunderbolt 3 vs. Thunderbolt 4 weren't all bad enough.

 

Yer.. TB3 I think is the worst of these since you could put the label on it and only needed to support a tiny sub-set of the spec. At least with TB4 it is more strict basically it is TB3 but will all the optional things (like power delivery, charging, display support etc) required.

HDMI 2.1 has been a little bit like TB3 in that it already has had a load of items on its spec sheet that are optional. 

Link to comment
Share on other sites

Link to post
Share on other sites

Quite dumb, even worse than USB really.

2.1 is a big upgrade over 2.0 yet also so weird how they named 2.0 as a big milestone yet it's not as big of an upgrade over 1.x ver. But 2.1 is over 2.0 so they could've named it 3.0 actually but yeah, like we see. A mess.

 

At least good that monitors are mainly DP and HDMI  TV focused.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

So, how my mind thinks this played out:

 

  • Corporations: "HDMI.org, this is bullshit! We can't sell cheap-o crap anymore because everyone wants 2.1! It's the new "cool" thing. We have old stock to sell and overcharge, goddamnit"
  • HMDI.org: "not my problem, the standard is the standard"
  • Corporations: "Ah come on, surely you can... you know... (insert digital tran$action noises here) do something"
  • HDMI.org: "Oh hey would you look at that, now everything is 2.1! How did that happeeeeennn..."

 

So yes, HDMI 2.1 is now pointless.

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, sounds said:

Is HDMI 2.1 now meaningless?

I expect this to be more of a problem for cables (and possibly for some gpus), monitors will necessarily support the bandwidth required to run them...

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Doobeedoo said:

2.1 is a big upgrade over 2.0 yet also so weird how they named 2.0 as a big milestone yet it's not as big of an upgrade over 1.x ver.

2.0 was the first to support 4k60. 1.x only supported up to 4k30.

 

At the time it was a very significant upgrade.

 

Comparatively, most people didn't care about 2.1 for a long time. Very few people actually care about the features of HDMI 2.1 in the grand scheme of things - it's only really since the release of the current-gen consoles that it's become an actually useful feature. Heck, it was only first seen on a GPU with the RTX 3000 series, despite being released back in 2017.

 

It wouldn't surprise me if this renaming is step 1 of a plan that will end up with a new HDMI 3.0 spec being released. A reshuffling of the stack if you will. After all it's been 3 years now since HDMI 2.1 was released - right around the time that we should expect something new to play with.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

HDMI 2.1 never meant a lot tbh. So far there are only a few devices with HDMI 2.1 that actually support all of it's features. Most implementations are either not full bandwidth, lack DSC, or omit other features. My LG C9 is still one of the rare cases of a full bandwidth HDMI 2.1 port with 48Gbps. Most TV's that use 2.1 come in at 40Gbps. Most monitors even lower. The Gigabyte M28U is only 24Gbps because it uses DSC for the higher bandwidth signals. This only allows for 4K 60Hz on the playstation 5 because it doesn't support DSC, while PC's and the Xbox can push 4K 120/144Hz through the HDMI connection. But so far manufacturers have been honest enough to actually tell what the HDMI ports are capable of when they branded them as 2.1.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, tim0901 said:

2.0 was the first to support 4k60. 1.x only supported up to 4k30.

 

At the time it was a very significant upgrade.

 

Comparatively, most people didn't care about 2.1 for a long time. Very few people actually care about the features of HDMI 2.1 in the grand scheme of things - it's only really since the release of the current-gen consoles that it's become an actually useful feature. Heck, it was only first seen on a GPU with the RTX 3000 series, despite being released back in 2017.

 

It wouldn't surprise me if this renaming is step 1 of a plan that will end up with a new HDMI 3.0 spec being released. A reshuffling of the stack if you will. After all it's been 3 years now since HDMI 2.1 was released - right around the time that we should expect something new to play with.

Wasn't DP there already for it?

Either way mostly HDMI is for TVs and with current consoles and new TVs that go hand in hand yeah definitely something to have full support when looking.

 

But I expect HDMI to continue being mainly targeted toward TVs and DP will be for monitors.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Doobeedoo said:

Wasn't DP there already for it?

Either way mostly HDMI is for TVs and with current consoles and new TVs that go hand in hand yeah definitely something to have full support when looking.

 

But I expect HDMI to continue being mainly targeted toward TVs and DP will be for monitors.

It was but as you said: what TV has displayport? I'm sure they exist but HDMI is definitely king there.

 

Non-techy consumers in general also just don't really know that displayport exists. Every display cable is a HDMI cable, the same way that every Android is a Samsung.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, tim0901 said:

It was but as you said: what TV has displayport? I'm sure they exist but HDMI is definitely king there.

 

Non-techy consumers in general also just don't really know that displayport exists. Every display cable is a HDMI cable, the same way that every Android is a Samsung.

Yeah, was mainly thinking about monitors. Though for sure this is just another mess in TV space like HDR is.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Exactly the same shit that USB is pulling. Manufacturers complain about not being able to advertise the latest standard so the standard org just renames everything as a way to intentionally mislead consumers.

 

It defeats the whole purpose of having a "standard".

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Glenwing said:

I've written a post here to clarify some things on this topic, for anyone interested.

mixed version 2.1 with the 2.1 HDMI?

 

and what you want is, not about the "cable version" but the supported features or as a standard?

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Glenwing said:

I've written a post here to clarify some things on this topic, for anyone interested.

 

https://linustechtips.com/blogs/entry/2046-on-the-topic-of-hdmi-version-numbers/

I've mostly skimmed it (Holy walls of text, Batman) but honestly, it just makes me even more pissed off, at both manufacturers and whichever entity currently controls the HDMI standard.

So either:

  • They are just not bothering to enforce their own rules because f**k the consumer
  • They are in cahoots with the manufacturers to confuse consumers, and just have that line in there to cover their asses and point to the manufacturers

Either way, it feels predatory. The end result is that consumers that are not that tech-savvy are going to buy products expecting the latest and greatest features and end up with something that doesn't live up to those expectations.


Basically any manufacturer can grab any item they used to market as "HDMI 1.4", rebrand it as "HDMI 2.0" and now rebrand as "HDMI 2.1" and if the consumer ain't happy about it, too bad! They are following the specifications!

Even though they actually aren't because they shouldn't advertise the specification version but the HDMI governing bodies don't give a crap about that!

 

Just feels incredibly infuriating...

 

But again, mostly skimmed it so maybe I've missed a key detail here or there.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Quackers101 said:

mixed version 2.1 with the 2.1 HDMI?

 

and what you want is, not about the "cable version" but the supported features or as a standard?

 If I understand correctly, the point is more that there is no and never has been a "cable version" or "port version". The "HDMI 2.1" version is simply the version of the specification itself at this moment in time. If you were to build an HDMI cable or device, you follow the specification, which tells you what features you must or can support. Those rules and guidelines happen to be at version 2.1 right now. They gave a good analogy with their electrical code example.

2 hours ago, Rauten said:

Basically any manufacturer can grab any item they used to market as "HDMI 1.4", rebrand it as "HDMI 2.0" and now rebrand as "HDMI 2.1" and if the consumer ain't happy about it, too bad! They are following the specifications!

Even though they actually aren't because they shouldn't advertise the specification version but the HDMI governing bodies don't give a crap about that!

This feels like an annoying case of "they're technically correct". I feel like the "HDMI 1.4" or whatever marketing could still make some sense, as in it may be a helpful detail for those in the know to know to which specification it was built, just like with e.g. the electrical code. I agree that they should include a list of which functionality of the marketed specification they support though. Or better yet, have a certification similar to the one they have for cables that indicate a certain set of features.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/13/2021 at 11:20 AM, sounds said:

Summary

A new display from Xiaomi, the "Xiaomi Fast LCD Monitor 24.5-inch 240HZ version" sets a new low for the industry.

 

Quotes

My thoughts

As if USB 3.1 gen 1 and USB 3.2 gen 2, USB type C cables, Thunderbolt 3 vs. Thunderbolt 4 weren't all bad enough.

 

Is HDMI 2.1 now meaningless?

 

Sources

https://tftcentral.co.uk/articles/when-hdmi-2-1-isnt-hdmi-2-1

https://www.techspot.com/news/92606-fake-hdmi-21-doesnt-really-bother-hdmi-licensing.html

https://today.in-24.com/technology/722383.html

 

Honestly what you're seeing is "products that are not licensed hdmi 2.1"

 

Cables must support all HDMI 2.1 features, but devices do not have to if that's not a feature of that device. It's like saying a 1080p240 TV must support 8K. It might down-scale it if has the logic for it, but realistically the TV is going to tell the device it only supports 1080p and to send only a 1080p image.

 

It's more of a lot of nothing. Basically a device can say the port is mechanically and electrically HDMI 2.1 but that doesn't mean the device supports any 2.1 features. 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/14/2021 at 11:26 AM, dilpickle said:

It defeats the whole purpose of having a "standard".

"There is no standard standard."

Link to comment
Share on other sites

Link to post
Share on other sites

Watching this weeks WAN , saw this.

 

This is yet another example of the entire display industries incredibly misleading advertisements and spec sheets.


All i can say to those surprised by this is ...welcome to the display industry. Its a sh*t show and the reason why I eventually made a guide in the Display section.

The amount of times i had to repeat myself to normal people asking for advice with displays, and i had to keep going over and over explaining that they need to more or less ignore advertised specs, why, and then what they should look for and where ...was crazy.

 

While i would prefer LTT to create a full blown review website for tested reviews, ala rtings etc, the idea they have of a simple posting of 'real' specs vs advertised specs is a good middle ground.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

Huge win for manufacturers huge loss for consistency and clarity. Gonna have to pour over them spec sheets now.

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly, who cares?

Who buys a display by only looking at the version of HDMI it has? Might as well buy a CPU by only looking at which socket it uses.

 

 

Also, since I've seen a lot of people bring up USB, I'd like to point out that the official naming from the USB-IF are actually super clear and easy to follow. Here are the official guidelines for how USB ports should be marked and described according to the USB 3.2 specifications:

Quote

The USB 3.2 specification absorbed all prior 3.x specifications. USB 3.2 identifies three transfer rates, USB 3.2 Gen 1 at 5Gbps, USB 3.2 Gen 2 at 10Gbps and USB 3.2 Gen 2x2 at 20Gbps. It is important that vendors clearly communicate the performance signaling that a product delivers in the product’s packaging, advertising content, and any other marketing materials.

 

USB 3.2 Gen 1

  • Product capability: product signals at 5Gbps
  • Marketing name: SuperSpeed USB

USB 3.2 Gen 2

  • Product capability: product signals at 10Gbps
  • Marketing name: SuperSpeed USB 10Gbps

USB 3.2 Gen 2x2

  • Product capability: product signals at 20Gbps
  • Marketing name: SuperSpeed USB 20Gbps

 

If companies like motherboard manufacturers actually used the naming standards written in the USB specifications, nobody would have any issues knowing what speed their USB ports functioned at.

 

This is what Asus' website looks like:

image.png.afd5172f3a95d412d6ef13e041046b67.png

 

Not that hard to follow but more complicated than it needs to be, since you need to know that Gen 1 means 5Gbps, gen 2 means 10Gbps and gen 2x2 means 20Gbps.

 

This is what Asus' website would look like if they followed the naming conventions laid out in the USB specs:

image.png.983f71e3ac045865c9ed001fb6209218.png

 

A bit more text and reading "SuperSpeed" so many times is annoying, but it becomes way clearer which ports run at what speed.

USB is confusing because component and device manufacturers make it confusing, not because the specs make it confusing.

Link to comment
Share on other sites

Link to post
Share on other sites

Does this affect displayport and them forwarding a HDMI signal?

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, LAwLz said:

The USB 3.2 specification absorbed all prior 3.x specifications. USB 3.2 identifies three transfer rates, USB 3.2 Gen 1 at 5Gbps, USB 3.2 Gen 2 at 10Gbps and USB 3.2 Gen 2x2 at 20Gbps. It is important that vendors clearly communicate the performance signaling that a product delivers in the product’s packaging, advertising content, and any other marketing materials.

I fully agree vendors should communicate clearly, but I also think we're at a point that if the standards care about manufacturers reporting the correct things that they should investigate ways of making this better or even enforcing it instead of just saying "well the manual says they shouldn't". Clearly both USB and HDMI don't want people using the version numbers. Why can't they then enforce that and make e.g. passing the USB 3.2 Gen2 certification mean that you have to report it as SuperSpeed 10 Gbps (or whatever they define) as in your example?

 

21 hours ago, LAwLz said:

USB is confusing because component and device manufacturers make it confusing, not because the specs make it confusing.

So if I understand USB correctly then "USB 3.2" also doesn't actually exist, given how they "absorbed" 3.0 in 3.1 years ago and don't want you to market it as such. Yet they still list 2.0 alongside 3.2 and 4 prominently at the bottom of homepage and confusingly(?) call it the "USB X specification", while they as well (to me) seem to imply "USB specification X". HDMI is not communicating clearly on this front either, by using both "HDMI 2.1 specification" and "version 2.1 of the HDMI specification" on their website.

 

I feel we might be a little biased as well with USB as, at least for me personally, it doesn't really matter if your USB "3.2" port does 5 or 10 Gbps for most things. Your new HDMI "2.1" device not being capable of 4k 120 Hz RGB, because that's optional and yours isn't full bandwidth, is quite an unwelcome surprise. Non-gamers may have the reverse argument though. Both are equally bad/confusing from a technical perspective. I think it might be good if HDMI would expand their certification for cables to ports as well and maybe take example from USB by having certain marketing terms available for certain sets of features. This could be that one case where "Gaming-ready" could be a genuine unironic label for something. Or even just a literal "38Gbps" like with "SuperSpeed 5 Gbps". Or better yet, just make version X specs with no optional features so that versions can mean something relevant to the consumer again.

15 hours ago, williamcll said:

Does this affect displayport and them forwarding a HDMI signal?

Maybe? DP versioning and capabilities seems to be a similar mess.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, tikker said:

I fully agree vendors should communicate clearly, but I also think we're at a point that if the standards care about manufacturers reporting the correct things that they should investigate ways of making this better or even enforcing it instead of just saying "well the manual says they shouldn't". Clearly both USB and HDMI don't want people using the version numbers. Why can't they then enforce that and make e.g. passing the USB 3.2 Gen2 certification mean that you have to report it as SuperSpeed 10 Gbps (or whatever they define) as in your example?

I think it would be good if the USB-IF mandated proper labelling of ports and not just had it as suggestions. Not sure why they don't, but when I was writing that post I went to different motherboard manufacturer's website and guess what, they list the info necessary to figure out what speeds they run at.

As you can see in the screenshot above, Asus for example lists the generation of each port, and those generations corresponds with different bandwidth just like "USB 3.0" and "USB 3.1" would have corresponded to different speeds.

 

 

 

1 hour ago, tikker said:

So if I understand USB correctly then "USB 3.2" also doesn't actually exist, given how they "absorbed" 3.0 in 3.1 years ago and don't want you to market it as such. Yet they still list 2.0 alongside 3.2 and 4 prominently at the bottom of homepage and confusingly(?) call it the "USB X specification", while they as well (to me) seem to imply "USB specification X". HDMI is not communicating clearly on this front either, by using both "HDMI 2.1 specification" and "version 2.1 of the HDMI specification" on their website.

USB 3.2 exists. USB 3.2 is the latest document detailing how to implement the third major version of USB, and its various transfer modes which currently range from 5Gbps to 20Gbps. If they make future changes to the USB 3 standard, they will most likely publish a document called USB 3.3 which replaces the old document for how to implement USB.

 

USB 3.2 can operate in different modes.

  • A 5Gbps mode which is described in the spec as "gen 1" and should be marketed as SuperSpeed USB.
  • A 10Gbps mode which is described in the spec as "gen 2" and should be marketed as SuperSpeed+ USB 10Gbps.
  • Another 10Gbps mode which is described in the spec as "gen 1x2" (which is just the gen 1 spec but using twice as many wires, thanks to USB-C) and should be marketed as SuperSpeed+ USB 10Gbps.
  • A 20Gbps mode which is described in the spec as "gen 2x2" (which it is just the gen 2 spec but using twice as many wires, thanks to USB-C) and should be marketed as SuperSpeed+ USB 20Gbps.

 

All of these are described in the USB 3.2 document. By the way, did you notice how there were two 10Gbps modes? In USB 3.1, there was only one. This means that USB 3.2 introduced a new way of implementing the old maximum transfer speed of 10Gbps that existed in USB 3.1. This is one of the reasons why you can't just tie the revision number to a specific transfer speed like people want. I think a lot of people want USB to look like this:

USB 3.0 = 5Gbps

USB 3.1 = 10Gbps

USB 3.2 = 20Gbps

 

But that would not work because 10Gbps mode in USB 3.2 might not work the same way the 10Gbps USB 3.1 mode works. The USB 3.2 device operating in 10Gbps mode might try to send 5Gbps over 2 lanes while the USB 3.1 device might want to send 10Gbps over a single lane.

Also, different USB standards use different encoding schemes that result in different effective data transfer rates. USB 10Gbps can not send 10Gbps of actual data over it. 10Gbps USB has an effective bandwidth of 1 or 1.212 GB/s depending on which encoding is used.

 

 

 

1 hour ago, tikker said:

Your new HDMI "2.1" device not being capable of 4k 120 Hz RGB, because that's optional and yours isn't full bandwidth, is quite an unwelcome surprise.

But is this going to actually be a thing?

If someone builds a monitor that's 4K 120Hz then do people honesty expect them to cheap out on the HDMI connector so that it becomes impossible to drive it at the full spec? 

I could kind of see it on devices that output HDMI, but:

1) Most devices already describe the maximum output resolution and frame rate alongside the HDMI version. If I'm interesting in knowing the maximum video output of a device I will look up the "maximum output resolution" section in the specs. I won't go looking for which HDMI version it uses and then hopefully remember that "HDMI version X corresponds to Y bandwidth which means it can drive a monitor with Z resolution".

2) We already have a problem with devices overselling their capabilities in scenarios where this might matter, like how the Xbox One was being touted as a "4K console" because it could technically display 4K resolutions, but in reality not even the menus ran at that resolution, and certainly no games did either. 

 

 

I just feel like people don't really get why standards are written the way they are, they blame poor choices from the manufacturer on the standard bodies, and they are getting riled up over something that quite frankly doesn't matter.

If someone makes a portable SSD that can do 20Gbps data transfers then they will most likely implement the 20Gbps version of the USB 3.2 port, not the 5Gbps version. In that case, it doesn't matter that both 5Gbps and 20Gbps modes are in the USB 3.2 spec.

If someone makes a 4K 120Hz monitor then they will implement the "full speed" HDMI 2.1 version, not the one limited to whatever HDMI 2.0 does.

If someone is looking at a laptop and want to know the maximum output resolution it will be the same scenario as it is right now. Either the manufacturer doesn't even specify which version of HDMI the port is (Apple for example just says "HDMI" on their spec page, not HDMI 2.0 even though that's what it is) or they will list the maximum output.

 

 

Anyway, super long and ramble-y post. My point is that the specs are written the way they are for a reason, and everyone complaining is just making a mountain out of a molehill.

If you are technically literate enough to know that "in order to use a 4K monitor at 120Hz you need HDMI 2.1" then you are also technically literate enough to actually go into the product page and look for whatever detail you are interested in. The average Joe that tech-bros are claiming to want to protect from sleazy manufacturers misleading people are not affected by this change because they barely know what HDMI is, let along that HDMI version 2.1 is what's required for 4K 120Hz.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, LAwLz said:

USB 3.2 exists. USB 3.2 is the latest document detailing how to implement the third major version of USB, and its various transfer modes which currently range from 5Gbps to 20Gbps. If they make future changes to the USB 3 standard, they will most likely publish a document called USB 3.3 which replaces the old document for how to implement USB.

If its about the document then it means USB 3.2 indeed doesn't exist. It means we are at version 3.2 of the USB spec currently, and that when 3.3 comes around every 3.X port out there will be 3.3.

19 minutes ago, LAwLz said:

But is this going to actually be a thing?

If someone builds a monitor that's 4K 120Hz then do people honesty expect them to cheap out on the HDMI connector so that it becomes impossible to drive it at the full spec? 

It's already happening. My old ASUS VG248QE had HDMI 1.4 which "should" be capable of 1920x1080@120Hz, but this unit is limited to 60 Hz over HDMI. Two current ones that pop in my head now are LG's CX/C1 and Lenovo's Legion 5. The HDMI 2.1 ports on LG's CX and C1 are only 40 Gbps instead of the full 48. On the other hand, they do expicitely list e.g. VRR support so that's nice. Lenovo's Legion 5 laptops list "HDMI 2.1" without further specification, but nobody seems to be able to get a full 10-bit 4k 120 Hz at 4:4:4 RGB signal to work. The 3000 series should be capable of that, so their HDMI port is <40 Gbps. Then there's the monitor in this topic. I do think monitors typicall list their supported resolutions and refresh rates in the manual nowadays, so it's not completely absent, but there are still expectations after seeing a version number.

36 minutes ago, LAwLz said:

If someone makes a 4K 120Hz monitor then they will implement the "full speed" HDMI 2.1 version, not the one limited to whatever HDMI 2.0 does.

That's what the monitor at the start of this thread did. Saying "HDMI 2.1", but being built to the 2.0 spec and thus completely not "2.1".

39 minutes ago, LAwLz said:

I just feel like people don't really get why standards are written the way they are, they blame poor choices from the manufacturer on the standard bodies, and they are getting riled up over something that quite frankly doesn't matter.

I agree that most don't understand the standards and their versioning, but I do blame the companies and marketing around it as well.

1 hour ago, LAwLz said:

If someone makes a portable SSD that can do 20Gbps data transfers then they will most likely implement the 20Gbps version of the USB 3.2 port, not the 5Gbps version. In that case, it doesn't matter that both 5Gbps and 20Gbps modes are in the USB 3.2 spec.

If someone makes a 4K 120Hz monitor then they will implement the "full speed" HDMI 2.1 version, not the one limited to whatever HDMI 2.0 does.

If someone is looking at a laptop and want to know the maximum output resolution it will be the same scenario as it is right now. Either the manufacturer doesn't even specify which version of HDMI the port is (Apple for example just says "HDMI" on their spec page, not HDMI 2.0 even though that's what it is) or they will list the maximum output.

That's true. If you just saw "USB 3.2" on your SSD though you wouldn't know what speed it had. That's more the issue here. Even if Apple now said "HDMI 2.1" on their port that would tell you nothing about what it can do, because there is no HDMI 2.1 port. There is only an HDMI port that was built according to or meets the HDMI specification currently at version 2.1, with the features Apple wanted.

45 minutes ago, LAwLz said:

Anyway, super long and ramble-y post. My point is that the specs are written the way they are for a reason, and everyone complaining is just making a mountain out of a molehill.

If you are technically literate enough to know that "in order to use a 4K monitor at 120Hz you need HDMI 2.1" then you are also technically literate enough to actually go into the product page and look for whatever detail you are interested in.

But the product pages aren't listing that currently. They just say HDMI 1.4, HDMI 2.0 or HDMI 2.1 and call it a day, which technically are all wrong. I can see that it may be a bit of a mountain-molehill situation, and ultimately it stems from people and companies using the USB and HDMI versions wrongly to describe the things we mean. I think Glenwing's blog entry touched nicely on that: you can't build an HDMI 2.0 port anymore, because there is no HDMI 2.0 anymore. However, if a customer sees HDMI 1.4/2.0/2.1 or USB 3.2 or any other versioned thing they think it means a certain set of features. Instead what it actually means is nothing more then "it satisfies the basic requirements and maybe some others of this list of possible features that is currently/at the moment of building at version 1.4/2.0/2.1".

 

This whole "it's the spec, not the port" debate probably sounds pedantic, but I do think it would help if the versions we saw on products reflected the expectations of telling us about functionality.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×