Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Why is HDMI (seemingly) the industry standard vs Displayport?

Hey all! After watching the Techquickie video on the various different display options available, I'm lacking understanding as to why HDMI is often found on everything from TVs, sound systems, consoles, or anything that isn't a PC?  You don't have to pay royalties seems like a no brainer for companies to implement the port onto their products... to me at least?  The only reason as far as I was able to find was HDMI is an older standard.  Any insight, links or whatever is greatly appreciated!

Link to post
Share on other sites

Older + DVI backwards compatible (years ago) with passive adapters. For many years it was good enough for PC/TV users.

 

Also - better standards rarely wins.

 

 

 i7-6950x @4.5GHz | ASRock X99 Taichi | 64GB @3200MHz CL15 | RTX 2080Ti -> RTX 3080 
 Samsung 970 Evo Plus 2TB | PNY CS3030 500GB | Kingston A2000 500GB | Crucial MX500(GB) 
 FD Define R4 | HR-02 MACHO Rev. C EU 
 EVGA G2 850W Gold | Win 10 Pro x64 
 DELL U2715H | Acer Predator XB271HU | DELL U2715H 

 

Link to post
Share on other sites

While DisplayPort is superior to HDMI,HDMI was first.

And because of that until DisplayPort was released - HDMI garnered worldwide support and popularity and was more than enough at the time.

My launch model Playstation 3 from 2006 has HDMI and it supports 1080p 60 FPS,so there was no reason at the time to switch.

 

In short: The popularity of HDMI is hindering the adoption of DisplayPort.

A PC Enthusiast since 2011
AMD Ryzen 5 2600@4GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2040MHz Memory 5000MHz
Cinebench R15: 1382cb | Unigine Superposition 1080p Extreme: 3439
Link to post
Share on other sites

Well Since HDMI is everywhere and some of the major differences between Monitors and the better TVs are starting to fade away with 4K, it can be hard to  find DisplayPort on Consumer level Monitors.

Link to post
Share on other sites
58 minutes ago, Vishera said:

In short: The popularity of HDMI is hindering the adoption of DisplayPort.

I find that a curious way of putting it...

 

From an observer viewpoint, VGA was the in thing when I started until DVI took over on the digital side. Somewhere along the line HDMI appeared which was compatible with single link DVI, so that helped with compatibility between them.

 

I think my first encounter with DP was with a wide gamut monitor used for photography. I don't recall any benefit to picking DP or HDMI on that. The first time it made a difference was when I went ultra-wide. At 3440x1440, DP offered the display's 60 Hz, but HDMI only had the bandwidth for 50 Hz. HDMI of the time was showing its limits. It only supported 4k30 so DP was also showing its superiority there allowing 4k60. I went G-sync, which is a DP only technology until G-sync compatible kinda happened, allowing it to work on some TVs. My current TV has 4 HDMI ports. There's still often a compatibility gap, in that both ends have to support the newer versions to enable the newer features.

 

More devices I have support HDMI than DP, so in a way I wish more GPUs would have two native HDMI out regardless of how many DP they also have. 

Desktop Gaming system: Asrock Z370 Pro4, i7-8086k, Noctua D15, Corsair Vengeance Pro RGB 3200 4x16GB, Gigabyte 2070, NZXT E850 PSU, Cooler Master MasterBox 5, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p144 G-sync
TV Gaming system: Gigabyte Z490 Elite AC, i5-10600k, Noctua D15, Kingston HyperX RGB 4000@3600 2x8GB, EVGA 2080Ti Black, EVGA 850W, Corsair 230T, Crucial P1 1TB + MX500 1TB, LG OLED55B9PLA 4k120 G-Sync Compatible
Streaming system: Asus X299 TUF mark 2, i9-7920X, Noctua D15, Corsair Vengeance LPX RGB 3000 8x8GB, Asus Strix 1080Ti, Corsair HX1000i, GameMax Abyss, Samsung 970 Evo 500GB, Crucial BX500 1TB, BenQ XL2411 1080p144 + HP LP2475w 1200p60
Former Main system: Asus Maximus VIII Hero, i7-6700k, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte GTX 1650, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, Acer RT280k 4k60 FreeSync [link]
Gaming laptop: Asus FX503VD, i5-7300HQ, DDR4 2133 2x8GB, GTX 1050, Sandisk 256GB + 480GB SSD [link]


 

Link to post
Share on other sites
6 minutes ago, Riccardo Cagnasso said:

If we go down that route, why are not we using SDI for everything?

because there's useful features like ability to send commands through the hdmi and displayport cables, dedicated network wires, wires to communicate with the monitor and determine capabilities etc

 

dvi was cool for the time, but there was a definite need to have smaller connectors and with less limitations... the connector itself was kinda bad, with low insertion count, problematic.

 

displayport is better, but until majority of products have displayport, a video card will continue to have hdmi...   for example a TV will continue to have hdmi because there's so many devices (game consoles, dvd players, bluray players, hd cameras etc) that only have hdmi output or YPbPr (composite or whatever it's called) so the cheapest TVs may have only hdmi to save pennies on the displayport connectors

 

Link to post
Share on other sites

Have you ever used DP ? It's clunky to all fucks compared to HDMI.

Can't even access my bios while connected to DP, the screen stays black. I had to connect my monitor to my PC, on both HDMI and DP because of that.

 

DP has some neat features and all, but god damn it can be annoying sometimes. There are times I have to close my monitor and reopen it after booting my PC, because it seemingly didn't realize that the computer was powered on and there was no signal on the monitor.

HDMI just works. You plug it, you instantly get a picture.

CPU: AMD Ryzen 3600 / GPU: Radeon HD7970 GHz 3GB(upgrade pending) / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to post
Share on other sites

HDMI arrived earlier than DisplayPort did. HDMI was finalized in 2002 and started appearing on devices like D-Theater tape players and cable boxes in late 2004 and early 2005. DisplayPort wouldn't even make its introduction until the end of 2007 on a Dell monitor (the very same one I have, funny enough), with GPUs that supported it coming later in 2008. 

Local dickhead and VHS collector. Collector of random copies of Fantavision and Gran Turismo 3: A-spec.

Volume / Normalized 100% / 98% (content loudness 0.1dB)

 

 

@handymanshandle x @pinksnowbirdie | Jake x Brendan :^

Link to post
Share on other sites

HDMI drove out SCART in the consumer market, that's how old HDMI is!

 

It is a fairly solid standard and I don't see it going anywhere for a long time.

 

Why add displayport to the mix when we have something that just works? 

 

I certainly think Displayport is great but it isn't needed outside GPU->Monitor connections.

i5 8600 - RX580 - Fractal Nano S - 1080p 144Hz

Link to post
Share on other sites
1 hour ago, mariushm said:

because there's useful features like ability to send commands through the hdmi and displayport cables, dedicated network wires, wires to communicate with the monitor and determine capabilities etc

 

dvi was cool for the time, but there was a definite need to have smaller connectors and with less limitations... the connector itself was kinda bad, with low insertion count, problematic.

 

displayport is better, but until majority of products have displayport, a video card will continue to have hdmi...   for example a TV will continue to have hdmi because there's so many devices (game consoles, dvd players, bluray players, hd cameras etc) that only have hdmi output or YPbPr (composite or whatever it's called) so the cheapest TVs may have only hdmi to save pennies on the displayport connectors

 

To my knowledge, there's an extensive "control" protocol over SDI

Link to post
Share on other sites

Because you are talking about 2 completely different industries.

 

Home theater and PCs have had nothing to do with each other until very recently.

 

Even now the electronics companies that talk about gaming support have no clue what they are doing. Just look at the fiasco with AVRs not working with the new xbox. Or LG TVs not working with new nvidia cards even when they had a specific cross promotion between the 2 companies.

Link to post
Share on other sites

DP's only "real" benefit is a higher bandwidth for higher resolution and frame rates. Something that doesn't matter to the huge majority of people.

 

you could MAYBE argue that it should have been on the XBSX and PS5, but given that most people are going to connect it to a TV, there's no reason to put DP on them

 

Judge the product by its own merits, not by the Company that created it.

 

Don't dilute <good thing> by always trying to focus on, and drag conversation back to, <bad thing>.

Link to post
Share on other sites

on the other hand why do seemingly all video cards come with 3 dp outs and only 1 hdmi out when all I'll ever use is hdmi? 

RYZEN 5 3600 | GIGABYTE 3070 VISION OC | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
11 minutes ago, Mark Kaine said:

on the other hand why do seemingly all video cards come with 3 dp outs and only 1 hdmi out when all I'll ever use is hdmi? 

see, that's "Big GPU's" way of getting you to buy more GPU's ... one GPU per display  xD xD     jk!

 

Probably because they target gamers who, until quite recently didn't really use TV's (I guess most gamers use monitors?)

edit: clear up wording

Edited by Pumpkin
Link to post
Share on other sites
3 minutes ago, Pumpkin said:

see, that's "Big GPU's" way of getting you to buy more GPU's ... one GPU per display  xD xD     jk!

 

Probably because they target gamers who, until quite recently didn't really use TV's (I guess most gamers use monitors?)

edit: clear up wording

uhh there are monitors with HDMI, not just TVs

 

mine is analog so I'm using a converter to plug it into the HDMI out, but the downside is that I can't plug anything else like a TV or projector without using, you've guessed, more converters

Link to post
Share on other sites
1 minute ago, Caroline said:

uhh there are monitors with HDMI, not just TVs

 

mine is analog so I'm using a converter to plug it into the HDMI out, but the downside is that I can't plug anything else like a TV or projector without using, you've guessed, more converters

ah yes, sry about that, I meant that mostly the higher-end GPUs tend to be used by enthusiasts who probably tend to use DP monitors? Idk, just a theory. It would actually be quite an interesting topic for a yt-video (like, is it a cost- thing, contracts or sth else?)

But yeah, I know the pain of using adapters... I still have old monitors (DVI and VGA), so I have to use adapters for that... still cheaper than getting a whole new monitor setup I guess

Link to post
Share on other sites
59 minutes ago, Pumpkin said:

Probably because they target gamers - snip - 

See that would be a good explanation in theory... the problem is I'm a gamer and I use a monitor since PS360 era, for *everything* mainly because at the time TVs all had absolutely terrible input lag basically. And my monitor has *2* hdmi ins, 1 dp (maybe) and 1 dvi in... 

 

So this is also a good example of how industry designs things for a virtual "target audience" completely missing the actual needs of users, and then, probably, try to make their failure design the de facto "standard" after the fact. 

 

There's like no reason why GPUs can't have 2 hdmi, 2dp, and 1 dvi out to basically cover all needs, besides aforementioned misunderstanding of the market and stubbornness of "the industry" (basically dudes that "had an idea" lol) 

RYZEN 5 3600 | GIGABYTE 3070 VISION OC | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
1 hour ago, Mark Kaine said:

on the other hand why do seemingly all video cards come with 3 dp outs and only 1 hdmi out when all I'll ever use is hdmi? 

Because it's cheaper. DisplayPort requires no licensing period. It's an open source standard, if I remember correctly.

HDMI requires licensing and there's a fee for each HDMI port on a device, last I remember.

Local dickhead and VHS collector. Collector of random copies of Fantavision and Gran Turismo 3: A-spec.

Volume / Normalized 100% / 98% (content loudness 0.1dB)

 

 

@handymanshandle x @pinksnowbirdie | Jake x Brendan :^

Link to post
Share on other sites
2 minutes ago, handymanshandle said:

Because it's cheaper. DisplayPort requires no licensing period. It's an open source standard, if I remember correctly.

HDMI requires licensing and there's a fee for each HDMI port on a device, last I remember.

 

it keeps getting worse lol... 

 

so they save a few cents on their $500+ cards and consumers can buy dumb adapters or a new TV / monitor altogether because of it.

 

at least some gpus come with 2 hdmi outs iirc... 

 

 

don't get me wrong it's not a big deal for me cause I only use one monitor anyway, but if I used more than that I'd be pissed lol... 

RYZEN 5 3600 | GIGABYTE 3070 VISION OC | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites
1 minute ago, Mark Kaine said:

 

it keeps getting worse lol... 

 

so they save a few cents on their $500+ cards and consumers can buy dumb adapters or a new TV / monitor altogether because of it.

 

at least some gpus come with 2 hdmi outs iirc... 

 

 

don't get me wrong it's not a big deal for me cause I only use one monitor anyway, but if I used more than that I'd be pissed lol... 

At the end of the day, only a small group are looking to use their GPU with more than one display, and an even smaller subset of that group has to use HDMI. It's simply more cost-efficient to not have to throw multiple HDMI ports on a card that doesn't need them.

Hell, I have to use DisplayPort on my RX 580 as my monitor predates an HDMI version that allows for me to drive the monitor at 2560x1600 at 60Hz.

Local dickhead and VHS collector. Collector of random copies of Fantavision and Gran Turismo 3: A-spec.

Volume / Normalized 100% / 98% (content loudness 0.1dB)

 

 

@handymanshandle x @pinksnowbirdie | Jake x Brendan :^

Link to post
Share on other sites
10 hours ago, tigalicious said:

also, in hindsight I've could have put this in a better thread category... sorry :/  

Moved to Displays.

 

FYI, you can use the Report button for any threads that need to be moved in the future. 

If you ever need help with a build, read the following before posting: http://linustechtips.com/main/topic/3061-build-plan-thread-recommendations-please-read-before-posting/
Also, make sure to quote a post or tag a member when replying or else they won't get a notification that you replied to them.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Newegg

×