Jump to content

144hz only compatible with DVI?

Will01
39 minutes ago, kilrath81 said:

Pissed me off to find it out too. DP has been around a long time. I dont know if it was planned obsolescence or short sighted on their part. I dont know when the models came out. i bought my 144hz DVI-D Acer 2 years ago only and now i cant really use it but was it on the market for 4 or 5 by then? They may have thought at the time that DVI-D was going to survive the test of time...

Look,

 

benq doesn't make any part of the monitor except maybe the plastic case and the firmware modifications, they just assemble the monitors like Dell and HP make computers out of parts.

The power supply is from some oem like FSP or Delta or other cheaper ones, the panel is from Samsung or LG or others , the decoder board is made using various components from other companies.

 

Here's a possible scenario for you all:

 

Hey benq, we made this awesome digital decoder chip for high refresh rate monitors, we worked for 2 years at it and we made 10k pieces of them. Unfortunately, we only now discovered a serious hardware bug in the DisplayPort interface, so we have to disable it and only allow DVI Dual Link and HDMI

It will take us around 3-4 months to get a new revision made and tested so here's a proposition: take these 10k chips off our hands at 50% off the regular price (25$ each instead of 50$ for example) so we'll recover at least the wafer costs and we'll sweeten the deal by giving you priority purchase or exclusivity for 6-12 months  and 10% extra discount when the new revision of the chip will be ready in at least 4 months from now.

 

benq tested the chips, the firmware was super stable with dual link dvi, so they went for it, got the decoder chips cheaper so more profit on each monitor etc etc..

 

Just a scenario.

 

It could also be the controller chip simply didn't have DisplayPort in the first place. Could very well have been some 120 Hz overclocked by firmware controller chip, to handle 144 Hz. The controller chips typically handle 120 Hz, as it's needed for 3D 60 Hz, so binning and overclocking some chips would not be impossible.

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Bombastinator said:

I thought I was getting that but when I looked at the type it was interpolated.

I was using HDMI to DVI now that i think of it, maybe the DP to DVI is different. 

i7-6700k 4.7ghz (1.375v)

Asus Maximus Hero VIII

64gb Kingston DDR4 2133mhz (4 x 16gb) OC to 2666mhz

Gigabyte 2070 Super GAMING OC WHITE (2085mhz core 15,400mhz memory)

Corsair H100i AIO

EVGA 850 Gold PSU

 

M.2 NVMe 1TB Drive

2 x 512GB SSDs Raid 0

2 x 2TB HDDs

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, mariushm said:

Look,

 

benq doesn't make any part of the monitor except maybe the plastic case and the firmware modifications, they just assemble the monitors like Dell and HP make computers out of parts.

The power supply is from some oem like FSP or Delta or other cheaper ones, the panel is from Samsung or LG or others , the decoder board is made using various components from other companies.

 

Yea, i dont work in any part of that business so its all basically wizard behind the curtain shit to me. From my limited point of view i would want my monitor to support as many input standards as possible, but on the other hand a lot of these 144hz monitors that are now basically obsolete because of the DVI-D requirement were, from what i can tell, on the economy side. So i can see your arguement for taking a cheaper route that supports 144hz for the marketing but without the cost of additional input support. So, i guess in the end the blame is kind of 50/50 here to me. We either bought older hardware or cheaper hardware to save money and lost the gamble, they didnt warn anyone and anyone not in the industry wont know what standards are on the brink. 

 

The other thing i dont understand is that these monitors typically say they are HDMI 1.4b which technically is 1080p 144hz capable. But even if they didnt, it should still be able to do 120hz and i just couldnt get my acer anywhere near there.

 

The scarey thing to me is you can still buy these monitors that are listed as 144hz and i have never seen anywhere a warning stating that you need to have a DVI-D compatible card only for the 144hz compatibility.

i7-6700k 4.7ghz (1.375v)

Asus Maximus Hero VIII

64gb Kingston DDR4 2133mhz (4 x 16gb) OC to 2666mhz

Gigabyte 2070 Super GAMING OC WHITE (2085mhz core 15,400mhz memory)

Corsair H100i AIO

EVGA 850 Gold PSU

 

M.2 NVMe 1TB Drive

2 x 512GB SSDs Raid 0

2 x 2TB HDDs

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, kilrath81 said:

Yea, i dont work in any part of that business so its all basically wizard behind the curtain shit to me. From my limited point of view i would want my monitor to support as many input standards as possible, but on the other hand a lot of these 144hz monitors that are now basically obsolete because of the DVI-D requirement were, from what i can tell, on the economy side. So i can see your arguement for taking a cheaper route that supports 144hz for the marketing but without the cost of additional input support. So, i guess in the end the blame is kind of 50/50 here to me. We either bought older hardware or cheaper hardware to save money and lost the gamble, they didnt warn anyone and anyone not in the industry wont know what standards are on the brink. 

 

The other thing i dont understand is that these monitors typically say they are HDMI 1.4b which technically is 1080p 144hz capable. But even if they didnt, it should still be able to do 120hz and i just couldnt get my acer anywhere near there.

 

The scarey thing to me is you can still buy these monitors that are listed as 144hz and i have never seen anywhere a warning stating that you need to have a DVI-D compatible card only for the 144hz compatibility.

There are several standards that support it now.  My monitor is quite old though.  If cards still had dvi-du as it is apparently called I would be fine.  DisplayPort does it and so does some forms of hdmi apparently.  The problem seems to be stuff that should convert it doesn’t.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Bombastinator said:

There are several standards that support it now.  My monitor is quite old though.  If cards still had dvi-du as it is apparently called I would be fine.  DisplayPort does it and so does some forms of hdmi apparently.  The problem seems to be stuff that should convert it doesn’t.

The issue is the conversion to DVI-D from any standard seems to require active adapters. HDMI 1.4b and DP support high refresh rates of 144hz+ but for one reason or another they decided that their HDMI ports have an imposed limit on them. My Acer for instance says 1.4b which the standard says is 1080p144 capable but they must have either locked that out at the hardware level or there was issues and amd/nvidia locked it out on the driver level. Seems stupid to gimp put a HDMI 1.4b spec in the monitor then install a chip that maybe has issues and lock it out to save a few cents.

 

On a side note, seems there is no better time than now to buy a 144hz monitor if you have DVI-D on your card. In the last 2 days i have seen the number for sale used jump 600% and of course none of them say anything about DVI-D requirements in their ads, lol.

i7-6700k 4.7ghz (1.375v)

Asus Maximus Hero VIII

64gb Kingston DDR4 2133mhz (4 x 16gb) OC to 2666mhz

Gigabyte 2070 Super GAMING OC WHITE (2085mhz core 15,400mhz memory)

Corsair H100i AIO

EVGA 850 Gold PSU

 

M.2 NVMe 1TB Drive

2 x 512GB SSDs Raid 0

2 x 2TB HDDs

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, kilrath81 said:

The issue is the conversion to DVI-D from any standard seems to require active adapters. HDMI 1.4b and DP support high refresh rates of 144hz+ but for one reason or another they decided that their HDMI ports have an imposed limit on them. My Acer for instance says 1.4b which the standard says is 1080p144 capable but they must have either locked that out at the hardware level or there was issues and amd/nvidia locked it out on the driver level. Seems stupid to gimp put a HDMI 1.4b spec in the monitor then install a chip that maybe has issues and lock it out to save a few cents.

It's because they're very old monitors from around 2013. Pretty much all the 144 Hz monitors from that time (Acer GN246HL, ASUS VG248QE, BenQ XL2411, XL2420, etc.) had the same limitation, because that's what hardware was available at the time from display controller manufacturers.

 

HDMI 1.4 allows up to 340 Mpx/s (1080p 144 Hz) but there's no requirement that all hardware must implement the maximum allowed. 1440p+ displays and >60 Hz displays were quite rare at the time so there was virtually no demand for HDMI controllers capable of more than 150 Mpx/s or so (1080p 60 Hz).

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Glenwing said:

It's because they're very old monitors from around 2013. Pretty much all the 144 Hz monitors from that time (Acer GN246HL, ASUS VG248QE, BenQ XL2411, XL2420, etc.) had the same limitation, because that's what hardware was available at the time from display controller manufacturers.

 

HDMI 1.4 allows up to 340 Mpx/s (1080p 144 Hz) but there's no requirement that all hardware must implement the maximum allowed. 1440p+ displays and >60 Hz displays were quite rare at the time so there was virtually no demand for HDMI controllers capable of more than 150 Mpx/s or so (1080p 60 Hz).

Then why put what might of been a bleeding edge HDMI standard and not implement the support? Im not a hardware guru, more like a severely handicapped golf player in comparison if anything, so maybe there is a great reason but it seems to me you dont put a 800HP motor into a car and then restrict it down to 400HP...

i7-6700k 4.7ghz (1.375v)

Asus Maximus Hero VIII

64gb Kingston DDR4 2133mhz (4 x 16gb) OC to 2666mhz

Gigabyte 2070 Super GAMING OC WHITE (2085mhz core 15,400mhz memory)

Corsair H100i AIO

EVGA 850 Gold PSU

 

M.2 NVMe 1TB Drive

2 x 512GB SSDs Raid 0

2 x 2TB HDDs

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, kilrath81 said:

Then why put what might of been a bleeding edge HDMI standard and not implement the support? Im not a hardware guru, more like a severely handicapped golf player in comparison if anything, so maybe there is a great reason but it seems to me you dont put a 800HP motor into a car and then restrict it down to 400HP...

I wholeheartedly agree but it’s what happened.

Part of the issue was there were so many kinds of dvi that all looked the same, and it was expensive to implement.

 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, kilrath81 said:

Then why put what might of been a bleeding edge HDMI standard and not implement the support? Im not a hardware guru, more like a severely handicapped golf player in comparison if anything, so maybe there is a great reason but it seems to me you dont put a 800HP motor into a car and then restrict it down to 400HP...

There is only one HDMI specification active at any given time, there really is no picking and choosing of what "version" of HDMI you implement. There is no "today should we implement the HDMI 1.2 or 1.4 feature set?", there is only "the HDMI specification", and in 2013 the latest edit to the HDMI specification was revision 1.4. All devices are built to comply with the latest version of the specification, you don't say "well the latest additions don't affect our product, so ours is really a version 1.3 device".

 

The HDMI specification (whatever the latest version is) covers all possible implementations of HDMI, from the lowest (25 MHz) to the highest (340 MHz in HDMI 1.4), and all possible combinations of features. In effect, all HDMI devices are "compliant with HDMI 1.4", because it's just another way of saying "our HDMI device is compliant with the HDMI specification". The reality is that the use of "version numbers" for describing a device's features or capabilities is simply incorrect. There are no "HDMI version numbers" in hardware.

 

Basically the description of a monitor as "it has an HDMI 1.4b port" is actually completely meaningless. Any HDMI device compliant with the HDMI 1.2 specification is also compliant with the HDMI 1.4 specification, and is also compliant with the HDMI 2.0 specification. Which is exactly why using "version numbers" to describe HDMI devices was banned almost 10 years ago, although enforcement of that is another story :P

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Glenwing said:

There is only one HDMI specification active at any given time, there really is no picking and choosing of what "version" of HDMI you implement. There is no "today should we implement the HDMI 1.2 or 1.4 feature set?", there is only "the HDMI specification", and in 2013 the latest edit to the HDMI specification was revision 1.4. All devices are built to comply with the latest version of the specification, you don't say "well the latest additions don't affect our product, so ours is really a version 1.3 device".

 

The HDMI specification (whatever the latest version is) covers all possible implementations of HDMI, from the lowest (25 MHz) to the highest (340 MHz in HDMI 1.4), and all possible combinations of features. In effect, all HDMI devices are "compliant with HDMI 1.4", because it's just another way of saying "our HDMI device is compliant with the HDMI specification". The reality is that the use of "version numbers" for describing a device's features or capabilities is simply incorrect. There are no "HDMI version numbers" in hardware.

 

Basically the description of a monitor as "it has an HDMI 1.4b port" is actually completely meaningless. Any HDMI device compliant with the HDMI 1.2 specification is also compliant with the HDMI 1.4 specification, and is also compliant with the HDMI 2.0 specification. Which is exactly why using "version numbers" to describe HDMI devices was banned almost 10 years ago, although enforcement of that is another story :P

Great information, thanks. All things so far i didnt have any clue about. Interesting how marketing departments try to use any bit of information, even if its completely irrelevant and go so far as to even post what revision of what version just to make a product look better to an uneducated consumer. A lot are likely like me or even more clueless. I googled the 1.4b spec before i bought mine, and like an ass, assumed that 144hz would work on both DVI-D and HDMI. These guys should have been clearer and stated that if you dont have DVI-D you dont have 144hz capability.

i7-6700k 4.7ghz (1.375v)

Asus Maximus Hero VIII

64gb Kingston DDR4 2133mhz (4 x 16gb) OC to 2666mhz

Gigabyte 2070 Super GAMING OC WHITE (2085mhz core 15,400mhz memory)

Corsair H100i AIO

EVGA 850 Gold PSU

 

M.2 NVMe 1TB Drive

2 x 512GB SSDs Raid 0

2 x 2TB HDDs

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, kilrath81 said:

Great information, thanks. All things so far i didnt have any clue about. Interesting how marketing departments try to use any bit of information, even if its completely irrelevant and go so far as to even post what revision of what version just to make a product look better to an uneducated consumer. A lot are likely like me or even more clueless. I googled the 1.4b spec before i bought mine, and like an ass, assumed that 144hz would work on both DVI-D and HDMI. These guys should have been clearer and stated that if you dont have DVI-D you dont have 144hz capability.

Well if you do have dvi-d-dL (because mere dvi-d doesn’t do it either) you have it as long as you have both a card and a monitor that do dviD-DL.  The problem is there aren’t any more cards that actually have dviD-dL made.  There IS DisplayPort. BUT there is ONLY DisplayPort. And so far as I can find, it is more or less impossible to convert 1440px144hz DisplayPort out to dviD-DL in.  Which is what basically everyone with an old high performance digital monitor (such as myself) wants to do.

 

If there IS a way I would love to know it.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×