Jump to content

Monitor Manufacturers are missing the boat - The Rant

IPD

Let me start this off by saying that if you feel 1080p on a 20" monitor is too high of a resolution, you probably can skip this thread.  And apologies in advance, because this is probably going to take a bit.

 

PART 1 - Laptops

---

I'm an old school gamer.  I started in the era of PC gaming when 1280x960 was unobtanium, when 1024x768 was baller status, 800x600 / 640x480 was for the average joe, and poor folk had to settle for 400x300.  The bottleneck was ALWAYS the GPU.  Sure, there were improvements in RAM, in BUS speed, etc--but the GPU was always the limfac.  Anti-Aliasing (AA) was developed, to help mitigate all the "jaggies" from the abysmal resolutions.  Anisotropic Filtering (AF was also devised to help with this).  Yet the conventional wisdom was always the same--and had been for years; "No AA/AF at higher resolution > AA/AF on, with a lower resolution".

 

Then came the GeForce 4600Ti, and the ability actually play (some) games in 1600x1200.  AGP was revolutionary for the time.  And I upgraded to a cherished 22" NEC with a glorious 1920x1440 resolution - pretty much the best available at that time.  Games like Serious Sam 1.0 looked phenomenally better in 1920x1440 than they did on the average 1024x768 monitor of the time.  And at that point, I didn't care about AA or AF (or their hit to performance) because the improvement in graphics was just so superior in every way.

 

Next up was the 16:10 migration.  And I gotta be honest, I was stoked.  I'm a huge believer in the golden rectangle.  16:10 is the closest aspect ratio to this (yes, even better than 16:9).  I feel that the golden rectangle makes for the best overall, most aesthetically pleasing viewing experience.  My first laptop in this was 1280x800 (WXGA), but I was overjoyed when I was able to find a 1920x1200 (WUXGA) laptop.  In fact, even regular business laptops of this era were often defaulting to WUXGA.  I was exceedingly hopeful that WQXGA would become the new mainstream standard.  That never happened.

 

And then came the unfortunate mess of 1080p.  Manufacturers concocted a way to fool consumers into thinking that 1080p was an upgrade (it was actually a downgrade of 120 horizontal lines of resolution).  And the industry conspired to push 16:10 out entirely, and 1080p was the absolute highest resolution you could get on a laptop for many, many years. There wasn't even an OPTION--not even one with an exorbitant price-tag.   And then MSI answered the call.  3k, 15.6" laptop (2880x1620).  I had to have one.  And I could IMMEDIATELY see the resolution difference (vs. a 1080p MSI laptop)--clear as day.  And at that moment, I knew that everyone who said that UHD was imperceptible on a 15.6" laptop--was full of malarky.

 

Now here comes the math, so hopefully I don't lose you:

(FHD) 1920x1080 - 15.6" = ~141 PPI

(3k) 2880x1620 - 15.6" = ~212 PPI

(4k / UHD) 3820x2160 - 15.6" = ~282 PPI

(WQHD) 2560x1440 - 15.6" = ~188 PPI

 

But just for the sake of comparison, let's assume that your average smartphone has a viewing distance of ~ half that of a laptop.  (~5-12" for a smartphone, desktop monitors ~20-32", laptops fall somewhere in between).

1920x1080 - 6" = ~367 PPI

 

So we would assume that half that PPI on a display twice as far--would generate a similar viewing experience.  Well that would mean ~ 184 PPI on a 15.6" display--clearly higher than 1080p.  And I can't tell too much difference between 3k and 4k on a 15.6" laptop, but the 1080p > 3k is quite visible--so this seems to bear out.  And since it doesn't make sense for manufacturers to offer myriad resolution options, jumping directly to 4k to save on mfg costs (vs. a 3k, midrange solution) is logical.  Everything "above and beyond 3.1k" might be a waste, but logistically this holds true better from a manufacturing standpoint.

 

So why then, are 2160p laptops virtual unobtanium with high-end GPU's?  This was the case even before the semiconducter shortage.  Trying to get an RTX 3070 and a 4k screen is an exercise in wallet abuse (let alone a 3080).  I can spec out a 3070 with a 1080p screen for a resonable price, but trying to pair it with 4k (if it's even available) is somewhere north of a $600 upgrade.  Pure insanity.  Having to settle for 1440p just seems like some overpriced consolation prize.  It's pretty clear that 1080p shouldn't even be an option on laptops, given the PPI and relative viewing distances above.  1440p should be the minimum spec option (refresh rates variable).

 

http://www.transvideointl.com/assets/TechDensity_Mobi92620.pdf

 

Given that my average distance to a 15.6" laptop screen is frequently > 2 feet while gaming, 1080p is simply not cutting the mustard.  And I can see the aliasing on text and icons in 1080p, 15.6".

 

Continued...

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

i'd imagine it's because laptop buyers don't really care about ppi and buy something anyway, and laptop's just alot of dead money if they don't sell, so that's what we get now. I can't imagine 4k laptops being mainstream.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

People would rather have high refresh rate on 1440p than low fps on 4K which is at best what you'll get with the mobile versions of GPUs, and that to not be visible because while high DPI is fine for photo, text smoothing etc you're not going to notice it in something that moves fast.

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

PART 2 - Desktops

 

Ok, so if 1440p is what I would call "minimally acceptable" on a laptop, what about desktops?  Back at the start of my first post, I mentioned 1080p on a 20".  I've already established that I can distinguish resolutions above 1080p on a 15.6", so unless I'm using a 20" from considerably further away, I would expect the reduced PPI to have a negative impact.

 

1920x1080 - 20" = ~110 PPI

 

Here's a sample of viewing distances vs PPI - note that 120DPI is listed at ~5'--virtually double what most people would use a 20" display at.

 

http://resources.printhandbook.com/pages/viewing-distance-dpi.php

 

While I realize this is for artwork/print, the math checks.  And I know of virtually no one who can plead a scientific case of why 1080p is too high of a resolution in a 20" display.  Fair enough, as 1440p should probably again be the minimum standard, but I digress.

 

Why do I care about a 20"?  Because a 2x2 array of 4, 20" monitors is the same viewing space and resolution as a single, 40", 4k monitor.  And if 1080p is justified on a 20", then 4k is equally justified on a 40".  So why then, are we getting 35"+ offerings with--at best--1440 horizontal lines?  If what I said about laptops at 1440p holds true, then a 2x2 array of those screens would be a 31.2" display with a 5k resolution (5120x2880).  Dell actually offers the UP2715K, which is a 27" monitor--so this not unreasonable.  And yet, here we are fighting for 4k resolutions on a larger display.

 

HDMI 2.0 has been out since 2015.  Why is the 60Hz, 4k, 40+ monitors market--just now catching up?

 

And then there's curved displays.  No, curved displays are not a gimmick.  No one has ever built a multiple-monitor setup and had all of their displays perfectly parallel to each other.  There's a reason for that.  Your eyeball is curved, and if you are staring at at a flat object, the parts of that object further into your peripheral vision will be further away from your eyeball than the center of the object.  This is why people often turn their heads, it reduces eyestrain and helps bring objects into focus by recentering vision over the intended viewing area.

 

Manufacturers know this, and that's why super ultrawidescreen (32:9) displays are curved.  It would be horrible using one that was perfectly flat.  Such a monitor would look more like an ellipse to the viewer (bigger center, smaller edges), due to viewing distance/angles.  While there is some room for debate on proper monitor curvature, the meat and potatoes is that curvature should be matched to viewing distance.  That's because the r-value (the radius measurement for a circle that would be created by continuing the curvature of a monitor) should place the viewer as close to the center of (of said circle) as possible.

 

When this is done correctly, the screen no longer appears curved to the user.  It appears as a perfectly flat panel to the viewer.  This is because the edges of it are equidistant to the eye as the center.  Seems simple enough, but is quite lacking in the monitor segment.

 

In fact, if I WANTED to buy a 40" (or larger), 3840x2160, curved, 60-120hz display--there are 0 options.  Not even stupidly overpriced ones.

 

I believe this to be intentional atrophy in the monitor segment.  Manufacturers have found all sorts of creative ways to "spice up" otherwise tepid offerings and fool/bait consumers into a sense of contentment.  HDR 1400 or whatever we're up to these days--will eventually have even more craziness thrown at it.  Same with 300Hz displays.  Sure, you can make the argument for 240Hz (3D gaming @ 120Hz per eye) but we're into the absurd refresh-rate range for all but the bleeding edge of competitive gamers.  And then they add more RGB to make things look fancier.

 

But like I said, I'm old school.  The first, most important thing, is resolution.  Sure, all of those bells and whistles are nice to have, but not at the cost of resolution.  I will take HDR8, 60hz, and no RGB...if I can get true 4k, 40" and curved.  I want to at least have the OPTION...and right now, that simply doesn't exist.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Kilrah said:

People would rather have high refresh rate on 1440p than low fps on 4K which is at best what you'll get with the mobile versions of GPUs, and that to not be visible because while high DPI is fine for photo, text smoothing etc you're not going to notice it in something that moves fast.

 

Is native resolution gaming really that paramount?  Just because I have a 4k display doesn't mean I will inherently use that resolution to game in.  I don't use a laptop exclusively for gaming or exclusively for productivity.  GPUs are improving all the time.  And up until 2005 or so, it was quite common for monitor offerings to have native resolutions far in excess of what desktop GPUs could handle.  Sure, not all consumers would be buying a 1600x1200 monitor, but the OPTIONS existed.  Fast forward to today, and options aren't even there.

 

The fact that 1080p has been "standard" for nearly 2 decades is indicative of how badly the display segment has languished.  1080p is fine for smartphones--and abysmal for anything > 10".

Link to comment
Share on other sites

Link to post
Share on other sites

I like 1600 vertical pixels and dislike 1440 and 1080 pixels so I get it.  

My favorite non 4k monitor is my  38" 2840 X 1600 ultrawide. The text looks as good as 4k. 

 

Viewing distance is meaningless to me since I think focusing distance is more important.

For me to use any laptop or a monitor 32" or less I have to use prescription eye glasses and it has been that way since the 80s. So I am using a 4k 120hz TVs at 36" away. I don't need to wear glasses at that distance and the text looks sharper than my 1440p monitors.

It started as an experiment since there was more negative then positive info out there but I was buying the TVs to upgrade the old 1080p TVs I had in the bedrooms so if it failed they would go on a wall.

It was a big success.

 

36 minutes ago, IPD said:

The fact that 1080p has been "standard" for nearly 2 decades is indicative of how badly the display segment has languished.  1080p is fine for smartphones--and abysmal for anything > 10".

I got my first 1080i TV in 2005 but going by screenshots I only started using 1080p monitors in 2010.  By 2015 I was using 4k and 3440 X 1440 for gaming.

 

In 2017 I bought a 32" 1440p 144hz Ge-sync monitor to see what the fuss was about and made the mistake of putting it right beside my 32" 4k monitor. It looked terrible. I also realized I did not play anything that really benefits from high refresh rates. So I gave the monitor to my Son. 

 

This year I upgraded my 20 series GPUs to 30 series cards so my TVs have gone from 60hz 4k to 120hz 4k but I am still not playing anything that benefits from it.  

It does look cooler when I type it in though.

 

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, IPD said:

Is native resolution gaming really that paramount?  Just because I have a 4k display doesn't mean I will inherently use that resolution to game in.

You can use non-native but then you get somewhat ugly rescaling blur.

 

1 hour ago, IPD said:

Fast forward to today, and options aren't even there.

They are there, just in product lines aimed at the type of people who are likely to want that and at the price reflecting that category and its size...

 

1 hour ago, IPD said:

HDMI 2.0 has been out since 2015.

2014 actually, I've been using curved 4K TVs as monitors since that first became possible (release of the GTX 900 series).

 

Would like to upgrade to a >60Hz one but curved TVs having fallen out of favor it's unlikely I'll get an option soon...

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

https://www.amazon.com/Sceptre-Curved-Class-DisplayPort-C408B-QWN168W/dp/B094LDMHJS

 

Found this today.  So SOME progress is being made.  I'd much rather have this than an Odyssey G9, because of the extra vertical real estate.  Still, 1440p on a 40" is too low for my taste.

 

I'd love to know what are these options you speak of.  Please post links, as I am unable to find any.  The Odyssey G9 and Sceptre G505B are far too short.

 

Even the Dell UltraSharp40, a nice 5120x2160, is about 4" shorter than my 40" 16:9.  If it were offered in a 50", this would be a fantastic option, given that I could downscale to 1440p for gaming and have a great refresh rate.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, IPD said:

HDMI 2.0 has been out since 2015.  Why is the 60Hz, 4k, 40+ monitors market--just now catching up?

For the very simple reason that the market doesn't or didn't until now exist. At 40+ inches pretty much everybody will be looking at TVs for content consumption, not monitors for gaming. From there follows the second reason: the vast majority doesn't need >60 Hz refresh rate. Movies are 24 FPS and there's little content at 60. TV gaming is mostly consoles which have only recently started outputting >60 FPS. Then there's the slight issue of price. This all makes large 4k 120 Hz monitors a rather niche market.

7 hours ago, IPD said:

Fast forward to today, and options aren't even there.

What do you mean options aren't there?

afbeelding.png.ada52321fcb3e87d2abc0768681aef7c.png

Plenty of resolutions to choose from. Just the more odd/specific it gets the less variety in models you'll have. There's no real practical benefit from 16k or 32k monitors at the moment, as you'll likely not use that at native scaling for say a <40" monitor, so the "bigger than some GPUs can handle" is over.

7 hours ago, IPD said:

In fact, if I WANTED to buy a 40" (or larger), 3840x2160, curved, 60-120hz display--there are 0 options.  Not even stupidly overpriced ones

Not curved, but there are the: LG OLEDs, Asus ROG Strix XG43UQ, Swift PG43UQ, PG65UQ; HP Omen Emperium 65, Gigabyte Aorus FV43U. The fact that these BFGDs have all but fallen out of fashion pretty much immediately is a dead giveaway the market doesn't exist (yet).

7 hours ago, IPD said:

but we're into the absurd refresh-rate range for all but the bleeding edge of competitive gamers.

This argument is the same for resolution though. Just replace "refresh rate" with "resolution" and "competitive gamers" with "tech and movie enthusiasts". HDR has a much bigger impact on movies than 4k, for example, in my opinion. An excellent 1080p blu-ray can look pretty much as good as a 4k one.

6 hours ago, Kilrah said:

Would like to upgrade to a >60Hz one but curved TVs having fallen out of favor it's unlikely I'll get an option soon...

Did you feel like the curve actually added something? I never saw the point of it for a TV.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, IPD said:

I'd love to know what are these options you speak of.  Please post links, as I am unable to find any.  The Odyssey G9 and Sceptre G505B are far too short.

I was referring to the laptop part.

 

20 minutes ago, tikker said:

Did you feel like the curve actually added something? I never saw the point of it for a TV.

It's pointless for a TV but when using a 49" TV as monitor and sitting 2ft away the curve does significantly reduce the otherwise massive difference in distance and viewing angle between center and edges/corners.

 

BTW while I LOVE 4K on a big desktop display my laptop also has a 4K panel and it's mostly a PITA and almost never an advantage except in very specific use cases (one of which made me go for the 4K option, but in the end I never really do it so I kinda regret...) 

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

I haven't owned a 1440p, 15.6".  It might well be ok, given the DPI vs. a WQXGA of similar size.

 

That's one of the reasons I'm seriously interested in the Zephyrus M16, given the 3070 and the WQXGA offering.  WQXGA and a top end GPU was what I had been praying for in the 15" factor--10+ years ago.  I might be missing something, but it seems like all the other WQXGA offerings seem to have midrange GPUs...but we'll see.  If it's overpriced (imho) like the Lenovo Legion 5, then I'll pass.

 

To me the sweet spot is $2000 +/- $200.  That's what I paid for a Sager 8662 (Screw Nvidia's shite 200 series).  It's what I paid for my MSI GT60.  It's what I paid for my Gigabyte P35X (never again).  And it's what I paid for my HP Omen (and that had 4k and a RTX 2080).

 

$2600-2700+ is not in the ballpark.

Link to comment
Share on other sites

Link to post
Share on other sites

Acer Predator Triton 500 SE RTX 3070 appears to have come out recently as well - and much closer to my budget range.

 

https://www.bestbuy.com/site/acer-predator-triton-500-se-16-2560x1600-165hz-g-sync-intel-11th-gen-i7-geforce-rtx-3070-16gb-ddr4-1tb-ssd/6469299.p?skuId=6469299

 

Any others that are in this price with 4k/WQXGA and a 3070? 

 

p.s.

It looks like the market has finally corrected a bit.  It was a trainwreck just a few months ago-with nil options like this.

Link to comment
Share on other sites

Link to post
Share on other sites

P.S.

AOC C4008VU8 40

 

This is what I"m currently using.  I realized I had the *derp* and mistakenly plugged in my HDMI into the wrong port, and was getting only 30hz.  Used the other port and am at 60hz now.

 

But this brings up my next point.  The entire reason I'm not using any of the DP options and using HDMI--is because the engineers designed this with the DP inputs directly impeded by the VESA stand pillar.  Could we please--for the love of all that's holy--put right-angle cables in with your products...if you are going to put the inputs in inconvenient places like this?  I realize that not everyone is prone to using a VESA stand out of the box, but this is a 40" monitor.  Let's have some common sense.

 

And oh btw, since monitors like this one can read simultaneous inputs from the same computer (i.e. HDMI input and DP input both show up under display settings as separate monitors--when they are both connected)...it might be worth considering adding a $.02 switch on the front that can more easily toggle between inputs.  I can make use of that.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×