Jump to content

Dell announces UP3017Q 4K OLED Ultrasharp display that costs $4999 - CES 2016

kameshss

Eh, huge input lag and GTG response times compared to other options like Samsung and Eizo.

they're beautiful though

and I've never had input lag

Shipping sucks

Link to comment
Share on other sites

Link to post
Share on other sites

Finally. Oh, Dell, I have waited for so long. Waited, always waited, for a monitor worthy of my money. Ever since you discontinued the U2711 I have been in a constant state of searching, but finally, the day has come.

Cheers,

Linus

Link to comment
Share on other sites

Link to post
Share on other sites

they're beautiful though

and I've never had input lag

 

It's 3x the input lag of the ROG swift. It's unacceptable for fps tournament gameplay. I got one for my little brother for christmas, but since he's in the top 50 CS:GO players, he can't play on it. He's resorted to shortening the cables of his mouse, monitor, and headphones all to give himself the tiniest edge in reaction time. That's the granularity he plays at now.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

It's 3x the input lag of the ROG swift. It's unacceptable for fps tournament gameplay. I got one for my little brother for christmas, but since he's in the top 50 CS:GO players, he can't play on it.

why would you buy an ultra sharp for professional gaming....

Shipping sucks

Link to comment
Share on other sites

Link to post
Share on other sites

Eh, huge input lag and GTG response times compared to other options like Samsung and Eizo.

Response time is a meaningless figure. There is no standard way of measuring, every manufacture measure their own ways, and some varies depending on series.

Manufactures pick 2 shades of gray colors to measure the speed for the panel to switch that pixel from 1 of the shade of gray to the other. What grays are picked? That is what varies, further they are apart the slower the panel is. Manufactures claim that they pick shades of gray that reflect the usage of the monitor. For example, in gaming, you don't have black on white very much, so they pick the 2 grays that very close together, achieving 1ms response times. While, office focused monitor they pick gray further apart, as black text on white background is very common, resulting in slower response time.

That is why it is completely meaningless. Also, there is no standard in equipment used, and how to measure the speed. So manufacture does and use what they want. If the equipment round anything bellow 16ms down to 1ms, then too bad for you. They'll put 1ms on the box.

There input lag circuit is pretty fast. The slower ones are those with color processors. So you do get nicer color reproduction.

Link to comment
Share on other sites

Link to post
Share on other sites

Hope when 4k becomes way more main steam 120Hz+ is adopted very quickly and isn't insanely priced. Using 1440p 96Hz+ anything 60Hz makes my eyes bleed lol.

Link to comment
Share on other sites

Link to post
Share on other sites

Response time is a meaningless figure. There is no standard way of measuring, every manufacture measure their own ways, and some varies depending on series.

Manufactures pick 2 shades of gray colors to measure the speed for the panel to switch that pixel from 1 of the shade of gray to the other. What grays are picked? That is what varies, further they are apart the slower the panel is. Manufactures claim that they pick shades of gray that reflect the usage of the monitor. For example, in gaming, you don't have black on white very much, so they pick the 2 grays that very close together, achieving 1ms response times. While, office focused monitor they pick gray further apart, as black text on white background is very common, resulting in slower response time.

That is why it is completely meaningless. Also, there is no standard in equipment used, and how to measure the speed. So manufacture does and use what they want. If the equipment round anything bellow 16ms down to 1ms, then too bad for you. They'll put 1ms on the box.

There input lag circuit is pretty fast. The slower ones are those with color processors. So you do get nicer color reproduction.

 

You can do a proper measurement with the XRite iPro equipment. Since I have access to one through my research, we verified. The variance was only 1,1,1. And it takes about 7.8 ms on the Dell vs 2.63 on the ROG Swift.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

You can do a proper measurement with the XRite iPro equipment. Since I have access to one through my research, we verified. The variance was only 1,1,1. And it takes about 7.8 ms on the Dell vs 2.63 on the ROG Swift.

Yes, IPS panels tend to be slower than TN panels.
Link to comment
Share on other sites

Link to post
Share on other sites

Yes, IPS panels tend to be slower than TN panels.

 

No, this is the ASUS IPS (AHVA) monitor. Like me, he's colorblind, so the greater the gamut, the better for him.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah they do. I actually had to account for that in my research. The change is less than 2%, but it's there.

I'm surprised it's in the 1-2% range. Though I then have to ask 2% of what?

Also... (not trying to pick apart your statement, just interested)

What was the nature of your research? What coatings did you test? What wavelengths were affected most, or was it a broad spectrum issue? What sort of measuring equipment was used?

I will have to take measurements if I ever finish my Arducorder.

Link to comment
Share on other sites

Link to post
Share on other sites

There is a huge difference in the lifespan of OLED subpixels. You can see this very clearly on some Samsung phones. My Galaxy S 2 for example is much more blue where the status bar is. That's because the pixels in the status bar have been turned off most of the time, and the blue subpixels in the rest of the display has died out faster than the rest.

I don't know if Samsung still does it, but on their older phones they actually shifted the color balance towards he blue. By making everything slightly more blue they could make the display less accurate out of the box, but it would counter some of the decay.

I don't have any exact numbers (and the numbers have changed over the generations) but I believe I read that blue subpixels degrade about 30% faster.

 

Samsung oled screens use different color subpixels, but lg does not.  They don't use red/blue/green, they only use white oled sub pixels, and use that white light and pass it through filters to get the correct color, this is how they bypassed the differential decay rate of different colored oled subpixels, because they are not using that method at all.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

I heard rumors of this announcement and I'm drooling. Just add g-sync and drop the price by $3000 and I'm ready. :^)

 

 

Can you imagine if this monitor had freesync support built in?  If it was gifted to Linus he's probably still not use it since it was not gsync and he'd have to use AMD cards in his personal rigs.  But maybe he's changed.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

Nope. A good displays will have a good anti-glare film. Those films aren't a texture transparent plastic. They are formed (well the good ones) to allow light to pass through in 1 direction more straight, while the other direction deflects in all direction.

Graphics artists tend to go with an iMac, because they are not tech oriented people. They know their software, tools, and have really good drawing skills, but knowledge on computers is limited. Apple is the only computer manufacture for the longest time that offered a simple solution, where you have the specs they needs, and an IPS panel (let alone be a true 8-bit panel, which most people that buy IPS monitors here aren't willing to cash out for, and opt for a 6-bit panel instead), with good color calibration out of the box. That is how Apple took over.

 

 

I hate anti glare displays because compared to glossy, more matte finishes never seem as crisp and sharp.  Light still reflects off the display, but instead of bouncing off in more clean vectors, it kind of scatters in the place of the display creating a more hazy cloudy effect and disrupting sharpness and clarity.  I think the real reason people hate glossy is either because they have lights right behind them and choose not to reorient their display, or they are so frightened they might catch a glimpse of their own reflection they refuse to go glossy to avoid the horror.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

I didn't need my other kidney anyways.

 

Hope when 4k becomes way more main steam 120Hz+ is adopted very quickly and isn't insanely priced. Using 1440p 96Hz+ anything 60Hz makes my eyes bleed lol.

 

I didn't think 4k would break 60Hz this soon, monitor competition is getting intense

why do so many good cases only come in black and white

Link to comment
Share on other sites

Link to post
Share on other sites

Samsung oled screens use different color subpixels, but lg does not.  They don't use red/blue/green, they only use white oled sub pixels, and use that white light and pass it through filters to get the correct color, this is how they bypassed the differential decay rate of different colored oled subpixels, because they are not using that method at all.

 

The different colors will still degrade at different rates because the usages are different. OLEDs degrade by being used. With all-white OLEDs, they may all degrade equally per volt per second, but the volts and seconds are different for each OLED too, so it's a moot(ish) point. If you light up your screen blue, all the OLEDs providing blue color are degrading while all the other OLEDs are not, because they're off. It doesn't matter whether the OLEDs are actually producing only blue light, or producing a full white spectrum and having blue filtered through. The issue is that not all the OLEDs are used equally, and thus degrade unequally regardless of half-life.

 

That being said, LG's approach of using white OLEDs with color filters is significantly better simply because the blue OLEDs degraded extremely quickly, and this eliminates those. Ultimately the issue with pixels and colors degrading out of balance will never be totally solved unless degradation can be eliminated entirely (unlikely). The different colors and different pixels will always degrade unequally. The key is really to make OLEDs that degrade so slowly that the difference becomes only theoretical, and even if all the pixels degrade unequally, they stay so close that the difference between them is marginal.

 

I'd agree having white OLEDs will probably make the degradation more tightly bound so the difference between pixels takes much longer to become apparent, but it's worth noting this doesn't "solve" burn-in or differential degradation rates.

Link to comment
Share on other sites

Link to post
Share on other sites

The different colors will still degrade at different rates because the usages are different. OLEDs degrade by being used. With all-white OLEDs, they may all degrade equally per volt per second, but the volts and seconds are different for each OLED too, so it's a moot(ish) point. If you light up your screen blue, all the OLEDs providing blue color are degrading while all the other OLEDs are not, because they're off. It doesn't matter whether the OLEDs are actually producing only blue light, or producing a full white spectrum and having blue filtered through. The issue is that not all the OLEDs are used equally, and thus degrade unequally regardless of half-life.

 

That being said, LG's approach of using white OLEDs with color filters is significantly better simply because the blue OLEDs degraded extremely quickly, and this eliminates those. Ultimately the issue with pixels and colors degrading out of balance will never be totally solved unless degradation can be eliminated entirely (unlikely). The different colors and different pixels will always degrade unequally. The key is really to make OLEDs that degrade so slowly that the difference becomes only theoretical, and even if all the pixels degrade unequally, they stay so close that the difference between them is marginal.

 

I'd agree having white OLEDs will probably make the degradation more tightly bound so the difference between pixels takes much longer to become apparent, but it's worth noting this doesn't "solve" burn-in or differential degradation rates.

 

 

I just looked this up and it seems you're right.

 

https://www.youtube.com/watch?v=3qRhTKOu9Pw#t=3m30s

 

for some reason I thought lg was ONLY using a white oled subpixel and passing that through color filters, but they are still using rgb+w.  They mentioned some sort of color refiner technology to raise the accuracy of blue, but no idea how that works or affects longevity.

 

 

I can't imagine the tvs will be as bad as early oled screens on phones though.  Their engineers have no know the issues and use cases of a tv and would want to avoid such things.  But by the time oled gets cheap enough for mass adoption we'll have more definitive answers to longevity.  The original 1080p 55" lg oled tv was sold in 2014 was it not? Let's see how that is holding up by the end of this year, and then 2017/2018.   If they can get a solid 6+ years of display quality, I think they'll be fine.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

Eh, huge input lag and GTG response times compared to other options like Samsung and Eizo.

That highly depends on which monitor you are comparing. The U2312HM had incredibly low input lag.

http://www.tftcentral.co.uk/reviews/dell_u2312hm.htm

The input lag of the U2312HM was incredibly low, and in fact the lowest we have ever seen from a TFT display. There was practically no delay at all with most measurements showing 0ms lag compared with the CRT. There was an occasional lag of up to 10ms but over many measurements we had an average lag of only 0.6ms. This was lower than the already very good U2311H (10.3ms) and the new Dell U2412M (9.4ms). Excellent work here from Dell to bring lag down to such a low level. This will present no problem, even to high end gamers.

And that shortening of the mouse cable stuff is just placebo. It does not actually help.

The cable would have to be about 6 kilometers to add 2ms delay.

Samsung oled screens use different color subpixels, but lg does not. They don't use red/blue/green, they only use white oled sub pixels, and use that white light and pass it through filters to get the correct color, this is how they bypassed the differential decay rate of different colored oled subpixels, because they are not using that method at all.

If the subpixels are white, how does it produce colors?
Link to comment
Share on other sites

Link to post
Share on other sites

Just got the confirmation from the Dell itself.

 

I tweeted them

 

And the monitor supports 4k @ 120 Hz

 

https://twitter.com/DellCares/status/684886541441855490

Times are a moving... a great time is coming.

Spartan 1.0

Spoiler

CPU: Intel Core i7-4770K 3.5GHz Quad-Core Processor

CPU Cooler: Cooler Master Seidon 120XL 86.2 CFM Liquid CPU Cooler

Motherboard: Asus Maximus VI Extreme ATX LGA1150 Motherboard
Memory: Corsair Dominator 32GB (4 x 8GB) DDR3-1600 Memory
Storage: OCZ Vector Series 512GB 2.5" Solid State Drive
Storage: Seagate Desktop HDD 4TB 3.5" 7200RPM Internal Hard Drive

Video Card: EVGA GeForce GTX 980 4GB Classified ACX 2.0 Video Card
Case: Thermaltake Urban S41 ATX Mid Tower Case
Power Supply: Corsair 1200W 80+ Platinum Certified Fully-Modular ATX Power Supply
Optical Drive: LG BH16NS40 Blu-Ray/DVD/CD Writer
Optical Drive: LG BH10LS30 Blu-Ray/DVD/CD Writer
Operating System: Microsoft Windows 10 Pro 64-bit
Sound Card: Creative Labs ZXR 24-bit 192 KHz Sound Card
Monitor: 2x Asus VG278HE 27.0" 144Hz Monitor
Keyboard: Logitech G19s Wired Gaming Keyboard
Keyboard: Razer Orbweaver Elite Mechanical Gaming Keypad Wired Gaming Keyboard
Mouse: Logitech G700s Wireless Laser Mouse
Headphones: Creative Labs EVO ZxR 7.1 Channel  Headset
Speakers: Creative Labs GigaWorks T40 Series II 32W 2ch Speakers

Hades 1.0

Spoiler

Laptop: Dell Alienware 15 2015

CPU: i7-4720HQ CPU

Memory: 16GB DDR3 SODIMM RAM

Storage: 256GB M.2 SSD

Storage: 1TB 5400rpm 2.5" HDD

Screen: 15.6" FHD Display

Video Card: Nvidia GTX 970M with 3GB

Operating System: Windows 10 Pro

Project: Spartan 1.2 PLEASE SUPPORT ME NEW CHANNEL > Tech Inquisition

Link to comment
Share on other sites

Link to post
Share on other sites

I'm surprised it's in the 1-2% range. Though I then have to ask 2% of what?

Also... (not trying to pick apart your statement, just interested)

What was the nature of your research? What coatings did you test? What wavelengths were affected most, or was it a broad spectrum issue? What sort of measuring equipment was used?

I will have to take measurements if I ever finish my Arducorder.

See my signature :P

Using Augmented Reality to correct colorblindness.

I tested the matte coating on my Viewsonic vs. the raw panel. I tested several lenses provided by the local optometrist with anti-glare vs. no coating, and of course the MacBook Pro Retina that we're using for human testing. Since the goal is to deterministically shift colors based on context and human-specific calibration factors, we don't need to deploy it on Google Glass or on a phone yet to use it to look at a room. Also the code of my predecessor was all for iOS :P

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

See my signature :P

Using Augmented Reality to correct colorblindness.

I tested the matte coating on my Viewsonic vs. the raw panel. I tested several lenses provided by the local optometrist with anti-glare vs. no coating, and of course the MacBook Pro Retina that we're using for human testing. Since the goal is to deterministically shift colors based on context and human-specific calibration factors, we don't need to deploy it on Google Glass or on a phone yet to use it to look at a room. Also the code of my predecessor was all for iOS :P

That sounds really cool, I'll check it out when I'm on my laptop. My issue is that 95% of the time I'm on this forum I'm using tapatalk on a phone so I can't see signatures, or half of people's profiles if I click on them (they run off the side of the screen and you can't scroll).

Link to comment
Share on other sites

Link to post
Share on other sites

Is anyone from linus media empire at ces now ?!?!?

 

If they are, is there a way to get one of them to head over to the dell booth and check/ask what connectors this monitor is using?  And if it supports freesync over displayport 1.3 (assuming they added that)?

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

Passive colour filters:

OLED-Tech-vs-600x254.png

 

 

Damn that's insane. Good job Dell. Now make it 21:9, Adaptive Sync and not the cost of a black market kidney.

 

I would like to know how they have due this, due to being impossible with DP 1.2 (could have used Dual DP 1.2) I will ask them now.

GPU[Two GTX 980ti Lightnings overclocked]-CPU[Core i7 5930K, 4.4Ghz]-Motherboard[MSI X99 Godlike]-Case[Corsair 780t Black]-RAM[32GB Corsair Dominator 3000Mhz Light bars]-Storage[samsung 950 Pro 512GB, Samsung 850 Pro 1TB and Samsung 850 Pro 512GB]-CPU Cooler[EK Predator 360mm]-PSU[EVGA 1600w T2 Individual cables Red]-Monitor[ASUS PG348Q]-Keyboard[Corsair K70 Red]-Mouse[Corsair M65 RGB]-Headset[sennheiser G4me one]-Case Fans[beQuiet Silent Wings 2]

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sorry, but a curved 55" 4k OLED TV @ 120Hz is $1000 less than this monitor. Am I missing something here?

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×