Jump to content

Seiki's 40-inch 4K display is a desk-dominating beauty for under $1000

BiG StroOnZ

You have no idea how your second quote is above and beyond wrong.

 

All three say basically the same thing. Here's more (straight from Wikipedia, in line with what the second quote says as well as the third):

 

Most TN panels represent colors using only 6 bits per RGB color, or 18 bit in total, and are unable to display the 16.7 million color shades (24-bit truecolor) that are available from graphics cards. Instead, they use a dithering method that combines adjacent pixels to simulate the desired shade.

 

18 bit TN+film panels with dithering are sometimes advertised as having "16.2 million colors."

 

 

 

http://en.wikipedia.org/wiki/Frame_rate_control

 

Are you really trying to say that all four sources are incorrect?  roflmao.gif You cant possibly be that egotistical?

Link to comment
Share on other sites

Link to post
Share on other sites

All three say basically the same thing. Here's more (straight from Wikipedia, in line with what the second quote says as well as the third):

 

 

http://en.wikipedia.org/wiki/Frame_rate_control

 

Are you really trying to say that all four sources are incorrect?  roflmao.gif You cant possibly be that egotistical?

 

6-bit with dithering is considered 8-bit color depth, not 6-bit. The articles you quoted must be quite old, any decent modern dithering technique can reproduce all 16.7 million colors very effectively. Like I said, most monitors today are 6-bit with dithering, including nearly all TN panels and nearly all low and mid-end IPS panels. For example my U2414H is a 6-bit+FRC panel, and so is the popular Acer H236HLbid. So is my ASUS PA248Q. It is difficult to tell the difference between a 6-bit panel with FRC from a true 8-bit panel.  The majority of monitors with true 8-bit panels are using 8-bit with dithering to produce 10-bit color ;)

Link to comment
Share on other sites

Link to post
Share on other sites

6-bit with dithering is considered 8-bit color depth, not 6-bit. The article you quoted must be quite old, any decent modern dithering technique can reproduce all 16.7 million colors very effectively. Like I said, most monitors today are 6-bit with dithering, including nearly all TN panels and nearly all low and mid-end IPS panel. For example my U2414H is a 6-bit+FRC panel, and so is the popular Acer H236HLbid. So is my ASUS PA248Q. It is difficult to tell the difference between a 6-bit panel with FRC from a true 8-bit panel.  The majority of monitors with true 8-bit panels are using 8-bit with dithering to produce 10-bit color ;)

 

Then why would Seiki advertise this monitor as only 6-bit? Since like you said, if it used dithering it would instead be advertised as 8-bit, not 6-bit.

Link to comment
Share on other sites

Link to post
Share on other sites

Are you really trying to say that all four sources are incorrect?  roflmao.gif You cant possibly be that egotistical?

 

6-bit with dithering is considered 8-bit color depth, not 6-bit. The articles you quoted must be quite old, any decent modern dithering technique can reproduce all 16.7 million colors very effectively. Like I said, most monitors today are 6-bit with dithering, including nearly all TN panels and nearly all low and mid-end IPS panels. For example my U2414H is a 6-bit+FRC panel, and so is the popular Acer H236HLbid. So is my ASUS PA248Q. It is difficult to tell the difference between a 6-bit panel with FRC from a true 8-bit panel.  The majority of monitors with true 8-bit panels are using 8-bit with dithering to produce 10-bit color ;)

☐ rekt ☐ not rekt Tyrannosaurus rekt ☑

Link to comment
Share on other sites

Link to post
Share on other sites

Then why would Seiki advertise this monitor as only 6-bit? Since like you said, if it used dithering it would instead be advertised as 8-bit, not 6-bit.

That looks more like it is the pcgamer journalist not understanding how this works.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes in the article.

 

 

Hardly at all, all facts point otherwise.

That article isn't by Seiki. That article is the words of that journalist, one who probably does not understand how this works. I am asking if SEIKI said it somewhere.

 

Published specs can be misleading.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm guessing that someone somewhere mistyped or misheard 1.07 billion as 1.07 million and the author interpreted that to mean 6-bit color since it's less than 16.7 million of a standard 8-bit panel. I've never heard 1.07 million colors being associated with 6-bit color depth, or any kind of color depth, 10-bit color depth with 1.07 billion colors makes much more sense. Now that I think about last time I heard about this monitor, it supposedly had a 12-bit LUT, so again that would seem to make more sense if it had 10-bit color depth.

Link to comment
Share on other sites

Link to post
Share on other sites

 

That looks more like it is the pcgamer journalist not understanding how this works.

 

 

That article isn't by Seiki. That article is the words of that journalist, one who probably does not understand how this works. I am asking if SEIKI said it somewhere.

 

Published specs can be misleading.

 

 

So now a writer who gathered his facts directly from the company to put up the article from CES somehow misheard them say 6-bit? Even though he clearly says 6-bit. He might not be correct on how many colors it supports, but he clearly heard 6-bit. Which obviously is something translated to him at CES. 

 

 

I know the article author said 6-bit, does Seiki actually say that somewhere?

 

No, Seiki does not have this product listed on their site. This is PCGamer's coverage of CES so this is obviously the information he got from the booth. 

 

I'm guessing that someone somewhere mistyped or misheard 1.07 billion as 1.07 million and the author interpreted that to mean 6-bit color since it's less than 16.7 million of a standard 8-bit panel. I've never heard 1.07 million colors being associated with 6-bit color depth, or any kind of color depth, 10-bit color depth with 1.07 billion colors makes much more sense. Now that I think about last time I heard about this monitor, it supposedly had a 12-bit LUT, so again that would seem to make more sense if it had 10-bit color depth.

 

If it was 1.07 billion, then wouldn't that make this actually a 10-bit panel, or an 8-bit with dithering? And for a 40-inch 4K monitor under $1000. Sounds improbable.

Link to comment
Share on other sites

Link to post
Share on other sites

If it was 1.07 billion, then wouldn't that make this actually a 10-bit panel, or an 8-bit with dithering? And for a 40-inch monitor 4K monitor under $1000. Sounds improbable.

A 6-bit panel at that price also sounds improbable.

Link to comment
Share on other sites

Link to post
Share on other sites

A 6-bit panel at that price also sounds improbable.

 

Well, according to Glenwing, that you so happily agreed with. It's standard practice that monitors are actually 6-bit with dithering. So this would be no different. Just another 6-bit monitor with dithering. 

Link to comment
Share on other sites

Link to post
Share on other sites

If it was 1.07 billion, then wouldn't that make this actually a 10-bit panel, or an 8-bit with dithering?

 

Right.

 

Seiki press release: http://www.seiki.com/company/news/2014-06-25-Seiki-CE-Week-4K-Display-Launch.php

 

Seiki Pro 4K displays are designed to deliver the ultimate desktop computing experience for intensive computer graphics, photo and video editing, and programming applications, as well as fast-paced 4K PC gaming at up to 60 frames per second.

Seiki will introduce three displays sizes including a 28- (28U4SEP-G02), 32- (32U4SEP-G02) and 40-inch (40U4SEP-G02) models. Current specification list include:

  • Vertical Alignment (VA) LED panel technology with 3,840 by 2,160 4K Ultra HD resolution
  • 12-bit color processing and 14-bit gamma mode

 

6-bit color depth seems extremely far-fetched. I think it's more likely someone at the booth told him 1.07 million color support by accident, and he didn't make the connection that they meant 1.07 billion.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, 40" on my desk sitting less than a meter away from it.. No thanks. 

It's 40 inches, you don't need to set that close. 

Link to comment
Share on other sites

Link to post
Share on other sites

Well, according to Glenwing, that you so happily agreed with. It's standard practice that monitors are actually 6-bit with dithering. So this would be no different. Just another 6-bit monitor with dithering. 

So absolutely not 1.07 million colors. Like you have been so adamantly saying for a few minutes.

 

I wasn't saying it to agree with Glenwing. I actually have some understanding as to how this works (not a ton, but clearly more than you)

 

Keep being defensive. It looks great.

Link to comment
Share on other sites

Link to post
Share on other sites

Right.

 

Seiki press release: http://www.seiki.com/company/news/2014-06-25-Seiki-CE-Week-4K-Display-Launch.php

 

 

6-bit color depth seems extremely far-fetched. I think it's more likely someone at the booth told him 1.07 million color support by accident, and he didn't make the connection that they meant 1.07 billion.

 

Those monitors in that press release are from June, and support HDMI 2.0. This monitor in this article does not support HDMI 2.0. So they aren't the same monitor. Those also supposedly support DP 1.3 (which is obviously not out yet), while this only supports DP 1.2. So not the same monitor. 

 

 

So absolutely not 1.07 million colors. Like you have been so adamantly saying for a few minutes.

 

I wasn't saying it to agree with Glenwing. I actually have some understanding as to how this works (not a ton, but clearly more than you)

 

Keep being defensive. It looks great.

 

 

I never once said it was 1.07 million colors? When did I say that? Did you even read anything I wrote? And who's being defensive? LOL.

 

Keep being arrogant. It looks great.

Link to comment
Share on other sites

Link to post
Share on other sites

It's 40 inches, you don't need to set that close. 

 

Not the point mate. If it's a desk monitor, it's going to be used at my desk connected to my PC. My desk is 1 meter deep and that's the furthest I sit from my monitor when I'm not watching a movie.

 

I'm sure this could be great for content creation, to my preference not so great for gaming. I like having everything on the monitor in my FOV without having to move my head. That's not possible on a 40" monitor I sit less than 1 meter from.

Bert & Ernie before squirting spermie. 

Link to comment
Share on other sites

Link to post
Share on other sites

Those monitors in that press release are from June, and support HDMI 2.0. This monitor in this article does not support HDMI 2.0. So they aren't the same monitor. Those also supposedly support DP 1.3 (which is obviously not out yet), while this only supports DP 1.2. So not the same monitor.

They are the same.

http://hd-report.com/2015/01/07/seiki-intros-pro-4k-monitors-at-ces-2015/

"Right now, the monitors only offer HDMI 1.4, but Seiki promises HDMI 2.0 will be available in the second quarter of 2015. Just as well, DisplayPort 1.2 will be replaced by DisplayPort 1.3 in Q2 2015. Hence, we advise waiting to purchase one of these monitors until those ports are updated."

Also, keep in mind most of the cheap 28" 4K monitors support 10-bit color depth, and there are some 4K TVs that do as well, under $1000. So not that improbable.

Link to comment
Share on other sites

Link to post
Share on other sites

They are the same.
http://hd-report.com/2015/01/07/seiki-intros-pro-4k-monitors-at-ces-2015/

"Right now, the monitors only offer HDMI 1.4, but Seiki promises HDMI 2.0 will be available in the second quarter of 2015. Just as well, DisplayPort 1.2 will be replaced by DisplayPort 1.3 in Q2 2015. Hence, we advise waiting to purchase one of these monitors until those ports are updated."

Also, keep in mind most of the cheap 28" 4K monitors support 10-bit color depth, and there are some 4K TVs that do as well, under $1000. So not that improbable.

 

Well they definitely support 10 bit color, so I stand corrected about it being 6-bit with dithering, but at the moment they are technically different monitors because the monitors in that press release (that support HDMI 2.0 and DP 1.3 or when they do) are called,  28-inch (28U4SEP-G02), 32-inch (32U4SEP-G02) and 40-inch (40U4SEP-G02) and comes in 3 sizes. While the article you posted above in HDReport are called 32-inch (SM32UNP) and 40-inch (SM40UNP) and only comes in 2 sizes. 

Link to comment
Share on other sites

Link to post
Share on other sites

40 inches? Good.

60hz? Good.

Now all it needs to do is support g-sync or freesync and I'll be interested.

Link to comment
Share on other sites

Link to post
Share on other sites

Using a TV is fine as long as it has a fast enough refresh rate and if it's 4K/120hz it's using HDMI 2.0, but you need to make sure your gpu spec supports hdmi 2.0 otherwise use displayport to hdmi 2.0 and you'll be fine.

But.. How does the colors hold up to a IPS 4K display? as colors matter more than FPS to me but also need a 4K tv at the same time! 

 

Not all LG 4K tvs have a IPS-4K panel, the 40" posted above does not have a IPS display, it seems they're only using IPS on the larger displays.

http://www.lg.com/us/tvs/lg-49UB8200-led-tv - big enough for a monitor or laying on the futon "kinda put my bed in the garage for extra desk space. 

 

Between the 40" Seiki and the LG, LG is far superior even if it's not using an IPS panel "LG knows it's schit when it comes to panel displays"

 

 

40 inches? Good.

60hz? Good.

Now all it needs to do is support g-sync or freesync and I'll be interested.

Won't find that on a TV, get a 240HZ if it matters to you but 4K/240hz costs a premium right now, I'm debating whether or not I should save and get a IPS 4K TV and double it as a TV or get a high end IPS 4K monitor.  

I'm Batman!

Steam: Rukiri89 | uPlay: Rukiri89 | Origin: XxRukiriXx | Xbox LIVE: XxRUKIRIxX89 | PSN: Ericks1989 | Nintendo Network ID: Rukiri

Project Xenos: Motherboard: MSI Z170a M9 ACK | CPU: i7 6700k | Ram: G.Skil TridentZ 16GB 3000mhz | PSU: EVGA SuperNova 850w G2 | Case: Caselabs SMA8 | Cooling: Custom Loop | Still in progress 

Link to comment
Share on other sites

Link to post
Share on other sites

Until I see some reviews on the input lag, I'd hold off.

Can't wait until someone releases a graphics card, that I don't have to multiply, that will handle 4k graphics.

If you don't us AA at 4K (pretty much unnecessary if you get a 24/25" monitor), a 290X pretty much can after the latest driver updates.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×