Jump to content

LightBoost; say good bye to your CRT

KaareKanin
Go to solution Solved by mdrejhon,
Buddy, I know what you're saying.
But, we're both correct -- you misunderstood me.
 

So in conclusion, pursuit camera only gets you crap visuals.

 

It's not for visuals; it's for more accurately photographically comparing motion between different displays in a format that's *closer* to what the human eye saw, *than* a static photograph.  As in motion blur, motion artifacts, etc.
 
Here are example pursuit camera photographs, that more properly demonstrates the purpose of pursuit camera photographs:
During motion of framerate matching refresh rate, the below photos are of www.testufo.com/ghosting -- View that webpage in a VSYNC-capable web browser (make sure it runs at full framerate), and see the remarkable similiarity of what you saw with your human eyes, to the photos below, as an example.
 
CROPPED_60Hz-300x99.jpg  60 Hz LCD 
 
CROPPED_120Hz-300x100.jpg  120Hz LCD -- 50% less motion blur than 60Hz 
 
CROPPED_LightBoost50-300x100.jpg  LightBoost 
 
A great example is running ToastyX Strobelight Utility (a new easy LightBoost utility) on a LightBoost LCD, and then turning ON/OFF LightBoost (Control+Alt+Plus and Control+Alt+Minus) while in the middle of viewing www.testufo.com/ghosting in a stable/fluid web browser -- every human who did this, says the photos agree quite accurately with what they saw.  Some minor variances between monitors (e.g. IPS vs TN vs VA) but always nearest-matched the correct photo of one of the above, since the sample-and-hold effect is the dominant factor of motion blur in today's monitors.  People sensitive to motion blur have confirmed that they see the same benefits in their game (especially when the game is running at framerate matching refreshrate). Obviously, you nee framerate matching refresh rates, to have the maximum possible fluidity on the display (best case scenario).

 

I don't care what the human eye can see. The human eye sees the reality. Our visual cortex does the heavy processing and image manipulation to fix errors to try for teh rest of the brain to make sense of the world. And this is the core reason why:

 

No disagreement there.  But you missed the point, again.

 

Scientists already know why motion blur happens with flickerfree displays -- it's called the sample-and-hold effect.  When your eyes are tracking moving objects on a flickerfree display, the static frames means your eyes are in a different position at the beginning of a refresh than at the end of a refresh.  That means the frames are blurred across your retinas.   The amount of motion blur is directly proportonal to the length of the visible part of the refresh.   Mathematically, 1ms of persistence translates to 1 pixel of tracking-based motion blur for 1000 pixels/sec motion (1 pixel per millisecond).   The only way to reduce this type of motion blur is to either add flicker (CRT / plasma / black frames / strobe backlight) or to add extra intermediate frames (interpolation or genuine frames).  Both methods shorten the static period of a frame.  

 

The Average User doesn't understand how the "sample-and-hold effect" works (the educational motion tests at www.testufo.com/eyetracking and www.testufo.com/blackframes does help to an extent).  However, it's all well-established science and explains how it interacts with vision.  Here are scientific references that show the well-known vision science of sample-and-hold:

 

List of Science Papers

 

What is needed in LCD panels to achieve CRT-like motion portrayal?
by A. A. S. Sluyterman (Journal of the SID 14/8, pp. 681-686, 2006.)
This is an older 2006 paper that explains how scanning backlight can help bypass the "hold effect". 

 

Temporal Rate Conversion (Microsoft Research)
Information about frame rate conversion, that also explains how eye tracking produces perceived motion blur on a sample-and-hold display, including explanatory diagrams.

Correlation between perceived motion blur and MPRT measurement
by J. Someya (SID05 Digest, pp. 10181021, 2005.)
Covers the relationship between human perceived motion blur versus Motion Picture Response Time (MPRT) of the display. This also accounts for motion blur caused by eye tracking on a sample-and-hold display, a separate factor than pixel persistence.

Frame Rate conversion in the HD Era
by Oliver Erdler (Stuttgart Technology Center, EuTEC, Sony Germany, 2008)
Page 4 has very useful motion blur diagrams, comparing sample-and-hold versus impulse-driven displays.

Perceptually-motivated Real-time Temporal Upsampling of 3D Content for High-refresh-rate Displays
by Piotr Didyk, Elmar Eisemann, Tobias Ritschel, Karol Myszkowski, Hans-Peter Seidel
(EUROGRAPHICS 2010 by guest editors T. Akenine-Mller and M. Zwicker)
Section 3. Perception of Displays (and Figure 1) explains how LCD pixel response blur can be separate from hold-type (eye-tracking) motion blur.

Display-induced motion artifacts
by Johan Bergquist (Display and Optics Research, Nokia-Japan, 2007)
Many excellent graphics and diagrams of motion blur, including impulse-driven and sample-and-hold examples.

 

 

....Anyway it is kind of nonsensical for you to say that I am wrong when I'm just quoting facts, all from pre-established vision research.  It may be a matter of "diminishing points of returns" (Certainly a legitimate wallet concern :)), but I'm certainly not scientifically wrong here -- you were saying "while we sometimes can't put our finger on it to pin point what is wrong with what we are seeing", you didn't realize that there are already established science papers over this matter that explains a lot of sample-and-hold already.

 

In fact, speaking of "wallets", it's already very profitable for ASUS (an ASUS rep mentions its popularity in their NewEgg YouTube).  The VG248QE is one of ASUS' better selling monitors, and very few monitors have over 100 Amazon reviews.  Hit Control+F while viewing that page, and search for word "LightBoost"; you'll see numerous users acclaim about it for themselves in those customer reviews.   Now Sony(click) and Eizo(click) are following suit with strobe backlights similar to LightBoost, and making them easier to turn on than nVidia's own LightBoost (nVidia originally force-bundled the LightBoost strobe backlight feature for 3D vision, and made it hard to enable without a 3D vision kit).   Blur Busters does get lots of visitors a week now (tens of thousands).  Motion-blur sensitive people exist in numbers big enough to create such demand.   Certainly not "millions of people" demand, but certainly big enough for quite a number of parties who want to get closer to the Holodeck experience. 

 

Sincerely,

Mark Rejhon

Owner of Blur Busters / TestUFO 

(Frequent collaborator with review sites including Adam of pcmonitors, Simon of TFTCentral, etc).

No.

You are talking about how flickering a light is a good way to solve slow response time issue.

I say, that this is stupid, and has, in my opinion, major down flaws, and R&D must be done to improve monitor LCD panel response time instead, flicker-free.

Link to comment
Share on other sites

Link to post
Share on other sites

No.

You are talking about how flickering a light is a good way to solve slow response time issue.

I say, that this is stupid, and has, in my opinion, major down flaws, and R&D must be done to improve monitor LCD panel response time instead, flicker-free.

 

CRT's aren't flickerfree (at least under high speed camera).  

All CRT's flicker, even if our eyes can't detect it.

 

Instant-pixel-transition displays can have motion blur (caused by sample-and-hold).  So making pixel transitions infinitely fast won't help motion blur, because transition time isn't true low response time.  Also a related article, see Why Do Some OLED's Have Motion Blur?    (HDGuru agreed with this in their review too as well).

 

Today, it is scientifically impossible to get holodeck quality on a finite-refresh-rate flickerfree display.  (Definition: Gettinbg 100% of motion blur to be solely caused by your brain, rather than externally enforced upon you by the display) unless you go to really high Hz (and framerates) which is impractical.  A compromise could be 1000fps @ 1000Hz, like what Michael Abrash of Valve Software mentioned.   But that probably won't happen in our lifetimes.

 

Regardless of CRT/LCD/Plasma/OLED/etc, mathematically, 1ms of persistence (pixel visibility time) translates to 1 pixel of motion blur for every 1000 pixels per second motion.  (That's because your eyes have tracked by 1/1000th of 1000 pixels in 1 millisecond, blurring 1 pixel over the course of 1 millisecond).  

 

Let's go to extremes here: Even a mathematically perfect, infinite-fast-pixel-transition, 4K 1000fps @ 1000Hz sample-and-hold display (no flicker even under highspeed camera) is mathematically guaranteed to create an absolute minimum of 4 pixels of motion blur during 4000 pixels/sec panning motion (about one screen width per second).  Vision researchers are already experimenting with 1000Hz DLP's today.  Those experimental projectors cost a high 5-figures.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, our computers of today were scientifically impossible back in the early 1900's. But hey look, we have them.

Also, I am not talking about Hz, I am talking about response time.

Faster response time, allows to diminish of blurring. The fix is a new panel technology that doesn't have this effect. Until that technology comes onto the mass market, money should be invested in R&D, to make the panel faster, and not use horrible age old TN panels, and stick a strobe light on the back of it, and call it "fix", 'cause that is what Light Boost is, and what you are promoting all day.

Link to comment
Share on other sites

Link to post
Share on other sites

Also I am not talking about Hz, I am talking about response time.

 

Read again.  That's what I said too.

-- I already said CRT 60fps@60Hz has less motion blur than regular LCD 120fps@120hz.

 

If you have ever compared CRT 60fps @ 60Hz, CRT 120fps @ 120Hz, LCD 60fps @ 60Hz, LCD 120fps @ 120hz, you already understand what I am talking about.  The blur differences of CRT's do not vary much between refresh rates because the superior phosphor persistence (as in short flicker length under high speed camera) is the same at all refresh rates.   There can be less stroboscopic effects and less visible flicker at higher refresh rates, but motion blur actually remains fairly constant at all refresh rates on a CRT during panning motion tests.  Whereas, the blur difference of non-strobed LCD's varies a lot with refresh rates is different at different refresh rates.  This is because the true response time is resembling the LCD hold time in this situation (e.g. 16.7ms of motion blur even on a 2ms LCD, during 60Hz).  But even at 120Hz, still worse motion blur than CRT as even 1/120sec is 8.3ms, creating approximately 8 times more motion blur than a CRT with about 1 millisecond of persistence.  So while refresh rate is different from true response time, I have an understanding of which situations and when they interact with each other in certain display technologies.  

 

 

Faster response time, allows to diminish of blurring. The fix is a new panel technology that doesn't have this effect. Until that technology comes onto the mass market, money should be invested in R&D, to make the panel faster, and not use horrible age old TN panels, and stick a strobe light on the back of it, and call it "fix", 'cause that is what Light Boost is.

 

Agree, strobing is a bandaid until 1000fps @ 1000Hz displays.  

(For a perfect "better-than-CRT" display that don't even flicker under high speed camera)

We're not going to get those for a long time.  

And even if that arrives, we're not going to be able to run 1000fps in current games with a GPU for a long time.  Perhaps later this century, of course.

 

So, for now, the "bandaid" of (optional) strobing is a good solution for many of us, even if not for you. Having (optional) strobing is a far easier solution, especially if using strobe rates far above human flicker detectability threshold.

 

For moving images, true response time is not caused by pixel transition speed.  True response time is caused by pixel visibility time (aka pixel "persistence", aka pixel "hold time").  

Link to comment
Share on other sites

Link to post
Share on other sites

I got a better idea... OLED.

while they have a "turn off" phasing out, technology advancement can reduce it.

It's already known that OLED can theoretically have less than 0.01ms response time, if memory serve correct.

While you might say, that in order to reach such speed you need to have some really high refresh rate. Getting TRUE <=8ms response time is possible, and algorithms can be made so that, if the next frame is the pixel is black, for a certain pixel, it can turn it off, reaching 0.01ms for color -> black response time, and possibly other tricks for bright to dark. And possibly other tricks with algorithms, which could give 8ms worst case response time, and 0.01ms at best.

~8ms response time, is 120Hz if I am not mistaken (I think it's 8. something ms). That is incredible, and is faster than all LCD panels. And no flickering!

Also, anything is possible. Remember when they designed the 32-bit CPU. "4GB of RAM? AHHAAHA we will never reach such ludicrous amount of memory! And even if we do have it, nothing will use more than it, even if you try!" (not actual quote, obviously). Well... someone was wrong.

Link to comment
Share on other sites

Link to post
Share on other sites

That post, I do LIKE.  (clicked "Like").

Some reply, though.

 

I got a better idea... OLED.

 

Agree that OLED is the correct direction to go in.  Lots of improvement is needed.
We need to give it time.  OLED is still worse than CRT though right now:

 

oled-response-1024x440.jpg

 

 
 

 

It's already known that OLED can theoretically have less than 0.01ms response time, if memory serve correct.

Not quite just yet, though.  Alas, needs more R&D.  

 

 

While you might say, that in order to reach such speed you need to have some really high refresh rate. Getting TRUE <=8ms response time is possible, and algorithms can be made so that, if the next frame is the pixel is black, for a certain pixel, it can turn it off, reaching 0.01ms for color -> black response time, and possibly other tricks for bright to dark. And possibly other tricks with algorithms, which could give 8ms worst case response time, and 0.01ms at best.

 

Yup, but don't forget it is required facing both the leading and the trailing edges of motion.  For example, if the color of the pixel in the previous refresh was black, and the color of the pixel in the next refresh was black, then we need to shorten both the trailing and leading edges to pull off a scientifically-accurate 0.01ms motion picture response time measurement. That means the pixel needs to be visible for only 0.01ms in order to meet the true 0.01ms response by scientific standard (google link).

 

The only two scientifically possible ways to pull off "scientifically true 0.01ms response":

[A] Flash that specific pixel for 0.01ms (quick strobe)

Use 100,000fps @ 100,000Hz (1second / 0.01ms = 100000) to fill all the 0.01ms time slots with unique frames. (a bit overkill)

 

Yes, one could theoretically keep non-moving pixels continuously shining, and moving pixels handled differently as above (either [A] or )  Something to more "resemble" the equivalent of "infinite refresh rate" that real life has (real life has no "frames", everything is continuous) to eliminate anything that looks "off" or "wrong" to the human vision.  All sorts of complex engineering could happen to pull that off eventually.   What you can't do is scientifically keep a pixel continuously shining for long periods (e.g. 8ms) inside a moving object, and expect zero motion blur. 

 

You can also replace the word "black" with "blue" or another color X.  e.g. If you have a fast-moving yellow fine details on a blue background, the color value of previous refresh and next refresh may be blue, so you may end up with a blue-yellow-blue cycle that's 0.01ms in length.  To do that without strobing, requires option (the 100,000Hz refresh rate) since you can't otherwise get a scientifically true 0.01ms response time on a flickerfree display.  Perhaps a refreshrate-free display where pixels can be updated at any point during a microsecond (e.g. each pixel as an independently controlled element in a microsecond time basis, without an artifical refresh schedule).  You can't do scientifically true 0.01ms flickerfree on a forced "pixel update schedule" (aka refresh rate) of every 8ms (120Hz) or 16.7ms (60Hz), without the ugly strobing/flicker/CRT band-aid.  But new tech could be invented.  Who knows.

 

Anyway, scientifically true 0.01ms measurement (persistence time, not transition time) is overkill.  

 

More realistically, let's use 1ms instead of 0.01ms.  A good eventual technological compromise within a few decades, will be the 1000fps@1000Hz (true native refresh rate) OLED, backed with a GPU capable of doing 1000fps.  This will allow CRT motion clarity on a display that doesn't even flicker at all to completely all human population (e.g. zero flickering seen even under high speed camera).

 

Or, who knows, before the end of this century, we might come with a solution to simulating infinite refresh rates / frame rates.  Holodeck perfection where "nothing looks off".

Link to comment
Share on other sites

Link to post
Share on other sites

I said: theoretically not practical. This means we know its possible for sure, but we can't do it with the current technology and knowledge we have now, we need more R&D, or would cost a fortune to make it happen with current technology and knowledge.

OLED is not ready. So pointing flaws on OLED is pointless.

Link to comment
Share on other sites

Link to post
Share on other sites

Right, agreed on that point.

 

So we're stuck with stopgap solutions for now.

Even Sony's Trimaster OLED uses a strobe mode.  As does the LG OLED HDTV too.  It has a black frame insertion mode (aka strobing -- in long flashes).

Some of us think strobing is an elegant stopgap solution for the OLED / LCD problem.  Honestly, it's not as inelegant as you think.  

 

Sometimes almost more elegant than the beautiful Rube Goldberg contraption of vaccuum-filled glass balloons ("CRT") containing an electron gun that shoots a beam of electrons, cleverly controlled by magnetic fields to aim through holes of a shadowmask/grille, slewing the beam back and forth (raster scanning), hitting luminescent material (phosphor) of three primary colors, to finally paint an image that's only useful to trichomat beings of certain wavelengths (human eyes) at the end.   Hypothetically, you can imagine an alien civilizations that never invented CRT's -- witnessing a CRT tube and getting puzzled by how that contraption works.   From that perspective, CRT's is a bandaid technology on the route to Holodeck perfection.  I loved CRT's too, by the way, but I'm just saying it's no less Rube Goldberg than a strobing an OLED or LCD.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 years later...

Is there any way possible i can use light boost technology for my Xbox one? I have an Asus VG248QE 3D Monitor 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×