Jump to content

CONFIRMED: G-SYNC includes LightBoost sequel. (nVidia sanctioned, no hack)

mdrejhon

When Andy of nVidia was asked whether LightBoost could be combined with G-GSYNC, AndyBNV of nVidia confirmed on NeoGaf:

 

“We have a superior, low-persistence mode that should outperform that unofficial [LightBoost] implementation, and importantly, it will be available on every 
G-SYNC
 monitor. Details will be available at a later date.”.

 

This scientifically confirms strobing is used, because of the law of vision physics — there is no other way to do LightBoost-matching low-persistence modes without ultrahigh refresh rates (e.g. 1000fps@1000Hz) or frame interpolation (e.g. 200fps->1000fps). Since both are unlikely with nVidia G-SYNC, this officially confirms backlight strobing. In addition, John Carmack confirmed on twitter that a better backlight strobe driver is included:
 

John Carmack (@ID_AA_Carmack) tweeted:

“@GuerillaDawg the didn’t talk about it, but this includes an improved lightboost driver, but it is currently a choice — gsync or flashed.”


Both statements by Andy and John, are confirmations that official backlight strobing (LightBoost) is part of G-SYNC, a 2D motion blur elimination, finally officially sanctioned by nVidia.  The question becomes: Can both be combined into adaptive-rate backlight strobing?

 

UPDATE: Your existing ASUS VG248QE monitor is already upgradeable to G-SYNC!

 

______________

 

For those not aware: Strobe backlights eliminate motion blur on LCD's The backlight is turned off while waiting for pixel transitions (unseen by human eyes), and the backlight is strobed only on fully-refreshed LCD frames (seen by human eyes). The strobes can be shorter than pixel transitions, breaking the pixel transition speed barrier! This allows LCD to have motion as clear as a CRT.

Since GtG (pixel transitions) is now shorter than persistence (pixel staticness), most motion blur today is now caused by persistence, as demoed by www.testufo.com/eyetracking

Link to comment
Share on other sites

Link to post
Share on other sites

You have my attention Nvidia...

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

I still don't know what light boost is(don't hurt me)

Cpu: Intel i7 4770k @4.4 Ghz | Case: Corsair 350D | Motherbord: Z87 Gryphon | Ram: dominator platinum 4X4 1866 | Video Card: SLI GTX 980 Ti | Power Supply: Seasonic 1000 platinum | Monitor: ACER XB270HU | Keyboard: RK-9100 | Mouse: R.A.T. 7 | Headset : HD 8 DJ | Watercooled

Link to comment
Share on other sites

Link to post
Share on other sites

I still don't know what light boost is(don't hurt me)

 

For those living under a rock, LightBoost (for 2D) is a strobe backlight that eliminates motion blur in a CRT-style fashion.  It eliminates crosstalk for 3D, but also eliminates motion blur (For 2D too).  LightBoost is now more popular for 2D.  Just google "lightboost".   See the "It's like a CRT" testimonials, the LightBoost media coverage (AnandTech, ArsTechnica, TFTCentral, etc), the improved Battlefield 3 scores from LightBoost, the photos of 60Hz vs 120Hz vs LightBoost, the science behind strobe backlights, and the LightBoost instructions for existing LightBoost-compatible 120Hz monitors.   It is truly an amazing technology that allows LCD to have less motion blur than plasma/CRT.  John Carmack uses a LightBoost monitor since December 2012 (Blur Busters broke the news before John did, though!), and Valve Software talked about strobing solutions too.  Now you're no longer living under a rock!

Link to comment
Share on other sites

Link to post
Share on other sites

I am planning to have a whole new build next year, this is sounding more appealing then it did before.

| Case:  NZXT H440 White| Motherboard: ASUS Sabertooth Z97 Mark I| CPU: Intel core i7 4770k | Cooling: NZXT Kraken X60  | Graphics Card: EVGA 780ti Classified| SSD: Samsung 840 EVO 250gb and 500gb | HDD: WD Black 2tb| RAM: Corsair Dominator Platinum 16gb| PSU: Corsair RM1000 | Monitor: LG 34UM95| Keyboard: Logitech G15 | Mouse: Razer Mamba| Audio: HyperX Cloud |

Link to comment
Share on other sites

Link to post
Share on other sites

Currently the back-light strobes once each refresh cycle, it works with high refresh rate monitors because the strobing is fast enough for you not to see the flicker. G-sync introduces dynamic refresh rates, so what happens when frames dip to 40 or 30fps? Surely the back-light wouldn't strobe as low as 30Hz. Maybe it dynamically adjusts in higher multiples of these refresh rates? But if the strobing is dynamic then doesn't the brightness need to be adjusted on the fly too? The faster it strobes the brighter the back light will appear.

 

Guess we'll see.

Link to comment
Share on other sites

Link to post
Share on other sites

Currently the back-light strobes once each refresh cycle, it works with high refresh rate monitors because the strobing is fast enough for you not to see the flicker. G-sync introduces dynamic refresh rates, so what happens when frames dip to 40 or 30fps? Surely the back-light wouldn't strobe as low as 30Hz. Maybe it dynamically adjusts in higher multiples of these refresh rates? But if the strobing is dynamic then doesn't the brightness need to be adjusted on the fly too? The faster it strobes the brighter the back light will appear.

 

Look closely at John Carmack's tweet -- It's a choice -- It's an "either-or" option

-- G-SYNC mode: Better for variable framerates (less stutters but more blur)
-- Strobe mode: Better for consistent max framerates 120fps @ 120hz (zero motion blur)

However, I believe I've invented a variable-rate strobing algorithm that gradually becomes PWM-free below 60Hz.  Creative strobe curve shaping.  Eliminates flicker (for most).  I'm hoping nVidia adopts this.

Link to comment
Share on other sites

Link to post
Share on other sites

And do I get to choose between these two or does the monitor automatically select between those 2 depending on how the frames vary?

Link to comment
Share on other sites

Link to post
Share on other sites

And do I get to choose between these two or does the monitor automatically select between those 2 depending on how the frames vary?

 

You get to choose via nVidia menus. (OSD or Control Panel; not sure which).

Easy option like VSYNC ON/OFF.

Link to comment
Share on other sites

Link to post
Share on other sites

If the vg278he won't support it I'm gonna be very disappointed. Does the monitor need a display port before the upgrade? if so I'm boned...

AMD FX8320 @3.5ghz |  Gigabyte 990FXA-UD3  |  Corsair Vengeance 8gb 1600mhz  |  Hyper 412s  |  Gigabyte windforceR9 290  |  BeQuiet! 630w  |  Asus Xonar DGX  |  CoolerMast HAF 912+  |  Samsung 840 120gb


2 WD red 1tb RAID0  |  WD green 2tb(external, backup)  |  Asus VG278He  |  LG Flatron E2240  |  CMstorm Quickfire TK MXbrown  |  Sharkoon Fireglider  |  Audio Technica ATH700X


#KILLEDMYWIFE

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×