Jump to content

Is G-Sync Possible On IPS/PLS Panels ?

Let's start with what we know, #1 Asus has exclusivity rights to G-Sync until Q3 2014 and we've heard no official news or even rumors suggesting that Asus is working on a G-Sync enabled IPS panel.
#2 Backlight strobing/lightboost is officially available on every G-Sync monitor : SOURCE

#3 Backlight strobing/lightboost requires a high refresh rate (120hz or higher) to mitigate flickering : SOURCE

 

There are technical limitations to which why G-Sync requires a fast panel and this has to do with backlight strobing or "lightboost" since this technology requires very high refresh rates to eliminate blurring i.e. ghosting without causing uncomfortable flicker.

 

The vast majority of IPS and PLS panels support only up to 60hz ,we currently do not have any 120hz IPS panels, there are very specific Korean IPS monitors which can reach over 90Hz but only after overclocking.
And since Nvidia has already confirmed that all G-Sync monitors support lightboost and thus are natively 120hz or higher we can safely assume that a G-Sync enabled IPS panel does not exist.

However it is still possible that in a few years time once IPS/PLS start reaching suitably low persistence for 120hz that G-Sync can be implemented.
It's also possible that a new panel type emerges that offers image quality that's comparable to IPS/PLS and speeds comparable or at least close enough to TN.

There are other reasons why G-Sync requires high refresh rate monitors to work as intended and this has to do with input lag.
When the game is running at a frame rate that's lower than the maximum refresh rate of the monitor i.e 55 FPS on a 60Hz panel, G-Sync works as intended, meaning it will display the frames as soon as they're ready.

But when the game is running at a frame rate that's higher than the maximum refresh rate of the monitor i.e 65 FPS on a 60Hz panel, G-Sync stops being able to display the frames as they're ready because the monitor isn't refreshing quickly enough to display 65 FPS, only 60 and here we run into an issue with input lag.

TKbiVkV.jpg
Here you can see that the 3rd frame has already been completely drawn by the GPU, but the panel wasn't ready to display it because it can only refresh 60 times a second, i.e once every 16 milliseconds, so it simply has to wait and this directly results in input lag.
However if this was a 120hz panel, the second frame would have been displayed in 8ms and so the panel would have been ready to display the third frame instantly, again it has to do with the panel's speed.

 

To anyone that's interested, what blacklight strobing does is it turns off the panel's backlight in between refresh cycles (while the pixels are switching), this is to prevent the viewer from seeing the blur effect or "ghosting" that occurs during pixel transitions.

So the viewer only sees fully-refreshed frames, bypassing pixel persistence as a motion blur limiting factor.
If the pixels don't transition quickly enough (as in IPS & PLS panels) the refresh rate has to be lowered to prevent blurring.

 

But in the here and now G-Sync on IPS/PLS is simply not possible, yet.
Although a good alternative to TN does exist which can achieve similarly high refresh rates and response times and also has better color reproduction and viewing angles and that's VA or vertical alignment, while still not as good as IPS or PLS it certainly is worlds better than TN.

At low refresh rates a full backlight strobe would cause very noticeable flickering and that's why blacklight strobing is currently not possible on IPS and PLS panels.

Link to comment
Share on other sites

Link to post
Share on other sites

What the literal heck?! You were watching the live stream I know, everybody knows, you were spamming Linus's twitter. He clearly said they're compatible, there's no limitation to panel technology.

export PS1='\[\033[1;30m\]┌╼ \[\033[1;32m\]\u@\h\[\033[1;30m\] ╾╼ \[\033[0;34m\]\w\[\033[0;36m\]\n\[\033[1;30m\]└╼ \[\033[1;37m\]'


"All your threads are belong to /dev/null"


| 80's Terminal Keyboard Conversion | $5 Graphics Card Silence Mod Tutorial | 485KH/s R9 270X | The Smallest Ethernet Cable | Ass Pennies | My Screenfetch |

Link to comment
Share on other sites

Link to post
Share on other sites

Hahaha is this because what Linus said in the WAN Show? Anyway, G-Sync IPS monitors will probably start appearing soon, there's barely even a prototype out yet.

Link to comment
Share on other sites

Link to post
Share on other sites

What the literal heck?! You were watching the live stream I know, everybody knows, you were spamming Linus's twitter. He clearly said they're compatible, there's no limitation to panel technology.

There is, a G-Sync monitor has to support a refresh rate of 120Hz or higher. There is no IPS monitor on the market that natively supports 120Hz.

 

Link to comment
Share on other sites

Link to post
Share on other sites

What the literal heck?! You were watching the live stream I know, everybody knows, you were spamming Linus's twitter. He clearly said they're compatible, there's no limitation to panel technology.

 

When Andy of nVidia was asked whether LightBoost could be combined with G-GSYNC, AndyBNV of nVidia confirmed on NeoGaf:

“We have a superior, low-persistence mode that should outperform that unofficial [LightBoost] implementation, and importantly, it will be available on every
G-SYNC
monitor. Details will be available at a later date.”.

 

John Carmack (@ID_AA_Carmack) tweeted:

“@GuerillaDawg the didn’t talk about it, but this includes an improved lightboost driver, but it is currently a choice — gsync or flashed.”

Both statements by Andy and John, are confirmations that official backlight strobing (LightBoost) is part of G-SYNC, a 2D motion blur elimination, finally officially sanctioned by nVidia. The question becomes: Can both be combined into adaptive-rate backlight strobing without visible flicker?

 

Lightboost monitors are ALL 120Hz or higher, IPS is limited to 60Hz.

Link to comment
Share on other sites

Link to post
Share on other sites

feel as if they should be putting all there effort into oled instead of developing an older standard i.e ips/pls to work with it.

Link to comment
Share on other sites

Link to post
Share on other sites

ROFL Linus you're wrong, you're just wrong !

You won't get this unless you watched the stream.

Link to comment
Share on other sites

Link to post
Share on other sites

It should be fully compatible but we might have to wait because it seems like 120/144hz monitors will be getting G-Sync first.

Link to comment
Share on other sites

Link to post
Share on other sites

What your capable of running a panel at through stock firmware is entirely different than if you direct access to the hardware. 

export PS1='\[\033[1;30m\]┌╼ \[\033[1;32m\]\u@\h\[\033[1;30m\] ╾╼ \[\033[0;34m\]\w\[\033[0;36m\]\n\[\033[1;30m\]└╼ \[\033[1;37m\]'


"All your threads are belong to /dev/null"


| 80's Terminal Keyboard Conversion | $5 Graphics Card Silence Mod Tutorial | 485KH/s R9 270X | The Smallest Ethernet Cable | Ass Pennies | My Screenfetch |

Link to comment
Share on other sites

Link to post
Share on other sites

Matter of time :)

That's appropriate since what I said in my tweet during the WAN show and I quote

Luke is right, we have actual numbers for Mantle but we did not see G-Sync because we can't.

And it's still only on TN.                                                                                     

Still as in right now.

So my point stands.

Link to comment
Share on other sites

Link to post
Share on other sites

Not according to Nvidia.

It will be and there is no reason for it not to work all they would really have to do is lower the maximum frequency/screen updates.

Link to comment
Share on other sites

Link to post
Share on other sites

It will be and there is no reason for it not to work all they would really have to do is lower the maximum frequency/screen updates.

Full backlight strobes do not work as intended at lower refresh rates and will cause flickering which will lead to eye strain and an unpleasant image.

Also having G-Sync on 60Hz monitors defeats the purpose, since once you reach 60+ FPS on a 60hz monitor G-Sync will function exactly like V-Sync.

i.e. display older frames -> input lag.

Link to comment
Share on other sites

Link to post
Share on other sites

That's appropriate since what I said in my tweet during the WAN show and I quote

Still as in right now.

So my point stands.

So basically you believe marketing-guys from AMD, but you don't believe the reporters and reviewers (Linus included) who HAVE seen Gsync?

I know you personally haven't seen Gsync, but have you personally seen/felt the performance difference of enabling Mantle?

Both technologies have great potential, but in my opinion gsync is the easiest/cheapest/fastest to implement. It will be widely available soon, and won't rely on gamedevelopers to be supported in games.

Link to comment
Share on other sites

Link to post
Share on other sites

Also having G-Sync on 60Hz monitors defeats the purpose, since once you reach 60+ FPS on a 60hz monitor G-Sync will function exactly like V-Sync.

i.e. display older frames -> input lag.

You clearly don't fully understand what Gsync is, because it doesn't work like V-sync at all...

Link to comment
Share on other sites

Link to post
Share on other sites

You clearly don't fully understand what Gsync is, because it doesn't work like V-sync at all...

When the framerate drops below the maximum refresh rate of the monitor G-Sync resumes to work as intended, i.e. display the frames when ever they're ready.

However when a game runs at a frame rate that's higher than the refresh rate of the monitor G-Sync does in fact begin to function like V-Sync, as in G-Sync will call for a frame every 16ms (60 hz = a frame every 16ms) even if a frame was ready at the 8th ms, the display has to wait 8 additional milliseconds to display it, because the display refresh rate simply cannot keep up, this directly results in input lag.

TKbiVkV.jpg

 

So basically you believe marketing-guys from AMD, but you don't believe the reporters and reviewers (Linus included) who HAVE seen Gsync?

I know you personally haven't seen Gsync, but have you personally seen/felt the performance difference of enabling Mantle?

Both technologies have great potential, but in my opinion gsync is the easiest/cheapest/fastest to implement. It will be widely available soon, and won't rely on gamedevelopers to be supported in games.

That is not the case at all, I do not really question that G-Sync is an improvement, however how much is this improvement exactly worth to you ? are you willing to pay $400 for a TN 1920x1080 panel ? I'm pretty sure no one is willing to make such a decision without experiencing G-Sync first hand.

I would still be very hesitant to replace my IPS or LPS monitor with a TN panel no matter how frame tearing free the image might be.

And G-Sync is definitely not the cheapest option since Mantle is completely free.

Link to comment
Share on other sites

Link to post
Share on other sites

You clearly don't fully understand what Gsync is, because it doesn't work like V-sync at all...

You obviously have no idea what gsync does.

 

NVIDIA demonstrated the technology on 144Hz ASUS panels, which obviously caps the max GPU present rate at 144 fps although that's not a limit of G-Sync. There's a lower bound of 30Hz as well, since anything below that and you'll begin to run into issues with flickering. If the frame rate drops below 30 fps, the display will present duplicates of each frame.

If you're trying to run more than 60 FPS on your 60hz screen it does NOT matter if you have gsync because the screen will be limited to 60 frames per second either way.

Thats why gsync doesnt make sense on ips screen because they will limit you to 60 fps no matter how powerful your card is and if you exceed 60 FPS the screen will be to slow to catch up and you will get input lag.

Link to comment
Share on other sites

Link to post
Share on other sites

Regardless of lightboost a monitor has to be fast to take advantage of GSYNC because GSYNC is only as fast as the panel you install it on. It makes no sense installing a GSYNC module on a 30 hertz 4K TV because of the flickering and lag since according to anandtech.com at below 30 fps the monitor will display each frame twice this results in doubling of input latency.
It makes no sense to install it on a 60hz 1080p monitor either because of the input latency when you are running 60+ fps. Simply put IPS isn't fast enough to make use of GSYNC properly.
If you want GSYNC to play nicely with your graphics card you need to maintain a minium of 30+ fps and never touch the max refresh rate of your monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

When the framerate drops below the maximum refresh rate of the monitor G-Sync resumes to work as intended, i.e. display the frames when ever they're ready.

However when a game runs at a frame rate that's higher than the refresh rate of the monitor G-Sync does in fact begin to function like V-Sync, as in G-Sync will call for a frame every 16ms (60 hz = a frame every 16ms) even if a frame was ready at the 8th ms, the display has to wait 8 additional milliseconds to display it, because the display refresh rate simply cannot keep up, this directly results in input lag.

That is not the case at all, I do not really question that G-Sync is an improvement, however how much is this improvement exactly worth to you ? are you willing to pay $300 for a TN 1920x1080 panel ? I'm pretty sure no one is willing to make such a decision without experiencing G-Sync first hand.

I would still be very hesitant to replace my IPS or LPS monitor with a TN panel no matter how frame tearing free the image might be.

And G-Sync is definitely not the cheapest option since Mantle is completely free.

 

I wasn't talking about the cheapest to buy into, I said the cheapest to implement (= for companies). What do you think will cost more: Putting a chip in a monitor for which the consumer will pay, or developing for both Mantle and DirectX, while the revenue of your game will be the same?

 

I'm not interested in a monitor that is a higher refresh rate then 60, or that is anything but IPS. Gsync will be available on IPS, it is only a matter of time. If you ask me if I am willing to pay extra for that 60Hz IPS panel with Gsync without having seen it myself: I know what it does, and I've seen the reactions from people like Linus. That is enough for me to decide whether I will buy into the tech. Whether it is for you is up to you I guess.

Last but not least. I understand perfectly that a 60Hz monitor can't produce more then 60 FPS, and that when you are over that 60Hz Gsync will be alike V-sync, but still not the same. Sure, V-sync helps with the tearing in most cases, but I have almost never seen a game run perfectly smooth with V-sync. Gsync can be a permanent fix for that. "Input Lag" is irrelevant, since V-sync has the same limitation... 

(People see the word "lag" and automatically think "= BAD!". Input lag is always there, whether you like it or not, what matters is how much of it there is)

Gsync = the GPU tells the monitor when to draw a frame (with the maximum of fps = max refresh rate of your monitor)

V-sync = Monitor keeps refreshing at 60Hz as always, but your GPU "tries" to sync every drawn frame to that same 60Hz (and fails in my opinion most of the time), but can't perfectly, since monitor and GPU are not communicating....

Link to comment
Share on other sites

Link to post
Share on other sites

I wasn't talking about the cheapest to buy into, I said the cheapest to implement (= for companies). What do you think will cost more: Putting a chip in a monitor for which the consumer will pay, or developing for both Mantle and DirectX, while the revenue of your game will be the same?

 

I'm not interested in a monitor that is a higher refresh rate then 60, or that is anything but IPS. Gsync will be available on IPS, it is only a matter of time. If you ask me if I am willing to pay extra for that 60Hz IPS panel with Gsync without having seen it myself: I know what it does, and I've seen the reactions from people like Linus. That is enough for me to decide whether I will buy into the tech. Whether it is for you is up to you I guess.

Last but not least. I understand perfectly that a 60Hz monitor can't produce more then 60 FPS, and that when you are over that 60Hz Gsync will be alike V-sync, but still not the same. Sure, V-sync helps with the tearing in most cases, but I have almost never seen a game run perfectly smooth with V-sync. Gsync can be a permanent fix for that. "Input Lag" is irrelevant, since V-sync has the same limitation... 

(People see the word "lag" and automatically think "= BAD!". Input lag is always there, whether you like it or not, what matters is how much of it there is)

Gsync = the GPU tells the monitor when to draw a frame (with the maximum of fps = max refresh rate of your monitor)

V-sync = Monitor keeps refreshing at 60Hz as always, but your GPU "tries" to sync every drawn frame to that same 60Hz (and fails in my opinion most of the time), but can't perfectly, since monitor and GPU are not communicating....

But that's my point, Nvidia already confirmed that all G-Sync monitors support lightboost and thus are 120mhz or up. Only TN panels and one VA panel can achieve that refresh rate, a VA panel with G-Sync is possible right now unlike IPS/PLS and VA can actually be fairly good in terms of color and viewing angles so I'll be waiting for one of those it won't be nearly as good as an IPS 2560x1440 display but it certainly won't be nearly as terrible as TN.

A G-Sync module costs upwards of $100 per monitor so it's actually very expensive to implement, in fact that's just as expensive as the CPU+GPU in the PS4 and XBOX One.

http://www.teamliquid.net/forum/viewmessage.php?topic_id=435625

Link to comment
Share on other sites

Link to post
Share on other sites

I'm betting that there will be a 60hz panel with no lightboost with G-Sync on release day, they will want to put that tech in their cheapest compatible panel to cater for the lower end of the market, using the technology to get the leg over their competitors in the most profitable segment of the market.

Link to comment
Share on other sites

Link to post
Share on other sites

This explains a lot ! I didn't know Gsync had all of these limitations this explains why it requires tn panels.

Link to comment
Share on other sites

Link to post
Share on other sites

Cost is still a major problem with gsynch, Asus wants 410$ for its  1080p 144hz gsync monitor and that's just stupid, you can get a very nice 1440p IPS monintor for that price.

Link to comment
Share on other sites

Link to post
Share on other sites

In threory, G-Sync basically makes your refresh rate , only changing the frame when needed.

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×