Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

so, what is nVidia's Fast Sync introduced with Pascal GPUs?

Spoiler

q7mNunu.png

 

TL;DR version

the GPU will have three buffer regions:

  • front buffer - that will be sent to the display
  • back buffer - where the game engine will render the scene
  • last rendered buffer - where the GPU will move the last completely rendered back buffer

how does this work? the game will continuously render in the back buffer and the last buffer will receive the last rendered scene while the front buffer is displayed on screen

as soon as the image is displayed, the last buffer will switch places with the front buffer - thus, there will be no tearing; with normal VSync OFF, the back buffer would become the front buffer, but if the scene is not completely rendered scanned it will exhibit tear on the display

the game's engine will run in the same conditions as it would with VSync OFF - it would render scenes as fast as the system can

but the main difference is that this will add some latency - the front buffer will only contain a completely rendered scene, where in the case of VSync OFF it will be displayed no matter the completion status

 

this new tech is reserved to Pascal GPUs, and newer

 

---

 

I will add my own thoughts into the mix

my guess is that this new FastSync will only be beneficial when the game can output a somewhat high stable framerate - because otherwise, with high frame rate fluctuation it will introduce stutter

 

---

 

some people though that FastSync would be nVidia's implementation of VESA's Adaptive Sync - as you can see, not the case

Edited by zMeul
Link to post
Share on other sites

Hope this works for U :How it works.png

How I see it :
First U render some frames and they are all saved in last rendered fully buffer (ie. it contains only complete frames).
When front buffer sends a request (because it's display refresh time), last rendered buffer will send last rendered frame to the display (and other frames get deleted).
After display... displays (:)) the frame, it sends a request for new one to front buffer and the whole process starts again.

No tearing - because all "probable display frames" are fully rendered.
Small latency - because actually choosing last frame isn't THAT hard.
Picture is/should be smooth, because frames are REALLY close to each other (U basicly render multi frames per display ref. cycle).

Vsync off sends "teared" or not finished frames to display (there is no way to delete them "on the fly").
Vsync on creates lag, because everything must slow down (and that isn't good for fast movement).

CPU : Xeon E5-1680 v2 @ 4,3 GHz + Hydronaut + TRVX + 2x Delta 38mm PWM
MB : Sabertooth X79 (BIOS 4801 + NVMe mod + uCode update)
RAM : 8x A-Data DDR3 XPG 2000X 2GB @ 1868MHz CL8.9.8.24 CR2T, Quad Channel.
GPU : ASUS GTX 1080 (FE)
M.2/HDD : Samsung SM961 256GB (NVMe/OS) + RAID0 2x WD10EZEX (rev. 2013) + HGST Ultrastar 7K6000 6TB
Link to post
Share on other sites

In which scenarios is Fast Sync beneficial? Is this a new type of tripple buffering?

 

I wish Nvidia would just support Adaptive Sync already. I have no idea about the engineering cost involved but it seems anti-consumer not to. 

CPU: i5 3570K  4.2 GHz   GPU: MSI GTX970  |  Case: Fractal R4  

Keyboard: Pok3r MX Brown   Mouse:  Razer Deathadder

 

Link to post
Share on other sites
18 minutes ago, Maudima said:

In which scenarios is Fast Sync beneficial? Is this a new type of triple buffering?

this is beneficial to games that render at higher framerate than your monitor and you want as low latency as you get but you don't want tearing

VSync ON gets rid of tearing but adds latency

VSync OFF gets rid of the latency problem but adds tearing

the middle ground is FastSync - adds some latency for no tearing

Link to post
Share on other sites
24 minutes ago, Maudima said:

In which scenarios is Fast Sync beneficial? Is this a new type of tripple buffering?

 

I wish Nvidia would just support Adaptive Sync already. I have no idea about the engineering cost involved but it seems anti-consumer not to. 

If Nvidia can engineer a solution with more features and better quality than Adaptive Sync, then that competition is pro-consumer. Just because a standard is open doesn't mean it's the very best.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to post
Share on other sites

So this is all on the gpu side meaning works on all monitors. It's a good compromise to get something better than vsync but a better compromise would be freesync support. 

-------

Current Rig

-------

Link to post
Share on other sites

Kinda is... 
Difference is that triple buffering can't 100% kill tearing (since it buffers every frame rendered, regardless of it's status), and Fast Sync (in theory) - can.

CPU : Xeon E5-1680 v2 @ 4,3 GHz + Hydronaut + TRVX + 2x Delta 38mm PWM
MB : Sabertooth X79 (BIOS 4801 + NVMe mod + uCode update)
RAM : 8x A-Data DDR3 XPG 2000X 2GB @ 1868MHz CL8.9.8.24 CR2T, Quad Channel.
GPU : ASUS GTX 1080 (FE)
M.2/HDD : Samsung SM961 256GB (NVMe/OS) + RAID0 2x WD10EZEX (rev. 2013) + HGST Ultrastar 7K6000 6TB
Link to post
Share on other sites

basically this tehnology is made for people who run games above monitor refresh rate and  are heavy tearing if they turn on vsync they get lag(latency) so for someone who doesnt care about wasted FPS like a pro gamer he turns on FastSync you render 200 fps in Dota2 or League even if your monitor is only 60hz or 120hz and get low latency and NO tearing

its a bit of a niche feature usefull in competitive games, judging by how fast and precise you have to be in MOBAS and FPS like CSGO this feature will be welcomed, for the rest of gamers that are tired after some work or school and want to kill some stuff on the screen this doesnt matter, 0.

Link to post
Share on other sites
14 minutes ago, agent_x007 said:

Kinda is... 
Difference is that triple buffering can't 100% kill tearing (since it buffers every frame rendered, regardless of it's status), and Fast Sync (in theory) - can.

I've never experienced tearing in windowed mode when the DWM applies tripple buffering. Though I have read about people who don't share that experience which is odd to me. 

 

Edit: This post contains a technical error, scroll down for correction! 

Edited by Maudima

CPU: i5 3570K  4.2 GHz   GPU: MSI GTX970  |  Case: Fractal R4  

Keyboard: Pok3r MX Brown   Mouse:  Razer Deathadder

 

Link to post
Share on other sites
2 minutes ago, Maudima said:

I've never experienced tearing in windowed mode when the DWM applies tripple buffering. Though I have read about people who don't share that experience which is odd to me. 

a correction is needed - Triple Buffering does not work in DirectX nor DWM, it's specific to OpenGL API

MS has their own implementation of buffering

Link to post
Share on other sites
1 minute ago, DEDRICK said:

How is this different than Borderless Fullscreen?

this has nothing to do with borderless window nor full screen

Link to post
Share on other sites
1 minute ago, DEDRICK said:

Fast Sync - Uncapped, no tearing, added latency

Borderless Fullscreen - Uncapped, no tearing, added latency

borderless window is not as fast as explicit full screen

Link to post
Share on other sites

I also thought this was very similar to borderless fullscreen or windowed with vsync off. No tearing, no fps cap and reduced latency compared to normal fullscreen vsync at least on AMD cards. Nvidia's fast sync looks like it's much faster though.

 

For comparison from here: http://www.displaylag.com/reduce-input-lag-in-pc-games-the-definitive-guide/

 

Vsync on = 102ms

Windowed vsync = 81ms

Vsync off = 61ms

 

I hope we get to play with this new feature soon.

 

 

Link to post
Share on other sites
10 minutes ago, watt said:

Vsync on = 102ms

Windowed vsync = 81ms

Vsync off = 61ms

 

I hope we get to play with this new feature soon.

 

 

Vsync doesn't work in windowed. 

CPU: i5 3570K  4.2 GHz   GPU: MSI GTX970  |  Case: Fractal R4  

Keyboard: Pok3r MX Brown   Mouse:  Razer Deathadder

 

Link to post
Share on other sites

It's a technology allowing the user to wank to pr0n at 300 fps (faps per second) without tearing.

 

But seriously though, does anyone need that? Is tearing even visible at 200+ fps?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to post
Share on other sites
1 hour ago, Maudima said:

Vsync doesn't work in windowed. 

Did you read the source? Vsync is off, windows is preventing the tearing. I only pasted what was listed in the chart.

 

  • Windowed V-Sync:

It is possible to completely bypass control panel and in-game V-Sync options to utilize Windows’ native V-Sync implementation, which is also triple-buffered. By running the game in a window or borderless window, Windows handles the vertical synchronization, which can lead to smoother gameplay and lower input lag. Make sure you disable V-Sync in all other areas before doing so.

Link to post
Share on other sites

So again...  This is effectively a hardware implementation of triple buffering?  I wonder the latency comparisons to that (as both are obviously better than normal vsync). 

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to post
Share on other sites

Does it mean that G Sync become useless ?

CPU: i7 4790K | MB: Asus Z97-A | RAM: 32Go Hyper X Fury 1866MHz | GPU's: GTX 1080Ti | PSU: Corsair AX 850 | Storage: Vertex 3, 2x Sandisk Ultra II,Velociraptor | Case : Corsair Air 540

Mice: Steelseries Rival | KB: Corsair K70 RGB | Headset: Steelseries H wireless

Link to post
Share on other sites
1 minute ago, IhazHedont said:

Does it mean that G Sync become useless ?

no, this has nothing to do with G-Sync

Link to post
Share on other sites
5 minutes ago, Curufinwe_wins said:

So again...  This is effectively a hardware implementation of triple buffering?  I wonder the latency comparisons to that (as both are obviously better than normal vsync). 

Tripple Buffering works with VSync ON - FastSync works with VSync OFF

Link to post
Share on other sites
1 minute ago, zMeul said:

no, this has nothing to do with G-Sync

Why ? =)

 

Like, the main goal of GSync is to get rid off stutters and tearing.

 

This "fast sync" can remove tearing but not stutters ?

CPU: i7 4790K | MB: Asus Z97-A | RAM: 32Go Hyper X Fury 1866MHz | GPU's: GTX 1080Ti | PSU: Corsair AX 850 | Storage: Vertex 3, 2x Sandisk Ultra II,Velociraptor | Case : Corsair Air 540

Mice: Steelseries Rival | KB: Corsair K70 RGB | Headset: Steelseries H wireless

Link to post
Share on other sites
Just now, IhazHedont said:

Why ? =)

 

Like, the main goal of GSync is to get rid off stutters and tearing.

 

This "fast sync" can remove tearing but not stutters ?

because G-Sync becomes useful in the cases when frame rate drops below the normal refresh rate of the monitor - the G-Sync module will bring down the monitor's refresh to what the game's engine is capable of rendering

whereas FastSync works at the opposite side, where the framerate is way higher than monitor's refresh

 

they can totally work at the same time

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×