Jump to content

Dissecting G-Sync and FreeSync - How the Technologies Differ

werto165

 

Didn't really know where to post this but I thought it would be best here as I feel a lot of misinformation is about, hopefully PC per can shed some light on it. I don't have time at the moment to watch it but I thought I might as well post it. 

 

There is an article to go along with it, probably builds on what is said in the video: http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

CPU: Intel 3570 GPUs: Nvidia GTX 660Ti Case: Fractal design Define R4  Storage: 1TB WD Caviar Black & 240GB Hyper X 3k SSD Sound: Custom One Pros Keyboard: Ducky Shine 4 Mouse: Logitech G500

 

Link to comment
Share on other sites

Link to post
Share on other sites

round-2-fight.png

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Great video and analysis. Didn´t know that both Gsync and FreeSync had that "minimum Fps" thing...
After this analysis I wouldn´t hesitate to go for a Gsync monitor, just beacause frame dips are VERY usual in gaming, no matter what rig u have.

I really REALLY wish AMD can fix this difference between technologies through drivers and offer the same experience as Nvidia´s Gsync. 

Processor: Intel i7 4790k @ 4.6GHz w/ 1.27v | Gpu: ASUS GTX 980Ti Strix | MoBo: ASUS Maximus Gene VII | Ram: Corsair Vengeance Pro 16GB 2400mhz | Case: Corsair Obsidian 350D | PSU: Corsair AX860i | Cooling: Corsair H100i + 4 Corsair SP 120mm fans | Keyboard: Logitech K800 | Mouse: Logitech Anywhere | Storage (OS):Samsung 840 Evo 500GB | Mass storage: WD Green 2TB + WD Blue 500GB + 1TB | Monitor: ASUS PB27Q  | Sound: Edifier C11 + Sennheiser RS 175

Link to comment
Share on other sites

Link to post
Share on other sites

Summary of video : Going into GSync and FreeSync by experimenting what happens below the lower bound of the variable refresh rate range. GSync's module seems to redraw frames when it is waiting too long to draw the next. As a result, when you get a lower fps, it still feels smooth. FreeSync's implementation did not have this, resulting into the juttery experience.

 

Opinion:

I'm still kinda confused. I mean, AMD just sorta revealed to us about the variable refresh rate. Wouldn't the solution to this be up to the manufacturers to improve their module to deal with it? The real difference between GSync and FreeSync in my opinion is that Nvidia sorta did the solution (for a price of course) for the monitor companies to just put in. FreeSync just sorta seems like that they said here is a way to have variable refresh rate, so monitor companies just do what they want with it (probably why we see all this different possible ranges for refresh rates). Regardless, I still want to see one of these technologies, so I can see how smooth the experience is.

Link to comment
Share on other sites

Link to post
Share on other sites

Cool, I was wondering exactly what they meant by handling the lower end better. Good to have an actually through explanation available. It's to bad that the first monitors available with Adaptive Sync are not well implemented, really hope the next few to release address these problems better.

Link to comment
Share on other sites

Link to post
Share on other sites

This is a very interesting comparison.  

 

I think the more important discussion is to figure out where the industry is headed.  Basically I'd love to see where G-Sync 2.0 and FreeSync 2.0 take us.  

 

First generation is never something that you should jump on. 

@TechBenchTV

 

Ex-NCIX Tech Tips Producer.  Linus hates my scripts. 

Link to comment
Share on other sites

Link to post
Share on other sites

Summary of video : Going into GSync and FreeSync by experimenting what happens below the lower bound of the variable refresh rate range. GSync's module seems to redraw frames when it is waiting too long to draw the next. As a result, when you get a lower fps, it still feels smooth. FreeSync's implementation did not have this, resulting into the juttery experience.

 

Opinion:

I'm still kinda confused. I mean, AMD just sorta revealed to us about the variable refresh rate. Wouldn't the solution to this be up to the manufacturers to improve their module to deal with it? The real difference between GSync and FreeSync in my opinion is that Nvidia sorta did the solution (for a price of course) for the monitor companies to just put in. FreeSync just sorta seems like that they said here is a way to have variable refresh rate, so monitor companies just do what they want with it (probably why we see all this different possible ranges for refresh rates). Regardless, I still want to see one of these technologies, so I can see how smooth the experience is.

Yeah, monitor manufacturers are like lazy and whatever...

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

GG Freesync. 

 

Imo the two companies should stop this "war" they have against each other and just collaborate to produce an ACTUALLY "FREE" variable refresh technology using the best parts of both the existing technologies. 

 

nVidia has nothing to lose as they have the bigger market share anyway, and AMD has nothing to lose as it means more people who would otherwise buy a g-sync monitor because it's obviously a better technology can now use AMD cards with it. 

 

makes sense to me.

This is what I think of Pre-Ordering video games: https://www.youtube.com/watch?v=wp98SH3vW2Y

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, gsync handles low frames rates better than freesync right now. Thanks pcper for the thorough explanation.

Hopefully AMD can fix it through drivers.

Intel i7 6700k @ 4.8ghz, NZXT Kraken X61, ASUS Z170 Maximus VIII Hero, (2x8GB) Kingston DDR4 2400, 2x Sapphire Nitro Fury OC+, Thermaltake 1050W

All in a Semi Truck!:

http://linustechtips.com/main/topic/519811-semi-truck-gaming-pc/#entry6905347

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, gsync handles low frames rates better than freesync right now. Thanks pcper for the thorough explanation.

Hopefully AMD can fix it through drivers.

I don't think they can fix it that simple as I understand the G-Sync module has its own storage in which it saves the last frame to allow for what they do at low frame rates.

In theory you could do that over your PC but that would mean you have to give up some Vram/Ram and I don't think anyone wants that.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think they can fix it that simple as I understand the G-Sync module has its own storage in which it saves the last frame to allow for what they do at low frame rates.

In theory you could do that over your PC but that would mean you have to give up some Vram/Ram and I don't think anyone wants that.

 

Should not be a problem. 1 frame does not take up that much. That is what a buffer is already, so you might not even notice. Now that the secret is out, AMD just needs to get going, because it is a nice way Gsync deals with this.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

GG Freesync. 

 

Imo the two companies should stop this "war" they have against each other and just collaborate to produce an ACTUALLY "FREE" variable refresh technology using the best parts of both the existing technologies. 

 

nVidia has nothing to lose as they have the bigger market share anyway, and AMD has nothing to lose as it means more people who would otherwise buy a g-sync monitor because it's obviously a better technology can now use AMD cards with it. 

 

makes sense to me.

this would actually be bad for us (consumers)... Lack of competetion will result in lack of innovations/price war/progress in technology

Like how a world war (a real one where ppl dies and stuff?) sling us 10 years forward in term of technology, a competetion will always make things progress faster

PLEASE QUOTE OR TAG (WITH @) ME IF YOU REALLY REALLY REALLY WANT ME TO REPLY!!!!!!!

Also if your issue is solved don't forget to mark the thread as solved!
Peace!!! from a random person in the tech's god forsaken land (named Finland or as I like to call it sarcastically FUNland)

Link to comment
Share on other sites

Link to post
Share on other sites

this would actually be bad for us (consumers)... Lack of competetion will result in lack of innovations/price war/progress in technology

Like how a world war (a real one where ppl dies and stuff?) sling us 10 years forward in term of technology, a competetion will always make things progress faster

Leave it to the internet to make an argument for an open standard being anti consumer.

Link to comment
Share on other sites

Link to post
Share on other sites

That was a really good video. Nice to see what actually happens.

With the Adaptive-Sync standard going all from 9hz to 240hz I would suspect AMD was hoping for a wider refresh range. Personally I think all Adaptive-Sync screens should work down to 20hz, or at least 24hz. Then the higher end ones could try to reach further (but is it really any point in that?)

 

Hopefully AMD can implement a driver solution for it until the panels get better. But if they make a solution for it, that might cause the panel makers to not strive for better range. Tough choice.

That's a little downside to Freesync that it is up to the manufacturers how good the Freesync part of it is. It could end up hurting AMD if many displays don't go down further than 40 (or 45 as some stops at now), open standards are good, but can be a bit risky for the company.

 

Let's hope we see 20-144hz (IPS) screens soon

Ryzen 7 5800X     Corsair H115i Platinum     ASUS ROG Crosshair VIII Hero (Wi-Fi)     G.Skill Trident Z 3600CL16 (@3800MHzCL16 and other tweaked timings)     

MSI RTX 3080 Gaming X Trio    Corsair HX850     WD Black SN850 1TB     Samsung 970 EVO Plus 1TB     Samsung 840 EVO 500GB     Acer XB271HU 27" 1440p 165hz G-Sync     ASUS ProArt PA278QV     LG C8 55"     Phanteks Enthoo Evolv X Glass     Logitech G915      Logitech MX Vertical      Steelseries Arctis 7 Wireless 2019      Windows 10 Pro x64

Link to comment
Share on other sites

Link to post
Share on other sites

So the tl;dr for those that didn't already know.... G-Sync's more mature than FreeSync.

CPU: FX 6300 @ stock Mobo: Gigabyte 990FX UD5 v3.0 GPU: 1 x R9 290 4GB RAM: 24GB DDR3 1600 SSD: Kingston HyperX 3K 120GB HDD: 1 x 1TB & 1 x 500GB PSU: BeQuiet PowerZone 1000W Case: Coolermaster Elite 370 (upside down due to lack of stick thermal pads for memory heatsinks) CPU Cooler: Thermalright Ultra Extreme 120 GPU Coolers: Thermalright HR03-GT Fans: 5 x Akasa Apache Blacks, 1 x Corsair 120mm SP HP (GPU) & 1 x Noctua 92mm
Most of this was from mining rig, hence the scewy specs (especially PSU)

Link to comment
Share on other sites

Link to post
Share on other sites

Great video and analysis. Didn´t know that both Gsync and FreeSync had that "minimum Fps" thing...

After this analysis I wouldn´t hesitate to go for a Gsync monitor, just beacause frame dips are VERY usual in gaming, no matter what rig u have.

I really REALLY wish AMD can fix this difference between technologies through drivers and offer the same experience as Nvidia´s Gsync. 

They (AMD) could just double the framerate once it reaches 1/2 of the monitors max refresh rate, then multiply it by 3-4 once it reaches 1/4 of the framerate and so on. All they need to do to do that is detect the monitors max refresh rate, and that is easy to detect.

hello!

is it me you're looking for?

ᴾC SᴾeCS ᴰoWᴺ ᴮEᴸoW

Spoiler

Desktop: X99-PC

CPU: i7 5820k

Mobo: X99 Deluxe

Cooler: Dark Rock Pro 3

RAM: 32GB DDR4
GPU: GTX 1080

Storage: 1TB 850 Evo, 1TB HDD, bunch of external hard drives
PSU: EVGA G2 750w

Peripherals: Logitech G502, Ducky One 711

Audio: Xonar U7, O2 amplifier (RIP), HD6XX

Monitors: 4k 24" Dell monitor, 1080p 24" Asus monitor

 

Laptop:

-Overkill Dell XPS

Fully maxed out early 2017 Dell XPS 15, GTX 1050 4GB, 7700HQ, 1TB nvme SSD, 32GB RAM, 4k display. 97Whr battery :x 
Dell was having a $600 off sale for the fully specced out model, so I decided to get it :P

 

-Crapbook

Fully specced out early 2013 Macbook "pro" with gt 650m and constant 105c temperature on the CPU (GPU is 80-90C) when doing anything intensive...

A 2013 laptop with a regular sized battery still has better battery life than a 2017 laptop with a massive battery! I think this is a testament to apple's ability at making laptops, or maybe how little CPU technology has improved even 4+ years later (at least, until the recent introduction of 15W 4 core CPUs). Anyway, I'm never going to get a 35W CPU laptop again unless battery technology becomes ~5x better than as it is in 2018.

Apple knows how to make proper consumer-grade laptops (they don't know how to make pro laptops though). I guess this mostly software power efficiency related, but getting a mac makes perfect sense if you want a portable/powerful laptop that can do anything you want it to with great battery life.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

They (AMD) could just double the framerate once it reaches 1/2 of the monitors max refresh rate, then multiply it by 3-4 once it reaches 1/4 of the framerate and so on. All they need to do to do that is detect the monitors max refresh rate, and that is easy to detect.

 

Indeed. Actually the Adaptive Sync standard, has a handshake standard, that makes the monitor tell the graphics card, what min/max hz are supported by the monitor. I would love for AMD to make such an implementation in the freesync drivers, as NVidia's solution is quite nice.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

So the tl;dr for those that didn't already know.... G-Sync's more mature than FreeSync.

It's not more mature it was implemented better. The swift and the monitor that was upgradable are using the original g-sync module and are performing better under the lower limit.

My posts are in a constant state of editing :)

CPU: i7-4790k @ 4.7Ghz MOBO: ASUS ROG Maximums VII Hero  GPU: Asus GTX 780ti Directcu ii SLI RAM: 16GB Corsair Vengeance PSU: Corsair AX860 Case: Corsair 450D Storage: Samsung 840 EVO 250 GB, WD Black 1TB Cooling: Corsair H100i with Noctua fans Monitor: ASUS ROG Swift

laptop

Some ASUS model. Has a GT 550M, i7-2630QM, 4GB or ram and a WD Black SSD/HDD drive. MacBook Pro 13" base model
Apple stuff from over the years
iPhone 5 64GB, iPad air 128GB, iPod Touch 32GB 3rd Gen and an iPod nano 4GB 3rd Gen. Both the touch and nano are working perfectly as far as I can tell :)
Link to comment
Share on other sites

Link to post
Share on other sites

so gsync handles lower fps better and frame caps you if you go over the limit?

 

meh, I never allow my games to dip below 50 and i can just cap my fps/vsync/increase graphics to stay in freesync range 

 

victory, imo, goes to freesync monitors for value

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, gsync handles low frames rates better than freesync right now. Thanks pcper for the thorough explanation.

Hopefully AMD can fix it through drivers.

the redrawing frames is already built into the displayport spec. i think amd just wanted to release free sync as fast as possible and they are probably going to implement it later

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×