Jump to content

G-sync vs FreeSync

Poggies

So im planning to buy a new 144hz monitor but the question is should it be FreeSync or g sync. What's really the difference of those 2 because most people say that they are the same but g sync is more expensive. Do i even need one of them for me as i get less than 144 fps? (around 110).

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync and "G-sync compatible" are both exactly the same thing, a fancy name for the adaptive sync spec already built into Displayport (and more recently HDMI, although not many monitors or GPUs support it). Full blown G-sync is better than those, since it has a significantly wider framerate window, but it's more expensive and much less common now that Nvidia has implemented standard adaptive sync.

¯\_(ツ)_/¯

 

 

Desktop:

Intel Core i7-11700K | Noctua NH-D15S chromax.black | ASUS ROG Strix Z590-E Gaming WiFi  | 32 GB G.SKILL TridentZ 3200 MHz | ASUS TUF Gaming RTX 3080 | 1TB Samsung 980 Pro M.2 PCIe 4.0 SSD | 2TB WD Blue M.2 SATA SSD | Seasonic Focus GX-850 Fractal Design Meshify C Windows 10 Pro

 

Laptop:

HP Omen 15 | AMD Ryzen 7 5800H | 16 GB 3200 MHz | Nvidia RTX 3060 | 1 TB WD Black PCIe 3.0 SSD | 512 GB Micron PCIe 3.0 SSD | Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Poggies said:

people say that they are the same but g sync is more expensive

Yes but actually no. There actually 3 different versions of G-Sync.

 

FreeSync - Just a rebrand of the VRR spec for displayport. It's the cheapest for display manufacturers to implement.

 

G-Sync Compatible - The exact same thing as Freesync, just validated to work with Nvidia GPUs. Most Freesync monitors will work on Nvidia cards, but there are a few that have issues. Buying one of these basically guarantees there won't be any issues. There might be some licensing that the display manufacturer needs to pay for to get this sticker which would explain why they're a little more expensive, but don't quote me on that.

 

G-Sync - There's an actual module installed in the monitor to control VRR. It's more expensive but usually has a larger range of refresh rates that it can vary between. 

 

G-Sync Ultimate - Everything above but also supports HDR and a couple other features. You only really find it on super expensive 4K 144Hz monitors.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mel0nMan said:

What GPU do you have? If it's AMD FreeSync will be better and Nvidia GSYNC will be better AFAIK. 

I got the gtx 1650, how will it be better tho? 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Mel0nMan said:

AMD makes FreeSync, Nvidia makes Gsync and it works better with their respective cards. So yeah, Gsync for you. 

Will it be useful even if i get less than 144 fps? 

Link to comment
Share on other sites

Link to post
Share on other sites

Let's get back to the question: should you get 144Hz Monitor with adaptive sync technology from either side, even when I am getting below the refresh rate?

I think you are having a misconception and/or worrying too much that the game's framerate should be equal to the monitor's refresh rate at all cost.

While having an adaptive sync technology from either side, it will correctly display that framerate, even if it is below the monitor refresh rate, though, anything higher would be capped at the maximum refresh rate, so 144 fps, like Vsync. However, since it has adaptive sync technology, there will not be screen tearing.

 

The problem with FreeSync (or GSync Compatible), is the range of adaptive sync refresh rate, most FreeSync monitor has a based range of 40/48Hz, if your framerate dropped below that range, screen tearing occurs.

GSync (OG) has a buffer module and has no range. While this is more expensive solution, it guarantee that from 1 to 144 Hz, the frame is displaying correctly.

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, AlfaProto said:

Let's get back to the question: should you get 144Hz Monitor with adaptive sync technology from either side, even when I am getting below the refresh rate?

I think you are having a misconception and/or worrying too much that the game's framerate should be equal to the monitor's refresh rate at all cost.

While having an adaptive sync technology from either side, it will correctly display that framerate, even if it is below the monitor refresh rate, though, anything higher would be capped at the maximum refresh rate, so 144 fps, like Vsync. However, since it has adaptive sync technology, there will not be screen tearing.

 

The problem with FreeSync (or GSync Compatible), is the range of adaptive sync refresh rate, most FreeSync monitor has a based range of 40/48Hz, if your framerate dropped below that range, screen tearing occurs.

GSync (OG) has a buffer module and has no range. While this is more expensive solution, it guarantee that from 1 to 144 Hz, the frame is displaying correctly.

Thanks for the explanation, i will probably get a g sync monitor since the prices are similar to FreeSync monitors. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/16/2021 at 11:57 PM, BobVonBob said:

Freesync and "G-sync compatible" are both exactly the same thing, a fancy name for the adaptive sync spec already built into Displayport (and more recently HDMI, although not many monitors or GPUs support it). Full blown G-sync is better than those, since it has a significantly wider framerate window, but it's more expensive and much less common now that Nvidia has implemented standard adaptive sync.

Most FreeSync monitors use "low framerate compensation", doubling Hz when you drop below the VRR window. For example 30 fps would be doubled to run at 60Hz, bringing it back into the VRR window.

 

On 10/17/2021 at 12:02 AM, RONOTHAN## said:

G-Sync Compatible - The exact same thing as Freesync, just validated to work with Nvidia GPUs. Most Freesync monitors will work on Nvidia cards, but there are a few that have issues. Buying one of these basically guarantees there won't be any issues. There might be some licensing that the display manufacturer needs to pay for to get this sticker which would explain why they're a little more expensive, but don't quote me on that.

Even G-Sync compatible monitors are not guaranteed to work. Samsung's Odyssey lineup is known to have bad issues using G-Sync. But somehow they all got certified by Nvidia to be fully compatible, telling us how much this certification is really worth.

 

On 10/17/2021 at 2:33 AM, Mel0nMan said:

AMD makes FreeSync, Nvidia makes Gsync and it works better with their respective cards. So yeah, Gsync for you. 

None of these is specifically "better" with any GPU. They do the same thing.

 

21 hours ago, AlfaProto said:

The problem with FreeSync (or GSync Compatible), is the range of adaptive sync refresh rate, most FreeSync monitor has a based range of 40/48Hz, if your framerate dropped below that range, screen tearing occurs.

GSync (OG) has a buffer module and has no range. While this is more expensive solution, it guarantee that from 1 to 144 Hz, the frame is displaying correctly.

Even if the range only goes to 40-48Hz for many FreeSync monitors, you still have low-framerate compensation in most monitors. Also, do you really play and enjoy games at 40 fps? If i'm that low i start to turn down settings to get more fps either way.

 

On 10/17/2021 at 12:02 AM, RONOTHAN## said:

G-Sync Ultimate - Everything above but also supports HDR and a couple other features. You only really find it on super expensive 4K 144Hz monitors.

Sadly the G-Sync Ultimate certification got changed so more monitors qualify. Nowadays there are G-Sync Ultimate displays that have next to no HDR cababilities, making this certification obsolete. It WAS the term users could look out for if they wanted a real HDR monitor.

 

I have used many monitors over the last years, including all the different forms of VRR technology and to the end user it simply doesn't make a difference if your monitor has FreeSync, G-Sync compatibility or native G-Sync. 99% of the time either one will work with any GPU. (AMD also supports VRR on native G-Sync monitors now afaik). G-Sync had some advantages in it's earlier days like the others mentioned. Like adaptive overdrive or the better VRR range. But nowadays even FreeSync monitors can have these features.

 

For modern high-end monitors G-Sync can be more of a problem than a solution. For example it needs a fan to cool it, which is by far the most common point of failure. And most manufacturers will not provide you with help if you want to replace it yourself. (I'm currently going through that with my Asus PG35VQ that works perfectly but has a whiny fan) Another problem is that G-Sync already starts to limit I/O. There is no G-Sync module available from Nvidia that supports HDMI 2.1, which is a few years old by now. LG started to implement HDMI 2.1 in their 2019 lineup of TVs. This results in 4K 144Hz monitors that can only run up to 60Hz over HDMI, for consoles for example.

 

 

TLDR: Don't bother with the certification or if the monitor has a hardware G-Sync module. The certification doesn't prevent incompatible monitors from entering the market (Samsung) and 99% of the time it will work either way. Nowadays it's a rare exception when a FreeSync monitor doesn't work on a NVidia GPU. G-Sync hardware is basically obsolete at this point.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Stahlmann said:

Most FreeSync monitors use "low framerate compensation", doubling Hz when you drop below the VRR window. For example 30 fps would be doubled to run at 60Hz, bringing it back into the VRR window.

 

Even G-Sync compatible monitors are not guaranteed to work. Samsung's Odyssey lineup is known to have bad issues using G-Sync. But somehow they all got certified by Nvidia to be fully compatible, telling us how much this certification is really worth.

 

None of these is specifically "better" with any GPU. They do the same thing.

 

Even if the range only goes to 40-48Hz for many FreeSync monitors, you still have low-framerate compensation in most monitors. Also, do you really play and enjoy games at 40 fps? If i'm that low i start to turn down settings to get more fps either way.

 

Sadly the G-Sync Ultimate certification got changed so more monitors qualify. Nowadays there are G-Sync Ultimate displays that have next to no HDR cababilities, making this certification obsolete. It WAS the term users could look out for if they wanted a real HDR monitor.

 

I have used many monitors over the last years, including all the different forms of VRR technology and to the end user it simply doesn't make a difference if your monitor has FreeSync, G-Sync compatibility or native G-Sync. 99% of the time either one will work with any GPU. (AMD also supports VRR on native G-Sync monitors now afaik). G-Sync had some advantages in it's earlier days like the others mentioned. Like adaptive overdrive or the better VRR range. But nowadays even FreeSync monitors can have these features.

 

For modern high-end monitors G-Sync can be more of a problem than a solution. For example it needs a fan to cool it, which is by far the most common point of failure. And most manufacturers will not provide you with help if you want to replace it yourself. (I'm currently going through that with my Asus PG35VQ that works perfectly but has a whiny fan) Another problem is that G-Sync already starts to limit I/O. There is no G-Sync module available from Nvidia that supports HDMI 2.1, which is a few years old by now. LG started to implement HDMI 2.1 in their 2019 lineup of TVs. This results in 4K 144Hz monitors that can only run up to 60Hz over HDMI, for consoles for example.

 

 

TLDR: Don't bother with the certification or if the monitor has a hardware G-Sync module. The certification doesn't prevent incompatible monitors from entering the market (Samsung) and 99% of the time it will work either way. Nowadays it's a rare exception when a FreeSync monitor doesn't work on a NVidia GPU. G-Sync hardware is basically obsolete at this point.

I don't understand, so if my fps drop to 90 with FreeSync will I experience tearing? Do I need to get G sync as i get minimum 90 fps? 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Poggies said:

I don't understand, so if my fps drop to 90 with FreeSync will I experience tearing? Do I need to get G sync as i get minimum 90 fps? 

No. Both FreeSync and G-Sync will be equally capable of reducing tearing. That's the short of it.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×