Jump to content

AMD FreeSync monitor on Nvidia GPU

Hello.

 

Im getting a new monitor to add to my setup.

 

I have chosen a monitor which is 24.5 inches and has 144hz. It also has AMD FreeSync Technology.

 

I was wondering if me having a Nvidia GeForce GTX 1060 6GB on a FreeSync Monitor will stop me from getting 100hz+ or will cause screen tearing. 

 

Thanks.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, huilun02 said:

The monitor will function as you expect, as if it didn't support Freesync at all

So having FreeSync monitor is just a bonus for AMD GPUs but just normal for Nvidia GPUs?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, huilun02 said:

Yup

Okay thanks for your help :)

Link to comment
Share on other sites

Link to post
Share on other sites

Also for the record, you won't notice any screen-tearing unless you got PAST 144 fps. If you start running into that issue you can set a frame-rate limit on your card or use something like adaptive Vsync which is similar to freesync. Gsync though... so nice.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, huilun02 said:

Nope

If you don't use Freesync, Vsync, or Gsync, you still get screen tearing. And it gets less noticeable the higher the framerate.

Adaptive Vsync is NOT similar to Freesync. Adaptive Vsync is basically Vsync that only kicks in if you can maintain 60fps.

Freesync and Gsync are similar in terms of end result.

Screen tearing most commonly happens when your FPS values go higher than your monitors refresh rate. This means that they get out of sync....

 

Lets make this simple (all that matters is the theory here and not the numbers I pick) Lets say your monitor can draw one frame every 1 second, but your gpu can draw 1 frame every .7 seconds.  That means that every few seconds the gpu will render and send you monitor a frame while it is still drawing one, this will cause it to stop drawing it and instead draw the new one... which will cause the current frame to appear torn.

 

Now on today's monitors with refresh rates over 200, this isn't as big of an issue, but lets say you are playing on an older 60hz monitor with a 1080ti... this would occur quiet often. That is why Vsync was invented and why it always caps your frames at your monitors refresh rate.  It makes it so you can't send more frames to the monitor than it can draw. The downside to this is that it introduces input lag as your gpu is constantly waiting on your monitor.

 

Generally the input lag isn't terrible and most of the time isn't noticeable, but it is there. That is why they introduced adaptive vsync which gives you the best of bother worlds... it limites the amount of input lag, while also trying to make sure you don't go over your refresh rate.   Gsync, is a module that communicates with your gpu.. this means that they are always displaying the same hz rate. Freesync is also very similar to Gsync.

 

Now what gsync does eliminate is the noticeable drop you feel when your fps goes from say 120 to 60.. it makes this transition feel smooth and makes it much less noticeable. That being said the Screen-tearing that is being mentioned is from your FPS going higher than your monitor can refresh the image.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, huilun02 said:

Screen tearing occurs because the screen refreshes at a constant interval regardless of what the GPU sends. Freesync and Gsync changes this and makes the screen adhere to the GPU instead. The GPU does not produce new frames at a constant interval. So if there is no Gsync or Freesync, the display will put out whatever incomplete frames the GPU has drawn when it is time to refresh the image. This is what causes tearing. And you WILL get tearing as long as the display doesn't wait for the GPU to complete frames.

 

Don't make me pull a moderator into the discussion... I have no qualm with you or your beliefs, but keep it to yourself and don't spread misinformation because it is only going to lead others to disappointment when they find out the hard way that you're wrong.

It might technically still be "Occurring", but you won't notice it. The screen tearing everyone complains about is 99.9% of the time from frame rate going over the refresh where you get a string of frames all having tearing. On say a 144hz monitor though the tearing occurs at a point that the refresh comes faster than the tearing can become noticeable. Also when you are BELOW the refresh rate it occurs much less often... as when you are above the refresh it happens on consecutive frames.

 

Go test it now if you don't believe me. If you have a weak card turn your resolution down to 720p and run around with fast movements above the refresh rate, now increase it back to 1080 or supersample it and see if you notice it BELOW the refresh rate.  The only time you will really wish you had Gsync or Freesync is when you have a game dipping from 80fps to 30's... where you will feel that nasty chug effect... that doesn't happen on gsync.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, huilun02 said:

Good now we are on the same page, but your first post;

Is flat out wrong. Please don't post this kind of misleading stuff again.

You can keep telling me how wrong I am all you want, but to tell me what I should and should not post is quiet honestly, none of your damn business.

 

Also if you take the time to do some research you would see I am pretty spot on. I even took a moment to re-educate myself due to your first response. To which I did discover that technically screen-tearing can occur at either below or above refresh rate... that it is only really noticeable to the human eye when it is above the refresh rate.

 

This is due to the frequency of the occurrence, and also the amount of time this frame is actually present to view.

 

if it occurs every 10-15 frames for .0069 seconds. when below refresh

 

Vs 4-6 frames in a ROW for .027 seconds. The problem with when you go above is it can occur on a LOT of simultaneous frames... 4-6 was just a nice average number.  So when you have about a 3% of a second to see it vs, .7% of a second it is a pretty big different.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×