Jump to content

-Sync counterintuitive?

Ergroilnin

Guys,

 

this question came up in my mind after I asked on these forums about possibly getting AMD instead of GeForce since my monitor is a 144hz Freesync one and after some more digging through the internet.

 

I have seen quite a few people say, that the higher the refresh rate of the monitor, the less the screen tearing is noticable even when not hitting same or more FPS than the monitor is set and able to display. This seems counterintuitive to me but maybe it is just that I am not looking at this from the right point of view and not having actual experience with the freesync/g-sync technology firsthand.

 

So is it actually true, that let's say getting 45 fps in 60 hz monitor is more noticable than say having 90 or even 120, whatever, on a 144hz one? Or is it actually not a matter of fps per se, but not stable as in highly variable fps (so not for example constant 40-45 but 90-120) that makes the tearing more pronounced?

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Ergroilnin said:

Guys,

 

this question came up in my mind after I asked on these forums about possibly getting AMD instead of GeForce since my monitor is a 144hz Freesync one and after some more digging through the internet.

 

I have seen quite a few people say, that the higher the refresh rate of the monitor, the less the screen tearing is noticable even when not hitting same or more FPS than the monitor is set and able to display. This seems counterintuitive to me but maybe it is just that I am not looking at this from the right point of view and not having actual experience with the freesync/g-sync technology firsthand.

 

So is it actually true, that let's say getting 45 fps in 60 hz monitor is more noticable than say having 90 or even 120, whatever, on a 144hz one? Or is it actually not a matter of fps per se, but not stable as in highly variable fps (so not for example constant 40-45 but 90-120) that makes the tearing more pronounced?

I have never used freesync/g-sync. Just use a fps lock. It does almost the same thing.

Link to comment
Share on other sites

Link to post
Share on other sites

I havent tested my 165hz without gsync , but with gsync if you configure well its almos 0 tearing with 0,5ms imput lag maximum, 

You need to put freesync 2 fps (from 2 to 5% from your total) lower than your max refresh , that allows to freesync work allways (you could have tearing if you reach 144fps and freesync disables)

tearing comes when the input (gpu) puts at diferent rate to the output (screen), your game tears trying to sync, fixed fps could solve that if your game are 100% enhanced with 0 drops, but the reality is that particles on explosions , playing online with a lot of people , will reduce your fps and if you havent a sync tech you will tear on that fps dip.

Freesync for me is a must if you play newer games or dont care about maxing fps in non-competitive games, for competitive games that you tweak all to make the best experience you can skip freesync (but I have gsync and never go back, its so smooth, smooth as if you swap your gpu for one newer)

Case: Corsair 760T  |  Psu: Evga  650w p2 | Cpu-Cooler : Noctua Nh-d15 | Cpu : 8600k  | Gpu: Gygabyte 1070 g1 | Ram: 2x8gb Gskill Trident-Z 3000mhz |  Mobo : Aorus GA-Z370 Gaming K3 | Storage : Ocz 120gb sata ssd , sandisk 480gb ssd , wd 1gb hdd | Keyboard : Corsair k95 rgb plat. | Mouse : Razer deathadder elite | Monitor: Dell s2417DG (1440p 165hz gsync) & a crappy hp 24' ips 1080p | Audio: Schiit stack + Akg k712pro + Blue yeti.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, H20Burner said:

I have never used freesync/g-sync. Just use a fps lock. It does almost the same thing.

FreeSync/G-Sync is useful for when your FPS are unstable and you're not hitting the Monitor Refresh Rate.

 

Locking the FPS of a game (or "limiting" it, since most games don't actually support a hard lock, but an upper limit, if they even support that) is useful when you can hit the max refresh rate of the monitor. The only way this would be remotely close to the same (or "useful") is if you set the FPS lock at your lowest minimum FPS that you can reach in-game. This severely limits your max FPS and overall performance, since the game may average really high FPS but see occasional drops that can often be significant.

 

If you're not hitting that max, having Adaptive Sync is a massive improvement. You get buttery smooth FPS the entire time (so long as your minimum FPS remains within the A-Sync range that your monitor supports - varies per monitor), even when the FPS drops.

 

It's actually an incredible invention, but like an SSD, doesn't sound that impressive "on paper". Like an SSD, you have to use it to truly understand and appreciate it.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

With these two comments, another question coming to my mind that I do not actually know answer to... How does a monitor/GPU/whatever handle things that are not way under, but way higher than the refresh rate of the monitor? Like I know a lot of people with high end gaming PCs hit say 300? 500? Well, alot, FPS in say CS:GO and even if they go for a 240 hz monitor, that is still way higher on low end and like twice as much for high end fps than the monitor can output. Also I guess that at these kinds of ridiculous FPS, it is not uncommon to get drops of like 50 fps or more at times.

 

Does the monitor just skip for the easy math every second frame and even pretty high (as in absolute value, not percentage-wise (like 50hz but that still being just 10%)) drops do not cause any tearing at all, or even in these cases the frame drop does cause the tearing? My guess is that it does not, as every single frame the monitor is saturated by at least one or more unique outputs, but I am not sure.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Ergroilnin said:

With these two comments, another question coming to my mind that I do not actually know answer to... How does a monitor/GPU/whatever handle things that are not way under, but way higher than the refresh rate of the monitor? Like I know a lot of people with high end gaming PCs hit say 300? 500? Well, alot, FPS in say CS:GO and even if they go for a 240 hz monitor, that is still way higher on low end and like twice as much for high end fps than the monitor can output. Also I guess that at these kinds of ridiculous FPS, it is not uncommon to get drops of like 50 fps or more at times.

Basically the monitor can only output frames as fast as it's refresh rate. Hence, "refresh" rate. The rate at which it can refresh. Any frames sent faster than the refresh rate are discarded.

 

You can still get tearing at high refresh rates, but they are caused by different things. Mostly it's when the game has refreshed 3, 5, 10, 50 times, before the monitor is fast enough to refresh once (300+ FPS on a 60Hz display). So the GPU sends a frame to the monitor, which renders it, then by the time the monitor can render the next frame, the GPU has already rendered 28 more, and sends the 29th frame to the monitor, instead of the 2nd. This can cause a form of stuttering.
 

Sometimes it can cause unexpected things and visual problems, but mostly everything is happening too fast, in too short a time frame for you to notice.

3 minutes ago, Ergroilnin said:

Does the monitor just skip for the easy math every second frame and even pretty high (as in absolute value, not percentage-wise (like 50hz but that still being just 10%)) drops do not cause any tearing at all, or even in these cases the frame drop does cause the tearing? My guess is that it does not, as every single frame the monitor is saturated by at least one or more unique outputs, but I am not sure.

To be honest, I'm not too sure what you're trying to say here.

 

The "ideal" outcome is that the GPU outputs exactly the same frames as your monitor's refresh rate. No more, no less. This eliminates tearing, input lag, latency, and stuttering.

 

The best way to do this is using a GPU that can max out the refresh rate 100% of the time, and then frame locking the game to said refresh rate. If you see occasional dips, that's where Adaptive Sync would come in (FreeSync, G-Sync).

 

But since it's hard to drive modern AAA games at max settings above 60Hz (let alone 120/144Hz) with no drops, without sacrificing quality (turning down to medium, etc), the "ideal" situation is almost never realized, and thus Adaptive Sync becomes even more valuable.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Ergroilnin said:

Guys,

 

this question came up in my mind after I asked on these forums about possibly getting AMD instead of GeForce since my monitor is a 144hz Freesync one and after some more digging through the internet.

 

I have seen quite a few people say, that the higher the refresh rate of the monitor, the less the screen tearing is noticable even when not hitting same or more FPS than the monitor is set and able to display. This seems counterintuitive to me but maybe it is just that I am not looking at this from the right point of view and not having actual experience with the freesync/g-sync technology firsthand.

 

So is it actually true, that let's say getting 45 fps in 60 hz monitor is more noticable than say having 90 or even 120, whatever, on a 144hz one? Or is it actually not a matter of fps per se, but not stable as in highly variable fps (so not for example constant 40-45 but 90-120) that makes the tearing more pronounced?

Well, it's pretty simple actually, at higher refresh rates the tear is on the screen for a shorter amount of time and is therefore less noticeable.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Glenwing said:

Well, it's pretty simple actually, at higher refresh rates the tear is on the screen for a shorter amount of time and is therefore less noticeable.

Oh god... That absolutely makes sense. D'oh! Thanks for the eye opener.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ergroilnin said:

With these two comments, another question coming to my mind that I do not actually know answer to... How does a monitor/GPU/whatever handle things that are not way under, but way higher than the refresh rate of the monitor? Like I know a lot of people with high end gaming PCs hit say 300? 500? Well, alot, FPS in say CS:GO and even if they go for a 240 hz monitor, that is still way higher on low end and like twice as much for high end fps than the monitor can output. Also I guess that at these kinds of ridiculous FPS, it is not uncommon to get drops of like 50 fps or more at times.

 

Does the monitor just skip for the easy math every second frame and even pretty high (as in absolute value, not percentage-wise (like 50hz but that still being just 10%)) drops do not cause any tearing at all, or even in these cases the frame drop does cause the tearing? My guess is that it does not, as every single frame the monitor is saturated by at least one or more unique outputs, but I am not sure.

The first thing that should be understood is that the monitor is sent video data at the exact rate that it needs to be drawn to the screen. If a screen is refreshing at 144 Hz, that's one frame every 6.944 ms. The GPU will send data to the monitor at a constant rate of exactly one frame per refresh period. The GPU may be generating new frames at a faster or slower rate, but this doesn't affect the speed at which frames are sent to the display.

 

Whenever the GPU finishes generating a new frame, it saves that frame in memory (buffers it) and starts working on a new one.

 

At any given time, a previously completed frame is being "scanned out" or transmitted to the display one bit at a time. As mentioned, the rate of this is timed so that when you finish sending a frame, it's time to start sending the next one, so it is basically a continuous process, other than a few microseconds pause between each frame.

 

The frame that is currently being scanned out to the display is labeled as the "front buffer". When the GPU finishes sending a frame, it can immediately switch the front buffer to that frame, and if the scanout process is in the middle, then the second half of the frame sent to the monitor is from a different frame than the upper half, which is a tear. Optionally, the GPU can wait for the current scan to finish, then switch the front buffer to the new frame during the short pause between scans. This is called V-Sync. Generally the GPU will stop processing new frames once it has one lined up, resulting in a framerate cap, but there are variations (such as FastSync) which have the GPU continue to generate as any new frames as possible. If your framerate is huge, you may generate 5 new frames since the previous scanout began. Only the newest one would be sent, the excess ones would be discarded.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Hi

 

Back in 2016 I bought a 3440X1440 34" Samsung monitor. At the time I used an i7 2600k with GTX 1080. Screen tear was a big issue with that combo. I could tweak out most of the tear but not all. With games that had a ENB mod I had a better chance of reducing it.

At the beginning of this year I replaced the i7 2600k with a i7 8700k & the screen tear was gone. I don't know if it was the faster ram or faster CPU. I am not knowledgeable enough to draw a conclusion but it may be safe to say that in some cases it is not the GPU/monitor causing the issue.

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×