Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Is adaptive sync reeeally that good?

So TLDR version, i have a great deal on an RTX 2070, its for 620$ and since im in europe that is a banging deal. So im planning to buy a 144hz 1440p monitor and it does have free sync but no g sync. What my question is, should i go with the RTX 2070 and get better performance, or go with the vega 56 and lose performance but have the adaptive sync. Keep in mind vega is quite old now, and just like the 980 Ti and the 1080, as it gets older it will get slower. The specs are:

16GBs of 3000mhz ram

R5 2600 Ocied at 3.8/9

 

so i think it can run fairly optimised games like BF 1 and 5 over 100Fps even at 1440p. What to choose? Help me out :D

Link to comment
Share on other sites

Link to post
Share on other sites

Adaptive sync is really amazing. I've bought a new monitor that supports Gsync for BF5 and I gotta say, it's amazing. 100FPS with high refresh rate and Gsync is a game changer. I've tried turning Gsync off and it felt as if ai lost half of my FPS.

Main system: i7 8700k 5Ghz / Asus Prime Z370-A / Corsair Vengeance 2x8GB 3000Mhz / Asus TUF RTX3080 / EVGA 750W GQ / Fractal Design Meshify C

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, GoldenLag said:

https://de.pcpartpicker.com/product/h766Mp/gigabyte-radeon-rx-vega-64-8gb-video-card-gv-rxvega64gaming-oc-8gd

 

You can get a vega 64 for lesd though. At least in germany.

 

Also compute cards lile Vega generally age quite well.

Ah thats a good deal, but im not in Germany, im in FYROM and that still gonna cost me about the same, the card you showed me was 480£ with 10£ shipping. Thats 490£, add the tax and whatever tariffs we have and your looking at atleast 510£ maybe even more. Thats why im going for local stuff, since with the tarrifs and shipping the cost adds up to be around 100 £ and with my local shops i get atleast 3 Years of warranty on every item except the case. 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, PopsicleHustler said:

Adaptive sync is really amazing. I've bought a new monitor that supports Gsync for BF5 and I gotta say, it's amazing. 100FPS with high refresh rate and Gsync is a game changer. I've tried turning Gsync off and it felt as if ai lost half of my FPS.

You say ah? Is it enough to tank performance by that much? 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, YourNewPalAlex said:

also what makes you think vegas age well? @GoldenLag

previous history of GCN which have kept their performed or performed better than their competing card over time. i believe the R9 290x which compteted with the gtx 780ti and were identical performance at launch gained in total 15% performance over the gtx 780ti over a 3 year period. (using data from multiple card benchmarks).

 

GCN is the macroarchitecture used in Vega and it is mostly oriented around Compute. (or at least Vega is and the high powered predaccessors.

 

2 minutes ago, YourNewPalAlex said:

You say ah? Is it enough to tank performance by that much? 

no, he is saying that when he had gsync/freesync on it was buttery smooth, but when he turned it off it seemded like it halfed the performance. the card was outputting the same amount of frames, but with sync on it was a lot smoother

 

people have different experience with adaptive sync, personally i have seen a marginal improvement. other see a major one and can go back to not having it.

 

6 minutes ago, YourNewPalAlex said:

Ah thats a good deal, but im not in Germany, im in FYROM and that still gonna cost me about the same, the card you showed me was 480£ with 10£ shipping. Thats 490£, add the tax and whatever tariffs we have and your looking at atleast 510£ maybe even more. Thats why im going for local stuff, since with the tarrifs and shipping the cost adds up to be around 100 £ and with my local shops i get atleast 3 Years of warranty on every item except the case. 

have you used price comparison websites to try getting the best pricing? also used market should be flooding with gtx 1070s or Vega cards.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, YourNewPalAlex said:

You say ah? Is it enough to tank performance by that much? 

Not at all, let me explain really what the whole adaptive sync is, GSync or Freesync.

 

If you crank up the graphical settings to ultras on a card that can't keep up above 60fps lets say and in intensive scenes like a lot of explosions happening you'll get fps drops.

 

These fps drops will cause that micro-freeze feeling that breaks immersion, these adaptive syncs will fill in fake frames and or reduce the refresh rate accordingly as it happens in order to keep the experience "smooth".

 

Sounds great sure, or you could just adjust your in game settings to where you stay on 60fps(or above) all times with a steady gameplay... rendering G-Sync or Freesync useless.

 

I actually don't use G-Sync because my system can v-sync triple buffered at the native 100hz refresh rate and never drops from it, so G-Sync does nothing in my case.

 

So basically adaptive sync is a gimmick  to make people look even more forwards that Ultra every thing when you could get away without it at high for instance. Depends more on the user whether it's useful than the feature in itself.

Personal Desktop":

CPU: Intel Core i7 8700 @4.45ghz |~| Cooling: Cooler Master Hyper 212X |~| MOBO: Gigabyte Z370M D3H mATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: nVidia Founders Edition GTX 1080 Ti |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: Intel Core i7 10700KF @ 5.0Ghz (5.1Ghz 4-core) |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Z490 UD |~| RAM: 32G Kingston HyperX @ 2666Mhz CL13 |~| GPU: AMD Radeon RX 6800 (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, GoldenLag said:

previous history of GCN which have kept their performed or performed better than their competing card over time. i believe the R9 290x which compteted with the gtx 780ti and were identical performance at launch gained in total 15% performance over the gtx 780ti over a 3 year period. (using data from multiple card benchmarks).

 

GCN is the macroarchitecture used in Vega and it is mostly oriented around Compute. (or at least Vega is and the high powered predaccessors.

 

no, he is saying that when he had gsync/freesync on it was buttery smooth, but when he turned it off it seemded like it halfed the performance. the card was outputting the same amount of frames, but with sync on it was a lot smoother

 

people have different experience with adaptive sync, personally i have seen a marginal improvement. other see a major one and can go back to not having it.

 

have you used price comparison websites to try getting the best pricing? also used market should be flooding with gtx 1070s or Vega cards.

When i asked is it enough to tank performance i meant going from a 2070 to a vega 56, not turning on g sync, i know that all it does it match the refresh rate of the monitor to the fps achieved in game

 

 

And the used market scene here is pretty sh*t, even after the mining stuff and the release of the rtx series, only like 1 or 2 vega deals.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Princess Cadence said:

Not at all, let me explain really what the whole adaptive sync is, GSync or Freesync.

 

If you crank up the graphical settings to ultras on a card that can't keep up above 60fps lets say and in intensive scenes like a lot of explosions happening you'll get fps drops.

 

These fps drops will cause that micro-freeze feeling that breaks immersion, these adaptive syncs will fill in fake frames and or reduce the refresh rate accordingly as it happens in order to keep the experience "smooth".

 

Sounds great sure, or you could just adjust your in game settings to where you stay on 60fps(or above) all times with a steady gameplay... rendering G-Sync or Freesync useless.

 

I actually don't use G-Sync because my system can v-sync triple buffered at the native 100hz refresh rate and never drops from it, so G-Sync does nothing in my case.

 

So basically adaptive sync is a gimmick  to make people look even more forwards that Ultra every thing when you could get away without it at high for instance. Depends more on the user whether it's useful than the feature in itself.

i know what g sync and free sync and what adaptive sync is, i just asked is it enough of a difference in smoothness to tank performance going from a 2070 to a vega 56.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Princess Cadence said:

Not at all, let me explain really what the whole adaptive sync is, GSync or Freesync.

 

If you crank up the graphical settings to ultras on a card that can't keep up above 60fps lets say and in intensive scenes like a lot of explosions happening you'll get fps drops.

 

These fps drops will cause that micro-freeze feeling that breaks immersion, these adaptive syncs will fill in fake frames and or reduce the refresh rate accordingly as it happens in order to keep the experience "smooth".

 

Sounds great sure, or you could just adjust your in game settings to where you stay on 60fps(or above) all times with a steady gameplay... rendering G-Sync or Freesync useless.

 

I actually don't use G-Sync because my system can v-sync triple buffered at the native 100hz refresh rate and never drops from it, so G-Sync does nothing in my case.

 

So basically adaptive sync is a gimmick  to make people look even more forwards that Ultra every thing when you could get away without it at high for instance. Depends more on the user whether it's useful than the feature in itself.

You forgot to mention that adaptive sync eliminates screen tearing without introducing mad input lag compared to Vsync. Gsync also works when your GPU can't get to monitors refresh rate. So for example when you have 144hz panel, vsync won't have any effect below 144fps.

Main system: i7 8700k 5Ghz / Asus Prime Z370-A / Corsair Vengeance 2x8GB 3000Mhz / Asus TUF RTX3080 / EVGA 750W GQ / Fractal Design Meshify C

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Princess Cadence said:

Not at all, let me explain really what the whole adaptive sync is, GSync or Freesync.

 

If you crank up the graphical settings to ultras on a card that can't keep up above 60fps lets say and in intensive scenes like a lot of explosions happening you'll get fps drops.

 

These fps drops will cause that micro-freeze feeling that breaks immersion, these adaptive syncs will fill in fake frames and or reduce the refresh rate accordingly as it happens in order to keep the experience "smooth".

 

Sounds great sure, or you could just adjust your in game settings to where you stay on 60fps(or above) all times with a steady gameplay... rendering G-Sync or Freesync useless.

 

I actually don't use G-Sync because my system can v-sync triple buffered at the native 100hz refresh rate and never drops from it, so G-Sync does nothing in my case.

 

So basically adaptive sync is a gimmick  to make people look even more forwards that Ultra every thing when you could get away without it at high for instance. Depends more on the user whether it's useful than the feature in itself.

Okay, so you think i will always keep a steady 60 in games with an r5 2600 ocied to 3.8 16gbs of 3000mhz dual channel ram, and a 2070 at 1440p? Or is it just worth it to bite the bullet and go with a vega and have the free sync working but loose performance when compared to the 2070?

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, PopsicleHustler said:

You forgot to mention that adaptive sync eliminates screen tearing without introducing mad input lag compared to Vsync. Gsync also works when your GPU can't get to monitors refresh rate. So for example when you have 144hz panel, vsync won't have any effect below 144fps.

This is untrue, v-sync "input lag" is negligible specially if you v-sync higher than 60hz, with a 144hz you can set it at the refresh rate you want, something like 80~100 already makes things feel very smooth while it is perfectly possible to lock v-sync steady on 80~100fps making as explained adaptive sync not needed.

 

Also with a Ryzen 5 2600, OP will only be getting the 144fps on e-sports and older titles any ways.

 

I don't think the RTX 2070 is any thing exciting, it's barely any faster than a much cheaper 1070 Ti and I don't think a V56 is enough for 1440p144hz if you want AAA gaming.

 

So to me personally doesn't matter which way he goes here

Personal Desktop":

CPU: Intel Core i7 8700 @4.45ghz |~| Cooling: Cooler Master Hyper 212X |~| MOBO: Gigabyte Z370M D3H mATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: nVidia Founders Edition GTX 1080 Ti |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: Intel Core i7 10700KF @ 5.0Ghz (5.1Ghz 4-core) |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Z490 UD |~| RAM: 32G Kingston HyperX @ 2666Mhz CL13 |~| GPU: AMD Radeon RX 6800 (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, YourNewPalAlex said:

What im basically asking is 10-15% faster performance or adaptive sync? @GoldenLag @Princess Cadence @PopsicleHustler

My humble opinion based on experience, your gaming experience will be much the same either ways, I'd go with the nVidia personally but try the 1080 Ti ~ 2080 instead as it feels very justified for 1440p144hz where you'd have enough horse power to render adaptive sync meaningless since you'll have barely no fps spike drops.

 

So TL:DR it doesn't matter terribly much.

Personal Desktop":

CPU: Intel Core i7 8700 @4.45ghz |~| Cooling: Cooler Master Hyper 212X |~| MOBO: Gigabyte Z370M D3H mATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: nVidia Founders Edition GTX 1080 Ti |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: Intel Core i7 10700KF @ 5.0Ghz (5.1Ghz 4-core) |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Z490 UD |~| RAM: 32G Kingston HyperX @ 2666Mhz CL13 |~| GPU: AMD Radeon RX 6800 (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Princess Cadence said:

This is untrue, v-sync "input lag" is negligible specially if you v-sync higher than 60hz, with a 144hz you can set it at the refresh rate you want, something like 80~100 already makes things feel very smooth while it is perfectly possible to lock v-sync steady on 80~100fps making as explained adaptive sync not needed.

 

Also with a Ryzen 5 2600, OP will only be getting the 144fps on e-sports and older titles any ways.

 

I don't think the RTX 2070 is any thing exciting, it's barely any faster than a much cheaper 1070 Ti and I don't think a V56 is enough for 1440p144hz if you want AAA gaming.

 

So to me personally doesn't matter which way he goes here

At 144Hz is very noticeable. There is a great video made by A YouTube Battle(non)sense, he proved that Vsync doubles input lag in exports games.

Main system: i7 8700k 5Ghz / Asus Prime Z370-A / Corsair Vengeance 2x8GB 3000Mhz / Asus TUF RTX3080 / EVGA 750W GQ / Fractal Design Meshify C

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, PopsicleHustler said:

At 60Hz is very noticeable. There is a great video made by A YouTube Battle(non)sense, he proved that Vsync doubles input lag in exports games.

VSync is for non-competitive play, the "input lag" hardly even matters in competitive but since it does exist, sure leaving it off even on a 60hz screen despite tearing is what you'd want, but for single player games v-sync will eliminate all tearing and not add significant input lag to really be noticeable.

 

That is why I'm trying so hard go off the rigid plastered "youtube truths" and try to show OP user cases that exist.

Personal Desktop":

CPU: Intel Core i7 8700 @4.45ghz |~| Cooling: Cooler Master Hyper 212X |~| MOBO: Gigabyte Z370M D3H mATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: nVidia Founders Edition GTX 1080 Ti |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: Intel Core i7 10700KF @ 5.0Ghz (5.1Ghz 4-core) |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Z490 UD |~| RAM: 32G Kingston HyperX @ 2666Mhz CL13 |~| GPU: AMD Radeon RX 6800 (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, GoldenLag said:

Its hard to recommend either option tbh. 

 

If the vega about 40-50$ cheaper. Go for that one. If it isnt. Go for the gtx 2070

The vega aint cheaper, its more expensive XD, about 600euros

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, YourNewPalAlex said:

The vega aint cheaper, its more expensive XD, about 600euros

then go with the 2070......... Pay attention to the wording......

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Princess Cadence said:

This is untrue, v-sync "input lag" is negligible specially if you v-sync higher than 60hz, with a 144hz you can set it at the refresh rate you want, something like 80~100 already makes things feel very smooth while it is perfectly possible to lock v-sync steady on 80~100fps making as explained adaptive sync not needed.

 

Also with a Ryzen 5 2600, OP will only be getting the 144fps on e-sports and older titles any ways.

 

I don't think the RTX 2070 is any thing exciting, it's barely any faster than a much cheaper 1070 Ti and I don't think a V56 is enough for 1440p144hz if you want AAA gaming.

 

So to me personally doesn't matter which way he goes here

Im not really a triple a guy either, im more on the racing faced titles, and a bit of battlefield and cod, which all can run more then 100fps at 1440p no matter the card. But nothing like shadow of the tomb raider and such.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, GoldenLag said:

then go with the 2070......... Pay attention to the wording......

Just sayin how expensive the vega is :P At least in my local shops.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Princess Cadence said:

VSync is for non-competitive play, the "input lag" hardly even matters in competitive but since it does exist, sure leaving it off even on a 60hz screen despite tearing is what you'd want, but for single player games v-sync will eliminate all tearing and not add significant input lag to really be noticeable.

 

That is why I'm trying so hard go off the rigid plastered "youtube truths" and try to show OP user cases that exist.

I actually use v sync a heck of alot on my crappy laptop since it can run old games like cod4 over 100fps but it has a shit ton of tearing and i have to use vsync to smooth it out, thats why in my opinion id get the vega and just chill with adaptive sync, but thats just my experience ya know?

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, geo3 said:

lol no. that's not true at all.

idk about you but the 980 ti isnt as fast as the 1080 in recent games, maybe the same wont happen with the vegas, just going by logic ya know

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share


×