Jump to content

Experiences with NVIDIA G-SYNC + VESA A-SYNC monitor

D13H4RD

Acer XF270H 1080p144hz, DisplayPort 1.2, GTX 1070, Windows 10.

 

Works just fine for me. Confirmed no tearing in Borderlands 2, no display blanking, no blur, no noticeable ghosting.

 

Edit: Checked using G-Sync Pendulum demo, still no issues.

 

Edit 2: Screen Blanks out momentarily when opening spotify desktop app.

Case: Thermaltake Versa H35 | CPU: AMD Ryzen 7 1700x (@4.0Ghz) Cooling: Cooler Master MasterLiquid Lite 240 | MOBO: Gigabyte AB350M-DS3H | RAM: Corsair Vengeance Pro RGB 16GB (2x8GB) 3333Mhz | GPU: MSI ARMOR 8GB OC GTX 1070 | Storage: SAMSUNG 970 EVO 250GB, 1TB Seagate 2.5" 5400RPM | PSU: Corsair CX750M

Link to comment
Share on other sites

Link to post
Share on other sites

Got a displayport cable for my ViewSonic VX2457.

 

G-SYNC still doesn't appear in the Nvidia Control panel. :(

Ryzen 1600x @4GHz

Asus GTX 1070 8GB @1900MHz

16 GB HyperX DDR4 @3000MHz

Asus Prime X370 Pro

Samsung 860 EVO 500GB

Noctua NH-U14S

Seasonic M12II 620W

+ four different mechanical drives.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Giganthrax said:

Got a displayport cable for my ViewSonic VX2457.

 

G-SYNC still doesn't appear in the Nvidia Control panel. :(

It appears when you enable Freesync on the monitor.

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, jones177 said:

It appears when you enable Freesync on the monitor.

Holy crap, I love you!

Ryzen 1600x @4GHz

Asus GTX 1070 8GB @1900MHz

16 GB HyperX DDR4 @3000MHz

Asus Prime X370 Pro

Samsung 860 EVO 500GB

Noctua NH-U14S

Seasonic M12II 620W

+ four different mechanical drives.

Link to comment
Share on other sites

Link to post
Share on other sites

LG 32UD59-B 32" 4K UHD

 

The G-sync shows up in the Nvidia control panel menu but I don't see a difference between it and regular vsync on screen.

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, jones177 said:

LG 32UD59-B 32" 4K UHD

 

The G-sync shows up in the Nvidia control panel menu but I don't see a difference between it and regular vsync on screen.

Download the nVidia G-sync pendulum demo, it has options you can change in real time to see the difference. It's most noticeable on the test pattern.

Case: Thermaltake Versa H35 | CPU: AMD Ryzen 7 1700x (@4.0Ghz) Cooling: Cooler Master MasterLiquid Lite 240 | MOBO: Gigabyte AB350M-DS3H | RAM: Corsair Vengeance Pro RGB 16GB (2x8GB) 3333Mhz | GPU: MSI ARMOR 8GB OC GTX 1070 | Storage: SAMSUNG 970 EVO 250GB, 1TB Seagate 2.5" 5400RPM | PSU: Corsair CX750M

Link to comment
Share on other sites

Link to post
Share on other sites

Okay, something I don't quite understand. I tested it with the Pendulum and then played Doom 2016 for 10 minutes with gsync ON, vsync OFF to test it out. I'm happy to report I noticed no screen tearing, blurring, or anything weird really. :)

 

The one thing that confuses me is that Steam overlay fps counter shows a healthy 200fps. Does this mean that gsync/freesync divorces the FPS counter from the actual framerate shown on screen? Do I no longer have to turn off vsync when benchmarking my games to see how well they run on my system? :D

Ryzen 1600x @4GHz

Asus GTX 1070 8GB @1900MHz

16 GB HyperX DDR4 @3000MHz

Asus Prime X370 Pro

Samsung 860 EVO 500GB

Noctua NH-U14S

Seasonic M12II 620W

+ four different mechanical drives.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Giganthrax said:

Okay, something I don't quite understand. I tested it with the Pendulum and then played Doom 2016 for 10 minutes with gsync ON, vsync OFF to test it out. I'm happy to report I noticed no screen tearing, blurring, or anything weird really. :)

 

The one thing that confuses me is that Steam overlay fps counter shows a healthy 200fps. Does this mean that gsync/freesync divorces the FPS counter from the actual framerate shown on screen? Do I no longer have to turn off vsync when benchmarking my games to see how well they run on my system? :D

FPS counter is always divorced from what is shown on the screen, it is showing what is rendered, not what is displayed. 

 

You don't have to turn v-sync off because you don't have to turn it on in the first place. G-sync is meant to remove the need and negative consequences of V-sync (stuttering and desynchronization when below max refreshrate, also increased input lag) while maintaining its advantages (no tearing). It is not locking to any framerate by itself though, so you have to use framerate limiter to stay within G-sync/FreeSync range. 

 

Also this 200 FPS is a framerate cap in Doom 2016, so you were just running at 200 FPS stable it seems. Don't know what your display's FreeSync range is, but if it doesn't go up to 200 FPS then you were out of range while playing so G-sync wasn't even active while you were playing. 

Spoiler

CPU: i7-6900K 4.5 GHz | Motherboard: X99 Sabertooth | GPU: RTX 2080 Ti SLi | RAM: 32GB DDR4-3400 CL13 | SSD: SX8200 PRO 512GB | PSU: Corsair AX1600i | Case: InWin 805C | Monitor: LG 38UC99-W 85Hz | Keyboard: Wooting One Analog | Keypad: Azeron Compact Analog | Mouse: Swiftpoint Z | Audio: Klipsch Reference 5.1

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, asand1 said:

Is that documented anywhere by Nvidia? All I've read or heard was that it "was being tested on Pascal and Turing." Not that it would be limited to them.

905731166_Screenshot_20190116-013901_AdobeAcrobat.thumb.jpg.963cb22c2e1b4fcfa53b38a36734bf32.jpg

 

Is anyone having problems getting the option to show up in the manage 3d setting tab, not on windows 10?

Link to comment
Share on other sites

Link to post
Share on other sites

Tested some more. Interestingly, Overwatch automatically knew to keep FPS at 75 (my monitor refresh rate) with vsync off & max fps set to 300, and there was no screen tearing I could see with just freesync/gsync there. Disregard this, I just remembered I play Overwatch in fullscreen windowed mode. :D 

 

Heroes of the Storm had no such aspirations, however, and the FPS rocketed into the ~350 range, which caused visible screen tearing while scrolling the map. Turning vsync on fixed the problem. 

 

The Surge didn't had very tiny amounts of screen tearing/flickering at certain places, mostly near edges. Turning vsync on fixed the problem. Otherwise, it was buttery smooth.

 

Will test some more games these days, but on the whole I'm happy with this little boost to my monitor gaming. :] 

Ryzen 1600x @4GHz

Asus GTX 1070 8GB @1900MHz

16 GB HyperX DDR4 @3000MHz

Asus Prime X370 Pro

Samsung 860 EVO 500GB

Noctua NH-U14S

Seasonic M12II 620W

+ four different mechanical drives.

Link to comment
Share on other sites

Link to post
Share on other sites

So I did more testing, this time in games that aren't difficult to run. It's even stranger this time.

 

In games like TF2 and L4D2, running at 144FPS or higher was pleasantly smooth. But the moment it dips, it stutters and tears, exactly the opposite of what should happen.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, 420istoday said:

Download the nVidia G-sync pendulum demo, it has options you can change in real time to see the difference. It's most noticeable on the test pattern.

Thanks I did.

 

Running the demo I got screen tear. Switching on Freesync on the monitor got rid of it.

 

Then I did what I do with all my games. I added it to the "Manage 3d settings" list in the Nvidia control panel. Then I added vsync. The demo ran better than it did with Freesync.

 

My LG 4k monitor probably won't get the G-sync compatible sticker but it is easy to get rid of screen tear on it. 

My other monitor is a Samsung 3440 X 1440 60hz monitor and it is much harder to fix screen tear. When I used a i7 2600k with a P67 motherboard it was impossible to get rid of all screen tear on that monitor. It likes my Maximus Hero and i7 8700k so the monitor is screen tear free.

 

The last monitor I bought was a LG 32" 1440p 144hz with g-sync. My games did not run better with that technology either. It cost me $530 to find that out.

 

With Nvidia introducing this technology I can't see myself getting a monitor without it. 

1 hour ago, D13H4RD said:

So I did more testing, this time in games that aren't difficult to run. It's even stranger this time.

 

In games like TF2 and L4D2, running at 144FPS or higher was pleasantly smooth. But the moment it dips, it stutters and tears, exactly the opposite of what should happen.

CPU hits seem to upset it. My heavily modded Fallout 4 goes from GPU bottleneck to CPU bottleneck in a few feet of movement. It is also prone to screen tear. A stutter and tear is created on every time there is a CPU hit. With Assassin's Creed Odyssey I get no CPU bottlenecks so the experience is smooth.

 

I am not going to knock this technology for that because my 144hz g-sync monitor still got the stutter without the tear and I paid for that.    

 

 

  

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

Gtx 1060 6gb 

display port 1.2

AOC G2460PF  144hz

 

G-Sync pendulum demo-  worked good from 35-144hz  I droped it down to 20 and i could tell the difference from 20-35.  

 

I played both witcher 3 and Batman Arkham Origins briefly and both looked good no issues i could tell.

 

so far so good with mine.

Link to comment
Share on other sites

Link to post
Share on other sites

 

21 hours ago, D13H4RD said:

Similar problem for my SHG50. 

 

The overdrive is permanently locked to what it thinks its best, which in 90% of the case, makes it look like a really bad acid trip in terms of motion blur 

 

21 hours ago, kevinisbeast707 said:

I have the same monitor and noticed the same thing happening. It works pretty much flawlessly except that it has changed the overdrive setting to whatever it feels is best for the situation which unfortunately a lot of the time is normal which is wayyy blurrier than extreme. Also when around 50fps in gta at night it likes to flicker and idk why but that's the only problem I've found.

3

 

on my panel, I was using it on normal anyways the extreme had like a purple overcompensation guess it can differ from panel to panel.

Needs Update

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, jones177 said:

CPU hits seem to upset it. My heavily modded Fallout 4 goes from GPU bottleneck to CPU bottleneck in a few feet of movement. It is also prone to screen tear. A stutter and tear is created on every time there is a CPU hit. With Assassin's Creed Odyssey I get no CPU bottlenecks so the experience is smooth.

 

I am not going to knock this technology for that because my 144hz g-sync monitor still got the stutter without the tear and I paid for that.    

It’s odd. It seems like the adaptive sync isn’t working nearly as properly as it should

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, D13H4RD said:

It’s odd. It seems like the adaptive sync isn’t working nearly as properly as it should

My modded fallout 4 is a PC killer. On older saves, not even my RTX 2080 ti can handle it. It is the only game that I have to apply an overclock to play and it is screaming.

My i7 8086k struggles even with a a 5ghz overclock. It replaced my i7 6700k last year because the i7 6700k was no longer able to play it with the amount of content I like. 

My game is not playable at all if it is on a hard drive and on a SATA SSD it is barely managing the in game load ins. Next upgrade 970 Evo 2tb. 

The reason for this is that I like to add lots of NPCs to the game. Getting attacked by 3 to 5 ghouls is not fun to me. Getting attacked by 20 to 30 ghouls is fun and getting attacked by 60 ghouls is epic. Even more fun are swarms of flying bugs.

I could dial all this back but what would be the fun in that. 

 

I have been doing this type of modding since the year 2000 and because of it vanilla games are easy to run. I really don't get performance issues unless I cause them. So adaptive sync is working properly to the point where I brake it.   

 

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

so how about a google excel sheet as database for monitor models that have been tested by members?

"You know it'll clock down as soon as it hits 40°C, right?" - "Yeah ... but it doesnt hit 40°C ... ever  😄"

 

GPU: MSI GTX1080 Ti Aero @ 2 GHz (watercooled) CPU: Ryzen 5600X (watercooled) RAM: 32GB 3600Mhz Corsair LPX MB: Gigabyte B550i PSU: Corsair SF750 Case: Hyte Revolt 3

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 5 weeks later...

Hey guys, I was late to this update only got it last night. Everything seems to be working fine I have enabled adaptive sync on both monitors (Asus XG27VQ) at 120hz in the Nvidia control panel I have both monitors set to G-Sync compatible and I know this monitor isn't official on the list of monitors.

I went and did a UFO test and the results show that I'm running 60hz only. When I game my frames go up 120fps or lower I have it capped. 

 

My question is that with adaptive sync is it normal that my refresh rate is at 60hz during normal windows use and gets turned up during gaming? 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×