Jump to content

NVIDIA finally officially supports Adaptive Sync (with a small catch)

D13H4RD

One of the biggest bombshells in NVIDIA's CES 2019 press conference was the announcement that NVIDIA GPU users will no longer have to get a G-SYNC monitor to utilize adaptive sync, as the company now extends official support to a number of FreeSync-capable monitors, dubbed as "G-SYNC capable". 

Quote

Nvidia is crashing AMD’s FreeSync party. The company announced today at CES that it is testing every “adaptive sync” monitor to see if they meet its GSync standards. So far, it has found 12 displays function properly, and it is going to automatically turn on GSync support for those monitors in its GeForce software.

There is a small catch however. Jensen Huang claims that after testing approximately 400 monitors on the market, only 12 are deemed to be worthy of bearing the "G-SYNC CAPABLE" moniker. These 12 are;

Spoiler
  • Acer XFA240
  • Acer XZ321Q
  • Acer XG270HU
  • Acer XV273K
  • Agon AG241QG4
  • AOC G2590FX
  • Asus MG278Q
  • Asus XG248
  • Asus VG258Q
  • Asus XG258
  • Asus VG278Q
  • BenQ Xl2740

With all that said, if your monitor is on the "We're not wooooorthyyyyy" list, the setting can still be enabled manually from the GPU Control Panel as part of a driver update that will be released on the 15th of January, whereas supported monitors have them enabled automatically. 

 

D13H4RD's opinion 

Spoiler

This is the best news out of the entire press conference by far. While G-SYNC is neat, these monitors can be significantly pricier compared to their FreeSync counterparts. NVIDIA's announcement basically means that while they can't guarantee the best performance, it should work. 

Source: VentureBeat

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync and Adaptive sync are two different things.

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

It was easy to see where this format war was headed simply by looking at the fact that FreeSync is free while Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel.

Link to comment
Share on other sites

Link to post
Share on other sites

Random thought, there's "good" FreeSync, and there's not so good... the adaptive range may be limited on cheaper models, and there was some frame doubling thing for low fps operation that might not be present on cheaper displays. I'd guess that nvidia would only certify the better monitors that have enough FreeSync feature set to be comparable to G-sync. The lower end models would be at your own risk.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Ja50n said:

Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel.

Nvidia doesn't "take a cut", they sell the FPGA that is embedded in the monitor, and yes they do make some profit off of it. G-SYNC is also slightly better than FreeSync in that it works through all the way down to 1fps unlike FreeSync, and to be G-SYNC approved the monitor has to go through Nvidia quality control.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Enderman said:

Freesync and Adaptive sync are two different things.

What's the main differences between VESA A-Sync and AMD FreeSync? 

 

Just wondering 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, D13H4RD said:

What's the main differences between VESA A-Sync and AMD FreeSync? 

 

Just wondering 

The name? Without digging I don't know either, but there was also FreeSync 2 with I think HDR support. Was that also standards based or an AMD extension to the VESA standard? Without looking it up, it is common for standards to have mandatory and optional parts, so maybe that's just a feature extension?

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Ja50n said:

It was easy to see where this format war was headed simply by looking at the fact that FreeSync is free while Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel.

Easy to see when looking back on it.

Not all format wars head in the same direction and quality rarely has anything to do with it.   Whilst G-sync is objectively better, most end users these days don't necessarily need any form off sync.  Making the whole ordeal dead in the water before it began. 

 

Also this is a bad thing for AMD,  as it stands now if you can use your Nvidia card on your freesync monitor then freesync adds nothing to the argument for a Navi based upgrade.  Nvidia have essentially bought the competition back to raw horsepower and RT  whilst making their money out of Gsync and investing nothing in freesync.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Enderman said:

Freesync and Adaptive sync are two different things.

From what I know, Freesync is just a marketing name AMD came up with for adaptive sync.

 

15 minutes ago, Ja50n said:

It was easy to see where this format war was headed simply by looking at the fact that FreeSync is free while Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel. 

Be careful when looking at like that though. A lot of times when people compare prices between G-Sync and FreeSync monitors they forget things like all G-Sync monitors supporting strobing backlight, while only some FreeSync monitors supporting it (because it's something that has to be added separately).

 

It would be interesting to know what the actual price difference between G-Sync and FreeSync would be, if both monitors were otherwise identical. Not sure if anyone makes that though.

 

 

10 minutes ago, porina said:

The name? Without digging I don't know either, but there was also FreeSync 2 with I think HDR support. Was that also standards based or an AMD extension to the VESA standard? Without looking it up, it is common for standards to have mandatory and optional parts, so maybe that's just a feature extension? 

I haven't done that much digging, but from what I understand Adaptive-Sync works independently of the color information being transmitted. If that's the case then Adaptive-Sync supports any past and future color standards, and FreeSync 2 is just a marketing name for a monitor which supports HDR as well as FreeSync. The actual FreeSync part of FreeSync 2 most likely works exactly the same as before.

Link to comment
Share on other sites

Link to post
Share on other sites

Spoiler

1a332akwxx821.png

Spoiler

Also I stole this me- I mean, I made this

 

 

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

was nice to see my mg278q show up on the slide. Bring on the 15th !!!!

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ja50n said:

It was easy to see where this format war was headed simply by looking at the fact that FreeSync is free while Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel.

But finding a comparable quality freesync panel wasn't always trivial either in the sea of shitty ones.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

From what I know, Freesync is just a marketing name AMD came up with for adaptive sync.

 

Be careful when looking at like that though. A lot of times when people compare prices between G-Sync and FreeSync monitors they forget things like all G-Sync monitors supporting strobing backlight, while only some FreeSync monitors supporting it (because it's something that has to be added separately).

 

It would be interesting to know what the actual price difference between G-Sync and FreeSync would be, if both monitors were otherwise identical. Not sure if anyone makes that though.

 

 

I haven't done that much digging, but from what I understand Adaptive-Sync works independently of the color information being transmitted. If that's the case then Adaptive-Sync supports any past and future color standards, and FreeSync 2 is just a marketing name for a monitor which supports HDR as well as FreeSync. The actual FreeSync part of FreeSync 2 most likely works exactly the same as before.

If my memory is accurate, I believe first gen G-sync modules cost monitor OEMs something like $150-$200 per module. I'd imagine that cost went up with versions 2 and 3.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

The actual FreeSync part of FreeSync 2 most likely works exactly the same as before.

Freesync 2 has low framerate compensation unlike freesync 1.

 

Gsync was better than Freesync, but Freesync 2 has brought them to parity.

 

Afaik at least, please correct me if im wrong

Link to comment
Share on other sites

Link to post
Share on other sites

would be quite interesting if we had like a google excel sheet for users to submit their results with their monitor models.

i have been complaining about freesync not making sense from the start and this might actually be awesome if it works. 

definitly testing mine since wildlands is currently giving my 1080 Ti a hard time reaching "good" fps

"You know it'll clock down as soon as it hits 40°C, right?" - "Yeah ... but it doesnt hit 40°C ... ever  😄"

 

GPU: MSI GTX1080 Ti Aero @ 2 GHz (watercooled) CPU: Ryzen 5600X (watercooled) RAM: 32GB 3600Mhz Corsair LPX MB: Gigabyte B550i PSU: Corsair SF750 Case: Hyte Revolt 3

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, GoldenLag said:

Freesync 2 has low framerate compensation unlike freesync 1.

 

Gsync was better than Freesync, but Freesync 2 has brought them to parity.

 

Afaik at least, please correct me if im wrong

LFC is not monitor dependent, it works on any FreeSync monitor with a maximum range at least 2x the minimum. It was added to AMD drivers long before FreeSync 2 existed.

Link to comment
Share on other sites

Link to post
Share on other sites

Oh, damn! What a great way to start 2019. I don't have a freesync panel either but, I'll be curious how the "Not worthy" monitors will play along with this.

Make sure to quote or tag people, so they get notified.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Cyberspirit said:

Oh, damn! What a great way to start 2019. I don't have a freesync panel either but, I'll be curious how the "Not worthy" monitors will play along with this.

"Not worthy" will probably play well. Nvidia wants some incentive to still ship Gsync monitors. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, CorruptedSanity said:

I literally clapped and cheered when I heard him talk about this as I have one of the Asus monitors listed.  My wife thought I was crazy

Dont make it sound like an r/thathappened story.

Link to comment
Share on other sites

Link to post
Share on other sites

i wonder if asus strix XG27VQ would work as well. Its not listed buts it's almost the same..

i7-8086K, Strix Z370E-Gaming, G.Skill Trident 32gb 3000MHZ CL 14, Strix 1080 Ti OC, Corsair HX1000i, Obsidian 1000D, Corsair Hydro X custom loop, 13x Corsair LL120, Corsair Lighting Node Pro, 2x SSD Adata SU800 3DNand - 1tb and 128gb, 1Tb WD Blue, Cable Mod Full Cable Kit, Monitor Asus XG27VQ 144Mhz Curved

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, GoldenLag said:

Dont make it sound like an r/thathappened story.

Sorry, I don't get it.  It's probably the biggest announcement in my eyes so I was excited. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, CorruptedSanity said:

Sorry, I don't get it.  It's probably the biggest announcement in my eyes so I was excited. 

*And everyone clapped and cheered

 

 

Just look up r/thathappened. Youll be inn for a fun time

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting.

 

I am very curious to see if manually enabling this on my Samsung C32HG70 will work or not.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Enochian said:

i wonder if asus strix XG27VQ would work as well. Its not listed buts it's almost the same..

 

5 minutes ago, Ergroilnin said:

I am very curious to see if manually enabling this on my Samsung C32HG70 will work or not.

We could all give it a shot when the driver releases. 

 

My SHG50 isn't listed but we can try. 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×