Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

NVIDIA finally officially supports Adaptive Sync (with a small catch)

One of the biggest bombshells in NVIDIA's CES 2019 press conference was the announcement that NVIDIA GPU users will no longer have to get a G-SYNC monitor to utilize adaptive sync, as the company now extends official support to a number of FreeSync-capable monitors, dubbed as "G-SYNC capable". 

Quote

Nvidia is crashing AMD’s FreeSync party. The company announced today at CES that it is testing every “adaptive sync” monitor to see if they meet its GSync standards. So far, it has found 12 displays function properly, and it is going to automatically turn on GSync support for those monitors in its GeForce software.

There is a small catch however. Jensen Huang claims that after testing approximately 400 monitors on the market, only 12 are deemed to be worthy of bearing the "G-SYNC CAPABLE" moniker. These 12 are;

Spoiler
  • Acer XFA240
  • Acer XZ321Q
  • Acer XG270HU
  • Acer XV273K
  • Agon AG241QG4
  • AOC G2590FX
  • Asus MG278Q
  • Asus XG248
  • Asus VG258Q
  • Asus XG258
  • Asus VG278Q
  • BenQ Xl2740

With all that said, if your monitor is on the "We're not wooooorthyyyyy" list, the setting can still be enabled manually from the GPU Control Panel as part of a driver update that will be released on the 15th of January, whereas supported monitors have them enabled automatically. 

 

D13H4RD's opinion 

Spoiler

This is the best news out of the entire press conference by far. While G-SYNC is neat, these monitors can be significantly pricier compared to their FreeSync counterparts. NVIDIA's announcement basically means that while they can't guarantee the best performance, it should work. 

Source: VentureBeat

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Station (Intel-powered Lenovo Yoga Slim 7i)

CPU: Intel Core i5 1135G7 | GPU: Intel Iris Xe 80CU | RAM: 16GB LPDDR4X-4267 | Storage: 512GB PCIe SSD | OS: Microsoft Windows 10 Home

 

The Communicator (Exynos-powered Samsung Galaxy Note8)

SoC: Exynos 8895 | GPU: ARM Mali G71 MP20 | RAM: 6GB LPDDR4 | Storage: 64GB internal + 128GB microSD | Display: 6.3" 1440p "Infinity Display" AMOLED | OS: Android Pie 9.0 w/ OneUI

Link to post
Share on other sites

Freesync and Adaptive sync are two different things.

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to post
Share on other sites

It was easy to see where this format war was headed simply by looking at the fact that FreeSync is free while Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel.

Link to post
Share on other sites

Random thought, there's "good" FreeSync, and there's not so good... the adaptive range may be limited on cheaper models, and there was some frame doubling thing for low fps operation that might not be present on cheaper displays. I'd guess that nvidia would only certify the better monitors that have enough FreeSync feature set to be comparable to G-sync. The lower end models would be at your own risk.

Desktop Gaming system: Asrock Z370 Pro4, i7-8086k, Noctua D15, Corsair Vengeance Pro RGB 3200 4x16GB, EVGA 2080Ti, NZXT E850 PSU, Cooler Master MasterBox 5, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p144 G-sync
TV Gaming system: Gigabyte Z490 Elite AC, i7-11700k, Noctua D15, Kingston HyperX RGB 4000@3600 2x8GB, MSI 3070 Gaming Trio X, EVGA Supernova G2L 850W, Corsair 230T, Crucial P1 1TB + MX500 1TB, LG OLED55B9PLA 4k120 G-Sync Compatible
Streaming system: Asus X299 TUF mark 2, i9-7920X, Noctua D15, Corsair Vengeance LPX RGB 3000 8x8GB, Asus Strix 1080Ti, Corsair HX1000i, GameMax Abyss, Samsung 970 Evo 500GB, Crucial BX500 1TB, BenQ XL2411 1080p144 + HP LP2475w 1200p60
Former Main system: Asus Maximus VIII Hero, i7-6700k, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, GTX 980Ti FE, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, Acer RT280k 4k60 FreeSync [link]
Gaming laptop: Asus FX503VD, i5-7300HQ, DDR4 2133 2x8GB, GTX 1050, Sandisk 256GB + 480GB SSD [link]


 

Link to post
Share on other sites
3 minutes ago, Ja50n said:

Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel.

Nvidia doesn't "take a cut", they sell the FPGA that is embedded in the monitor, and yes they do make some profit off of it. G-SYNC is also slightly better than FreeSync in that it works through all the way down to 1fps unlike FreeSync, and to be G-SYNC approved the monitor has to go through Nvidia quality control.

Link to post
Share on other sites
10 minutes ago, Enderman said:

Freesync and Adaptive sync are two different things.

What's the main differences between VESA A-Sync and AMD FreeSync? 

 

Just wondering 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Station (Intel-powered Lenovo Yoga Slim 7i)

CPU: Intel Core i5 1135G7 | GPU: Intel Iris Xe 80CU | RAM: 16GB LPDDR4X-4267 | Storage: 512GB PCIe SSD | OS: Microsoft Windows 10 Home

 

The Communicator (Exynos-powered Samsung Galaxy Note8)

SoC: Exynos 8895 | GPU: ARM Mali G71 MP20 | RAM: 6GB LPDDR4 | Storage: 64GB internal + 128GB microSD | Display: 6.3" 1440p "Infinity Display" AMOLED | OS: Android Pie 9.0 w/ OneUI

Link to post
Share on other sites
2 minutes ago, D13H4RD said:

What's the main differences between VESA A-Sync and AMD FreeSync? 

 

Just wondering 

The name? Without digging I don't know either, but there was also FreeSync 2 with I think HDR support. Was that also standards based or an AMD extension to the VESA standard? Without looking it up, it is common for standards to have mandatory and optional parts, so maybe that's just a feature extension?

Desktop Gaming system: Asrock Z370 Pro4, i7-8086k, Noctua D15, Corsair Vengeance Pro RGB 3200 4x16GB, EVGA 2080Ti, NZXT E850 PSU, Cooler Master MasterBox 5, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p144 G-sync
TV Gaming system: Gigabyte Z490 Elite AC, i7-11700k, Noctua D15, Kingston HyperX RGB 4000@3600 2x8GB, MSI 3070 Gaming Trio X, EVGA Supernova G2L 850W, Corsair 230T, Crucial P1 1TB + MX500 1TB, LG OLED55B9PLA 4k120 G-Sync Compatible
Streaming system: Asus X299 TUF mark 2, i9-7920X, Noctua D15, Corsair Vengeance LPX RGB 3000 8x8GB, Asus Strix 1080Ti, Corsair HX1000i, GameMax Abyss, Samsung 970 Evo 500GB, Crucial BX500 1TB, BenQ XL2411 1080p144 + HP LP2475w 1200p60
Former Main system: Asus Maximus VIII Hero, i7-6700k, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, GTX 980Ti FE, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, Acer RT280k 4k60 FreeSync [link]
Gaming laptop: Asus FX503VD, i5-7300HQ, DDR4 2133 2x8GB, GTX 1050, Sandisk 256GB + 480GB SSD [link]


 

Link to post
Share on other sites
14 minutes ago, Ja50n said:

It was easy to see where this format war was headed simply by looking at the fact that FreeSync is free while Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel.

Easy to see when looking back on it.

Not all format wars head in the same direction and quality rarely has anything to do with it.   Whilst G-sync is objectively better, most end users these days don't necessarily need any form off sync.  Making the whole ordeal dead in the water before it began. 

 

Also this is a bad thing for AMD,  as it stands now if you can use your Nvidia card on your freesync monitor then freesync adds nothing to the argument for a Navi based upgrade.  Nvidia have essentially bought the competition back to raw horsepower and RT  whilst making their money out of Gsync and investing nothing in freesync.

QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Sometimes I miss contractions like n't on the end of words like wouldn't, couldn't and shouldn't.    Please don't be a dick,  make allowances when reading my posts.

Link to post
Share on other sites
13 minutes ago, Enderman said:

Freesync and Adaptive sync are two different things.

From what I know, Freesync is just a marketing name AMD came up with for adaptive sync.

 

15 minutes ago, Ja50n said:

It was easy to see where this format war was headed simply by looking at the fact that FreeSync is free while Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel. 

Be careful when looking at like that though. A lot of times when people compare prices between G-Sync and FreeSync monitors they forget things like all G-Sync monitors supporting strobing backlight, while only some FreeSync monitors supporting it (because it's something that has to be added separately).

 

It would be interesting to know what the actual price difference between G-Sync and FreeSync would be, if both monitors were otherwise identical. Not sure if anyone makes that though.

 

 

10 minutes ago, porina said:

The name? Without digging I don't know either, but there was also FreeSync 2 with I think HDR support. Was that also standards based or an AMD extension to the VESA standard? Without looking it up, it is common for standards to have mandatory and optional parts, so maybe that's just a feature extension? 

I haven't done that much digging, but from what I understand Adaptive-Sync works independently of the color information being transmitted. If that's the case then Adaptive-Sync supports any past and future color standards, and FreeSync 2 is just a marketing name for a monitor which supports HDR as well as FreeSync. The actual FreeSync part of FreeSync 2 most likely works exactly the same as before.

Link to post
Share on other sites
1 hour ago, Ja50n said:

It was easy to see where this format war was headed simply by looking at the fact that FreeSync is free while Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel.

But finding a comparable quality freesync panel wasn't always trivial either in the sea of shitty ones.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to post
Share on other sites
1 hour ago, LAwLz said:

From what I know, Freesync is just a marketing name AMD came up with for adaptive sync.

 

Be careful when looking at like that though. A lot of times when people compare prices between G-Sync and FreeSync monitors they forget things like all G-Sync monitors supporting strobing backlight, while only some FreeSync monitors supporting it (because it's something that has to be added separately).

 

It would be interesting to know what the actual price difference between G-Sync and FreeSync would be, if both monitors were otherwise identical. Not sure if anyone makes that though.

 

 

I haven't done that much digging, but from what I understand Adaptive-Sync works independently of the color information being transmitted. If that's the case then Adaptive-Sync supports any past and future color standards, and FreeSync 2 is just a marketing name for a monitor which supports HDR as well as FreeSync. The actual FreeSync part of FreeSync 2 most likely works exactly the same as before.

If my memory is accurate, I believe first gen G-sync modules cost monitor OEMs something like $150-$200 per module. I'd imagine that cost went up with versions 2 and 3.

Link to post
Share on other sites
1 hour ago, LAwLz said:

The actual FreeSync part of FreeSync 2 most likely works exactly the same as before.

Freesync 2 has low framerate compensation unlike freesync 1.

 

Gsync was better than Freesync, but Freesync 2 has brought them to parity.

 

Afaik at least, please correct me if im wrong

Link to post
Share on other sites

would be quite interesting if we had like a google excel sheet for users to submit their results with their monitor models.

i have been complaining about freesync not making sense from the start and this might actually be awesome if it works. 

definitly testing mine since wildlands is currently giving my 1080 Ti a hard time reaching "good" fps

"I know its stupidly overdone and unreasonably unneccesary but wouldnt it be awesome if ..."

 

CPU: Delidded i7 7700k (watercooled) Cooling: 3x 360 rads by Alphacool  MB: ASUS z270i  RAM: 32 GB Corsair Dominator Platinum  GPU: MSI GTX1080 Ti Aero @ 2 GHz (watercooled)  DISPLAY: LG 34UC88-B 21:9 1440p SSD(OS): Samsung 960 EVO 250GB SSD(Games): Corsair MP510 960GB SSD(Applikations): Samsung 850 EVO 500GB  HDD(Scratch): WD Blue 500GB HDD(Downloads): WD Blue 320GB HDD(Long-term): WD Green 2TB (external)   PSU: Corsair SF750 Case: Lian Li PC-O11 D Mouse: Logitech MX Master Keyboard: Logitech G513 Carbon

 

Link to post
Share on other sites
21 minutes ago, GoldenLag said:

Freesync 2 has low framerate compensation unlike freesync 1.

 

Gsync was better than Freesync, but Freesync 2 has brought them to parity.

 

Afaik at least, please correct me if im wrong

LFC is not monitor dependent, it works on any FreeSync monitor with a maximum range at least 2x the minimum. It was added to AMD drivers long before FreeSync 2 existed.

Link to post
Share on other sites

Oh, damn! What a great way to start 2019. I don't have a freesync panel either but, I'll be curious how the "Not worthy" monitors will play along with this.

Make sure to quote or tag people, so they get notified.

Link to post
Share on other sites
2 minutes ago, Cyberspirit said:

Oh, damn! What a great way to start 2019. I don't have a freesync panel either but, I'll be curious how the "Not worthy" monitors will play along with this.

"Not worthy" will probably play well. Nvidia wants some incentive to still ship Gsync monitors. 

Link to post
Share on other sites
1 minute ago, CorruptedSanity said:

I literally clapped and cheered when I heard him talk about this as I have one of the Asus monitors listed.  My wife thought I was crazy

Dont make it sound like an r/thathappened story.

Link to post
Share on other sites

i wonder if asus strix XG27VQ would work as well. Its not listed buts it's almost the same..

i7-8086K, Strix Z370E-Gaming, G.Skill Trident 32gb 3000MHZ CL 14, Strix 1080 Ti OC, Corsair HX1000i, Obsidian 1000D, Corsair Hydro Series h150I, 15x Corsair LL120, Corsair Lighting Node Pro, 2x SSD Adata SU800 3DNand - 1tb and 128gb, 1Tb WD Blue, Cable Mod Full Cable Kit, Monitor Asus XG27VQ 144Mhz Curved

Link to post
Share on other sites
6 minutes ago, GoldenLag said:

Dont make it sound like an r/thathappened story.

Sorry, I don't get it.  It's probably the biggest announcement in my eyes so I was excited. 

Link to post
Share on other sites
Just now, CorruptedSanity said:

Sorry, I don't get it.  It's probably the biggest announcement in my eyes so I was excited. 

*And everyone clapped and cheered

 

 

Just look up r/thathappened. Youll be inn for a fun time

Link to post
Share on other sites

Interesting.

 

I am very curious to see if manually enabling this on my Samsung C32HG70 will work or not.

Link to post
Share on other sites
11 minutes ago, Enochian said:

i wonder if asus strix XG27VQ would work as well. Its not listed buts it's almost the same..

 

5 minutes ago, Ergroilnin said:

I am very curious to see if manually enabling this on my Samsung C32HG70 will work or not.

We could all give it a shot when the driver releases. 

 

My SHG50 isn't listed but we can try. 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Station (Intel-powered Lenovo Yoga Slim 7i)

CPU: Intel Core i5 1135G7 | GPU: Intel Iris Xe 80CU | RAM: 16GB LPDDR4X-4267 | Storage: 512GB PCIe SSD | OS: Microsoft Windows 10 Home

 

The Communicator (Exynos-powered Samsung Galaxy Note8)

SoC: Exynos 8895 | GPU: ARM Mali G71 MP20 | RAM: 6GB LPDDR4 | Storage: 64GB internal + 128GB microSD | Display: 6.3" 1440p "Infinity Display" AMOLED | OS: Android Pie 9.0 w/ OneUI

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×