Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
D13H4RD

NVIDIA finally officially supports Adaptive Sync (with a small catch)

Recommended Posts

Posted · Original PosterOP

One of the biggest bombshells in NVIDIA's CES 2019 press conference was the announcement that NVIDIA GPU users will no longer have to get a G-SYNC monitor to utilize adaptive sync, as the company now extends official support to a number of FreeSync-capable monitors, dubbed as "G-SYNC capable". 

Quote

Nvidia is crashing AMD’s FreeSync party. The company announced today at CES that it is testing every “adaptive sync” monitor to see if they meet its GSync standards. So far, it has found 12 displays function properly, and it is going to automatically turn on GSync support for those monitors in its GeForce software.

There is a small catch however. Jensen Huang claims that after testing approximately 400 monitors on the market, only 12 are deemed to be worthy of bearing the "G-SYNC CAPABLE" moniker. These 12 are;

Spoiler
  • Acer XFA240
  • Acer XZ321Q
  • Acer XG270HU
  • Acer XV273K
  • Agon AG241QG4
  • AOC G2590FX
  • Asus MG278Q
  • Asus XG248
  • Asus VG258Q
  • Asus XG258
  • Asus VG278Q
  • BenQ Xl2740

With all that said, if your monitor is on the "We're not wooooorthyyyyy" list, the setting can still be enabled manually from the GPU Control Panel as part of a driver update that will be released on the 15th of January, whereas supported monitors have them enabled automatically. 

 

D13H4RD's opinion 

Spoiler

This is the best news out of the entire press conference by far. While G-SYNC is neat, these monitors can be significantly pricier compared to their FreeSync counterparts. NVIDIA's announcement basically means that while they can't guarantee the best performance, it should work. 

Source: VentureBeat


Please tag me if you need assistance or if you want me to contribute to a topic 

 

ASUS RoG STRIX GL502VM

Intel Core i7 7700HQ | GeForce GTX 1060 6GB | 16GB DDR4-2133 | 128GB SanDisk M.2 SATA SSD + 1TB 7200RPM Hitachi HDD | 15.6" 1080p IPS monitor @ 60Hz w/ G-SYNC | Windows 10 64-bit

 

Samsung Galaxy Note8 SM-N950F

Exynos 8895 (4x Mongoose @ 2.3GHz, 4x Cortex A53 @ 1.7GHz)ARM Mali G71 MP20 | 6GB LPDDR4 | 64GB Samsung NAND flash w/ UFS 2.1 dual-lane controller + 128GB SanDisk C10 UHS-I microSD | 6.3" 1440p "Infinity Display" AMOLED | Android Nougat 7.1.1 w/ Samsung Experience 8.5

Link to post
Share on other sites

Freesync and Adaptive sync are two different things.


My sound system costs more than my PC.        Check out my S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Intel i7 4790k | ASUS GTX770 | ASUS Sabertooth Z97 Mark S | Corsair Vengeance Pro 32GB | NZXT S340 | Seasonic Platinum 760 | modded H100i | Ducky ONE White TKL RGB | Logitech MX Master 2S | 2x Samsung 850 Pro 512GB | WD Red 4TB Samsung 58" 4k TV | 2x Behringer NEKKST K8 | BIC Acoustech H-100II | Scarlett 2i4 | 2x AT2020

 

Link to post
Share on other sites

It was easy to see where this format war was headed simply by looking at the fact that FreeSync is free while Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel.

Link to post
Share on other sites

Random thought, there's "good" FreeSync, and there's not so good... the adaptive range may be limited on cheaper models, and there was some frame doubling thing for low fps operation that might not be present on cheaper displays. I'd guess that nvidia would only certify the better monitors that have enough FreeSync feature set to be comparable to G-sync. The lower end models would be at your own risk.


Main rig: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte GTX 1650, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, Corsair Vengeance LPX RGB 3000 2x8GB, Gigabyte RTX 2070, Fractal Edison 550W PSU, Corsair 600C, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

Ryzen rig: Asrock B450 ITX, R5 3600, Noctua D9L, Corsair Vengeance LPX RGB 3000 2x4GB, EVGA GTX 970, Corsair CX450M, NZXT Manta, Crucial MX300 525GB, Acer RT280K

VR rig: Asus Z170I Pro Gaming, i7-6600k stock, Silverstone TD03-E, Kingston Hyper-X 2666 2x8GB, Zotac 1070 FE, Corsair CX450M, Silverstone SG13, Samsung PM951 256GB, HTC Vive

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB SSD

Total CPU heating: i7-7800X, i7-5930k, i7-5820k, 2x i7-6700k, i7-6700T, i5-6600k, i7-5775C, i5-5675C, i5-4570S, i3-8350k, i3-6100, i3-4360, i3-4150T, E5-2683v3, 2x E5-2650, E5-2667, R7 3700X, R5 3600

Link to post
Share on other sites
3 minutes ago, Ja50n said:

Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel.

Nvidia doesn't "take a cut", they sell the FPGA that is embedded in the monitor, and yes they do make some profit off of it. G-SYNC is also slightly better than FreeSync in that it works through all the way down to 1fps unlike FreeSync, and to be G-SYNC approved the monitor has to go through Nvidia quality control.

Link to post
Share on other sites
Posted · Original PosterOP
10 minutes ago, Enderman said:

Freesync and Adaptive sync are two different things.

What's the main differences between VESA A-Sync and AMD FreeSync? 

 

Just wondering 


Please tag me if you need assistance or if you want me to contribute to a topic 

 

ASUS RoG STRIX GL502VM

Intel Core i7 7700HQ | GeForce GTX 1060 6GB | 16GB DDR4-2133 | 128GB SanDisk M.2 SATA SSD + 1TB 7200RPM Hitachi HDD | 15.6" 1080p IPS monitor @ 60Hz w/ G-SYNC | Windows 10 64-bit

 

Samsung Galaxy Note8 SM-N950F

Exynos 8895 (4x Mongoose @ 2.3GHz, 4x Cortex A53 @ 1.7GHz)ARM Mali G71 MP20 | 6GB LPDDR4 | 64GB Samsung NAND flash w/ UFS 2.1 dual-lane controller + 128GB SanDisk C10 UHS-I microSD | 6.3" 1440p "Infinity Display" AMOLED | Android Nougat 7.1.1 w/ Samsung Experience 8.5

Link to post
Share on other sites
2 minutes ago, D13H4RD said:

What's the main differences between VESA A-Sync and AMD FreeSync? 

 

Just wondering 

The name? Without digging I don't know either, but there was also FreeSync 2 with I think HDR support. Was that also standards based or an AMD extension to the VESA standard? Without looking it up, it is common for standards to have mandatory and optional parts, so maybe that's just a feature extension?


Main rig: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte GTX 1650, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, Corsair Vengeance LPX RGB 3000 2x8GB, Gigabyte RTX 2070, Fractal Edison 550W PSU, Corsair 600C, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

Ryzen rig: Asrock B450 ITX, R5 3600, Noctua D9L, Corsair Vengeance LPX RGB 3000 2x4GB, EVGA GTX 970, Corsair CX450M, NZXT Manta, Crucial MX300 525GB, Acer RT280K

VR rig: Asus Z170I Pro Gaming, i7-6600k stock, Silverstone TD03-E, Kingston Hyper-X 2666 2x8GB, Zotac 1070 FE, Corsair CX450M, Silverstone SG13, Samsung PM951 256GB, HTC Vive

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB SSD

Total CPU heating: i7-7800X, i7-5930k, i7-5820k, 2x i7-6700k, i7-6700T, i5-6600k, i7-5775C, i5-5675C, i5-4570S, i3-8350k, i3-6100, i3-4360, i3-4150T, E5-2683v3, 2x E5-2650, E5-2667, R7 3700X, R5 3600

Link to post
Share on other sites
14 minutes ago, Ja50n said:

It was easy to see where this format war was headed simply by looking at the fact that FreeSync is free while Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel.

Easy to see when looking back on it.

Not all format wars head in the same direction and quality rarely has anything to do with it.   Whilst G-sync is objectively better, most end users these days don't necessarily need any form off sync.  Making the whole ordeal dead in the water before it began. 

 

Also this is a bad thing for AMD,  as it stands now if you can use your Nvidia card on your freesync monitor then freesync adds nothing to the argument for a Navi based upgrade.  Nvidia have essentially bought the competition back to raw horsepower and RT  whilst making their money out of Gsync and investing nothing in freesync.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites
13 minutes ago, Enderman said:

Freesync and Adaptive sync are two different things.

From what I know, Freesync is just a marketing name AMD came up with for adaptive sync.

 

15 minutes ago, Ja50n said:

It was easy to see where this format war was headed simply by looking at the fact that FreeSync is free while Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel. 

Be careful when looking at like that though. A lot of times when people compare prices between G-Sync and FreeSync monitors they forget things like all G-Sync monitors supporting strobing backlight, while only some FreeSync monitors supporting it (because it's something that has to be added separately).

 

It would be interesting to know what the actual price difference between G-Sync and FreeSync would be, if both monitors were otherwise identical. Not sure if anyone makes that though.

 

 

10 minutes ago, porina said:

The name? Without digging I don't know either, but there was also FreeSync 2 with I think HDR support. Was that also standards based or an AMD extension to the VESA standard? Without looking it up, it is common for standards to have mandatory and optional parts, so maybe that's just a feature extension? 

I haven't done that much digging, but from what I understand Adaptive-Sync works independently of the color information being transmitted. If that's the case then Adaptive-Sync supports any past and future color standards, and FreeSync 2 is just a marketing name for a monitor which supports HDR as well as FreeSync. The actual FreeSync part of FreeSync 2 most likely works exactly the same as before.

Link to post
Share on other sites
1 hour ago, Ja50n said:

It was easy to see where this format war was headed simply by looking at the fact that FreeSync is free while Nvidia takes a $200-$500 cut for every Gsync monitor. Their implementation was far more expensive and no better in quality than a comparable FreeSync panel.

But finding a comparable quality freesync panel wasn't always trivial either in the sea of shitty ones.


LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Galaxy S9+ - XPS 13 (9343 UHD+) - Samsung Note Tab 7.0 - Lenovo Y580

 

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to post
Share on other sites
1 hour ago, LAwLz said:

From what I know, Freesync is just a marketing name AMD came up with for adaptive sync.

 

Be careful when looking at like that though. A lot of times when people compare prices between G-Sync and FreeSync monitors they forget things like all G-Sync monitors supporting strobing backlight, while only some FreeSync monitors supporting it (because it's something that has to be added separately).

 

It would be interesting to know what the actual price difference between G-Sync and FreeSync would be, if both monitors were otherwise identical. Not sure if anyone makes that though.

 

 

I haven't done that much digging, but from what I understand Adaptive-Sync works independently of the color information being transmitted. If that's the case then Adaptive-Sync supports any past and future color standards, and FreeSync 2 is just a marketing name for a monitor which supports HDR as well as FreeSync. The actual FreeSync part of FreeSync 2 most likely works exactly the same as before.

If my memory is accurate, I believe first gen G-sync modules cost monitor OEMs something like $150-$200 per module. I'd imagine that cost went up with versions 2 and 3.

Link to post
Share on other sites
1 hour ago, LAwLz said:

The actual FreeSync part of FreeSync 2 most likely works exactly the same as before.

Freesync 2 has low framerate compensation unlike freesync 1.

 

Gsync was better than Freesync, but Freesync 2 has brought them to parity.

 

Afaik at least, please correct me if im wrong

Link to post
Share on other sites

would be quite interesting if we had like a google excel sheet for users to submit their results with their monitor models.

i have been complaining about freesync not making sense from the start and this might actually be awesome if it works. 

definitly testing mine since wildlands is currently giving my 1080 Ti a hard time reaching "good" fps


"I know its stupidly overdone and unreasonably unneccesary but wouldnt it be awesome if ..."

 

CPU: Delidded i7 7700k (watercooled) Cooling: 3x 360 rads by Alphacool  MB: ASUS z270i  RAM: G.Skill Trident Z RGB 16GB  GPU: MSI GTX1080 Ti Aero (watercooled)  DISPLAY: LG 34UC88-B 21:9 1440p SSD(OS): Samsung 960 EVO 250GB SSD(Games): Corsair MP510 960GB SSD(Applikations): Samsung 850 EVO 500GB  HDD(Scratch): WD Blue 500GB HDD(Downloads): WD Blue 320GB HDD(Long-term): WD Green 2TB (external)   PSU: Corsair SF600 Case: Lian Li PC-O11 D Mouse: Logitech MX Master Keyboard: Logitech G513 Carbon

 

Link to post
Share on other sites
21 minutes ago, GoldenLag said:

Freesync 2 has low framerate compensation unlike freesync 1.

 

Gsync was better than Freesync, but Freesync 2 has brought them to parity.

 

Afaik at least, please correct me if im wrong

LFC is not monitor dependent, it works on any FreeSync monitor with a maximum range at least 2x the minimum. It was added to AMD drivers long before FreeSync 2 existed.

Link to post
Share on other sites

Oh, damn! What a great way to start 2019. I don't have a freesync panel either but, I'll be curious how the "Not worthy" monitors will play along with this.


Make sure to quote or tag people, so they get notified.

 

 

 

UP THE HAMMERS & DOWN THE NAILS
MAY THE LORDS OF LIGHT BE WITH YOU
BLESSED BE
HAIL CROM
HAIL ODIN
HAIL THOR
HAIL THE MANILLAN EMPIRE
HAIL TO THE BRETHREN OF THE HAMMER

Rest in peace Mark \m/

1957-2018

Link to post
Share on other sites
2 minutes ago, Cyberspirit said:

Oh, damn! What a great way to start 2019. I don't have a freesync panel either but, I'll be curious how the "Not worthy" monitors will play along with this.

"Not worthy" will probably play well. Nvidia wants some incentive to still ship Gsync monitors. 

Link to post
Share on other sites
1 minute ago, CorruptedSanity said:

I literally clapped and cheered when I heard him talk about this as I have one of the Asus monitors listed.  My wife thought I was crazy

Dont make it sound like an r/thathappened story.

Link to post
Share on other sites

i wonder if asus strix XG27VQ would work as well. Its not listed buts it's almost the same..


i7-8086K, Strix Z370E-Gaming, G.Skill Trident 16gb 3000MHZ CL 14, Strix 1080 Ti OC, Corsair HX1000i, Dark Base Pro 900 Rev.2 Silver, Corsair Hydro Series h150I, 6x Corsair LL120, Corsair Lighting Node Pro, 2x SSD Adata SU800 3DNand - 1tb and 128gb, 1Tb WD Blue, Cable Mod Full Cable Kit, Monitor Asus XG27VQ 144Mhz Curved

Link to post
Share on other sites
6 minutes ago, GoldenLag said:

Dont make it sound like an r/thathappened story.

Sorry, I don't get it.  It's probably the biggest announcement in my eyes so I was excited. 

Link to post
Share on other sites
Just now, CorruptedSanity said:

Sorry, I don't get it.  It's probably the biggest announcement in my eyes so I was excited. 

*And everyone clapped and cheered

 

 

Just look up r/thathappened. Youll be inn for a fun time

Link to post
Share on other sites
Posted · Original PosterOP
11 minutes ago, Enochian said:

i wonder if asus strix XG27VQ would work as well. Its not listed buts it's almost the same..

 

5 minutes ago, Ergroilnin said:

I am very curious to see if manually enabling this on my Samsung C32HG70 will work or not.

We could all give it a shot when the driver releases. 

 

My SHG50 isn't listed but we can try. 


Please tag me if you need assistance or if you want me to contribute to a topic 

 

ASUS RoG STRIX GL502VM

Intel Core i7 7700HQ | GeForce GTX 1060 6GB | 16GB DDR4-2133 | 128GB SanDisk M.2 SATA SSD + 1TB 7200RPM Hitachi HDD | 15.6" 1080p IPS monitor @ 60Hz w/ G-SYNC | Windows 10 64-bit

 

Samsung Galaxy Note8 SM-N950F

Exynos 8895 (4x Mongoose @ 2.3GHz, 4x Cortex A53 @ 1.7GHz)ARM Mali G71 MP20 | 6GB LPDDR4 | 64GB Samsung NAND flash w/ UFS 2.1 dual-lane controller + 128GB SanDisk C10 UHS-I microSD | 6.3" 1440p "Infinity Display" AMOLED | Android Nougat 7.1.1 w/ Samsung Experience 8.5

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×