Jump to content

nvidia unlocks future gsync for amd gpus

spartaman64
Quote

It looks like good things do come to those who wait. A recent TFT Central report reveals that Nvidia is finally opening up its G-Sync technology, which fights screen tearing on gaming monitors but has been reserved for PCs running an Nvidia graphics card, to AMD graphics card owners.

Choosing between a G-Sync and FreeSync monitor used to be straight-forward. You were basically confined to the first if you owned a Nvidia graphics card and to the latter if you had an AMD graphics card. However, Nvidia in January added Variable Refresh Rate (VRR) support to its GeForce graphics cards, so people could use G-Sync with monitors lacking Nvidia's proprietary chip, as long as it complied with VESA's DisplayPort Adaptive Sync standard and Nvidia validated it. They call these monitors G-Sync Compatible. We've even found that some FreeSync monitors not certified as G-Sync Compatible can run G-Sync if you follow our instructions on how to run G-Sync on a FreeSync monitor. 

However, AMD graphics card owners have still been unable to use a monitor's G-Sync feature. Now, six long years after G-Sync's introduction, Nvidia has finally decided to welcome the Red Team.

Quote

TFT Central confirmed with Nvidia that future G-Sync monitors will support HDMI-VRR and Adaptive Sync over HDMI and DisplayPort. Naturally, a new firmware will be required to bring the two new features to the G-Sync displays, and one for both the G-Sync v1 and v2 hardware modules is already in the works. Basically, this means that AMD graphics card and console owners will be able to use a G-Sync monitor with their setup. 

"So if you have an AMD graphics card, you could still enjoy the VRR experience and other additional benefits that the G-sync module brings even from a native G-sync screen, which was previously out of reach to those users," TFT said. However, it's uncertain if users will be able to reap G-Sync's full benefits. 

But there is a caveat. The new firmware will only be compatible with future G-Sync displays, such as the Acer Predator XB273 X. Nvidia told TFT Central that the new firmware will not work with existing G-Sync monitors on the market.

source: https://www.tomshardware.com/news/gsync-monitor-with-amd-graphics-card-nvidia

 

Great move by nvidia its nice to see companies working together on standards in ways that will benefit consumers. Now when is nvidia going to open up rtx :P. It will be interesting to see whether monitor manufacturers will go for the new g sync modules or just try to make g sync compatible adaptive sync monitors.

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia:

"Oh shoot, locking our special display technology off makes people less likely to buy our graphics cards for adaptive sync"

Nvidia, later:

"Oh shoot, enabling the open standard Freesync on our GPUs has made people stop buying our really expensive monitors"

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

So when they finally brought support to Freesync pretty much everyone stop buying G-Sync monitors to save up those 200 dollars while retaining the same experience? You don't say!

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Princess Luna said:

So when they finally brought support to Freesync pretty much everyone stop buying G-Sync monitors to save up those 200 dollars while retaining the same experience? You don't say!

I still haven't seen anyone actually prove that the G-Sync module just adds an extra 200 dollars to the monitor.

All the comparisons I've seen are G-Sync monitors vs significantly worse FreeSync monitors. Of course inferior products are cheaper than good ones.

Link to comment
Share on other sites

Link to post
Share on other sites

I think I saw somewhere that VRR technology on the transport (HDMI or DP) was going mandatory, and no longer optional. If so, nvidia has no choice but to support it or be cut off from future versions of the standards. Anyone know different? If what I wrote is correct, that kinda implies VRR also has to be supported, which doesn't necessarily make sense on all displays.

 

That doesn't mean G-sync has no value. It is still going to be the premium VRR technology. You want the best, you can still pay for it. If G-sync compatible is good enough, you can go for that. You can go lower if you accept the limitations that go with it. Not everyone has best everything.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, GoldenLag said:

People will somehow make it seem like a bad thing

Nvidia's so evil! They're doing something that doesn't fit into my worldview of Nvidia being evil!

 

 

 

 

 

 

 

 

 

 

Happy now?

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, LAwLz said:

I still haven't seen anyone actually prove that the G-Sync module just adds an extra 200 dollars to the monitor.

All the comparisons I've seen are G-Sync monitors vs significantly worse FreeSync monitors. Of course inferior products are cheaper than good ones.

I've tested and used both G-Sync and FreeSync panels. Like all products, there are going to be good and bad implementations of both technologies. G-Sync in my opinion offers a better experience on average, but that doesn't mean it's perfect either. The Acer XB273K (first gen) that I own using the new G-Sync 2 (Ultimate) module sacrificed Lightboost for FALD, which feels poorly implemented as it can cause random blanking when used, especially when watching videos that change from light to dark backgrounds. I'd much rather have Lightboost for the less graphically demanding games that I play.

 

Cheap FreeSync monitors are basically worthless in my opinion, especially when paired with an Nvidia GPU. Some panels I've tested will randomly lose their input and swap between DP and HDMI when Freesync is enabled, and others had issues with half of the screen outright disappearing on the budget ultrawides. They also suffer from a very limited VRR range, severely reducing their effectiveness. Now with that being said, some of the premium Freesync panels feel absolutely smooth, just as good as Freesync. I can't vouch for how it would feel if one were to dip too low in framerates, but for the most part, I cannot tell the difference between a good Freesync panel and good G-Sync panel.

 

I am happy to see that Nvidia opened the standard up to everyone and made VRR tech more accessible, I just wish this would translate over to the mobile scene. Mobile is arguably the most important market for VRR tech as they typically have hardware that would be of more dire need for it. Right now, G-Sync on mobile is sketchy at best, as you can buy a panel directly from a G-Sync laptop and pair it with a compatible GTX or RTX card, but because your cards VBIOS didn't whitelist the panel, it will not work. I'd rather see it implemented the way things are on desktop systems, and simply let laptop users roll the dice as to whether or not they want to enable VRR tech.

 

This concludes my mini rant.

 

EDIT: Added a part about Freesync's limited VRR range on cheaper panels.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GoldenLag said:

Finally we have Freesync compatible monitors!

 

and dont worry about Nvidia finally doing a good thing. People will somehow make it seem like a bad thing

Well it is very unlike them. Usually they do whatever they can the fuck AMD over.

But considering the Freesync is an open standard that costs less there's still going to be more of those monitors.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

That's great to hear, from a customer standpoint it wasn't an easy decision to choose which "ecosystem" one should join.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

I bought the Dell S2719DGF for $280. I don't need Nvidia handouts to get adaptive sync. 

 

Nvidia is not the good guy here, they went "Adaptive refresh is ours" to "well I guess we should let other people have it" only because AMD made Freesync FREE and nobody wanted to pay double for their monitor because it had a BS GSync module. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, DrMacintosh said:

I bought the Dell S2719DGF for $280. I don't need Nvidia handouts to get adaptive sync. 

 

Nvidia is not the good guy here, they went "Adaptive refresh is ours" to "well I guess we should let other people have it" only because AMD made Freesync FREE and nobody wanted to pay double for their monitor because it had a BS GSync module. 

As far as I know, AMD didn't make Freesync free, they took an open (already free) standard and slapped a cool name on it. 

 

From what I've seen, G-Sync displays are usually higher quality panels to begin with, and g-sync overall is a better experience. Being an open standard, Freesync has a lot of leeway in implementations, so some are shitty and others are nice, but it varies by monitor. G-Sync panels are held to a higher standard of quality because OEMs have to go through Nvidia to be approved. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Princess Luna said:

So when they finally brought support to Freesync pretty much everyone stop buying G-Sync monitors to save up those 200 dollars while retaining the same experience? You don't say!

As someone with both gsync and freesync monitors, the gsync ultimate experience is still better than gsync on freesync. Is it worth the extra money? Eh.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Zando Bob said:

G-Sync panels are held to a higher standard of quality because OEMs have to go through Nvidia to be approved. 

That's not a good thing imo.

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Zando Bob said:

As far as I know, AMD didn't make Freesync free, they took an open (already free) standard and slapped a cool name on it. 

 

From what I've seen, G-Sync displays are usually higher quality panels to begin with, and g-sync overall is a better experience. Being an open standard, Freesync has a lot of leeway in implementations, so some are shitty and others are nice, but it varies by monitor. G-Sync panels are held to a higher standard of quality because OEMs have to go through Nvidia to be approved. 

I think it's more like they made the standard and then gave it to vesa. Amd was the first to showcase the technology

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Rune said:

As someone with both gsync and freesync monitors, the gsync ultimate experience is still better than gsync on freesync. Is it worth the extra money? Eh.

I own an Acer X34 with G-Sync and a Viotek GN34CB with Freesync, testing on a 2080 Ti both Adaptive Sync works the same for me ':x

 

Sure it depends greatly on implementations but high end Freesync panels are still cheaper than high end G-sync panels... this isn't about buying the cheapest monitor with Freesync and expecting it to be the same of a high end monitor featuring G-Sync.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Rune said:

Sounds like Apple.

Now you stop that. You stop pointing out that irony.

 

 

 

 

 

 

 

 

 

 

Spoiler

Because that’s my job.

 

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, DrMacintosh said:

That's not a good thing imo.

Makes for a more consistent experience for consumers. There's a price bump but that's pretty normal for a more premium product. Basically:

1 hour ago, porina said:

That doesn't mean G-sync has no value. It is still going to be the premium VRR technology. You want the best, you can still pay for it. If G-sync compatible is good enough, you can go for that. You can go lower if you accept the limitations that go with it. Not everyone has best everything.

Same as people reeing about the 5700 XT vs the 2070 Super, the 2070 Super does actually have reasons to cost more and it is indeed the better card. 

Or:

9 minutes ago, Rune said:

Sounds like Apple.

MacOS/iOS supported hardware goes through Apple. Results in a price premium (and less choices overall), but usually makes for a better, more consistent experience for consumers. 

 

Also yes:

1 minute ago, Mira Yurizaki said:

So the developer of a standard shouldn't be allowed to enforce a minimum quality?

 

No wonder USB is so fucked.

 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Rune said:

As someone with both gsync and freesync monitors, the gsync ultimate experience is still better than gsync on freesync. Is it worth the extra money? Eh.

Thats what I was wondering (so thank you for saying it). I've just upgraded my old monitor to one that supports gsync over freesync and the experience with it on compared to my old non vrr panel is outstanding. Everything feels smoother now, even at 60fps, and I can't imagine gsync would have been worth either the extra cash, or the downgrade in specs I would have taken if I went with that instead. 

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, DrMacintosh said:

I bought the Dell S2719DGF for $280. I don't need Nvidia handouts to get adaptive sync. 

 

Nvidia is not the good guy here, they went "Adaptive refresh is ours" to "well I guess we should let other people have it" only because AMD made Freesync FREE and nobody wanted to pay double for their monitor because it had a BS GSync module. 

To be fair, FreeSync isn't entirely "free", especially FreeSync 2 panels which now require a certification process similar to Nvidia's. Sure, it does not require additional hardware like Nvidia's expensive FPGA, but that certification does cost money, and can be passed off to the consumers in some way.

 

The G-Sync module isn't entirely worthless either. The first generation module included support for backlight strobing (lightboost) which was fantastic at alleviating motion blur assuming you could maintain an even framerate (120hz). The G-Sync 2 module provides a requirement for FALD when used in conjunction with HDR, which can be amazing for more accurate black color and shadows, but pretty awful when the image is rapidly changing from light to dark. 

 

Having had access to several G-Sync and FreeSync panels, I still chose to pay the G-Sync premium. There is a peace of mind with the certification process, and some of the added features can be useful depending on what I am doing. Whether or not it's worth it, that's entirely subjective. The real question is whether or not Nvidia is going to truly expand on this and let laptops with AMD and Intel iGPU's use their G-Sync certified panels. So far, their artificial limitations for mobile GPU's is wearing thin on consumers.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, spartaman64 said:

I think it's more like they made the standard and then gave it to vesa. Amd was the first to showcase the technology

co created? maybe?

vesa does have many members though

dp1.2a had adaptive sync protocol in jan 2013?

freesync 2014

 

curious myself now you talk about it

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×