Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
D13H4RD

NVIDIA finally officially supports Adaptive Sync (with a small catch)

Recommended Posts

Double Doh!  I just bought a 1070 (339CAD) and a S2716DG (449.99CAD).  Pulled the trigger a bit to quickly, I didn't realize the 2060 would be quite that competitive.  Good deals none the less, just a little bummed, I nearly returned the G-Sync monitor thinking... wtf, I turn off G-Sync and I can hardly tell the diff so why not get a 144hz Samsung 1440p 32"... same price.  I went back to check and the Sammy wasn't adaptive but only 60,120,144 selectable only.  I have until tomorrow to return this monitor on BB 14 day return policy.  Don't think I will though.  After all the boxing day deals are done there are very few similarly priced freesync monitors that come close to the specs of the dell.

 

Thinking out loud.  I'm pretty sure a couple of you did the same black friday and boxing day.

Link to post
Share on other sites
53 minutes ago, empyr3al said:

Double Doh!  I just bought a 1070 (339CAD) and a S2716DG (449.99CAD).  Pulled the trigger a bit to quickly, I didn't realize the 2060 would be quite that competitive.  Good deals none the less, just a little bummed, I nearly returned the G-Sync monitor thinking... wtf, I turn off G-Sync and I can hardly tell the diff so why not get a 144hz Samsung 1440p 32"... same price.  I went back to check and the Sammy wasn't adaptive but only 60,120,144 selectable only.  I have until tomorrow to return this monitor on BB 14 day return policy.  Don't think I will though.  After all the boxing day deals are done there are very few similarly priced freesync monitors that come close to the specs of the dell.

 

Thinking out loud.  I'm pretty sure a couple of you did the same black friday and boxing day.

 

You bought a good monitor, that's all really.  The bits in bold are reflective of what I said earlier about Gsync being higher quality and not always necessary with today's high performance GPU's


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites
18 minutes ago, mr moose said:

 

You bought a good monitor, that's all really.  The bits in bold are reflective of what I said earlier about Gsync being higher quality and not always necessary with today's high performance GPU's

Good yes, perfect it's not.  I'll be swapping it for another unit (same S2716DG G-Sync) as it has 1 dead pixel (constantly white) about 2x2" from the bottom left and 1 stuck pixel though extremely difficult to spot other than when using a Dead Pixel Spotting website.  1440p really means they don't stick out that badly, I just want something perfect. Dell's policy is that its covered, so I'll roll the dice and hope I don't get another panel with a dead pixel.

Link to post
Share on other sites
Posted · Original PosterOP
8 hours ago, mr moose said:

Actually,  I really think the thing that saved AMD was getting Lisa Su in as the CEO in 2014. 

That alongside AMD getting some breaks and some good releases. 

 

Notably the Polaris RX line, them supplying the processing units for the 8th generation home consoles and especially the successful launch of the Zen micro-architecture. 

 

Wasn't too long ago that the general sentiment around AMD was very bleak. 


Please tag me if you need assistance or if you want me to contribute to a topic 

 

ASUS RoG STRIX GL502VM

Intel Core i7 7700HQ | GeForce GTX 1060 6GB | 16GB DDR4-2133 | 128GB SanDisk M.2 SATA SSD + 1TB 7200RPM Hitachi HDD | 15.6" 1080p IPS monitor @ 60Hz w/ G-SYNC | Windows 10 64-bit

 

Samsung Galaxy Note8 SM-N950F

Exynos 8895 (4x Mongoose @ 2.3GHz, 4x Cortex A53 @ 1.7GHz)ARM Mali G71 MP20 | 6GB LPDDR4 | 64GB Samsung NAND flash w/ UFS 2.1 dual-lane controller + 128GB SanDisk C10 UHS-I microSD | 6.3" 1440p "Infinity Display" AMOLED | Android Nougat 7.1.1 w/ Samsung Experience 8.5

Link to post
Share on other sites

So nvidia finally decided to let the cat out of the bag and admit to the lie that everyone already knew was bullshit.

 

G-sync has always been a propriety cash grab and completely unnecessary. At the end of the day customer friendly open standards always pull through and nvidia could no longer justify the over inflated price differential of gimmick based video cards paired with a g-sync monitor for a several hundred dollar premium.

 

Now we'll just have to give it another 5 years before they admit that the "g-sync certification" is also a scam.


What does windows 10 and ET have in common?

 

They are both constantly trying to phone home.

Link to post
Share on other sites
25 minutes ago, Hellion said:

So nvidia finally decided to let the cat out of the bag and admit to the lie that everyone already knew was bullshit.

 

 

 

G-sync has always been a propriety cash grab and completely unnecessary. At the end of the day customer friendly open standards always pull through and nvidia could no longer justify the over inflated price differential of gimmick based video cards paired with a g-sync monitor for a several hundred dollar premium.

 

 

 

Now we'll just have to give it another 5 years before they admit that the "g-sync certification" is also a scam.

While Gsync is a premium over a freesync display, the feature requirements are higher such as the better variable refresh rate so you get a higher quality monitor. There are a lot of crappy Freesync monitors with the compatibility slapped on for marketing so an open standard isn't always better. But Nvidia opening up the support to some high end Freesync displays is awesome no matter the reason why, but of course some always have to make it seem negative because it's Nvidia.

Link to post
Share on other sites
2 hours ago, Blademaster91 said:

While Gsync is a premium over a freesync display, the display quality is still higher, and the feature requirements are higher such has the better variable refresh rate so you get a higher quality monitor. There are tons of crappy Freesync monitors with the compatibility slapped on for marketing so an open standard isn't always better. But Nvidia opening up the support to some high end Freesync displays is awesome no matter the reason why, but of course a few people always have to make it seem negative because it's Nvidia.

The display quality is higher under what standards exactly? I'm willing to bet that during a pepsi challenge scenario you like nearly everyone else would not be able to tell the difference. Regardless, most of the freesync monitors worth buying after taking into consideration the type of panel, refresh rate, response time and color gamut coverage are essentially the same as the g-sync variant. Of course an entry level monitor vs a premium one with a completely different spec sheet is going to have an inferior user experience... Your argument has zero merit without context.

 

Regarding the available range, educate yourself about freesync 2 and outside of that understand that the coverage range sub 60 FPS is going to be a poor experience simply based on the amount of dropped frames. You may not see tearing but at that point it won't matter. It's still going to feel choppy, slow and restrained.

 

At the end of the day this has nothing to do with nvidia's name. That's a desperate reach. As I've already explained here before I make purchases on a personal level based exclusively on value which boils down to price to performance ratio. Nvidia has been a poor customer experience for so many reasons that I can’t even keep track any more and it’s gone on for years.


What does windows 10 and ET have in common?

 

They are both constantly trying to phone home.

Link to post
Share on other sites
10 hours ago, Syntaxvgm said:

The problem is I guarantee it was an intentional limit that will be worked around with driver hacks like usual. It's not a hardware limitation. It doesn't matter if they are 5 years old, the fact is they still play games well and a lot of people still use them. I don't expect them to make it work if the cards were so old they no longer got driver updates, but they're not. I'm 90% sure it's an artificial limitation, and if they didn't have so much 10 series stock to get rid of I bet it would be 20 series only. 

And no, they are a little over 4 years old btw, not 5. 980ti is bout 3 and a half. 

image.png.329fc17a74d9953cf33854a6f0bb6d8e.png

I said they "will be" five years old this year, not they already are. They probably don't have a lot of 10 series cards in stock anymore. They're not excluding the 10-series because they simply can't. They're too new still and Nvidia is still actively optimizing for them. The 9-series is supported by current drivers, but its not like Nvidia is doing game specific optimizations for them and they haven't for a while now. They're still supported because they're not quite old enough to hit legacy status yet. Officially supporting old hardware, means spending resources testing on it to make sure everything works. That is extra time and money spent for what will probably be little to no real financial benefit and zero real negative results for not supporting them.

Link to post
Share on other sites
21 hours ago, Cyberspirit said:

Oh, damn! What a great way to start 2019. I don't have a freesync panel either but, I'll be curious how the "Not worthy" monitors will play along with this.

It's actually second christmas for me on the 15th, I'm soooooooooooooo excited to try it with my asus mg248 and 1070Ti. Since my monitor "isn't worthy", it makes me more curious to see what'll happen. I don't feel like my monitor is one of the lower end ones, but I guess I'll see in a week.


PSU Tier List

 

  PCs and other things:

Spoiler

Current PC

CPU: AMD A8-6600k (@4.2Ghz)

GPU: GTX 1070Ti 8gb FTW2 iCX

Motherboard: GA-F2A78M-HD2 

RAM: 8gb Patriot DDR3-1600mhz

Storage: Samsung 830 EVO 120gb, WD Blue 1tb 7200RPM 

PSU: SeaSonicM12II 620w (I wish I didn't buy this)

 Peripherals: Logitech G305//Corsair K55 RGB 

Displays: Asus MG248QR//Dell 1905FP

Spoiler

New PC (getting parts)

CPU: i7-8086k [✓]

Cooling: EVGA CLC 280 RGB [✓]

GPU: EVGA GTX 1070Ti 8gb FTW2 iCX [✓]

Mobo: GIGABYTE Z390 AORUS PRO [✓]

RAM: XPG SPEXTRIX X41 2666mhz (Grey) [✓]

 Storage: T-Force Delta 250gb, WD Blue 1tb 7200RPM [✓]

PSU: 650w+ and ranked well

Case: Fractal Design Meshify C White

Peripherals: Logitech G305//Corsair K55 RGB [✓]

Displays: Asus MG248QR//Dell 1905FP [✓]

Extra: Sky blue front panel, sky blue cable extensions [✓]

Spoiler

Other

Phone: Huawei Mate 10 Lite (China steal my data???)

Dog: Irish Water Spaniel, Lucy, 9 years old (I didn't know what to write)

School Laptop: Acer Chromebook R11 ((less RAM than my phone)

 

Link to post
Share on other sites
Posted · Original PosterOP
25 minutes ago, Hellion said:

The display quality is higher under what standards exactly? I'm willing to bet that during a pepsi challenge scenario you like nearly everyone else would not be able to tell the difference.

Mostly to due with binning. G-SYNC monitors tend to use highly binned panels while FreeSync panels can vary from highly binned to lower binned units. Some FreeSync models also use panels originally meant for G-SYNC models but did not meet standards.


Please tag me if you need assistance or if you want me to contribute to a topic 

 

ASUS RoG STRIX GL502VM

Intel Core i7 7700HQ | GeForce GTX 1060 6GB | 16GB DDR4-2133 | 128GB SanDisk M.2 SATA SSD + 1TB 7200RPM Hitachi HDD | 15.6" 1080p IPS monitor @ 60Hz w/ G-SYNC | Windows 10 64-bit

 

Samsung Galaxy Note8 SM-N950F

Exynos 8895 (4x Mongoose @ 2.3GHz, 4x Cortex A53 @ 1.7GHz)ARM Mali G71 MP20 | 6GB LPDDR4 | 64GB Samsung NAND flash w/ UFS 2.1 dual-lane controller + 128GB SanDisk C10 UHS-I microSD | 6.3" 1440p "Infinity Display" AMOLED | Android Nougat 7.1.1 w/ Samsung Experience 8.5

Link to post
Share on other sites
16 minutes ago, mxk. said:

It's actually second christmas for me on the 15th, I'm soooooooooooooo excited to try it with my asus mg248 and 1070Ti. Since my monitor "isn't worthy", it makes me more curious to see what'll happen. I don't feel like my monitor is one of the lower end ones, but I guess I'll see in a week.

Make sure to update us on that. I'm very curious.


Make sure to quote or tag people, so they get notified.

 

 

 

UP THE HAMMERS & DOWN THE NAILS
MAY THE LORDS OF LIGHT BE WITH YOU
BLESSED BE
HAIL CROM
HAIL ODIN
HAIL THOR
HAIL THE MANILLAN EMPIRE
HAIL TO THE BRETHREN OF THE HAMMER

Rest in peace Mark \m/

1957-2018

Link to post
Share on other sites
1 hour ago, D13H4RD said:

Mostly to due with binning. G-SYNC monitors tend to use highly binned panels while FreeSync panels can vary from highly binned to lower binned units. Some FreeSync models also use panels originally meant for G-SYNC models but did not meet standards.

The binning is done arbitrarily by a single company, nvidia. So again, in real world use cases you still wouldn't be able to tell the difference between the same panel. Both panels still meet the minimum spec.

 

That sure as hell isn't worth a $200+ premium especially to those that care about performance per dollar rather then chasing highly diminishing returns.

 

Even on the lower models monitor performance below 60 FPS is a crap shoot and having g-sync available sub 30 FPS is still a terrible experience.


What does windows 10 and ET have in common?

 

They are both constantly trying to phone home.

Link to post
Share on other sites
Posted · Original PosterOP
4 minutes ago, Hellion said:

That sure as hell isn't worth a $200+ premium especially to those that care about performance per dollar rather then chasing highly diminishing returns.

Well, no one (ones with a reasonably mindset anyway) ever said G-SYNC was worth $200+ over an equivalent FreeSync model.

 

Is G-SYNC objectively better? Yes. But is it $200+ better? I'd say no.

 

Regardless, it's still better than they (finally) realized that it was better to just let customers choose based on their demands and budget, rather than forcing something if someone wants variable refresh rate, rather than having options, especially when an open one exists that works well enough for many.


Please tag me if you need assistance or if you want me to contribute to a topic 

 

ASUS RoG STRIX GL502VM

Intel Core i7 7700HQ | GeForce GTX 1060 6GB | 16GB DDR4-2133 | 128GB SanDisk M.2 SATA SSD + 1TB 7200RPM Hitachi HDD | 15.6" 1080p IPS monitor @ 60Hz w/ G-SYNC | Windows 10 64-bit

 

Samsung Galaxy Note8 SM-N950F

Exynos 8895 (4x Mongoose @ 2.3GHz, 4x Cortex A53 @ 1.7GHz)ARM Mali G71 MP20 | 6GB LPDDR4 | 64GB Samsung NAND flash w/ UFS 2.1 dual-lane controller + 128GB SanDisk C10 UHS-I microSD | 6.3" 1440p "Infinity Display" AMOLED | Android Nougat 7.1.1 w/ Samsung Experience 8.5

Link to post
Share on other sites
On 1/8/2019 at 1:37 AM, D13H4RD said:

Well, no one ever said G-SYNC was worth $200+ over an equivalent FreeSync model.

 

Is G-SYNC objectively better? Yes. But is it $200+ better? I'd say no.

 

Regardless, it's still better than they (finally) realized that it was better to just let customers choose based on their demands and budget. 

Which is how it should have been since day one. Instead nvidia made a propriety over priced "alternative" and touted nonsense about how freesync isn't supported to fleece fan boys without the logical ability to rationalize. Then 5 years later after realizing that their bullshit strategy wasn't working to force g-sync sales over the open standard, came clean and enabled adaptive sync via nothing more then a driver update.

 

So even if we conclude that the same panel is slightly better sub 30 FPS where it doesn't actually matter, that doesn't change the fact that this is just another shitty business practice tacked onto a never ending list from nvidia.

 

I personally don't praise corporations for doing what was right after the fact. It's like praising a game developer for not including loot boxes. Games never had that shit 30 years ago. Why should they be praised for not having them in the modern age?


What does windows 10 and ET have in common?

 

They are both constantly trying to phone home.

Link to post
Share on other sites
1 hour ago, D13H4RD said:

Mostly to due with binning. G-SYNC monitors tend to use highly binned panels while FreeSync panels can vary from highly binned to lower binned units. Some FreeSync models also use panels originally meant for G-SYNC models but did not meet standards.

First time I am hearing that the G-Sync panels are better binned ones. Got a source on that?

I don't find it that hard to believe considering G-Sync monitors are usually more expensive, but I'd like some evidence before believing it.

 

 

16 minutes ago, Hellion said:

The binning is done arbitrarily by a single company, nvidia. So again, in real world use cases you still wouldn't be able to tell the difference between the same panel. Both panels still meet the minimum spec.

If the panels for FreeSync monitors are lower binned ones then it's not done by Nvidia. The binning happens at the panel manufacturer (like Samsung or Japan Display) and then the manufacturers of the displays (Asus, BenQ, etc) will decide which panel gets put into which monitor model. Nvidia can only say that "if you want to be called a G-Sync monitor and be compatible, you need to fulfill these specifications". Doing things like demanding that your competitor only get B panels is illegal.

 

 

19 minutes ago, Hellion said:

That sure as hell isn't worth a $200+ premium especially to those that care about performance per dollar rather then chasing highly diminishing returns.

Got any source that there is a $200+ price premium? I see this get thrown around all the time but never seen it actually backed up by posting comparable displays. And no, a FreeSync monitor without things like frame duplication or strobe backlight is not comparable.

 

20 minutes ago, Hellion said:

Even on the lower models monitor performance below 60 FPS is a crap shoot and having g-sync available sub 30 FPS is still a terrible experience.

What are you on about? It's at below 60 FPS you get the biggest benefit from adaptive sync.

One of the main reasons why sub-60 FPS feels bad is because you get very uneven frame timings. When some frames are displayed for 1 refresh cycle, and others for 3 or 4 refresh cycles, the animation looks and feels very choppy. Adaptive sync like FreeSync and G-Sync solves that.

Otherwise FreeSync and G-Sync would just be glorified solutions to v-sync, which isn't even a big problem at such high frame rates.

Link to post
Share on other sites
10 minutes ago, Hellion said:

Instead nvidia made a propriety over priced "alternative" and touted nonsense about how freesync isn't supported to fleece fan boys without the logical ability to rationalize.

Weren't nvidia first out with variable refresh technology? FreeSync was a catchup move by AMD, and they only way they could get traction on it was to give it away by standardising it. It's the strategy AMD are forced into by their weak position, they're not doing this out of their generosity, it's all they can do business wise. We end up with some good FreeSync panels, but there are many not so good ones also.

 

7 minutes ago, LAwLz said:

It's at below 60 FPS you get the biggest benefit from adaptive sync.

I think I'd disagree there, at least based on my personal preferences. With a 144 Hz G-sync and 60 Hz FreeSync, I think the benefit is in some zone around 50-90fps. At very high frame rates, 100+, I'm not sure I feel it. Below 50fps, even if pacing is helped, it isn't quite smooth enough for me. Variable fps in the 60+ fps area does feel more responsive to me than say fixed 60fps V-sync. Maybe it is latency reduction more than the absolute frame rate. To save on heat and noise, I'm actually playing in game limited to 72fps.


Main rig: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte Windforce 980Ti, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, G.Skill TridentZ 3000C14 2x8GB, Asus 1080 Ti Strix OC, Fractal Edison 550W PSU, Corsair 600C, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

Ryzen rig: Asrock B450 ITX, R5 2600, Noctua D9L, Corsair Vengeance LPX 3000 2x4GB, Vega 56, Corsair CX450M, NZXT Manta, Crucial MX300 525GB, Acer RT280K

VR rig: Asus Z170I Pro Gaming, i7-6600k stock, Silverstone TD03-E, Kingston Hyper-X 2666 2x8GB, Zotac 1070 FE, Corsair CX450M, Silverstone SG13, Samsung PM951 256GB, HTC Vive

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB SSD

Total CPU heating: i7-7800X, 2x i7-6700k, i7-6700HQ, i5-6600k, i5-5675C, i5-4570S, i3-8350k, i3-6100, i3-4360, 2x i3-4150T, E5-2683v3, 2x E5-2650, R7 1700, 1600

Link to post
Share on other sites
Posted · Original PosterOP
21 minutes ago, LAwLz said:

First time I am hearing that the G-Sync panels are better binned ones. Got a source on that?

I don't find it that hard to believe considering G-Sync monitors are usually more expensive, but I'd like some evidence before believing it.

I couldn’t find the source that I had since it was a while from when I last heard of it, but the closest is this one

 

It’s Reddit, so hardly the most informative and reliable source, but it’s honestly hard to find a more recent source that backs up that theory.

 

Speaking of that, I have to issue an update and a potential correction. G-SYNC monitors tend to be upper-end gaming units that could have highly-binned panels but reports of defective units a la Amazon reviews may suggest that these are also subject to various levels of QC.

 

Another update is that FreeSync monitors can also be highly binned, although that depends on the type of monitor that features it 


Please tag me if you need assistance or if you want me to contribute to a topic 

 

ASUS RoG STRIX GL502VM

Intel Core i7 7700HQ | GeForce GTX 1060 6GB | 16GB DDR4-2133 | 128GB SanDisk M.2 SATA SSD + 1TB 7200RPM Hitachi HDD | 15.6" 1080p IPS monitor @ 60Hz w/ G-SYNC | Windows 10 64-bit

 

Samsung Galaxy Note8 SM-N950F

Exynos 8895 (4x Mongoose @ 2.3GHz, 4x Cortex A53 @ 1.7GHz)ARM Mali G71 MP20 | 6GB LPDDR4 | 64GB Samsung NAND flash w/ UFS 2.1 dual-lane controller + 128GB SanDisk C10 UHS-I microSD | 6.3" 1440p "Infinity Display" AMOLED | Android Nougat 7.1.1 w/ Samsung Experience 8.5

Link to post
Share on other sites
On 1/7/2019 at 8:42 AM, D13H4RD said:

One of the biggest bombshells in NVIDIA's CES 2019 press conference was the announcement that NVIDIA GPU users will no longer have to get a G-SYNC monitor to utilize adaptive sync, as the company now extends official support to a number of FreeSync-capable monitors, dubbed as "G-SYNC capable". 

There is a small catch however. Jensen Huang claims that after testing approximately 400 monitors on the market, only 12 are deemed to be worthy of bearing the "G-SYNC CAPABLE" moniker. These 12 are;

  Reveal hidden contents
  • Acer XFA240
  • Acer XZ321Q
  • Acer XG270HU
  • Acer XV273K
  • Agon AG241QG4
  • AOC G2590FX
  • Asus MG278Q
  • Asus XG248
  • Asus VG258Q
  • Asus XG258
  • Asus VG278Q
  • BenQ Xl2740

With all that said, if your monitor is on the "We're not wooooorthyyyyy" list, the setting can still be enabled manually from the GPU Control Panel as part of a driver update that will be released on the 15th of January, whereas supported monitors have them enabled automatically. 

 

D13H4RD's opinion 

  Reveal hidden contents

This is the best news out of the entire press conference by far. While G-SYNC is neat, these monitors can be significantly pricier compared to their FreeSync counterparts. NVIDIA's announcement basically means that while they can't guarantee the best performance, it should work. 

Source: VentureBeat

Too bad I spent $600 on a G-SYNC monitor two months ago... Well, at least I got it on sale so I didn't have to spend $780 on it :) 


CPU: i7-9700K OC @ 4.9 GHz 

CPU Cooler: Corsair Hydro H150i Pro 360mm 
Motherboard: ASUS ROG STRIX Z390-F 
RAM: Corsair 16GB (2x8GB) DDR4 3200MHz CL16 Vengeance 

SSD: Samsung 970 EVO Plus 1TB 
GPU: RTX 2070 FE 
Case: Corsair Obsidian 500D RGB 
PSU: Corsair RM750X 
Monitor: ROG Swift PG278QR G-Sync 165Hz 

Link to post
Share on other sites

PCWorld had a look at Nvidia's booth o_o

EDIT: I'm keeping in mind that Nvidia might be showing worst case scenarios, although it'll actually take anecdotal reports to see how many actually work out there (or if Nvidia manages to take hardware polls in the geforce driver ._.)

Link to post
Share on other sites
2 hours ago, LAwLz said:

First time I am hearing that the G-Sync panels are better binned ones. Got a source on that?

I don't find it that hard to believe considering G-Sync monitors are usually more expensive, but I'd like some evidence before believing it.

Ever so slightly better binned, they have 165hz instead of the usual 144hz. So meh. 

Link to post
Share on other sites

I'm looking forward to testing my AOC G2460PF

 

It was one of the better freesync displays when it was new, with a freesync range of 35-144 hz

 

It can also do it over HDMI up to 120hz


System specs:

4790k

GTX 1050

16GB DDR3

Samsung evo SSD

a few HDD's

Link to post
Share on other sites

This happening was just a matter of time. There was no way NVIDIA could counter an open standard that's as good and cheaper than their G-Sync. Plus, not supporting it meant people were essentially stuck with AMD solutions if they wanted tear free gaming but didn't want to invest into god expensive G-Sync monitor (or change it if they had FreeSync capable monitor already). Now with NVIDIA supporting FreeSync too, means you have GeForce again as viable solution. Out of which NVIDIA profits the most because you can be their customer again.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×