Jump to content

nvidia unlocks future gsync for amd gpus

spartaman64
10 minutes ago, spartaman64 said:

if a new phone company comes out and says their phone is better than iphones and then compares it to an iphone 2 would that be fair? i think thats what hes getting at the higher end free sync monitors are just as good as the higher end g sync monitors

 

gsync is just a quality control cert with requirements (supposedly from articles)

freesync not really any quality control, there are many we all know that freesync means shit on them, freesync on them even sucks

freesync 2 little stricter but still same applies

its not fluff when all gsync monitors are held to higher quality from the get go

 

as time goes on yes of course freesync shit is just as good

companies want to make quality products for cheap removing gsync module does that

but consumers are lazy and want quality features but no work of researching what works for cheaper

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, pas008 said:

 

gsync is just a quality control cert with requirements (supposedly from articles)

freesync not really any quality control, there are many we all know that freesync means shit on them, freesync on them even sucks

freesync 2 little stricter but still same applies

its not fluff when all gsync monitors are held to higher quality from the get go

 

as time goes on yes of course freesync shit is just as good

companies want to make quality products for cheap removing gsync module does that

but consumers are lazy and want quality features but no work of researching what works for cheaper

 

 

shitty free sync is better than no free sync and thats the point of free sync it costs so little to implement that you might as well put it in every monitor even the lower end models

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Neftex said:

opinions can be wrong, what i call fluff is the article. i see people parroting this "gsync better quality" thing over and over but when you look at real test, its clearly not true

It's not fluff, especially when FreeSync first came about. G-Sync's requirements offered a better experience as they essentially served as a guarantee. You knew what you were buying, and didn't have to roll the dice on a manufacturers subjective implementation of a technology. AMD's original implementation of FreeSync was basically the wild west. You would have dramatically different VRR ranges depending on the panel you purchased, while Nvidia had a strict VRR window with a minimum refresh rate requirement. AMD actually picked up on this, and changed their FreeSync 2 requirements to require LFC, which guarantees that their maximum refresh rate is at least 2.5x the minimum refresh rate. This helps makes sure that the VRR range is usable to keep it enabled even during dramatic frame dips. This requirement was not met with many of the original FreeSync panels, and the dips outside of the VRR range were very jarring and noticeable. They also added HDR requirements which helped weed out a lot of the extremely cheap budget panels that were basically guaranteed to have the original FreeSync support. Buying a FreeSync 2 panel with their new certification requirements offers far more peace of mind compared to when FreeSync first launched.

 

Honestly if we are talking FreeSync 2 vs G-Sync Ultimate, you'd be hard pressed to feel any differences during use. I personally wasn't able to determine any real difference, and I've owned a G-Sync panel since they first came out. When I upgraded from my S2417DG, I chose another G-Sync panel (XB273K) because at the time, it was priced only $50 more than the 4k 144hz FreeSync 2 version (XV273K) and I already had an Nvidia GPU.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, spartaman64 said:

shitty free sync is better than no free sync and thats the point of free sync it costs so little to implement that you might as well put it in every monitor even the lower end models

thats the point

which is why there are diamonds in the rough with freesync also

gsync isnt only about vrr looks to be have other things included supposedly from the articles

 

I like to think of freesync as the usda organic cert

you are guaranteed organic thats it

with gsync you supposedly are guaranteed organic with some other things like looks, size, or taste

 

from the article

Every G-Sync display needs to pass Nvidia’s strict certification process, which has rejected monitors in the past. Nvidia hasn’t publicly detailed the requirements, but representatives tell me that the company works directly with panel makers like AU Optronics to optimize refresh rates, flicker properties, response times, and visual quality; then works with the display makers (like Asus and Acer) to fine-tune the on-screen display and more. Every monitor is calibrated to the sRGB color gamut. 

Every G-Sync monitor supports the equivalent of AMD’s Low Framerate Compensation, guaranteeing a smooth gaming experience. All G-Sync monitors also support “frequency dependent variable overdrive.” Without diving into too much detail, the technology prevents ghosting on G-Sync displays—an issue that severely affected early FreeSync panels, though the issue’s less prevalent now.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, spartaman64 said:

shitty free sync is better than no free sync and thats the point of free sync it costs so little to implement that you might as well put it in every monitor even the lower end models

If a feature is shoddily implemented, I'd rather not have it at all. At best it won't be used so what's the point, at worst it'll get in the way and I'll hate it.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Mira Yurizaki said:

If a feature is shoddily implemented, I'd rather not have it at all. At best it won't be used so what's the point, at worst it'll get in the way and I'll hate it.

you mean at worst you will disable it and at best it provides you with a 20 fps window of vrr

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, spartaman64 said:

you mean at worst you will disable it and at best it provides you with a 20 fps window of vrr

If I have to do work to make sure my performance is within that 20FPS window, it's not worth even trying. Either it "just works" or it doesn't.

 

EDIT: Especially if that window is still within my acceptable FPS requirements.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Neftex said:

opinions can be wrong, what i call fluff is the article. i see people parroting this "gsync better quality" thing over and over but when you look at real test, its clearly not true

Depends on what you look at. The bigger issue is that there are some monitors that simply do not have a freesync alternative like the pg27uq where your only option is gsync if you want 1000 nit peak brightness 4k HDR10 98hz monitor. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Neftex said:

opinions can be wrong, what i call fluff is the article. i see people parroting this "gsync better quality" thing over and over but when you look at real test, its clearly not true

 

Gsync does actually have stricter requirements, like it or not that makes it better spec'd.  Whether you think those specs are worth it or not is personal opinion which cannot be wrong.

 

Thing thing here is you think your opinions are right and that anyone who disagrees is wrong.   It doesn't work like that.

 

5 hours ago, spartaman64 said:

if a new phone company comes out and says their phone is better than iphones and then compares it to an iphone 2 would that be fair? i think thats what hes getting at the higher end free sync monitors are just as good as the higher end g sync monitors

 

He's not talking about comparing old products to new products, but comparing the technology and with that g-sync has stricter requirements.  In your example apple can be argued to have much stricter requirements (regard app store, privacy etc) but that doesn't mean for the individual that an iphone is better,  to others it does.   What can be argued is that the requirements are better, the price is justified, whether or not that is of value to the end user is end user opinion. 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, mr moose said:

 

Gsync does actually have stricter requirements, like it or not that makes it better spec'd.  Whether you think those specs are worth it or not is personal opinion which cannot be wrong.

 

Thing thing here is you think your opinions are right and that anyone who disagrees is wrong.   It doesn't work like that.

 

 

He's not talking about comparing old products to new products, but comparing the technology and with that g-sync has stricter requirements.  In your example apple can be argued to have much stricter requirements (regard app store, privacy etc) but that doesn't mean for the individual that an iphone is better,  to others it does.   What can be argued is that the requirements are better, the price is justified, whether or not that is of value to the end user is end user opinion. 

 

 

im just saying you cant compare your product to the bottom tier of another company's protect to say yours is better than theirs. maybe a better example is if intel compared their 10980xe to the new athlon to say their cpus are better than amd's new athlon 200ge

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, spartaman64 said:

im just saying you cant compare your product to the bottom tier of another company's protect to say yours is better than theirs. maybe a better example is if intel compared their 10980xe to the new athlon to say their cpus are better than amd's new athlon 200ge

I don't think anyone is doing that.   They are comparing freesync to gsync in both technical requirements of each and how it effects the general performance of the whole market.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, spartaman64 said:

you mean at worst you will disable it and at best it provides you with a 20 fps window of vrr

A window of only 20fps isn't enough to work with, and if its some low tier monitor that doesn't have a decent vrr range it can also have screen tearing or flickering, I'd rather turn off vrr.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Blademaster91 said:

A window of only 20fps isn't enough to work with, and if its some low tier monitor that doesn't have a decent vrr range it can also have screen tearing or flickering, I'd rather turn off vrr.

if you have a powerful enough computer you would easily keep the game at near 60 fps and like i said if you dont want it then turn it off its not like it cost you anything extra while other people have the choice to keep it on

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, spartaman64 said:

if you have a powerful enough computer you would easily keep the game at near 60 fps and like i said if you dont want it then turn it off its not like it cost you anything extra while other people have the choice to keep it on

Not necessarily. Even the most powerful systems have minor dips in framerates that would drop below 60. I have an 8700k at 5.2ghz and an RTX 2080 Ti and I can still dip at random below 60 fps depending on the game I am playing. When your VRR range is 40-60hz, any dip below 40 will be jarring. 

 

This is an even bigger issues in poorly optimized titles like PUBG, where you can have 20% CPU utilization, 100% GPU utilization, then dip for no reason and watch as your GPU is underutilized while the rest of your hardware sits idle. It's less of a problem now, but when you had those 48hz-75hz VRR panels that first launched with FreeSync, people struggled to stay above that minimum window.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, MageTank said:

Not necessarily. Even the most powerful systems have minor dips in framerates that would drop below 60. I have an 8700k at 5.2ghz and an RTX 2080 Ti and I can still dip at random below 60 fps depending on the game I am playing. When your VRR range is 40-60hz, any dip below 40 will be jarring. 

 

This is an even bigger issues in poorly optimized titles like PUBG, where you can have 20% CPU utilization, 100% GPU utilization, then dip for no reason and watch as your GPU is underutilized while the rest of your hardware sits idle. It's less of a problem now, but when you had those 48hz-75hz VRR panels that first launched with FreeSync, people struggled to stay above that minimum window.

well you dont really want vrr under 30 anyways. it would be nice for the window to be 30-60 but you get what you pay for

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, spartaman64 said:

well you dont really want vrr under 30 anyways. it would be nice for the window to be 30-60 but you get what you pay for

If it's going to continue to be an afterthought, then FreeSync is merely a checkbox on the list of features to make some product look better.

 

EDIT: It's the same thing with HDR. If the monitor is rated for HDR400, I'm not even going to bother with it.

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, spartaman64 said:

well you dont really want vrr under 30 anyways. it would be nice for the window to be 30-60 but you get what you pay for

Yeah... Even with LFC, sub-30fps is not a great experience. They make claims that input lag is not an issue when this happens, but that is most definitely untrue. This is the case on both G-Sync and FreeSync.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, MageTank said:

but when you had those 48hz-75hz VRR panels that first launched with FreeSync, people struggled to stay above that minimum window.

I had that issue myself back when I used a LG 29UM68-P which is 48hz to 75hz... Freesync on nVidia worked flawlessly but the window was too small for certain demanding (poorly optimized) games.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Princess Luna said:

I had that issue myself back when I used a LG 29UM68-P which is 48hz to 75hz... Freesync on nVidia worked flawlessly but the window was too small for certain demanding (poorly optimized) games.

I most recently encountered it a few months back in a lab environment trying to demonstrate the Port Royal benchmark with Raytracing. We were using an RTX 2080 Ti in a custom loop system using a 32 inch LG 1440p monitor that had Freesync. We enabled G-Sync compatible, but the VRR window was 48-75 and we kept dipping below 48fps with RT enabled. We tried to use CRU to edit the FreeSync range, but I couldn't get it to work.

 

Needless to say, that demo did not go so well, lol.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, mr moose said:

Gsync does actually have stricter requirements, like it or not that makes it better spec'd.  Whether you think those specs are worth it or not is personal opinion which cannot be wrong.

slapping gsync on a monitor doesnt magically make it good - you can see that between gsync monitors themselves.

 

gsync and freesync main function is vrr and you cant say gsync is better at that

 

i linked a test with same panel gsync vs freesync and it shows that the freesync one was better, so gsync requirements and certification didnt help the product.

 

so my point is, it doesnt matter if the monitor is gsync or freesync, you need to research the monitor anyway

MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Neftex said:

slapping gsync on a monitor doesnt magically make it good

Nvidia had a list of requirements for G-sync monitors, Freesync had none. The end result: significantly higher quality on the lower end G-sync monitors than on low end Freesync monitors, where most manufacturers tacked on (bad implementations of) freesync.

 

AMD fixed that with the second edition of Freesync, by introducing similar requirements manufacturers had to follow in order to market their monitors as Freesync 2.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Neftex said:

slapping gsync on a monitor doesnt magically make it good - you can see that between gsync monitors themselves.

That;'s not how it works.  No one can just slap the G-syn label on it,  When freesync first came out you could do it with that, but not G-sync.

5 hours ago, Neftex said:

gsync and freesync main function is vrr and you cant say gsync is better at that

You don't get it.   They both aim to do the same thing, but the g-sync requirements go above and beyond that.  As multiple people have tried to tell you already.

 

5 hours ago, Neftex said:

i linked a test with same panel gsync vs freesync and it shows that the freesync one was better, so gsync requirements and certification didnt help the product.

 

so my point is, it doesnt matter if the monitor is gsync or freesync, you need to research the monitor anyway

 

Your point was that other people's opinions were wrong and only yours was right.  Except in this case that is logically, demonstrably and relatively wrong. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, mr moose said:

That;'s not how it works.  No one can just slap the G-syn label on it,  When freesync first came out you could do it with that, but not G-sync.

manufacturer slaps the label on it when it passes this "dubious" certification

2 hours ago, mr moose said:

Your point was that other people's opinions were wrong and only yours was right.  Except in this case that is logically, demonstrably and relatively wrong

no, my point was exactly what i said: "opinions can be wrong". i reacted to an article claiming gsync makes sure the panels are optimized and whatnot and i linked concrete evidence of freesync monitor outperforming gsync with the same panel, no opinion there

2 hours ago, mr moose said:

You don't get it.   They both aim to do the same thing, but the g-sync requirements go above and beyond that.  As multiple people have tried to tell you already.

i do very much get it. nvidia wants us to think that gsync is some kind of guarantee of better quality than anything labeled as freesync, which is not true. if you want good monitor you need to look at other stuff, gsync nor freesync label will help you find one

MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Neftex said:

manufacturer slaps the label on it when it passes this "dubious" certification

What are you talking about?  

20 minutes ago, Neftex said:

no, my point was exactly what i said: "opinions can be wrong".

 

Like yours.

20 minutes ago, Neftex said:

i reacted to an article claiming gsync makes sure the panels are optimized and whatnot and i linked concrete evidence of freesync monitor outperforming gsync with the same panel, no opinion there

You linked to the only article you could find that from a certain point of view said something that supported your opinions.  What you did not do was point to an article showing the person you quoted was wrong.  It is their opinion and experience the Gsync is better,  several people have explained to you why this can be true and is not "wrong".  You are the one suffering under the delusion that your opinions on Gsync (which you have illustrated is deeply lacking knowledge) is the only true opinion.

20 minutes ago, Neftex said:

i do very much get it. nvidia wants us to think that gsync is some kind of guarantee of better quality than anything labeled as freesync, which is not true. if you want good monitor you need to look at other stuff, gsync nor freesync label will help you find one

Keep telling yourself that.  It does not make everyone else wrong and you right.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mr moose said:

You linked to the only article you could find that from a certain point of view said something that supported your opinions.  What you did not do was point to an article showing the person you quoted was wrong.  It is their opinion and experience the Gsync is better,  several people have explained to you why this can be true and is not "wrong".  You are the one suffering under the delusion that your opinions on Gsync (which you have illustrated is deeply lacking knowledge) is the only true opinion.

i linked the only thing as close to apples to apples comparison, same panel, same manufacturer, same review outlet... i did that exactly so its fair and not based on opinion. that said im done with this shit, i shouldnt care about opinions based on... well nothing concrete it seems

MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×