Jump to content

NVIDIA finally officially supports Adaptive Sync (with a small catch)

D13H4RD
4 hours ago, Ja50n said:

https://www.newegg.com/Product/Product.aspx?Item=N82E16824009769

https://www.newegg.com/Product/Product.aspx?Item=24-236-797

No clue about extra features, but backlight strobing would cut brightness in half and significantly increase the chances of eye strain and headaches...

This is what I meant when I said people are just looking at the very basic specs when comparing G-Sync and FreeSync, and ignoring a lot of the functions and features that are not obvious. Those two monitors are not comparable. The G-Sync one is a higher tier monitor.

 

The G-Sync one has strobing backlight, unlike the Freesync one.

The FreeSync monitor also lacks adaptive overdrive, so if you want to avoid ghosting you're going to need to manually tune your overdrive settings, preferably for each individual game.

 

After looking at tests of the two monitors I can also say that:

The Asus one has better calibration out of the box. For example more accurate colors and grayscale. It's not all that much, but still enough to be noticeable. The G-Sync monitor also has lower input lag (23ms vs 28 ms). 

 

 

It's true that strobing backlight monitors need extra brightness to compensate for the black intervals. However, this is not really an issue on modern displays (at least high quality ones) because they are fitted with eye-searingly bright backlights to begin with. If you run your current display at 100% brightness, and it's a fairly high end one, then chances are you are getting eye strains and headaches because the display is too bright. It's like shining a flashlight into your eyes.

 

The strobing backlight can be turned off, so even if there were some disadvantages you could just turn it off if you didn't want it. The point is that you're getting an extra feature that you can use if you want (and it's brilliant for reducing motion blur).

 

 

I am not saying the FreeSync monitor is bad. It's a good monitor. But the Asus one is in fact a higher tier one. More features and overall a slightly better display in terms of accuracy.

 

I am getting kind of tired of people doing these types of comparisons when they conclude that "FreeSync is cheaper than G-Sync" and they don't bother looking into more than screen size, resolution and refresh rate. You can't say two monitors are comparable without doing some in-depth reading by people who has done things like measured it with a spectrophotometer or colorimeter. And once you start looking into it a bit more, you will realize that FreeSync monitors are usually cheaper, especially when we're talking about several hundred dollars, because they are just not as good as the G-Sync monitor it's being compared to.

 

This 200 dollar G-Sync tax myth needs to die.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, LAwLz said:

This is what I meant when I said people are just looking at the very basic specs when comparing G-Sync and FreeSync, and ignoring a lot of the functions and features that are not obvious. Those two monitors are not comparable. The G-Sync one is a higher tier monitor.

 

The G-Sync one has strobing backlight, unlike the Freesync one.

The FreeSync monitor also lacks adaptive overdrive, so if you want to avoid ghosting you're going to need to manually tune your overdrive settings, preferably for each individual game.

 

After looking at tests of the two monitors I can also say that:

The Asus one has better calibration out of the box. For example more accurate colors and grayscale. It's not all that much, but still enough to be noticeable. The G-Sync monitor also has lower input lag (23ms vs 28 ms). 

 

 

It's true that strobing backlight monitors need extra brightness to compensate for the black intervals. However, this is not really an issue on modern displays (at least high quality ones) because they are fitted with eye-searingly bright backlights to begin with. If you run your current display at 100% brightness, and it's a fairly high end one, then chances are you are getting eye strains and headaches because the display is too bright. It's like shining a flashlight into your eyes.

 

The strobing backlight can be turned off, so even if there were some disadvantages you could just turn it off if you didn't want it. The point is that you're getting an extra feature that you can use if you want (and it's brilliant for reducing motion blur).

 

 

I am not saying the FreeSync monitor is bad. It's a good monitor. But the Asus one is in fact a higher tier one. More features and overall a slightly better display in terms of accuracy.

 

I am getting kind of tired of people doing these types of comparisons when they conclude that "FreeSync is cheaper than G-Sync" and they don't bother looking into more than screen size, resolution and refresh rate. You can't say two monitors are comparable without doing some in-depth reading by people who has done things like measured it with a spectrophotometer or colorimeter. And once you start looking into it a bit more, you will realize that FreeSync monitors are usually cheaper, especially when we're talking about several hundred dollars, because they are just not as good as the G-Sync monitor it's being compared to.

 

This 200 dollar G-Sync tax myth needs to die.

I feel you're actually proving my point there, because I'm guessing those extra features on G-sync aren't optional, but required. Therefore someone who's monitor shopping is likely to spend a few hundred more for the Nvidia solution, regardless of whether they're getting more features for their money. It's kind of like justifying RTX cost compared to Vega purely on ray tracing... Yes, RTX cards can do more, but they also cost a lot more even when doing the same thing. Gsync may very well be better, we'll see once comparisons come out! It would still have cost me $600-800, though, when my FreeSync monitor (a feature that made no impact on my purchase, it was just the best deal on the class of monitor at the time) was only $400. Features missing, probably, but I like being able to choose that for myself.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/8/2019 at 12:47 PM, Crunchy Dragon said:

my panel isn't on the "worthy" list.

I have a few MG248Qs I'll be testing. I'm curious to see if they play well with free-sync like their bigger brother is supposed to do.

"And I'll be damned if I let myself trip from a lesser man's ledge"

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Ja50n said:

Yes, RTX cards can do more, but they also cost a lot more even when doing the same thing. Gsync may very well be better, we'll see once comparisons come out!

I'm not too sure I follow your logic here. the consumers needs/wants justify/dismiss whether the product with more features is worth it.  What I believe LawLz is saying is that G-sync nearly always has more features that warrant it being more expensive.  Whether it is justified for your specific use or not is a different story. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, mr moose said:

I'm not too sure I follow your logic here. the consumers needs/wants justify/dismiss whether the product with more features is worth it.  What I believe LawLz is saying is that G-sync nearly always has more features that warrant it being more expensive.  Whether it is justified for your specific use or not is a different story. 

But that's where you and @LAwLz are so very obviously wrong. If *I* don't need something, then the product with that something shouldn't exist. 

 

 

 

sadly necessary

 

/S

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, 79wjd said:

But that's where you and @LAwLz are so very obviously wrong. If *I* don't need something, then the product with that something shouldn't exist. 

 

 

  Reveal hidden contents

sadly necessary

 

/S

How could I have been so foolish?   ?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, mr moose said:

I'm not too sure I follow your logic here. the consumers needs/wants justify/dismiss whether the product with more features is worth it.  What I believe LawLz is saying is that G-sync nearly always has more features that warrant it being more expensive.  Whether it is justified for your specific use or not is a different story. 

Yeah... so as a very happy user of one of the early 1440p ultrawide g-sync panels, it certainly paid off for me at least. These monitors being at least decently color calibrated and color accurate... as fast of ips end to end monitors as existed at the time (and still faster than many tn and va panels out today that use worse scalers and internal hardware). 100Hz refresh rate, perfectly good with me.

 

Honestly it's been an experience I would have recommended to anyone in my shoes that could afford it.

 

I'm glad the g-sync compatible program is making sure to require similarly high standards, and the people down the line that pay less for the same experience I got? Well good for them. It didn't increase my cost directly from their cost being lowered. This isn't a zero-sum trade.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Ja50n said:

I feel you're actually proving my point there, because I'm guessing those extra features on G-sync aren't optional, but required. Therefore someone who's monitor shopping is likely to spend a few hundred more for the Nvidia solution, regardless of whether they're getting more features for their money. It's kind of like justifying RTX cost compared to Vega purely on ray tracing... Yes, RTX cards can do more, but they also cost a lot more even when doing the same thing. Gsync may very well be better, we'll see once comparisons come out! It would still have cost me $600-800, though, when my FreeSync monitor (a feature that made no impact on my purchase, it was just the best deal on the class of monitor at the time) was only $400. Features missing, probably, but I like being able to choose that for myself.

All I was trying to explain is that it isn't as simple as going "there is a 200 dollar price premium for G-Sync" because that implies the monitors are the same in the other aspects, which they aren't. 

 

How would you like it if I said Ryzen had a 200 dollar "AMD tax" which Intel didn't have? Don't believe me? Look at the price of the 2700X. It costs 300 dollars. Now look at the price of the Intel Pentium G5600. It's 100 dollars. Clearly AMD is 200 dollars more expensive. 

What? Yoy don't think that's a fair comparison because the Amd chip performs better? Well I am satisfied with the performance of the Pentium chip so I think it's a fair comparison. 

/sarcasm

 

 

If you can link me a few G-Sync monitors and a few FreeSync monitors, which all have the same feature sets, quality and performance, but the G-Sync monitors are 200 dollar more expensive then I'll agree with your claim. Linking two different monitors, with different performance and different feature sets however, just means that you're comparing a G-Sync monitors vs a worse FreeSync monitor, and then going "look! The worse monitor is cheaper!".

Link to comment
Share on other sites

Link to post
Share on other sites

Lol, well... I feel somewhat vindicated after years of saying NVIDIA has the power to add support for FreeSync monitors with a driver update at any time they choose, and people replying with "no way, if it was just a driver restriction people would have hacked around it by now":P

 

On 7/4/2015 at 9:04 AM, Glenwing said:

[In response to "can NVIDIA support FreeSync with a driver update if they wanted to"]

 

The first thing to consider here is that not all of AMD's GCN-based cards support dynamic refresh rates, only GCN 1.1 cards or above. Older 1.0 cards like the 7970/280X aren't capable, so this means that there is some hardware involved in making this possible. It's possible Kepler does not have this (G-SYNC hardware may be too different to simply act as a substitute), and if that's the case then Maxwell may not have it either. GPUs are designed on a 3–4 year cycle so by the time FreeSync was anounced, Maxwell was already finished, for the most part. It's not unrealistic to think it might not have the required hardware either.

On the other hand, Maxwell mobile GPUs are capable of "Mobile G-SYNC" which is essentially FreeSync, it operates over the same eDP protocol that AMD used to demonstrate FreeSync for the first time. Personally I think Maxwell GPUs at least could be capable of FreeSync given the right software, but only NVIDIA knows for sure.

I do not think NVIDIA will adopt "FreeSync" but I predict they will eventually come out with "G-SYNC 2.0" or something like that which works via DisplayPort Adaptive-Sync like FreeSync, because they will need to compete more on cost. It's just a question of whether they'll take special measures to block monitors with Adaptive-Sync that haven't been certified for "G-SYNC 2.0", which I think is incredibly likely.


It's also worth noting that right now, G-SYNC is superior to any current FreeSync implementation. G-SYNC covers the entire range of each monitor, from 0–144 Hz (or whatever the max is) while FreeSync has a lower and/or upper limit, sometimes quite restrictive ones. G-SYNC is also an entire package, not just the dynamic refresh technology. As it stands, the entire display controller is replaced by the G-SYNC module which includes NVIDIA's anti-ghosting algorithms optimized for dynamic refresh rates, something that works quite well and beats any FreeSync monitor I've seen so far. If NVIDIA uses Adaptive-Sync for "G-SYNC 2.0", it's likely they'd want to ensure it keeps those advantages, so for a monitor vendor there'd be a bit more effort in creating a monitor that is "G-SYNC 2.0" compliant than FreeSync, and it's difficult to say how cross-compatibility will work out.

 

On 7/4/2015 at 9:46 AM, Glenwing said:

With "G-SYNC 2.0" I mean NVIDIA will most likely not use a proprietary controller chip in the monitor in a future version of G-SYNC, it will work via the same protocol as FreeSync. They will in effect switch to FreeSync (which works DisplayPort Adaptive-Sync), but they'll still call it G-SYNC ;) I do not think this is coming any time soon though.

 

Not a bad prediction, considering the date of the post :P

 

Happy to say I was wrong about them blocking compatibility on monitors without "new-G-Sync" certification, instead simply having it disabled by default, which is no big deal at all.

 

Of course, there is that nasty bit about not being supported on Maxwell cards, even though the hardware is clearly capable of it, but maybe this will come in time... I recall some features that had a similar situation, like DSR, which I believe was originally only available on Maxwell, but was later enabled for Kepler cards in a future update. I suppose we'll see.

Link to comment
Share on other sites

Link to post
Share on other sites

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Hellion said:

How does one not understand the basic concept of how higher resolutions displayed on smaller sized screens sub 32 inches doesn't actually have a noticable difference in real world applications?

I can still see the difference on my phone's screen, buddy. 

 

And this is not because I have "perfect eyesight". It's down to the bitrate and more importantly, chroma subsampling. 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

* Thread cleaned *

 

Please avoid passive-aggressive and antagonizing comments.

If you need help with your forum account, please use the Forum Support form !

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×