Jump to content

nvidia: 5% adaptive-sync monitors pass G-sync Compatible tests

porina
Quote

Since we announced support for Adaptive-Sync Variable Refresh Rate monitors, we’ve been testing every model available, with those that passed several tests being marked “G-SYNC Compatible”. This validation ensures that buyers receive a baseline Variable Refresh Rate (VRR) experience that makes gaming smoother, clearer, and more enjoyable.

 

We’ve completed our first phase of testing, getting our hands on as many Adaptive-Sync monitors as we could, and we’ve got the results ready to share with you.

 

To date, 503 VRR monitors have passed through our lab, and 28 (5.56%) have received G-SYNC Compatible validation, meaning 475 monitors failed.

https://www.nvidia.com/en-us/geforce/news/g-sync-compatible-validation/

(who's that in the video on that page?...)

 

This made the news previously in that they had started their testing, and they have now completed their first phase of testing of available monitors. They tested 503 variable refresh monitors, with 273 failing due to limited operating range, and 202 failing due to other image quality problems. Only 28 models passed, making up 5.56% of models tested. They said they identified 33 further models, but were unable to test them as they were no longer available.

 

While we all wonder if the G-sync tax is worth it, their testing does show you generally do get a higher level of performance from a G-sync display. That's not to say a FreeSync one isn't good enough, but it seems in many cases you pay less as you get less.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

The biggest category by which monitors were failed is because they didn't have an adaptive-sync range that satisfied Nvidia.

 

So, it could be that most of those tested monitors are actually fully g-sync compatible, but simply don't have the specs that Nvidia considers premium enough to promote their brand with.

 

g-sync-computex-2019-g-sync-compatible-infographic.jpg

 

Quote

273 failed for lacking a VRR range of at least 2.4:1 (e.g. 60Hz-144Hz), meaning you were unlikely to get any of the benefits of VRR as your framerate wasn’t within the tight range offered.

2.4:1 seems like a very high bar to be met. And there are a lot of 75hz FreeSync monitors for which the VRR range is 48 - 75hz. All of those would be automatically failed by Nvidia simply because they don't offer the 2.4:1 VRR ratio that Nvidia has chosen as necessary to get a "g-sync compatible" badge.

 

 

My 75hz HP Omen 32 is going to be one of the monitors that is automatically disqualified by Nvidia. Yet, its g-sync-compatible performance is perfect and my FPS never drops below 60. Frame-skipping above 60 hz with an Nvidia GPU is another matter, though (and can be continually fixed for another long while until it starts happening again by cycling though the monitor's display inputs).

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, porina said:

.

Naah... I tested out the Freesync on one of those 3440x1440p75hz ultrawides by LG which according to nVidia wasn't "G-Sync Certified", compared the experience to my actual G-Sync monitor... it looked/felt the very same if I'm honest... most I have talked with also mention that as long as the FreeSync implementation on the display is any good it already "just works".

 

I really think G-Sync is a dead feature and you're always better off saving those 200 bucks going with the FreeSync variation of the display.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

Do Nvidia drivers really suck that bad that well reviewed freesync monitors with LFC and good operating ranges cannot play well with Geforce cards like they do with Radeon?

 

Have they implemented driver level low framerate compensation for compatible monitors like AMD has?

i.e. if the minimum monitor refresh is 40Hz and the framerate drops to 30fps, my GPU will double it to 60Hz by displaying each frame twice. Thus keeping it within the refresh window. If I drop to 10fps my GPU will display each frame 4 times and so on to stay at 40Hz+... Does Nvidia do this at the driver level like AMD does if the monitor input range is wide enough? Or are they totally reliant on the presence of their proprietary g-sync modules on the monitor to achieve the same effect.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

One of the things that Nvidia takes into consideration is that FreeSync should be enabled automatically when the monitor is connected to a compatible graphics card, the monitor also needs to have LFC (Low Framerate Compensation) to meet Nvidia's requirements. AFAIK the branding only means that Nvidia approves the adaptive-sync experience to be just as good as it is on normal G-Sync monitors.

Funny how I bought my first FreeSync monitor when Nvidia announced the "G-Sync Compatible" program and it got added to the shortlist like three months after it launched. There were some small quirks here and there early on with the VRR experience on a 1080Ti, but after Nvidia released a driver certifying the monitor and Gigabyte released a firmware update that "improves compatibility with Nvidia GPUs" the experience is pretty much flawless in all games I tried.

6 minutes ago, Humbug said:

Do Nvidia drivers really suck that bad that well reviewed freesync monitors with LFC and good operating ranges cannot play well with Geforce cards like they do with Radeon?

 

Have they implemented driver level low framerate compensation for compatible monitors like AMD has?

i.e. if the minimum monitor refresh is 40Hz and the framerate drops to 30fps, my GPU will double it to 60Hz by displaying each frame twice. Thus keeping it within the refresh window. If I drop to 10fps my GPU will display each frame 4 times and so on to stay at 40Hz+... Does Nvidia do this at the driver level like AMD does if the monitor input range is wide enough? Or are they totally reliant on the presence of their proprietary g-sync modules on the monitor.

LFC works flawlessly, at least on the AD27QD that I have. It did even when it wasn't officially certified yet.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Princess Luna said:

I really think G-Sync is a dead feature and you're always better off saving those 200 bucks going with the FreeSync variation of the display.

 

1 minute ago, Misanthrope said:

"Stop trying to make Gsync happen it is NOT gonna happen"

IMO there is always a space for high end gaming experience. Of course, that doesn't mean you can't have a good experience with lower equipment. My "gaming" laptop has a quad core (no HT) CPU and 1050, and with settings turned down it gets the job done. Of course I have a better experience with my desktop at home... I do use a G-sync monitor, and if I were to replace it, I certainly wouldn't rule out getting another one.

 

So G-sync will probably never hit the mainstream, but it has a place at the higher end. I also hope G-sync Compatible will encourage display manufacturers to offer more premium FreeSync displays at moderate cost in future.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

i get that a qualification sticker to set a bar to how good a freesync monitor is, but "G-sync compatible" is seemingly just turning out to be an expencive sticker that barely tells you how good the monitor actually is. 

 

 

i wouldnt be mad if there was some sort of Adaptive Sync "gold" certification which just tells you the freesync range goes all the way down to say 28 hz. 

 

also does Nvidia have an official list that says exactly what is required to get the sticker?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

So G-sync will probably never hit the mainstream, but it has a place at the higher end. I also hope G-sync Compatible will encourage display manufacturers to offer more premium FreeSync displays at moderate cost in future.

the question then becomes. Should AMD add a "freesync Gold/premium" sticker. 

 

its essentially says the product is premium and does all of x things. its what Gsync does these days. isnt there Freesync HDR that does this these days?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

So G-sync will probably never hit the mainstream, but it has a place at the higher end. I also hope G-sync Compatible will encourage display manufacturers to offer more premium FreeSync displays at moderate cost in future.

Good freesync monitors are already performing as well as more expensive g-sync monitors. Now obviously a cheap ass freesync monitor is not going to compete with something twice the cost (g-sync or otherwise).

 

However if AMD wants more premium freesync displays they have to become more competitive in the discrete GPU market.. simple as that. Or else monitor manufacturers have less incentive to put out high quality freesync displays.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, GoldenLag said:

the question then becomes. Should AMD add a "freesync Gold/premium" sticker. 

 

its essentially says the product is premium and does all of x things. its what Gsync does these days. isnt there Freesync HDR that does this these days?

I guess they kinda have it in FreeSync 2:

https://www.anandtech.com/show/10967/amd-announces-freesync-2-improving-ease-lowering-latency-of-hdr-gaming

 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Ok, so a bunch of monitors don't qualify for an arbitrary standard set by nVidia to justify their snake oil... why do we care?

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Running regular 144Hz monitor with Adaptive V-Sync. Literally don't see any reason to chase Xsync monitor of any kind. Everything is super smooth and without any tearing.

Link to comment
Share on other sites

Link to post
Share on other sites

Meanwhile, my monitor just flat out wacks with variable refresh rate switched on. 

 

Just so much blur, it's not even funny 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, porina said:

 

IMO there is always a space for high end gaming experience. Of course, that doesn't mean you can't have a good experience with lower equipment. My "gaming" laptop has a quad core (no HT) CPU and 1050, and with settings turned down it gets the job done. Of course I have a better experience with my desktop at home... I do use a G-sync monitor, and if I were to replace it, I certainly wouldn't rule out getting another one.

 

So G-sync will probably never hit the mainstream, but it has a place at the higher end. I also hope G-sync Compatible will encourage display manufacturers to offer more premium FreeSync displays at moderate cost in future.

The thing is: there is no guarantee that Nvidia will continue to be the main and only provider of said niche, high end experience: you're betting a considerably long time into the future that Nvidia will be the only name you'll need for high end gaming.

 

More over, by doing so, you're also enabling Nvidia to charge basically whatever the fuck they want. AMD releasing nothing but fucking turds for the high end is half of the equation, but the other half is people buying into Nvidia exclusive hardware: that's at least part of why $1000+ GPUs came to be today.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, Morgan MLGman said:

One of the things that Nvidia takes into consideration is that FreeSync should be enabled automatically when the monitor is connected to a compatible graphics card, the monitor also needs to have LFC (Low Framerate Compensation) to meet Nvidia's requirements. AFAIK the branding only means that Nvidia approves the adaptive-sync experience to be just as good as it is on normal G-Sync monitors.

Funny how I bought my first FreeSync monitor when Nvidia announced the "G-Sync Compatible" program and it got added to the shortlist like three months after it launched. There were some small quirks here and there early on with the VRR experience on a 1080Ti, but after Nvidia released a driver certifying the monitor and Gigabyte released a firmware update that "improves compatibility with Nvidia GPUs" the experience is pretty much flawless in all games I tried.

LFC works flawlessly, at least on the AD27QD that I have. It did even when it wasn't officially certified yet.

My LG 60 and 75hz monitors have it so you have to turn it on in the monitor settings. I like this feature since some titles like vsync better. 

 

I see Nvidia freesync as another tool to combat screen tear. 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

Wow. Only 28 of the 500 or so available are able to give us a few more milliseconds. Great. It's really useful research for Nvidia. Unfortunately everybody else lives in the real world and don't care about G-Sync. Why can't they just admit defeat, or at least lower the gsync prices? It's ridiculous to have to pay another £100 just for a few milliseconds improvement over Freesync.

 

You do pay less and get less. But what you get is so close to what you would get if you paid more that it's just not worth paying the crazy tax they put on it. As long as Freesync is implemented properly (Which it is most of the time), it's pretty much just as good as Gsync.

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Misanthrope said:

The thing is: there is no guarantee that Nvidia will continue to be the main and only provider of said niche, high end experience: you're betting a considerably long time into the future that Nvidia will be the only name you'll need for high end gaming.

I can't predict the future, but from what we know today, neither AMD nor Intel will have competing high end products for the rest of this year, likely wont until well into next year or even later. If you have deep enough pockets to go high end, chances are you could likely refresh more frequently also to remain close to the leading edge. What might or might not happen in 1 or 2 years doesn't matter much to today.

 

For those who are more careful about how they spend, G-sync Compatible seems to be a fair mid ground. You keep open the option of switching to AMD or Intel devices and maintain variable refresh should they become a better choice in future.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

I can't predict the future,

 

Quote

For those who are more careful about how they spend, G-sync Compatible seems to be a fair mid ground.

 

You see no offense but these statements in my opinion are mutually exclusive: A monitor is a piece of hardware that most users (Yes, even the really high end niche ones imo) are bound to keep for a while, at least 2 or 3 generations of GPUs if not more is basically not unheard of (Which in fact is the case for adaptive sync technologies at this point: 4 generations? However many it's been since like 2014 and the Geforce 600 series)

 

So the fact that you indeed can't predict the future means that you aren't being carefully about your monitor purchase if you purchase G-Sync based on this year alone when you should be expecting to use a monitor anywhere from 4 to 6 years into the future.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Only down to 48 Hz is near-worthless trash anyways.

 

30 to 75 is acceptable via that standard, and if you are a low frequency monitor, I would always expect being capable of at least down to 30.

 

Yes that does mean these monitors are substandard.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Curufinwe_wins said:

Only down to 48 Hz is near-worthless trash anyways.

 

30 to 75 is acceptable via that standard, and if you are a low frequency monitor, I would always expect being capable of at least down to 30.

 

Yes that does mean these monitors are substandard.

Who plays games at 48 and lower FPS? If a game is running  at 75 FPS normally, in what situation is it going to go as low as 48 or lower? Game designers use graphical budgets to avoid those kinds of FPS swings.

 

Like I said in my post, my FPS most-never goes below 60. If it does, then I'll probably adjust the settings so that it doesn't. With or without adaptive-sync, FPS spending any time below 50 means the settings are not optimal or the PC isn't very capable of playing a game.

 

I can tell you from my experience, that range of adaptive-sync isn't at all worthless and gives plenty of leeway that ensures the FPS never dips out of VRR range.

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Delicieuxz said:

Who plays games at 48 and lower FPS? If a game is running  at 75 FPS normally, in what situation is it going to go as low as 48 or lower? Game designers use graphical budgets to avoid those kinds of FPS swings.

 

Like I said in my post, my FPS most-never goes below 60. If it does, then I'll probably adjust the settings so that it doesn't. With or without adaptive-sync, FPS spending any time below 50 means the settings are not optimal or the PC isn't very capable of playing a game.

 

I can tell you from my experience, that range of adaptive-sync isn't at all worthless and gives plenty of leeway that ensures the FPS never dips out of VRR range.

Except that almost no game ever actually runs stably above 60 FPS. Even games running at 100 FPS average tend to drop well into the 30s and 40s on occasion. Just because you don't 'see it' on your time averaged FPS counter doesn't mean it isn't occurring. And having used both a monitor with a proper VRR range and one with the actual trash 48-75 range, the difference is instantly palpable. Covering at least double range means that you always have a good multiplier (for example 40 to 80 is fine because 39 FPS just is 78 Hz double displayed). Otherwise, it's less than ideal. In fact, I'll be perfectly blunt in saying at that point I'd rather use v-sync if you feel that confident you aren't going to drop frames in the first place.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

The other problem with Freesync is that monitors won't even necessarily perform the listed specs, as there's an element of silicon lottery. At least you can return it if it can't do its listed specs. But you can also get lucky and it'll do better than the listed range. Either way, there's still an element of getting what you pay for, you don't need to look hard to find reports of crappy Freesync monitors, and this is the kernel of truth on which Nvidia bases their Gsync marketing.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not sure what Nvidia is on about regarding ''insufficient VRR range'', and i think that is up to me to decide. I could not care less if the Freesync range is only 60 to 144 hertz. I have no plans going that low either way.

Motherboard: Asus X570-E
CPU: 3900x 4.3GHZ

Memory: G.skill Trident GTZR 3200mhz cl14

GPU: AMD RX 570

SSD1: Corsair MP510 1TB

SSD2: Samsung MX500 500GB

PSU: Corsair AX860i Platinum

Link to comment
Share on other sites

Link to post
Share on other sites

It seems to me people don't understand the law of diminishing returns.  Naturally you will get adequate (if not perceived equivalence) with today's freesync monitors, but that doesn't mean there is no space or desire for people to go one step further even if the cost is exponentially higher.

 

LoDM is in every industry and every product,  at the top of the product line where the cost is higher  the consumer demand drops,  thus to get those last few touches of feature/quality requires a large step in price.   Otherwise the absolute best is just not viable to produce.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×