Jump to content

The reasons there are so few G-sync monitors compared to FreeSync

I have a G-Sync display and I'm very happy with it. G-Sync is better but you do pay for it I got mine when FreeSync wasn't even a thing. And I have an Nvidia card already so I had to get G-Sync. Not complaining though because G-Sync is pretty good. It's the best display tech in the past 10 years or so. It completely removed a huge problem that people had accepted for so long.

 

Give Nvidia credit for doing it because if they hadn't no one would have bothered. 

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, kuddlesworth9419 said:

I have a G-Sync display and I'm very happy with it. G-Sync is better but you do pay for it I got mine when FreeSync wasn't even a thing. And I have an Nvidia card already so I had to get G-Sync. Not complaining though because G-Sync is pretty good. It's the best display tech in the past 10 years or so. It completely removed a huge problem that people had accepted for so long.

 

Give Nvidia credit for doing it because if they hadn't no one would have bothered. 

How is it better when they do the same thing? Minus gsync going always to the panels max rate, I don't believe the experience is any different.

 

And what about those laptops with vrr, was that before gsync or after? I don't have time to scour the depths of the internet right now.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, xAcid9 said:

We're definitely not on the same page. xD

 

Yes Nvidia's current implementation of AdaptiveSync need that module right now but might not need it in the future.

its not "their implimentation" of adaptive sync its their implementation of variable refresh rate. and yes they can choose to follow the VESA standard but they cant do the same if they dont jump on board the VESA standard unless they make their own display connector standard

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, kuddlesworth9419 said:

I have a G-Sync display and I'm very happy with it. G-Sync is better but you do pay for it I got mine when FreeSync wasn't even a thing. And I have an Nvidia card already so I had to get G-Sync. Not complaining though because G-Sync is pretty good. It's the best display tech in the past 10 years or so. It completely removed a huge problem that people had accepted for so long.

 

Give Nvidia credit for doing it because if they hadn't no one would have bothered. 

its not really better they do the same thing and if you know the variable refresh rate of a monitor doesnt go up to the max frame rate then just dont buy that monitor

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, CyanideInsanity said:

How is it better when they do the same thing? Minus gsync going always to the panels max rate, I don't believe the experience is any different.

 

And what about those laptops with vrr, was that before gsync or after? I don't have time to scour the depths of the internet right now.

 

 

I think it is better in the sense that it covers better low FPS. FreeSync monitors usually have very narrow range like 50-75FPS while GSync goes well under 40 and well above 100.

Even though FreeSync is rated to handle up a lot more very few monitors with it supports it.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Samfisher said:

Nope.  Intel will license Freesync from AMD (it's free anyway).

They wouldn't have to. They could create their own Xyzsync and it should work with the so-called Freesync monitors on the market (at least the Displayport version, not sure about Freesync over HDMI).

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, WereCat said:

I think it is better in the sense that it covers better low FPS. FreeSync monitors usually have very narrow range like 50-75FPS while GSync goes well under 40 and well above 100.

Even though FreeSync is rated to handle up a lot more very few monitors with it supports it.

Plenty of Freesync monitors have a wider range than 50-75, I actually think the great majority do.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, WereCat said:

I think it is better in the sense that it covers better low FPS. FreeSync monitors usually have very narrow range like 50-75FPS while GSync goes well under 40 and well above 100.

Even though FreeSync is rated to handle up a lot more very few monitors with it supports it.

Not sure how long ago it was(news to me as of this thread), but freesync now does the same thing as gsync when down low with low framerate compensation doubling the frequency of the display.

 

Meaning gsync only has max refresh rate support at all times over poorer implementations of freesync. Honestly an automatic increase of $100 or so just for that isn't worth it imo.

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia's holding-out on going non-proprietary-sync is probably harming monitor sales, because there are probably other people who are like myself in holding out on purchasing a new monitor, until it's known where these standards are going, and where they will be usable. I have an Nvidia GPU right now, but I'm not buying a G-sync monitor because they're too expense, and I don't want to invest in a closed implementation of a free technology.

 

If anything, I'm most likely, to sell my Nvidia GPU and buy an AMD GPU when Vega releases, so that I can also buy a Freesync monitor. And if Nvidia keeps their expensive and proprietary G-sync, then I might stay with AMD, because doing so will perpetually save me more money, and not depend on additional hardware that could cause further problems.

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

Why is this still a discussion? Anyways time to clear up a few misconceptions:

 

  • Both Gsync and Adaptive Sync uses a technology called Variable VBlank, where the monitor controller (scaler+tcon) forces the monitor to hold an image until the next one is provided, instead of a default refresh cycle. This is standard in most monitors and was made as a power savings feature for laptops. This is also why AMD first showed a proof of concept on a laptop monitor.
  • Gsync replaces the entire monitor controller with their own proprietary module with an FPGA processor + Ram. Essentially a full computer. As a result, AMD made a proof of concept in said laptop to show they could do the same without an expensive proprietary module.
  • AMD then created the Adaptive Sync standard for VESA to implement. I'm sure they made the requirements more lenient to get the VESA members to vote it through.
  • Adaptive Sync is thus the hardware standard, by VESA, that monitor vendors has to support in their monitor controller (scaler/TCON/etc). This has nothing to do with freesync.
  • Freesync is the driver/software implementation/branding AMD has, that utilizes Adaptive Sync. Intel and NVidia can make their own software implementation to utilize Adaptive Sync. Heck NVidia can even call their software/driver implementation Gsync, if they want to. "Gsync, now with Adaptive Sync support". Nvidia hates consumers and loves money, so they won't. Fuck you Nvidia.
  • Like all standards, you can implement them in a good way or a not so good way. AMD has 0 influence on this. If a monitor vendor wants to use the Freesync brand/logo, it has to work within a certain interval set by AMD. LG's monitors are good enough for this.
  • Adaptive Sync is royalty free like all other parts of the DisplayPort technology. If a company wants to use the Freesync brand, that's free too, but they have to live up to AMD's requirements.
  • Yes Adaptive Sync is a standard. Just like DisplayPort is a standard.
  • No there's nothing a Gsync monitor can do, that a well implemented Adaptive Sync monitor cannot do.
  • Yes there are infinite things an Adaptive Sync monitor can do, that a Gsync monitor cannot do. The vendors has free reigns. As a result, you're not gonna see 10bit/HDR/whatever Gsync monitors anytime soon, until NVidia releases a new module.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/14/2016 at 3:52 PM, CyanideInsanity said:

As an nvidia AND gsync user, I see nothing about it that makes it worthwhile over freesync.

There is a tradeoff regarding input latency if you start to nitpick.

 

Gsync has less latency at certain FPS (I believe around 45FPS) while Freesync has less latency in another range of FPS.

 

But I think it's nothing to write home about.

\\ QUIET AUDIO WORKSTATION //

5960X 3.7GHz @ 0.983V / ASUS X99-A USB3.1      

32 GB G.Skill Ripjaws 4 & 2667MHz @ 1.2V

AMD R9 Fury X

256GB SM961 + 1TB Samsung 850 Evo  

Cooler Master Silencio 652S (soon Calyos NSG S0 ^^)              

Noctua NH-D15 / 3x NF-S12A                 

Seasonic PRIME Titanium 750W        

Logitech G810 Orion Spectrum / Logitech G900

2x Samsung S24E650BW 16:10  / Adam A7X / Fractal Axe Fx 2 Mark I

Windows 7 Ultimate

 

4K GAMING/EMULATION RIG

Xeon X5670 4.2Ghz (200BCLK) @ ~1.38V / Asus P6X58D Premium

12GB Corsair Vengeance 1600Mhz

Gainward GTX 1080 Golden Sample

Intel 535 Series 240 GB + San Disk SSD Plus 512GB

Corsair Crystal 570X

Noctua NH-S12 

Be Quiet Dark Rock 11 650W

Logitech K830

Xbox One Wireless Controller

Logitech Z623 Speakers/Subwoofer

Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I never used to notice screen tearing. It never bothered me the slightest.

 

Having upgraded to a PG348Q for half a year now, i definitely notice screen tear when i use someone else's PC.

 

That said, screen tearing still doesnt bother me.

 

Bottom line - GSYNC isnt worth $200. Especially not once you start hitting 100+ hz. At 144hz screen tearing is pretty much unnoticeable unless you go looking for it.

MSI GTX 1080 Gaming X | ASUS PG348Q | i7 4790k @ 4.5Ghz | ASUS Sabertooth Z97 Mark S | 16GB RAM

Corsair RM750i + CableMod C Series | Corsair 750D | 1 TB Samsung 850 EVO SSD + 500GB Crucial MX200 SSD

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, CyanideInsanity said:

Not sure how long ago it was(news to me as of this thread), but freesync now does the same thing as gsync when down low with low framerate compensation doubling the frequency of the display.

 

Meaning gsync only has max refresh rate support at all times over poorer implementations of freesync. Honestly an automatic increase of $100 or so just for that isn't worth it imo.

Also it is not "only" 100. That's probably the manufacturer cost but most Gsync panels are only much higher price brackets making the average price difference overall 300 or 400 bucks.

 

Of course you get a lot more features for those 300 to 400 bucks (Higher resolutions, higher refresh rates, bigger monitor sized, more curved displays, etc.) but it still means that Nvidia or vendors just refuse to do more midrange monitors with Gsync making most of the high end vs Freesync ones which run the range of prices from the lower end 1080p screens all the way to the monstrosity Linus reviewed recently.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

On 13/10/2016 at 4:39 PM, N.tony said:

nVidia's target audience - nVidia claims they are aiming at the premium end and have no immediate plans of going down to mainstream, charging for the "better experience" as opposed to just the hardware module itself

Fuck off.

 

This is what I object to about NVIDIA. The amount of fucking elitism it takes to say "here's a technology that does the same thing as our competitor's technology, but we're going to restrict it's availability to the high end and charge you more for it because there's jack-shit you can do about it."

 

This whole "holier-than-thou, better-than-you" thing that NVIDIA carries on with recently is really pissing me off. Makes me kind of proud that I have an AMD card and a Freesync display.

Project White Lightning (My ITX Gaming PC): Core i5-4690K | CRYORIG H5 Ultimate | ASUS Maximus VII Impact | HyperX Savage 2x8GB DDR3 | Samsung 850 EVO 250GB | WD Black 1TB | Sapphire RX 480 8GB NITRO+ OC | Phanteks Enthoo EVOLV ITX | Corsair AX760 | LG 29UM67 | CM Storm Quickfire Ultimate | Logitech G502 Proteus Spectrum | HyperX Cloud II | Logitech Z333

Benchmark Results: 3DMark Firestrike: 10,528 | SteamVR VR Ready (avg. quality 7.1) | VRMark 7,004 (VR Ready)

 

Other systems I've built:

Core i3-6100 | CM Hyper 212 EVO | MSI H110M ECO | Corsair Vengeance LPX 1x8GB DDR4  | ADATA SP550 120GB | Seagate 500GB | EVGA ACX 2.0 GTX 1050 Ti | Fractal Design Core 1500 | Corsair CX450M

Core i5-4590 | Intel Stock Cooler | Gigabyte GA-H97N-WIFI | HyperX Savage 2x4GB DDR3 | Seagate 500GB | Intel Integrated HD Graphics | Fractal Design Arc Mini R2 | be quiet! Pure Power L8 350W

 

I am not a professional. I am not an expert. I am just a smartass. Don't try and blame me if you break something when acting upon my advice.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

...why are you still reading this?

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Vode said:

There is a tradeoff regarding input latency if you start to nitpick.

 

Gsync has less latency at certain FPS (I believe around 45FPS) while Freesync has less latency in another range of FPS.

 

But I think it's nothing to write home about.

If you want to nitpick, that's not strictly true.

 

That's true with certain implementations of Adaptive Sync but not all of them. It has to do with how the monitor scaler handles things, not with the standards themselves.

 

GSync is pretty similar across the board since they all use the same scalers. Freesync has more variation since the scaler is left up to the individual implementation.

Link to comment
Share on other sites

Link to post
Share on other sites

On ‎10‎/‎16‎/‎2016 at 5:50 AM, KOMTechAndGaming said:

I meant they will make less money 

I know what you meant and I disagree. The best way to make money is to sell more. Not selling would make less money. They need to do something that will allow monitor manufacturers to put their own tech into the module and design it how they need it to be.

 

The solution I to give the designs to the manufacturers to make their own module. It allows the manufacturers to be much more flexible and cost efficient.

 

We know for a fact that ARM makes a LOT of money selling their processor designs to manufacturers so I don't understand your logic that this would make Nvidia less money.

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, TheCMan said:

I know what you meant and I disagree. The best way to make money is to sell more. Not selling would make less money. They need to do something that will allow monitor manufacturers to put their own tech into the module and design it how they need it to be.

 

The solution I to give the designs to the manufacturers to make their own module. It allows the manufacturers to be much more flexible and cost efficient.

 

We know for a fact that ARM makes a LOT of money selling their processor designs to manufacturers so I don't understand your logic that this would make Nvidia less money.

Oh yeah,  I may have misunderstood what you said

Edited by KOMTechAndGaming
Autocorrect

CPU: Intel9-9900k 5.0GHz at 1.36v  | Cooling: Custom Loop | MOTHERBOARD: ASUS ROG Z370 Maximus X Hero | RAM: CORSAIR 32GB DDR4-3200 VENGEANCE PRO RGB  | GPU: Nvidia RTX 2080Ti | PSU: CORSAIR RM850X + Cablemod modflex white cables | BOOT DRIVE: 250GB SSD Samsung 850 evo | STORAGE: 7.75TB | CASE: Fractal Design Define R6 BLackout | Display: SAMSUNG OLED 34 UW | Keyboard: HyperX Alloy elite RGB |  Mouse: Corsair M65 PRO RGB | OS: Windows 10 Pro | Phone: iPhone 11 Pro Max 256GB

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, ThinkWithPortals said:

Fuck off.

 

This is what I object to about NVIDIA. The amount of fucking elitism it takes to say "here's a technology that does the same thing as our competitor's technology, but we're going to restrict it's availability to the high end and charge you more for it because there's jack-shit you can do about it."

 

This whole "holier-than-thou, better-than-you" thing that NVIDIA carries on with recently is really pissing me off. Makes me kind of proud that I have an AMD card and a Freesync display.

This, and GameWorks, was the combined reason I recycled my GTX 760 in three different places and bought a 290x.

They're Apple. No, they're worse than Apple.

 

Also stoked that the Freesync monitor I got (Acer XG270HU) ended up measured the lowest combined input lag monitor (tied with the BenQ XL2730Z) in the 1440p/ 144hz sweet spot among both G-Sync and Freesync monitors. And it goes all the way down to 40hz if needed.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×