Jump to content

The reasons there are so few G-sync monitors compared to FreeSync

I used both daily at 75hz for a while and there was no noticable difference, can't speak for above that though.

i7 8086k @ 5.3Ghz / 32GB DDR4 Trident Z RGB @ 3733Mhz / Aorus GTX 1080 11Gbps / PG348Q

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Shakaza said:

Well, the problem is that you can't seem to find a reason why G-Sync is superior to FreeSync. All you're saying in your defense is that you own a G-Sync display and therefore know all about its benefits over FreeSync, and that others don't know because all they have is Google. Anecdotal evidence isn't proof, and bias and the placebo effect are very much problems when an individual makes claims such as these. If that is your only defense for why you know what you're talking about, please reconsider making stupid arguments. 

 

But, I'll let @Notional hammer it home. He's much more versed in this subject than me. I'm just the logic guy.

Who is defending? We talked about Gsync alot on LTT. Search for it if you want learn something. AMD Users bitch on Gsync, what a surprise ?  But thats ok for me. Have Fun doing so.

CPU i7 6700k MB  MSI Z170A Pro Carbon GPU Zotac GTX980Ti amp!extreme RAM 16GB DDR4 Corsair Vengeance 3k CASE Corsair 760T PSU Corsair RM750i MOUSE Logitech G9x KB Logitech G910 HS Sennheiser GSP 500 SC Asus Xonar 7.1 MONITOR Acer Predator xb270hu Storage 1x1TB + 2x500GB Samsung 7200U/m - 2x500GB SSD Samsung 850EVO

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Praesi said:

Who is defending? We talked about Gsync alot on LTT. Search for it if you want learn something. AMD User bitch on Gsync, what a surprise ?

Umm... I'm an Nvidia user (Titan X Pascal on my main rig) and have two GSync monitor's, and I think freesync is superior so I don't know your argument has any basis in reality.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Praesi said:

Who is defending? We talked about Gsync alot on LTT. Search for it if you want learn something. AMD Users bitch on Gsync, what a surprise ?  But thats ok for me. Have Fun doing so.

As an nvidia AND gsync user, I see nothing about it that makes it worthwhile over freesync.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Sniperfox47 said:

Umm... I'm an Nvidia user (Titan X Pascal on my main rig) and have two GSync monitor's, and I think freesync is superior so I don't know your argument has any basis in reality.

Ya. I believe that one.  ?

CPU i7 6700k MB  MSI Z170A Pro Carbon GPU Zotac GTX980Ti amp!extreme RAM 16GB DDR4 Corsair Vengeance 3k CASE Corsair 760T PSU Corsair RM750i MOUSE Logitech G9x KB Logitech G910 HS Sennheiser GSP 500 SC Asus Xonar 7.1 MONITOR Acer Predator xb270hu Storage 1x1TB + 2x500GB Samsung 7200U/m - 2x500GB SSD Samsung 850EVO

Link to comment
Share on other sites

Link to post
Share on other sites

To me open-source technology is the best. This is why I prefer AMD's current strategies.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, bigneo said:

"...G-Sync works to whatever the max refresh rate of the panel is."

 

This still depends on the card tho, no?

If so, why are we comparing FreeSync and G-Sync performances if they depend on different hardware? 

GSync is a closed standard by Nvidia so comparing it to another closed standard would be pointless. While Freesync *is* proprietary, it's a proprietary implementation of an open industry standard (adaptive-sync). Intel also has an adaptive sync implementation, and it works on all the same monitors as Freesync does.

 

Nvidia on the other hand currently uses G-sync, but has the option of doing their own implementation which would have exactly the same features as Freesync. So comparing freesync to GSync is useful not from a hardware comparison standpoint between the two, but rather as a profit/loss comparison between GSync and a potential Nvidia adaptive-sync implementation.

 

For the most part:

 

For Nvidia) it's most beneficial to force users into GSync because they haven't gotten much backlash about it, it makes them a boatload of money, and locks users into their ecosystem.

 

For end users) it would be most beneficial to be allowed to use adaptive-sync monitors with Nvidia hardware, since it would give them more options for hardware. Note that while adaptive-sync and GSync are mutually exclusive on the monitor side, there's no reason you couldn't have an implementation of both on the GPU side to support both kinds of monitors.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Praesi said:

You Guys should feel Gsync for yourself bevore judge it.

We are too poor for that shit.

(⌐■_■) 

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Stefan1024 said:

It's very simple:

i don't buy Gsync monitors at all, wait a few years to get freesync widespead into the market and they nvidia will have to support it. But if we keep buying gsync monitors nvida will abuse it's market share and refuse to implement freesync (it's probably not that big of an effort and there are no HW limitations as far as I know).

And there in lies the problem: most people buying adaptive sync monitors are still not low or mid range customers and AMD doesn't even has a high end offering and Vega will likely miss 2016 (and the most lucrative end of the year quarter) entirely.

 

So how is AMD ever going to pick up momentum? I know most sales come from low and mid range but without high end offering people just don't pay as much attention for the same reason most car manufacturers have bought up a "deluxe" or "sports" car manufacturer: that brings up prestige, notoriety and ultimately drives customers even into the lower tier cars that have none of the engineering from the high end ones in them.

 

10 hours ago, Sniperfox47 said:

For Nvidia) it's most beneficial to force users into GSync because they haven't gotten much backlash about it, it makes them a boatload of money, and locks users into their ecosystem.

But for the most part, it doesn't. I am in the market for a monitor and I wouldn't consider any of the Gsync monitors because they're far too expensive. I am intentionally waiting for Nvidia to give up and support adaptive otherwise I might just skip the fancy refresh rate thing entirely.

 

Far from being just anectdotal on my part I don't think the Gsync sales have really reflected otherwise, both standards are basically old news at this point: Nvidia because they refuse to sell cheaper Gsync pannels and AMD because they just don't have the marketshare and probably won't for a long while.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

I thought this had been known for more than a year now...

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, CyanideInsanity said:

Then please feel free to enlighten us on how is gsync superior to freesync. Because all you've said is that people should try it for themselves(even though gsync and freesync accomplish the same thing) and that apparently high price = superior product.

The standard itself isn't. Gsync is just supported by more cards, cards that are overall faster ( above the 1070 with no competition whatsoever from AMD and it will remain so for all of 2016) and cards that are generally overall better for the money on most games (No, DX12 games are not "most games" yet and even then you've got games like GoW 4 where the 480 still looses even with Async enabled) not to mention far better driver support.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Misanthrope said:

But for the most part, it doesn't. I am in the market for a monitor and I wouldn't consider any of the Gsync monitors because they're far too expensive. I am intentionally waiting for Nvidia to give up and support adaptive otherwise I might just skip the fancy refresh rate thing entirely.

 

Far from being just anectdotal on my part I don't think the Gsync sales have really reflected otherwise, both standards are basically old news at this point: Nvidia because they refuse to sell cheaper Gsync pannels and AMD because they just don't have the marketshare and probably won't for a long while.

Intel supports (or will support soon) adaptive as far as I know, which should definitely help fill the marketshare gap if not completely turn the tables on this particular thing.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Misanthrope said:

But for the most part, it doesn't. I am in the market for a monitor and I wouldn't consider any of the Gsync monitors because they're far too expensive.

Issue is IF you have gotten a Gsync monitor, you are very likely to only get NVidia gpu's going forward. You've paid a price premium for a feature, that you don't want to lose out on. Even if AMD's offering at the time is better. This is the problem from a consumer side, with vendor lock in. You end up having high switching costs to other solutions with similar features. In this case you would have to replace your monitor as well.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Sauron said:

Intel supports (or will support soon) adaptive as far as I know, which should definitely help fill the marketshare gap if not completely turn the tables on this particular thing.

Intel is a fucking joke for gaming. They're far below the usefulness of a gaming centric feature so they're basically a non-factor until they can jam in 460 levels of performance on their igpu which is still too far away. Even if they did they have the worst drivers of the bunch anyways so no, serious gaming on intel it's just not viable, not enough for a serious gaming feature like adaptive sync.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Misanthrope said:

The standard itself isn't. Gsync is just supported by more cards, cards that are overall faster ( above the 1070 with no competition whatsoever from AMD and it will remain so for all of 2016) and cards that are generally overall better for the money on most games (No, DX12 games are not "most games" yet and even then you've got games like GoW 4 where the 480 still looses even with Async enabled) not to mention far better driver support.

Well yeah, that's my point. Mr. nvidia fanboy has nothing to back up his claims that gsync is superior, and thinks "search for yourself" proves his point.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Notional said:

Issue is IF you have gotten a Gsync monitor, you are very likely to only get NVidia gpu's going forward. You've paid a price premium for a feature, that you don't want to lose out on. Even if AMD's offering at the time is better. This is the problem from a consumer side, with vendor lock in. You end up having high switching costs to other solutions with similar features. In this case you would have to replace your monitor as well.

Ok fair enough but less follow the more common scenario: IF you have gotten a Gsync monitor on average you paid a grand for it. Most consumers spending a grand on a monitor probably have very high end rigs that at least 2 grand but likely even more. So we're talking people with dual 1080s or even Titan Xpees right now.

 

What does AMD sells that can compete with that? Fuck all: no enthusiast is going to fit his rig with 4 way crossfire 480s instead of dual 1080s or something among the lines. And it's been like this for far too long in fact: Fiji was a bust and Vega took too long so it's like what, 2 or 3 years since AMD offered anything truly high end that can actually compete? 295x2 was the last time they were kinda showing up to that level strongly but the card wasn't the easiest to fit on a rig due to power usage.

 

Now I am ready to accept that if Vega knocks it out of the park then this entire point is well is null and void so I can give you that.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Misanthrope said:

Ok fair enough but less follow the more common scenario: IF you have gotten a Gsync monitor on average you paid a grand for it. Most consumers spending a grand on a monitor probably have very high end rigs that at least 2 grand but likely even more. So we're talking people with dual 1080s or even Titan Xpees right now.

 

What does AMD sells that can compete with that? Fuck all: no enthusiast is going to fit his rig with 4 way crossfire 480s instead of dual 1080s or something among the lines. And it's been like this for far too long in fact: Fiji was a bust and Vega took too long so it's like what, 2 or 3 years since AMD offered anything truly high end that can actually compete? 295x2 was the last time they were kinda showing up to that level strongly but the card wasn't the easiest to fit on a rig due to power usage.

 

Now I am ready to accept that if Vega knocks it out of the park then this entire point is well is null and void so I can give you that.

Only the 34" ultrawides are in that price brackets, but yes, gsync monitors belong in the higher end market. Right now, it's true only NVidia has the high end cards of the generation, but that was not the case with the 900 series and Fury. And it won't be next year with Vega + Pascal v2/Volta. So I'm not sure it's a huge issue. You see people with (x)x60/(x)x70 cards also buy the cheaper gsync monitors, so it is very much a feature that spans across the market.

 

Thing is, most people replace their monitors a lot less often than their gpu's. So buying a monitor for an old card you have, might not be the best idea. Especially if it causes vendor lock in. At the end of the day, this debacle is all NVidia's fault, and based on the article, NVidia does not seem to give a frack about consumers being in a much worse position, even after they shell out a huge premium.

 

Let's just hope for all our sakes that AMD gets competitive in the high end again, soon. After all, with HDMI supporting Freesync, and tv's in the future will too, what would be the best for a GPU to support?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Misanthrope said:

Intel is a fucking joke for gaming. They're far below the usefulness of a gaming centric feature so they're basically a non-factor until they can jam in 460 levels of performance on their igpu which is still too far away. Even if they did they have the worst drivers of the bunch anyways so no, serious gaming on intel it's just not viable, not enough for a serious gaming feature like adaptive sync.

Of course, but if either of these standards is going to become commonplace it's going to be adaptive sinply because it will work with a higher number of devices.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Misanthrope said:

-snip-

The FreeSync standard can still be there whether it is used or not. It is already known that implementing a GSync module has quite a bit of costs, so it wouldn't be surprising if every new monitor (high-end or low-end) ended up with FreeSync in the future. While, Nvidia cards can't use FreeSync, the monitor itself will still work. On the other hand, I think a GSync monitor will just not work if it is not coming from a Nvidia card.

I do agree that AMD is not really competing at all at the high end GPU side. Though, hopefully when FreeSync becomes more widely available, Nvidia will decide to support the standard. I doubt they would lose much GSync monitor sales if they decided to just advertise that the true GeForce experience will be on a GSync monitor or something. It is like you said, where people buying high end GPU configurations will likely just throw away more money at the Nvidia GSync monitors, while the people not willing to get those monitors will still be able to get a 'similar' experience.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, dragosudeki said:

The FreeSync standard can still be there whether it is used or not. It is already known that implementing a GSync module has quite a bit of costs

It can have more costs but it doesn´t really needs to: I remember Wendell (the Neckbeard formerly known for Tek Syndicate) took a "Gsync" laptop screen and ran it as vanilla adaptive. The conclusion is that while Nvidia does more stuff with it, if you remove the DRM it actually could do normal adaptive sync within the more limited scope of 40 to 75 and such but still.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/13/2016 at 9:52 PM, Samfisher said:

FreeSync is built into an OPTIONAL VESA standard.  You do NOT need FreeSync support to be VESA certified.

 

IIRC FreeSync has a much more limited FPS range where it will work flawlessly whereas G-Sync works to whatever the max refresh rate of the panel is.

9-240hz is the free sync range. so g sync has 0-100000hz range?

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, spartaman64 said:

9-240hz is the free sync range. so g sync has 0-100000hz range?

The implementation of Freesync isn't in that range though, so he has a point. But like I stated, you can just get a monitor with a proper range.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Notional said:

The implementation of Freesync isn't in that range though, so he has a point. But like I stated, you can just get a monitor with a proper range.

thats not the fault of freesync the monitor manufacturer can make it 9-240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, spartaman64 said:

thats not the fault of freesync the monitor manufacturer can make it 9-240hz

 

Not LCD panels in the existence of the world can go below ~30hz, so no they cannot. Same panels doesn't natively go above 165hz either (if even). The few that overclocks beyond that, does it at the expense of colour accuracy.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, spartaman64 said:

thats not the fault of freesync the monitor manufacturer can make it 9-240hz

 

It's the fault of the standard that AMD is pushing that there are no minimum performance figures that manufacturers have to adhere to.  It's a lousy standard if your experience with different monitors all differ.  Hardly standard is it?

QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×