Jump to content

Why don't we have a G-Sync-to-Freesync converter yet?

JT_NC

Given that G-sync monitors use a proprietary Nvidia controller, the two techs just can't exist in the same monitor, and having the G-sync controller adds a few hundred $$$ to the monitor price.  We all know that.

 

So why hasn't Nvidia (or one of its licensed monitor manufacturers) built an intermediate solution?  Something that sits between the DisplayPort-out of the video card and the DisplayPort-in on the monitor.

 

It could be an external black box with 2 DisplayPort and power connectors (maybe even USB-powered).

 

Or it could be an internal card, like the good old 3dfx Voodoo (if anyone here is old enough to remember the late 90s).  That was one of the first dedicated 3D graphics cards, and it only worked via pass-through.  You still needed a normal video card too.  When you weren't running a 3dfx-enabled program, the signal from the other graphics card would simply pass right through to the monitor.  The VGA-out of the other video card was connected to a VGA-in on the Voodoo card via a really short cable, and then it used its own VGA-out to connect to the monitor.

 

So why can't we get something like that, in which a G-sync card "thinks" it's talking to a G-sync-enabled monitor, but in reality it's talking to another card (or black box) which converts the output to work with a Freesync monitor?

 

Obviously we'd need Nvidia to build (or license) the hardware.  I'm sure they wouldn't mind the opportunity to make more money.  Freesync monitors are so much cheaper and prolific that they're not gaining market share there.  So why not double-dip?  They don't put Freesync tech into their G-sync cards, so why don't they sell us more cards?

 

Would there be an unacceptable amount of increased latency?  (even so, the bottleneck would be on the computer side rather than the monitor side.  plus, a 144mhz monitor running at 100mhz still beats 60mhz)

Would it require Nvidia and AMD to work together in order to merge their proprietary technologies?  (in which case it'll never happen, but it seems like something Nvidia might be able to do on its own)

Would anyone other than gaming enthusiasts even be interested?  (if not...  marketing!)

 

On the flip side, Nvidia could probably make a card which pretends to be a Freesync-enabled monitor, and then outputs compatible signals to a G-sync monitor controller pretty easily.  There was even a time when Nvidia sold a G-sync mod kit for a particular monitor.  (Linus did a video of that.)

 

Anyway, this whole "G-Sync vs Freesync hardware limitation" bugs me.  We have adapters and converters for almost everything these days, but not this.  I had high hopes for Vega, especially since I got a 4K Freesync monitor last year, but Vega's price and performance fell short of my expectations, so I went with a 1080 Ti instead.

 

Thanks,

JT

Link to comment
Share on other sites

Link to post
Share on other sites

That's an incredible post for a first post!

 

I have very little technical knowledge but I have a feeling that at the very least it would be difficult to not have input latency as it would have to be taking one stream and making it go different while pacing it differently?

Sig under construction.

Link to comment
Share on other sites

Link to post
Share on other sites

Because NVIDIA doesn't want to. Easy.

PSU Nerd | PC Parts Flipper | Cable Management Guru

Helpful Links: PSU Tier List | Why not group reg? | Avoid the EVGA G3

Helios EVO (Main Desktop) Intel Core™ i9-10900KF | 32GB DDR4-3000 | GIGABYTE Z590 AORUS ELITE | GeForce RTX 3060 Ti | NZXT H510 | EVGA G5 650W

 

Delta (Laptop) | Galaxy S21 Ultra | Pacific Spirit XT (Server)

Full Specs

Spoiler

 

Helios EVO (Main):

Intel Core™ i9-10900KF | 32GB G.Skill Ripjaws V / Team T-Force DDR4-3000 | GIGABYTE Z590 AORUS ELITE | MSI GAMING X GeForce RTX 3060 Ti 8GB GPU | NZXT H510 | EVGA G5 650W | MasterLiquid ML240L | 2x 2TB HDD | 256GB SX6000 Pro SSD | 3x Corsair SP120 RGB | Fractal Design Venturi HF-14

 

Pacific Spirit XT - Server

Intel Core™ i7-8700K (Won at LTX, signed by Dennis) | GIGABYTE Z370 AORUS GAMING 5 | 16GB Team Vulcan DDR4-3000 | Intel UrfpsgonHD 630 | Define C TG | Corsair CX450M

 

Delta - Laptop

ASUS TUF Dash F15 - Intel Core™ i7-11370H | 16GB DDR4 | RTX 3060 | 500GB NVMe SSD | 200W Brick | 65W USB-PD Charger

 


 

Intel is bringing DDR4 to the mainstream with the Intel® Core™ i5 6600K and i7 6700K processors. Learn more by clicking the link in the description below.

Link to comment
Share on other sites

Link to post
Share on other sites

Why doesn't GM make their auto's with adapters to install Ford engines?

Best Excuses:

        #1(simple) "Well, I never liked that stupid thing anyway!"

        #2(complex) "Obviously there was a flaw in the material, probably due to the inadvertent introduction of contaminants during the manufacturing process."

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, MadModder said:

Why doesn't GM make their auto's with adapters to install Ford engines?

lmao, nice

 

Ryzen 9 3950x - 64 GB DDR4 - NVME 980 pro SSD - EVGA RTX 3080 FTW Ultra - FAD CASE

Full custom loop / links below out of date

LTT Build Log | PCPP Build Log

_____________________________________________________________________________________________

Sorry if I stop responding, I've probably gotten busy as I mostly am only on here while working.

_____________________________________________________________________________________________

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

Obviously we'd need Nvidia to build (or license) the hardware.  I'm sure they wouldn't mind the opportunity to make more money

And there is the the flaw in your logic. I wouldn't doubt  with just a driver update Nvidia could make their cards work with Freesync. But they are not interested.
Theres a reason why they don't wanna support FreeSync at all. It may not make sense to you or I, It may not even make business sense at all, but to someone in power it does and unfortunately they are in control.
 

Link to comment
Share on other sites

Link to post
Share on other sites

@MadModder There is a slough of companies that build adapters to do precisely that. So your lame answer only validates OP's statements. Seems to me a kick ass company like BenQ could build a monitor with both technologies and the ability to detect and apply the correct one.

Black Knight-

Ryzen 5 5600, GIGABYTE B550M DS3H, 16Gb Corsair Vengeance LPX 3000mhz, Asrock RX 6800 XT Phantom Gaming,

Seasonic Focus GM 750, Samsung EVO 860 EVO SSD M.2, Intel 660p Series M.2 2280 1TB PCIe NVMe, Linux Mint 20.2 Cinnamon

 

Daughter's Rig;

MSI B450 A Pro, Ryzen 5 3600x, 16GB Corsair Vengeance LPX 3000mhz, Silicon Power A55 512GB SSD, Gigabyte RX 5700 Gaming OC, Corsair CX430

Link to comment
Share on other sites

Link to post
Share on other sites

Well, even if Nvidia doesn't want to do it themselves, I understand there's a certain country on this planet of ours where patents and copyrights don't hold much (if any) sway, which in turn provides us with all sorts of cheap knockoffs and useful gadgets that we might not have otherwise.  With that being the case, it has me wondering if the reason why we haven't seen something turn up yet might be due to a technical reason rather than a business one.

 

When unofficial (and possibly illegal) tech gains enough traction, a lot of companies tend to adopt it into their portfolio.  That hasn't happened in this case.

 

I tend to adopt the "why can't we all just get along" philosophy.  Given that Nvidia recently made a tweet about AMD's CPUs working great with Nvidia's GPUs, maybe there's hope for something official... someday.

 

Thanks for the replies so far everyone.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, JT_NC said:

Given that G-sync monitors use a proprietary Nvidia controller, the two techs just can't exist in the same monitor, and having the G-sync controller adds a few hundred $$$ to the monitor price.  We all know that.

 

So why hasn't Nvidia (or one of its licensed monitor manufacturers) built an intermediate solution?  Something that sits between the DisplayPort-out of the video card and the DisplayPort-in on the monitor.

 

It could be an external black box with 2 DisplayPort and power connectors (maybe even USB-powered).

 

Or it could be an internal card, like the good old 3dfx Voodoo (if anyone here is old enough to remember the late 90s).  That was one of the first dedicated 3D graphics cards, and it only worked via pass-through.  You still needed a normal video card too.  When you weren't running a 3dfx-enabled program, the signal from the other graphics card would simply pass right through to the monitor.  The VGA-out of the other video card was connected to a VGA-in on the Voodoo card via a really short cable, and then it used its own VGA-out to connect to the monitor.

 

So why can't we get something like that, in which a G-sync card "thinks" it's talking to a G-sync-enabled monitor, but in reality it's talking to another card (or black box) which converts the output to work with a Freesync monitor?

 

Obviously we'd need Nvidia to build (or license) the hardware.  I'm sure they wouldn't mind the opportunity to make more money.  Freesync monitors are so much cheaper and prolific that they're not gaining market share there.  So why not double-dip?  They don't put Freesync tech into their G-sync cards, so why don't they sell us more cards?

 

Would there be an unacceptable amount of increased latency?  (even so, the bottleneck would be on the computer side rather than the monitor side.  plus, a 144mhz monitor running at 100mhz still beats 60mhz)

Would it require Nvidia and AMD to work together in order to merge their proprietary technologies?  (in which case it'll never happen, but it seems like something Nvidia might be able to do on its own)

Would anyone other than gaming enthusiasts even be interested?  (if not...  marketing!)

 

On the flip side, Nvidia could probably make a card which pretends to be a Freesync-enabled monitor, and then outputs compatible signals to a G-sync monitor controller pretty easily.  There was even a time when Nvidia sold a G-sync mod kit for a particular monitor.  (Linus did a video of that.)

 

Anyway, this whole "G-Sync vs Freesync hardware limitation" bugs me.  We have adapters and converters for almost everything these days, but not this.  I had high hopes for Vega, especially since I got a 4K Freesync monitor last year, but Vega's price and performance fell short of my expectations, so I went with a 1080 Ti instead.

 

Thanks,

JT

I agree with you on this.

 

I mean, an example of allowing two separate technologies in this sector are most enthusiast motherboard chipsets supporting both SLI and Crossfire.

 

I don't believe there should even be latency outside of the initial handshake between the graphics card and the monitor when enabling adaptive sync.

 

Think about it, it's just a simple else if statement in the software of the monitor to determine how to manage the video card request for connection via display port.

 

I imagine there may be license limitations which prevents manufacturers from having both.

Desktop:

AMD Ryzen 7 @ 3.9ghz 1.35v w/ Noctua NH-D15 SE AM4 Edition

ASUS STRIX X370-F GAMING Motherboard

ASUS STRIX Radeon RX 5700XT

Corsair Vengeance LPX 16GB (2x 8GB) DDR4 3200

Samsung 960 EVO 500GB NVME

2x4TB Seagate Barracuda HDDs

Corsair RM850X

Be Quiet Silent Base 800

Elgato HD60 Pro

Sceptre C305B-200UN Ultra Wide 2560x1080 200hz Monitor

Logitech G910 Orion Spectrum Keyboard

Logitech G903 Mouse

Oculus Rift CV1 w/ 3 Sensors + Earphones

 

Laptop:

Acer Nitro 5:

Intel Core I5-8300H

Crucial Ballistix Sport LT 16GB (2x 8GB) DDR4 2666

Geforce GTX 1050ti 4GB

Intel 600p 256GB NVME

Seagate Firecuda 2TB SSHD

Logitech G502 Proteus Spectrum

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

@asand1 Yes 3rd party manufacturers do make kits to put Ford engines into GM cars. The use of those kits will automatically void the warranties/support from both Ford, and GM. It has to do with patent protection, and competition.

I'm sure a company like BenQ could make a monitor that supports both technologies if they were willing to pay the licensing fees, and the cost of engineering, and tooling for such a hybrid. How much would that cost the end user? A lot more than anybody would be willing to pay for that convenience of options.

Best Excuses:

        #1(simple) "Well, I never liked that stupid thing anyway!"

        #2(complex) "Obviously there was a flaw in the material, probably due to the inadvertent introduction of contaminants during the manufacturing process."

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Sypran said:

And there is the the flaw in your logic. I wouldn't doubt  with just a driver update Nvidia could make their cards work with Freesync. But they are not interested.
Theres a reason why they don't wanna support FreeSync at all. It may not make sense to you or I, It may not even make business sense at all, but to someone in power it does and unfortunately they are in control.
 

This. Freesync is built into the displayport design, it would be very easy to activate. There's no reason disable it besides the fact that they want to, in order to push G-sync.

System specs:

4790k

GTX 1050

16GB DDR3

Samsung evo SSD

a few HDD's

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MadModder said:

@asand1 Yes 3rd party manufacturers do make kits to put Ford engines into GM cars. The use of those kits will automatically void the warranties/support from both Ford, and GM. It has to do with patent protection, and competition.

I'm sure a company like BenQ could make a monitor that supports both technologies if they were willing to pay the licensing fees, and the cost of engineering, and tooling for such a hybrid. How much would that cost the end user? A lot more than anybody would be willing to pay for that convenience of options.

It would be cheaper and easier to prduce one line that sopports both than to build two separate lines.

Black Knight-

Ryzen 5 5600, GIGABYTE B550M DS3H, 16Gb Corsair Vengeance LPX 3000mhz, Asrock RX 6800 XT Phantom Gaming,

Seasonic Focus GM 750, Samsung EVO 860 EVO SSD M.2, Intel 660p Series M.2 2280 1TB PCIe NVMe, Linux Mint 20.2 Cinnamon

 

Daughter's Rig;

MSI B450 A Pro, Ryzen 5 3600x, 16GB Corsair Vengeance LPX 3000mhz, Silicon Power A55 512GB SSD, Gigabyte RX 5700 Gaming OC, Corsair CX430

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, asand1 said:

It would be cheaper and easier to prduce one line that sopports both than to build two separate lines.

But then they couldn't justify adding $300 onto the monitor's price for an extra $20-30 worth of silicon.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, JT_NC said:

So why can't we get something like that, in which a G-sync card "thinks" it's talking to a G-sync-enabled monitor, but in reality it's talking to another card (or black box) which converts the output to work with a Freesync monitor?

 

Obviously we'd need Nvidia to build (or license) the hardware.  I'm sure they wouldn't mind the opportunity to make more money.  Freesync monitors are so much cheaper and prolific that they're not gaining market share there.  So why not double-dip?  They don't put Freesync tech into their G-sync cards, so why don't they sell us more cards?

NVIDIA cards already support the "tech" required for FreeSync, and it's actually enabled for mobile G-Sync via eDP on their laptops, which use the same silicon as the desktop parts. NVIDIA already has full capability to use FreeSync monitors, it's not an issue of technology or compatibility. NVIDIA has decided to deny the ability to use FreeSync monitors, and that's all there is to it.

 

As for why we can't have a device that lets you use AMD cards with G-Sync monitors, all G-Sync devices are co-designed with NVIDIA's help, this is not something a third party can do on their own, and NVIDIA would not agree to make a device that allows AMD cards to use G-Sync, because they would rather you just buy an NVIDIA card instead. Yes, they could sell the device to make some money off AMD users, but frankly it's unlikely it would even pay for the development cost of the device, G-Sync monitors are $100-150 more than an equivalent FreeSync monitor, and another G-Sync device in the middle would probably be another $250. Considering the entire market for this device uses AMD graphics cards, I can't see very many of them buying that setup when they could just get a FreeSync monitor and save $300-400. Yes, maybe a few people who already have G-Sync monitors, but all in all it wouldn't sell very well.

 

And, since such a device would require NVIDIA's cooperation, why even bother wishing for that, if you're going to wish for something that requires NVIDIA's cooperation then you might as well wish for the simpler version which is for them to flip a switch and support FreeSync monitors.

Link to comment
Share on other sites

Link to post
Share on other sites

the entire reason they dont do it, is because they get money everytime you buy a gsync monitor. why would they let you buy a freesync monitor and then just adapt when they get more money on the monitor sale. people buy a gsync capable gpu and go 'well i guess i should get a gsync monitor too'

How do Reavers clean their spears?

|Specs in profile|

The Wheel of Time turns, and Ages come and pass, leaving memories that become legend. Legend fades to myth, and even myth is long forgotten when the Age that gave it birth comes again.

Link to comment
Share on other sites

Link to post
Share on other sites

Given the number of Freesync monitors out there, Nvidia could sell even more cards if they enabled theirs to work with both types of monitor, now that AMD has a *cough* competitor *cough*, at least once there's not a complete shortage of cards *cough cough*.  I should get that checked out.

 

If it's simply a "software switch", I'm really surprised no one's hacked the drivers.

 

You know the difference between an Amazon Kindle without Ads and an Amazon Kindle with Ads?  You pay more for the version without.  Otherwise they're the same.  It's just software.  Nvidia could come up with some "premium membership" program.  For $X (or $X/yr), members could have access to drivers that enable Freesync.  Maybe toss in a game every month so people without Freesync monitors want to subscribe too.

 

Problem solved.  Win-win for everyone.  (except those who don't participate, who won't be any worse off than they are today)

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, JT_NC said:

Given the number of Freesync monitors out there, Nvidia could sell even more cards if they enabled theirs to work with both types of monitor, now that AMD has a *cough* competitor *cough*, at least once there's not a complete shortage of cards *cough cough*.  I should get that checked out.

 

If it's simply a "software switch", I'm really surprised no one's hacked the drivers.

 

You know the difference between an Amazon Kindle without Ads and an Amazon Kindle with Ads?  You pay more for the version without.  Otherwise they're the same.  It's just software.  Nvidia could come up with some "premium membership" program.  For $X (or $X/yr), members could have access to drivers that enable Freesync.  Maybe toss in a game every month so people without Freesync monitors want to subscribe too.

 

Problem solved.  Win-win for everyone.  (except those who don't participate, who won't be any worse off than they are today)

I wish they would.

 

I wouldn't be sitting here waiting for Vega to come into a more realistic price range.

 

My Radeon R9 380 is aging, and I want to be able to keep up.

 

I would buy a GTX 1070 in a heart beat if they flipped on the support for it.

Desktop:

AMD Ryzen 7 @ 3.9ghz 1.35v w/ Noctua NH-D15 SE AM4 Edition

ASUS STRIX X370-F GAMING Motherboard

ASUS STRIX Radeon RX 5700XT

Corsair Vengeance LPX 16GB (2x 8GB) DDR4 3200

Samsung 960 EVO 500GB NVME

2x4TB Seagate Barracuda HDDs

Corsair RM850X

Be Quiet Silent Base 800

Elgato HD60 Pro

Sceptre C305B-200UN Ultra Wide 2560x1080 200hz Monitor

Logitech G910 Orion Spectrum Keyboard

Logitech G903 Mouse

Oculus Rift CV1 w/ 3 Sensors + Earphones

 

Laptop:

Acer Nitro 5:

Intel Core I5-8300H

Crucial Ballistix Sport LT 16GB (2x 8GB) DDR4 2666

Geforce GTX 1050ti 4GB

Intel 600p 256GB NVME

Seagate Firecuda 2TB SSHD

Logitech G502 Proteus Spectrum

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, JT_NC said:

But then they couldn't justify adding $300 onto the monitor's price for an extra $20-30 worth of silicon.

Plus all the money that went into engineering/manufacturing it. You have to pay people to design the hardware/software, manage the manufacturing line, etc. If only the world worked in such a way where you only paid for what the hardware cost to manufacture.

 

Ryzen 9 3950x - 64 GB DDR4 - NVME 980 pro SSD - EVGA RTX 3080 FTW Ultra - FAD CASE

Full custom loop / links below out of date

LTT Build Log | PCPP Build Log

_____________________________________________________________________________________________

Sorry if I stop responding, I've probably gotten busy as I mostly am only on here while working.

_____________________________________________________________________________________________

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, jeffmeyer5295 said:

Plus all the money that went into engineering/manufacturing it. You have to pay people to design the hardware/software, manage the manufacturing line, etc. If only the world worked in such a way where you only paid for what the hardware cost to manufacture.

The G-sync controller was designed enough years ago that they shouldn't still be paying for the R&D.  That technology doesn't need to be redeveloped for every new monitor design.  All they need to design on the per-monitor basis is the layout of the components and shape of the circuit board (which is more likely dictated by the monitor manufacturer rather than Nvidia anyway), and that shouldn't jack up the price so much considering the process has to be done regardless of whether it's G-sync or Freesync, yet Freesync monitors cost significantly less.  That price disparity is simply Nvidia's version of the "Apple tax".  If people are willing to pay it, they'll happily keep overcharging for it.

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/22/2017 at 2:06 PM, JT_NC said:

 

Obviously we'd need Nvidia to build (or license) the hardware.  I'm sure they wouldn't mind the opportunity to make more money.  Freesync monitors are so much cheaper and prolific that they're not gaining market share there.  So why not double-dip?  They don't put Freesync tech into their G-sync cards, so why don't they sell us more cards?

 

 

I can't comment on the hardware side of things, but I can comment on the business aspect of it:

 

The way I see it, the problem is Nvidia's licensing partners who make G-Sync Monitors.  Companies like Asus who put G-Sync chips into monitors likely (a) pay a wholesale price per G-Sync chip, and (b) pay a larger umbrella-like licensing fee for the 'right' to be a G-Sync partner to begin with.  Between those two things that probably represents a lot of money to Nvidia.

 

If Nvidia starts selling a G-Sync to Freesync pass through device, they just chopped the legs out from under their own partners.  For example:  Lets say I have $700 to spend on a gaming monitor.  I have the choice to get either a really nice Freesync Monitor + Gsync PassTrough Device; or just a really nice Gsync Monitor alone.  Just to keep the analogy simple, lets say those two options equate to basically the same price, for basically the same specs on the monitor.  Which are you gonna pick??

 

Smart people would pick the Freesync Monitor + Gsync PassThrough Device every time, because its more flexible for the future.  You haven't tied yourself to one video card manufacturer or the other.  Nvidia might make more money from selling these PassThrough Devices, but their partners will sell less GSync monitors as a result.  Whether or not the changes would balance each other out is unknown, but it would certainly make the Gsync monitor manufacturers VERY UNHAPPY.  Not a good business practice to piss off your partners.

 

Besides.  They want you locked into their Gsync ecosystem.  Once you have the monitor . . . you're not gonna change brands of video card.

Link to comment
Share on other sites

Link to post
Share on other sites

FreeSync is (basically) built into the VESA standards and really costs little to implement.

G-Sync requires extra hardware to actually use which does cost a lot more to implement. G-Sync monitors are built to work with Nvidia cards, while FreeSync monitors (or VESA adaptive sync or w/e) are just built to work in general with whatever it needs to work with. On one end, FreeSync is a lot cheaper to implement, generally costing little to implement. On the flip side, I've heard that G-Sync is slightly superior because of its nature.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, JT_NC said:

The G-sync controller was designed enough years ago that they shouldn't still be paying for the R&D.  That technology doesn't need to be redeveloped for every new monitor design.  All they need to design on the per-monitor basis is the layout of the components and shape of the circuit board (which is more likely dictated by the monitor manufacturer rather than Nvidia anyway), and that shouldn't jack up the price so much considering the process has to be done regardless of whether it's G-sync or Freesync, yet Freesync monitors cost significantly less.  That price disparity is simply Nvidia's version of the "Apple tax".  If people are willing to pay it, they'll happily keep overcharging for it.

For the original G-Sync controller yes. But the second-gen controller with the added HDMI port took some R&D. And they're still spending R&D money on the yet-unreleased next-gen G-Sync controller with HDMI 2.0 and HDR support. Development is ongoing, the development costs are continuous, they didn't just create the first G-Sync module and have been sitting around since then.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Glenwing said:

For the original G-Sync controller yes. But the second-gen controller with the added HDMI port took some R&D. And they're still spending R&D money on the yet-unreleased next-gen G-Sync controller with HDMI 2.0 and HDR support. Development is ongoing, the development costs are continuous, they didn't just create the first G-Sync module and have been sitting around since then.

And AMD worked on Freesync 2, yet those monitor prices still remain a lot cheaper.  There's just no justification for the price disparity.  It's simply Nvidia doing it because they can.

http://www.anandtech.com/show/10967/amd-announces-freesync-2-improving-ease-lowering-latency-of-hdr-gaming

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, JT_NC said:

And AMD worked on Freesync 2, yet those monitor prices still remain a lot cheaper.  There's just no justification for the price disparity.  It's simply Nvidia doing it because they can.

http://www.anandtech.com/show/10967/amd-announces-freesync-2-improving-ease-lowering-latency-of-hdr-gaming

They aren't designing hardware, just the protocol, and leaving it to display controller designers (Realtek, Novatek, etc.) to actually design the silicon that implements the capabilities that AMD has specified.

 

Yes G-Sync is more expensive than it should be, but it does cost more than FreeSync to design and implement (just not by the amount we see in the price difference).

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×