Jump to content

Nvidia Pitches Advantages of G-Sync Over AMD's FreeSync

BiG StroOnZ

We don't want talk and interviews... We want side by side comparisons from an unbiased source with actual results from both finished products.

Link to comment
Share on other sites

Link to post
Share on other sites

Some Freesync model are either the same price or cheaper, than the non freesync version. LG 34UM67 has freesync and costs the same as 34UM65, which does not. The 67 version might have a lower MSRP in the US. AMD or Nvidia has no say in tv standards/manufacturers, so I don't see how you can make this a console thing. Also why make consoler master race? Makes no sense.

That's fair enough: Freesync (eventually) becoming more widely available would make a compelling case. If I wanted an upgrade I'd look at a freesync, 4k IPS panel that's about 400 bucks or less so still some ways away I think.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

You're right, they could have taken the same approach....if they wanted to delay release 1-2 years and then launch something that's inferior. Are you really yelling at Nvidia for not sharing all their little secrets? Go to Heinz and try to get their Ketchup recipe...good luck. 

 

Lol I know right? 

By his logic, why couldn't AMD work with Nvidia and license their tech? AMDs implementation is about as "open" as KFC telling you what the herbs and spices are. 

 

Adaptive Sync ≠ FreeSync and FreeSync ≠ Free but you know what, who cares? Are tech enthusiasts actually getting upset that Nvidia spent more time and money, made a superior system (however "proprietary" it is) and offers it alongside what, right now, is the inferior solution. 

 

Since when did the 'master race' adopt weaker implementations. Did hell just freeze over? 

Link to comment
Share on other sites

Link to post
Share on other sites

Adaptive Sync ≠ FreeSync and FreeSync ≠ Free but you know what, who cares? Are tech enthusiasts actually getting upset that Nvidia spent more time and money, made a superior system (however "proprietary" it is) and offers it alongside what, right now, is the inferior solution. 

NVIDIA is superior and AMD is inferior according to who? Tom Peterson? Pcper?

Please...

Link to comment
Share on other sites

Link to post
Share on other sites

You're right, they could have taken the same approach....if they wanted to delay release 1-2 years and then launch something that's inferior. Are you really yelling at Nvidia for not sharing all their little secrets? Go to Heinz and try to get their Ketchup recipe...good luck. 

 

That makes no sense. Nvidia could have easily done their tech in collaboration with scaler vendors, and open it up to AMD. Nvidia could even still take royalties for the tech, if they need to be so greedy. Remember that Gsync was delayed for half a year, because it didn't work properly at launch. They had to reinvent OSD, etc from the ground up with Asus, which would not have been necessary, if they had just collaborated with scaler vendors from the get go. That would even have resulted in ASIC scalers, instead of expensive FPGA processors, that Gsync uses now.

 

I'm yelling at Nvidia for creating yet more proprietary crap, that divides the markets, locking in consumers into proprietary, expensive ecosystems. No consumer should ever condone this kind of behaviour. It is anti competitive, and anti consumer.

 

Lol I know right? 

By his logic, why couldn't AMD work with Nvidia and license their tech? AMDs implementation is about as "open" as KFC telling you what the herbs and spices are. 

 

Adaptive Sync ≠ FreeSync and FreeSync ≠ Free but you know what, who cares? Are tech enthusiasts actually getting upset that Nvidia spent more time and money, made a superior system (however "proprietary" it is) and offers it alongside what, right now, is the inferior solution. 

 

Since when did the 'master race' adopt weaker implementations. Did hell just freeze over? 

 

Since when has Nvidia let AMD license anything ever? Adaptive Sync is an open industry standard. Freesync, has nothing to do with anyone else, like Nvidia or Intel. They can make their own driver implementation. Freesync drivers, has no consequence for any other hardware vendor. They can use Adaptive Sync just as well as AMD.

 

Also in what way is Adaptive Sync and freesync inferior? A lot of the reviews conclude, that the choice of Vsync or free variable fps above the Adaptive sync interval, is a lot better, than Gsyncs forced vsync. Especially for professionals, who'd rather have the extra mouse input, at the cost of a little tearing, which is hard to see above 144hz anyways. The synced interval performance, seems to be identical between the two. Only below the hz interval, does Gsync seem to be the best atm. That could change in a freesync driver update down the road though. Either way, gaming below 30 hz is quite pointless, don't you think?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

My facts are completely in order. Note how I did NOT state, that Adaptive Sync works over HDMI or DVI? Only that Adaptive Sync monitors can have these inputs, to use with laptops, bluray players, gaming consoles etc.

That makes these monitors better, than Gsync monitors, by giving more options. Adaptive Sync can be used for power savings features as well (which is what variable vblank in eDP was invented for as well). This is also a reason why Intel might support AS, as hey can use it for power saving. That is very interesting in a professional setting.

G-Sync monitors can also have other inputs. It would just require slightly more engineering. There's no Freesync monitor yet with multiple inputs either, so your premise is moot.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

How is AMD guilty in this? They made an open industry standard, that everyone, including Nvidia can use. Nvidia made a closed off proprietary solution, that AMD has no access to. All the blame, ALL OF IT, is on Nvidia. They could have taken the same industry standard approach, that AMD did, and they could have made the industry standard, everyone could use. They chose not to, giving AMD, and everyone else, zero choice, but to make their own version.

Some Freesync model are either the same price or cheaper, than the non freesync version. LG 34UM67 has freesync and costs the same as 34UM65, which does not. The 67 version might have a lower MSRP in the US. AMD or Nvidia has no say in tv standards/manufacturers, so I don't see how you can make this a console thing. Also why make consoler master race? Makes no sense.

It's called business and profit margins. AMD would do well to abandon their open standard efforts until they're on financially solid ground. You don't give your competitors any quarter in business, ever. Intel and Nvidia know this well. The consumer isn't King. The producers are, and the producers have every right to make money on work they do. To not take advantage of that right at every opportunity creates room for competitors to gain a financial leg up on you. Thank God AMD is properly pricing the 390X at $700. They need profit margins and big ones to get out of debt and get their stocks moving upward again.

Also, ASICs are the wrong answer. You can change the algorithms stored on an FPGA, meaning you can actually upgrade the hardware without having to buy a new one. If a bug is ever found in the Adaptive-Sync module(yes, there is one), then you're out hundreds of dollars. Nvidia made all the correct choices. They can also add in the option of VSync or not as they are now planning. Nvidia has proved once again closed, proprietary standards, at least when competing with others, produce the best quality and investment potential.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

G-Sync monitors can also have other inputs. It would just require slightly more engineering. There's no Freesync monitor yet with multiple inputs either, so your premise is moot.

 

But they don't. Saying that it just takes more engineering, is pointless, as that could be made about every single thing ever.

 

Acer XG270HU Freesync monitor has both HDMI and DVI inputs. LG 34UM67 Has 2xHDMI and a DVI input as well. In fact, how about you link me to a Freesync monitor, that does NOT have more than just Displayport inputs?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

But they don't. Saying that it just takes more engineering, is pointless, as that could be made about every single thing ever.

Acer XG270HU Freesync monitor has both HDMI and DVI inputs. LG 34UM67 Has 2xHDMI and a DVI input as well. In fact, how about you link me to a Freesync monitor, that does NOT have more than just Displayport inputs?

Every single one Samsung is launching? Furthermore, see the rest of my post.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

So really, reading between the lines, none of those problems are ones that cannot be fixed in the drivers. Yeah, it doesn't really matter which brand you pick.

I cannot be held responsible for any bad advice given.

I've no idea why the world is afraid of 3D-printed guns when clearly 3D-printed crossbows would be more practical for now.

My rig: The StealthRay. Plans for a newer, better version of its mufflers are already being made.

Link to comment
Share on other sites

Link to post
Share on other sites

But Adaptive Sync, IS a hardware implementation.

 

No. It is a hardware layer in between a driver talking to a monitor.

 

Gsync is hardware -> hardware

Freesync is software -> hardware -> hardware

 

DisplayPort only enables the communication to happen, it is not the medium that communicates.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

That makes no sense. Nvidia could have easily done their tech in collaboration with scaler vendors, and open it up to AMD. Nvidia could even still take royalties for the tech, if they need to be so greedy. Remember that Gsync was delayed for half a year, because it didn't work properly at launch. They had to reinvent OSD, etc from the ground up with Asus, which would not have been necessary, if they had just collaborated with scaler vendors from the get go. That would even have resulted in ASIC scalers, instead of expensive FPGA processors, that Gsync uses now.

 

I'm yelling at Nvidia for creating yet more proprietary crap, that divides the markets, locking in consumers into proprietary, expensive ecosystems. No consumer should ever condone this kind of behaviour. It is anti competitive, and anti consumer.

 

 

Since when has Nvidia let AMD license anything ever? Adaptive Sync is an open industry standard. Freesync, has nothing to do with anyone else, like Nvidia or Intel. They can make their own driver implementation. Freesync drivers, has no consequence for any other hardware vendor. They can use Adaptive Sync just as well as AMD.

 

Also in what way is Adaptive Sync and freesync inferior? A lot of the reviews conclude, that the choice of Vsync or free variable fps above the Adaptive sync interval, is a lot better, than Gsyncs forced vsync. Especially for professionals, who'd rather have the extra mouse input, at the cost of a little tearing, which is hard to see above 144hz anyways. The synced interval performance, seems to be identical between the two. Only below the hz interval, does Gsync seem to be the best atm. That could change in a freesync driver update down the road though. Either way, gaming below 30 hz is quite pointless, don't you think?

I don't have words for you....how dare a company want to make money. And looking at it from nvidia perspective, why in gods name would they change their business practices? Nvidia is doing FAR better than AMD, and their first priority is (as it should be) is making money.

There's nothing wrong with a proprietary implementation of its better.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Never mind, it was covered already. 

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't have words for you....how dare a company want to make money. And looking at it from nvidia perspective, why in gods name would they change their business practices? Nvidia is doing FAR better than AMD, and their first priority is (as it should be) is making money.

There's nothing wrong with a proprietary implementation of its better.

Your right proprietary is better for Nvidia and companys as it normally means more money, but it's worse for the consumer, if one makes a proprietary implementation to use it I will need to get into their eco system and be locked into that if I want to use that implementation, or another company make their own proprietary  implementation, or someone tries to make a open implementation, suddenly there are so many implementations that saturate the market.  

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

Never mind, it was covered already. 

I think AMD influenced VESA a lot to implement adaptive-sync into 1.2a, also http://www.tomshardware.com/news/amd-freesync-nvidia-g-sync-vesa,26483.html

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

Every single one Samsung is launching? Furthermore, see the rest of my post.

 

Not a single Samsung Freesync monitor has been launched. We have no idea what the specs/inputs are going to be. Sorry but you are wrong, and I've given you two examples, disproving your point.

The rest of what? Both of those monitors are out, and can be bought.

 

 

No. It is a hardware layer in between a driver talking to a monitor.

 

Gsync is hardware -> hardware

Freesync is software -> hardware -> hardware

 

DisplayPort only enables the communication to happen, it is not the medium that communicates.

 

I think we are misunderstanding each other? Adaptive Sync is a standard, the scaler ASIC has to support, which gives the graphics cards (and its drivers) the ability to dictate the monitors refresh rate.

Gsync is both a hardware module in the monitor, as well as drivers (see the leaked driver, that enabled gsync on an Asus laptop).

Adaptive sync, is a standard that needs to be supported in the scaler, that does the same as the Gsync hardware module.

 

 

I don't have words for you....how dare a company want to make money. And looking at it from nvidia perspective, why in gods name would they change their business practices? Nvidia is doing FAR better than AMD, and their first priority is (as it should be) is making money.

There's nothing wrong with a proprietary implementation of its better.

 

That's a nice logical fallacy. I have never claimed a company can not or should not make money. I'm saying their business practices are anti competitive and anti consumer, and that I don't understand, why anyone would condone their behaviour, as it is bad for consumers like you and me?

 

I disagree with your final point, if you look at it from a consumers view (why would you ever look at it from Nvidia's view, unless you are an investor). Proprietary usually results in divided markets, making prices higher, and giving the consumer less choice and options. Adaptive Sync already has a higher diversity of solutions than Gsync (ultra wide monitors).

 

I just don't understand your priorities? Why defend those kinds of business practices that ends up hurting you? Like I said, Nvidia could have just done the same as AMD, making Gsync the industry standard. Nvidia chose the proprietary route, splitting up the market and locking in their consumers in a closes eco system. What if Nvidia falls on their face and makes a new Fermi? Nvidia users cannot switch to AMD, as they lose synced framerate in your gsync monitor, and maybe also your shield etc.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I think AMD influenced VESA a lot to implement adaptive-sync into 1.2a, also http://www.tomshardware.com/news/amd-freesync-nvidia-g-sync-vesa,26483.html

 

AMD proposed the Adaptive Sync standard to VESA. AMD invented the standard.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Your right proprietary is better for Nvidia and companys as it normally means more money, but it's worse for the consumer, if one makes a proprietary implementation to use it I will need to get into their eco system and be locked into that if I want to use that implementation, or another company make their own proprietary  implementation, or someone tries to make a open implementation, suddenly there are so many implementations that saturate the market.

Yeah, ideally everything would be open source, but often proprietary implementations are better (and sometimes open source implementations are better). But from a business standpoint you can't fault a company for trying to lock you into their ecosystem. And like I said, I'm perfectly okay with a proprietary solution IFF it is actually superior.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD proposed the Adaptive Sync standard to VESA. AMD invented the standard.

No they didn't, they used the technology that was already implemented in embedded DisplayPort standard.

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, ideally everything would be open source, but often proprietary implementations are better (and sometimes open source implementations are better). But from a business standpoint you can't fault a company for trying to lock you into their ecosystem. And like I said, I'm perfectly okay with a proprietary solution IFF it is actually superior.

I agree, but my point still stands for consumers it's worse to have proprietary/

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

Yes 30-144Hz is the limitation of current monitors has nothing to do with the theoretical limit as I'm trying to explain but you are blatantly ignoring. Just like FreeSync is capable of doing 9-240Hz, G-Sync is capable of doing 1-240Hz. However as I'm trying to explain to you but you are not understanding there are no monitors available that allow such variable refresh rate coverage. Which is why you see 30-144Hz limitations in G-Sync and 40-144Hz limitations in FreeSync.

im pretty sure thats the limitation of the asic card they put in. asic cards need to be really specific in their function so if nvidia want to support a wider range they have to redesign the asic card. but if you say pretty much no monitor has a higher refresh than 144hz and anything under 30 or 40 hz cause flickering so the range is not needed i would agree with you

Link to comment
Share on other sites

Link to post
Share on other sites

No they didn't, they used the technology that was already implemented in embedded DisplayPort standard.

 

The tech you are talking about is called Variable VBlank, used in eDP for power savings. The entire Adaptive Sync standard, using variable vblank for variable frame rate and the initial handshake between the monitor and graphics card, along with supported hz interval, is indeed defined/invented by AMD and proposed for VESA by AMD.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

The tech you are talking about is called Variable VBlank, used in eDP for power savings. The entire Adaptive Sync standard, using variable vblank for variable frame rate and the initial handshake between the monitor and graphics card, along with supported hz interval, is indeed defined/invented by AMD and proposed for VESA by AMD.

Prove it, I know for sure that their demo was running on some Toshiba laptops which used free-sync which was based on eDP technology.

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

I agree, but my point still stands for consumers it's worse to have proprietary/

Well, yes and no. It's worse because you're locked into an ecosystem, but can also be better as you wouldn't have the same money spent on RnD if it was for an open standard...take freesync vs gaunt, the proprietary solution is the better performer (albeit at a worse price/performance). And if nvidia made gaunt an open standard like freesync it would be safe to assume it wouldn't perform the same way it does now.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×