Jump to content

The real difference between "Free-Sync" vs G-Sync

exyia

I think it's hilarious how easily you guys get worked up over something you don't really understand

 

http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo

 

it pays to not jump to conclusions. there's a reason why "free-sync" was showcased on a laptop and quietly. please stop the sh--storms over reading just a headline

 

 

...

 

However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

 

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.

 

...

Link to comment
Share on other sites

Link to post
Share on other sites

So it's now even more not amazing... and Linus said this would change the mobile industry  <_<

 

AMD, make your move!

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

Oh, thats why.

 

Well, waiting for the next AMD move then.

Signatures are stupid.

Link to comment
Share on other sites

Link to post
Share on other sites

So it's now even more not amazing... and Linus said this would change the mobile industry  <_<

 

well......it still has huge power savings potential in the mobile market :)

 

I just find it laughable that so many people keep jumping to conclusions and trying to stir up stuff over something they don't bother looking into

Link to comment
Share on other sites

Link to post
Share on other sites

fuck this shit... :(

Gaming HTPC:

R5 5600X - Cryorig C7 - Asus ROG B350-i - EVGA RTX2060KO - 16gb G.Skill Ripjaws V 3333mhz - Corsair SF450 - 500gb 960 EVO - LianLi TU100B


Desktop PC:
R9 3900X - Peerless Assassin 120 SE - Asus Prime X570 Pro - Powercolor 7900XT - 32gb LPX 3200mhz - Corsair SF750 Platinum - 1TB WD SN850X - CoolerMaster NR200 White - Gigabyte M27Q-SA - Corsair K70 Rapidfire - Logitech MX518 Legendary - HyperXCloud Alpha wireless


Boss-NAS [Build Log]:
R5 2400G - Noctua NH-D14 - Asus Prime X370-Pro - 16gb G.Skill Aegis 3000mhz - Seasonic Focus Platinum 550W - Fractal Design R5 - 
250gb 970 Evo (OS) - 2x500gb 860 Evo (Raid0) - 6x4TB WD Red (RaidZ2)

Synology-NAS:
DS920+
2x4TB Ironwolf - 1x18TB Seagate Exos X20

 

Audio Gear:

Hifiman HE-400i - Kennerton Magister - Beyerdynamic DT880 250Ohm - AKG K7XX - Fostex TH-X00 - O2 Amp/DAC Combo - 
Klipsch RP280F - Klipsch RP160M - Klipsch RP440C - Yamaha RX-V479

 

Reviews and Stuff:

GTX 780 DCU2 // 8600GTS // Hifiman HE-400i // Kennerton Magister
Folding all the Proteins! // Boincerino

Useful Links:
Do you need an AMP/DAC? // Recommended Audio Gear // PSU Tier List 

Link to comment
Share on other sites

Link to post
Share on other sites

fuck this shit... :(

Maybe it's my Fanboyism but every time I hear that NVIDEAHHHHHHHH shit before a game I wana just take my monitor and throw it.

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

Ok so how exactly did you expect the Nvidia spkoesperson to respond? 

"Shit they got us"?

 

No, they are going to defend their decision to licence out their module to the very end. Let's see what AMD can do. 

 

I think it's hilarious how easily you guys get worked up over something you don't really understand

 

http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo

 

it pays to not jump to conclusions. there's a reason why "free-sync" was showcased on a laptop and quietly. please stop the sh--storms over reading just a headline

 

And BTW this sounds a tad aggressive and unnecessary. If you have any input on the article other than the half sentence you wrote, please feel free to pipe in. It sounds like AMD hurt your feelings somehow. 

AMD FX-8350 @ 4.7Ghz when gaming | MSI 990FXA-GD80 v2 | Swiftech H220 | Sapphire Radeon HD 7950  +  XFX Radeon 7950 | 8 Gigs of Crucial Ballistix Tracers | 140 GB Raptor X | 1 TB WD Blue | 250 GB Samsung Pro SSD | 120 GB Samsung SSD | 750 Watt Antec HCG PSU | Corsair C70 Mil Green

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe it's my Fanboyism but every time I hear that NVIDEAHHHHHHHH shit before a game I wana just take my monitor and throw it.

 

:D i dont really hate on nvidia or anything, i just want g-sync for free because its super fcking awesome! ;)

and i like my monitor aswell as my (nvidia-) gpu way to much to throw either of them :P

Gaming HTPC:

R5 5600X - Cryorig C7 - Asus ROG B350-i - EVGA RTX2060KO - 16gb G.Skill Ripjaws V 3333mhz - Corsair SF450 - 500gb 960 EVO - LianLi TU100B


Desktop PC:
R9 3900X - Peerless Assassin 120 SE - Asus Prime X570 Pro - Powercolor 7900XT - 32gb LPX 3200mhz - Corsair SF750 Platinum - 1TB WD SN850X - CoolerMaster NR200 White - Gigabyte M27Q-SA - Corsair K70 Rapidfire - Logitech MX518 Legendary - HyperXCloud Alpha wireless


Boss-NAS [Build Log]:
R5 2400G - Noctua NH-D14 - Asus Prime X370-Pro - 16gb G.Skill Aegis 3000mhz - Seasonic Focus Platinum 550W - Fractal Design R5 - 
250gb 970 Evo (OS) - 2x500gb 860 Evo (Raid0) - 6x4TB WD Red (RaidZ2)

Synology-NAS:
DS920+
2x4TB Ironwolf - 1x18TB Seagate Exos X20

 

Audio Gear:

Hifiman HE-400i - Kennerton Magister - Beyerdynamic DT880 250Ohm - AKG K7XX - Fostex TH-X00 - O2 Amp/DAC Combo - 
Klipsch RP280F - Klipsch RP160M - Klipsch RP440C - Yamaha RX-V479

 

Reviews and Stuff:

GTX 780 DCU2 // 8600GTS // Hifiman HE-400i // Kennerton Magister
Folding all the Proteins! // Boincerino

Useful Links:
Do you need an AMP/DAC? // Recommended Audio Gear // PSU Tier List 

Link to comment
Share on other sites

Link to post
Share on other sites

Ok so how exactly did you expect the Nvidia spkoesperson to respond? 

"Shit they got us"?

 

No, they are going to defend their decision to licence out their module to the very end. Let's see what AMD can do. 

 

 

And BTW this sounds a tad aggressive and unnecessary. If you have any input on the article other than the half sentence you wrote, please feel free to pipe in. It sounds like AMD hurt your feelings somehow. 

 

wut?

 

1. did you even read the article? or did you just read the headline and snippet as well?

 

Tom Petersen did nothing but explain the design differences between a laptop display and desktop display. And then explained why they chose to do what they did.....

 

where did he "defend" g-sync anywhere?????????

 

2. AMD hurt my feelings? where anywhere did I imply AMD did nothing wrong? where anywhere did I imply AMD mislead anyone?

 

 

 

it pays to not jump to conclusions....please stop the sh--storms over reading just a headline

 

subject: reader

issue: jumping to conclusions

cause: reading just the headline

 

where does AMD hurting my feelings factor in? how did you come to that?.....................

Link to comment
Share on other sites

Link to post
Share on other sites

wut?

 

1. did you even read the article? or did you just read the headline and snippet as well?

 

Tom Petersen did nothing but explain the design differences between a laptop display and desktop display. And then explained why they chose to do what they did.....

 

where did he "defend" g-sync anywhere?????????

 

2. AMD hurt my feelings? where anywhere did I imply AMD did nothing wrong? where anywhere did I imply AMD mislead anyone?

 

 

 

 

subject: reader

issue: jumping to conclusions

cause: reading just the headline

 

where does AMD hurting my feelings factor in? how did you come to that?.....................

 

 

I read it, chill out. My point was it was an Nvidia guy defending his company's decision to licence out tech. Defending by telling us why doing it free is not possible. 

 

Your ability to comprehend what the word "defend" means in its application to what I wrote baffles me. He defends it by arguing against a non propitiatory  standard. Do you understand what I just said?

 

You seem asshurt some how by AMD's showcase. Chill with the generalized blanket critique in your opening and maybe you might get people to have a civilized discussion. 

AMD FX-8350 @ 4.7Ghz when gaming | MSI 990FXA-GD80 v2 | Swiftech H220 | Sapphire Radeon HD 7950  +  XFX Radeon 7950 | 8 Gigs of Crucial Ballistix Tracers | 140 GB Raptor X | 1 TB WD Blue | 250 GB Samsung Pro SSD | 120 GB Samsung SSD | 750 Watt Antec HCG PSU | Corsair C70 Mil Green

Link to comment
Share on other sites

Link to post
Share on other sites

I think it's hilarious how easily you guys get worked up over something you don't really understand

 

http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo

 

it pays to not jump to conclusions. there's a reason why "free-sync" was showcased on a laptop and quietly. please stop the sh--storms over reading just a headline

There's no scaler chip on my Qnix, any chance that I can use this? ;)

 

I just looked it up, I have a bypass board which is VERY similar to the tech in laptop screens. So there's a good chance it would work :)

 

So my single dual link DVI connection goes straight to the monitor. I'd love to test this out  :D

CPU Overclocking Database <------- Over 275 submissions, and over 40,000 views!                         

GPU Overclocking Database                                                    

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia holding the video game industry back with proprietary crap.

Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to comment
Share on other sites

Link to post
Share on other sites

I'm glad I decided to wait rather than jumping into AMD's lap.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm glad I decided to wait rather than jumping into AMD's lap.

 

Why? What were you going to do?

Link to comment
Share on other sites

Link to post
Share on other sites

Why? What were you going to do?

Make myself look like an idiot by doing so.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

All that is needed for this to work, as AMD explained it, was an eDP connection between the discrete GPU and the display, a controller for the screen that understands the variable refresh rate methods of eDP 1.0 specifications and an updated AMD driver to properly send it the signals.  The panel can communicate that it supports this variable refresh technology to the graphics card through the EDID as resolutions and timings are communicated today and then the graphics driver would know to send the varying vblank signals to adjust panel refresh times on the fly.

 

If you aren't familiar with eDP, don't feel bad.  It's a connection type used in tablets and notebooks and isn't used at all in desktop configurations (some all-in-one designs do use eDP).  But here is where it might get interesting: the upcoming DisplayPort 1.3 standard actually includes the same variable refresh rate specification.  That means that upcoming DP 1.3 panels COULD support variable refresh technology in an identical way to what we saw demoed with the Toshiba laptops today.  DP 1.3 is on schedule to be ratified as a standard in the next 60-90 days and from there we'll have some unknown wait time before we begin to see monitors using DP 1.3 technology in them.

 

To be clear, just because a monitor would run with DisplayPort 1.3 doesn't guarantee this feature would work.  It also requires the controller on the display to understand and be compatible with the variable refresh portions of the spec, which with eDP 1.0 at least, isn't required.  AMD is hoping that with the awareness they are building with stories like this display designers will actually increase the speed of DP 1.3 adoption and include support for variable refresh rate with them. That would mean an ecosystem of monitors that could potentially support variable speed refresh on both AMD and NVIDIA cards.  All that would be needed on the PC side is a software update for both Radeon and GeForce graphics cards.

 

Koduri told me that AMD wasn't bringing this demo out to rain on NVIDIA's G-Sync parade but instead to get media interested in learning about this feature of eDP 1.0 and DP 1.3, urging the hardware companies responsible to more quickly produce the necessary controllers and integrate them with upcoming panels in 2014.  While I don't doubt that it is the case for AMD, I'm sure the timing of the demo and NVIDIA's G-Sync releases this week were not an accident.

 

 

Koduri did admit that NVIDIA deserved credit for seeing this potential use of the variable refresh feature and bringing it to market as quickly as they did.  It has raised awareness of the issue and forced AMD and the rest of the display community to take notice.  But clearly AMD's goal is to make sure that it remains a proprietary feature for as little time as possible.

 

As it stands today, the only way to get variable refresh gaming technology on the PC is to use NVIDIA's G-Sync enabled monitors and GeForce graphics cards.  It will likely take until the ratification and release of DisplayPort 1.3 monitors before AMD Radeon users will be able to enjoy what I definitely believe is one of the best new technologies for PC gaming in years.  AMD is hopeful it will happen in Q3 of 2014 but speed of integration has never been a highlight of the DisplayPort standard.  NVIDIA definitely has an availability advantage with G-Sync but the question will be for how many months or quarters it will last.

 

Finally, as a last minute stirring of the pot, I received an email from AMD's Koduri that indicated that there might be some monitors already on the market that could support variable refresh rate TODAY with just a firmware update.  This would be possible if a display was shipping with a controller that happened to coincidentally support variable refresh, perhaps in an early stage of development for the upcoming DP 1.3 standard.  We are trying to find a list of these monitors so we can talk with them and ask for these necessary changes.  

 

 

http://www.pcper.com/reviews/Graphics-Cards/AMD-Variable-Refresh-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync

 

So basically AMD need to convince monitor manufacturers to include the controller with upcoming DisplayPort 1.3 panels and then with AMD & NVIDIA driver updates we could have 'FreeSync'. 

 

I'm assuming it will add to the cost of monitors but hopefully not as much as G-Sync. If it does work as well as G-Sync NVIDIA was be forced to adopt or even lower the prices of their own G-Sync monitors. 

 

Good news for everybody. 

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia holding the video game industry back with proprietary crap.

Like AMD is different "Mantle".

They are both holding PC gaming back with they're fighting outside of GPUs.

Stuff like PhysX,Mantle,G-sync needs to be PC only not AMD or Nvidia only.

Instead of fighting each other they should fight consoles.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

There's no scaler chip on my Qnix, any chance that I can use this? ;)

 

I just looked it up, I have a bypass board which is VERY similar to the tech in laptop screens. So there's a good chance it would work :)

 

So my single dual link DVI connection goes straight to the monitor. I'd love to test this out  :D

It probably won't work.  It still needs to support vairable vblank which no desktop displays support.  Pcper said amd told them there might be some desktop displays with support for it but they haven't found a list of displays yet.  The good news is that it is a planned feature for DP 1.3 which will be finalized within the next 2 months.  Once DP 1.3 is out amd needs to get display manufacturers to implement it and then it will be possible. 

i7 4770k, 16GB Corsair Vengeance

Gigabyte z87, Phanteks Enthoo Primo

7970

Link to comment
Share on other sites

Link to post
Share on other sites

Like AMD is different "Mantle".

They are both holding PC gaming back with they're fighting outside of GPUs.

Stuff like PhysX,Mantle,G-sync needs to be PC only not AMD or Nvidia only.

Instead of fighting each other they should fight consoles.

 

Mantle is open source. AMD also gave Nvidia TressFX. Nvidia just needs to adopt Mantle, although not entirely sure how.

Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to comment
Share on other sites

Link to post
Share on other sites

Mantle is open source. AMD also gave Nvidia TressFX.

It's not open source and it is also specific to GCN gpus.  It is an open standard that anyone can develop on but the source is not available.  Anyone is allowed to use it but not allowed to change it.  AMD pretends to play nice a lot better than nvidia.  Nvidia justs says this is ours and no one else can have it while AMD says you can use it, but you won't cause it won't work well for you.

i7 4770k, 16GB Corsair Vengeance

Gigabyte z87, Phanteks Enthoo Primo

7970

Link to comment
Share on other sites

Link to post
Share on other sites

It's not open source and it is also specific to GCN gpus.  It is an open standard that anyone can develop on but the source is not available.  Anyone is allowed to use it but not allowed to change it.  AMD pretends to play nice a lot better than nvidia.  Nvidia justs says this is ours and no one else can have it while AMD says you can use it, but you won't cause it won't work well for you.

Oh, my bad.

But what is this? http://wccftech.com/amd-mantle-api-require-gcn-work-nvidia-graphic-cards/ They said it's not tied to the GCN architecture.

Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to comment
Share on other sites

Link to post
Share on other sites

I tend to agree with the opinion that Nvidia saw variable refresh rates in the pipeline (ie. new display port standards) and decided to bring the functionality to market first.  This is not to say that their silicon had that functionality first as AMD is stating here look we can do the same thing as long as the display supports this.  All AMD is trying to do is say "yes it's a good idea, yes our cards have the ability built in now, yes displays will be sold in the near future that will work with our cards."  

 

Nvidia probably knows that there solution will probably never be widely adopted and that manufacturers will simply make screens based on the standards so they know they can sell it to the hole market not just the owners of certain Nvidia cards.  The reason for Nvidia pushing a half baked solution early is simple.  They wanted to say we did if first that's it.

 

As far as I care Nvidia can say we did it first all they want.  I won't buy a product that is locked to another product (this is one of the main reasons I don't buy Apple products).  

 

I think AMD is the winner with Mantle because it has the potential to become an industry standard where as Gsync will never be.

1 Timothy 1:15

Link to comment
Share on other sites

Link to post
Share on other sites

As far as I care Nvidia can say we did it first all they want.  I won't buy a product that is locked to another product (this is one of the main reasons I don't buy Apple products).  

 

I think AMD is the winner with Mantle because it has the potential to become an industry standard where as Gsync will never be.

Did you hear that @xrex64?

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's not open source and it is also specific to GCN gpus.  It is an open standard that anyone can develop on but the source is not available.  Anyone is allowed to use it but not allowed to change it.  AMD pretends to play nice a lot better than nvidia.  Nvidia justs says this is ours and no one else can have it while AMD says you can use it, but you won't cause it won't work well for you.

Mantle is completely open and since it's a low level interaction between the GPU and the software all it would need is a game patch and/or a driver patch to add support for Nvidia and Intel GPUs.

It's similar to how many Intel GPUs didn't support OpenCL until they made the drivers for it. Everyone sees OpenCL as AMD specific whilst Intel and Nvidia have wide support for it, simply AMD GPUs are better at it due to having more raw power.

 

[sarcasm] Obviously since AMD GPUs are better at it, it's obviously biased [/sarcasm]. All I'm trying to say is that when Intel/Nvidia are on top it's all fine and dandy but the moment AMD have something to be proud of an have a throne to sit on someone must come and inspect the finest details of that throne to find the one particle that is not gold.

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

@Kuzma they sit on a throne of Platinum studded with jewels.

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×