Jump to content

Nvidia Vows to Support AMD's Alternative to G-Sync - Adaptive Sync Gets Green Light

That's surprising but very exciting :D

 

I guess they don't want to add G-sync as an extra cost of going with an Nvidia card. 

 

Obviously freesync will have an associated markup but I expect it to be less than G-sync and possibly less limiting in terms of monitor choices.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes , AMD did something amazing.

They got Nvidia, the king of proprietary bullshit to drop there technology, And use open standards. In the future you won't be locked to a GPU manufacturer due to your Monitor.

A riddle wrapped in an enigma , shot to the moon and made in China

Link to comment
Share on other sites

Link to post
Share on other sites

nvidia only created g sync to push the market,

ha....no. Its Nvidia, just trying to limit the market with proprietary hardware to milk money out of consumers. I really wanted to love Nvidia, but that godawful podcast with employees of Nvidia as well as its decisions as a company (PhyX, GameWorks, limiting to only their hardware or may not work on competitors/ not letting them optimize for it) is just bad for everyone.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This is way we need a healty competition, and thanks AMD for the "FreeSync"

Link to comment
Share on other sites

Link to post
Share on other sites

This just makes Display port even more the only connector that you will ever need for video, hopefully it will kill off the terribleness that HDMI is

How about we kill bulky connectors like VGA and DVI first. HDMI 2.0 is a great improvement.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

VGA is to support businesses that refuse to upgrade their computer hardware more than once every 20 years, DVI is a "safe" bet that somebody's graphics card with have it. We have to tolerate HDMI because for some arcane reason things like consoles, DVD players and satellite set-top boxes are still shipping with that connector.

If HD-BluRay with 4k content becomes main-stream HDMI should die out though, what with not supporting the bitrate required by that resolution and framerate.

HDMI 2.0 does support it, so you're stuck with it existing.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia must have allot of faith in their technology to be doing this. We still dont even know if free sync will be the same, better, or worse than g-sync.

Case: Phanteks Evolve X with ITX mount  cpu: Ryzen 3900X 4.35ghz all cores Motherboard: MSI X570 Unify gpu: EVGA 1070 SC  psu: Phanteks revolt x 1200W Memory: 64GB Kingston Hyper X oc'd to 3600mhz ssd: Sabrent Rocket 4.0 1TB ITX System CPU: 4670k  Motherboard: some cheap asus h87 Ram: 16gb corsair vengeance 1600mhz

                                                                                                                                                                                                                                                          

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

How about we kill bulky connectors like VGA and DVI first. HDMI 2.0 is a great improvement.

Doesn't Nvidia's new GPUs have 3 DisplayPort, 1 HDMI and just 1 DVI Port?

I just wish everyone would just standardize on DisplayPort. Although it'll probably never happen...

Link to comment
Share on other sites

Link to post
Share on other sites

to completely shit focus

 

I certainly hope not.

 

Anyway in all seriousness, yes. This is good. This is very, very good. Unfortunately it looks like Maxwell cards are sporting DisplayPort 1.2, not 1.3...

Link to comment
Share on other sites

Link to post
Share on other sites

Adaptive Sync was proposed by AMD, based on Freesync. Adaptive Sync is AMD's doing, noone else.

 

 

So much wrong here:

  1. No eDP does not have Adaptive Sync. You cannot use variable framerate synced to your fps on a laptop (yet I guess).
  2. Both Adaptive Sync and G-sync is based on Variable VBlank, which is a power savinvgs feature from eDP (hence your confusion). If Adaptive Sync is just an eDP feature, then so is G-sync.
  3. AMD has worked on Freesync for years.
  4. Freesync was announced and created before Adaptive Sync.

Yes,Adaptive Sync was a feature in eDP.

"VESA sends word today that they have done just that with the latest update to the DisplayPort 1.2a standard. Adaptive-Sync (not to be confused with NVIDIA’s Adaptive V-Sync), the eDP feature that allows for variable refresh monitors, has been added to the DisplayPort 1.2a standard as an optional feature"

The only thing it did do in eDP was,for example,lower the refresh rate to 30Hz when you're in desktop mode or something,for extended battery life.

Freesync,on the other hand,calculates (aproximates) the frames per second your GPU is outputting and sends that data to the Adaptive Sync chip/whatever it is , which in turn changes the monitor's refresh late. 

 

Adaptive Sync works by changing the VBLANK value,hence variable VBLANK.

VBLANK = the time difference between the last line of one frame and the beginning of the first line of the next frame.

 

eDP came out in 2008.I don't think AMD has been working on Freesync even before that.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

ew my god has the world gone crazy,nvidia pushing open standards? whats next putin apologizing for crimeea?

Link to comment
Share on other sites

Link to post
Share on other sites

I am almost at a point now where I want both Nvidia and amd to go bankrupt so the fanboys can stop messing up these threads with BS fanboy rants.  If you don't like what either company has done or you think one company is intrinsically better then you are a fanboy and should stop posting so the people can actually discuss a product for it's merits without having to suffer pointless insinuations about the merits of a company (or worse, assumptions about their activities).

 

 

I am glad to hear this is happening,  I remember not too long ago almost everyone was saying they never would.  I think I said it was not likely given that's not how companies make money, but glad to see I was wrong on this one.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia has been there since last year.  AMD has the VESA vice chairman Syed Athar Hussain as there display domain architect.  Nvidia has there display technical marketing manager, Pablo Ortega as a Board member.

Who do you think has more pull on the Board?  http://www.vesa.org/news/vesa-announces-new-vice-chairman-to-board-of-directors/

post-69576-0-27412400-1411377314_thumb.p

post-69576-0-36449200-1411377790.png

Link to comment
Share on other sites

Link to post
Share on other sites

Yes,Adaptive Sync was a feature in eDP.

"VESA sends word today that they have done just that with the latest update to the DisplayPort 1.2a standard. Adaptive-Sync (not to be confused with NVIDIA’s Adaptive V-Sync), the eDP feature that allows for variable refresh monitors, has been added to the DisplayPort 1.2a standard as an optional feature"

The only thing it did do in eDP was,for example,lower the refresh rate to 30Hz when you're in desktop mode or something,for extended battery life.

Freesync,on the other hand,calculates (aproximates) the frames per second your GPU is outputting and sends that data to the Adaptive Sync chip/whatever it is , which in turn changes the monitor's refresh late. 

 

Adaptive Sync works by changing the VBLANK value,hence variable VBLANK.

VBLANK = the time difference between the last line of one frame and the beginning of the first line of the next frame.

 

eDP came out in 2008.I don't think AMD has been working on Freesync even before that.

 

You are almost correct, but still confusing some details. The eDP feature allowing variable refresh rate is Variable VBlank, but it is NOT synced to the fps of the graphicscard. The entire standard of Adaptive Sync is to sync framerate and fps together. Variable VBlank used in eDP is not capable of doing this. No sync, no adaptive sync. Furthermore, the standardization of AS is not part of eDP either. No initial handshake for the display controller to provide the gfx with min/max supported framerates for instance. Just because Adaptive Sync AND G-sync both are based on variable VBlank technology, does not mean eDP supports either, nor that it was ever a feature in eDP; it was not.

 

Freesync does not calculate anything; it works like this:

  1. On connection/startup, monitor handshakes with graphicscard, supplying supported min/max framerates, for instance 20 min, 144 max.
  2. Graphics card, in games, will then supply frames within that min/max interval. Like G-sync, if it drops below supported min, then you will get stuttering. We don't know how Freesync will react above max, but G-sync craps out, when fps goes over 120/144hz.
  3. It works like this: Gfx sends image to monitor controller, followed by a Vblank start signal, telling the monitor to hold off on scanning a new image (after it scanned and displayed the one we just sent). Then when a new image is ready, the gfx sends a VBlank end signal, followed by the new image, telling the monitor to start scanning againg, instead of just holding the previous image on the panel. It's very simple, which is why it does not add a lot of cost to monitors.
  4. It is because of this lean and simplified communication between controller and gfx, that makes AS cheaper, simpler and therefore better than G-sync. No needed monitor RAM, or complicated 2 way communications for each frame sent.

It is no news, that both Freesync and G-sync are based on this eDP feature, variable VBlank. But both AS and GS provide added feature (the sync part), which does not exist in eDP (yet, hope it will come later on).

Where Nvidia chose to brand their entire solution, both software and hardware into one name: G-sync; AMD chose to split up their freesync into software AND hardware. The software part, being their drivers controlling this tech, and the hardware being the supported display controllers. AMD did this because they wanted the hardware part to become an industry standard. That is why Adaptive Sync was announced quite a bit later than Freesync, even though AMD is responsible for both (AMD proposed AS to VESA as a new standard).

 

As for 2008, AMD has been working with Freesync for years. Maybe not since 2008, but remember AMD is part of the VESA group, so they where obviously involved in the specification of eDP back before 2008.

I hope this clears up the confusion.

 

No you got it the other way around. FreeSync was based on the VESA standard that became Adaptive-Sync.

FreeSync is based on Adaptive-Sync, not the other way around. Pretty sure even the Adaptive-Sync FAQ from VESA says this. Hang on I'll try to find it.

 

Source: VESA FAQ

 

Read the above, it should explain the difference. Adaptive Sync is based on Freesync standard, and was proposed by AMD to VESA, after Freesync was announced. The VESA FAQ does not dispute this, but does focus on its own branding and IP, instead of mentioning AMD (remember VESA is a standards organization, so what it makes, it owns itself. So it will not name other companies. After all Nvidia is also in the VESA group).

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia has been there since last year.  AMD has the VESA vice chairman Syed Athar Hussain as there display domain architect.  Nvidia has there display technical marketing manager, Pablo Ortega as a Board member.

Who do you think has more pull on the Board?  http://www.vesa.org/news/vesa-announces-new-vice-chairman-to-board-of-directors/

 

Boards aren't dictatorships that work on hierarchy, they are democratic votes and working groups.

Realistically neither company has more sway than another.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I certainly hope not.

Anyway in all seriousness, yes. This is good. This is very, very good. Unfortunately it looks like Maxwell cards are sporting DisplayPort 1.2, not 1.3...

1.3 was JUST released as a standard >:( As if anyone would have a product ready that fast.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

who would have thought it, very very cool. Companies should support competitors systems more often. More of this please :)

Never trust a man, who, when left alone with a tea cosey... Doesn't try it on. Billy Connolly
Marriage is a wonderful invention: then again, so is a bicycle repair kit. Billy Connolly
Before you judge a man, walk a mile in his shoes. After that, who cares? He's a mile away and you've got his shoes. Billy Connolly
Link to comment
Share on other sites

Link to post
Share on other sites

Freesync does not calculate anything; it works like this:

  • On connection/startup, monitor handshakes with graphicscard, supplying supported min/max framerates, for instance 20 min, 144 max.
  • Graphics card, in games, will then supply frames within that min/max interval. Like G-sync, if it drops below supported min, then you will get stuttering. We don't know how Freesync will react above max, but G-sync craps out, when fps goes over 120/144hz.

What happens? How/why does it crap out? I haven't seen anything about this in the reviews i've read. 

Link to comment
Share on other sites

Link to post
Share on other sites

What happens? How/why does it crap out? I haven't seen anything about this in the reviews i've read. 

lag-csgo.png

 

The inputlag spikes when the game outputs more frames than the monitor can handle. The monitor used can handle 144hz, but it derps at 143 as well. Not sure if that is the game derping or what. But you can see the inputlag dropping a lot, when the game is limited to 120hz rendering.

G-sync introduces more input lag than no g-sync, because of its overcomplicated 2way communication and 750MB monitor ram buffer thing. Another reason Adaptive Sync should be better (we will see). The difference is very small, so it should not prove a big problem, but there is an issue here.

http://www.blurbusters.com/gsync/preview2/

 

For most people, this will never be an issue in a 144hz monitor. But when we get Adaptive Sync for 60hz monitors, this can be an issue, so it will be interesting to see how AS deals with this.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

at least now there is a standard, now we don't have to pledge to a certain brand of graphics card if you have a adaptive sync monitor!

very good news!

Build: Sister's new build |CPU i5 2500k|MOBO MSI h61m-p23 b3|PSU Rosewill 850w  |RAM 4GB 1333|GPU Radeon HD 6950 2GB OCedition|HDD 500GB 7200|HDD 500GB 7200|CASE Rosewill R5|Status online


Build: Digital Vengeance|CPU i7 4790k 4.8GHz 1.33V|MOBO MSI z97-Gaming 7|PSU Seasonic Xseries 850w|RAM 16GB G.skill sniper 2133|GPU Dual R9 290s|SSD 256GB Neutron|SSD 240GB|HDD 2TB 7200|CASE Fractal Design Define R5|Status online

Link to comment
Share on other sites

Link to post
Share on other sites

HDMI 2.0 does support it, so you're stuck with it existing.

 

:(

 

 

At least DVI has the fact that the connector can be screwed on firmly going for it.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

You are almost correct, but still confusing some details. The eDP feature allowing variable refresh rate is Variable VBlank, but it is NOT synced to the fps of the graphicscard. The entire standard of Adaptive Sync is to sync framerate and fps together. Variable VBlank used in eDP is not capable of doing this. No sync, no adaptive sync. Furthermore, the standardization of AS is not part of eDP either. No initial handshake for the display controller to provide the gfx with min/max supported framerates for instance. Just because Adaptive Sync AND G-sync both are based on variable VBlank technology, does not mean eDP supports either, nor that it was ever a feature in eDP; it was not.

 

Freesync does not calculate anything; it works like this:

  1. On connection/startup, monitor handshakes with graphicscard, supplying supported min/max framerates, for instance 20 min, 144 max.
  2. Graphics card, in games, will then supply frames within that min/max interval. Like G-sync, if it drops below supported min, then you will get stuttering. We don't know how Freesync will react above max, but G-sync craps out, when fps goes over 120/144hz.
  3. It works like this: Gfx sends image to monitor controller, followed by a Vblank start signal, telling the monitor to hold off on scanning a new image (after it scanned and displayed the one we just sent). Then when a new image is ready, the gfx sends a VBlank end signal, followed by the new image, telling the monitor to start scanning againg, instead of just holding the previous image on the panel. It's very simple, which is why it does not add a lot of cost to monitors.
  4. It is because of this lean and simplified communication between controller and gfx, that makes AS cheaper, simpler and therefore better than G-sync. No needed monitor RAM, or complicated 2 way communications for each frame sent

From my understanding, a monitor,even if it has Adaptive Sync,still needs a Freesync module or something like that.

If that is the case,then it means that Adaptive Sync can't manage to communicate with the GPU without the help of Freesync.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

From my understanding, a monitor,even if it has Adaptive Sync,still needs a Freesync module or something like that.

If that is the case,then it means that Adaptive Sync can't manage to communicate with the GPU without the help of Freesync.

Freesync is the driver support to control the VBlank timings. The reason why some AMD cards do not fully support Freesync, is because the gfx display controller is not capable. But all graphics cards have such a chip on them. Only the newer AMD GPU's and APU's have such a controller. There is no such thing as a Freesync module. Only G-sync has a module, which is a mini computer replacing the monitor display controlle (scaler). 

 

With the exception of the 2-way handshake at computer startup/display connection, Adaptive Sync/free sync is a 1-way communication, where the graphics card sends frames within the supported hz rate of the monitor, thus telling the monitor when to display a frame.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×