Jump to content

[CES 2015] AMD Freesync Monitors

bogus

partially, The Vsync is used as a back up for when the frame rate is above the highest refresh rate, so if you had a 120Hz Gsync panel that was playing minecraft at 300 fps you should get screen tearing (Haven't seen it in person) It appears to be a thing that both sides are only able to work within a certain range without screen tearing

That hasen't been my experience. Playing lol on a gtx 780ti is capped at 144fps on the RPG swift and I have never experenced tearing. It's also very noticable when I turn off G-sync, way more jitters. I haven't really played any other games where I hit the monitors refresh rate cap so I can't comment on more detailed games. gonna go play around with that now :)

My posts are in a constant state of editing :)

CPU: i7-4790k @ 4.7Ghz MOBO: ASUS ROG Maximums VII Hero  GPU: Asus GTX 780ti Directcu ii SLI RAM: 16GB Corsair Vengeance PSU: Corsair AX860 Case: Corsair 450D Storage: Samsung 840 EVO 250 GB, WD Black 1TB Cooling: Corsair H100i with Noctua fans Monitor: ASUS ROG Swift

laptop

Some ASUS model. Has a GT 550M, i7-2630QM, 4GB or ram and a WD Black SSD/HDD drive. MacBook Pro 13" base model
Apple stuff from over the years
iPhone 5 64GB, iPad air 128GB, iPod Touch 32GB 3rd Gen and an iPod nano 4GB 3rd Gen. Both the touch and nano are working perfectly as far as I can tell :)
Link to comment
Share on other sites

Link to post
Share on other sites

Does gysnc have to use vsync when the frame rate is above 60fps, or is it just the implementation of freesync that has to do it. EDIT: So is it me or does gysnc sound a lot better than free sync? I thought that the whole point of gsync was to eliminate the use of vsync. But free sync has to use vsync(to eliminate tearing) so won't that introduce input lag, or do I have it all wrong? (assuming your frame rate changes from like 40fps to around 70/80fps) 

No, you just go in nvidia control panel and change vertical sync to Gsync to use Gsync. If you enable vsync in the game, it doesn't do anything because the driver overwrites it. Freesync sounds better but anyways we'll see which does better (like it matters). 

Link to comment
Share on other sites

Link to post
Share on other sites

FreeSync is the name of the AMD driver support for the AdaptiveSync standard. If Intel gave driver support for AdaptiveSync then it would work with computers with Intel graphics and compatible Display-port. If Nvidia was super awesome they could have driver support for AdaptiveSync.

 

We all know that the last one won't happen :(

Laptops with free sync >> YAY <<

Desktop support >> slap on a display port << not sure if vendors would put the time and money to have display ports on the mobos

LTT CSGO SERVER! IP 8.12.22.45!~  Connect by connecting on csgo console

Use console command "connect"   --->  connect 8.12.22.45

Link to comment
Share on other sites

Link to post
Share on other sites

FreeSync is the name of the AMD driver support for the AdaptiveSync standard. If Intel gave driver support for AdaptiveSync then it would work with computers with Intel graphics and compatible Display-port. If Nvidia was super awesome they could have driver support for AdaptiveSync.

 

We all know that the last one won't happen :(

you've hit the nail right on the head. Nvidia will die before they suppport free sync specially when they have G-sync. that will just kill their market IMO

 

I really hope that this destroys the G-sync competition. I'm an Nvidia user but I can't stand the price difference for G-sync.

Damn straight man. im the exact same. I absolutely refuse to pay £200 difference for G-sync

 

 

Below 30FPS (you're screwed :) ), between 30-60 FPS it's F-Sync and above 60 it's V-sync

 

when u say above 60 its V-sync, u mean the maximum refresh rate the monitor supports, right? i think ryan selected the wrong monitor as an example. coz to me it kinda sounded like above 60 free sync is useless.

what he meant was if you go above the maximum refrsh rate of the monitor, V-sync will be enabled, just the monitor he talked about, had a maximum refresh rate of 60. (just making sure we;re on the same page, i think thats how he meant it)

which to me is fine because Maximum frames per second is controllable in most games. 

Link to comment
Share on other sites

Link to post
Share on other sites

dude I'm an AMD fan but freesync sounds terrible.  Gsync has a min fps but not a max.  To have such a small windows that freesync works is terrible.  I saw pc per review and wanted to see what people were saying on ltt forums.  

 

Lets clear this up though.  Freesync is not better than gsync.  It is free and we hope it brings down the price of gsync monitors but why would you switch to vsync at 60+ fps and below 30 fps.  just don't have adaptive sync than.  If you don't think there will be lag in game from switching from freesync to vsync your crazy.  you turn off vsync you get tearing outside the range.  Lets just hope than zen is a great cpu for amd or this whole thing is crashing down.

 

*this is off topic but intel just stole the whole ces and if they continue on this path who can even challenge them.  nvidia qualcomm amd samsung are all dependant on gf and tsmc.  Intel has the best r&d and they do the manufacturing on their own.  I hate intel but if master dedicated graphics, every device will be powered by intel.

Link to comment
Share on other sites

Link to post
Share on other sites

FreeSync is the name of the AMD driver support for the AdaptiveSync standard. If Intel gave driver support for AdaptiveSync then it would work with computers with Intel graphics and compatible Display-port. If Nvidia was super awesome they could have driver support for AdaptiveSync.

We all know that the last one won't happen :(

I just hope Nvida driver modders could patch in support and bypass Nvidia being stubborn to support the new standard.

A riddle wrapped in an enigma , shot to the moon and made in China

Link to comment
Share on other sites

Link to post
Share on other sites

dude I'm an AMD fan but freesync sounds terrible.  Gsync has a min fps but not a max.  To have such a small windows that freesync works is terrible.  I saw pc per review and wanted to see what people were saying on ltt forums.  

Neither does freesync have a max.

The limitation is the panels max refresh rate. Which varies from monitor to monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

If free sync requires an AMD card... This doesn't sound free... Unless they are talking about being free from tearing...

Freesync is just a feature in Amds driver software so it doesnt cost anything for the user.

They will need Active Sync capable monitors though , hopefully those aren't too pricey. And anyone is free to add in support for Active sync since its a display port standard not amd Intelectual property.

BTW

OP- Change your title to Active Sync monitors.

A riddle wrapped in an enigma , shot to the moon and made in China

Link to comment
Share on other sites

Link to post
Share on other sites

dude I'm an AMD fan but freesync sounds terrible.  Gsync has a min fps but not a max.  To have such a small windows that freesync works is terrible.  I saw pc per review and wanted to see what people were saying on ltt forums.  

 

Lets clear this up though.  Freesync is not better than gsync.  It is free and we hope it brings down the price of gsync monitors but why would you switch to vsync at 60+ fps and below 30 fps.  just don't have adaptive sync than.  If you don't think there will be lag in game from switching from freesync to vsync your crazy.  you turn off vsync you get tearing outside the range.  Lets just hope than zen is a great cpu for amd or this whole thing is crashing down.

 

Adaptive Sync has a max hz of 240, so that really is not an issue.

 

I must admit it does sound odd that these monitors will use VSYNC when it goes to the monitors max hz supported by the panel, but really it does seem to be the best solution, and also what I sort of estimated would happen. I did ask the question, if there is any benefit to render frames, that will not be shown on the monitor, but I never got any points.

 

The low 30hz border is a hardware limitation, and really, who games at below 30fps?

 

For Gsync, this is what happens, when the fps goes above the max HZ supported by the monitor:

lag-csgo.png

 

So really this is the best way to deal with it.

 

Not sure what you mean about Adaptive Sync not being better than Gsync. It cheaper, has a wider hz interval (9-240hz in the spec), is an open standard, has many different implementations from different vendors, etc.

Instead let me ask, why do you think Gsync is better than Adaptive Sync?

 

I do like how LG has improved their 29" 21:9 monitor to be 75hz, instead of 60hz, although I'd rather have a 34" 1440p curved monitor with Adaptive Sync. But we will see 144hz Adaptive Sync monitors too.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I must admit it does sound odd that these monitors will use VSYNC when it goes to the monitors max hz supported by the panel, but really it does seem to be the best solution, and also what I sort of estimated would happen. I did ask the question, if there is any benefit to render frames, that will not be shown on the monitor, but I never got any points.

As long as it's vsync with triple buffering then it would be good (no input lag) as it would displaying an up-to-date frame.

http://www.anandtech.com/show/2794/2

Link to comment
Share on other sites

Link to post
Share on other sites

As long as it's vsync with triple buffering then it would be good (no input lag) as it would displaying an up-to-date frame.

http://www.anandtech.com/show/2794/2

 

It makes no sense to make any buffer at all. A massive 3 fps buffer, would create a lot of "input lag". The point of VSYNC here, is not because the card cannot keep up, but to set an artificial limiter, the card will not go above. Introducing latency via framebuffers, makes no sense, especially when the fps then drops back down to the panel hz interval. Then, the buffer would have to be flushed, and you get stutter, thus removing one of the key points of synced fps to begin with.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

It makes no sense to make any buffer at all. A massive 3 fps buffer, would create a lot of "input lag". The point of VSYNC here, is not because the card cannot keep up, but to set an artificial limiter, the card will not go above. Introducing latency via framebuffers, makes no sense, especially when the fps then drops back down to the panel hz interval. Then, the buffer would have to be flushed, and you get stutter, thus removing one of the key points of synced fps to begin with.

Isn't the point of triple buffering to allow the GPU to render faster than the refresh rate, but only the most up-to-date frame is put on the output buffer to the monitor?

It's not a 3fps buffer. The most up-to-date  frame which the GPU has finished drawing gets displayed, at least according to anandtech

Link to comment
Share on other sites

Link to post
Share on other sites

Isn't the point of triple buffering to allow the GPU to render faster than the refresh rate, but only the most up-to-date frame is put on the output buffer to the monitor?

It's not a 3fps buffer. The most up-to-date  frame which the GPU has finished drawing gets displayed, at least according to anandtech

 

Kind of, unless all three (or more) buffers are filled, then the graphics card will usually idle. These buffers are generally there to keep up the vsync, when the fps drops below the vsync fps/hz. All the same, these are still buffers, that will result in added latency, which is what (amongst others), that Async and Gsync tries to get rid off.

 

But of course it depends if the tripple buffering is just there or combined with vsync. http://en.wikipedia.org/wiki/Multiple_buffering#Triple_buffering

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Kind of, unless all three (or more) buffers are filled, then the graphics card will usually idle.

It's supposed to overwrite one of the back buffers as soon as possible, not idle. That's the point of having 2 buffers on the backend.

Did you read the anandtech link?

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia won't support VESA Adaptive sync at all, right? I once read that there was a chance nvidia would do it... but a new war has begun

 

If GSync and Freaasync/adaptive sync won't merge it will be a really bad thing for us consumers

On a mote of dust, suspended in a sunbeam

Link to comment
Share on other sites

Link to post
Share on other sites

It's supposed to overwrite one of the back buffers as soon as possible, not idle. That's the point of having 2 buffers on the backend.

Did you read the anandtech link?

 

I skimmed it, but did you read my link? In your own link, read the update on the last page. There are many ways of doing "tripple buffering", and one is render ahead, which DX11 uses. That is a queue system.

But all of this is irrelevant, as it creates latency either way. There is no reason for a GFX to render frames faster, than the vsync max is set to (like 60hz/fps), so why use tripple buffer, when the monitor will just lower the hz in sync, when the fps drops below the 60hz vsync?

 

Again, vsync is only used here, as an fps limiter, set to the monitor panel's max hz, not a buffered way, to sustain a fixed, or at least a fairly stable, fps.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia won't support VESA Adaptive sync at all, right?

Technically they can once they start making cards with the new version of display port.

Link to comment
Share on other sites

Link to post
Share on other sites

Technically they can once they start making cards with the new version of display port.

 

I mean... they don't want. This is comprehensible but kinda sad for us.

 

I hope to have the money to get a new pc this year and I'm not going to get anything less than 4K 60Hz or 1440p 120+Hz... possibly with gsync/freesync. But it's too much dependant on which GPU brand will "win" this year...

 

And going from my 21.5" 1080p to 27" 1440p doesn't sound very good since the PPI will barely increase and probably my distance from the monitor will stay the same

On a mote of dust, suspended in a sunbeam

Link to comment
Share on other sites

Link to post
Share on other sites

I mean... they don't want. This is comprehensible but kinda sad for us.

 

I hope to have the money to get a new pc this year and I'm not going to get anything less than 4K 60Hz or 1440p 120+Hz... possibly with gsync/freesync. But it's too much dependant on which GPU brand will "win" this year...

 

And going from my 21.5" 1080p to 27" 1440p doesn't sound very good since the PPI will barely increase and probably my distance from the monitor will stay the same

 

If the industry(monitor manufacturers) picks up AdaptiveSync in a big way, then I don't think they can afford NOT to support it.  So we will see.

i7-5820k  |  MSI X99S SLI-Plus  |  4x4GB HyperX 2400 DDR4  |  Sapphire Radeon R9 295X2  |  Samsung 840 EVO 1TB x2  |  Corsair AX1200i  |  Corsair H100i  |  NZXT H440 Razer

Link to comment
Share on other sites

Link to post
Share on other sites

Adaptive Sync has a max hz of 240, so that really is not an issue.

 

I must admit it does sound odd that these monitors will use VSYNC when it goes to the monitors max hz supported by the panel, but really it does seem to be the best solution, and also what I sort of estimated would happen. I did ask the question, if there is any benefit to render frames, that will not be shown on the monitor, but I never got any points.

 

The low 30hz border is a hardware limitation, and really, who games at below 30fps?

 

For Gsync, this is what happens, when the fps goes above the max HZ supported by the monitor:

lag-csgo.png

 

So really this is the best way to deal with it.

 

Not sure what you mean about Adaptive Sync not being better than Gsync. It cheaper, has a wider hz interval (9-240hz in the spec), is an open standard, has many different implementations from different vendors, etc.

Instead let me ask, why do you think Gsync is better than Adaptive Sync?

 

I do like how LG has improved their 29" 21:9 monitor to be 75hz, instead of 60hz, although I'd rather have a 34" 1440p curved monitor with Adaptive Sync. But we will see 144hz Adaptive Sync monitors too.

 

Ok I guess I misunderstood what they were saying.  if the limitation of freesync is the max hz of the monitor than it doesnt matter.  just cap the fps to the max hz of your monitor.  The idea of switching from vsync to freesync and back is worthless though.  AMD just needs to override to set a max fps if freesync is enabled.

Link to comment
Share on other sites

Link to post
Share on other sites

More great news: Apparently the Asus MG279Q 27" 1440p IPS @ 120Hz supports DP 1.2a+ Adaptive Sync and a AMD representative confirmed that any monitor with Variable Refresh Rate (obviously - since it was the VESA standard AMD pushed forward - not supported by NVIDIA) will work with FreeSync.

Asus, for obvious reasons (yes, NVIDIA G-Sync), isn't branding it as a FreeSync monitor. But yes, it is indeed a FreeSync monitor that ranges from 40hz to 120hz IPS panel for €599!

You go AMD!

Source: http://www.pcper.com/news/Displays/CES-2015-ASUS-MG279Q-27-2560x1440-IPS-120-Hz-Variable-Refresh-Monitor

Link to comment
Share on other sites

Link to post
Share on other sites

More great news: Apparently the Asus MG279Q 27" 1440p IPS @ 120Hz supports DP 1.2a+ Adaptive Sync and a AMD representative confirmed that any monitor with Variable Refresh Rate (obviously - since it was the VESA standard AMD pushed forward - not support by NVIDIA) will work with FreeSync.

Asus, for obvious reasons (yes, NVIDIA G-Sync), isn't branding it as a FreeSync monitor. But yes, it is indeed a FreeSync monitor that ranges from 40hz to 120hz IPS panel for €599!

You go AMD!

Source: http://www.pcper.com/news/Displays/CES-2015-ASUS-MG279Q-27-2560x1440-IPS-120-Hz-Variable-Refresh-Monitor

Ahhh... the benefits of open standards.

 

So buying this monitor will ensure you are not locked into Nvidia or AMD, you can buy the best GPU at your budget at the time.

Link to comment
Share on other sites

Link to post
Share on other sites

Ahhh... the benefits of open standards.

 

So buying this monitor will ensure you are not locked into Nvidia or AMD, you can buy the best GPU at your budget at the time.

 

I don't think that he was meaning this... he just wrote that this Asus monitor is not directly marked as "Freesync" ( AMD only ) but the general "Adaptive Sync" VESA standard which is currently not supported by Nvidia ( and who knows if it will any time soon )

I bet it's because nvidia pressure on Asus because of Gsync ( since they had the exclusive to be the first brand to sell a gsync product )

On a mote of dust, suspended in a sunbeam

Link to comment
Share on other sites

Link to post
Share on other sites

Ahhh... the benefits of open standards.

 

So buying this monitor will ensure you are not locked into Nvidia or AMD, you can buy the best GPU at your budget at the time.

 

I don't think that he was meaning this... he just wrote that this Asus monitor is not directly marked as "Freesync" ( AMD only ) but the general "Adaptive Sync" VESA standard which is currently not supported by Nvidia ( and who knows if it will any time soon )

I bet it's because nvidia pressure on Asus because of Gsync ( since they had the exclusive to be the first brand to sell a gsync product )

You can't use freesync with Nvidia because Nvidia doesn't support it.  They are happy that the asus panel is supported for AMD cards even though the panel not on the approved list.  Now the result of not being on the supported list isn't something we won't know until its out.

 

Remember too guys that G-Sync does use a chip in the monitor where freesync does not.  We will talk until we are blue in the face but the true test will be when it comes out and compare the two.  

 

I just really hope that AMD isn't soo stupid that their real solution to being out of range if freesync is using vsync.  That is what I don't think anyone can explain and if they could it would be great.  Adaptive sync is to get rid of vsync.  This have nothing to do with triple buffer vsync it is solely adaptive sync vs vsync.  No one is asking for a solution of switching between the two.  

 

Also, I would also be curious if u guys would be willing to have a hard cap on fps when you use freesync?

Link to comment
Share on other sites

Link to post
Share on other sites

kn1ghtnsh1narmr

You can't use freesync with Nvidia because Nvidia doesn't support it.  They are happy that the asus panel is supported for AMD cards even though the panel not on the approved list.  Now the result of not being on the supported list isn't something we won't know until its out.


If it has a VVR compatible scalar, I don't know of a reason why it shouldn't work.
 
 

 

Remember too guys that G-Sync does use a chip in the monitor where freesync does not.  We will talk until we are blue in the face but the true test will be when it comes out and compare the two.



It does have a chip. It has a scalar (the bit of PCB that handles what is sent to the panel and when) that can handle variable refresh rates.  
 
 

 

I just really hope that AMD isn't soo stupid that their real solution to being out of range if freesync is using vsync.  That is what I don't think anyone can explain and if they could it would be great.  Adaptive sync is to get rid of vsync.  This have nothing to do with triple buffer vsync it is solely adaptive sync vs vsync.  No one is asking for a solution of switching between the two.



It isn't Freesync hitting a ceiling, rather the panel. Freesync has a Frame-Rate ceiling of 240 Hz. If, however, you push frames to the monitor faster than it can display them, you WILL get tearing if you don't throttle the delivery rate in some way.
 

 

Also, I would also be curious if u guys would be willing to have a hard cap on fps when you use freesync?

 

You have to have a cap on displayed frame-rate, because there is a physical limit to what your monitor can display. Because screen tearing is, in my opinion, more of a problem than the bit of input latency from V-Sync.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×