Jump to content

[CES 2015] AMD Freesync Monitors

bogus

kn1ghtnsh1narmr

If it has a VVR compatible scalar, I don't know of a reason why it shouldn't work.

Do you really think that Nvidia would allow this?  Freesync is a cheaper, easier option and Nvidia just spent a ton of money.  The reason why it wouldn't work is because Nvidia will say if you want to use our drivers which a are closed source and you have to than if you want variable refresh rate, buy a gsync monitor.

 

If in a year or two freesync is equal to or surpasses gsync than Nvidia will cave and said they created variable refresh rate which they did not.

Link to comment
Share on other sites

Link to post
Share on other sites

dude I'm an AMD fan but freesync sounds terrible.  Gsync has a min fps but not a max.  To have such a small windows that freesync works is terrible.  I saw pc per review and wanted to see what people were saying on ltt forums.  

 

Lets clear this up though.  Freesync is not better than gsync.  It is free and we hope it brings down the price of gsync monitors but why would you switch to vsync at 60+ fps and below 30 fps.  just don't have adaptive sync than.  If you don't think there will be lag in game from switching from freesync to vsync your crazy.  you turn off vsync you get tearing outside the range.  Lets just hope than zen is a great cpu for amd or this whole thing is crashing down.

 

*this is off topic but intel just stole the whole ces and if they continue on this path who can even challenge them.  nvidia qualcomm amd samsung are all dependant on gf and tsmc.  Intel has the best r&d and they do the manufacturing on their own.  I hate intel but if master dedicated graphics, every device will be powered by intel.

sorry dunno if u've been corrected yet but

when ryan said above 60 u need v sync,  he meant if you go above the maximum refresh rate of the monitor, V-sync will be enabled,its just that the monitor he was talking about, had a maximum refresh rate of 60hz. i think  he confused a lot of people by talking about that specific monitor. 

 to me its fine because Maximum frames per second is controllable in most games. 

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia won't support VESA Adaptive sync at all, right? I once read that there was a chance nvidia would do it... but a new war has begun

 

If GSync and Freaasync/adaptive sync won't merge it will be a really bad thing for us consumers

Well Every new monitor with G-sync on it, will have a 1.2a display port on it, which mean it will be compatible. its not necessarily G-sync and Free-sync merging thats the problem, the problem is if Nvidia chooses not to support it. but thats kinda like shooting themselves in the foot. the only way G-sync will survive in this war, is if it works better, which isnt clear yet :/ we'll see.

 

If the industry(monitor manufacturers) picks up AdaptiveSync in a big way, then I don't think they can afford NOT to support it.  So we will see.

But either way if that happens, they loose. If Adaptive Sync works just as well as G-sync, Emphasis on IF, then why would anyone pay extra money for G-sync. their own business will slowly but surely die.

they Cant afford to support it, because it makes their own product, irrelevant, and They cant afford to Not support it, if most Manufacturers pick it up. like you said, we'll have to wait and see.

 

More great news: Apparently the Asus MG279Q 27" 1440p IPS @ 120Hz supports DP 1.2a+ Adaptive Sync and a AMD representative confirmed that any monitor with Variable Refresh Rate (obviously - since it was the VESA standard AMD pushed forward - not supported by NVIDIA) will work with FreeSync.

Asus, for obvious reasons (yes, NVIDIA G-Sync), isn't branding it as a FreeSync monitor. But yes, it is indeed a FreeSync monitor that ranges from 40hz to 120hz IPS panel for €599!

You go AMD!

Source: http://www.pcper.com/news/Displays/CES-2015-ASUS-MG279Q-27-2560x1440-IPS-120-Hz-Variable-Refresh-Monitor

awesome, but unfortunetly, its not really true IPS. its AHVA (advanced Hyper Viewing angles). I havent done enough research on the panel but i know only 1 exists in the market at this time. If colour production is just as well as a AH-IPS panel. then amen, but i dont think it will be

 

You have to have a cap on displayed frame-rate, because there is a physical limit to what your monitor can display. Because screen tearing is, in my opinion, more of a problem than the bit of input latency from V-Sync.

I feel the same way

Link to comment
Share on other sites

Link to post
Share on other sites

Linus on AMD Freesync:

 

i7-5820k  |  MSI X99S SLI-Plus  |  4x4GB HyperX 2400 DDR4  |  Sapphire Radeon R9 295X2  |  Samsung 840 EVO 1TB x2  |  Corsair AX1200i  |  Corsair H100i  |  NZXT H440 Razer

Link to comment
Share on other sites

Link to post
Share on other sites

I heard that the 280 is not supported, is this right ?

CPU:AMD FX-8320, RAM:Kingston Fury Black Series 8GB (1 x 8GB) DDR3-1600, GPU:MSI Radeon R9 280, Case:Cooler Master 690 III, PSU:Thermaltake Smart 650W

Storage:WD Purple 1TB, CPU Cooler:Hyper 212 EVO, Keyboard:SCI Granite, OS:win 8.1 64bit, Display:Samsung SyncMaster SA100, Mouse:CM Storm Spawn

Link to comment
Share on other sites

Link to post
Share on other sites

partially, The Vsync is used as a back up for when the frame rate is above the highest refresh rate, so if you had a 120Hz Gsync panel that was playing minecraft at 300 fps you should get screen tearing (Haven't seen it in person) It appears to be a thing that both sides are only able to work within a certain range without screen tearing

G-sync caps the framerate at 144fps, or 143.7 or something. I have a Swift. You never see tearing. So please don't spread misinformation-

“The mind of the bigot is like the pupil of the eye; the more light you pour upon it the more it will contract” -Oliver Wendell Holmes “If it can be destroyed by the truth, it deserves to be destroyed by the truth.” -Carl Sagan

Link to comment
Share on other sites

Link to post
Share on other sites

I heard that the 280 is not supported, is this right ?

 

Seems to be.  The GPU's that support Freesync in games right now seem to be the 295X2, 290X, 290, 285, 260X, and 260.

i7-5820k  |  MSI X99S SLI-Plus  |  4x4GB HyperX 2400 DDR4  |  Sapphire Radeon R9 295X2  |  Samsung 840 EVO 1TB x2  |  Corsair AX1200i  |  Corsair H100i  |  NZXT H440 Razer

Link to comment
Share on other sites

Link to post
Share on other sites

I heard that the 280 is not supported, is this right ?

Only for video playback.

Only the 260x, 285, and 290/x support it in games.

One of the biggest reasons I'm upgrading my 7970 this summer.

Intel i7 6700k @ 4.8ghz, NZXT Kraken X61, ASUS Z170 Maximus VIII Hero, (2x8GB) Kingston DDR4 2400, 2x Sapphire Nitro Fury OC+, Thermaltake 1050W

All in a Semi Truck!:

http://linustechtips.com/main/topic/519811-semi-truck-gaming-pc/#entry6905347

Link to comment
Share on other sites

Link to post
Share on other sites

G-sync caps the framerate at 144fps, or 143.7 or something. I have a Swift. You never see tearing. So please don't spread misinformation-

It doesn't introduce tearing because G-Sync enforces V-Sync, so it will just introduce lag.

At a high level the sweet spot for G-Sync is going to be a situation where you have a frame rate that regularly varies between 30 and 60 fps. Game/hardware/settings combinations that result in frame rates below 30 fps will exhibit stuttering since the G-Sync display will be forced to repeat frames, and similarly if your frame rate is equal to your refresh rate (60, 120 or 144 fps in this case) then you won’t really see any advantages over plain old v-sync.

Source: http://www.anandtech.com/show/7582/nvidia-gsync-review/2

Link to comment
Share on other sites

Link to post
Share on other sites

Only for video playback.

Only the 260x, 285, and 290/x support it in games.

One of the biggest reasons I'm upgrading my 7970 this summer.

 

 

Seems to be.  The GPU's that support Freesync in games right now seem to be the 295X2, 290X, 290, 285, 260X, and 260.

son of a biscuit, I bought my 280 5 months ago and I really want to try free-sync

I guess I'll have to get a new gpu. 

CPU:AMD FX-8320, RAM:Kingston Fury Black Series 8GB (1 x 8GB) DDR3-1600, GPU:MSI Radeon R9 280, Case:Cooler Master 690 III, PSU:Thermaltake Smart 650W

Storage:WD Purple 1TB, CPU Cooler:Hyper 212 EVO, Keyboard:SCI Granite, OS:win 8.1 64bit, Display:Samsung SyncMaster SA100, Mouse:CM Storm Spawn

Link to comment
Share on other sites

Link to post
Share on other sites

son of a biscuit, I bought my 280 5 months ago and I really want to try free-sync

I guess I'll have to get a new gpu. 

The reason your R9 280 (Or any R9 280/280x) won't work is because it has a DP 1.2 connector. Only the newer designs (Such as the R9 285 or the R9 290) have DP 1.2a.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

The reason your R9 280 (Or any R9 280/280x) won't work is because it has a DP 1.2 connector. Only the newer designs (Such as the R9 285 or the R9 290) have DP 1.2a.

I know, free sync requires display port 1.2a or 1.3 

I asked not because I don't know but in hope that I was wrong.

CPU:AMD FX-8320, RAM:Kingston Fury Black Series 8GB (1 x 8GB) DDR3-1600, GPU:MSI Radeon R9 280, Case:Cooler Master 690 III, PSU:Thermaltake Smart 650W

Storage:WD Purple 1TB, CPU Cooler:Hyper 212 EVO, Keyboard:SCI Granite, OS:win 8.1 64bit, Display:Samsung SyncMaster SA100, Mouse:CM Storm Spawn

Link to comment
Share on other sites

Link to post
Share on other sites

I think it's hilarious how many people still don't seem to understand how this all works. All that's going on here is that instead of the GPU waiting for the screen to be ready the screen waits for the GPU. And when the GPU can push out frames faster than the screen can physically them? Well then naturally you'll have the GPU waiting again. Same with the lower threshold which has to be a thing because of the way that LCD panels work. That's all there is to it. So unless something is broken in either implementation I suspect they'll both have about the same end result with the same sort of limitations as dictated by the limitations of the panels themselves.

Fools think they know everything, experts know they know nothing

Link to comment
Share on other sites

Link to post
Share on other sites

I know, free sync requires display port 1.2a or 1.3 

I asked not because I don't know but in hope that I was wrong.

We all know that feeling. Fortunately free hugs of condolences are available upon request from fellow and former bleeding edge participants here.

Link to comment
Share on other sites

Link to post
Share on other sites

I think it's hilarious how many people still don't seem to understand how this all works. All that's going on here is that instead of the GPU waiting for the screen to be ready the screen waits for the GPU. And when the GPU can push out frames faster than the screen can physically them? Well then naturally you'll have the GPU waiting again. Same with the lower threshold which has to be a thing because of the way that LCD panels work. That's all there is to it. So unless something is broken in either implementation I suspect they'll both have about the same end result with the same sort of limitations as dictated by the limitations of the panels themselves.

While I haven't seen confirmation of this one way or the other, one difference might be the "two-way communication" possible with G-Sync between the GPU and the module. This may allow for a lower input lag option when capping the framerate (Eg: Not using traditional V-Sync above the max panel refresh rate).

 

This is something that will need to be tested once reviewers can get their hands on FreeSync monitors to compare against G-Sync ones.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

While I haven't seen confirmation of this one way or the other, one difference might be the "two-way communication" possible with G-Sync between the GPU and the module. This may allow for a lower input lag option when capping the framerate (Eg: Not using traditional V-Sync above the max panel refresh rate).

This is something that will need to be tested once reviewers can get their hands on FreeSync monitors to compare against G-Sync ones.

There's alot of stuff yet to be checked like : is freesync limited to full window as well as g sync? How about multi monitors? It's something AMD surely is leading and aiming to keep pushing forward, but I haven't read anything about this.

Also something that I remember seeing in forums about gsync, was performance loss (maybe was fixed), that's also that needs to be checked.

Link to comment
Share on other sites

Link to post
Share on other sites

Does anyone know if there could be any issues downsampling 4k content ( games and videos ) on a 1440p display with Freesync/gsync?

On a mote of dust, suspended in a sunbeam

Link to comment
Share on other sites

Link to post
Share on other sites

Does anyone know if there could be any issues downsampling 4k content ( games and videos ) on a 1440p display with Freesync/gsync?

None that we're aware of. AMD now has VSR downsampling as a feature of their drivers, so I assume they've thought about that.

 

'Course, we won't know these nitty gritty answers until someone gets their hands on one and starts playing around.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

There's alot of stuff yet to be checked like : is freesync limited to full window as well as g sync? How about multi monitors? It's something AMD surely is leading and aiming to keep pushing forward, but I haven't read anything about this. Also something that I remember seeing in forums about gsync, was performance loss (maybe was fixed), that's also that needs to be checked.

 

I asked Thracks about that on Twitter. His reply was:

 

"Dynamic refresh, any tech, must use full screen. Windows controls refresh in any other mode."

 

https://twitter.com/Thracks/status/553637895534546945

 

I then asked him the follow-up question of whether this is the same in all OS. The answer was a simple "si" (i.e. "yes").

 

So that's a limitation you can't get around, because you can't have part of the screen refreshing at one frame-rate and part of it refreshing at another frame-rate.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

I asked Thracks about that on Twitter. His reply was:

 

"Dynamic refresh, any tech, must use full screen. Windows controls refresh in any other mode."

 

https://twitter.com/Thracks/status/553637895534546945

 

I then asked him the follow-up question of whether this is the same in all OS. The answer was a simple "si" (i.e. "yes").

 

So that's a limitation you can't get around, because you can't have part of the screen refreshing at one frame-rate and part of it refreshing at another frame-rate.

 

 

Seems legit

On a mote of dust, suspended in a sunbeam

Link to comment
Share on other sites

Link to post
Share on other sites

Anyone heard if/when AMD is going to release desktop APU's that are freesync able?  Seems like something that would go over pretty well, though it may eat up some of their much lower end graphics card sales. 

I've seen some speculation that the console APUs may be freesync able, and I recall 1 or 2 laptops being used for the whole windmill freesync demo, but not a whisper beyond that....actually I think it may have been the same source...

 

:edit:http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014

hmm...same source indeed,  But it's anand so...

 

:edit2: derp, year old article mistaken as a new one

 

#2014VS2015

Link to comment
Share on other sites

Link to post
Share on other sites

Anyone heard if/when AMD is going to release desktop APU's that are freesync able?  Seems like something that would go over pretty well, though it may eat up some of their much lower end graphics card sales. 

I've seen some speculation that the console APUs may be freesync able, and I recall 1 or 2 laptops being used for the whole windmill freesync demo, but not a whisper beyond that....actually I think it may have been the same source...

 

:edit:http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014

hmm...same source indeed,  But it's anand so...

 

:edit2: derp, year old article mistaken as a new one

 

#2014VS2015

The console GPU's themselves can probably drive FreeSync, but the consoles (current gen) will never support it. They don't even have DisplayPort outputs at all - let alone DP 1.2a+. Consoles primarily drive via HDMI.

 

It's possible that a future HDMI spec might be Adaptive-Sync compatible, or that future gen consoles may include DisplayPort's as well, but as it currently stands, that's a no-go. Though FreeSync on the consoles would actually help a great deal with their terrible framerates.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Anyone heard if/when AMD is going to release desktop APU's that are freesync able?  Seems like something that would go over pretty well, though it may eat up some of their much lower end graphics card sales. 

I've seen some speculation that the console APUs may be freesync able, and I recall 1 or 2 laptops being used for the whole windmill freesync demo, but not a whisper beyond that....actually I think it may have been the same source...

 

:edit:http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014

hmm...same source indeed,  But it's anand so...

 

:edit2: derp, year old article mistaken as a new one

 

#2014VS2015

 

One of the models on display at CES was being driven by the A10-7850k, so it would seem that at least one of their existing APUs already supports Freesync.

i7-5820k  |  MSI X99S SLI-Plus  |  4x4GB HyperX 2400 DDR4  |  Sapphire Radeon R9 295X2  |  Samsung 840 EVO 1TB x2  |  Corsair AX1200i  |  Corsair H100i  |  NZXT H440 Razer

Link to comment
Share on other sites

Link to post
Share on other sites

One of the models on display at CES was being driven by the A10-7850k, so it would seem that at least one of their existing APUs already supports Freesync.

Yep the only major requirement is DP 1.2a. With APU's, it's easier to support that, since they just update the Motherboard spec, and drop the APU into the new motherboard.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×