Jump to content

FreeSync Monitors Sampling Next Month, Arriving Early Next Year.

Richard Huddy, AMD's Gaming Scientist has revealed in an interview with TechReport that FreeSync monitors will begin sampling next month and arrive to consumers in early 2015.

According to Huddy adding FreeSync support to a monitor would add somewhere between $10-$20 in total cost which is significantly less than what adding G-Sync support costs , approximately one tenth.

according to Huddy, who told us he'd be surprised if FreeSync compatibility added more than $10-20 to a display's bill of materials. Even taking additional validation costs into consideration, monitor makers should be able to support adaptive refresh rates fairly cheaply

 

Huddy also went on to say that we'll see a very wide range of refresh rates supported from 24hz and all the way to 144hz.

There are no requirements surrounding the range of refresh rates that monitor makers must support. However, Huddy expects entry-level models to start at 24Hz, which is the most desirable update frequency for typical video. Higher-end implementations could scale up to 144Hz and beyond.

 

You can find the GPU & APU FreeSync compatibility list here.

Link to comment
Share on other sites

Link to post
Share on other sites

BUT FREE SYNC WAS SUPPOSED TO BE FREE!?!?!

 

/s

It is, there is no licensing fee or proprietary AMD hardware that must be added.

In G-Sync's case in order to add G-Sync support the monitor maker will HAVE TO pay Nvidia a licensing fee AND buy a 150$ G-Sync module from Nvidia to add to the monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

Wow, this is happening much faster than I would have ever guessed.

 

I guess the licensing fees and memory required for Gsync made up the majority of the cost. This is great news.

 

Very cool, thanks TechFan@ic!

Link to comment
Share on other sites

Link to post
Share on other sites

get me at least a 1440p monitor with Adaptive V-Sync and i'd be happy.

Desktop: CM Elite 130 - Corsair CX600M PSU - Asus Maximus VI Impact - Intel Core i7-4790K (@4.4GHz) - Corsair H80i - 2x8GB Crucial Ballistix Sport DDR3-1600 - Asus DirectCUII Radeon R9 290 - 250GB Samsung EVO SSD + 4TB WD Red HDD

Laptop: Asus N56DP-DH11 (AMD A10-4600M - Radeon HD7730M) -------------------------------------------------------- I know, I'm a bit of an AMD fanboy --------------------------------------------------------

"It's not what you drive; it's how you drive it."   ~~Jeremy Clark, TopGear

Link to comment
Share on other sites

Link to post
Share on other sites

It is, there is no licensing fee or proprietary AMD hardware that must be added.

In G-Sync's case in order to add G-Sync support the monitor maker will HAVE TO pay Nvidia a licensing fee AND buy a 150$ G-Sync module from Nvidia to add to the monitor.

 

Thank you for missing my sarcasm, old sport. 

Link to comment
Share on other sites

Link to post
Share on other sites

Wow, this is happening much faster than I would have ever guessed.

 

I guess the licensing fees and memory required for Gsync made up the majority of the cost. This is great news.

 

Very cool, thanks TechFan@ic!

I honestly can't wait, already made the upgrade to a compatible GPU just waiting for the monitors to start rolling out.

Also, you're quite welcome ! :)

 

get me at least a 1440p monitor with Adaptive V-Sync and i'd be happy.

Agreed, although I'm still waiting for 24' or smaller 1440p monitors. We already have 24" 4K monitors why the heck don't they release 24" 1440p panels !?

I feel the sharpness added by the additional pixels would be negated through inflating the size of the panel.

Link to comment
Share on other sites

Link to post
Share on other sites

I honestly can't wait, already made the upgrade to a compatible GPU just waiting for the monitors to start rolling out.

Also, you're quite welcome ! :)

 

Agreed, although I'm still waiting for 24' or smaller 1440p monitors. We already have 24" 4K monitors why the heck don't they release 24" 1440p panels !?

I feel the sharpness added by the additional pixels would be negated through inflating the size of the panel.

because the trend is towards 27 now, plus its cheaper to design for 27 as you don't have to cram the pixels in as tight.

Link to comment
Share on other sites

Link to post
Share on other sites

I will believe it when I see it.

20 dollars might only be for the lowest end version (which only has a few different refresh rates), and that's before the manufacturers adds on their usual profit margin.

Link to comment
Share on other sites

Link to post
Share on other sites

I will believe it when I see it.

20 dollars might only be for the lowest end version (which only has a few different refresh rates), and that's before the manufacturers adds on their usual profit margin.

It's funny, you say the exact same thing with every single AMD announcement.
You said that with Mantle & it nearly doubled the performance in BF4 multilayer, you said that with the 290X & it beat the Titan.
& you said FreeSync was for laptops only and it was showcased on desktop monitors just a couple of months later.

We know for a fact that FreeSync will NOT add a bare minimum 150$+ cost to monitors + additional profit margins like G-Sync has done.
 

Link to comment
Share on other sites

Link to post
Share on other sites

get me at least a 1440p monitor with Adaptive V-Sync and i'd be happy.

 

Mmm, yess.. With a good IPS panel too.. ^_^

waffle waffle waffle on and on and on

Link to comment
Share on other sites

Link to post
Share on other sites

i like the fact that it can go down to 24Hz since there is no buffer it doesnt hold a frame unlike gsync holding a frame for too long will cause degradation of the frame

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

Well that is pretty much on schedule for what Huddy said in that interview earlier this year when everybody said "that's never going to happen".

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

Really good stuff from AMD; between this and sharing Mantle with Khronos to help improve OpenGL.

 

With regard to freesync though, although on the long term it looks like AMD will win the battle against Gsync Nvidia still deserves credit for pursuing this. Freesync would never have happened so fast if not for Gsync.

Link to comment
Share on other sites

Link to post
Share on other sites

get me at least a 1440p monitor with Adaptive V-Sync and i'd be happy.

Adaptive Vsync is Nvidia's proprietary tech, don't confused that with adaptive-sync. They are different.

Processor: AMD FX8320 Cooler: Hyper 212 EVO Motherboard: Asus M5A99FX PRO 2.0 RAM: Corsair Vengeance 2x4GB 1600Mhz

Graphics: Zotac GTX 1060 6GB PSU: Corsair AX860 Case: Corsair Carbine 500R Drives: 500GB Samsung 840 EVO SSD & Seagate 1TB 7200rpm HDD

 

Link to comment
Share on other sites

Link to post
Share on other sites

I will believe it when I see it.

20 dollars might only be for the lowest end version (which only has a few different refresh rates), and that's before the manufacturers adds on their usual profit margin.

Freesync doesn't work like that. You can't just pick and choose a few refresh rates to make the monitor cheaper

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync doesn't work like that. You can't just pick and choose a few refresh rates to make the monitor cheaper

That's what it sounds like to me:

 

However, Huddy expects entry-level models to start at 24Hz, which is the most desirable update frequency for typical video. Higher-end implementations could scale up to 144Hz and beyond.

This sentence implies that there will be different tiers of implementations. Some which are more limited than others. The way the 7XXX series supports FreeSync also indicates that there will be different tiers. Some which just locks at different refresh rates, and some that dynamically changes them.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

It's funny, you say the exact same thing with every single AMD announcement.

You said that with Mantle & it nearly doubled the performance in BF4 multilayer, you said that with the 290X & it beat the Titan.

& you said FreeSync was for laptops only and it was showcased on desktop monitors just a couple of months later.

We know for a fact that FreeSync will NOT add a bare minimum 150$+ cost to monitors + additional profit margins like G-Sync has done.

 

 

 

Please, why must people always resort to picking and posting these graphs?  they mean hardly anything because you can always find a graph that says the opposite.  Whats wrong with letting people be skeptical of marketing claims?  9 times out of 10 products aren't as grandiose as the companies make out. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Please, why must people always resort to picking and posting these graphs?  they mean hardly anything because you can always find a graph that says the opposite.  Whats wrong with letting people be skeptical of marketing claims?  9 times out of 10 products aren't as grandiose as the companies make out. 

Well it doesn't "say the opposite"

From the link you posted, R9 290X beats Titan in Crysis 3 by 35 FPS to 31 FPS & ties it in BF3 @ 68 FPS.

290X does beat out the Titan, this has been established at launch and well documented.

More in tandem with the topic at hand, FreeSync creates a really interesting dynamic in which there is absolutely no reason why panel makers wouldn't integrate freesync into every single monitor because it's so affordable and relatively simple.

I'd be quite surprised if we don't see the majority of panels supporting variable refresh rate in a few years time.

Link to comment
Share on other sites

Link to post
Share on other sites

Well it doesn't "say the opposite"

From the link you posted, R9 290X beats Titan in Crysis 3 by 35 FPS to 31 FPS & ties it in BF3 @ 68 FPS.

290X does beat out the Titan, this has been established at launch and well documented.

More in tandem with the topic at hand, FreeSync creates a really interesting dynamic in which there is absolutely no reason why panel makers wouldn't integrate freesync into every single monitor because it's so affordable and relatively simple.

I'd be quite surprised if we don't see the majority of panels supporting variable refresh rate in a few years time.

 

That's nice, but my point still stands, every time someone says company A is better than company B and post a graph we end up with pages of graphs and no real insight into the topic at hand.  It just results in a fanboy war.  There was nothing wrong with what LawLz said, If he wants to wait for monitors to actually hit the market and know the price instead of just believing what AMD will tell us then that's not only his prerogative but it's quite a mature approach to take, especially considering how volatile the industry can be.   He wasn't saying it will be shit, he wasn't outright calling it a lie and saying it won't happen, he simply said he'll wait for it to happen before he believes it. And to be quite honest I am in the same boat,  there has been a lot (and I mean a LOT) of talk from AMD over the last 5 years, so far we have a handful of cheaper GPU's and maybe an API.  Until freesync actually resolves itself into something tangible what's there to get excited about?  In my experience (more than 30years) you get what you pay for.  I happy to be proven wrong but that hasn't happened yet.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

That's what it sounds like to me:

This sentence implies that there will be different tiers of implementations. Some which are more limited than others. The way the 7XXX series supports FreeSync also indicates that there will be different tiers. Some which just locks at different refresh rates, and some that dynamically changes them.

I'm pretty sure that just means there will be monitors that have a maximum refresh rate of 24hz, 144hz,60hz, etc. If the monitors could only choose between 60hz and 24hz and 120hz, that would make absolutely no difference in tearing because the whole way fresync works is that the refresh rate of the monitor equals the frame rate. If they are not exactly identical, some tearing can occur.

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

That's what it sounds like to me:

This sentence implies that there will be different tiers of implementations. Some which are more limited than others. The way the 7XXX series supports FreeSync also indicates that there will be different tiers. Some which just locks at different refresh rates, and some that dynamically changes them.

By 7XXX series are you refering to AMD's APUs or the HD 7000 series gfx cards? Only a couple cards will be compatible with the dynamic refresh rate of games -

The AMD Radeon™ R9 295X2, 290X, R9 290, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming

All of AMD's APU's (at least the ones from the last few years) will support dynamic refresh rates, not just the 7XXX series... so I'm a bit confused

HD 7000, HD 8000, R7 or R9 Series will support Project FreeSync for video playback and power-saving purposes

This to me basically means it will run at the refresh rate of video playback (generally, 24fps/24Hz) and low refresh rates while idle, but can't handle the extreme variable refresh rates of gaming when fps changes drastically every second.

 

Most likely we'll just be seeing some "non gaming" monitors with low refresh rates for much cheaper.

Processor: AMD FX8320 Cooler: Hyper 212 EVO Motherboard: Asus M5A99FX PRO 2.0 RAM: Corsair Vengeance 2x4GB 1600Mhz

Graphics: Zotac GTX 1060 6GB PSU: Corsair AX860 Case: Corsair Carbine 500R Drives: 500GB Samsung 840 EVO SSD & Seagate 1TB 7200rpm HDD

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's funny, you say the exact same thing with every single AMD announcement.

You said that with Mantle & it nearly doubled the performance in BF4 multilayer, you said that with the 290X & it beat the Titan.

& you said FreeSync was for laptops only and it was showcased on desktop monitors just a couple of months later.

We know for a fact that FreeSync will NOT add a bare minimum 150$+ cost to monitors + additional profit margins like G-Sync has done.

First of all, Mantle is atm useless. Barely adds any performance to the GPU, the amount of cpu bottlenecking Mantle reduced was easily achieveable for nvidia by reducing the driver overhead which AMD is apparently complety ignoring as you could notice from a few of these DX11 tests. Star Swarm which was a demo to mainly advertise Mantle, there was a good 30-40% difference between the 290x & 780ti and that test is mainly CPU bound, nvidia released 337.50 it's no longer the case anymore that AMD is pulling ahead. 

http://cdn.wccftech.com/wp-content/uploads/2014/02/battlefield-4-mantle-benchmarks.gif

http://i.imgur.com/9sdrOU9.png

You aren't getting anywhere by offering people a new API which apparently is only supported in 2 games when your drivers for DX11 are suffering from a bunch of overhead that might cause the CPU bottlenecking and the fact that we've seen almost no proper performance in a half year time for Mantle. Its been rumored for awhile that nvidia was tweaking this for a pretty long time but nobody could prove this until now, it's clear atm that nvidia achieved more than AMD did in terms of reducing cpu bottlenecking. Sure nvidia can only reduce the driver overhead, a new API can do much more but there's no magic happening on AMD's side with Mantle.

About Freesync, there's no proof so it's pretty much a claim theyre making. Using the name Freesync and afterwards excuses like "It's the license that's free" when you're targetting millions of people you're clearly intending brainwashing people with "dont buy Gsync, we offer it for free". Also he never said that Freesync was only for laptops. You don't have dynamic refresh rates with EDP (vesa adaptive sync) like Gsync offers, Intel offers this as well named DRRS which they improved it for their own stuff.

"Die Demo läuft mit einer Bildwiederholfrequenz von 47 bis 48 Frames pro Sekunde und der Monitor liefert dank der angepassten Bildwiederhohlfrequenz ein einwandfreies Bild ab, das dem von Nvidias G-Sync sehr ähnlich sieht. Eine variierende Framerate, auf die der Monitor reagieren kann, lässt die Demo aktuell noch nicht zu. Während der Monitor während der Präsentation keine Probleme machte, stürzte die Techdemo von AMD zudem mehrfach ab."

"The demo runs at a refresh rate 47-48 frames per second and the monitor provides thanks to the custom frequency a perfect picture that is very similar to Nvidia G-Sync. Currently the demo does not have a varying framerate at which the monitor could react. While the monitor showed no problems during the presentation, the tech demo of AMD also crashed several times."

http://www.computerbase.de/2014-06/amd-freesync-monitore-guenstiger-als-g-sync/

AMD hasn't showed the framerates at all with something like RIVAtuners OSD or fraps whatever, why the hell wouldn't you show the framerates if your "technology" can do it? Find me an excuse and let's make a meme of it. I just can't stand falsely presenting things they claim to have which they don't even have. Let them first achieve some dynamic refresh rates like Gsync instead of believing its only 20$ more for a technology that doesnt exist yet - you might as well tell people you're selling flying cars just for 300$ more than a normal car. 

This was their latest presentation; youtube.com/watch?v=cK-aV4ryKdE

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×