Jump to content

AMD FreeSync Drivers And Monitors Are Available

Opcode

As of now it does not work with Eyefinity and Crossfire setups.

 

Those with multiple cards and monitors, might want to hold your horses for a while.

 

And good news for those in green shirts as well. Freesync monitors being substantially cheaper might drive Gsync monitor prices down.

 

Three Freesync monitors available on OCUK. They are all under 500GBP

i attended an online conference and they said it works for eyefinity if all the monitors are the same

Link to comment
Share on other sites

Link to post
Share on other sites

i have the feeling nvidia will need to use this to stay competitive in any imaginable way

 

 

Maybe not, there are throngs of nvidia fanboys who would gladly pay the premium to stick with team green.

 

 

The pcper guys are not going to switch to amd cards in their main rigs, same probably goes for linus/slick.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

TTL Customs (OC3D) have a bit of info on it now... with results/conclusions.

Source: http://www.overclock3d.net/reviews/gpu_displays/amd_freesync_review/1

18170348774l.jpg

I don't know if just seeing averages is enough to give a yay or nay to this . I won't make any judgments until I see a graph of ALL the spikes and dips in a benchmark test in game.

PC: CPU - FX 8350 @4.5 Ghz | GPU - 3x R9 290 @1100 core/1300 memory | Motherboard - Asus Crosshair V Formula Z | RAM - 16 GB Mushkin Redline 1866 Mhz | PSU - Corsair AX 860w | SSD - ADATA SX900 256 GB | HDD - Seagate 3TB 7200RPM | CPU Cooler - Noctua NH D-14 | Case - Cooler Master HAF Stacker 935

Peripherals: Monitor - ASUS VN248H-P IPS | Keyboard - Corsair K70 | Mouse - Corsair M65 | Headphones - ASUS ROG Orion Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I was under the impression that Sync monitors exist (and will exist in more numbers) with multiple inputs, audio, OSDs and so forth...

Never take PR material at its word.

TTL Customs (OC3D) have a bit of info on it now... with results/conclusions.

Source: http://www.overclock3d.net/reviews/gpu_displays/amd_freesync_review/1

"Love that overhead!"

"What overhead?"

"Exactly!"

Does G-Sync have a performance penalty ?  how ? i thought it all had to do with the scaler chip on the monitor and doesn't put any load on the Gpu, can someone educate me on this ? thank you //@Victorious Secret 

Makes sense, but as a consumer i'm confused,

Nvidia says : G-Sync - No performance hit , it's all in the scalers - i can assume the PR team is seeling us a lie, there is actually a performance hit

AMD says - Free Sync no performance hit and G-Sync - there is a performance hit - now i have to assume AMD's PR is over selling by saying their compilation has a performance hit , so it's probably a Lie 

The questions is, Which PR team should i not trust ?

I have heard that in testing there is a 3%-5% performance hit but these numbers seem to be based off the G-Sync that was part of the "DIY kit" from a while ago, I don't know if anyone has done a new test with the current crop of G-Sync monitors to see what the hit is.

PCPER claims that GSync has a performance hit of up to 2.38% when tested with a GTX 780, for reference they claim that FreeSync has a performance hit of up to .46%.

gH6yB.png
Linus Sebastian said:

The stand is indeed made of metal but I wouldn't drive my car over a bridge made of it.

 

https://youtu.be/X5YXWqhL9ik?t=552

Link to comment
Share on other sites

Link to post
Share on other sites

 

All AMD Radeon graphics cards in the AMD Radeon HD 7000, HD 8000, R7 or R9 Series will support Project FreeSync for video playback and power-saving purposes. The AMD Radeon R9 295X2, 290X, R9 290, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming.
 
AMD APUs codenamed "Kaveri," "Kabini," "Temash," "Beema" and "Mullins" also feature the necessary hardware capabilities to enable dynamic refresh rates for video playback, gaming and power-saving purposes.

 

 

Only the cards I bold' can do what g-sync does, the rest can only do it for video playback, which is kinda pointless if you ask me.  So the fact is that a far chunk of current AMD users and all nvidia users will have to buy a new GPU if they want freedom of sync,   as Opposed to the already larger chunk of nividia users on Keppler who only need a new monitor.

 

 

 

Also I don't see the point in posting FPS graphs, g-sync will have similar FPS results, it's latency that we need to know, why no latency graphs?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Are there any 1080p monitors? 16:9?

The year is 20XX. Everyone plays Fox at TAS levels of perfection. Because of this, the winner of a match depends solely on port priority. The RPS metagame has evolved to ridiculous levels due to it being the only remaining factor to decide matches.

Only Abate, Axe, and Wobbles can save us.

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe not, there are throngs of nvidia users who don't want to pay again for another GPU and monitor when they can just get the monitor for the same end result.

 

 

The pcper guys are not going to switch to amd cards in their main rigs, same probably goes for linus/slick.

 

Fixed that for you,  there is a difference between being a fanboy and being stupid. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Are there any 1080p monitors? 16:9?

According to AMD's Meet the Experts conference yesterday there will be 1080p monitors, Viewsonic and Nixeus in particular have announced that they will have 1080p FreeSync monitors.
Linus Sebastian said:

The stand is indeed made of metal but I wouldn't drive my car over a bridge made of it.

 

https://youtu.be/X5YXWqhL9ik?t=552

Link to comment
Share on other sites

Link to post
Share on other sites

Does G-Sync have a performance penalty ?  how ? i thought it all had to do with the scaler chip on the monitor and doesn't put any load on the Gpu, can someone educate me on this ? thank you //@Victorious Secret

FreeSyncSlide-5.jpg

 

9-240Hz, all I'm seeing is 30-144Hz.

Proprietary module? Uhm it's an Altera Arria V GX which everyone can buy.

DSC_4620-640x516.jpg

Performance penalty of below 5% is within margin of error, run the test 100 times the results will always vary especially in games that do not even have an in-game benchmark run.

 

 

G-Sync suffers from bidirectional communication (latency) as the module and the GPU pipe information back and forth from one to the other for each frame. With FreeSync the GPU just sends out whatever frames that are available and the screen itself waits until it receives the next frame before refreshing (one way communication). So far we've seen that G-Sync increases input lag because of this (we have benchmarks on the forum proving so).

Explain the ghosting then;

ghost1.jpg

Also your explanation is completely wrong, when the monitor is done drawing the current frame it will wait for the GPU to deliver the next frame (if not it will draw the last frame again). The delay is controlled with Vblank interval which is the same technique Freesync applies. So both will introduce some lag as the technique is the same. The GPU will just poll the monitor to see if the monitor is in the vblank state or not so you don't end up with bad scans which freesync does as well. It's a oneway system.

Link to comment
Share on other sites

Link to post
Share on other sites

Also your explanation is completely wrong, when the monitor is done drawing the current frame it will wait for the GPU to deliver the next frame (if not it will draw the last frame again). The delay is controlled with Vblank interval which is the same technique Freesync applies. So both will introduce some lag as the technique is the same. The GPU will just poll the monitor to see if the monitor is in the vblank state or not so you don't end up with bad scans which freesync does as well. It's a oneway system.

FreeSync works off what Adaptive-Sync does display side. No different than how it currently works in all displays. Frames are piped out and the display handles them. With Adaptive-Sync the display can adapt its refresh rate dynamically which means its refresh rate is updated for every single frame. Without a frame the display ceases to refresh until it receives the next frame (variable vblank). You're thinking of VSYNC that draws the last frame to fill the time gap between the next frame.

 

amd-freesync-slide-5-645x363.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

FreeSync works off what Adaptive-Sync does display side. No different than how it currently works in all displays. Frames are piped out and the display handles them. 

That won't work at all, you have to manipulate the Vblank interval to have VRR. If you're just mindlessly dumping all of your frames to your monitor you rather create more input lag as the display has more frames to process. Even your understanding how AS works is flawed.

 

 

With Adaptive-Sync the display can adapt its refresh rate dynamically which means its refresh rate is updated for every single frame. Without a frame the display ceases to refresh until it receives the next frame (variable vblank). You're thinking of VSYNC that draws the last frame to fill the time gap between the next frame.

Dynamically? It's variable. The GPU is not sending mindlessly frames to the monitor, that's not the purpose of a sync.

Link to comment
Share on other sites

Link to post
Share on other sites

Fixed that for you,  there is a difference between being a fanboy and being stupid. 

 

 

Obvious we are talking about people who do not already have a variable refresh rate monitor and are looking at new gpus.  The vast majority of the market does not have vrr monitors, but for those that did by gsync displays, they have to keep buying green to make use of the price premium they paid.  That was part of the point, the lock in, the apple model.  Lap it up.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

In theory I can force my old CRT style LCD screen to use freesync. Before my 650ti decided it wouldn't allow it, I had the refresh rate cycling from 15MHz to 85MHz @1024x768 (physical resolution of the screen, which can display any resolution up to the limit of VGA) using my custom driver.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

That won't work at all, you have to manipulate the Vblank interval to have VRR. If you're just mindlessly dumping all of your frames to your monitor you rather create more input lag as the display has more frames to process. Even your understanding how AS works is flawed.

That's exactly what I said in my previous post. The previous frame is not displayed again with Adaptive-Sync as the screen will not refresh until it's queued to do so by the GPU.

 

Dynamically? It's variable. The GPU is not sending mindlessly frames to the monitor, that's not the purpose of a sync.

They are one and the same context.

Link to comment
Share on other sites

Link to post
Share on other sites

Obvious we are talking about people who do not already have a variable refresh rate monitor and are looking at new gpus.  The vast majority of the market does not have vrr monitors, but for those that did by gsync displays, they have to keep buying green to make use of the price premium they paid.  That was part of the point, the lock in, the apple model.  Lap it up.

And conversely anyone who buys a freedom of sync monitor will be tied to AMD GPUs.  Buying one brand over another doesn't make one a fanboy.  Nvidia currently has the lions share of the market, not because 60% of GPU buyers are fanboys but because at the time of purchase they made a better sales pitch/product/price point. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

The PCper Ghosting problem on FreeSync monitors need to be looked at. This is a huge problem if it really exists.

Link to comment
Share on other sites

Link to post
Share on other sites

And conversely anyone who buys a freedom of sync monitor will be tied to AMD GPUs.  Buying one brand over another doesn't make one a fanboy.  Nvidia currently has the lions share of the market, not because 60% of GPU buyers are fanboys but because at the time of purchase they made a better sales pitch/product/price point. 

 

 

Nothing is stopping from intel or nvidia from enabling adaptive sync support, if they choose not to it's because they CHOSE not to.  Nvidia stops the others from utilizing gsync, because the entire point is to lock people into their gpus with the feature.  This is not to be contested.  As for market share, I don't know how many people with discreet gpus are so untethered to a particular brand that they don't mind which company they go for, but I do know that nvidia has a larger base of fanboys than amd does.  Why else do you think the 960 does so well?

 

 http://www.amazon.com/gp/bestsellers/electronics/284822/ref=pd_zg_hrsr_e_1_4_last

 

What drives that affinity?  Objective analysis of performance levels?  Really?  No, it's the same thing that makes people think apple computers and phones are the bee's knees, their peers have them, so they want them too.  Large chunks of the user base (not all) are filled with followers who want to have something similar to their peers, or go off what they think is best, damn reality.  There is a belief that THEIR hardware and software is "special" and that the other stuff is just peasant stock.  You can easily out these people when they start talking about "drivers."  Same code word apple people use when they talk about osx being inherently superior to windows.  Then they can dampen their brain from the fact that the dell xps 13 or spectre x360 are far better machines than a 13" macbook air, because even if the hardware was better, they always have something they can fall back on to maintain some sort of quasi plausible sense of superiority.  They can't just admit that part of their decisions have nothing to do with objective analysis, and more to do with emotional affinity.  

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

Makes sense, but as a consumer i'm confused,

Nvidia says : G-Sync - No performance hit , it's all in the scalers - i can assume the PR team is seeling us a lie, there is actually a performance hit

AMD says - Free Sync no performance hit and G-Sync - there is a performance hit - now i have to assume AMD's PR is over selling by saying their compilation has a performance hit , so it's probably a Lie 

The questions is, Which PR team should i not trust ? 

I heard G-Sync actually has a bit of input lag on some games. 

Link to comment
Share on other sites

Link to post
Share on other sites

The PCper Ghosting problem on FreeSync monitors need to be looked at. This is a huge problem if it really exists.

FreeSync doesn't add ghosting. Just like G-Sync doesn't eliminate ghosting. That depends solely on the panel and the manufacturer's implementation.

Link to comment
Share on other sites

Link to post
Share on other sites

In the PcPer article from: http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Gaming-Experience-FreeSync-

 

 

 NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates.

 

From that quote it does mean that G-Sync fixes ghosting problems because G-Sync requires the G-Sync module. Therefore G-Sync fixes the ghosting problems.

 

 

 

For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area.

 

PcPer has shown that 2 FreeSync monitors from different manufactures have ghosting problems that is a fact. AMD even acknowledge that there is a ghosting problem on the FreeSync monitors. 

 

Edit: hardware.fr is claiming in their review of the ACER XG270U also has ghosting too. That is now 3 monitors that have ghosting report on FreeSync monitors.

link to article in french:http://www.hardware.fr/focus/108/freesync-disponible-premiers-ecrans-decoivent.html

Link to comment
Share on other sites

Link to post
Share on other sites

FreeSync doesn't add ghosting. Just like G-Sync doesn't eliminate ghosting. That depends solely on the panel and the manufacturer's implementation.

The BenQ XL2730Z & Swift both use the same panel that's made by AUO (BenQ is like the Crucial of Micron or in this case AUO), so yes it does. 

Link to comment
Share on other sites

Link to post
Share on other sites

The BenQ XL2730Z & Swift both use the same panel that's made by AUO (BenQ is like the Crucial of Micron or in this case AUO), so yes it does. 

Ghosting has existed for years long before either of these technologies. People are reporting terrible ghosting on the Swift as well.

Link to comment
Share on other sites

Link to post
Share on other sites

 but I do know that nvidia has a larger base of fanboys than amd does.  Why else do you think the 960 does so well?

 

Well done, analyzing market share and concluding fanboys.  get with reality.    If someone has a keppler or later GPU then it is cheaper for them to buy a gsync monitor than it is to buy and AMD GPU and freesync monitor.   This is not being a fanboy this is being rational.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Ghosting has existed for years long before either of these technologies. People are reporting terrible ghosting on the Swift as well on Nvidia's forums.

Yeah few things; they don't know what ghosting is or they abuse the term like lag is being used for fps.

Only one thread; https://www.google.be/?gws_rd=ssl#q=site:geforce.com+swift+ghosting

That's in 3D mode (Gsync doesnt work in 3D) so stop making shit up as you always do. You've gotten a video between two monitors using the same exact panel with Freesync causing Ghosting, that's enough to jump off your zeppelin.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×