Jump to content

AMD FreeSync Drivers And Monitors Are Available

Opcode

Yeah few things; they don't know what ghosting is or they abuse the term like lag is being used for fps.

Only one thread; https://www.google.be/?gws_rd=ssl#q=site:geforce.com+swift+ghosting

That's in 3D mode (Gsync doesnt work in 3D) so stop making shit up as you always do. You've gotten a video between two monitors using the same exact panel with Freesync causing Ghosting, that's enough to jump off your zeppelin.

Actually Google is flooded with reports of ghosting on the Swift.

Link to comment
Share on other sites

Link to post
Share on other sites

Actually Google is flooded with reports of ghosting on the Swift.

Nope, you're just making claims you can't prove.

Link to comment
Share on other sites

Link to post
Share on other sites

Confused.... How is ghosting an adaptive sync issue? Isn't it a panel issue? It's not like tearing and such.

Link to comment
Share on other sites

Link to post
Share on other sites

The BenQ XL2730Z & Swift both use the same panel that's made by AUO (BenQ is like the Crucial of Micron or in this case AUO), so yes it does.

NOPE, don't spread the lie, it's not the same panel.

 

@TFTCentral

Confirmation that the BenQ XL2730Z is using an AU Optronics M270DTN01.0 TN Film panel. Different to the Asus ROG Swift PG278Q (M270Q002 V0)

https://twitter.com/TFTCentral/status/575701197010649088

Link to comment
Share on other sites

Link to post
Share on other sites

Nope, you're just making claims you can't prove.

Using LMGTFY is against the CoC so just go to Google and key in "PG278Q ghosting". I don't need any more evidence to counter your claims of ghosting being an Adaptive-Sync problem. So far the only ghosting we have is on AMD's technology demo designed to showcase tearing. We need legitimate game evidence from several titles to prove that ghosting is truly an issue. If you can show me at least 20+ games all ghosting then you'd have something more believable but a single technology demo isn't going to cut it as ghosting is also software induced. Saying ghosting is a FreeSync problem is no different than saying ghosting is also a G-Sync problem (ghosting happens with G-Sync displays too).

Link to comment
Share on other sites

Link to post
Share on other sites

Confused.... How is ghosting an adaptive sync issue? Isn't it a panel issue? It's not like tearing and such.

"NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."

That's Nvidia's reponse.

"For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area."

AMD's reponse that made no sense. Monitor manufacturers know better than AMD. Gsync doesn't replace the Tcon timer at all, so quite sad.

Link to comment
Share on other sites

Link to post
Share on other sites

I'd happily join team red if some nice high quality displays come from this. G-sync is ludicrously expensive right now.

Link to comment
Share on other sites

Link to post
Share on other sites

"NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."

That's Nvidia's reponse.

"For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area."

AMD's reponse that made no sense. Monitor manufacturers know better than AMD. Gsync doesn't replace the Tcon timer at all, so quite sad.

You have a source? And not a bad one (PcPer). Also that means the fault lies in display manufactures and not FreeSync or AMD. I'd also like to hear input based on a technology designed to only stop tearing being the blame for something that has been happening (especially with high refresh rate displays) for years now. How come the Swift also has bad ghosting? G-Sync is the blame? It's easy for anyone to fabricate rumors.

Link to comment
Share on other sites

Link to post
Share on other sites

You have a source? And not a bad one (PcPer). Also that means the fault lies in display manufactures and not FreeSync or AMD. I'd also like to hear input based on a technology designed to only stop tearing being the blame for something that has been happening (especially with high refresh rate displays) for years now.

You'll only believe something that's coming straight from AMD, if there is any negativity about Intel/Nvidia or positivity about AMD you'll take all sources serious like shit.

Link to comment
Share on other sites

Link to post
Share on other sites

You'll only believe something that's coming straight from AMD, if there is any negativity about Intel/Nvidia or positivity about AMD you'll take all sources serious like shit.

Nope, I am real and so far all you've provided the community with is fabricated stories of how biased I am.

 

You got anything to contribute to this thread? Otherwise you're wasting space.

Link to comment
Share on other sites

Link to post
Share on other sites

Nope, I am real and so far all you've provided the community with is fabricated stories of how biased I am.

Thank God you're real.

 

You got anything to contribute to this thread? Otherwise you're wasting space.

Yeah AMD blaming the scalers they've been designing together with many asic companies such as realtek is stupid. 

"For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area."

Do you have anything better to do than backpedaling after you're being asked to provide a link instead or some PR pictures that prove less than what people managed to prove that God exists to confirm you're real?

Link to comment
Share on other sites

Link to post
Share on other sites

I heard G-Sync actually has a bit of input lag on some games. 

you heard ? where the Internet ? sources please ? i really want to know 

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

My dream:

 

27" 4K IPS 60Hz with Freesync.

 

10/10 would buy

CPU: i7 2600 @ 4.2GHz  COOLING: NZXT Kraken X31 RAM: 4x2GB Corsair XMS3 @ 1600MHz MOBO: Gigabyte Z68-UD3-XP GPU: XFX R9 280X Double Dissipation SSD #1: 120GB OCZ Vertex 2  SSD #2: 240GB Corsair Force 3 HDD #1: 1TB Seagate Barracuda 7200RPM PSU: Silverstone Strider Plus 600W CASE: NZXT H230
CPU: Intel Core 2 Quad Q9550 @ 2.83GHz COOLING: Cooler Master Eclipse RAM: 4x1GB Corsair XMS2 @ 800MHz MOBO: XFX nForce 780i 3-Way SLi GPU: 2x ASUS GTX 560 DirectCU in SLi HDD #1: 1TB Seagate Barracuda 7200RPM PSU: TBA CASE: Antec 300
Link to comment
Share on other sites

Link to post
Share on other sites

you heard ? where the Internet ? sources please ? i really want to know 

 

At 140+ FPS you will most likely see lag with gsync,  but under 120fps the latency is within margin of error, maybe 2 or 3 ms slower.  To put this in perspective human reaction time is average 190ms (for visual stimulus) and the ability to differentiate 5 FPS when gaming at 70+ FPS is non existent, not even elite athletes have nervous systems able to work at that speed.

 

http://www.blurbusters.com/gsync/preview2/

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Thank God you're real.

 

Yeah AMD blaming the scalers they've been designing together with many asic companies such as realtek is stupid. 

"For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area."

Do you have anything better to do than backpedaling after you're being asked to provide a link instead or some PR pictures that prove less than what people managed to prove that God exists to confirm you're real?

 

 OK, got something new here about this "Ghosting" from "AMD Freesync owners thread" at OCUK

 

 

shankly1985: That ghosting on BenQ is AMA switched on.. I noticed it on battlefield 4 when moving lamp posts would show a glow effect. Switch of AMA and it was gone..

 

MrMD: I got just a 1080p Benq monitor(1ms) but with AMA turned on i defiantly experience ghosting.Turn it off its fine

 

MrMD: Its a puzzler as its meant to reduce ghosting,but i experience none with it turned off,but loads with it turned on.

I always figured it was a bug in the firmware maybe and it got on/off back to front,and only affected my model/batch or whatever

Its a GL2460             
shankly1985: Its the same in both my BenQ monitors the Freesync one and my older one.. Although on this Freesync one AMA High the ghosting is very little you really need to look for it.

On stock High its very bad and I expect this is why its showing up in PCer review because its on by default on Premium setting.

drunkenmaster: It's a terrible review and it's not surprising to see you posting it. It's borderline retarded review. Blaming ghosting on different settings available on different monitors. Also difficult to say without the screen if it's ghosting or overshoot, many screens have excessively over aggressive settings that cause awful overshoot and under aggressive settings that aren't pushing the panel hard enough and do show ghosting.

As others have pointed out, they turned off a setting and boom, no ghosting. They are inherently blaming freesync for ghosting from a panel based setting. Thing is any reviewer who read even 1 monitor review would know about overdrive settings and what was responsible. He instead posited that the g-sync module was supplying more voltage to the pixels than on freesync screens. The guy is a moron and he's claiming G-sync is better because he's ignorant, that isn't a good review.

http://forums.overclockers.co.uk/showthread.php?p=27799365#post27799365

 

source:OCN

 

So, what are the final words?. Well, i guess we just have wait for a review from someone who know better about monitor review, TFT Central and Blurbuster.

Link to comment
Share on other sites

Link to post
Share on other sites

I am still unclear in how, if framerate is within spec for the panel and it respective hardware limitations, that ghosting could be occurring. It would seem that there is either something else at play on the rest of the hardware implementation, or its being driven beyond its capabilities.  Is there any sources that describe why ghosting is occurring on freesync enabled setups and how ghosting can/is directly caused by the implementation variable refresh rate? The PcPer review linked in the thread doesn't appear to know either.

 

"NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."

That's Nvidia's reponse.

"For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area."

AMD's reponse that made no sense. Monitor manufacturers know better than AMD. Gsync doesn't replace the Tcon timer at all, so quite sad.

I get that people might say it causes it, but any idea how freesync can actually cause the ghosting. It still sounds like a design or implementation problem on the display hardware side of things. I did read the supplied pcper piece, but even they don't know, only what Nvidia has claimed.

Link to comment
Share on other sites

Link to post
Share on other sites

Has anybody reliable tested the input lag caused by freesync? I know g-sync on adds 2-4ms of input lag

Link to comment
Share on other sites

Link to post
Share on other sites

"For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area."

Wow lol... just wow... from that sentence you got the conclusion that Freesync introduces ghosting? I mean...

 

It's just like Tesla saying that they want to help car manufacturers, who will use Tesla tech, to pick the right parts to lessen CO2 emissions ... yet you will conclude that the cars that will use Tesla tech will increase CO2 emission because Tesla wants to tackle CO2 emission problems, to drive a fast evolution in the area.

That logic is just... WOWing material.

Keep fighting the good fight, oh brave green knight. Keep telling us the tales of nvidia fanboyism... Amuse us with the songs of Jen. 

Link to comment
Share on other sites

Link to post
Share on other sites

Still have to buy an AMD graphics card tho. 

 

I'm going to keep a monitor for a damnsight longer than I keep a graphics card. 

 

Consider I personally favour both sides of the GPU field, I could not fathom purchasing a variable refresh rate monitor that doesn't work with both nVidia and AMD GPU's. 

 

Therefore unless freesync (which isn't free, really) becomes an open standard I couldn't care less tbh.

This is what I think of Pre-Ordering video games: https://www.youtube.com/watch?v=wp98SH3vW2Y

Link to comment
Share on other sites

Link to post
Share on other sites

Still have to buy an AMD graphics card tho. 

 

I'm going to keep a monitor for a damnsight longer than I keep a graphics card. 

 

Consider I personally favour both sides of the GPU field, I could not fathom purchasing a variable refresh rate monitor that doesn't work with both nVidia and AMD GPU's. 

 

Therefore unless freesync (which isn't free, really) becomes an open standard I couldn't care less tbh.

You do realise that Freesync makes use of a open standard called adaptive sync? So if you want to be sure of something, get a monitor and/or GPU that supports the DP1.2a controller. Currently you will only get that on AMD GPUs (yep, GPUs released in 2013), since NVIDIA GPUs only support the 2011 standard.

And freesync is free of licensing fees. 

Link to comment
Share on other sites

Link to post
Share on other sites

Still have to buy an AMD graphics card tho. 

 

I'm going to keep a monitor for a damnsight longer than I keep a graphics card. 

 

Consider I personally favour both sides of the GPU field, I could not fathom purchasing a variable refresh rate monitor that doesn't work with both nVidia and AMD GPU's. 

 

Therefore unless freesync (which isn't free, really) becomes an open standard I couldn't care less tbh.

 

G-Sync is the closed standard, nobody is preventing Nvidia from using the open Adaptive Sync standard to make their own Freesync. But they won't, because they like proprietary stuff. See also: PhysX.

Link to comment
Share on other sites

Link to post
Share on other sites

BRILLIENT! now i just need $500...

 

Please, LMG really needs to check some of these out!

Build: Sister's new build |CPU i5 2500k|MOBO MSI h61m-p23 b3|PSU Rosewill 850w  |RAM 4GB 1333|GPU Radeon HD 6950 2GB OCedition|HDD 500GB 7200|HDD 500GB 7200|CASE Rosewill R5|Status online


Build: Digital Vengeance|CPU i7 4790k 4.8GHz 1.33V|MOBO MSI z97-Gaming 7|PSU Seasonic Xseries 850w|RAM 16GB G.skill sniper 2133|GPU Dual R9 290s|SSD 256GB Neutron|SSD 240GB|HDD 2TB 7200|CASE Fractal Design Define R5|Status online

Link to comment
Share on other sites

Link to post
Share on other sites

Lg's 29 inch freesync monitor. What GPU would I need to run it? It's 75Hz, 1080x2560.

Link to comment
Share on other sites

Link to post
Share on other sites

Lg's 29 inch freesync monitor. What GPU would I need to run it? It's 75Hz, 1080x2560.

In terms of GPU horsepower at least the the R9 290.

 

amd-freesync-slide-9-645x363.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

Thank God you're real.

 

Yeah AMD blaming the scalers they've been designing together with many asic companies such as realtek is stupid. 

"For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area."

Do you have anything better to do than backpedaling after you're being asked to provide a link instead or some PR pictures that prove less than what people managed to prove that God exists to confirm you're real?

 

Anyway, I found an old pcper video that highlights more of their own nvidia bias:

 

https://www.youtube.com/watch?v=FGl1Udpz1gY#t=16m30s

 

 

Hear that talk about the Asus ips freesync capable display?  Before freesync was even released Allen came out of the gate and said gsync was going to perform better than freesync because of some magical fairy pixie dust the gsync module provides.  This is BEFORE any actual tests.  He favors nvidia, and presumes that most things they do or make have a technical superiority.

 

In his actual comments about freesync he was speculating up the wazoo about areas where gsync was providing a better experience than freesync like ghosting, confirming his own innate bias FOR nvidia and against amd.  Now we hear reports that that ghosting is a byproduct of a monitor setting and not sourced from the freesync implementation vs gsync.  He might be right having frames doubled or tripled once they dip below the monitors refresh rate being a better trade off, but based on what I know about his inclinations and biases, I DO NOT TRUST his analysis and need to see other confirmations.  Because that guy is doing everything he can to find ways where nvidia is superior for its own sake.  freesync performs IDENTICALLY to gsync within the vrr window, so he goes out and finds the edge cases and beyond where there might be a noticeable difference to justify his theology that nvidia is the superior technical entity.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×