Jump to content

AMD caught on lying as well, falsely presenting a working variable refresh rate monitor

Faa

Well that was kind of an asshole move.

 

Most lies come up anyways.

 

I had believed AMD when they said you wouldn't need a new monitor.

Had they ever said consumers wouldn't need one though? I was always under the impression that the monitor would need new firmware, which is obviously not something a consumer is going to be able to do at home.

Intel i5-4690K @ 3.8GHz || Gigabyte Z97X-SLI || 8GB G.Skill Ripjaws X 1600MHz || Asus GTX 760 2GB @ 1150 / 6400 || 128GB A-Data SX900 + 1TB Toshiba 7200RPM || Corsair RM650 || Fractal 3500W

Link to comment
Share on other sites

Link to post
Share on other sites

At least AMD haven't sold a single FreeSync monitor.

You can't really compare this to the 970 case, AMD was showing an "Alpha" project, it's just to say "hey, it works!" But you can't buy it yet, I think they're working on it improving it and stuff.

Nvidia released the 970 as a finished product, with wrong specs, sold thousands and "admitted" the mistake after 3+ months. It's not the same thing.

Before you call me "amd fanboy", I have a freaking 970

Link to comment
Share on other sites

Link to post
Share on other sites

Gotta quote the man, Gabe Newell on this one. "One of the things we learned pretty early on is, don't ever, ever try to lie to the internet - because they will catch you. They will de-construct your spin. They will remember everything you ever say for eternity."

Now, I say this, but i'm not on the AMD hate train, nor the Nvidia hate train, or even the Intel hate train, and what have you. So many companies do this that I just take everything with a grain of salt these days.

CPU: i7 7700k @ 4.9GHz Cooling: NZXT Kraken X62 RAM: 16 GB GSkill Trident Z RGB GPU: Sapphire Nitro R9 390 Case:Phanteks Evolv TG

Link to comment
Share on other sites

Link to post
Share on other sites

Ok. Please stay on topic. Robert if you could please send a private message to @Slick to verify yourself and get an Industry Affiliate badge :)

 

PM sent. Thanks!

Robert Hallock

Global Head of Technical Marketing

Advanced Micro Devices, Inc.

Link to comment
Share on other sites

Link to post
Share on other sites

laptops can do variable refresh rate (on eDP) because AMD took advantage of power saving features that lowered the refresh rate

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

The tool didn't support roaming framerates then. It does now, and the footage from CES at TechReport et. al. will show you that. Even so, having a fixed framerate can still demonstrate stuttering and tearing. There is a common misconception that matching the framerate to the refresh rate will solve all tearing, but that is patently false. 

 

Furthermore, please see the comments from Nixeus that confirm my own.

First off, I do have to say it's very interesting to see you here, it's a pleasure, and the little geek in me is so thrilled to be able to directly type to an AMD representative so casually.

 

Second off, then, is FreeSync (and I assume G-Sync as well) only a sort of "bandaid" to the problem of screen tearing? It's not something to so easily get rid of? Is this something that can be worked on or is it just going to be a flaw that only more graphical horsepower can solve?

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

First off, I do have to say it's very interesting to see you here, it's a pleasure, and the little geek in me is so thrilled to be able to directly type to an AMD representative so casually.

 

Second off, then, is FreeSync (and I assume G-Sync as well) only a sort of "bandaid" to the problem of screen tearing? It's not something to so easily get rid of? Is this something that can be worked on or is it just going to be a flaw that only more graphical horsepower can solve?

 

The primary benefit of FreeSync and G-Sync is to remove stuttering and duplicate frames since the monitor will only refresh when there is a new frame to show. Whereas on a 60Hz monitor if you have 55 frames per second, you're going to see 5 of those frames for twice as long as the others.

 

Screen tearing occurs when the monitor has the data for two separate frames. For a simple visual representation:

 

|------|------|------|-

111222333444

 

Just because they are refreshing the same number of times per second doesn't mean all the data for a given frame is arriving during the refresh window, so for example the monitor might refresh with half of the image of frame 1 and 2 in the example above. At least this is my understanding of it.

 

FreeSync and G-Sync cause the monitor to not start its refresh cycle until it receives the information from the GPU, as I understand it. Whereas V-Sync causes the GPU to wait for the monitor, resulting in input lag.

 

Feel free to let me know if I'm an idiot and I've got it all wrong... I'm a bit of a layman when it comes to monitors.

Intel i5-4690K @ 3.8GHz || Gigabyte Z97X-SLI || 8GB G.Skill Ripjaws X 1600MHz || Asus GTX 760 2GB @ 1150 / 6400 || 128GB A-Data SX900 + 1TB Toshiba 7200RPM || Corsair RM650 || Fractal 3500W

Link to comment
Share on other sites

Link to post
Share on other sites

The primary benefit of FreeSync and G-Sync is to remove stuttering and duplicate frames since the monitor will only refresh when there is a new frame to show. Whereas on a 60Hz monitor if you have 55 frames per second, you're going to see 5 of those frames for twice as long as the others.

 

Screen tearing occurs when the monitor has the data for two separate frames. For a simple visual representation:

 

|------|------|------|-

111222333444

 

Just because they are refreshing the same number of times per second doesn't mean all the data for a given frame is arriving during the refresh window, so for example the monitor might refresh with half of the image of frame 1 and 2 in the example above. At least this is my understanding of it.

 

FreeSync and G-Sync cause the monitor to not start its refresh cycle until it receives the information from the GPU, as I understand it. Whereas V-Sync causes the GPU to wait for the monitor, resulting in input lag.

 

Feel free to let me know if I'm an idiot and I've got it all wrong... I'm a bit of a layman when it comes to monitors.

In simple terms, MAGIC.

 

Joking aside, I'm fairly certain you have your stuff correct.

Link to comment
Share on other sites

Link to post
Share on other sites

The primary benefit of FreeSync and G-Sync is to remove stuttering and duplicate frames since the monitor will only refresh when there is a new frame to show. Whereas on a 60Hz monitor if you have 55 frames per second, you're going to see 5 of those frames for twice as long as the others.

Screen tearing occurs when the monitor has the data for two separate frames. For a simple visual representation:

|------|------|------|-

111222333444

Just because they are refreshing the same number of times per second doesn't mean all the data for a given frame is arriving during the refresh window, so for example the monitor might refresh with half of the image of frame 1 and 2 in the example above. At least this is my understanding of it.

FreeSync and G-Sync cause the monitor to not start its refresh cycle until it receives the information from the GPU, as I understand it. Whereas V-Sync causes the GPU to wait for the monitor, resulting in input lag.

Feel free to let me know if I'm an idiot and I've got it all wrong... I'm a bit of a layman when it comes to monitors.

Honestly it sounds like you have it more right than I do lol.

Interesting stuff though, I assumed in the case of G-Sync/FreeSync, or just regular 'ole VSync, that the GPU could detect the monitor's refresh windows and attempt to output frames within that spec, but typing it out I can see how that might not be.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

Is there any particular reason you put this in the GPU section, aside from attempting to start a shitfest?

It was in news previously I believe, a mod moved it to GPU section.

RIP in pepperonis m8s

Link to comment
Share on other sites

Link to post
Share on other sites

They said that you wont need new monitors.

That you can do free sync with a fimware update.

If that ends up being wrong i will be sad .

We dont know that yet you are right,

 

I think it is supposed to work with any dp 1.2a monitor, but only if manufacturer releases firmware supporting that.

Curing shitposts by shitposts

Link to comment
Share on other sites

Link to post
Share on other sites

I think it is supposed to work with any dp 1.2a monitor, but only if manufacturer releases firmware supporting that.

 

It works with any monitor that supports DisplayPort Adaptive-Sync, which was defined in the DP 1.2a spec but isn't required by it. A display doesn't necessarily have to have Adaptive-Sync to be DP 1.2a compliant so not all DP 1.2a monitors will support it.

Link to comment
Share on other sites

Link to post
Share on other sites

Good try.

CPU: Intel i7 3970X @ 4.7 GHz  (custom loop)   RAM: Kingston 1866 MHz 32GB DDR3   GPU(s): 2x Gigabyte R9 290OC (custom loop)   Motherboard: Asus P9X79   

Case: Fractal Design R3    Cooling loop:  360 mm + 480 mm + 1080 mm,  tripple 5D Vario pump   Storage: 500 GB + 240 GB + 120 GB SSD,  Seagate 4 TB HDD

PSU: Corsair AX860i   Display(s): Asus PB278Q,  Asus VE247H   Input: QPad 5K,  Logitech G710+    Sound: uDAC3 + Philips Fidelio x2

HWBot: http://hwbot.org/user/tame/

Link to comment
Share on other sites

Link to post
Share on other sites

The primary benefit of FreeSync and G-Sync is to remove stuttering and duplicate frames since the monitor will only refresh when there is a new frame to show. Whereas on a 60Hz monitor if you have 55 frames per second, you're going to see 5 of those frames for twice as long as the others.

 

Screen tearing occurs when the monitor has the data for two separate frames. For a simple visual representation:

 

|------|------|------|-

111222333444

 

Just because they are refreshing the same number of times per second doesn't mean all the data for a given frame is arriving during the refresh window, so for example the monitor might refresh with half of the image of frame 1 and 2 in the example above. At least this is my understanding of it.

 

FreeSync and G-Sync cause the monitor to not start its refresh cycle until it receives the information from the GPU, as I understand it. Whereas V-Sync causes the GPU to wait for the monitor, resulting in input lag.

 

Feel free to let me know if I'm an idiot and I've got it all wrong... I'm a bit of a layman when it comes to monitors.

This is accurate.

 

I think it is supposed to work with any dp 1.2a monitor, but only if manufacturer releases firmware supporting that.

 

Sort of. What most people probably don't think about is what kind of electronics run their monitor. We all go gaga over great panels, but great scalers--those electronics--are just as vital. There are three vendors in the world for those scalers: Realtek, MStar and Novatek. All three of them support FreeSync in their portfolio.

 

This support was predominantly achieved by developing new firmware that utilized latent capabilities in the silicon of the units already in mass production. That is to say, if a monitor vendor is using one of these scalers in their units or supply chain, now they only need to obtain a different ROM for their monitor and validate the monitor for DRR.

 

Tada: a FreeSync monitor is born, with no licensing or material costs beyond what they were already going to pay to build that monitor. Nor do we charge any sort of royalty fee. This is what puts the "free" in FreeSync. We made it as simple and as economical as possible for monitor guys.

 

There are certainly other scalers in the world that were not so lucky. For whatever reason, the silicon doesn't support DRR, or didn't support a range that would be "useful" for gaming. Monitors based on these scalers could not be retooled or upgraded, so the manufacturer would need to bring a new SKU into their lineup with a different scaler.

 

Finally, the scaler guys are also working on new SKUs to entice monitor vendors into sourcing them for new designs. These scalers are adopting FreeSync also, because the expertise and the Adaptive-Sync standard is already established and incorporable.

 

Like 4K, audio, daisychaining, and a whole host of other features, Adaptive-Sync is an optional feature of the 1.2a spec. Not every 1.2a monitor can be FreeSync, for the technical reason I described above, or simply because the vendor does not want to adopt the optional feature.

 

I hope that clears things up.

Robert Hallock

Global Head of Technical Marketing

Advanced Micro Devices, Inc.

Link to comment
Share on other sites

Link to post
Share on other sites

This is accurate.

 

 

Sort of. What most people probably don't think about is what kind of electronics run their monitor. We all go gaga over great panels, but great scalers--those electronics--are just as vital. There are three vendors in the world for those scalers: Realtek, MStar and Novatek. All three of them support FreeSync in their portfolio.

 

This support was predominantly achieved by developing new firmware that utilized latent capabilities in the silicon of the units already in mass production. That is to say, if a monitor vendor is using one of these scalers in their units or supply chain, now they only need to obtain a different ROM for their monitor and validate the monitor for DRR.

 

Tada: a FreeSync monitor is born, with no licensing or material costs beyond what they were already going to pay to build that monitor. Nor do we charge any sort of royalty fee. This is what puts the "free" in FreeSync. We made it as simple and as economical as possible for monitor guys.

 

There are certainly other scalers in the world that were not so lucky. For whatever reason, the silicon doesn't support DRR, or didn't support a range that would be "useful" for gaming. Monitors based on these scalers could not be retooled or upgraded, so the manufacturer would need to bring a new SKU into their lineup with a different scaler.

 

Finally, the scaler guys are also working on new SKUs to entice monitor vendors into sourcing them for new designs. These scalers are adopting FreeSync also, because the expertise and the Adaptive-Sync standard is already established and incorporable.

 

Like 4K, audio, daisychaining, and a whole host of other features, Adaptive-Sync is an optional feature of the 1.2a spec. Not every 1.2a monitor can be FreeSync, for the technical reason I described above, or simply because the vendor does not want to adopt the optional feature.

 

I hope that clears things up.

 

If monitors are already in mass production that support freesync in silicon and all that's needed is a firmware update, then why is it taking so long to get to market? Why don't we have firmware updates already?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

If monitors are already in mass production that support freesync in silicon and all that's needed is a firmware update, then why is it taking so long to get to market? Why don't we have firmware updates already?

 

A good question, and a couple reasons:

 

1) An industry standard just takes time. There are rounds of submission, editing, voting, drafts, etc. You have to finish the spec (Adaptive-Sync) and get it ratified in the body before you can ever truly begin the software development of new firmware. Big multi-party consortiums are by necessity pretty slow, because the standard has to be fair and equitable to all participating parties, and everyone has an opinion. ;)

 

2) Okay, now the spec is done. Now you have to get support from scaler vendors, which means months and weeks of meetings and pitches and proposals and technical demonstrations.

 

3) Okay, now there's scalers. Repeat the same pitch process for monitor vendors using those scalers.

 

4) Okay, now you've convinced the industry's biggest vendors. Software development isn't lightning fast, either. A monitor scaler is a complicated piece of silicon, and it takes time to design an entirely new piece of firmware on an entirely specification. Then, and most people don't know this, the monitor QA period is incredibly long. 

 

That's it. It takes time to build an industry standard, and then get people onboard, then they have to build things. 

Robert Hallock

Global Head of Technical Marketing

Advanced Micro Devices, Inc.

Link to comment
Share on other sites

Link to post
Share on other sites

If monitors are already in mass production that support freesync in silicon and all that's needed is a firmware update, then why is it taking so long to get to market? Why don't we have firmware updates already?

To be honest, I don't think a firmware will come into place (just being skeptical). If you owned a company that dealt with producing/selling monitors, would you rather give an update to customers to add a new a feature or add a new monitor in a lineup with the new feature? Though, I could be wrong, but no firmware update would be better business.

Link to comment
Share on other sites

Link to post
Share on other sites

I have a g sync monitor so it has adaptive sync.. I'm hoping I can update it's firmware some how in the future so I've best of both worlds.. not that g sync and free sync are a big deal to me.. tearing isn't something that stops me playing games or something I ever really notice.

Gaming PC: • AMD Ryzen 7 3900x • 16gb Corsair Vengeance RGB Pro 3200mhz • Founders Edition 2080ti • 2x Crucial 1tb nvme ssd • NZXT H1• Logitech G915TKL • Logitech G Pro • Asus ROG XG32VQ • SteelSeries Arctis Pro Wireless

Laptop: MacBook Pro M1 512gb

Link to comment
Share on other sites

Link to post
Share on other sites

 

1) An industry standard just takes time. There are rounds of submission, editing, voting, drafts, etc. You have to finish the spec (Adaptive-Sync) and get it ratified in the body before you can ever truly begin the software development of new firmware. Big multi-party consortiums are by necessity pretty slow, because the standard has to be fair and equitable to all participating parties, and everyone has an opinion.  ;)

You just said you finished the spec by upgrading the firmware lol.

 

4) Okay, now you've convinced the industry's biggest vendors. Software development isn't lightning fast, either. A monitor scaler is a complicated piece of silicon, and it takes time to design an entirely new piece of firmware on an entirely specification. Then, and most people don't know this, the monitor QA period is incredibly long. 

But it took you guys a month making a firmware and you guys claimed most monitors are capable of variable refresh rate. For that 10 Hz (30-60Hz vs 40-60Hz), nobody is playing below 40 fps - any reasons why it's not released? Two options here, else you guys rushed and lied as much as you could (you guys didn't provide any proof of working variable refresh rate) to mock Nvidia with a free solution that's not free after all and afterwards trying to hide behind the BS in the form of "it's just the license that's free". 2nd option being, many people here with their current monitor should get a firmware update like you guys said most monitors are capable of it. Give people the firmware update instead of making them pay for a new monitor you guys are probably making profit of. I don't see AMD doing this for free.

Also, eDP and DP1.2 both have variable Vblanks which is enough to solve tearing/stuttering given that you match the frame rate with the refresh rate of the monitor except you won't have variable refresh rate. Your demo's back in CES/Computex 2014 were pretty much the usual marketing we see from AMD; trollish.

By the way I assume Freesync is all about variable refresh rate, right? How did you guys manage variable refresh rate on eDP that's not even capable of Adaptive sync since laptop screens don't even use a scaler which you just agreed on that variable refresh rate won't work without a scaler?

"Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. "

http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo

Also it's seamless refresh rate switching, according to Vesa the range is 40-60Hz, you guys were showing 30 fps lower than your prototype that couldn't do below 40Hz. Got another question, any reasons you guys renamed Adaptive sync monitors to Freesync monitors? It's not AMD's tech after all, it should be Adaptive sync and your GPU/Driver support shouldn't be named to Freesync because the license is free. I don't see Nvidia calling their GPU/Driver support for Gsync called 'paidsync'.
 

 

I've lurked for a very long time.  ;) But nothing rustles my jimmies like incorrect conspiracy theories.

Can you correct your boss Richard Huddy then?



At 30.00 claiming Gameworks crippling AMD's performance. If you think your PR were pure honesty, can you explain me this here?

2013-09-25_17-35-31.jpg

"8-core parallel rendering", so lets see what processexplorer has to say on a 8350 with Mantle;

2mnoz2u.jpg

Eh, 8 core parallel rendering with 5 threads? Even your latest Mantle game isn't doing it; http://linustechtips.com/main/topic/293392-fx-cpu/page-7#entry4000410

Also why aren't you guys fixing your massive driver overhead issues? Mantle is only good against your own DirectX;

i7_sw_1920.png

If Nvidia can cut as much as CPU overhead Mantle can do with DirectX, so can AMD. Can we please get a fix for this? Don't tell me your driver team wasnt aware of this issue, it's still there with the omega drivers. Also inform Richard Huddy about this, that should answer why Nvidia wasn't interested in adopting Mantle.
Link to comment
Share on other sites

Link to post
Share on other sites

You clearly have your opinions on the matter and are entitled to them. I've explained the facts as they are. What you do with those facts is your discretion. :)

Robert Hallock

Global Head of Technical Marketing

Advanced Micro Devices, Inc.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

You just said you finished the spec by upgrading the firmware lol.

 

But it took you guys a month making a firmware and you guys claimed most monitors are capable of variable refresh rate. For that 10 Hz (30-60Hz vs 40-60Hz), nobody is playing below 40 fps - any reasons why it's not released? Two options here, else you guys rushed and lied as much as you could (you guys didn't provide any proof of working variable refresh rate) to mock Nvidia with a free solution that's not free after all and afterwards trying to hide behind the BS in the form of "it's just the license that's free". 2nd option being, many people here with their current monitor should get a firmware update like you guys said most monitors are capable of it. Give people the firmware update instead of making them pay for a new monitor you guys are probably making profit of. I don't see AMD doing this for free.

Also, eDP and DP1.2 both have variable Vblanks which is enough to solve tearing/stuttering given that you match the frame rate with the refresh rate of the monitor except you won't have variable refresh rate. Your demo's back in CES/Computex 2014 were pretty much the usual marketing we see from AMD; trollish.

By the way I assume Freesync is all about variable refresh rate, right? How did you guys manage variable refresh rate on eDP that's not even capable of Adaptive sync since laptop screens don't even use a scaler which you just agreed on that variable refresh rate won't work without a scaler?

"Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. "

http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo

Also it's seamless refresh rate switching, according to Vesa the range is 40-60Hz, you guys were showing 30 fps lower than your prototype that couldn't do below 40Hz. Got another question, any reasons you guys renamed Adaptive sync monitors to Freesync monitors? It's not AMD's tech after all, it should be Adaptive sync and your GPU/Driver support shouldn't be named to Freesync because the license is free. I don't see Nvidia calling their GPU/Driver support for Gsync called 'paidsync'.

 

 

Can you correct your boss Richard Huddy then?

At 30.00 claiming Gameworks crippling AMD's performance. If you think your PR were pure honesty, can you explain me this here?

2013-09-25_17-35-31.jpg

"8-core parallel rendering", so lets see what processexplorer has to say on a 8350 with Mantle;

2mnoz2u.jpg

Eh, 8 core parallel rendering with 5 threads? Even your latest Mantle game isn't doing it; http://linustechtips.com/main/topic/293392-fx-cpu/page-7#entry4000410

Also why aren't you guys fixing your massive driver overhead issues? Mantle is only good against your own DirectX;

i7_sw_1920.png

If Nvidia can cut as much as CPU overhead Mantle can do with DirectX, so can AMD. Can we please get a fix for this? Don't tell me your driver team wasnt aware of this issue, it's still there with the omega drivers. Also inform Richard Huddy about this, that should answer why Nvidia wasn't interested in adopting Mantle.

 

 

There isn't anything wrong with AMD drivers you pleb, there hasn't been since 2010/2011...You clearly have a grudge against AMD for reason unknown but stop bringing irrelevant facts into your conspiracy.

Regular human bartender...Jackie Daytona.

Link to comment
Share on other sites

Link to post
Share on other sites

There isn't anything wrong with AMD drivers you pleb, there hasn't been since 2010/2011...

There is, http://www.overclock.net/t/1528559/directx-driver-overhead-and-why-mantle-is-a-selling-point-bunch-of-benchmarks/0_100

 

You clearly have a grudge against AMD for reason unknown but stop bringing irrelevant facts into your conspiracy.

Conspiracy? I'm not sure why you wouldn't demonstrate your new technology aka Freesync which was all about VARIABLE refresh rate working. It's like BMW coming up with a new tech eg their cars being able to fly except they don't show it.

 

The name alone was a clear attempt of mocking Nvidia, if they call their SSDs/RAM Radeon they should call their monitor RadeonSync too, what does it matter if they had it working or not? As long as they could mock Nvidia, that was the whole point of their marketing. Just to be real, I never saw any brand naming their products that cost money "free", have you?

 

You clearly have your opinions on the matter and are entitled to them. I've explained the facts as they are. What you do with those facts is your discretion.  :)

That you had variable working during CES/Computex 2014 is not a fact. Also can you tell your driver team to get their DX CPU overhead on par with Mantle like Nvidia did instead of gimmicking Mantle? http://pclab.pl/art57235.html

Link to comment
Share on other sites

Link to post
Share on other sites

 

There is, http://www.overclock.net/t/1528559/directx-driver-overhead-and-why-mantle-is-a-selling-point-bunch-of-benchmarks/0_100

 

Conspiracy? I'm not sure why you wouldn't demonstrate your new technology aka Freesync which was all about VARIABLE refresh rate working. It's like BMW coming up with a new tech eg their cars being able to fly except they don't show it.

 

The name alone was a clear attempt of mocking Nvidia, if they call their SSDs/RAM Radeon they should call their monitor RadeonSync too, what does it matter if they had it working or not? As long as they could mock Nvidia, that was the whole point of their marketing. Just to be real, I never saw any brand naming their products that cost money "free", have you?

 

 

But GSync is literally a rip on VSync is it not? and you still have to pay a bollock or two to get a "monitor" that has a piece of hardware just to use it? Face it, Nvidia just monetized an already free inbuilt technology [Vsync] whereas AMD have enabled it on all DP1.3a monitors the worst you have to is by a monitor under  DP 1.3 iteration rather than spending an ungodly amount of money on a £200 piece of hardware JUST to make it run on a specific brand of GPUs and these are early days with early firmware released so it's not going to be "balls to the walls" amazing right of the bat but there will be noticeable improvements.

 

And not driver is completely error free in fact in the last year there have been more issues with Nvidias drivers than there was with AMDs so you can't just directly take a shot at AMDs drivers just because there are issues with them.

 

As for the name jab, why do you think there are taking a shot at Nvidia? Maybe they though it was to reflect the fact you don't have to pay £200+ extra on a monitor to get it to work? I would call that free in the strictest sense all you need is a monitor compatible with DP 1.3 and even then they aren't asking you or forcing you to go and spend money on as monitor it's more of a choice thing if anything. 

Regular human bartender...Jackie Daytona.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×