Jump to content

Nvidia Pitches Advantages of G-Sync Over AMD's FreeSync

BiG StroOnZ

That's not an opinion. That's ignorance due to the fact that you have nothing to base that on.

Linus hasn't even made his review on the freesync monitor and you already assume it is better because of the brand?

 

LOL you need to calm down and wait for SOME WHO ACTUALLY OWNS BOTH MONITORS to review it. aka linus.

BTW profanity is unnecessary if you actually have something to back up your claims other than bad language.

 

 

 

Hey Ender. You threw the 1st punch calling him closed minded. ;)

Gonna cuffem and stuffem. QUE QUE QUE. I love it I love it. :P

 

i7 4790K, Asus Z97 Sabertooth S, Crutial M.2 120gig, 32 Gig Corsair Dominators, Corsair h100i, Seagate ST750XL, 2 X MSI R9 290X Lightning's, Corsair air 540 

Link to comment
Share on other sites

Link to post
Share on other sites

It really does amaze me sometimes how easily some people suck up the marketing BS.  Consume it like its the only information in the world. 

 

People you are being lied to from all sides.  The only thing regurgitating marketing BS from manufacturers proves is that you are gullible.

 

Take in the reviews and make up your own mind, live with your choices. You neither need nor can convince others to accept marketing hype. we will each look at the information as it's presented and make our own minds up.  Share your opinions, but don't let yourself be fooled with marketing BS.  

 

Also if you can't make a post without making a degrading comment,  you are a fanboy who has missed the point.

Dude, I love this thread

 

There are 2 people, talking about 2 different things, basically one ask about A, other answers Ab, then say A is wrong because D, as C proves, that are not even remotely close to topic, and continues, so the argument just goes on and on, evolving (slowly) page by page

 

and it just goes on and on

 

And there are some formations who feel obliged to attack or defend a company because of brand loyalty, going as far as using insults to make their points stronger. 

 

 

Comedy gold:)

Link to comment
Share on other sites

Link to post
Share on other sites

I don't understand what you mean my reason? Regardless, I made it the title because that is what the title of the article was, so personally I don't think it had to be change, was it kind of sensationalist sure but that's what grabs a potential reader's attention. I believe you are talking about the 4K panel, and the only reason why it only goes up to 60Hz is because there is no panel or interface capable of yet displaying over 60Hz 4K signal. But mind you that still is inside of their range of 30-144Hz. So I don't see how that is a problem. NVIDIA's range is on 30-144Hz, whereas AMD's is being stated as 9-240Hz. But their panels that exist are not really close to that range. Well HDMI is offered on products along with DisplayPort. So it's not like VESA is trying to make one product seem better than the other. Whereas with G-Sync or FreeSync you can only use one, but not both. Meanwhile, everyone has a video card that has HDMI along with DisplayPort so it's not like they are able to force a user to only use one with advertising. Since both can be offered on the same device.

 

No it's not wrong, but it still means AMD is promoting one of their products. Which people here seem to think that is not the case.

 

-snip'd the rest-

I was really just trying to understand how you came by your opinion and I really just don't understand why you're so convinced that AMD did something wrong.

 

To be frank most of what you just said was contradictory: The title of the article is saying loudly and exactly what you're saying AMD is implying with their comparisons. If you can say the 4k 60hz panel is "inside of their range" of 30-144, then wouldn't 48-75/40-144 be inside the range of 9-240? And if full range @4k being unsupported is permitted as an acceptation, why is 240hz not?(Bear in mind 240hz refreshing panel exist even if their implementation is cut in half, hey exist) As for input methods; implementation for HDMI and DP comes at a cost,(manufacturing costs) just because they are cheap enough to implement them together doesn't really counter my assertion. Shouldn't all your criticisms, so far, of AMD apply to every company, why does Nvidia get away with it in your opinion?

Link to comment
Share on other sites

Link to post
Share on other sites

until adaptive sync becomes a standard for all monitors, both are just money sinks (har har). either get gsync and deal with flickering, or get freesync and deal with keeping frame rates inside the variable refresh envelope. AMD can implement the doubling/tripling/quadrupling up of low frame rates with driver updates, and Nvidia can enable vsync off for gsync as well. Too early in the game to be spending that much money for the first releases.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

I was really just trying to understand how you came by your opinion and I really just don't understand why you're so convinced that AMD did something wrong.

 

To be frank most of what you just said was contradictory: The title of the article is saying loudly and exactly what you're saying AMD is implying with their comparisons. If you can say the 4k 60hz panel is "inside of their range" of 30-144, then wouldn't 48-75/40-144 be inside the range of 9-240? And if full range @4k being unsupported is permitted as an acceptation, why is 240hz not?(Bear in mind 240hz refreshing panel exist even if their implementation is cut in half, hey exist) As for input methods; implementation for HDMI and DP comes at a cost,(manufacturing costs) just because they are cheap enough to implement them together doesn't really counter my assertion. Shouldn't all your criticisms, so far, of AMD apply to every company, why does Nvidia get away with it in your opinion?

 

AMD's panels only support 40-144Hz (and 48Hz-75Hz). So when you state you can do up to 240Hz that is way outside of 144Hz. So it doesn't do up to 240Hz. 96Hz difference to be exact. When you state you can go down to 9Hz but you can only go down to 40Hz that's a 31Hz difference. How are they doing 9-240Hz if they can only do 30-144Hz? It's not like they are saying they can do 40-144Hz and all their monitors do 40-144Hz except their 4K panel does 30-60Hz because it is limited by DP spec and the fact above 60Hz 4K won't be possible until a DP revision and the fact that there aren't any planned 144Hz 4K monitors yet. Or besides the fact that 4K 144Hz gaming isn't really possible right now without 8-way Titan X's (/s). I don't see what you don't get? Every single G-Sync monitor is able to do 30-144Hz, except one does 30-60 and it's a 4K monitor. Whereas no AMD monitors do 9-240Hz. They do 40-144Hz or 48-75Hz.

 

240Hz monitors don't actually exist, the Eizo Foris FG2421 240Hz is really just a 120Hz with strobing backlight and they are just calling it a 240Hz. The 240Hz modes are just frame doubling or black frame insertion. It just lessens blur. 

 

How doesn't it counter your assertion? You can have both DP and HDMI on a graphics card. You can have both DP and HDMI on a monitor. If NVIDIA and AMD are competing with each other how are you going to get G-Sync and FreeSync in the same monitor? How are you going to get G-Sync on an AMD Card? 

 

NVIDIA didn't get away with anything, the 970 outrage was enough evidence for that. Even after multiple sources benchmarked and benchmarked and benchmarked, showing no adverse side-effects to having the 500MB section. People still completely destroyed them. So nobody gets away with anything. I just don't think AMD should get away with trying to trash NVIDIA in their marketing when their actual products are out and we know what they are capable of.

Link to comment
Share on other sites

Link to post
Share on other sites

No it won't. FreeSync still has visible downsides and worse ghosting problems than GSYNC in most cases. There were always going to be limits to what software can do. Even if you can emulate every feature, there will be a performance hit as resources are used to do it. GSync modules store the algorithms at the hardware level and the algorithms can be changed because it's an FPGA. GSYNC can always evolve on the same hardware until the module is totally filled and can fit no more, and then software can be added on top, giving a minimal performance hit vs. AMD's ever growing software-based solution to the problem.

most of these issues will be fixed with time. and I disagree on the performance part, because I believe the difference in performance will be very tiny nobody can notice it just like playing on 100fps vs 120fps

Link to comment
Share on other sites

Link to post
Share on other sites

AMD's panels only support 40-144Hz (and 48Hz-75Hz). So when you state you can do up to 240Hz that is way outside of 144Hz. So it doesn't do up to 240Hz. 96Hz difference to be exact. When you state you can go down to 9Hz but you can only go down to 40Hz that's a 31Hz difference. How are they doing 9-240Hz if they can only do 30-144Hz? It's not like they are saying they can do 40-144Hz and all their monitors do 40-144Hz except their 4K panel does 30-60Hz because it is limited by DP spec and the fact above 60Hz 4K won't be possible until a DP revision and the fact that there aren't any planned 144Hz 4K monitors yet. Or besides the fact that 4K 144Hz gaming isn't really possible right now without 8-way Titan X's (/s). I don't see what you don't get? Every single G-Sync monitor is able to do 30-144Hz, except one does 30-60 and it's a 4K monitor. Whereas no AMD monitors do 9-240Hz. They do 40-144Hz or 48-75Hz.

 

240Hz monitors don't actually exist, the Eizo Foris FG2421 240Hz is really just a 120Hz with strobing backlight and they are just calling it a 240Hz. The 240Hz modes are just frame doubling or black frame insertion. It just lessens blur. 

 

How doesn't it counter your assertion? You can have both DP and HDMI on a graphics card. You can have both DP and HDMI on a monitor. If NVIDIA and AMD are competing with each other how are you going to get G-Sync and FreeSync in the same monitor? How are you going to get G-Sync on an AMD Card? 

 

NVIDIA didn't get away with anything, the 970 outrage was enough evidence for that. Even after multiple sources benchmarked and benchmarked and benchmarked, showing no adverse side-effects to having the 500MB section. People still completely destroyed them. So nobody gets away with anything. I just don't think AMD should get away with trying to trash NVIDIA in their marketing when their actual products are out and we know what they are capable of.

So why is DP allowed to be the limiting factor for while panel technology is not since, as you have said, there are no real 240hz monitors? Are you saying AMD graphics cards/driver can not go to 240hz? Can not work with the full range of Adaptive sync?

 

Some can have an Nvidia card and AMD card in their computer as well, does that mean they don't compete? G-sync and Adaptive sync can work on the same monitor if you look at the example the BenQ monitor mention earlier in this thread, a monitor can have multiple scalars connecting to the panel it just would add more cost to make it though, obviously. I have no idea how an AMD card would get G-sync supported though, lol.

 

This isn't a 970 thread, don't go to far off topic. You seem to more upset about how AMD is doing it's marketing then limitations of technology, I'm sorry you feel that way, but I just can't understand this issue you seem to be having.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD's panels only support 40-144Hz (and 48Hz-75Hz). So when you state you can do up to 240Hz that is way outside of 144Hz. So it doesn't do up to 240Hz. 96Hz difference to be exact. When you state you can go down to 9Hz but you can only go down to 40Hz that's a 31Hz difference. How are they doing 9-240Hz if they can only do 30-144Hz? It's not like they are saying they can do 40-144Hz and all their monitors do 40-144Hz except their 4K panel does 30-60Hz because it is limited by DP spec and the fact above 60Hz 4K won't be possible until a DP revision and the fact that there aren't any planned 144Hz 4K monitors yet. Or besides the fact that 4K 144Hz gaming isn't really possible right now without 8-way Titan X's (/s). I don't see what you don't get? Every single G-Sync monitor is able to do 30-144Hz, except one does 30-60 and it's a 4K monitor. Whereas no AMD monitors do 9-240Hz. They do 40-144Hz or 48-75Hz.

 

Man am I glad I got to see this post before your great wall of China text.

 

AMD's panels? You obviously have no clue what you are on about, so do yourself a favour and read this post I made on page 5: http://linustechtips.com/main/topic/333694-nvidia-pitches-advantages-of-g-sync-over-amds-freesync/page-5#entry4529520

 

Have you read it yet? Good! Now it should be easy to conclude that AMD Freesync is a driver for the graphics card on the computer, that utilizes the hardware with Adaptive Sync on the monitor. This means that:

  • Freesync is not implemented in any monitor. Adaptive Sync by VESA is.
  • AMD has no influence on scalers, panels or anything else, used in monitors in any way what so ever.
  • AMD has no influence on panel technology, or the limitations hereof.
  • AMD does not make their own panels.

What AMD does have is a license free program, that ensures the Adaptive Sync monitors have a wide enough hz interval to be useful for gaming (which means they can use the freesync logo), and support for the full 9-240hz interval, that is in the VESA Adaptive Sync standard. That means, that if a monitor with a native 180hz with Adaptive Sync where to be launched tomorrow, Freesync driver on the graphics card, would support it fully, just as the industry standard of Adaptive Sync would.

 

I hope this helps with your confusion, because you are indeed confusing brands, standards, drivers, hardware, etc. into one giant clusterfrack.

 

The 30-144hz is an LCD panel limitation (although LCD should be able to go above 144hz). Look for OLED if you want to go below 30hz. You understand that standards, as Adaptive Sync needs to be very future proof, right? Might be unchanged for a decade.

 

If you are not sure what I mean, then, what's the point of continuing. I made it very clear what I mean, and repeated myself multiple times. So now you say I don't know what I'm talking about, how convenient of you to pull a straw man argument out of thin air. When I made my point loud and clear, and multiple people understand my point. The only person who is not understanding it is clearly you. So maybe you should go back and read a little more closely because I shouldn't have to repeat myself or my point 1000 times for one single person who is confused to understand it.

 

Of course I don't understand what you are saying when you are confusing everything together. I think the ultimate issue, is that you compare standards and the support of these, with actual products, instead of comparing products to product.

 

So now it's irrelevant what monitors support? Why is it irrelevant? Because AMD get's caught stretching the truth, now it's irrelevant? They say 9-240Hz but their panels only do 40-144Hz, but that's ok in your opinion? To basically lie to make yourself look better than the competition. FreeSync can't utilize what doesn't exist, and won't exist for many years. So let's nip that in bud.

 

It's irrelevant, because we are talking standards, and the support of standards, not the actual products. No monitors have freesync implemented. If an OLED monitor with a hz interval of 9-240 hz was released tomorrow with an adaptive Sync capable scaler with that interval, both Adaptive Sync as a standard, and Freesync as a driver, would support it out of the gate. We have no idea about Gsync. If they have to reprogram it, change it, or do additional R&D.

 

Let's go through your bullets:

 

 

 What it costs to implement G-Sync into a monitor is not a licensing fee. That's what it was saying, but you didn't seem to get that.

 So now you are going to lie and claim that Adaptive-Sync isn't for gaming? Really that's your bullet point, Adaptive-Sync isn't for gaming but also for "power savings." LMAO. Next.

 You seem to be pushing the blame on VESA, when VESA isn't competing with G-Sync. FreeSync is. But let's try to make it seem like AMD gets off scot-free here for promoting the VESA standard instead of what monitors are actually capable of

 I posted numerous sources with information that proves G-Sync can do 1-240Hz. From Tom, Malventano, and NVIDIA. Multiple times in this thread. If you choose to ignore said information, then that's your problem. Remember, "Tom's comments are useless, Malventano is a fanboy, and NVIDIA didn't state specific numbers" so that automatically means it's not true or accurate.

 Well first, let's read the article that you are using that chart from. Firstly, they are talking about Input Lag, not performance loss, which would mean much less framerate because of G-Sync being enabled. Secondly, if you continue to read the author concludes with the following, "The good news now comes: As a last-ditch, I lowered fps_max more significantly to 120, and got an immediate, sudden reduction in input lag (27ms/24ms for G-SYNC). I could no longer tell the difference in latency between G-SYNC and VSYNC OFF in Counterstrike: GO! Except there was no tearing, and no stutters anymore, the full benefits of G-SYNC without the lag of VSYNC ON." Before that they were running the test with an FPSMax of 300 which is why the results in the above chart happened. So no you didn't even read the article, and no it doesn't force V-Sync on. The author said the result was it felt like V-Sync was turned on when went over the cap. If you continue reading the article he concludes with the following, "As even the input lag in CS:GO was solvable, I found no perceptible input lag disadvantage to G-SYNC relative to VSYNC OFF, even in older source engine games, provided the games were configured correctly. G-SYNC gives the game player a license to use higher graphics settings in the game, while keeping the gameplay smooth"

 

  • We don't know if the GSYNC hardware module comes with a fee or not. Maybe monitor vendors just has to pay a standard price premium for the module. The point is that AMD does not get a single cent for any Adaptive Sync capable monitor on the market. Not even for the ones, that has gotten the Freesync Logo and verification. Nvidia gets a lot of money for each Gsync monitor. That is one of the reasons why Gsync is more expensive, and will not have the same availability as Adaptive Sync.
  • See THAT is a strawman. I never claimed such a thing. Variable VBlank used by both GSync and Adaptive Sync, was used in eDP as a power savings feature. Adaptive Sync can still use it as such. For instance the 280x card can only use Adaptive Sync for power saving and video playback. This is relevant, because it could make Adaptive Sync interesting for a non gaming player, like Intel and their integrated graphics solutions.
  • I'm not blaming anyone, because I don't see anything wrong. But you are criticizing AMD for fully supporting the VESA Adaptive Sync standard, and marketing it. If you have a problem with the hz interval support, you have to criticize the source, that made it. That is VESA.
  • I will get back to this on your sources further down.
  • I have read the article. The point is that Gsync forces VSYNC when reaching the upper level of the monitor. That creates latency. On Freesync, you can choose to either do the same, and implement Vsync or just let the fps run variable, which will create tearing. The latter is preferable for pro players, but is not that noticeable on 144+hz. But that is NOT AMD's point; Gsync uses an overcomplicated two way handshake communication between each frame, which has a performance hit of 1-1,5%: https://youtu.be/EFQGuYx2q7k?t=102 (1:42 in)


 

Now let's go through your quotes:

 

Forbes: Let’s talk about the minimum response times that both G-Sync and Adaptive Sync support.

Tom Petersen: “First of all, the spec ‘Adaptive Sync’ has no minimum. Both have the ability to communicate any range, so there’s nothing about the base specs that are different. What’s interesting though, is the reason there are panel-specific refresh limits. LCD images decay after a refresh, you kinda paint the screen and it slowly fades. That fade is just related to the panel. The reason there’s an Adaptive Sync spec and G-Sync module is because that lower limit is variable depending on the technology inside the panel. But games don’t know about that! So what do you do when a game has a lower FPS than the minimum rate you want to run your panel? Because when they run below that minimum rate things start to flicker, and that’s a horrible experience.”

 

Tom's comments on the degradation of the LCD panel, is why no monitor goes below 30hz atm. Now if it is truly true, that Gsync supports higher hz intervals, so be it. I did write, it might very well be possible several times. What perplexes me, is that Nvidia never mentioned this before. The FPGA processor in the Gsync module, should be able to handle it; the question is, whether the module needs firmware updating or not, to support it. If so, the support is not native. Nvidia should have announced this before, about Gsync as a standad, instead of only as an implementation.

 

But then again, what does "communicate any range" even mean? Can Gsync do a synced framerate of 5hz, if the panel supports it? That is the question. Either way, you can give Nvidia the benefit of the doubt. Based on al the PCPer interviews with Tom, it does seem he has a good way of distorting comments to sound like something it's not quite. Also known as "spin". 

Tom Petersen: “I can’t go into too much detail because it’s still one of our secret sauces. But our technology allows a seamless transition above and below that minimum framerate that’s required by the panel. PC Perspective wrote an article guessing how we did that, and they’re not that far off…”

 

Seamless transition is about how Gsync works, when dropping below the panels 30hz minimum framerate, which is essentially VSync ON (or redraw previous image from monitor ram, which is pretty much the same). Has nothing to do with support below 30 hz.

 

Malventano is biased. He claimed that Freesync would never be able to do what Gsync does. He was wrong. If he has inside information from Nvidia, he might just be even more biased. Either way, he is not an official source, and I do not see him as credible. You might, good on you, but that does not make his statements, facts.

 

I do find it interesting that Gsync can send down to 1 fps (hz) (or so he claims). That is useful for power savings (if Gsync supports that?). For gaming, going below 30hz is pretty useless. For video playback, 24hz could be useful (or 23,96hz).

 

Now you can criticize AMD all you want, but all Adaptive Sync monitors, does show their max hz setting. Finding the hz interval should be easy in reviews, but honestly I think they should be mandatory to show.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

@Notional, it's hardly established academia when the academics don't agree in any sizeable majority. It's not arrogance. It's being an academic and knowing how the system works. The arrogant one here is you. I did plenty of searching around the past day. The theory has far too many counter examples to be considered an authoritative truth.

You are allowed use of it, but you cannot release developed software with it because it's Nvidia's property and you have no development agreement with them. Basic property rights would be a good thing to learn. Also, AMD can come up with their own version of Gameworks...duh... If they can sufficiently prove their libraries don't disadvantage Nvidia GPUs, then that directly proves Nvidia's libraries disadvantage AMD unfairly. Until such a time, AMD fanchild ten can continue to bitch while it's still mostly the fault of their own favorite company's downfalls.

Also there's 0 proof that Gameworks is responsible for canceling AMD AA, especially since there's only one example. It's just AMD fans looking for anything to slander Nvidia with.

The graphics card market is demand driven in every sense of the word. Chip makers would not bother to innovate if demand for more performance didn't exist. 100% a demand-based market.

Please try to get a sense of integrity.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

 

So why is DP allowed to be the limiting factor for while panel technology is not since, as you have said, there are no real 240hz monitors? Are you saying AMD graphics cards/driver can not go to 240hz? Can not work with the full range of Adaptive sync?

 

Some can have an Nvidia card and AMD card in their computer as well, does that mean they don't compete? G-sync and Adaptive sync can work on the same monitor if you look at the example the BenQ monitor mention earlier in this thread, a monitor can have multiple scalars connecting to the panel it just would add more cost to make it though, obviously. I have no idea how an AMD card would get G-sync supported though, lol.

 

This isn't a 970 thread, don't go to far off topic. You seem to more upset about how AMD is doing it's marketing then limitations of technology, I'm sorry you feel that way, but I just can't understand this issue you seem to be having.

 

We know the adaptive sync spec range supports up to 240Hz, that's about it. We don't know if it's actually possible or not. Because there are no planned 240Hz panels as of now.

 

Yes they can have an NVIDIA and an AMD card in their computer, but it's not the easiest thing to do. And also, with my comparison both DP and HDMI are offered on a card. You are not getting an NVIDIA AMD card. It's one or the other. Putting both inside of your computer is not the same as HDMI and DP being offered in both monitors and GPU's.

 

I know anyone can support Adaptive-Sync, but what I'm saying is do you really think NVIDIA is going to support Adaptive-Sync along side their G-Sync. Most likely not, because right now NVIDIA is competing with AMD's implementation of it, FreeSync.

 

It's more or less a combination of both.

 

 

Man am I glad I got to see this post before your great wall of China text.

 

AMD's panels? You obviously have no clue what you are on about, so do yourself a favour and read this post I made on page 5: http://linustechtips.com/main/topic/333694-nvidia-pitches-advantages-of-g-sync-over-amds-freesync/page-5#entry4529520

 

Have you read it yet? Good! Now it should be easy to conclude that AMD Freesync is a driver for the graphics card on the computer, that utilizes the hardware with Adaptive Sync on the monitor. This means that:

  • Freesync is not implemented in any monitor. Adaptive Sync by VESA is.
  • AMD has no influence on scalers, panels or anything else, used in monitors in any way what so ever.
  • AMD has no influence on panel technology, or the limitations hereof.
  • AMD does not make their own panels.

What AMD does have is a license free program, that ensures the Adaptive Sync monitors have a wide enough hz interval to be useful for gaming (which means they can use the freesync logo), and support for the full 9-240hz interval, that is in the VESA Adaptive Sync standard. That means, that if a monitor with a native 180hz with Adaptive Sync where to be launched tomorrow, Freesync driver on the graphics card, would support it fully, just as the industry standard of Adaptive Sync would.

 

I hope this helps with your confusion, because you are indeed confusing brands, standards, drivers, hardware, etc. into one giant clusterfrack.

 

The 30-144hz is an LCD panel limitation (although LCD should be able to go above 144hz). Look for OLED if you want to go below 30hz. You understand that standards, as Adaptive Sync needs to be very future proof, right? Might be unchanged for a decade.

 

 

Of course I don't understand what you are saying when you are confusing everything together. I think the ultimate issue, is that you compare standards and the support of these, with actual products, instead of comparing products to product.

 

 

It's irrelevant, because we are talking standards, and the support of standards, not the actual products. No monitors have freesync implemented. If an OLED monitor with a hz interval of 9-240 hz was released tomorrow with an adaptive Sync capable scaler with that interval, both Adaptive Sync as a standard, and Freesync as a driver, would support it out of the gate. We have no idea about Gsync. If they have to reprogram it, change it, or do additional R&D.

 

 

  • We don't know if the GSYNC hardware module comes with a fee or not. Maybe monitor vendors just has to pay a standard price premium for the module. The point is that AMD does not get a single cent for any Adaptive Sync capable monitor on the market. Not even for the ones, that has gotten the Freesync Logo and verification. Nvidia gets a lot of money for each Gsync monitor. That is one of the reasons why Gsync is more expensive, and will not have the same availability as Adaptive Sync.
  • See THAT is a strawman. I never claimed such a thing. Variable VBlank used by both GSync and Adaptive Sync, was used in eDP as a power savings feature. Adaptive Sync can still use it as such. For instance the 280x card can only use Adaptive Sync for power saving and video playback. This is relevant, because it could make Adaptive Sync interesting for a non gaming player, like Intel and their integrated graphics solutions.
  • I'm not blaming anyone, because I don't see anything wrong. But you are criticizing AMD for fully supporting the VESA Adaptive Sync standard, and marketing it. If you have a problem with the hz interval support, you have to criticize the source, that made it. That is VESA.
  • I will get back to this on your sources further down.
  • I have read the article. The point is that Gsync forces VSYNC when reaching the upper level of the monitor. That creates latency. On Freesync, you can choose to either do the same, and implement Vsync or just let the fps run variable, which will create tearing. The latter is preferable for pro players, but is not that noticeable on 144+hz. But that is NOT AMD's point; Gsync uses an overcomplicated two way handshake communication between each frame, which has a performance hit of 1-1,5%: https://youtu.be/EFQGuYx2q7k?t=102 (1:42 in)


 

Now let's go through your quotes:

 

 

Tom's comments on the degradation of the LCD panel, is why no monitor goes below 30hz atm. Now if it is truly true, that Gsync supports higher hz intervals, so be it. I did write, it might very well be possible several times. What perplexes me, is that Nvidia never mentioned this before. The FPGA processor in the Gsync module, should be able to handle it; the question is, whether the module needs firmware updating or not, to support it. If so, the support is not native. Nvidia should have announced this before, about Gsync as a standad, instead of only as an implementation.

 

But then again, what does "communicate any range" even mean? Can Gsync do a synced framerate of 5hz, if the panel supports it? That is the question. Either way, you can give Nvidia the benefit of the doubt. Based on al the PCPer interviews with Tom, it does seem he has a good way of distorting comments to sound like something it's not quite. Also known as "spin". 

 

Seamless transition is about how Gsync works, when dropping below the panels 30hz minimum framerate, which is essentially VSync ON (or redraw previous image from monitor ram, which is pretty much the same). Has nothing to do with support below 30 hz.

 

Malventano is biased. He claimed that Freesync would never be able to do what Gsync does. He was wrong. If he has inside information from Nvidia, he might just be even more biased. Either way, he is not an official source, and I do not see him as credible. You might, good on you, but that does not make his statements, facts.

 

I do find it interesting that Gsync can send down to 1 fps (hz) (or so he claims). That is useful for power savings (if Gsync supports that?). For gaming, going below 30hz is pretty useless. For video playback, 24hz could be useful (or 23,96hz).

 

Now you can criticize AMD all you want, but all Adaptive Sync monitors, does show their max hz setting. Finding the hz interval should be easy in reviews, but honestly I think they should be mandatory to show.

 

You obviously are still pretending like AMD has nothing to do with the panels that are labeled FreeSync. So do yourself a favor, since you have no idea what you are talking about and look at this:

 

15oaqlk.jpg

 

 

 Yes but it comes along with the FreeSync name, we've been over this, look at the above picture

 So you are actually saying that AMD has not a single say in anything that gets released. Please. That's a lie.

 No they don't have control over the technology available, but they have control over their marketing. Which you still don't seem to get

 Nobody said they did, but they are still putting their FreeSync name and their name on the panels:

 

t8l0g2.jpg

 

Yes we understand that, except there are no 180Hz panels, and none planned. So it's pointless using such a irrelevant specification like 9-240Hz in your advertising. Which you still don't seem to get the point. Everyone understands that 9-240Hz is an interface specification for Adaptive-Sync. What you are not understanding is AMD is using that base spec standard in their advertising. Not the actual realistic specification that they have. Which is 40-144Hz. But you still don't seem to get it, or you get it, but are blind to the fact that it is wrong.

 

I hope this helps with your confusion, because you seem to me very lost. You seem to be responding with information that is hardly even related to anything that I said.

 

If 30-144Hz is a panel limitation as you say, then why is AMD saying 9-240Hz and comparing it to G-Syncs 30-144Hz? When as you said 30-144Hz is a panel limitation meaning AMD can also only do at least 30-144Hz but not 9-240Hz. Oh wait, I forgot, that's not AMD's fault for advertising as such and only delivering 40-144Hz because apparently they have no say in their advertising.

 

I'm not confusing anything, you are just confusing yourself because you aren't paying attention. Which is why most of your responses go off into the deep end.

 

We aren't just talking about standards, which is why I am saying, you are confusing yourself and not paying attention. We are also talking about advertising here, and what your advertising claims and what your actual product delivers. Yes we understand, if there is ever a monitor released with 9-240Hz Adaptive-Sync will support it, because that is the base spec for Adaptive-Sync. But for AMD to promote that base spec as their own, is not correct, because it is just an interface spec. Not a real spec. So like I said for the millionth time. You cannot compare yourself to the competition like in this chart:

 

2nbvrpd.jpg

 

When you can only do 40-144Hz maximum as well. But you seem to keep ignoring that fact and skipping over it. Regurgitating the same known information instead, and ignoring my base point.

 

• They still get Free advertising for their company on all the monitors and also the fact that you need an AMD card to support FreeSync or Adpative-Sync at the moment. So they are getting money in that sense too. Even if the monitors might be cheaper. You still need to buy an AMD card to utilize FreeSync/Adaptive-Sync. Therefore someone has to be locked into an AMD ecosystem regardless if they want to use a FreeSync/Adaptive-Sync monitor. Also nobody really knows what AMD gets, they can come out and claim they dont get anything, but you really dont know. Does it seem likely that a company is doing anything for free? Being honest about that and not being a fanboy.

• Not really, first you made it seem like extra input methods don't have a play in input lag. Everyone knows this, why do you think all those Korean monitors only come with one input? But then you went into it being a power savings feature. Just because one of the benefits of a product is power savings, doesn't necessarily mean that that's what the products intention was. Just because it's a side feature, doesn't mean anyone is going to buy a $630 monitor along with a capable graphics card just to save some power

• It's not VESA's fault though, they are just creating an interface specification. But AMD is the ones using it in their advertising to counter NVIDIA's G-Sync. That's the issue here why can you not grasp that?

• It says no where in the article that G-Sync forces V-Sync on. If you even read the Forbes article, Tom says this is a feature of FreeSync/Adaptive-Sync that he would like to implement into G-Sync. However it is not doing that. In the chart you posted, in that article it comes from, he says it "feels like"  V-Sync is enabled not that V-Sync actually gets enabled. Do you know what 1-1.5% is? That's nothing. That's like the difference between 55 fps and 54.1 fps so a .9 difference in framerate (that's using the maximum of 1.5%) at 1% it's the difference between 55 fps and 54.4 fps so a .6 difference in framerate. AMD in their marketing makes it seem like it's this huge loss in framerate. Calling it a penalty or performance loss is just a moot point. If it was like 10-20% I could understand, but it's not even close to that.

 

I don't see how he is distorting anything, he is just trying to clear up the air after AMD made claims of being able to do 9-240Hz in their marketing. Tom is just trying to say, hey look, 9-240Hz is a base specification. Not an actual current specification. Both Adpative-Sync and G-Sync are capable of going that low or that high. 

 

It's a seamless transition above and below the required refresh rate of the panel. So that means the required refresh rates of that panels are 30-144Hz. If you can go below that, that means you can go below 30Hz, and you can go above that it means you can go above 144Hz. What don't you understand about that? And as I stated earlier, NVIDIA doesn't have a V-Sync on feature for G-Sync, you can even see Tom saying that they want to implement that feature eventually:

 

Tom Petersen: “There’s also a difference at the high frequency range. AMD really has 3 ranges of operation: in the zone, above the zone, and below the zone. When you’re above the zone they have a feature which I like (you can either leave V-Sync On or V-Sync Off), that we’re going to look at adding because some gamers may prefer that. 

 

 

Well, if you don't want to see someone as credible for a person who works at a respected tech review site, that's your prerogative, but most people would take the information given from people who are working inside the industry on a daily basis as reliable sources. If you want to ignore that information, like I said, that's your personal feelings towards the information and who it is coming from. It's not like it is coming from "Joe Blow" from Guru3D forums. It's from someone who works at PCPerspective, not just a random nobody off some internet forum.

 

Going Below 30Hz is actually very important, the low side is where you need it, especially at high resolutions. Definitely a great feature for Gaming too when achieving framerates over 30 fps become more difficult at higher resolutions with a single card.

 

I think showing the actual intervals should be mandatory in marketing too, none of this we can do this but our competitors can only do this nonsense. Which is the point I'm trying to make. They might have a base spec of 9-240Hz, but it's nothing more than a interface specification. Not what is actually possible. All I'm trying to say in their advertising they should have put 40-144Hz because that's their maximum range at the moment. 

Link to comment
Share on other sites

Link to post
Share on other sites

Oh well, at least this thread has been keeping me entertained while on the afternoon shift.

 

It really is a simple one this, however a lot of you dont or cannot or simply will not understand the issue because it opposes the current kit that you have bought. AMD fully supports the adaptive sync standard - The adaptive sync standard supports 9-240hz, regardless of what hardware is currently available right now. The important word here is Supports. So AMD are basically saying to monitor manufacturers that Freesync (AMDs take on VRR Technology, which supports Adaptive Sync standards) will support the 9-240Hz range and that the monitor manufacturers are free to build whatever range they see fit into their Freesync supported monitors.

 

That's it....nothing else to say.

 

AMD decided to put this into their advertising because it is a truthful fact. Freesync Supports refresh rates of between 9 - 240Hz. Just because there isnt a monitor that supports that kind of refresh rate range out there on the shelves does not matter.

 

Yes, the bit that is hurting Nvidia and it's Fanboys is the fact that AMD used the supported refresh range in their advertising which may suggest or give the impression that Freesync is better than G-Sync. It all comes down to numbers...my dad is bigger than your dad etc...well at least on first reading anyway. Bigger numbers, better more powerful product.... But we as intelligent techies know that even that sometimes isnt the case and a lot of other factors may or may not come into it. However, that is why AMD used it....it's a bit crafty yes, but they arent lying about it because it does support it.

 

Anyway, I am avidly keeping my eyes open on the current technology as I am about to upgrade my monitor and at this point in time I have an R9 280X (ironicly the damn thing is the only AMD card that does not support Freesync. Just my luck eh!). Yes, I am a bit pissed about that as it means I have a few choices to make. Do I go Freesync or G-Sync. I will have to get another card regardless of which I choose if I want to use any of this VRR technology. But  I will not be swayed by half of the ranting on this thread and I will go with the monitor and GFX card that suits me best at this time....I am not loyal to AMD or NVIDIA...I will go with what suits my needs and budget at the time I choose to buy. I have supported both companies over the years...they both make good products at varying price ranges and thank god that they both do this as this is what keeps technology going from strength to strength. I certainly wouldnt want either of them to go bust as competition (in whatever form they both choose to use) keeps the tech going and we would all be sorry if those continuous improvements slowed or stopped.

 

What would make this more interesting is if another GFX company started to knock out some serious competition for both of them. MATROX are you listening?

 

Have fun people and as Sgt Esterhaus used to say....."Be Carefull Out There"

:-)

Kind Regards Always

 

Mayo

Link to comment
Share on other sites

Link to post
Share on other sites

We know the adaptive sync spec range supports up to 240Hz, that's about it. We don't know if it's actually possible or not. Because there are no planned 240Hz panels as of now.

 

Yes they can have an NVIDIA and an AMD card in their computer, but it's not the easiest thing to do. I know anyone can support Adaptive-Sync, but what I'm saying is do you really think NVIDIA is going to support Adaptive-Sync along side their G-Sync. Most likely not, because right now NVIDIA is competing with AMD's implementation of it, FreeSync.

 

Oh come on now, are you really discrediting VESA now? You have no idea what monitors or panels they have at their disposal, OLED inclusive. You know Nvidia is a VESA member too? Of course VESA would not make a standard, that doesn't work.

 

We know Nvidia don't want to support Adaptive Sync, but that is their own choice. They are free to do so, if they want or are forced to down the road; you know Nvidia can just make a Gsync branded driver, that utilizes Adaptive Sync, right?

 

You obviously are still pretending like AMD has nothing to do with the panels that are labeled FreeSync. So do yourself a favor, since you have no idea what you are talking about and look at this:

 

 Yes but it comes along with the FreeSync name, we've been over this, look at the above picture

 So you are actually saying that AMD has not a single say in anything that gets released. Please. That's a lie.

 No they don't have control over the technology available, but they have control over their marketing. Which you still don't seem to get

 Nobody said they did, but they are still putting their FreeSync name and their name on the on the panels:

 

Do you even know what a panel is? Are you confusing panel with monitor? A panel cannot be labelled Freesync. Freesync is a graphics driver tech. A panel does not need to be special in any way. It is the scaler (tcon), that has to support Adaptive Sync. The panel has no clue what's going on. It's all controlled by Adaptive Sync, on the hardware side in the monitor. Freesync then controls the adaptive sync scaler.

 

  • If you'd actually read my posts, you'd know that AMD has a freesync branding thing going. It is entirely possible to do an Adaptive Sync monitor with a hz interval of 40-42 hz. That would be useless for gaming, and a mockery of adaptive sync, but possible. So AMD can review your monitor, and if it has an acceptable interval, and performs well, you get to use the Freesync logo and branding on the box/monitor. None of this makes AMD responsible or hold any power over any hardware. Also AMD has no responsibility of how retailers plaster their brand all over, to make it easier for their customers. Asus is making an Adaptive Sync monitor, that is not free sync branded.
  • PROVE IT. AMD is not responsible for other companies hardware.
  • Their marketing is not lying. Freesync driver software supports the full hz interval/range of Adaptive Sync. The problem is that their marketing compares what AS supports, and what NVidia has said, it supports. It's only within the last couple of weeks, that Nvidia suddenly maybe/perhaps/might support more than 30-144hz. You cannot blame AMD for Nvidia's lack of communication.
  • Not the panels. Again, do you mean monitor? Do you know the difference? Do you understand that the panel only makes the hz limitations, based on how fast the crystals can move? No panel knows, and thus does not have to "support" any synced tech like Gsync or Async.

 

Yes we understand that, except there are no 180Hz panels, and none planned. So it's pointless using such a irrelevant specification like 9-240Hz in your advertising. Which you still don't seem to get the point. Everyone understands that 9-240Hz is an interface specification for Adaptive-Sync. What you are not understanding is AMD is using that base spec standard in their advertising. Not the actual realistic specification that they have. Which is 40-144Hz. But you still don't seem to get it, or you get it, but are blind to the fact that it is wrong.

 

I hope this helps with your confusion, because you seem to me very lost. You seem to be responding with information that is hardly even related to anything that I said.

 

So many logical fallacies! Don't tell me, what I'm saying, I know that better than you thanks.

And it's completely fair for AMD to say that their driver software supports a hardware spec 100%, and that it is future proof.

I am not at all confused. You seem to be the only person, who is not getting it in this thread.

 

• They still get Free advertising for their company on all the monitors and also the fact that you need an AMD card to support FreeSync or Adpative-Sync at the moment. So they are getting money in that sense too. Even if the monitors might be cheaper. You still need to buy an AMD card to utilize FreeSync/Adaptive-Sync. Therefore someone has to be locked into an AMD ecosystem regardless if they want to use a FreeSync/Adaptive-Sync monitor. Also nobody really knows what AMD gets, they can come out and claim they dont get anything, but you really dont know. 

• Not really, first you made it seem like extra input methods don't have a play in input lag. Everyone knows this, why do you think all those Korean monitors only come with one input? But then you went into it being a power savings feature. Just because one of the benefits of a product is power savings, doesn't necessarily mean that that's what the products intention was. Just because it's a side feature, doesn't mean anyone is going to buy a $630 monitor along with a capable graphics card just to save some power

• It's not VESA's fault though, they are just creating an interface specification. But AMD is the ones using it in their advertising to counter NVIDIA's G-Sync. That's the issue here why can you not grasp that?

• It says no where in the article that G-Sync forces V-Sync on. If you even read the Forbes article, Tom says this is a feature of FreeSync/Adaptive-Sync that he would like to implement into G-Sync. However it is not doing that. In the chart you posted, in that article it comes from, he says it "feels like"  V-Sync is enabled not that V-Sync actually gets enabled. Do you know what 1-1.5% is? That's nothing. That's like the difference between 55 fps and 54.1 fps so a .9 difference in framerate (that's using the maximum of 1.5%) at 1% it's the difference between 55 fps and 54.4 fps so a .6 difference in framerate. AMD in their marketing makes it seem like it's this huge loss in framerate. Calling it a penalty or performance loss is just a moot point. If it was like 10-20% I could understand, but it's not even close to that.

 

  • Oh wow. So now free advertising is equal to license and or/proprietary hardware pricing? That is so far fetch, I doubt you even buy it yourself. If you're going to use my argumentations against me, at least make sure you understand what you are talking about: How is it possible to lock someone into en eco system, that is not an eco system, but is an open industry standard?!?! You have completely lost it now. Nvidia is free to support Adaptive Sync all they want, just like any and all other VESA member.
  • Those Korean monitors are like laptop panels. They have no scalers, but only a TCON to drive the panel. Adaptive Sync only works on Displayport, so we have no idea I it will create any latency from the other inputs, when you have to use DP exclusively. PCPer concludes themselves that Adaptive Sync works just as well. We will have to see. These are new scalers, so we cannot conclude based on older technology. Again with the logical fallacy. 100% of all 2015 Samsung 4K monitors will support AS. This is already an industry standard, which will be widely implemented on non gamer monitors as well. My point was, and is, that Intel and other might support AS as well for non gaming reasons.
  • I can, I just don't see it as a problem. If Gsync supports a wider hz interval, they should have said so.
  • Do you even know how Gsync works? Yes at max supported monitor hz, Gsync forces on Vsync. Even PCPer is a source on that. I agree that the difference is miniscule, but it's still factual. The VSync is still a bad choice for pro players: http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Inside-and-Outside-VRR-Wind
AMD FreeSync offers more flexibility for the gamer than G-Sync around this VRR window. For both above and below the variable refresh area, AMD allows gamers to continue to select a VSync enabled or disabled setting.
Though it doesn’t exist, imagine if you would a matching 2560x1080 75 Hz monitor with G-Sync support. When your game is capable of rendering above 75 Hz, say the same 85 FPS mentioned above, then you would be forced into a VSync enabled state. NVIDIA has currently decided that the experience of VSync on at the peak refresh rate of the monitor is the best experience for the gamer.

 


 

I don't see how he is distorting anything, he is just trying to clear up the air after AMD made claims of being able to do 9-240Hz in their marketing. Tom is just trying to say, hey look, 9-240Hz is a base specification. Not an actual current specification. Both Adpative-Sync and G-Sync are capable of going that low or that high. 

 

It's a seamless transition above and below the required refresh rate of the panel. So that means the required refresh rates of that panels are 30-144Hz. If you can go below that, that means you can go below 30Hz, and you can go above that it means you can go above 144Hz. What don't you understand about that? And as I stated earlier, NVIDIA doesn't have a V-Sync on feature for G-Sync, you can even see Tom saying that they want to implement that feature eventually:

 

So what do you think he means, when he says "seamless transition"? Because in Freesync you can either enforce Vsync or standard variable framerate (non synced). When Gsync reaches 30hz or below, it will double the hz of the monitor and show the same frames twice. So at 29 hz, the monitor runs at 48 hz showing each frame twice. This is a clever way I hope AMD might implement in their drivers (should be easy enough). This gives a "seamless transition", that will not give stutter as much, and no tearing, as it still runs synced frame rates, but at double the same frame.

gsyncdiagram_0.jpg

 

 

Well, if you don't want to see someone as credible for a person who works at a respected tech review site, that's your prerogative, but most people would take the information given from people who are working inside the industry on a daily basis as reliable sources. If you want to ignore that information, like I said, that's your personal feelings towards the information and who it is coming from. It's not like it is coming from "Joe Blow" from Guru3D forums. It's from someone who works at PCPerspective, not just a random nobody off some internet forum.

 

Going Below 30Hz is actually very important, the low side is where you need it, especially at high resolutions. Definitely a great feature for Gaming too when achieving framerates over 30 fps become more difficult at higher resolutions with a single card.

 

Like I said, I think he is biased, based on his ignorant comments on Adaptive/free Sync. I still see PCPer as a reputable source, but I am wary of Allyn's comments on Nvidia and AMD.

 

Sure, low hz is definitely a good place for synced frame rates to work. But we can both agree that 30 or worse, sub 30 fps is a bad gaming experience? Right now LCD tech does not support it tough. I can tell you why it is not possible if you want?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Oh come on now, are you really discrediting VESA now? You have no idea what monitors or panels they have at their disposal, OLED inclusive. You know Nvidia is a VESA member too? Of course VESA would not make a standard, that doesn't work.

 

We know Nvidia don't want to support Adaptive Sync, but that is their own choice. They are free to do so, if they want or are forced to down the road; you know Nvidia can just make a Gsync branded driver, that utilizes Adaptive Sync, right?

 

 

Do you even know what a panel is? Are you confusing panel with monitor? A panel cannot be labelled Freesync. Freesync is a graphics driver tech. A panel does not need to be special in any way. It is the scaler (tcon), that has to support Adaptive Sync. The panel has no clue what's going on. It's all controlled by Adaptive Sync, on the hardware side in the monitor. Freesync then controls the adaptive sync scaler.

 

  • If you'd actually read my posts, you'd know that AMD has a freesync branding thing going. It is entirely possible to do an Adaptive Sync monitor with a hz interval of 40-42 hz. That would be useless for gaming, and a mockery of adaptive sync, but possible. So AMD can review your monitor, and if it has an acceptable interval, and performs well, you get to use the Freesync logo and branding on the box/monitor. None of this makes AMD responsible or hold any power over any hardware. Also AMD has no responsibility of how retailers plaster their brand all over, to make it easier for their customers. Asus is making an Adaptive Sync monitor, that is not free sync branded.
  • PROVE IT. AMD is not responsible for other companies hardware.
  • Their marketing is not lying. Freesync driver software supports the full hz interval/range of Adaptive Sync. The problem is that their marketing compares what AS supports, and what NVidia has said, it supports. It's only within the last couple of weeks, that Nvidia suddenly maybe/perhaps/might support more than 30-144hz. You cannot blame AMD for Nvidia's lack of communication.
  • Not the panels. Again, do you mean monitor? Do you know the difference? Do you understand that the panel only makes the hz limitations, based on how fast the crystals can move? No panel knows, and thus does not have to "support" any synced tech like Gsync or Async.

 

 

So many logical fallacies! Don't tell me, what I'm saying, I know that better than you thanks.

And it's completely fair for AMD to say that their driver software supports a hardware spec 100%, and that it is future proof.

I am not at all confused. You seem to be the only person, who is not getting it in this thread.

 

 

  • Oh wow. So now free advertising is equal to license and or/proprietary hardware pricing? That is so far fetch, I doubt you even buy it yourself. If you're going to use my argumentations against me, at least make sure you understand what you are talking about: How is it possible to lock someone into en eco system, that is not an eco system, but is an open industry standard?!?! You have completely lost it now. Nvidia is free to support Adaptive Sync all they want, just like any and all other VESA member.
  • Those Korean monitors are like laptop panels. They have no scalers, but only a TCON to drive the panel. Adaptive Sync only works on Displayport, so we have no idea I it will create any latency from the other inputs, when you have to use DP exclusively. PCPer concludes themselves that Adaptive Sync works just as well. We will have to see. These are new scalers, so we cannot conclude based on older technology. Again with the logical fallacy. 100% of all 2015 Samsung 4K monitors will support AS. This is already an industry standard, which will be widely implemented on non gamer monitors as well. My point was, and is, that Intel and other might support AS as well for non gaming reasons.
  • I can, I just don't see it as a problem. If Gsync supports a wider hz interval, they should have said so.
  • Do you even know how Gsync works? Yes at max supported monitor hz, Gsync forces on Vsync. Even PCPer is a source on that. I agree that the difference is miniscule, but it's still factual. The VSync is still a bad choice for pro players: http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Inside-and-Outside-VRR-Wind

 


 

 

So what do you think he means, when he says "seamless transition"? Because in Freesync you can either enforce Vsync or standard variable framerate (non synced). When Gsync reaches 30hz or below, it will double the hz of the monitor and show the same frames twice. So at 29 hz, the monitor runs at 48 hz showing each frame twice. This is a clever way I hope AMD might implement in their drivers (should be easy enough). This gives a "seamless transition", that will not give stutter as much, and no tearing, as it still runs synced frame rates, but at double the same frame.

gsyncdiagram_0.jpg

 

 

 

Like I said, I think he is biased, based on his ignorant comments on Adaptive/free Sync. I still see PCPer as a reputable source, but I am wary of Allyn's comments on Nvidia and AMD.

 

Sure, low hz is definitely a good place for synced frame rates to work. But we can both agree that 30 or worse, sub 30 fps is a bad gaming experience? Right now LCD tech does not support it tough. I can tell you why it is not possible if you want?

 

No, nobody is discrediting VESA, not sure where you got that idea, or why you even stated such. We do know what panels are planned though, as stated by Malventano. He wouldn't just pull that out of thin air, unless it had some basis. What does NVIDIA being a VESA member have anything to do with anything. Stop straying off topic.

 

They already have a G-Sync like driver that basically works like Adaptive-Sync. They are using this method in their Laptops, if you remember the leak that released a few months ago about being able to use G-Sync without the hardware module in laptops. But again, I'm not sure why that is relevant.

 

Yes I know what a panel is, no you don't need to argue semantics here. Because that's really all you have left, is arguing semantics and nothing more than baseless arguments that aren't even tied into any of my points. A monitor can be labeled FreeSync and they have, as I posted above two times. Then I love how you give more widely-known information in your rebuttal as it has any relation to to what we are talking about. 

 

 What are you talking about, now you are running out of replies so you are responding with complete mindlessness. What does 40-42Hz interval have anything to do with what I said. What are you even talking about? AMD is 100% responsible for what they put in their own advertising, that is what I am talking about, but yet again you ignore that fact because you have nothing left to respond to except with empty arguments. 

 I did prove it, they have their AMD branding on every single Adaptive-Sync monitor. All the ads that they have have AMD or AMD Adaptive Sync. You think that branding is free? Please.

 If their marketing is not lying, then what is it? They market 9-240Hz, but only offer 40-144Hz. If that's not lying, then please to explain to me what that is. AMD doesn't support 9-240Hz. As you said before, that is the VESA interface specification. So what are you even saying now? How can AMD use 9-240Hz in their marketing if they aren't responsible for that specification. The actual monitors released only do 40-144Hz. These same monitors have AMD FreeSync branding on them. How are they not responsible for that?

 Here we go arguing semantics again. People use panel or monitor, vice versa all the time. The words can be interchangeable.  Only when you have nothing left in your argument, do you nitpick about that. Yes we understand that a panel is the actual screen used. But when talking about monitors you can use both. Stop acting like you never seen anyone use both before in conversation. Like I said before, arguing semantics because you have nothing else left in your argument.

 

You clearly don't know better than me, you ran out of arguments a long time ago. Now you are just nitpicking and spouting useless information that is already known. It's not completely fair, they aren't telling the truth. You know it, and everyone knows it. It's dissuades a potential customer who doesn't know any better. They might think that AMD is better because they do 9-240Hz and NVIDIA only does 30-144Hz. Except AMD only does 40-144Hz, so no, it's not fair. Multiple people understand me in this thread, even people who are not agreeing with me. You are the only one who does not.

 

What it costs to implement G-Sync into a monitor is not a licencing fee. It requires a module, obviously that will cost money to implement. But it's not a licensing fee. And nothing is free. You still don't seem to get that point. Because AMD says, "no licensing fee" you automatically believe them that it means nobody is getting paid. Which I find hilarious, how ignorant anyone could be to actually believe that. I know what I am talking about, you clearly dont know what I am talking about. Which is why your responses lack more and more very single time you respond. Which I find hilarious how much your posts continue to degrade after time. Oh so now in order to utilize FreeSync you are saying you can use any graphics card? But I don't know what I am talking about. Firstly, you require an AMD card to use FreeSync, so yes you are locked into an ecosystem. If you want G-Sync you have to use NVIDIA, if you want FreeSync you have to use AMD. Sure Adaptive-Sync is open to anyone, but that doesn't mean anyone will be able to use it since there are only two discrete graphics card manufacturers in this space at the moment (there are others, but I'm talking realistically). But I've completely lost it... I'm starting to wonder if you even are cognizant of what you write. I mean do you even read what you write after you write it?

• You do realize just because you call something a logical fallacy doesn't necessarily make it one right? No matter how many times you use it. People point out that in that chart, AMD says it's compatible with standard monitors features, one of those being multiple inputs. You think if it was better to put more inputs on a G-Sync monitor, NVIDIA wouldn't have done it? Of course they would. But like I said before you find not having multiple inputs helps with input lag. Whether you want to say it doesn't matter because that's older technology which is why it might have once been the case. Doesn't necessarily mean it's not the case now either. Like you said, these are "new scalers," the same can hold true. Which is one of the reasons why NVIDIA probably made it a choice to go with a single input.

• Ok, you don't see it as a problem, but I do. Because it's not being completely honest. And when they talk about G-Sync in other articles, they do state the current panel limits are not G-Sync limits. They just don't promote it as being able to do intervals that aren't existent. Which is how it should be. 

• But my original point is that it shouldn't really be considered a "performance loss." Performance loss should more be considered loss in framerate because of it being enabled and 1-1.5% isn't just enough to justify it actually being considered a legitimate performance loss.

 

But doesn't that simply confirm what I'm saying that G-Sync is able to handle below the monitors refresh rate, and that the G-Sync module is capable of handling such refresh rates. Even if it makes up for the loss by doubling or tripling, etc. It still has a way to maneuver when it is dropped down that low.

 

Well, if your weary of Allyn's comments there is nothing I can do to stop you from feeling that way, but if you consider PCPer as a reputable source. I would normally assume you consider the people who work their as reputable also.

 

Yes sub 30 is a bad gaming experience, but it's not to say that sub 30 fps won't happen with say a single GPU card on a 4K or 5K monitor. Anyone can name an instance when they were playing a game and their framerate dropped substantially during an intensive scene. Sometimes into the 10's or 20's. I have a pretty good idea of why they don't go that low right now, but feel free to enlighten me.

Link to comment
Share on other sites

Link to post
Share on other sites

I gotta ask: Does everyone in here have an equally hard time to understand me? Or is it just Stroonz?

 

Yes I know what a panel is, no you don't need to argue semantics here. Because that's really all you have left, is arguing semantics and nothing more than baseless arguments that aren't even tied into any of my points. A monitor can be labeled FreeSync and they have, as I posted above two times. Then I love how you give more widely-known information in your rebuttal as it has any relation to to what we are talking about. 

 

Semantics are important when discussing. Otherwise we don't know that the other person actually mean.

 

The problem is you don't seem to understand why these monitors are called Freesync. Here's the two reasons:

  1. Retailers use the branding, to help consumers know what to get. AMD has no responsibility for this. The Asus' MG279Q will not carry the Freesync branding officially, but retailers might still use it. The reason why the Asus won't have it is because:
  2. AMD has an official license free certification program, so monitor vendors with Adaptive Sync supporting monitors, can freely apply to use the freesync branding. Only if the hz interval supported is high enough and gives the end user a good experience, will the vendor get to use the branding. If for instance, the interval is only 40-42 hz, then the experience is bad and you won't be allowed to use the branding: http://techreport.com/news/27650/here-240-fps-footage-of-amd-freesync-tech-and-some-new-info 
    Certification of FreeSync monitors will be handled by AMD directly. The company says it wants to ensure its brand is synonymous with a "good experience." The certification process will be free of charge, the company tells us, so it hopefully won't add to the cost of FreeSync panels. That said, AMD says its drivers will also allow variable-refresh mojo with non-FreeSync-certified panels, provided those panels support the DisplayPort 1.2a Adaptive-Sync specification.

     

Do you understand why monitors are called free sync now? There is NO software or hardware from Freesync inside the monitors. Nor does AMD/freesync has any influence what hardware is inside the monitors either.

 

 

 What are you talking about, now you are running out of replies so you are responding with complete mindlessness. What does 40-42Hz interval have anything to do with what I said. What are you even talking about? AMD is 100% responsible for what they put in their own advertising, that is what I am talking about, but yet again you ignore that fact because you have nothing left to respond to except with empty arguments. 

 I did prove it, they have their AMD branding on every single Adaptive-Sync monitor. All the ads that they have have AMD or AMD Adaptive Sync. You think that branding is free? Please.

 If their marketing is not lying, then what is it? They market 9-240Hz, but only offer 40-144Hz. If that's not lying, then please to explain to me what that is. AMD doesn't support 9-240Hz. As you said before, that is the VESA interface specification. So what are you even saying now? How can AMD use 9-240Hz in their marketing if they aren't responsible for that specification. The actual monitors released only do 40-144Hz. These same monitors have AMD FreeSync branding on them. How are they not responsible for that?

 Here we go arguing semantics again. People use panel or monitor, vice versa all the time. The words can be interchangeable.  Only when you have nothing left in your argument, do you nitpick about that. Yes we understand that a panel is the actual screen used. But when talking about monitors you can use both. Stop acting like you never seen anyone use both before in conversation. Like I said before, arguing semantics because you have nothing else left in your argument.

  • It was an example for the freesync certification program, read above. 
  • You proved nothing. You did not prove that AMD has any influence on hardware, nor that AMD gets any money from it, and yes we know they don't: http://support.amd.com/en-us/search/faq/225

    AMD has undertaken efforts to encourage broad adoption for AMD FreeSync™ technology, including:

    Royalty-free licensing for monitor vendors;
    Along with my source above as well. So yeah that branding IS free. Feel free to disprove it.
  • Freesync is still just a piece of software driver, it does not provide any hardware, nor is responsible for any hardware limitations. It is, and will always as a technology, be a driver implementation of Adaptive Sync, which supports 9-240hz. That does not mean that ALL adaptive Sync monitors, nor the Freesync branded ones, needs to support that. Not all Gsync monitors support 144hz either.
  • No people only use monitor. Do not project your mistakes on everyone. Panel is the physical part inside of the monitor, that has all the pixels in them. That is why semantics are important, because when you talk gibberish, and confuse terms, I won't understand what you say, because it is wrong. Either way, AMD still has no control over monitors OR panels OR scalers OR Tcon's OR input options OR anything else.

 

 

What it costs to implement G-Sync into a monitor is not a licencing fee. It requires a module, obviously that will cost money to implement. But it's not a licensing fee. And nothing is free. You still don't seem to get that point. Because AMD says, "no licensing fee" you automatically believe them that it means nobody is getting paid. Which I find hilarious, how ignorant anyone could be to actually believe that. I know what I am talking about, you clearly dont know what I am talking about. Which is why your responses lack more and more very single time you respond. Which I find hilarious how much your posts continue to degrade after time. Oh so now in order to utilize FreeSync you are saying you can use any graphics card? But I don't know what I am talking about. Firstly, you require an AMD card to use FreeSync, so yes you are locked into an ecosystem. If you want G-Sync you have to use NVIDIA, if you want FreeSync you have to use AMD. Sure Adaptive-Sync is open to anyone, but that doesn't mean anyone will be able to use it since there are only two discrete graphics card manufacturers in this space at the moment (there are others, but I'm talking realistically). But I've completely lost it... I'm starting to wonder if you even are cognizant of what you write. I mean do you even read what you write after you write it?

• You do realize just because you call something a logical fallacy doesn't necessarily make it one right? No matter how many times you use it. People point out that in that chart, AMD says it's compatible with standard monitors features, one of those being multiple inputs. You think if it was better to put more inputs on a G-Sync monitor, NVIDIA wouldn't have done it? Of course they would. But like I said before you find not having multiple inputs helps with input lag. Whether you want to say it doesn't matter because that's older technology which is why it might have once been the case. Doesn't necessarily mean it's not the case now either. Like you said, these are "new scalers," the same can hold true. Which is one of the reasons why NVIDIA probably made it a choice to go with a single input.

• Ok, you don't see it as a problem, but I do. Because it's not being completely honest. And when they talk about G-Sync in other articles, they do state the current panel limits are not G-Sync limits. They just don't promote it as being able to do intervals that aren't existent. Which is how it should be. 

• But my original point is that it shouldn't really be considered a "performance loss." Performance loss should more be considered loss in framerate because of it being enabled and 1-1.5% isn't just enough to justify it actually being considered a legitimate performance loss.

 

  • We know nothing of Nvidia's pricing models of Gsync. What we do know is that Nvidia makes a lot of money on all Gsync monitors, and that AMD makes absolutely nothing on any monitors. The technology is an industry standard in Displayport, which is license free for VESA members and a certification program to use freesync branding, which is also free of charge. If you believe otherwise feel free to provide proof.
    Please do not use terminology you don't understand. There is no closed/locked eco system, because it is all based on open standards. AMD is not responsible for others, like Intel and NVidia, for not using an industry standard, made by an industry organ they are both a member of. You cannot use terminology in any way you please. They have definitions. Be angry at NVidia, for not supporting an industry standard.
  • Except it was a logical fallacy. You misrepresented my views to support your own argument. That is spot on the definition right there.
    NVidia did not put addition inputs on, because it would require a lot more R&D, for no good reason, as only DP can be used for Gsync. It might also require additional hardware. In Adaptive Sync, the scaler vendors only needed to add functionality. They could reuse all their tech, inputs, OSD's, colour settings, etc. So to include those made sense, as it would be at little to no cost. Like I said HDMI connectivity is a plus, even if synced frame rates does not work o that input. Any latency penalty is beyond our knowledge as we don't know how these scalers behave in adaptive sync mode.
  • That is your subjective view.
  • Again, I agree it's miniscule, but it is factual as well. The Vsync part still applies.

 

But doesn't that simply confirm what I'm saying that G-Sync is able to handle below the monitors refresh rate, and that the G-Sync module is capable of handling such refresh rates. Even if it makes up for the loss by doubling or tripling, etc. It still has a way to maneuver when it is dropped down that low.

 

Yes sub 30 is a bad gaming experience, but it's not to say that sub 30 fps won't happen with say a single GPU card on a 4K or 5K monitor. Anyone can name an instance when they were playing a game and their framerate dropped substantially during an intensive scene. Sometimes into the 10's or 20's. I have a pretty good idea of why they don't go that low right now, but feel free to enlighten me.

 

Depends on how you define "able to handle below". Both Freesync and Gsync works below the minimum hz of the monitor. They just do it in different ways. But I agree that the Gsync version seems to be better, and I hope Freesync will be updated to do the same (gsync does this in a hardware level on the monitor, freesync could do it on a software level on the graphics card). Either way, you need to read up on Gsync, because it does in fact enable Vsync when reaching, or going above the max hz on the monitor. Not based on the article from blurbusters, but from PCPer and others, who have tested this.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I gotta ask: Does everyone in here have an equally hard time to understand me? Or is it just Stroonz?

 

 

Semantics are important when discussing. Otherwise we don't know that the other person actually mean.

 

The problem is you don't seem to understand why these monitors are called Freesync. Here's the two reasons:

  1. Retailers use the branding, to help consumers know what to get. AMD has no responsibility for this. The Asus' MG279Q will not carry the Freesync branding officially, but retailers might still use it. The reason why the Asus won't have it is because:
  2. AMD has an official license free certification program, so monitor vendors with Adaptive Sync supporting monitors, can freely apply to use the freesync branding. Only if the hz interval supported is high enough and gives the end user a good experience, will the vendor get to use the branding. If for instance, the interval is only 40-42 hz, then the experience is bad and you won't be allowed to use the branding: http://techreport.com/news/27650/here-240-fps-footage-of-amd-freesync-tech-and-some-new-info 

     

Do you understand why monitors are called free sync now? There is NO software or hardware from Freesync inside the monitors. Nor does AMD/freesync has any influence what hardware is inside the monitors either.

 

 

  • It was an example for the freesync certification program, read above. 
  • You proved nothing. You did not prove that AMD has any influence on hardware, nor that AMD gets any money from it, and yes we know they don't: http://support.amd.com/en-us/search/faq/225 Along with my source above as well. So yeah that branding IS free. Feel free to disprove it.
  • Freesync is still just a piece of software driver, it does not provide any hardware, nor is responsible for any hardware limitations. It is, and will always as a technology, be a driver implementation of Adaptive Sync, which supports 9-240hz. That does not mean that ALL adaptive Sync monitors, nor the Freesync branded ones, needs to support that. Not all Gsync monitors support 144hz either.
  • No people only use monitor. Do not project your mistakes on everyone. Panel is the physical part inside of the monitor, that has all the pixels in them. That is why semantics are important, because when you talk gibberish, and confuse terms, I won't understand what you say, because it is wrong. Either way, AMD still has no control over monitors OR panels OR scalers OR Tcon's OR input options OR anything else.

 

 

 

  • We know nothing of Nvidia's pricing models of Gsync. What we do know is that Nvidia makes a lot of money on all Gsync monitors, and that AMD makes absolutely nothing on any monitors. The technology is an industry standard in Displayport, which is license free for VESA members and a certification program to use freesync branding, which is also free of charge. If you believe otherwise feel free to provide proof.
    Please do not use terminology you don't understand. There is no closed/locked eco system, because it is all based on open standards. AMD is not responsible for others, like Intel and NVidia, for not using an industry standard, made by an industry organ they are both a member of. You cannot use terminology in any way you please. They have definitions. Be angry at NVidia, for not supporting an industry standard.
  • Except it was a logical fallacy. You misrepresented my views to support your own argument. That is spot on the definition right there.
    NVidia did not put addition inputs on, because it would require a lot more R&D, for no good reason, as only DP can be used for Gsync. It might also require additional hardware. In Adaptive Sync, the scaler vendors only needed to add functionality. They could reuse all their tech, inputs, OSD's, colour settings, etc. So to include those made sense, as it would be at little to no cost. Like I said HDMI connectivity is a plus, even if synced frame rates does not work o that input. Any latency penalty is beyond our knowledge as we don't know how these scalers behave in adaptive sync mode.
  • That is your subjective view.
  • Again, I agree it's miniscule, but it is factual as well. The Vsync part still applies.

 

 

Depends on how you define "able to handle below". Both Freesync and Gsync works below the minimum hz of the monitor. They just do it in different ways. But I agree that the Gsync version seems to be better, and I hope Freesync will be updated to do the same (gsync does this in a hardware level on the monitor, freesync could do it on a software level on the graphics card). Either way, you need to read up on Gsync, because it does in fact enable Vsync when reaching, or going above the max hz on the monitor. Not based on the article from blurbusters, but from PCPer and others, who have tested this.

 

But you do know what I mean, you are just purposefully pointing out something to dissuade people from the fact that you really have no actual argument. 

 

• If AMD has no responsibility for it carrying the FreeSync name, why have you contradicted yourself in the process because of the next bullet point?

• Here you say AMD has a certification license where monitors vendors can use the FreeSync name. So obviously AMD has responsibility in them using the FreeSync name if there is a licensing certification to use the name. And that is not a reasoning for why ASUS won't carry the FreeSync name. The Asus MG279Q has a minimum refresh rate of 40Hz and a maximum of 120Hz. Therefore from your conclusions, it is clearly high enough and will provide a good experience. 

• Lastly, I'm not sure how any of this is really relevant to what you quoted. More baseless arguments I suppose.

 

Just because there is no hardware inside of the monitors, has nothing to do with the fact that AMD puts their names on it willingly. Which you seem to be not getting, or that concept goes right over your head. As you said before, there is an actual certification required to be passed to have the AMD name on it. In which AMD willfully puts their name on it after it passes. Therefore you cannot lie and say AMD has no influence on what is put inside of the monitors, after all the information you just provided. Clearly, everyone now knows, AMD willingly puts their names on the monitors even after they know it cannot do 9-240Hz.

 

• It was an example that was unwarranted and unneeded because it had nothing to do with what I was talking about. It was one of your random factoids that you throw in to deter from the actual topic.

• No but I did prove because of your information is that they have a process on what monitors get the FreeSync branding. Therefore they know exactly what hardware is inside the monitors thus. If the monitors do not do 9-240Hz they know that. Therefore they are responsible for giving a monitor the FreeSync branding, even if it doesn't do 9-240Hz. Like they claim in their advertising. 

• If branding wasn't free, why would AMD have such a process to begin with to put the FreeSync name on it? If it was free, anyone could put the FreeSync name on their monitors that supported Adaptive-Sync. Why even have a process at all if it is free? What would be the point of that? It doesn't make sense to have an entire process in what is considered a certified monitor that would allow the FreeSync branding if it was simply free. Just because AMD says, they are doing it all for free. Doesn't necessarily make sense or true. When does any company do anything for free? Companies exist to make profit. That's what their purpose is for. If something is free, there is obviously some other loose end where they get to acquire money in the process.

• Firstly all G-Sync Monitors support 144Hz except one, a 4K panel. It does not support 144Hz because there are no 4K panels that can do 144Hz yet. Because they are limited by DP 1.2 spec and current panel limitations. Not one AMD FreeSync branded monitor offers 9-240Hz. Not one. If you think that you are allowed to say in your advertising, that your FreeSync branded monitors support 9-240Hz. Then your monitors come out and they can only do 40-144Hz. And that's okay, then clearly you are nothing more than a fanboy. If you are touting a specification as why yours is better than the competitions, you need to back up that claim. Not outright lie about it.

• People use panel, and monitor. It wasn't a mistake. Here's Malventano using panel himself:

 

I'm sorry, but this little bit of AMD marketing crap needs to stop being repeated. The adaptive sync spec range is 9-240, and AMD is repeating it to make themselves look better than the competition. If NV did the same, they could probably claim 1-240 (as they have a functional method to push lower than any panel physical limit). The AMD claimed spec is nowhere near what any panels are going to be capable of for a very long time - it's just what the interface standard supports. If NV claimed 1-240 as if it was so much better than everything else, everyone here would be calling BS (myself included), so you guys should really stop repeating that spec in that context. The real specs are those of the available panels for FreeSync, and for G-Sync (with it understood that they rate at the minimum of the panel but are capable of pushing lower with a method not (yet?) employed by FreeSync). I say 'yet' because if AMD's driver devs were sharp enough, they could implement frame redraws at the driver / GPU level.

 

 

Oh now you didn't understand what I was saying. How convenient of you, when you get proven wrong, suddenly you don't understand what I am saying. Nobody was confusing terms. Stop being ridiculous. You really lost your argument a long time ago, when you started nitpicking about panels and monitors. I guess Malventano is wrong too, for using panel. Well according to you, they do, because they required a certification to put the FreeSync name on a product. So they do have a say. 

 

• If we know nothing of NVIDIA's pricing models on G-Sync how can AMD say that NVIDIA has a licensing fee? Like what they used in their chart? You don't actually believe that AMD makes no money on FreeSync monitors do you? The proof is in the pudding. They are allowing people to use their name on their products and they actually have a system that has to be followed in order to do so. You also require an AMD graphics card to utilize the technology. Their logos are painted in the advertising for the monitors. Their logos are used on the websites when purchasing the monitors. FreeSync is everywhere. You cannot actually believe that AMD is making no money on this, can you? Especially knowing how much debt the company is in, that they would just have hand-outs for their products?

• So you are actually going to deny that you don't need an AMD card to support Adaptive-Sync? If I don't understand the terminology. How don't you understand why it was used? LMAO. You need an AMD compatible card to use Adaptive-Sync/FreeSync. There is no other way around it, or no other way to describe it. How much of an open standard is it, if you require a specific company's card along side a capable monitor (also with that company's branding) in order to use it? 

• If I misrepresent your views, to support my argument, what do you do? Maybe your views are just a misrepresentation of the truth. 

• But didn't you say in a previous post that having more inputs would allow you to do other things, like hook up an XBOX or PS4 or a TV? Why wouldn't NVIDIA want to do that? Unless there was a specific reason for not doing that like because of input lag. Also, Adaptive-Sync can only be used with DP as well, so doesn't isn't that a contradictory statement?

• It's not really subjective, dishonesty in marketing is not subjective.

• Facts and factoids are quite different. In the reasoning that a factoid is mostly useless, or trivial although true. And saying that G-Sync has a performance penalty is nothing more than a factoid. 

 

They do it in different ways, and those ways make all the difference. I suggest you read this article:

 

Below that 40 FPS mark though things shift. The red line shows how AMD's FreeSync and Adaptive Sync work: the refresh rate stays static at 40 Hz even as the frame rate dips below 40 FPS, to 35 FPS, 30 FPS, etc. G-Sync works differently, doubling the refresh rate and inserting duplicate frames starting at around the 37 FPS mark. This continues until the game frame rate hits 19/20 FPS where a third frame is inserted and the refresh rate is increased again. The result is the dashed line representing the effective experience of G-Sync.

 

Zoomed in on the area of interest, you get a better view of how G-Sync and FreeSync differ. Effectively, G-Sync has no bottom window for variable refresh and produces the same result as if the display technology itself was capable going to lower refresh rates without artifacting or flickering. It is possible that in the future, as display technologies improve, the need for this kind of frame doubling algorithm will be made unnecessary, but until we find a way to reduce screen flicker at low refresh rates, NVIDIA's G-Sync VRR implementation will have the edge for this scenario.

 

gsync2.png

 

http://www.youtube.com/watch?v=VkrJU5d2RfA&t=15m30s

 

"In practical terms the variable refresh range of G-Sync Panels is almost nothing... there is not a bottom"

 

http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

 

So G-Sync is actually capable of going down below 30Hz, whereas AMD claims 9-240Hz but is not capable of delivering. Meanwhile, G-Sync is able to do this, even beyond the panel limits. Going below 30Hz. 

Link to comment
Share on other sites

Link to post
Share on other sites

But you do know what I mean, you are just purposefully pointing out something to dissuade people from the fact that you really have no actual argument. 

 

• If AMD has no responsibility for it carrying the FreeSync name, why have you contradicted yourself in the process because of the next bullet point?

• Here you say AMD has a certification license where monitors vendors can use the FreeSync name. So obviously AMD has responsibility in them using the FreeSync name if there is a licensing certification to use the name. And that is not a reasoning for why ASUS won't carry the FreeSync name. The Asus MG279Q has a minimum refresh rate of 40Hz and a maximum of 120Hz. Therefore from your conclusions, it is clearly high enough and will provide a good experience. 

• Lastly, I'm not sure how any of this is really relevant to what you quoted. More baseless arguments I suppose.

 

Just because there is no hardware inside of the monitors, has nothing to do with the fact that AMD puts their names on it willingly. Which you seem to be not getting, or that concept goes right over your head. As you said before, there is an actual certification required to be passed to have the AMD name on it. In which AMD willfully puts their name on it after it passes. Therefore you cannot lie and say AMD has no influence on what is put inside of the monitors, after all the information you just provided. Clearly, everyone now knows, AMD willingly puts their names on the monitors even after they know it cannot do 9-240Hz.

 

No I did not, because you use the wrong terminology. Panel and monitor still is not the same. Feel free to make a poll on this site.

 

  • AMD has no responsibility for how retailers use it. Several of your pictures were from retailers.
  • License free, don't manipulate please. Just because a monitor can be used with freesync, as support, still does not mean the monitor HAS to support the full range. Any monitor with a DisplayPort 1.2a, does not have to support Adaptive Sync either; nor does it have to be 4K, just because it is supported by both the VESA standard AND the AMD graphics card. You are still confusing brands, software, and industry standards with actual hardware. You don't seem to be able to learn this.[hl]The Asus monitor will not be called Freesync monitor, because Asus chose not to have it certified. Probably because they help NVidia make Gsync. Who knows.

Just because AMD allows their branding to be used, does not give them any power of what hardware will be used by the monitor vendor. Just because Freesync supports 9-240hz, does not mean every single Adaptive Sync monitor (or even freesync branded) has to support the entire interval. Jeez.

 

• It was an example that was unwarranted and unneeded because it had nothing to do with what I was talking about. It was one of your random factoids that you throw in to deter from the actual topic.

• No but I did prove because of your information is that they have a process on what monitors get the FreeSync branding. Therefore they know exactly what hardware is inside the monitors thus. If the monitors do not do 9-240Hz they know that. Therefore they are responsible for giving a monitor the FreeSync branding, even if it doesn't do 9-240Hz. Like they claim in their advertising. 

• If branding wasn't free, why would AMD have such a process to begin with to put the FreeSync name on it? If it was free, anyone could put the FreeSync name on their monitors that supported Adaptive-Sync. Why even have a process at all if it is free? What would be the point of that? It doesn't make sense to have an entire process in what is considered a certified monitor that would allow the FreeSync branding if it was simply free. Just because AMD says, they are doing it all for free. Doesn't necessarily make sense or true. When does any company do anything for free? Companies exist to make profit. That's what their purpose is for. If something is free, there is obviously some other loose end where they get to acquire money in the process.

• Firstly all G-Sync Monitors support 144Hz except one, a 4K panel. It does not support 144Hz because there are no 4K panels that can do 144Hz yet. Because they are limited by DP 1.2 spec and current panel limitations. Not one AMD FreeSync branded monitor offers 9-240Hz. Not one. If you think that you are allowed to say in your advertising, that your FreeSync branded monitors support 9-240Hz. Then your monitors come out and they can only do 40-144Hz. And that's okay, then clearly you are nothing more than a fanboy. If you are touting a specification as why yours is better than the competitions, you need to back up that claim. Not outright lie about it.

• People use panel, and monitor. It wasn't a mistake. Here's Malventano using panel himself:

 

  • It was an example of why the certification program exists, which is one of the two reasons, that a monitor might be freesync branded. It is not difficult to understand.
  • No you did not prove anything, because I told YOU about it. Only you demand that all freesync monitors has to support 100% of the possible features of freesync/adaptive Sync. Only you.
  • Omfg. Seriously dude? I have answered this twice already. The irony is that your very first point in your quote above, is EXACTLY why I made the example. You call it unneeded, and then proceed to ask this question! OMFG.
    I will answer for the third and last time. They have this certification program, because any monitor vendor could make an Adaptive Sync monitor with a hz interval of just 40-42 (or whatever useless small interval), which would give a bad gaming experience, as you would have tearing or stutter , except for those two hz. Have you finally understood it now? After 3 times?
    If you still don't get it, feel free to prove it's a license, even after AMD said it wasn't, and the media says it isn't either. The burden of proof is on you.
  • All 4K TN panels should be able to do 144hz, it's more of a DisplayPort/scaler issue. Only you demand that a freesync monitor supports the entire range of the AS interval.
  • No they don't. And the ultimate irony is that Malventano didn't either. He literally means the panel, as the panel sets the limitations of the hz interval, not necessarily the scaler and not the monitor in itself. Epic facepalm!

 

Oh now you didn't understand what I was saying. How convenient of you, when you get proven wrong, suddenly you don't understand what I am saying. Nobody was confusing terms. Stop being ridiculous. You really lost your argument a long time ago, when you started nitpicking about panels and monitors. I guess Malventano is wrong too, for using panel. Well according to you, they do, because they required a certification to put the FreeSync name on a product. So they do have a say. 

 

You made a mistake, you keep making at, and try to blame me for it. Get over yourself. Malventano did not make a mistake, you did however again, by not understanding what he said. AMD has a say of who gets to use the branding, by AMD themselves. AMD still has no say what monitor vendors use of hardware, and still no responsibility in how retailers use the branding.

 

• If we know nothing of NVIDIA's pricing models on G-Sync how can AMD say that NVIDIA has a licensing fee? Like what they used in their chart? You don't actually believe that AMD makes no money on FreeSync monitors do you? The proof is in the pudding. They are allowing people to use their name on their products and they actually have a system that has to be followed in order to do so. You also require an AMD graphics card to utilize the technology. Their logos are painted in the advertising for the monitors. Their logos are used on the websites when purchasing the monitors. FreeSync is everywhere. You cannot actually believe that AMD is making no money on this, can you? Especially knowing how much debt the company is in, that they would just have hand-outs for their products?

• So you are actually going to deny that you don't need an AMD card to support Adaptive-Sync? If I don't understand the terminology. How don't you understand why it was used? LMAO. You need an AMD compatible card to use Adaptive-Sync/FreeSync. There is no other way around it, or no other way to describe it. How much of an open standard is it, if you require a specific company's card along side a capable monitor (also with that company's branding) in order to use it? 

• If I misrepresent your views, to support my argument, what do you do? Maybe your views are just a misrepresentation of the truth. 

• But didn't you say in a previous post that having more inputs would allow you to do other things, like hook up an XBOX or PS4 or a TV? Why wouldn't NVIDIA want to do that? Unless there was a specific reason for not doing that like because of input lag. Also, Adaptive-Sync can only be used with DP as well, so doesn't isn't that a contradictory statement?

• It's not really subjective, dishonesty in marketing is not subjective.

• Facts and factoids are quite different. In the reasoning that a factoid is mostly useless, or trivial although true. And saying that G-Sync has a performance penalty is nothing more than a factoid. 

 

  • Just because we don't know, doesn't mean no one does. AMD knows the monitor vendors, and get a lot more info than we would ever. NVidia has never refuted this statement. I wonder why? I know AMD doesn't because they have clearly stated so several times. After all, the branding makes no difference, as the monitors have Adaptive Sync, not free sync implemented. If NVidia don't want AMD to make money off of Adaptive Sync, they can just support it themselves. License free as well. 
  • Again a nice logical fallacy. Adaptive Sync is an open license free standard. Not my or AMD's fault that no one else supports it. My point is that anyone can, and probably will down the line too. Also not my fault you fail to understand theories and terminology, and use them wrong.
  • That is your subjective opinion. Prove it.
  • Because NVidia doesn't make any money off of the consoles (unlike AMD). Gsync is made by NVidia for NVidia, and their closed proprietary eco system. Fits perfectly with NVidia's current and previous business strategies. Also it would increase the R&D cost at no benefit for NVidia. Why? The monitor scaler isn't designed or made by AMD. They have no say on the matter. It makes sense for the monitor vendors to include these inputs.
  • You calling it dishonest is subjective. I don't think it is dishonest. That is my subjective opinion.
  • Indeed, a factoid is unverified, false or fabricated. You even used that word wrong. Now you understand the necessity for semantics? Neither is true for AMD's statement. The Vsync issue is still an issue for pro players.

 

They do it in different ways, and those ways make all the difference. I suggest you read this article:

 

Which means, FreeSync is not currently capable of going below 40Hz, at all. But G-Sync has an implementation for when you do go below 30Hz. So where's that 9-240Hz spec again?

 

I have read it, and know exactly how both works. Now that you have, do you understand, that Gsync uses Vsync at (and above) max hz of the monitor? Did you also notice that the physical hz of the Gsync monitor did not go below 30hz? In non G/A/F sync monitors, you can also go below 30 fps, but your graphics card will never go below, afaik. So it all depends on how you define "go below x hz". I've already stated that I prefer Gsyncs way, and assume we will see an updated freesync driver, doing the same thing. Conclusion: Gsync deals with min hz better and freesync with max hz better. They both work pretty good in between. Gsync is expensive and proprietary for a closed eco system, Adaptive Sync is a free open standard all VESA member can use (including NVidia).

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

No I did not, because you use the wrong terminology. Panel and monitor still is not the same. Feel free to make a poll on this site.

  • AMD has no responsibility for how retailers use it. Several of your pictures were from retailers.
  • License free, don't manipulate please. Just because a monitor can be used with freesync, as support, still does not mean the monitor HAS to support the full range. Any monitor with a DisplayPort 1.2a, does not have to support Adaptive Sync either; nor does it have to be 4K, just because it is supported by both the VESA standard AND the AMD graphics card. You are still confusing brands, software, and industry standards with actual hardware. You don't seem to be able to learn this.[hl]The Asus monitor will not be called Freesync monitor, because Asus chose not to have it certified. Probably because they help NVidia make Gsync. Who knows.
Just because AMD allows their branding to be used, does not give them any power of what hardware will be used by the monitor vendor. Just because Freesync supports 9-240hz, does not mean every single Adaptive Sync monitor (or even freesync branded) has to support the entire interval. Jeez.
  • It was an example of why the certification program exists, which is one of the two reasons, that a monitor might be freesync branded. It is not difficult to understand.
  • No you did not prove anything, because I told YOU about it. Only you demand that all freesync monitors has to support 100% of the possible features of freesync/adaptive Sync. Only you.
  • Omfg. Seriously dude? I have answered this twice already. The irony is that your very first point in your quote above, is EXACTLY why I made the example. You call it unneeded, and then proceed to ask this question! OMFG.
    I will answer for the third and last time. They have this certification program, because any monitor vendor could make an Adaptive Sync monitor with a hz interval of just 40-42 (or whatever useless small interval), which would give a bad gaming experience, as you would have tearing or stutter , except for those two hz. Have you finally understood it now? After 3 times?
    If you still don't get it, feel free to prove it's a license, even after AMD said it wasn't, and the media says it isn't either. The burden of proof is on you.
  • All 4K TN panels should be able to do 144hz, it's more of a DisplayPort/scaler issue. Only you demand that a freesync monitor supports the entire range of the AS interval.
  • No they don't. And the ultimate irony is that Malventano didn't either. He literally means the panel, as the panel sets the limitations of the hz interval, not necessarily the scaler and not the monitor in itself. Epic facepalm!
You made a mistake, you keep making at, and try to blame me for it. Get over yourself. Malventano did not make a mistake, you did however again, by not understanding what he said. AMD has a say of who gets to use the branding, by AMD themselves. AMD still has no say what monitor vendors use of hardware, and still no responsibility in how retailers use the branding.
  • Just because we don't know, doesn't mean no one does. AMD knows the monitor vendors, and get a lot more info than we would ever. NVidia has never refuted this statement. I wonder why? I know AMD doesn't because they have clearly stated so several times. After all, the branding makes no difference, as the monitors have Adaptive Sync, not free sync implemented. If NVidia don't want AMD to make money off of Adaptive Sync, they can just support it themselves. License free as well. 
  • Again a nice logical fallacy. Adaptive Sync is an open license free standard. Not my or AMD's fault that no one else supports it. My point is that anyone can, and probably will down the line too. Also not my fault you fail to understand theories and terminology, and use them wrong.
  • That is your subjective opinion. Prove it.
  • Because NVidia doesn't make any money off of the consoles (unlike AMD). Gsync is made by NVidia for NVidia, and their closed proprietary eco system. Fits perfectly with NVidia's current and previous business strategies. Also it would increase the R&D cost at no benefit for NVidia. Why? The monitor scaler isn't designed or made by AMD. They have no say on the matter. It makes sense for the monitor vendors to include these inputs.
  • You calling it dishonest is subjective. I don't think it is dishonest. That is my subjective opinion.
  • Indeed, a factoid is unverified, false or fabricated. You even used that word wrong. Now you understand the necessity for semantics? Neither is true for AMD's statement. The Vsync issue is still an issue for pro players.
I have read it, and know exactly how both works. Now that you have, do you understand, that Gsync uses Vsync at (and above) max hz of the monitor? Did you also notice that the physical hz of the Gsync monitor did not go below 30hz? In non G/A/F sync monitors, you can also go below 30 fps, but your graphics card will never go below, afaik. So it all depends on how you define "go below x hz". I've already stated that I prefer Gsyncs way, and assume we will see an updated freesync driver, doing the same thing. Conclusion: Gsync deals with min hz better and freesync with max hz better. They both work pretty good in between. Gsync is expensive and proprietary for a closed eco system, Adaptive Sync is a free open standard all VESA member can use (including NVidia).

 

 

You knew what I meant, you are just saying you did not to try to make a point out of it. Like the epic straw man that you are. It's not wrong terminology. You have to be kidding at this point. If that's seriously all you have left to stand on. I consider this conversation done. You are being quite ridiculous at this point. I find it hilarious that you have nothing left but using panel and monitor. And actually claiming that they cannot be interchangeable in some cases. I'm wondering if English is not your native language? Because people call monitors a lot of things. Like "Screens." Should people be hassled for using the term Screen, when Screen is actually referring to the Panel? Or you are going to nitpick with them too for using Screen? The fact that I have to even explain this to you is quite sad.

 

• Most of the pictures were from AMD's own advertising like this one:

 

AMD-FreeSync-Slide11.jpg

 

So that's your argument, it's the retailers who are doing this. LOL Does that above picture look like it's from retailers?

 

• A certification to use a specific branding or logo is essentially a type of license. Do you not understand that fact? You are licencing out your branding to another company. LOL. Can't believe you don't understand that. So now the monitor doesn't have to support the full range? Why is that, because it's convenient for you and AMD to have bogus advertising then not live up to their own standard. Who is saying any monitor that supports display port 1.2a has to support Adaptive-Sync? Nobody is, this is just more of your straw man fallacies to dissuade the conversation in a direction it is not going in. More mindlessness from you. Nobody is confusing anything. Really you have to stop saying that. The only one who is confused is yourself. As I pointed out numerous times. You really don't know what we are talking about at this time, and the second you finally understand it. You dismiss it and continue on with senselessness. 

 

Nobody is saying they have power over what hardware is being used. Nobody said that. What was said was that they put their branding on products that don't support the intervals that they claim. Jeez...

 

• It wasn't necessary or needed, you quoted me saying one thing then responded with something completely different. Stay on topic or don't respond to me at all.

• Yes that should be considered normal. You say you have a feature, you have to implement it when the product releases. How else do you hold standards to companies if they can just make up crap? Then when the product is actually released it doesn't hold up to the previous claims

• It wasn't an actual question. Which is why I assume English is not your native language. It was a rhetorical question to make a point. OMFG, JEEZ! 

• There's no burden of proof, anyone who believes AMD isn't making money on FreeSync is just ignorant.

• There's a burden of proof though, that all 4K panels can do 144Hz. Why don't you provide a source for that. I don't even demand FreeSync support the entire range, I'm actually fine with them only supporting 40-144Hz. But the problem is it's not what they advertised. So that's where the problem lies.

 

It's not a mistake, it just appears you aren't fluent enough in English to understand that people use many words to describe the same things in English. I understood what he said, you are just making peevish criticisms and objections about petty matters at this point, because as I said before you have nothing left in your argument.

 

Well, AMD has a say in what vendors get to use their branding and has a process in which monitors get the branding. Then that means they have a say in what products will have their name on it. Therefore if the products do not support the range of 9-240Hz. They will know that. Therefore they have a say in what products carry their name. Which in essence means they have a say in the hardware that has an AMD name on it. They might not dictate the actual hardware inside of it, of course, we understand that. What they do dictate is putting their name on the hardware. But you keep saying that doesn't matter, they are allowed to claim 9-240Hz but they don't have to actually support it in the products with their branding. They have no say in how retailers use their branding? You cannot be serious? Do you not have the slightest understanding of how retailers work in tandem with companies? 

 

• Maybe NVIDIA hasn't refuted the statement because they don't care what AMD is saying. NVIDIA trumps AMD in discrete GPU sales. They don't need to refute anything AMD says. So just because a company claims they don't make any money that means they don't make any money? I guess you haven't heard of "non-profit" organizations. Jeez.

• It's not a logical fallacy, please stop using that term. You really don't know what it means. It also doesn't make you appear more intelligent. So I suggest you stop using it. There was nothing about what I said that was a fundamental error in logic. The only error in logic is your thinking towards my statements. Who said anything about it being anyone's fault? I'm just stating the facts, how it is. G-Sync is NVIDIA, FreeSync is AMD. That's how it is. Those are the facts. I don't understand theories or terminology, but you use logical fallacy when it doesn't even apply, and the only errors in logic present is your lack of understanding towards what is being said. 

• See above, where I prove that your use of the term logical fallacy doesn't even apply, and is used incorrectly. The only thing that happened was you didn't understand something. So you call it a logical fallacy, to make yourself appear like you know what you are talking about. Or maybe you did understand what I was saying, I really don't see how anyone couldn't but perhaps you felt the need to call it a logical fallacy to deter from the fact that you got proven wrong. Like I said, your views are misrepresentation of the truth. 

• So now you need to make money off of consoles to have more inputs on your monitors? People like to use their monitors for other things than just consoles. Like for Blueray players or as a television. I believe if NVIDIA could provide additional inputs, without any degradation of performance they would. But they didn't, which leaves me to believe that it has a part to play in input lag. And why would there be additional cost to NVIDIA? I don't see how that would be possible. Some of the panels used in G-Sync monitors are used in other monitors without G-Sync, and those do have other inputs. Your point was that G-Sync only works with DisplayPort (which is why additional inputs weren't included), my point is so does Adaptive-Sync, that it only works with DisplayPort, but it includes additional inputs so I don't see how that can be a point.

• So, if that's not dishonest. Was it dishonest how NVIDIA portrayed the memory layout and specs on the GTX 970 in advertising? Afterwards we discovered a lot of it was a misrepresentation of what they said.  If you say yes, then you are being a hypocrite. Because by your standards, what you advertise doesn't have to be the same as the products you actually release (or at least in this case, have your branding)

• Well, you obviously don't use the dictionary, which is why I think English is not your native language. Which is why I don't think you know that English words have multiple meanings. But you say, I use words incorrectly (I mean you didn't even look it up in the dictionary either to see if I used it correctly, you just blurted out like a moron that I used it incorrectly): 

 

 

factoid
noun fac·toid \ˈfak-ˌtid\

: a brief and usually unimportant fact

: a briefly stated and usually trivial fact

 

 

http://www.merriam-webster.com/dictionary/factoid

 

No your necessity for semantics is unwarranted, because you don't know what you are talking about the majority of the time. Like in this instance and many others. It might be because English is not your native tongue, which if is the case, I guess I can forgive you for your mistakes or confusion. But if that is the case, I don't see why you would attempt to point out other peoples flaws in their English or their understanding of the language when it is their native language and not yours. I see you are from Copenhagen, even if I studied Danish throughout all my years of schooling. I still wouldn't at any point attempt to correct a native Danes word usage in Danish because it just wouldn't be my place to do so. Being that it is not my primary language. And even if I had a good understanding of Danish because I studied it it school for many years, it still wouldn't be appropriate for me to do such a thing considering there is probably a lot that I do not know or understand. Even if I'm good enough to hold a conversation or write. If it happens to be that English is your native tongue, and you just moved from the US or Canada or the UK to Denmark. I don't see how you cannot know that in English there are many polysemic words and also that many words are used loosely in the language to describe the same thing. 

Link to comment
Share on other sites

Link to post
Share on other sites

Which means, FreeSync is not currently capable of going below 40Hz, at all. So where's that 9-240Hz spec again?

Your argument still has no leg to stand on since my initial reply to this accusation days ago. Unless you can provide substantial evidence of your claims just lose the argument because I can tell you right now that you're genuinely wrong. You've been spewing the same nonsense for days blaming FreeSync for display manufactures for not producing 9-240 Hz displays. What are you going to do next? Blame G-Sync for not being able to support 240 Hz just because it's limited by panel technology? How many times does the community need to tell you that FreeSync has absolutely no implementation display side. Which in itself is concrete evidence that the current batch of displays are not limited by FreeSync. Claiming FreeSync is only capable of 40-144 Hz just because a display doesn't happen to support outside that range is like saying it's impossible for a car to go over 55 MPH with a restrictor plate even tho the speedometer reads 180. The 9-240 Hz is the official supported range of FreeSync as it has no limitations other than the display interface. There is no benefit of secrecy, they need to disclose the right info at the right time. Which doesn't sit too well with you (and only you) although it's cold hard facts.

Link to comment
Share on other sites

Link to post
Share on other sites

 

Your argument still has no leg to stand on since my initial reply to this accusation days ago. Unless you can provide substantial evidence of your claims just lose the argument because I can tell you right now that you're genuinely wrong. You've been spewing the same nonsense for days blaming FreeSync for display manufactures for not producing 9-240 Hz displays. What are you going to do next? Blame G-Sync for not being able to support 240 Hz just because it's limited by panel technology? How many times does the community need to tell you that FreeSync has absolutely no implementation display side. Which in itself is concrete evidence that the current batch of displays are not limited by FreeSync. Claiming FreeSync is only capable of 40-144 Hz just because a display doesn't happen to support outside that range is like saying it's impossible for a car to go over 55 MPH with a restrictor plate even tho the speedometer reads 180. The 9-240 Hz is the official supported range of FreeSync as it has no limitations other than the display interface. There is no benefit of secrecy, they need to disclose the right info at the right time. Which doesn't sit too well with you (and only you) although it's cold hard facts.

 

Well substantial evidence is right here:

 

gsync2.png

 

 

 

Below that 40 FPS mark though things shift. The red line shows how AMD's FreeSync and Adaptive Sync work: the refresh rate stays static at 40 Hz even as the frame rate dips below 40 FPS, to 35 FPS, 30 FPS, etc. G-Sync works differently, doubling the refresh rate and inserting duplicate frames starting at around the 37 FPS mark. This continues until the game frame rate hits 19/20 FPS where a third frame is inserted and the refresh rate is increased again. The result is the dashed line representing the effective experience of G-Sync.

 

 

http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

 

 

FreeSync cannot currently go below 40Hz, and has no implementation below 40Hz. Whereas G-Sync has an implementation for when the framerate drops below 30 (37 to be exact). It in actuality has no limits as stated by Malventano. So while G-Sync only claimed to go down to 30Hz. It still is able to go down below the panels refresh rate and has a system in which it follows when that happens. 

 

I don't see how I'm genuinely wrong. FreeSync claimed to be able to go down to 9Hz, while they are only able to go down to 40Hz. 

 

It's not really nonsense. I mean NVIDIA gave a realistic specification with their G-Sync range, while AMD went ahead and used the interface spec from VESA as their range. Then the products released with their branding and they were no where near 9-240Hz. I don't see how anyone can see differently? If you claim to do one thing, back up your claim. I wouldn't blame G-Sync for not supporting 240Hz because they never actual released specifications that said they can go up to 240Hz. It just happens that G-Sync can and has no limits. But they never made claims that they could at this current time. Which is why I don't see why you don't understand that.

 

And don't start about the community, because it's like two people. We understand FreeSync has no implementation display side, this is understood, but they still put their branding on products which carry the FreeSync name. 

 

Well how is it concrete evidence, G-Sync only claimed 30-144Hz but it turns out that they can actually go below 30Hz without any isues. FreeSync went right a head and said they can go down to 9Hz but they can only go to 40Hz which is what we discovered after the products released with FreeSync branding.

 

I don't see why you AMD fanboys don't understand this. Everyone knows that 9-240Hz is the Adaptive-Sync interface specification. Nobody is claiming otherwise. The problem here is that they are using that specification and not what is actually implemented by current technology. Which will deter potential buyers from G-Sync, that might think that FreeSync is better because of being able to do 9-240Hz while G-Sync only can do 30-144Hz. This is what AMD compares in their own advertising. 

 

It's nothing more than a marketing tactic to make yourself look better than the competition. I don't think it has anything to do with "secrecy," I think they are just using dishonest marketing tactics. Pretty simple really.

 

If you disagree with this, there's nothing more to talk about. I guess we will have to "agree to disagree" at this point because I don't think this thread is going anywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Well substantial evidence is right here:

 

- snip -

You're still basing FreeSync capabilities on displays which is an invalid argument. Like I said do you have any substantial evidence?

Link to comment
Share on other sites

Link to post
Share on other sites

You're still basing FreeSync capabilities on displays which is an invalid argument. Like I said do you have any substantial evidence?

How are you not understanding the info given here? Lol

Link to comment
Share on other sites

Link to post
Share on other sites

I don't see how I'm genuinely wrong. FreeSync claimed to be able to go down to 9Hz, while they are only able to go down to 40Hz.

 

At this moment in time, pending new display releases, BiG StroOnZ.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

How are you not understanding the info given here? Lol

I understand the information completely. Thus why I am still seeking a valid answer. Maybe you can provide more insight to backup his claims? With FreeSync being a GPU/Driver technology the use of display technology is in every way invalid. So unless he can prove evidence that FreeSync is not capable of going below 40 Hz or above 144 Hz without referencing displays then his whole argument of FreeSync not being capable of 9-240 Hz is invalid.

Link to comment
Share on other sites

Link to post
Share on other sites

I understand the information completely. Thus why I am still seeking a valid answer. Maybe you can provide more insight to backup his claims? With FreeSync being a GPU/Driver technology the use of display technology is in every way invalid. So unless he can prove FreeSync is not capable of going below 40 Hz or above 144 Hz without referencing displays then his whole argument of FreeSync not being capable of 9-240 Hz is invalid.

Well I mean, they have nothing to show for it. It's like me claiming I can jump ten feet, but I only ever do three because I'm limited by gravity or something.

Link to comment
Share on other sites

Link to post
Share on other sites

Well I mean, they have nothing to show for it. It's like me claiming I can jump ten feet, but I only ever do three because I'm limited by gravity or something.

 

That doesn't mean the FreeSync technology is incapable of doing what they say it can. And your analogy is flawed. It would be more like you saying you could run 100 metres in 10 seconds, if only you weren't wearing high heels.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×