Jump to content

Nvidia Pitches Advantages of G-Sync Over AMD's FreeSync

BiG StroOnZ

Well I mean, they have nothing to show for it. It's like me claiming I can jump ten feet, but I only ever do three because I'm limited by gravity or something.

Just because there's not a product available on the market to showcase the range doesn't mean that it's simply a fabricated number. It seems like people are under the impression that there is such thing as a "FreeSync display".

Link to comment
Share on other sites

Link to post
Share on other sites

 

You're still basing FreeSync capabilities on displays which is an invalid argument. Like I said do you have any substantial evidence?

 

How bout this, provide proof that FreeSync capable displays can go down to 9Hz, and can go above 144Hz. Since there is no such proof, I'll sit here and wait while you just post the VESA Interface Specification of 9-240Hz. While this is the only piece of evidence available, it does not qualify as proof. Until AMD provides proof, their claims are misleading. Because the displays are out and they have a minimum of 40 and a maximum of 144. There is no where, that dictates that there are FreeSync capable displays of going below 40Hz or above 144Hz.

 

At this moment in time, pending new display releases, BiG StroOnZ.

 

Yes at this moment in time, and for many years down the road. The technology doesn't exist, and isn't currently planned. When it does exist, 3 years down the road. Sure AMD can state 9-240Hz all day long and I won't have a problem with them doing that. However, right now, currently technology for those ranges are not possible. Therefore, it discredits their claims.

 

I understand the information completely. Thus why I am still seeking a valid answer. Maybe you can provide more insight to backup his claims? With FreeSync being a GPU/Driver technology the use of display technology is in every way invalid. So unless he can prove evidence that FreeSync is not capable of going below 40 Hz or above 144 Hz without referencing displays then his whole argument of FreeSync not being capable of 9-240 Hz is invalid.

 

No you don't understand the information completely. Because if you did, you would stop replying a long time ago. Valid answers were given, you just choose to ignore them. 

 

Why is it invalid? Because you want it to be? AMD puts their FreeSync branding on displays, knowing well that they aren't able to go down to 9Hz or above 144Hz. That is the evidence itself. There is no evidence available that FreeSync is currently capable of going below 40Hz or above 144Hz, not one single piece of evidence. Based on these facts, nobody can conclude that FreeSync is able to go down to 9Hz or above 144Hz. If you feel the need to reference the interface spec, that just proves that you are wrong. We know what the interface specification is. It doesn't conclude however, what is actually currently possible for them.

 

That doesn't mean the FreeSync technology is incapable of doing what they say it can. And your analogy is flawed. It would be more like you saying you could run 100 metres in 10 seconds, if only you weren't wearing high heels.

 

How so? They said 9-240Hz. Their products are out and they don't go below 40Hz, or above 144Hz. That's quite a huge gap in your proposed range. Your analogy is flawed. It's like a company saying their next graphics card will feature DirectX 14. When DirectX 14 isn't even planned or being worked on. Then their graphics card finally releases and all it supports is DirectX 12. 

 

I mean sure eventually we will see a DirectX 14. But the question is when? 3 years from now? 4 years from now? 5 years from now? Do you not understand that? The ranges of 9-240Hz are completely baseless, and aren't substantiated by any current technology or future planned technology.

 

Just because there's not a product available on the market to showcase the range doesn't mean that it's simply a fabricated number. It seems like people are under the impression that there is such thing as a "FreeSync display".

 

Then why even use that range? What's the point? NVIDIA only used 30-144Hz because they obviously know what current panel and planned panel limitations are going to be in the near future. Luckily, they figured out an implementation of G-Sync that allows it to go below 30Hz without any problems, even beyond the panel limits. FreeSync however claims a range of being able to cover 9-240Hz but their products are out and this couldn't be even further from the truth. Not only are they not able to go below 40Hz, they don't even have an implementation for when they go below 40Hz. So it isn't even panel limitations, they don't even have an implementation beyond panel limitations like G-Sync does. Meanwhile, G-Sync never even claimed of being able to go below 30Hz, but they still can. While AMD claims being able to go down to 9Hz but they don't even have an implementation below 40Hz.

 

Nobody are under any impressions, stop saying that, everyone understands that panels labeled with FreeSync are nothing more than carrying the FreeSync branding, this doesn't divert from the fact that AMD consciously puts their branding on panels that aren't currently capable of doing the ranges or intervals they claimed in their advertising (and unlike G-Sync they don't even have implementations that take their technology beyond panel limits).

Link to comment
Share on other sites

Link to post
Share on other sites

How bout this, provide proof that FreeSync capable displays can go down to 9Hz, and can go above 144Hz. Since there is no such proof, I'll sit here and wait while you just post the VESA Interface Specification of 9-240Hz. While this is the only piece of evidence available, it does not qualify as proof. Until AMD provides proof, their claims are misleading. Because the displays are out and they have a minimum of 40 and a maximum of 144. There is no where, that dictates that there are FreeSync capable displays of going below 40Hz or above 144Hz.

It seems like you're under the impression that there is such thing as a "FreeSync display". I'm willing to bet these panels will not go below 40 Hz without interlacing. I have a fairly new display and the lowest it will go is 50 Hz before it reaches interlacing. What we are looking at here with these 40-144 Hz ranges is likely a panel limitation. Even G-Sync will not go below this range as once it starts nearing its baseline the module simply starts double buffering to push the panels refresh rate above the minimum threshold. You can continue to blame FreeSync for displays not going below 40 or above 144 Hz but it's not a limitation of FreeSync. The argument from the start has been pointless, people are not going to buy an expensive display with Adaptive-Sync or G-Sync to play at low FPS avoiding the tear free and smoothness advantages of those displays. Better said, no one would buy a display to play below 30 FPS, they would buy a new GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

 

It seems like you're under the impression that there is such thing as a "FreeSync display". I'm willing to bet these panels will not go below 40 Hz without interlacing. I have a fairly new display and the lowest it will go is 50 Hz before it reaches interlacing. What we are looking at here with these 40-144 Hz ranges is likely a panel limitation. Even G-Sync will not go below this range as once it starts nearing its baseline the module simply starts double buffering to push the panels refresh rate above the minimum threshold. You can continue to blame FreeSync for displays not going below 40 or above 144 Hz but it's not a limitation of FreeSync. The argument from the start has been pointless, people are not going to buy an expensive display with Adaptive-Sync or G-Sync to play at low FPS avoiding the tear free and smoothness advantages of those displays. Better said, no one would buy a display to play below 30 FPS, they would buy a new GPU.

 

How am I under that impression? I said FreeSync capable displays. Meaning displays that are capable of carrying the FreeSync branding. 

 

Yes we understand that 40-144Hz is a panel limitation. So there is no sense in repeating what I have said before. The problem here is touting an interface specification, as an actual possible specification for the technology that is currently available. That is the issue here. Also have you even watched the PCPer Video, they take G-Sync below 40Hz without any interlacing issues:

 

 

So it's more than just a panel limitation but also a limitation of the implementation of FreeSync.

 

While G-Sync does have a method employed when it goes below the panels limits, it still has a process that allows G-Sync to still work seamlessly; which results in a stutter and screen-tearing free experience even below 30 fps. 

 

Well it is also an issue with FreeSync because they don't even have a method at the driver level, with an algorithm similar to G-Sync, that would allow them to implement Adaptive-Sync even below the panel's limitations. People have said this shouldn't be really too difficult to do, and I will applaud AMD when they do that, if they do. Because then it will at least make their claims of 9-240Hz somewhat relevant. 

 

Sure, FreeSync capable displays could very well be able to go below 40 and above 144. But it just doesn't exist right now. Which is all I'm pointing out. So while they could in the future, it doesn't mean that they can currently. But I'm basing my opinion off of what they claim now, not what will be possible in the future.

 

The problem here is having a stutter-free and screen-tearing free experience becomes important below 30 fps because as high resolution monitors become more popular and more affordable these experiences will become more frequent (dipping below 30 fps). So having coverage even in the low end, is equally as important. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

An unstoppable force meets an immovable object. Y'all are obviously at an impasse. Best if we move on.

Intel i7 6700k @ 4.8ghz, NZXT Kraken X61, ASUS Z170 Maximus VIII Hero, (2x8GB) Kingston DDR4 2400, 2x Sapphire Nitro Fury OC+, Thermaltake 1050W

All in a Semi Truck!:

http://linustechtips.com/main/topic/519811-semi-truck-gaming-pc/#entry6905347

Link to comment
Share on other sites

Link to post
Share on other sites

 

You knew what I meant, you are just saying you did not to try to make a point out of it. Like the epic straw man that you are. It's not wrong terminology. You have to be kidding at this point. If that's seriously all you have left to stand on. I consider this conversation done. You are being quite ridiculous at this point. I find it hilarious that you have nothing left but using panel and monitor. And actually claiming that they cannot be interchangeable in some cases. I'm wondering if English is not your native language? Because people call monitors a lot of things. Like "Screens." Should people be hassled for using the term Screen, when Screen is actually referring to the Panel? Or you are going to nitpick with them too for using Screen? The fact that I have to even explain this to you is quite sad.

 

• Most of the pictures were from AMD's own advertising like this one:

 

 

So that's your argument, it's the retailers who are doing this. LOL Does that above picture look like it's from retailers?

 

• A certification to use a specific branding or logo is essentially a type of license. Do you not understand that fact? You are licencing out your branding to another company. LOL. Can't believe you don't understand that. So now the monitor doesn't have to support the full range? Why is that, because it's convenient for you and AMD to have bogus advertising then not live up to their own standard. Who is saying any monitor that supports display port 1.2a has to support Adaptive-Sync? Nobody is, this is just more of your straw man fallacies to dissuade the conversation in a direction it is not going in. More mindlessness from you. Nobody is confusing anything. Really you have to stop saying that. The only one who is confused is yourself. As I pointed out numerous times. You really don't know what we are talking about at this time, and the second you finally understand it. You dismiss it and continue on with senselessness. 

 

Nobody is saying they have power over what hardware is being used. Nobody said that. What was said was that they put their branding on products that don't support the intervals that they claim. Jeez...

 

• It wasn't necessary or needed, you quoted me saying one thing then responded with something completely different. Stay on topic or don't respond to me at all.

• Yes that should be considered normal. You say you have a feature, you have to implement it when the product releases. How else do you hold standards to companies if they can just make up crap? Then when the product is actually released it doesn't hold up to the previous claims

• It wasn't an actual question. Which is why I assume English is not your native language. It was a rhetorical question to make a point. OMFG, JEEZ! 

• There's no burden of proof, anyone who believes AMD isn't making money on FreeSync is just ignorant.

• There's a burden of proof though, that all 4K panels can do 144Hz. Why don't you provide a source for that. I don't even demand FreeSync support the entire range, I'm actually fine with them only supporting 40-144Hz. But the problem is it's not what they advertised. So that's where the problem lies.

 

It's not a mistake, it just appears you aren't fluent enough in English to understand that people use many words to describe the same things in English. I understood what he said, you are just making peevish criticisms and objections about petty matters at this point, because as I said before you have nothing left in your argument.

 

Well, AMD has a say in what vendors get to use their branding and has a process in which monitors get the branding. Then that means they have a say in what products will have their name on it. Therefore if the products do not support the range of 9-240Hz. They will know that. Therefore they have a say in what products carry their name. Which in essence means they have a say in the hardware that has an AMD name on it. They might not dictate the actual hardware inside of it, of course, we understand that. What they do dictate is putting their name on the hardware. But you keep saying that doesn't matter, they are allowed to claim 9-240Hz but they don't have to actually support it in the products with their branding. They have no say in how retailers use their branding? You cannot be serious? Do you not have the slightest understanding of how retailers work in tandem with companies? 

 

• Maybe NVIDIA hasn't refuted the statement because they don't care what AMD is saying. NVIDIA trumps AMD in discrete GPU sales. They don't need to refute anything AMD says. So just because a company claims they don't make any money that means they don't make any money? I guess you haven't heard of "non-profit" organizations. Jeez.

• It's not a logical fallacy, please stop using that term. You really don't know what it means. It also doesn't make you appear more intelligent. So I suggest you stop using it. There was nothing about what I said that was a fundamental error in logic. The only error in logic is your thinking towards my statements. Who said anything about it being anyone's fault? I'm just stating the facts, how it is. G-Sync is NVIDIA, FreeSync is AMD. That's how it is. Those are the facts. I don't understand theories or terminology, but you use logical fallacy when it doesn't even apply, and the only errors in logic present is your lack of understanding towards what is being said. 

• See above, where I prove that your use of the term logical fallacy doesn't even apply, and is used incorrectly. The only thing that happened was you didn't understand something. So you call it a logical fallacy, to make yourself appear like you know what you are talking about. Or maybe you did understand what I was saying, I really don't see how anyone couldn't but perhaps you felt the need to call it a logical fallacy to deter from the fact that you got proven wrong. Like I said, your views are misrepresentation of the truth. 

• So now you need to make money off of consoles to have more inputs on your monitors? People like to use their monitors for other things than just consoles. Like for Blueray players or as a television. I believe if NVIDIA could provide additional inputs, without any degradation of performance they would. But they didn't, which leaves me to believe that it has a part to play in input lag. And why would there be additional cost to NVIDIA? I don't see how that would be possible. Some of the panels used in G-Sync monitors are used in other monitors without G-Sync, and those do have other inputs. Your point was that G-Sync only works with DisplayPort (which is why additional inputs weren't included), my point is so does Adaptive-Sync, that it only works with DisplayPort, but it includes additional inputs so I don't see how that can be a point.

• So, if that's not dishonest. Was it dishonest how NVIDIA portrayed the memory layout and specs on the GTX 970 in advertising? Afterwards we discovered a lot of it was a misrepresentation of what they said.  If you say yes, then you are being a hypocrite. Because by your standards, what you advertise doesn't have to be the same as the products you actually release (or at least in this case, have your branding)

• Well, you obviously don't use the dictionary, which is why I think English is not your native language. Which is why I don't think you know that English words have multiple meanings. But you say, I use words incorrectly (I mean you didn't even look it up in the dictionary either to see if I used it correctly, you just blurted out like a moron that I used it incorrectly): 

 

 

 

http://www.merriam-webster.com/dictionary/factoid

 

No your necessity for semantics is unwarranted, because you don't know what you are talking about the majority of the time. Like in this instance and many others. It might be because English is not your native tongue, which if is the case, I guess I can forgive you for your mistakes or confusion. But if that is the case, I don't see why you would attempt to point out other peoples flaws in their English or their understanding of the language when it is their native language and not yours. I see you are from Copenhagen, even if I studied Danish throughout all my years of schooling. I still wouldn't at any point attempt to correct a native Danes word usage in Danish because it just wouldn't be my place to do so. Being that it is not my primary language. And even if I had a good understanding of Danish because I studied it it school for many years, it still wouldn't be appropriate for me to do such a thing considering there is probably a lot that I do not know or understand. Even if I'm good enough to hold a conversation or write. If it happens to be that English is your native tongue, and you just moved from the US or Canada or the UK to Denmark. I don't see how you cannot know that in English there are many polysemic words and also that many words are used loosely in the language to describe the same thing. 

 

Well this is going round in circles, getting more and more dumb, so I will try and keep it brief.

 

You are the only person I've ever seen call a monitor, a panel. Even your example does not do that. In an extreme case of layman's term, maybe; like calling a computer a cpu. But this is a tech site, where we discuss details, and terminology matters. In this case, where the limitation lies, and why.

 

That image is from the launch of the Freesync driver, showing off monitors that can be used. But it doesn't matter. Freesync is still a graphics side driver, and has nothing to do with monitor hardware or limitations.

Let me ask you though: If a monitor is labelled as an Adaptive Sync monitor, and only supports from 46-75hz, is that also unacceptable for VESA to put its name on it? If not, then why is it bad for AMD to do so, with their Adaptive Sync, utilizing driver?

 

  • No a monitor does not have to support the full range to be an Adaptive Sync monitor, or be branded Freesync either. You are the only one, claiming it should be.
  • The claim is only of the support of Freesync's capabilities, using the Adaptive Sync standard. It has nothing to do with monitors or their limitations, that are outside the power of AMD.
  • I can talk, write and comprehend English just fine. You, however, seem to have certain comprehension issues. Either on purpose or not. Being a rhetorical question, is exactly why it is a logical fallacy. You don't seem to understand what that means, since you talk about "logic errors"?!
  • You claim that AMD makes money off of it, that is YOUR claim, so YOU have the burden of proof. Calling people ignorant is your subjective opinion, that is all.
  • Again a logical fallacy. I said 4K TN panels. ALL TN panels should be able to do 144 hz, as TN usually has a refresh rate quite a bit under 7 milliseconds.

 

  • Don't care why NVidia hasn't refuted. They haven't. Either way, license or not, NVidia makes money off of the gsync module, AMD makes no money off of Adaptve Sync. That is the point.
  • The irony here is that YOU don't seem to understand what a logical fallacy is. I used it correctly and for a reason. Freesync is indeed AMD, Adaptive Sync is not. Monitors has Adaptive Sync functionality in the hardware.
  • Your idea of "proof" is lacking to say the least. Your subjective opinion, your burden of proof.
  • Just keep those logical fallacies coming. Nowhere have I ever stated such a thing. Fact is that monitor vendors make their monitors for their own sake, not AMD's, so they include several input types for convenience and broader adoption rates. NVidia couldn't care less about consoles or blueray players. Adding input types, that makes Gsync irrelevant, makes no sense from an R&D perspective. Designing an interface, to handle different hardware inputs, as well as OSD's, and controls to toggle between them, is not free. Unless we are talking adaptive sync, where the scaler vendors could just recycle their existing tech and design.
  • NVidia did manipulate about the 970, as the last 512mb could not run at full bandwidth. That has nothing to do with third party hardware, hardware standards or drivers. If an AMD graphics card, supporting the full scale of Adaptive Sync (that would be GCN 1.1=>), and it could not do 9-240hz on a monitor capable of it, then yeah, I would agree with you. But I doubt that is the case. Freesync driver is not responsible for any and all limitations of a monitor.
  • How about going to your own link on the dictionary, go to the full description, and see the first (and primary) definition? Or how about this: http://en.wikipedia.org/wiki/Factoid
    The word is defined by the Compact Oxford English Dictionary as "an item of unreliable information that is repeated so often that it becomes accepted as fact"

    Or we could just see what the inventor of the word himself, define it as:

    Mailer described a factoid as "facts which have no existence before appearing in a magazine or newspaper"

    Shall I go on? Or can we get back to the actual point of the argument, that you claimed AMD was lying/manipulating about gsyncs performance hit, and "correcting" their powerpoint slide? They weren't, so you dismissed it as trivial/unimportant.


 

My necessity for semantics are very warranted, when you don't know what you are talking about. You confuse hardware standards, hardware parts, terminology, products and branding into one giant clusterfrack, and get annoyed when I call you out on it? How can we communicate, if we use the wrong words?

 

Talking about my language skills because of my nationality is unwarranted. Like I said, my English skills might not be perfect, but they are damn well good enough to read/write and comprehend anything being said.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Well this is going round in circles, getting more and more dumb, so I will try and keep it brief.

 

You are the only person I've ever seen call a monitor, a panel. Even your example does not do that. In an extreme case of layman's term, maybe; like calling a computer a cpu. But this is a tech site, where we discuss details, and terminology matters. In this case, where the limitation lies, and why.

 

That image is from the launch of the Freesync driver, showing off monitors that can be used. But it doesn't matter. Freesync is still a graphics side driver, and has nothing to do with monitor hardware or limitations.

Let me ask you though: If a monitor is labelled as an Adaptive Sync monitor, and only supports from 46-75hz, is that also unacceptable for VESA to put its name on it? If not, then why is it bad for AMD to do so, with their Adaptive Sync, utilizing driver?

 

  • No a monitor does not have to support the full range to be an Adaptive Sync monitor, or be branded Freesync either. You are the only one, claiming it should be.
  • The claim is only of the support of Freesync's capabilities, using the Adaptive Sync standard. It has nothing to do with monitors or their limitations, that are outside the power of AMD.
  • I can talk, write and comprehend English just fine. You, however, seem to have certain comprehension issues. Either on purpose or not. Being a rhetorical question, is exactly why it is a logical fallacy. You don't seem to understand what that means, since you talk about "logic errors"?!
  • You claim that AMD makes money off of it, that is YOUR claim, so YOU have the burden of proof. Calling people ignorant is your subjective opinion, that is all.
  • Again a logical fallacy. I said 4K TN panels. ALL TN panels should be able to do 144 hz, as TN usually has a refresh rate quite a bit under 7 milliseconds.


 

  • Don't care why NVidia hasn't refuted. They haven't. Either way, license or not, NVidia makes money off of the gsync module, AMD makes no money off of Adaptve Sync. That is the point.
  • The irony here is that YOU don't seem to understand what a logical fallacy is. I used it correctly and for a reason. Freesync is indeed AMD, Adaptive Sync is not. Monitors has Adaptive Sync functionality in the hardware.
  • Your idea of "proof" is lacking to say the least. Your subjective opinion, your burden of proof.
  • Just keep those logical fallacies coming. Nowhere have I ever stated such a thing. Fact is that monitor vendors make their monitors for their own sake, not AMD's, so they include several input types for convenience and broader adoption rates. NVidia couldn't care less about consoles or blueray players. Adding input types, that makes Gsync irrelevant, makes no sense from an R&D perspective. Designing an interface, to handle different hardware inputs, as well as OSD's, and controls to toggle between them, is not free. Unless we are talking adaptive sync, where the scaler vendors could just recycle their existing tech and design.
  • NVidia did manipulate about the 970, as the last 512mb could not run at full bandwidth. That has nothing to do with third party hardware, hardware standards or drivers. If an AMD graphics card, supporting the full scale of Adaptive Sync (that would be GCN 1.1=>), and it could not do 9-240hz on a monitor capable of it, then yeah, I would agree with you. But I doubt that is the case. Freesync driver is not responsible for any and all limitations of a monitor.
  • How about going to your own link on the dictionary, go to the full description, and see the first (and primary) definition? Or how about this: http://en.wikipedia.org/wiki/Factoid

    Or we could just see what the inventor of the word himself, define it as:

    Shall I go on? Or can we get back to the actual point of the argument, that you claimed AMD was lying/manipulating about gsyncs performance hit, and "correcting" their powerpoint slide? They weren't, so you dismissed it as trivial/unimportant.


 

My necessity for semantics are very warranted, when you don't know what you are talking about. You confuse hardware standards, hardware parts, terminology, products and branding into one giant clusterfrack, and get annoyed when I call you out on it? How can we communicate, if we use the wrong words?

 

Talking about my language skills because of my nationality is unwarranted. Like I said, my English skills might not be perfect, but they are damn well good enough to read/write and comprehend anything being said.

 

If I'm the only person, that's your problem. I've seen plenty of people call a monitor a panel. I've also seen them call it a display, or a screen. There are many words for the same things here. Maybe you would know that if you lived in the US. Which you don't.

 

It does have to do with hardware limitations, they are the ones making claims in their advertising and then putting their branding on said products with these hardware limitations.

 

Do you see VESA going around making ad campaigns, comparing themselves to G-Sync? No you do not. All they are touting is an interface specification. Nothing more. So terrible question.

 

• I'm just holding companies to standards you however do not, which is why you think it is okay for them to do that.

• What is not outside of AMD's power, is their advertising. Which you seem to not be getting. 

• I disagree, as seen throughout most of your posts. You lack basic comprehensions skills and you use terms incorrectly meanwhile criticizing other peoples' word usage. A rhetorical question doesn't make it a logical fallacy. Like I said before, stop using that word you don't know what it means.

• It's not subjective to believe that a company makes no money off of a product. That is what you describe as ignorance.

• No not a logical fallacy. Stop using that word, you don't know what it means. For the millionth freaking time. I said provide proof that 4K panels can do 144Hz. You haven't provided such, all you did was say it's a logical fallacy. Which is what you do when you get proven wrong. As I stated previously.

 

• The problem here lies, that you actually believe AMD is making no money in all of this. 

• It's easy to point out when someone else doesn't understand what something is, when in actuality you don't understand what something is. Which is the case here. In order to utilize Adaptive-Sync currently, you must use an AMD card. There is no other way to use Adaptive-Sync currently. Hence, ecosystem. Stop calling things you don't understand logical fallacies. It doesn't make you appear more intelligent. I tried to explain this to you before.

• LOL Again you use logical fallacy. Now I'm starting to believe you must be 12 years old and you learned a new word on the internet. Now you feel the need to use it repeatedly. You must have ignored the part where I said there are panels used in G-Sync monitors that are also used in non G-Sync monitors. So your entire explanation is nullified and useless.

• You doubt it is the case? FreeSync is not responsible? NVIDIA has a method of employing below 30 frames that keeps G-Sync active, while FreeSync has absolutely nothing. So not only are they limited by panel technology, they don't even have anything that can take them below panel limitations like G-Sync can. You are being a hypocrite here. Like I said you would.

• How about that was taken directly from my link, and from the primary definition:

 

25p36kk.jpg

 

Now you use Wikipedia as a source for definitions as opposed to Merriam-Webster? LMAO. 

 

Shall we go on, you just blatantly lied to prove a point? Like I said, you don't even know that words have multiple meanings. Then you wonder why I question your English capabilities. 

 

They were, it was a misrepresentation of the truth. 1-1.5% is so minuscule it might as well not even be stated or suggested. Especially when trying to be used as a comparison to another company's technology. 

 

It's easy to say someone doesn't know what they are talking about, when in fact, you are the one who doesn't know what they are talking about. Nobody confuses anything, the only one who is confused is yourself. On top of your confusion, you spin things around and make up lies to prove points. You don't even respond to what is being said half the time. Not once did I confuse hardware standards, hardware parts, terminology or products. Not once, but you need to lie at this point to prove a point because you lost the argument ages ago.

 

It's not unwarranted? You just tried to prove me wrong again, EVEN AFTER I POSTED THE DEFINITION WITH THE LINK for factoid, that's how deluded you are. I mean, your ignorance is only second to your arrogance. 

Link to comment
Share on other sites

Link to post
Share on other sites

Do you see VESA going around making ad campaigns, comparing themselves to G-Sync? No you do not. All they are touting is an interface specification. Nothing more. So terrible question.

VESA doesn't have to sell anything to consumers, only to manufacturers.

 

It's not unwarranted? You just tried to prove me wrong again, EVEN AFTER I POSTED THE DEFINITION WITH THE LINK for factoid, that's how deluded you are. I mean, your ignorance is only second to your arrogance.

His definition of "factoid" is in the definition you provided. It is actually the first one in the list, yours being the second.

 

25p36kk.jpg

I said provide proof that 4K panels can do 144Hz.

 

A year ago 4k couldn't do more than 30. Because of the DisplayPort standard. You couldn't push enough data through the cable to do it. That is a restriction that is nothing to do with FreeSync, Adaptive Sync or indeed G-Sync. Nobody uses 4k yet because you need to spend about 2000 dollars on graphics cards to game at anything resembling a usable frame-rate. There are plenty of 2560*1440 panels that are capable of 144Hz. There are hundreds of 1920*1080 panels capable of 144Hz. That's where most people game. That's where you'll see things like high refresh rates. 1 - because the cable is actually capable of pushing that much data per second. 2 - because everybody can afford the hardware to play games at that resolution.

 

OL Again you use logical fallacy. Now I'm starting to believe you must be 12 years old and you learned a new word on the internet

 

That just makes you sound petty.

 

None of what you said has anything to do with what the AdaptiveSync technology can do. It is currently limited by output devices. Output devices are limited by the manufacturer of the device.

 

Up until CES 2015, there were no IPS G-Sync panels available. Heck, there weren't even half a dozen G-Sync monitors available for purchase. Is that nVidia's fault too? That's the standard you are applying to AMD.

 

None of them were available for under 600 dollars. That is nVidia's fault.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

VESA doesn't have to sell anything to consumers, only to manufacturers.

 

His definition of "factoid" is in the definition you provided. It is actually the first one in the list, yours being the second.

 

A year ago 4k couldn't do more than 30. Because of the DisplayPort standard. You couldn't push enough data through the cable to do it. That is a restriction that is nothing to do with FreeSync, Adaptive Sync or indeed G-Sync. Nobody uses 4k yet because you need to spend about 2000 dollars on graphics cards to game at anything resembling a usable frame-rate. There are plenty of 2560*1440 panels that are capable of 144Hz. There are hundreds of 1920*1080 panels capable of 144Hz. That's where most people game. That's where you'll see things like high refresh rates. 1 - because the cable is actually capable of pushing that much data per second. 2 - because everybody can afford the hardware to play games at that resolution.

 

 

That just makes you sound petty.

 

None of what you said has anything to do with what the AdaptiveSync technology can do. It is currently limited by output devices. Output devices are limited by the manufacturer of the device.

 

Up until CES 2015, there were no IPS G-Sync panels available. Heck, there weren't even half a dozen G-Sync monitors available for purchase. Is that nVidia's fault too? That's the standard you are applying to AMD.

 

None of them were available for under 600 dollars. That is nVidia's fault.

 

Isn't that the point I'm making?

 

Words can have multiple meanings, which is my point. Which if you read his original reply, you would understand why I made such a statement.

 

I said one of the reasons why over 60Hz 4K monitors don't exist was because of the expense it costs to drive that refresh rate, and also because of DP spec; if you actually read what I posted instead of just trying to refute what I posted. I also said, I believe it has to do with panel limitations because we haven't heard any planned panels that can do over 60Hz 4K. Just because the other two reasons exist, doesn't mean it isn't also a panel limitation. 

 

I don't see how it makes me sound petty, he keeps using the term over and over again not knowing there are two types of logical fallacies with many sub-types. Neither of which apply.

 

But what you seem to not be getting is, AMD is responsible for giving that product their own FreeSync branding.

 

The reason for that was because IPS panels aren't traditionally able to go way over 60Hz unless you overclock them, therefore NVIDIA waited until the technology progressed to be able to offer G-Sync IPS panels. Now we see with the progression of AHVA-IPS panels 120Hz/144Hz is now possible.

 

Not really, the standard I'm applying to AMD is the one they represent in their own advertising. NVIDIA hasn't misrepresented all that much in their advertising. Any of the their misrepresentations have actually been greatly added additions. Like discovering that G-Sync is capable of going below 30Hz.

 

Maybe were, but now are:

 

http://www.newegg.com/Product/Product.aspx?Item=N82E16824160226&ignorebbr=1- $399

http://www.amazon.com/dp/B00NUCRBCU/ref=asc_df_B00NUCRBCU3607044?smid=ATVPDKIKX0DER&tag=iceleadscom-20&linkCode=df0&creative=395093&creativeASIN=B00NUCRBCU&ascsubtag=s1427413836485256ra52378- $499

http://www.ncixus.com/products/?sku=101107&promoid=1202- $524

Link to comment
Share on other sites

Link to post
Share on other sites

How am I under that impression? I said FreeSync capable displays. Meaning displays that are capable of carrying the FreeSync branding. 

It's bluntly obvious when you started claiming FreeSync only supports a 40-144 Hz window blaming FreeSync days ago.

 

Yes we understand that 40-144Hz is a panel limitation. So there is no sense in repeating what I have said before. The problem here is touting an interface specification, as an actual possible specification for the technology that is currently available. That is the issue here. Also have you even watched the PCPer Video, they take G-Sync below 40Hz without any interlacing issues:

 

- snip -

There is no repeating that of which you've never stated. To be quite frankly honest I think you're a bit clueless as to how either of these technologies actually work.

Luckily, they figured out an implementation of G-Sync that allows it to go below 30Hz without any problems, even beyond the panel limits. 

Meanwhile, G-Sync never even claimed of being able to go below 30Hz, but they still can. While AMD claims being able to go down to 9Hz but they don't even have an implementation below 40Hz.

As I've clarified above even G-Sync cannot go beyond panel limits. Once it starts to close in on the panels bottom line the module starts double buffering (PcPer video demonstrates).

 

So it's more than just a panel limitation but also a limitation of the implementation of FreeSync.

Panel limitations are not a FreeSync limitation. That's the same as saying G-Sync cannot support 240 Hz even tho Tom Petersen came out and said that it's a panel limitation.

 

While G-Sync does have a method employed when it goes below the panels limits, it still has a process that allows G-Sync to still work seamlessly; which results in a stutter and screen-tearing free experience even below 30 fps. 

G-Sync doesn't have a method employed to go below the panels limits. What happens is the module starts double buffering to keep the panel from ever going below it's limit.

 

Well it is also an issue with FreeSync because they don't even have a method at the driver level, with an algorithm similar to G-Sync, that would allow them to implement Adaptive-Sync even below the panel's limitations. People have said this shouldn't be really too difficult to do, and I will applaud AMD when they do that, if they do. Because then it will at least make their claims of 9-240Hz somewhat relevant. 

FreeSync was never designed to include double buffering. Although it could be added to the driver model simply by dispatching the same frame in a timely manner. Personally I don't see the point as my minimum FPS doesn't dip below 40 FPS even with my ancient HD 5870. That or wait for 21-144 Hz QHD or 40-144 Hz FHD displays to hit the market.

 

Sure, FreeSync capable displays could very well be able to go below 40 and above 144. But it just doesn't exist right now. Which is all I'm pointing out. So while they could in the future, it doesn't mean that they can currently. But I'm basing my opinion off of what they claim now, not what will be possible in the future.

You've been stating that 9-240 Hz is a fabricated number that's impossible for AMD to support for over four days now. When in fact 9-240 Hz is officially supported in their latest driver. At this point I would encourage you to disassemble the latest driver and look for yourself. Your argument has been focused towards simply blaming FreeSync for display limitations.

 

The problem here is having a stutter-free and screen-tearing free experience becomes important below 30 fps because as high resolution monitors become more popular and more affordable these experiences will become more frequent (dipping below 30 fps). So having coverage even in the low end, is equally as important. 

As stated above if you're dipping below 30-40 FPS then you need to stop investing money into displays and invest more into your machine. Going out and buying a G-Sync display doesn't justify having a machine that's only capable of 15-40 FPS. Both technologies are aimed towards gaming enthusiasts that typically already have a beefy machine.

Link to comment
Share on other sites

Link to post
Share on other sites

It's bluntly obvious when you started claiming FreeSync only supports a 40-144 Hz window days ago.

 

 

There is no repeating that of which you've never stated. To be quite frankly honest I think you're a bit clueless as to how either of these technologies actually work.

 
 

As I've clarified above even G-Sync cannot go beyond panel limits. Once it starts to close in on the panels bottom line the module starts double buffering (PcPer video demonstrates).

 

 

Panel limitations are not a FreeSync limitation. That's the same as saying G-Sync cannot support 240 Hz even tho Tom Petersen came out and said that it's a panel limitation.

 
 

G-Sync doesn't have a method employed to go below the panels limits. What happens is the module starts double buffering to keep the panel from ever going below it's limit.

 

 

FreeSync was never designed to include double buffering. Although it could be added to the driver model simply by dispatching the same frame in a timely manner. Personally I don't see the point as my minimum FPS doesn't dip below 40 FPS even with my ancient HD 5870.

 
 

You've been stating that 9-240 Hz is a fabricated number that's impossible for AMD to support for over four days now. When in fact 9-240 Hz is officially supported in their latest driver. At this point I would encourage you to disassemble the latest driver and look for yourself.

 

 

As stated above if you're dipping below 30-40 FPS then you need to stop investing money into displays and invest more into your machine.

 

Maybe I stand corrected? What refresh rate ranges are FreeSync capable displays currently capable of delivering? If it is below 40Hz, or above 144Hz. Please enlighten me.

 

Oh I have stated it multiple times in this thread, but like most of you AMD fanboys you don't even read what I write. But somehow find it possible to respond to me. I've demonstrated that I have quite a good idea of how these technologies work.

 

Yes, which means they have a method employed that allows G-Sync to still offer a smooth stutter free experience even beyond panel limits.

 

They aren't? Then why is AMD putting FreeSync branding on these monitors? I mean after all they advertised as being able to do one thing, but this is not the reality. No it's not the same, G-Sync never officially stated they can do over 144Hz. They just comment in interviews that these limitations have nothing to do with G-Sync but with the panels themselves.

 

This method that you keep describing, allows G-Sync to still offer the same experience as if it was above the monitors minimum refresh rate. Which means, even when it goes below 30Hz, it still offers a G-Sync like experience. Whereas with AMD, as soon as you go below 40Hz. Adpative-Sync turns off, tearing, stuttering, and jarring is present.

 

That also is dependent on what resolution you are running at, and what games you are playing.

 

No if you actually read what I wrote, I said 9-240Hz is not a relevant spec to use at this current time considering what the technology is currently capable of. Why not just post, where it says 9-240Hz is officially supported in the driver? What I'm not sure about is how that even has anything to do with what I said. We already know the VESA Specification is capable of doing 9-240Hz. I'm not sure how that is relevant. I want to see FreeSync actually capable of going up to 240Hz or down to 9Hz. Not just a specification or standard in writing.

 

I don't see how that is not an opinion. There are people who are running high resolutions on a single card and not until recently with a Titan X has that been really possible in Triple A games. I'm sure there are moments where they dip below 40 fps, just look at the average of Crysis 3 on a GTX 980:

 

crysis3_2560_1600.gif

 

Average is 26.8, so there's what I'm talking about.

Link to comment
Share on other sites

Link to post
Share on other sites

lol sure lets pitch something where there's a entirely free alternative from a good competitor

 

G-sync should really die. There's no reason for it to exist anymore

"It seems we living the American dream, but the people highest up got the lowest self esteem. The prettiest people do the ugliest things, for the road to riches and diamond rings."- Kanye West, "All Falls Down"

 

Link to comment
Share on other sites

Link to post
Share on other sites

How has this thread not been locked yet...

This has been 20 pages of fanboys refusing other fanboys claims...

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

How has this thread not been locked yet...

This has been 20 pages of fanboys refusing other fanboys claims...

I'm actually pretty interested in the differences between G-sync and Freesync as I'm in the market for a new PC and adaptive sync monitor but this thread is just noise.

Link to comment
Share on other sites

Link to post
Share on other sites

How has this thread not been locked yet...

This has been 20 pages of fanboys refusing other fanboys claims...

Please learn the difference between spirited debate (based in fact, usually with good citation and strong logic) and fanboy wars which are based on irrational approaches to product analysis which tend to end up in the production of straw man arguments.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

crysis3_2560_1600.gif

 

Average is 26.8, so there's what I'm talking about.

You hardly need AA at that resolution, turning it down to 2x would likely double the frame-rate. Or turn down other settings like particle density or the several kinds of multisampling. Or get an R9-295x2 for less money than a GTX 980 and have 50FPS.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

 

crysis3_2560_1600.gif

 

Average is 26.8, so there's what I'm talking about.

 

You do realize at this current moment there are no g-sync or free sync monitors at this resolution, Close, yes, but this resolution, no.

 

Also if your frames are dropping to these kinds of levels you are doing things very wrong, turn stuff down or get more GPU power. Like we have been doing, prior to variable refresh.

 

Benchmarks are nice and all, but no one here would be foolish enough to run a game at such a low framerate, Hell you'd be better off going to console.

Link to comment
Share on other sites

Link to post
Share on other sites

I fail to see why people are getting so rustled up over this wave of initial AS monitor releases. As it is right now:

 

-Freesync works well within the operational window, and allows the user to select Vsync on/off above the threshold

-Freesync does not double/tripple the frames below the threshold, so Gsync in superior in that respect

 

What kind of consumer would see this presentational AMD slide and be deceived into believing all AS monitors have the capability of running at 9-240hz? I think very few if any at all; no such monitors exist regardless of the VRR tech included. I am not a fan of misleading marketing, but this is really stretching it to claim consumers will some how make an uninformed purchase because of the slide IMO. When a potential buyer compares his/her options, the VRR ranges are clearly shown. I think it is far more likely to have a customer buy an AS monitor with an incompatible video card, (with a Tahiti based gpu for example), than it is to be tricked by that slide and believe they are purchasing a monitor that has a refresh range of 9-240hz.

 

IMO, the IPS 48-75hz implementation is not great, and the VRR operational window is not wide enough. Despite this not being a product I would buy, its faults are certainly not the result of something AMD did or with the AS/Freesync spec. On the other hand, the 40-144hz TN monitor gets really close to the features many have praised the Swift for, and at a much lower cost. What is wrong with that?

 

When we were learning about the memory structure of the gtx970, a very common and valid argument was that in order to experience the limitations of the last .5GB memory section, unrealistic settings/resolutions were required that pushed the performance into frame rates no would would play at anyway. I find this situation to be similar with the 40-144hz AS monitors. Who the heck would game on settings that would drop below 40fps if they have a 144hz max refresh monitor?

 

Honestly I think that the initial AS/Freesync release has been impressive. My biggest worry was that it would somehow be a lesser experience compared to Gsync within the operational window. I am thrilled this is not the case.

 

I think Nvidia should get huge kudos for bringing VRR to consumers. I also think AS/Freesync should not be slammed into the ground over its (current) lack of a solution below the VRR range of a given AS monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

If I'm the only person, that's your problem. I've seen plenty of people call a monitor a panel. I've also seen them call it a display, or a screen. There are many words for the same things here. Maybe you would know that if you lived in the US. Which you don't.

 

It does have to do with hardware limitations, they are the ones making claims in their advertising and then putting their branding on said products with these hardware limitations.

 

Do you see VESA going around making ad campaigns, comparing themselves to G-Sync? No you do not. All they are touting is an interface specification. Nothing more. So terrible question.

 

• I'm just holding companies to standards you however do not, which is why you think it is okay for them to do that.

• What is not outside of AMD's power, is their advertising. Which you seem to not be getting. 

• I disagree, as seen throughout most of your posts. You lack basic comprehensions skills and you use terms incorrectly meanwhile criticizing other peoples' word usage. A rhetorical question doesn't make it a logical fallacy. Like I said before, stop using that word you don't know what it means.

• It's not subjective to believe that a company makes no money off of a product. That is what you describe as ignorance.

• No not a logical fallacy. Stop using that word, you don't know what it means. For the millionth freaking time. I said provide proof that 4K panels can do 144Hz. You haven't provided such, all you did was say it's a logical fallacy. Which is what you do when you get proven wrong. As I stated previously.

 

• The problem here lies, that you actually believe AMD is making no money in all of this. 

• It's easy to point out when someone else doesn't understand what something is, when in actuality you don't understand what something is. Which is the case here. In order to utilize Adaptive-Sync currently, you must use an AMD card. There is no other way to use Adaptive-Sync currently. Hence, ecosystem. Stop calling things you don't understand logical fallacies. It doesn't make you appear more intelligent. I tried to explain this to you before.

• LOL Again you use logical fallacy. Now I'm starting to believe you must be 12 years old and you learned a new word on the internet. Now you feel the need to use it repeatedly. You must have ignored the part where I said there are panels used in G-Sync monitors that are also used in non G-Sync monitors. So your entire explanation is nullified and useless.

• You doubt it is the case? FreeSync is not responsible? NVIDIA has a method of employing below 30 frames that keeps G-Sync active, while FreeSync has absolutely nothing. So not only are they limited by panel technology, they don't even have anything that can take them below panel limitations like G-Sync can. You are being a hypocrite here. Like I said you would.

• How about that was taken directly from my link, and from the primary definition:

 

 

Now you use Wikipedia as a source for definitions as opposed to Merriam-Webster? LMAO. 

 

Shall we go on, you just blatantly lied to prove a point? Like I said, you don't even know that words have multiple meanings. Then you wonder why I question your English capabilities. 

 

They were, it was a misrepresentation of the truth. 1-1.5% is so minuscule it might as well not even be stated or suggested. Especially when trying to be used as a comparison to another company's technology. 

 

It's easy to say someone doesn't know what they are talking about, when in fact, you are the one who doesn't know what they are talking about. Nobody confuses anything, the only one who is confused is yourself. On top of your confusion, you spin things around and make up lies to prove points. You don't even respond to what is being said half the time. Not once did I confuse hardware standards, hardware parts, terminology or products. Not once, but you need to lie at this point to prove a point because you lost the argument ages ago.

 

It's not unwarranted? You just tried to prove me wrong again, EVEN AFTER I POSTED THE DEFINITION WITH THE LINK for factoid, that's how deluded you are. I mean, your ignorance is only second to your arrogance. 

 

 

For the sake of simplicity, lets get all the language, and semantics out of the way first:

  • This is a tech site, not Bobs woodshed. When discussing how hardware works, it is important to use the right terminology. Identifying limitations and discussing them, is impossible if you use the wrong words. If a limit of frame rates lies in the computer, somewhere, you call it a computer, not a "CPU" (in layman's terms). By now you should know the difference between a panel (the hardware part containing the pixels, that actually makes the picture), and a monitor. Why continue to be wrong, when you know better now?
  • Most of your logical fallacies used against me are straw men: http://en.wikipedia.org/wiki/Straw_man
  1. Straw men are a type of informal fallacy
  2. Informal fallacy is a type of logical fallacy
  3. All straw men are logical fallacies.
  4. Feel free to criticize me for not being specific, but calling them logical fallacies are factually correct. I might even point some of them out. Most of your misrepresentations of my statements are in previous posts though.

 

• It's not subjective to believe that a company makes no money off of a product. That is what you describe as ignorance.

• No not a logical fallacy. Stop using that word, you don't know what it means. For the millionth freaking time. I said provide proof that 4K panels can do 144Hz.(There's the straw man. I specifically said 4K TN panel. HUGE difference) You haven't provided such, all you did was say it's a logical fallacy. Which is what you do when you get proven wrong. As I stated previously. 

 

  • It is very subjective to believe a company makes money off of another hardware vendors product, when no licensing fees, hardware or patents are included. If you believe otherwise, feel free to prove it. I have given you sources showing AMD takes no license fees; that Adaptive Sync is license free, and that AMD has no direct income from a vendors monitor sale, unlike NVidia.
  • Ok, now that you should know what a panel is, let's get into the technical part. I know you don't understand how a monitor and its components work, because you didn't react to the "under 7ms" part. So allow me to elaborate:


 

The reason why all 4K TN panels should be able to support 144hz:

  • At 144hz, 1hz (or fluctuations pr. second) is just under 7ms long. (I know, I know, logical scales like the metric scale is difficult for Americans to comprehend: http://cdn.zmescience.com/wp-content/uploads/2015/03/imperial_vs__metric_by_nekit1234007-d5p0ou5.png ), but 1 second = 1000 milliseconds.
  • If the pixel response time of a panel is lower than 7ms, it can do 144hz. If it is close to it however, the image will constantly be blurry, at motion. A modern TN panel usually has a pixel response time of 1 ms, which leaves almost 6ms to show and hold the given colour (I know pixel response varies with colour).
  • As such all TN panels are able to update at 144hz (or more). The resolution of the panel is irrelevant, being 2K, 4K or 32K makes no difference to the pixel response time.
  • Instead the limitation lies elsewhere, mainly in the bandwidth of the source input (HDMI/DisplayPort, etc). and the monitor/display controller* http://www.paradetech.com/products/dp_monitor_controllers/dp703-lcd-monitor-controller/ (called a scaler in layman's terms. (or even worse a scalar, which is linear algebra in mathematics)).

Do you understand now, why I claim any 4K TN PANEL (not monitor), is capable of a 144hz?

 


 

An interesting fact, you might actually enjoy. Because a liquid crystal in an LCD (liquid crystal display), cannot hold it's charge for more than 30+ms or so; any LCD would have its pixels return to idle state which is off (closed crystals, shutting out the backlight = black). At 30hz, the pixel is on for more than 30ms, which is why the Asus laptop, with the leaked alpha gsync driver, flickered black screen constantly, when the FPS went down to, or lower than 30.

 

This is also why Gsync does its double frame rate at 37hz. and not 30, like NVidia claim their Gsync tech can (wait a minute, would that not mean that NVidia made dishonest claims about their gsync tech?) :o

 

The really interesting part, is that NVidia could emulate the gsync modules double frame rate, on their drivers (just needs a buffer of the last processed frame), since laptops doesn't have scaler (yes a scaler in this instance), built in.

Now the really interesting thing is, that AMD should be able to make an equal driver implementation of double frame rate, just as easily as NVidia, as it just needs a buffer like Vsync.

 

 

• The problem here lies, that you actually believe AMD is making no money in all of this. 

• It's easy to point out when someone else doesn't understand what something is, when in actuality you don't understand what something is. Which is the case here. In order to utilize Adaptive-Sync currently, you must use an AMD card. There is no other way to use Adaptive-Sync currently. Hence, ecosystem. Stop calling things you don't understand logical fallacies. It doesn't make you appear more intelligent. I tried to explain this to you before.

• LOL Again you use logical fallacy. Now I'm starting to believe you must be 12 years old and you learned a new word on the internet. (yay ad hominem, that is also an informal fallacy, more specifically, a genetic fallacy) Now you feel the need to use it repeatedly. You must have ignored the part where I said there are panels used in G-Sync monitors that are also used in non G-Sync monitors. So your entire explanation is nullified and useless.

• You doubt it is the case? FreeSync is not responsible? NVIDIA has a method of employing below 30 frames that keeps G-Sync active, while FreeSync has absolutely nothing. So not only are they limited by panel technology, they don't even have anything that can take them below panel limitations like G-Sync can. You are being a hypocrite here. Like I said you would.

 

  • AMD makes money off of selling their hardware. AMD has themselves stated that they have no licensing fees on their Freesync certification program. Even though AMD proposed the Adaptive Sync standard, they have no right to any royalties. The point was (and is), that NVidia makes money off of every single gsync monitor sold, and AMD does not on any Adaptive Sync monitor. You believe otherwise, fair enough, but you hold the burden of proof.
  • It's easy to point out when other people don't know or understand something, when you do and they don't. How about be open minded and learn when someone says something you don't know. You even admitted to not knowing terminology and theories. Why not ask instead and learn from the people who know more? I agree AMD is the only player on the market supporting Adaptive Sync right now (that is a fact), but it is open to anyone to support it. You cannot call it vendor lock in, or closed eco system on an open license free industry standard.
  • Yes it was a straw man, when you claimed I said you had to make money off of consoles to have more inputs on a monitor. AMD still has no influence or say in how a monitor vendor, chooses to make their monitors. LG, Asus, BENQ, etc. do what they see best for them and their costumers.
  • I think the problem here is discussing FPS or HZ. I don't deny Gsync's way of dealing with sub 37 FPS is better than AMD's forced Vsync. But in the end, you will have a bad stuttery experience playing at that low FPS. Displaying the same image several times, will still result in some lag. But you are right, that the synced frame rate still works below 37 FPS, although not completely, as the monitor will run at twice the hz. Whether this makes any qualitative difference when gaming, I don't know. But I've stated several times on LTT forum, that I would like for AMD to implement a similar buffer based solution.

 

Just to wrap things up:

 

Shall we go on, you just blatantly lied to prove a point? Like I said, you don't even know that words have multiple meanings. Then you wonder why I question your English capabilities. 

 

Panel has never meant monitor, as it is a specific component. You know better now, why do you keep on showing your ignorance? Also please prove that I don't know words can have multiple meanings. Even small children understand this. You calling a monitor a panel, is wrong by all definitions, stop the ad hominem please.

 

They were, it was a misrepresentation of the truth. 1-1.5% is so minuscule it might as well not even be stated or suggested. Especially when trying to be used as a comparison to another company's technology. 

 

Something being miniscule is a far cry from misrepresenting the truth. You edited out the AMD slide, claiming there was no performance hit. There is, and now you call it miniscule. Whether it has a real life effect or not, makes no difference: The constant two way handshake is over complicated, and comes at a cost.

 

It's easy to say someone doesn't know what they are talking about, when in fact, you are the one who doesn't know what they are talking about. Nobody confuses anything, the only one who is confused is yourself. On top of your confusion, you spin things around and make up lies to prove points. You don't even respond to what is being said half the time. Not once did I confuse hardware standards, hardware parts, terminology or products. Not once, but you need to lie at this point to prove a point because you lost the argument ages ago.

 

What is being said half the time is wrong, and shows clear signs of ignorance. You don't get to dictate how I answer something. If I need to educate you on something (like the Freesync certification program), then that is relevant to answer you question, so you understand, why I hold a different opinion than you do. That should not be difficult to understand.

 

Several times did you misuse terminology, hardware parts, etc. Comparing a driver, which utilizes an open industry standard, to a monitor, and the limitations of panels, makes absolutely no sense. Either you are wilfully manipulating, creating propaganda, or you are very confused. At least that is MY subjective conclusion of your statements.

 

*Edit: The "scaler" of a monitor has different names, depending on the vendor. They are all called controllers, but either display, monitor or interface controller. No wonder people call it a scaler, even though a controller board contains a lot more than just the scaling chip. Edited the post to reflect this.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

All nvidia has to do is open up Gsync communication and they start shitting gold.

Link to comment
Share on other sites

Link to post
Share on other sites

 
 

You hardly need AA at that resolution, turning it down to 2x would likely double the frame-rate. Or turn down other settings like particle density or the several kinds of multisampling. Or get an R9-295x2 for less money than a GTX 980 and have 50FPS.

 

Well you do realize, you can be averaging 50FPS even with a dip below 30.
 
57
58
60
25
-----
200
/4
---
50
 
Which is my main point, not everyone expects dips like that to happen. But they can happen. And if they do and you have G-Sync enabled for instance, it will offer a smooth transition and make the dip less noticeable. Which is one of the main reasons for using G-Sync (to avoid stuttering, tearing and jarring). Adaptive-Sync on the other hand stops at about 40Hz, and to say that dips below 40 fps don't happen or wont happen at all is just not true. Even with a high-end graphics card. This can exist, in triple A titles even with a really nice graphics card (just look at minimum framerates for most triple A games).
 

 

You do realize at this current moment there are no g-sync or free sync monitors at this resolution, Close, yes, but this resolution, no.

 

Also if your frames are dropping to these kinds of levels you are doing things very wrong, turn stuff down or get more GPU power. Like we have been doing, prior to variable refresh.

 

Benchmarks are nice and all, but no one here would be foolish enough to run a game at such a low framerate, Hell you'd be better off going to console.

 

Yes but there are G-Sync monitors @ 4K and 1440P. So a above, and a little below (and 1600p is very similar to 1440p as far as how taxing it is).
 
Like I used in the above example, you see you can be averaging out 50 fps even with a dip below 30.
 
Well, nobody intends to run at that FPS, but with modern triple A titles you cannot control when dips happen. Perfect example is Dying Light, you can be averaging around 50-60fps just fine with a modern graphics card and there are instances when you get severe dips. Being able to have coverage down that low will be able to smooth those situations out and make them less noticeable. There are other games that happens to, like Dragon Age: Inquisition. 
 

 

For the sake of simplicity, lets get all the language, and semantics out of the way first:

  • This is a tech site, not Bobs woodshed. When discussing how hardware works, it is important to use the right terminology. Identifying limitations and discussing them, is impossible if you use the wrong words. If a limit of frame rates lies in the computer, somewhere, you call it a computer, not a "CPU" (in layman's terms). By now you should know the difference between a panel (the hardware part containing the pixels, that actually makes the picture), and a monitor. Why continue to be wrong, when you know better now?
  • Most of your logical fallacies used against me are straw men: http://en.wikipedia.org/wiki/Straw_man
  1. Straw men are a type of informal fallacy
  2. Informal fallacy is a type of logical fallacy
  3. All straw men are logical fallacies.
  4. Feel free to criticize me for not being specific, but calling them logical fallacies are factually correct. I might even point some of them out. Most of your misrepresentations of my statements are in previous posts though.

 

 

  • It is very subjective to believe a company makes money off of another hardware vendors product, when no licensing fees, hardware or patents are included. If you believe otherwise, feel free to prove it. I have given you sources showing AMD takes no license fees; that Adaptive Sync is license free, and that AMD has no direct income from a vendors monitor sale, unlike NVidia.
  • Ok, now that you should know what a panel is, let's get into the technical part. I know you don't understand how a monitor and its components work, because you didn't react to the "under 7ms" part. So allow me to elaborate:


 

The reason why all 4K TN panels should be able to support 144hz:

  • At 144hz, 1hz (or fluctuations pr. second) is just under 7ms long. (I know, I know, logical scales like the metric scale is difficult for Americans to comprehend: http://cdn.zmescience.com/wp-content/uploads/2015/03/imperial_vs__metric_by_nekit1234007-d5p0ou5.png ), but 1 second = 1000 milliseconds.
  • If the pixel response time of a panel is lower than 7ms, it can do 144hz. If it is close to it however, the image will constantly be blurry, at motion. A modern TN panel usually has a pixel response time of 1 ms, which leaves almost 6ms to show and hold the given colour (I know pixel response varies with colour).
  • As such all TN panels are able to update at 144hz (or more). The resolution of the panel is irrelevant, being 2K, 4K or 32K makes no difference to the pixel response time.
  • Instead the limitation lies elsewhere, mainly in the bandwidth of the source input (HDMI/DisplayPort, etc). and the monitor/display controller* http://www.paradetech.com/products/dp_monitor_controllers/dp703-lcd-monitor-controller/ (called a scaler in layman's terms. (or even worse a scalar, which is linear algebra in mathematics)).

Do you understand now, why I claim any 4K TN PANEL (not monitor), is capable of a 144hz?

 


 

An interesting fact, you might actually enjoy. Because a liquid crystal in an LCD (liquid crystal display), cannot hold it's charge for more than 30+ms or so; any LCD would have its pixels return to idle state which is off (closed crystals, shutting out the backlight = black). At 30hz, the pixel is on for more than 30ms, which is why the Asus laptop, with the leaked alpha gsync driver, flickered black screen constantly, when the FPS went down to, or lower than 30.

 

This is also why Gsync does its double frame rate at 37hz. and not 30, like NVidia claim their Gsync tech can (wait a minute, would that not mean that NVidia made dishonest claims about their gsync tech?)  :o

 

The really interesting part, is that NVidia could emulate the gsync modules double frame rate, on their drivers (just needs a buffer of the last processed frame), since laptops doesn't have scaler (yes a scaler in this instance), built in.

Now the really interesting thing is, that AMD should be able to make an equal driver implementation of double frame rate, just as easily as NVidia, as it just needs a buffer like Vsync.

 

 

 

  • AMD makes money off of selling their hardware. AMD has themselves stated that they have no licensing fees on their Freesync certification program. Even though AMD proposed the Adaptive Sync standard, they have no right to any royalties. The point was (and is), that NVidia makes money off of every single gsync monitor sold, and AMD does not on any Adaptive Sync monitor. You believe otherwise, fair enough, but you hold the burden of proof.
  • It's easy to point out when other people don't know or understand something, when you do and they don't. How about be open minded and learn when someone says something you don't know. You even admitted to not knowing terminology and theories. Why not ask instead and learn from the people who know more? I agree AMD is the only player on the market supporting Adaptive Sync right now (that is a fact), but it is open to anyone to support it. You cannot call it vendor lock in, or closed eco system on an open license free industry standard.
  • Yes it was a straw man, when you claimed I said you had to make money off of consoles to have more inputs on a monitor. AMD still has no influence or say in how a monitor vendor, chooses to make their monitors. LG, Asus, BENQ, etc. do what they see best for them and their costumers.
  • I think the problem here is discussing FPS or HZ. I don't deny Gsync's way of dealing with sub 37 FPS is better than AMD's forced Vsync. But in the end, you will have a bad stuttery experience playing at that low FPS. Displaying the same image several times, will still result in some lag. But you are right, that the synced frame rate still works below 37 FPS, although not completely, as the monitor will run at twice the hz. Whether this makes any qualitative difference when gaming, I don't know. But I've stated several times on LTT forum, that I would like for AMD to implement a similar buffer based solution.

 

Just to wrap things up:

 

Panel has never meant monitor, as it is a specific component. You know better now, why do you keep on showing your ignorance? Also please prove that I don't know words can have multiple meanings. Even small children understand this. You calling a monitor a panel, is wrong by all definitions, stop the ad hominem please.

 

 

Something being miniscule is a far cry from misrepresenting the truth. You edited out the AMD slide, claiming there was no performance hit. There is, and now you call it miniscule. Whether it has a real life effect or not, makes no difference: The constant two way handshake is over complicated, and comes at a cost.

 

 

What is being said half the time is wrong, and shows clear signs of ignorance. You don't get to dictate how I answer something. If I need to educate you on something (like the Freesync certification program), then that is relevant to answer you question, so you understand, why I hold a different opinion than you do. That should not be difficult to understand.

 

Several times did you misuse terminology, hardware parts, etc. Comparing a driver, which utilizes an open industry standard, to a monitor, and the limitations of panels, makes absolutely no sense. Either you are wilfully manipulating, creating propaganda, or you are very confused. At least that is MY subjective conclusion of your statements.

 

*Edit: The "scaler" of a monitor has different names, depending on the vendor. They are all called controllers, but either display, monitor or interface controller. No wonder people call it a scaler, even though a controller board contains a lot more than just the scaling chip. Edited the post to reflect this.

 

Here's the problem with that explanation. Is you are taking what I said out of the context that I used it in, and trying to make an argument out of it to prove a point. Which is what you would call a logical fallacy. In the context that I used it in, it was more than acceptable. However, I also demonstrated throughout our conversations even prior to that, that I have a clear understanding between the differences of panels and monitors. However, in that one instance, when I used panel freely. You decided to make an argument out of it. Which is why you are being a straw man in this instance. Since I have used monitor and panel previously, in the proper context that you claim it must be used in. However, surprise, you have nothing left in your argument except this. So you are going to cling to it like someone like yourself would. It was not akin to saying "computer" when discussing a CPU because in the context that I was using it; it was clear and understood what I was talking about. Then you proceed to act as if I don't know the difference, even though in conversations before when context was explicitly required, I made it clear that I knew the difference. So now you are just being ridiculous.
 
• Not quite maybe you should do yourself a favor and go back to read what was said. Then try to elaborate on how that was being a straw man. I'm not sure you really understand how this terminology works. It's more of a defense mechanism of yours.
 
1) Yes
2) Yes
3) Yes 
4) Nobody is criticizing you for not being specific. I am criticizing you for overusing the terms to escape an argument (and when they barely apply). That's the irony of your arguments. You are so quick to point out other people are being a straw man or using logical fallacies, yet, you yourself are the one who is guilty of such things constantly. I guess it's your personal method in taking some of the heat off of yourself. I see it as a sort of reverse psychology that you use. You make logical fallacies yourself, and then when someone is presenting you with a more than understandable and solid argument. Then you quickly say it's a logical fallacy and therefore it is not acceptable. Then you don't even have to reply to the actual argument. It's your way of escaping.
 
Here's the problem. You criticized how I used the word and said it was incorrect. I gave you a link and a definition of said word. You then had the audacity, even after I gave you the definition. To still say it is not correct. That's how deluded you are. Here you are continuing on, even though I posted the link. With a screenshot (highlighting the definitions). Trying to dismiss the way I used it. 
 
I did not ignore that definition, the definition has more than one meaning (what do you not understand? you cannot be this stupid!). And I was using one of its alternative meanings. What don't you understand about that? You aren't that close-minded that you actually believe the nonsense you are currently saying? I didn't ignore the definition of the inventor of the word. I was using a different meaning of the same word. And as words evolve they acquire more meanings. Do you have any sense of how etymology works? 
 
 
The original definition was coined in 1973.  The creator, Norman Mailer described a factoid as “facts which have no existence before appearing in a magazine or newspaper.”  He came up with the word, adding the suffix “oid” as an “oid” ending implies “similar but not the same” or more succinctly “like” or “resembling”. However, because of CNN and the BBC in the 1980's to today including "factoids" in their news casts referring to trivial bits of factual information, there is now a second “official” definition of “factoid” as follows (from Merriam-Webster): “a briefly stated and usually trivial fact”

 

 

 
See you even try to be snide, and in the process you are wrong making yourself look stupid. A humanoid is something resembling a human (looking or acting). That's what the suffix "oid" means. "similar but not the same," "like," "resembling." Therefore a factoid, because of the suffix could mean, like a fact, or similar but not the same as a fact, or resembling a fact. Therefore the Webster's definition of factoid, remains in tact. 
 
Now my definition is wrong? But my definition is from the Dictionary, so basically what you are saying is Merriam-Webster's Dictionary is wrong? LMAO. This is what I mean, you are so entirely deluded you actually believe half of the crap you say. It's just insanity at this point. My own source supports my claim as well. It was to show you that words have multiple meanings, something you clearly do not understand. I can't believe you are actually saying this, it just proves your insanity (oh no is that a logical fallacy, no, it's a fact based on your current argument). Seriously, learn from it and move on. You are wrong, you have been proven wrong. Give up on it already, because all you are doing is digging yourself a deeper hole at this point.
 
• It's not a straw man argument, we already all saw you said TN (stop acting crazy). Which is why there was no reason to be redundant. I said give me evidence of why a 4K panel would be able to do 144Hz I waited for it (you quickly called it a logical fallacy and dismissed the original question).
• I proved it to you before, but you quickly called it a logical fallacy. Do you not have any memory of this event? I said in order to use Adaptive-Sync/FreeSync you must be locked into an AMD ecosystem. Which means you are required to purchase an AMD card to use an Adaptive-Sync/FreeSync monitor. There is no other way to use Adaptive-Sync/FreeSync currently unless you use and AMD card. Therefore AMD is making money off of the fact that in order to utilize an Adaptive-Sync/FreeSync monitor a consumer needs to purchase an AMD card.
• It's funny how you say I don't understand how a monitor works, the funniest part is neither do you have the slightest clue about what you are talking about. And all of this nonsense about response times is completely meaningless and means absolutely nothing.
 

I decided because I knew if I called you out on your bullshit response time math, you wouldn't believe me. I instead decided to contact the head Moderator of the Monitors and Displays section at Overclock.net to find out exactly why everything you said is wrong. This is what happened:

 

 
My question:
 
Are all 4K TN panels able to do 144Hz?
 
A person stated that all 4K TN panels can do 144Hz, this was their reasons why:
 
The reason why all 4K TN panels should be able to support 144hz:
 
-At 144hz, 1hz (or fluctuations pr. second) is just under 7ms long. but 1 second = 1000 milliseconds.
 
-If the pixel response time of a panel is lower than 7ms, it can do 144hz. If it is close to it however, the image will constantly be blurry, at motion. A modern TN panel usually has a pixel response time of 1 ms, which leaves almost 6ms to show and hold the given colour (I know pixel response varies with colour).
 
-As such all TN panels are able to update at 144hz (or more). The resolution of the panel is irrelevant, being 2K, 4K or 32K makes no difference to the pixel response time.
 
Is this entirely correct? And/or are there other reasons why 4K 144Hz panels don't exist?
 
Answer: 
 
No.
 
1) Pixel response times are all marketing and not meaningful.
2) There has to be hardware in the monitor capable of handling the frequency properly.
 
Question:
 
So basically all of that explanation that was given is practically meaningless and has no correlation?
 
Answer:
 
Yeah. Pixel response time on package is the gray-to-gray time. It's the fastest of the possible pixel times.
 
 
Even if you have a display that can refresh fast enough, the display needs the proper hardware to handle the necessary bandwidth and refresh fast enough. You can overclock any LCD already. How it handle the overclock depends.... it may work, it may appear to work (but doesn't), or the screen may go black.
 
Read up on "frame skipping" as an example of how it doesn't always actually work: http://www.testufo.com/#test=frameskipping
 
I have a X-Star 27" 1440p monitor.... it's 60Hz by default like most LCDs. However, many can overclock to 96Hz or 120Hz.... but not all.

 

 

Do you understand now why I say, current panel limitations are also what restricts 4K 144Hz monitors? I know you won't believe this, but I will post a screenshot of the conversation if I have to.
 
Not really, at all, how is it being dishonest? They have a method employed that gives an experience like G-Sync is enabled even way below 37 fps. It's not like FreeSync where as soon as it dips below 40 Adaptive-Sync is disabled and stuttering, jarring, or screen-tearing is present.
 
• You can call it a logical fallacy, but you are acting sporadic. You seem to dismiss arguments by instead calling them logical fallacies. Whereas, I'm just stating the obvious. You overuse the term to not have to debate about a topic. Notice how we are barely even talking about any of my original arguments made? That was straw man tactic that you used this entire time to sway the conversation in a direction that you needed it to go in order to regain control after you had lost the argument ages ago.
• What you don't realize is that retailers using the AMD FreeSync name or AMD in any instance (when selling these monitors labled "AMD FreeSync" or "FreeSync") usually calls for some sort of agreement, and usually those agreements are monetary. You are licensing out your name to be used (having the name on a product). This very rarely is a free service. You have to understand that there has to be a way somehow for AMD to make money on this, right? You don't actually believe they are not making any money on FreeSync right (aside from the fact that you need an AMD Card to utilize Adaptive-Sync/FreeSync)? The fact that they even used "Free" in FreeSync is a way to dissuade consumers in believing that they aren't making money on it. It's another marketing ploy. Is FreeSync actually free? No, not really because not any person can use it. Only people with an R9 295x2, R9 290X, R9 290, R9 285, R7 260X and R7 260 can utilize the support for dynamic refresh rates during gaming. Which means you have to go out and buy one of these cards to currently use it. Which means all the owners of cards that are not in this tiny bracket, must now go replace their card in order to use FreeSync on top of the $600 some odd dollars they have to spend on the monitor itself. And remember, they are putting the FreeSync name and/or AMD FreeSync name on the monitors. Even if AMD has no proprietary module inside, or dictates what manufactures put inside the monitor itself. They are still allowing the manufacture to use the FreeSync logo and AMD logo. Which as I stated previously, is impossible for it to be free. Just going by logic (you know something that you claim to have so much of), you would have to know in AMD's current financial situation as a company and now even more with the Securities fraud lawsuit, they need to be able to make a profit on their products. Therefore, because of this fact, it would be only ignorance that would leave someone to believe that aren't accumulating a single dollar from FreeSync. 
• Now you are back to your reverse psychology. Using the exact statement I said to you, back to me, how ironic. After I proved you wrong about factoid having more than one meaning (which you still deny as true), and now after I proved you wrong about all 4K TN panels being able to do 144Hz. You still (and will) act in this manner. I never admitted anything. So now you are even straight up lying like you always do in an argument to prove a point. You are nothing more than a liar. Please quote me where I admitted not knowing terminology or theories (those were your accusations). Please do. The funny thing here is you don't know more, stop acting so condescending. You can call it a closed eco-system because that's how it is right now and that's how it is going to be. In order to use FreeSync you need to use an AMD card. This is fact and you even admitted so. Denying that it is a closed eco-system is just arguing for arguments sake. The only other company that can utilize Adaptive-Sync is Intel, but, do we know if Intel is going to do that? Has Intel hinted at doing that? No. So as it currently stands. Although Adaptive-Sync is an open standard. It is currently limited to AMD. Nobody denied that Adaptive-Sync isn't open for use by anyone. What I am saying is the way it is offered currently, is limited to AMD.
• See now you are forgetting very easily, this is what you said (it was not a straw man you just overuse the term; anything you can say to get out of getting proven wrong):

 

 

 

  • Because NVidia doesn't make any money off of the consoles (unlike AMD).

 

 

 

This was one of your points being made as to why additional inputs aren't a necessity for NVIDIA to include on their monitors after I made it a point that additional inputs could be beneficial to use for other things and that NVIDIA would include them if they could. You said, because NVIDIA doesn't make any money off of consoles (unlike AMD). Therefore concluding why I made the RHETORICAL QUESTION, "So now you need to make money off of consoles to have more inputs on your monitors?" You made the point, I was following up with that same point. You just don't happen to remember what you said. That was one of your reasons, because NVIDIA doesn't make any money off of consoles (unlike AMD). That was your point, not mine. You said that because NVIDIA doesn't make consoles, that was one of YOUR points as to why they don't need to include additional inputs, not mine. And don't try to play it off like that wasn't what you were saying. That was exactly what you were trying to convey.
 
How could AMD not have influence or say in what the monitors vendors produce? They are using their logos and branding on the monitors. Trust me they have a say or they wouldn't allow them to freely use their logos. This is just common sense.
 
• I don't think you understand the point of this technology if you think you will have a bad stuttery experience at that FPS. The point of these technologies is to eliminate stuttering, screen-tearing and jarring. By them having this implementation, it's for those instances when you do have dips under 37 fps. Which everyone has played a game where that has happened. Having coverage even below 37 fps means you will get a smooth transition. So if you are at 50, and you dip below 30. With G-Sync enabled you will barely notice it compared to if you had it disabled or had V-Sync on. Having coverage below 37 fps is not so much for actually playing at those frames but rather what happens when you do get down to those framerates (more so for brief and quick instances or transitions).
 
Just to wrap things up.
 
I have used panel, and monitor in the correct context that you require it to be used many, many times. So saying this just proves your craziness. You can go back and look at when I was specifying panel and when I was specifying monitor. That fact that this is even one of your arguments proves that you are nothing more than a straw man, chock full of the logical fallacies you claim others to have. And yet, you still are acting like a pompous asshole even after you know I know how to use panel and monitor correctly. You don't know words have multiple meanings, after I proved to you that factoid has multiple meanings you denied it having more than one meaning and accepted your own personal definition as true. Because like I said before, your ignorance is only second to your arrogance. If even small children understand this, then you should understand this. Factoid has more than one meaning, accept it or don't. Either way if you don't you are wrong. If I need to stop the ad hominem, you need to stop the straw man fallacies.
 
Yes it is a misrepresentation of the truth. If someone sees one of AMD's advertisements or one of their slides, showing that FreeSync has no performance hit and NVIDIA does. People will automatically assume that the performance hit is great enough to even bring up as a point. Which is the problem, it is not great enough to even make a point. 1-1.5% is so minuscule it's like cents out of hundred dollars. Secondly, that wasn't my slide, that was from someone else. Whether it has real life effect, makes all the difference. If it really makes no difference at all performance wise, why even call it a performance hit? So it does make a difference.
 
Yes, I know what you say half of the time is wrong. Educate me? You didn't educate anyone, this is just your delusional thinking taking over. Please cut the pompous act, if you want to gain any respect around here. Just drop the act entirely. 
 
No see, nobody misused any terms. You just don't read properly or thoroughly, and confuse yourself in the process. You are the only person who thought such things. Which must tell you something, if you are the only person who is thinking they are speaking out of context, misusing terminology, hardware parts, comparing things improperly. What does that say? If I'm willfully manipulating, creating propaganda or are very confused what are you do? You defend AMD's lies and misrepresentations of truth like you work for the company. And make sure you realize the importance of the statement you made, subjective, very.
Link to comment
Share on other sites

Link to post
Share on other sites

come on guys, we don't need walls of text.  Here's how it works:

 

AMD fanboys are pissed because freesync isn't shaping up to be better in general.

 

Nvidia fanboys are pissed because free sync is has the potential to be better long term.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Here's the problem with that explanation. Is you are taking what I said out of the context that I used it in, and trying to make an argument out of it to prove a point. Which is what you would call a logical fallacy. In the context that I used it in, it was more than acceptable. However, I also demonstrated throughout our conversations even prior to that, that I have a clear understanding between the differences of panels and monitors. However, in that one instance, when I used panel freely. You decided to make an argument out of it. Which is why you are being a straw man in this instance. Since I have used monitor and panel previously, in the proper context that you claim it must be used in. However, surprise, you have nothing left in your argument except this. So you are going to cling to it like someone like yourself would. It was not akin to saying "computer" when discussing a CPU because in the context that I was using it; it was clear and understood what I was talking about. Then you proceed to act as if I don't know the difference, even though in conversations before when context was explicitly required, I made it clear that I knew the difference. So now you are just being ridiculous.

 
• Not quite maybe you should do yourself a favor and go back to read what was said. Then try to elaborate on how that was being a straw man. I'm not sure you really understand how this terminology works. It's more of a defense mechanism of yours.
 
(...)
4) Nobody is criticizing you for not being specific. I am criticizing you for overusing the terms to escape an argument (and when they barely apply). That's the irony of your arguments. You are so quick to point out other people are being a straw man or using logical fallacies, yet, you yourself are the one who is guilty of such things constantly. I guess it's your personal method in taking some of the heat off of yourself. I see it as a sort of reverse psychology that you use. You make logical fallacies yourself, and then when someone is presenting you with a more than understandable and solid argument. Then you quickly say it's a logical fallacy and therefore it is not acceptable. Then you don't even have to reply to the actual argument. It's your way of escaping.

 

The entire starting point was discussing where the limitations lied. In that context, the correct terminology mattered. Especially if it was in the panel, the monitor controller (scaler) or not even in the monitor itself.

You know the correct terminology now (you might have before too, but why then, use it wrongly?), and by now, you should know why the right terminology is so important, when discussing technical matters. I cannot read your mind, only the words you type; use the wrong ones, and I will misunderstand. At least in this case, where that mistake made a difference.

 

The problem is, that instead of just writing your points, showing sources, etc., you have a tendency to "recap" or paraphrase what I have said. In doing so a lot of the meaning/details disappears, and you end up misconstruing my arguments and points, and passing them off as mine. That results is a straw man, that you then continue to criticize me, for not defending myself against. That is the entire point of a straw man, and the reason why straw men are shitty fallacies to use in a discussion. You create an argument based on a straw man. If I try to argue against that, I end up arguing against the straw man.

 

Here's the problem. You criticized how I used the word and said it was incorrect. I gave you a link and a definition of said word. You then had the audacity, even after I gave you the definition. To still say it is not correct. That's how deluded you are. Here you are continuing on, even though I posted the link. With a screenshot (highlighting the definitions). Trying to dismiss the way I used it. 

 
I did not ignore that definition, the definition has more than one meaning (what do you not understand? you cannot be this stupid!). And I was using one of its alternative meanings. What don't you understand about that? You aren't that close-minded that you actually believe the nonsense you are currently saying? I didn't ignore the definition of the inventor of the word. I was using a different meaning of the same word. And as words evolve they acquire more meanings. Do you have any sense of how etymology works? 
 

 

See you even try to be snide, and in the process you are wrong making yourself look stupid. A humanoid is something resembling a human (looking or acting). That's what the suffix "oid" means. "similar but not the same," "like," "resembling." Therefore a factoid, because of the suffix could mean, like a fact, or similar but not the same as a fact, or resembling a fact. Therefore the Webster's definition of factoid, remains in tact. 
 
Now my definition is wrong? But my definition is from the Dictionary, so basically what you are saying is Merriam-Webster's Dictionary is wrong? LMAO. This is what I mean, you are so entirely deluded you actually believe half of the crap you say. It's just insanity at this point. My own source supports my claim as well. It was to show you that words have multiple meanings, something you clearly do not understand. I can't believe you are actually saying this, it just proves your insanity (oh no is that a logical fallacy, no, it's a fact based on your current argument). Seriously, learn from it and move on. You are wrong, you have been proven wrong. Give up on it already, because all you are doing is digging yourself a deeper hole at this point.

 

 

The point was that I stated AMD's claim was factual. You said it wasn't, and produced (or sourced) an image with the claim crossed over, and replaced with the opposite, as if AMD has stated something incorrect. They have not. You then proceed to call it a factoid, which it isn't.

 

The author of the word factoid, as well as the oxford dictionary defines it differently than Merriam Webster. I choose to take the definition made by the original author, as a prime source (like any academic would). Merriam Webster's own 1. definition supports my two sources, and Merriam Webster's 2. definition supports yours. How you can believe that Merriam Websters second definition holds true above their primary definition, as well as the definitions of the authors and Oxfords is beyond me. Like I said, many people using something wrong, does not make it right/a fact. Ironically, that is what a factoid was used to explain, when it came to Marilyn Monroe's life, as portrayed in the media.

 

The thing about facts, is that they are binary: Either they are objectively correct or not. So something being like a fact, or resembling a fact, does not make it a fact (= true). Calling it a factoid, when it's factual, is therefore incorrect. If you want to believe otherwise, go ahead. The entire point is still that there is a performance hit with Gsync and not Freesync.

 

Yes it is a misrepresentation of the truth. If someone sees one of AMD's advertisements or one of their slides, showing that FreeSync has no performance hit and NVIDIA does. People will automatically assume that the performance hit is great enough to even bring up as a point. Which is the problem, it is not great enough to even make a point. 1-1.5% is so minuscule it's like cents out of hundred dollars. Secondly, that wasn't my slide, that was from someone else. Whether it has real life effect, makes all the difference. If it really makes no difference at all performance wise, why even call it a performance hit? So it does make a difference.

 

It does not state the extent of the performance hit, correct. Actually it's 1-1½ dollars out of a hundred. You can argue how much of a difference it makes in praxis; and I agree it's probably not a lot, but it is still a fact, and still points to the issue of having an overly complex two way handshake communication, with no seemingly advantage to it. That was the point.

 

 

• It's not a straw man argument, we already all saw you said TN (stop acting crazy). Which is why there was no reason to be redundant. I said give me evidence of why a 4K panel would be able to do 144Hz I waited for it (you quickly called it a logical fallacy and dismissed the original question).

• I proved it to you before, but you quickly called it a logical fallacy. Do you not have any memory of this event? I said in order to use Adaptive-Sync/FreeSync you must be locked into an AMD ecosystem. Which means you are required to purchase an AMD card to use an Adaptive-Sync/FreeSync monitor. There is no other way to use Adaptive-Sync/FreeSync currently unless you use and AMD card. Therefore AMD is making money off of the fact that in order to utilize an Adaptive-Sync/FreeSync monitor a consumer needs to purchase an AMD card.
• It's funny how you say I don't understand how a monitor works, the funniest part is neither do you have the slightest clue about what you are talking about. And all of this nonsense about response times is completely meaningless and means absolutely nothing.

 

  • Good, but by recapping my point without it, might confuse the reader, and to me, it looks like a missed, important, detail on your part.
  • I'm still not agreeing to your use of locked and eco system here, when talking open standards. We are talking about an industry standard, that is optional, and AMD supporting it. Vendor lock in, and closed proprietary eco systems, are created in a different way. I understand your point, but 1. Everyone can support the standard, AMD is not responsible for what Intel/NVidia/etc. does or doesn't. 2. You are using specific terminology incorrectly. You don't need to use them, to explain your point. You are only setting yourself up for failure that way.
I decided because I knew if I called you out on your bullshit response time math, you wouldn't believe me. I instead decided to contact the head Moderator of the Monitors and Displays section at Overclock.net to find out exactly why everything you said is wrong. This is what happened:

 

  • So now I have to argue by proxy, via a third party? Come on! Well let me try:

Being a head moderator, does not necessarily make him competent in the field, so I don't know why you emphasized that particular part?

 

 
My question:
 
Are all 4K TN panels able to do 144Hz?
 
A person stated that all 4K TN panels can do 144Hz, this was their reasons why:
 
The reason why all 4K TN panels should be able to support 144hz:
 
-At 144hz, 1hz (or fluctuations pr. second) is just under 7ms long. but 1 second = 1000 milliseconds.
 
-If the pixel response time of a panel is lower than 7ms, it can do 144hz. If it is close to it however, the image will constantly be blurry, at motion. A modern TN panel usually has a pixel response time of 1 ms, which leaves almost 6ms to show and hold the given colour (I know pixel response varies with colour).
 
-As such all TN panels are able to update at 144hz (or more). The resolution of the panel is irrelevant, being 2K, 4K or 32K makes no difference to the pixel response time.
 
Is this entirely correct? And/or are there other reasons why 4K 144Hz panels don't exist?

 

The biggest problem for both me, and the moderator, is that you left out one of the most important parts, that I wrote: The very last point:

 

 

 

 

 See this is the very key to my point. So let's go through his points:

 

Answer: 
 
No.
 
1) Pixel response times are all marketing and not meaningful.
2) There has to be hardware in the monitor capable of handling the frequency properly.
 
Question:
 
So basically all of that explanation that was given is practically meaningless and has no correlation?
 
Answer:
 
Yeah. Pixel response time on package is the gray-to-gray time. It's the fastest of the possible pixel times.
 
 
Even if you have a display that can refresh fast enough, the display needs the proper hardware to handle the necessary bandwidth and refresh fast enough. You can overclock any LCD already. How it handle the overclock depends.... it may work, it may appear to work (but doesn't), or the screen may go black.
 
Read up on "frame skipping" as an example of how it doesn't always actually work: http://www.testufo.c...t=frameskipping
 
I have a X-Star 27" 1440p monitor.... it's 60Hz by default like most LCDs. However, many can overclock to 96Hz or 120Hz.... but not all.

 

 

 

Answer 1:

  1. I disagree somewhat, but I see his point. He makes it again below in the second answer: Pixel response time varies from colour, so the published refresh rate is grey to grey, which is a bit misleading. That is why I was so specific about TN panels, as they have the lowest refresh rate, and the reason why they are the only 144hz panels on the market for ages (I know AHVA at 144hz is a thing now, and I am very exited for it).
  2. Which I clearly stated in the point you left out. That "hardware" he is mentioning, is the monitor controller (aka. scaler), which controls the panel. That was the ENTIRE point, and you chose to leave it out?!?

Answer 2:

 

Exactly, which is why I wrote "(I know pixel response varies with colour)."

Everything else he states, is exactly what I wrote in the point, you left out: It's up to the monitor controller to support/handle.

 

You did yourself a massive discredit, by leaving out my last point. Feel free to go back, and put that last part in, and see what he has to say about that.

 

As for his monitor, it uses whatever panels X star can get a hold of. Seems like PLS or AHVA panels. The latter should handle 120hz, the former, not so much. The actual overclocking is of the monitor controller. If that controller forces the panel out of spec, the panel will go black or do weird shit, just like he says.


And no I don't understand why you say current panel tech does not support 4K@144phz, because, like I said, resolution has no effect on pixel response time in the panel. The entire limitation on 4K TN panels, are the monitor controller. But the fact that you still use panel in that question, either means you still don't understand what a panel is, and how it works, or you just continue to use wrong terminology.

 

 

Not really, at all, how is it being dishonest? They have a method employed that gives an experience like G-Sync is enabled even way below 37 fps. It's not like FreeSync where as soon as it dips below 40 Adaptive-Sync is disabled and stuttering, jarring, or screen-tearing is present.

 

• You can call it a logical fallacy, but you are acting sporadic. You seem to dismiss arguments by instead calling them logical fallacies. Whereas, I'm just stating the obvious. You overuse the term to not have to debate about a topic. Notice how we are barely even talking about any of my original arguments made? That was straw man tactic that you used this entire time to sway the conversation in a direction that you needed it to go in order to regain control after you had lost the argument ages ago.

• What you don't realize is that retailers using the AMD FreeSync name or AMD in any instance (when selling these monitors labled "AMD FreeSync" or "FreeSync") usually calls for some sort of agreement, and usually those agreements are monetary. You are licensing out your name to be used (having the name on a product). This very rarely is a free service. You have to understand that there has to be a way somehow for AMD to make money on this, right? You don't actually believe they are not making any money on FreeSync right (aside from the fact that you need an AMD Card to utilize Adaptive-Sync/FreeSync)? The fact that they even used "Free" in FreeSync is a way to dissuade consumers in believing that they aren't making money on it. It's another marketing ploy. Is FreeSync actually free? No, not really because not any person can use it. Only people with an R9 295x2, R9 290X, R9 290, R9 285, R7 260X and R7 260 can utilize the support for dynamic refresh rates during gaming. Which means you have to go out and buy one of these cards to currently use it. Which means all the owners of cards that are not in this tiny bracket, must now go replace their card in order to use FreeSync on top of the $600 some odd dollars they have to spend on the monitor itself. And remember, they are putting the FreeSync name and/or AMD FreeSync name on the monitors. Even if AMD has no proprietary module inside, or dictates what manufactures put inside the monitor itself. They are still allowing the manufacture to use the FreeSync logo and AMD logo. Which as I stated previously, is impossible for it to be free. Just going by logic (you know something that you claim to have so much of), you would have to know in AMD's current financial situation as a company and now even more with the Securities fraud lawsuit, they need to be able to make a profit on their products. Therefore, because of this fact, it would be only ignorance that would leave someone to believe that aren't accumulating a single dollar from FreeSync. 
 

 

If AMD lies, when saying their Freesync graphics driver, can go down to 9 hz, then what does NVidia do, when they claim their hardware can go down to 30 hz, when in fact, it only goes down to 37 hz? Again we are talking hertz, not frames per second. I still like NVidia's implementation, but you are holding AMD responsible for hardware limitations, that they have no control over.

 

  • I doubt a retailer is paid any money for putting a selling point, of the product they sell, on their product description page. Come on now.

This point is large, so it needs a little more space. The problem is that AMD has to counter NVidia's closed proprietary eco system, that causes vendor lock in. That is why they were so quick to make a freesync demo, and why they proposed the Adaptive Sync standard to VESA. Making money is also about keeping market shares. In this case, AMD has worked with the monitor vendors, scaler vendors and VESA, to make Adaptive Sync a reality. Giving all of this away for free, including the certification of the monitors, helps AMD be competitive on features like Gsync, which can cost market shares (and might already; we know the 900 series, has stolen a big chunk. Whether or not gsync has any influence on that is difficult to conclude). In this case, gaining a high market share of Adaptive Sync monitors, can make AMD graphics cards more desirable. Especially, if people already has an Adaptive Sync monitor.

 

Just because NVidia takes a large chunk of money on everything they do, does not mean all companies operate this way. AMD makes money off of their cards, and they do so by ensuring market share, and hopefully gaining some, by using cheaper technologies, like Adaptive Sync, due to component and vendor competition and no royalty on the tech.

 

This is why I say you confuse brands, software, hardware, etc. Freesync is a driver. A piece of software, that controls the graphics card. AMD drivers are free for gaming graphics cards. Adaptive Sync, which is the monitor's ability to do variable refresh rates, has nothing to do with AMD, or pricing. I've linked to several monitors, that had Adaptive Sync functionality, and came with a similar price tag to the same models without adaptive sync. Sounds pretty free to me. It is from AMD's side either way. AMD also has free support of 4K on their graphics cards. You still have to get a 4K monitor, but the support is not something you pay extra for.

 

Putting the Freesync logo on the boxes, just shows that AMD freesync can be utilized on the monitor, and that AMD finds the Adaptive Sync implementation, good. But I have already stated all of this several times.

 

Again, it is YOUR claim, that AMD makes money directly off of the Freesync logo on monitors. AMD says it's license free, and I believe them. This is a marketing issue. Marketing is a cost, not a source of revenue. 

 

 

• Now you are back to your reverse psychology. Using the exact statement I said to you, back to me, how ironic. After I proved you wrong about factoid having more than one meaning (which you still deny as true), and now after I proved you wrong about all 4K TN panels being able to do 144Hz. You still (and will) act in this manner. I never admitted anything. So now you are even straight up lying like you always do in an argument to prove a point. You are nothing more than a liar. Please quote me where I admitted not knowing terminology or theories (those were your accusations). Please do. The funny thing here is you don't know more, stop acting so condescending. You can call it a closed eco-system because that's how it is right now and that's how it is going to be. In order to use FreeSync you need to use an AMD card. This is fact and you even admitted so. Denying that it is a closed eco-system is just arguing for arguments sake. The only other company that can utilize Adaptive-Sync is Intel, but, do we know if Intel is going to do that? Has Intel hinted at doing that? No. So as it currently stands. Although Adaptive-Sync is an open standard. It is currently limited to AMD. Nobody denied that Adaptive-Sync isn't open for use by anyone. What I am saying is the way it is offered currently, is limited to AMD.

 

 

  • You never claimed that factoid had more than one meaning, only that your meaning was correct. Also you have by no means "proved" that I am wrong about 4K TN panels, but by misrepresenting my posts once again, you skewed the outcome. Try again! Here is the quote you asked for:

     

    I don't understand theories or terminology, but you use logical fallacy when it doesn't even apply, and the only errors in logic present is your lack of understanding towards what is being said. 

    Now if you don't understand the theories and terminology, then how can you conclude what's open and closed eco systems? AMD uses industry standards, that makes any eco system across hardware vendors, open; as opposed to a proprietary closed eco system, that only works with one vendor.

 

• See now you are forgetting very easily, this is what you said (it was not a straw man you just overuse the term; anything you can say to get out of getting proven wrong):

 

This was one of your points being made as to why additional inputs aren't a necessity for NVIDIA to include on their monitors after I made it a point that additional inputs could be beneficial to use for other things and that NVIDIA would include them if they could. You said, because NVIDIA doesn't make any money off of consoles (unlike AMD). Therefore concluding why I made the RHETORICAL QUESTION, "So now you need to make money off of consoles to have more inputs on your monitors?" You made the point, I was following up with that same point. You just don't happen to remember what you said. That was one of your reasons, because NVIDIA doesn't make any money off of consoles (unlike AMD). That was your point, not mine. You said that because NVIDIA doesn't make consoles, that was one of YOUR points as to why they don't need to include additional inputs, not mine. And don't try to play it off like that wasn't what you were saying. That was exactly what you were trying to convey.
 
How could AMD not have influence or say in what the monitors vendors produce? They are using their logos and branding on the monitors. Trust me they have a say or they wouldn't allow them to freely use their logos. This is just common sense.
 
• I don't think you understand the point of this technology if you think you will have a bad stuttery experience at that FPS. The point of these technologies is to eliminate stuttering, screen-tearing and jarring. By them having this implementation, it's for those instances when you do have dips under 37 fps. Which everyone has played a game where that has happened. Having coverage even below 37 fps means you will get a smooth transition. So if you are at 50, and you dip below 30. With G-Sync enabled you will barely notice it compared to if you had it disabled or had V-Sync on. Having coverage below 37 fps is not so much for actually playing at those frames but rather what happens when you do get down to those framerates (more so for brief and quick instances or transitions).

 

  • Point was very clear: NVidia designed gsync to benefit them. They designed the entire monitor controller, with no prior knowledge, so making a controller board, that could handle several inputs, would increase R&D cost, complexity, development time, and come at no benefit for NVidia hardware (and thus NVidia themselves). That is the point and I stand by it.

Your rhetorical question was a straw man, built on the notion, that AMD has any say or control of what any monitor vendor does or chooses, when they include Adaptive Sync functionality. They point was, and is, that Adaptive Sync monitors has all the standard inputs, for more options and convenience for the monitor vendors end users, and that this is essentially free, as it was just reused design and tech, from the scaler vendors previous non adaptive sync models. AMD has no say in this, but it is a plus with Adaptive Sync, vs. Gsync.

 

I have explained how AMD's certification program works, and what purpose it serves (ensuring an acceptable hz. interval, or VRR, as it is now called).

  • This is why semantics matter. Maybe stutter is not the correct term? But if you play a game at 10 fps, it will feel like a sped up slide show. I would call that a stuttery feeling. It doesn't matter, if Gsync actually runs at 40hz, since you will still only see 10 unique frames per second. Now at 37 hz or over 30 hz, I agree, it's not a stuttery mess, but below, I would say it is. That was the point. Either way, I look forward to AMD implementing the same functionality of frame doubling/tripling at low hz. Should be fairly easy to do driver side.

Just to wrap things up.

 
I have used panel, and monitor in the correct context that you require it to be used many, many times. So saying this just proves your craziness. You can go back and look at when I was specifying panel and when I was specifying monitor. That fact that this is even one of your arguments proves that you are nothing more than a straw man, chock full of the logical fallacies you claim others to have. And yet, you still are acting like a pompous asshole (ad hominem) even after you know I know how to use panel and monitor correctly. You don't know words have multiple meanings (straw man), after I proved to you that factoid has multiple meanings you denied it having more than one meaning and accepted your own personal definition as true (another straw man. I accepted the author of the word's definition). Because like I said before, your ignorance is only second to your arrogance (another ad hominem). If even small children understand this, then you should understand this. Factoid has more than one meaning, accept it or don't. Either way if you don't you are wrong. If I need to stop the ad hominem, you need to stop the straw man fallacies.

 

Good, so why did you use panel wrongly, when you know better? It just served as confusion, and became an incorrect statement, that to me, shoved that you do not understand the inner workings of a monitor. If you do, fine, then use the correct terminology. However, several later statements, tell me that your understanding of a lot of monitor hardware, is severely lacking. Especially with monitor controllers and panels (4k@144p for instance).

 

You fallacies are just a joke. Stop it. Continue to use factoid wrong, I don't care. It's just odd, it never seems to have crossed your mind, that maybe Merriam-Webster is actually wrong on this one? Or maybe the media (as you quoted) just misused the term factoid, so everyone believes something incorrect. That is indeed very ironic, as that is the definition of factoid. How very meta!

 

Yes, I know what you say half of the time is wrong. Educate me? You didn't educate anyone, this is just your delusional thinking taking over. Please cut the pompous act, if you want to gain any respect around here. Just drop the act entirely. 

 
No see, nobody misused any terms. You just don't read properly or thoroughly, and confuse yourself in the process. You are the only person who thought such things. Which must tell you something, if you are the only person who is thinking they are speaking out of context, misusing terminology, hardware parts, comparing things improperly. What does that say? If I'm willfully manipulating, creating propaganda or are very confused what are you do? You defend AMD's lies and misrepresentations of truth like you work for the company. And make sure you realize the importance of the statement you made, subjective, very.

 

 

You continue to compare a software driver (freesync) to a piece of hardware (gsync module). You still call an open industry standard, a closed eco system, etc. You haven't learned. I would however very much like for you to state, what you think AMD is lying about. And since you state it, you have the burden of proof, so please supply that too (we both know you won't/can't).

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

I'm guessing that I am not the only one who is now wearing out his scroll wheel bypassing the posts of both Notional & BiG StroOnZ. :-P

 

Come on guys, give it up. Neither of you are adding anything positive to this thread anymore and most of us are scrolling past each and every WALL of text. How can so little be said with so many words. LOL

 

No-one who is interested in the new technology is bothered about what happens if FPS dips below 40hz (or 37Hz in Nvidias case) or above 144Hz. Simply because no-one in their right mind would want to game outside of current VRR windows. Why would you want to game at below 30FPS????

 

I think the reason you are arguing about the outer ranges is because reviewers have said that both technologies produce similar results while keeping within the VRR ranges. So within those ranges everything is smooth and THE SAME. You lot cant argue if you are getting THE SAME results within the ranges.....so you have decided to search elsewhere for something to argue about.

 

As said in a previous post, I am in the market for a new monitor which supports 2560x1440p @ 144hz with VRR technology. my current card AMD/ATI 280X does not support Freesync so I am looking at both technologies and not being biased either way. However, I have two monitors currently in mind right now and I am only concerned with what those monitors perform like within the currently supported ranges of both technologies. I am not remotely interested in what they perform like outside of those ranges because I will do my best to make sure they stay inside the ranges....why would I do anything else? I want smooth gameplay FFS.

 

I have previously been a member of both the Green and Red teams over the years, but I am loyal to none. I will buy the setup that offers the best bang for buck that I can get at the time I am ready to buy, based on reviews which are still not out yet (TFT Central, please get a move on with the BenQ XL2730Z and the Asus ROG Dominator reviews please).

 

So with those two monitors in mind, I guess you know which way I am leaning at the moment  ;-)

 

It really doesn't make any sense to spend over $400 more to get the same results.

 

So right now, as it currently stands.........

 

RED TEAM WINS  :-)

Kind Regards Always

 

Mayo

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×