Jump to content

Nvidia Pitches Advantages of G-Sync Over AMD's FreeSync

BiG StroOnZ

In what cases? There is literally ONE review of ONE monitor and that is "most cases" to you? AMD has no control over the range that manufacturers decide to use and yet somehow you're blaming the ghosting on freesync. Seeing as you claimed that no freesync monitor has multiple inputs, it's safe to say you're also pulling this out of your ass.

At the time I wrote that there were no monitors in the US with FreeSync and multiple inputs.

Also, there are plenty of European reviews.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

@Notional, it's a theory and nothing more, real study of the world reveals much more at work. Nvidia presents no switching costs to most consumers. GSync monitors can include FreeSync if the producer so chooses. Nvidia can't force them not to. You can stream with any GPU. AMD doesn't have a handheld and frankly the Shield is already out of style. CUDA only represents a cost because of software makers, not because of Nvidia. Nvidia produces no vendor lock-ins, which is probably a big reason why they get in no trouble with trade authorities.

It's not difficult to understand the theory, but the theory does not apply here. Why is that so difficult for you to understand? Where are the switching costs that Nvidia itself imposes?

PHYSX is IP that 3DFX and Nvidia developed. Frankly it's their right to enforce exclusive use of their tech when using their software tech. If AMD came up with a competing standard that won over more of the market, Nvidia would respond and improve their own and the competition would be on as normal. If AMD's GPU performance is on par or better at a lower price and the products represent the same quality in reliability and use, then eventually AMD will win out. Unfortunately for AMD it has no products which fulfill this requirement. The 290/X still have high failure rates and a decent number of negative reviews on tech websites. They also don't have even half-decent marketing. AMD is the one falling down on the job. Nvidia is just moving along as usual innovating for the sake of future sales.

Nvidia deserves exclusive use of its innovations and research. We've had IP laws since the U.S. was founded if not earlier. Without that protection no one would innovate because simply the most rich would steal and sell those ideas and cut the inventors out. We saw exactly this problem in Weimar Germany and in China.

Intel beat IBM into the dirt bloody and half-dead fair and square. Intel oversold the P4. Boohoo. AMD did the same with Bulldozer. And before you bring up rigged benchmarks, Intel did not pay anyone to use ICC in Cinebench. Furthermore, ICC is built on the premise Intel knows the clock counts for every single instruction of every single chip family. Unless AMD and VIA are willing to hand over that information, Intel cannot be expected to optimize code for their architecture. Even if they had provided code paths for newer instructions by querying the CPUs, the generated code would not be optimal for that chip, and that petty lawsuit would have stuck anyway. The Cinebench fiasco is the single biggest legal failure I have ever seen stemming from technological ignorance on the part of legal parties.

Now, Intel is also no longer managed by Otellini who orchestrated the exclusivity deals with Dell and others. Kirzanich's record is squeaky clean, and that's a long record.

OriginPC is weird, but without hard evidence it's nothing but hearsay. There could have been a service failure on AMD's part poisoning the well for a while. More has been done over less in the corporate world. It doesn't mean Nvidia bought loyalty from anyone.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

At the time I wrote that there were no monitors in the US with FreeSync and multiple inputs.

Also, there are plenty of European reviews.

LOL. I give up. This is pointless.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

You are so full of BS, seriously man:

 

At the time I wrote that there were no monitors in the US with FreeSync and multiple inputs.

Also, there are plenty of European reviews.

 

G-Sync monitors can also have other inputs. It would just require slightly more engineering. There's no Freesync monitor yet with multiple inputs either, so your premise is moot.

 

(...) how about you link me to a Freesync monitor, that does NOT have more than just Displayport inputs?

 

You keep making this claim. You have yet to link to any Freesync monitor, US or non US, that only has displayport connectivity. Why do you keep making the same false statement, when you know it's not true? It is dishonest and pointless.

 

Like I asked: Link to any freesync monitor, that only has displayport. You can't, can you? Changing goalposts, like only saying US all of the sudden, does not make a difference.

 

LOL. I give up. This is pointless.

 

Ugh, I know right?

 

@Notional, it's a theory and nothing more, real study of the world reveals much more at work. Nvidia presents no switching costs to most consumers. GSync monitors can include FreeSync if the producer so chooses. Nvidia can't force them not to. You can stream with any GPU. AMD doesn't have a handheld and frankly the Shield is already out of style. CUDA only represents a cost because of software makers, not because of Nvidia. Nvidia produces no vendor lock-ins, which is probably a big reason why they get in no trouble with trade authorities.

It's not difficult to understand the theory, but the theory does not apply here. Why is that so difficult for you to understand? Where are the switching costs that Nvidia itself imposes?

 

So because you don't understand it, or how it affects markets, it's only a theory? No it is a proven market mechanic, that's why it's an accepted theory used by companies, etc. It's fine you choose not to believe it, but that does not make it untrue or non fact.

I have given several examples of switching barriers if you buy into the eco system. I doubt you are this ignorant. I honestly think you choose to ignore it all. Why?

As an academic, is it not more desirable to learn and understand, than to just be right, no matter what? I have said it before, and I'll say it again: Your behaviour is not compatible with that of a proper academic.

 

There is a switching cost with Nvidia, IF you bought into the closed eco system. If you did not, then of course not, but that scenario is outside of the discussion, as you are NOT in the closed ecosystem as such. The problem occurs when you have several proprietary Nvidia products/tech, like a graphics card + a gsync monitor.

 

Oh you know that other companies can just add in adaptive sync support? And you know Nvidia allow it? Nvidia can have plenty of limitations in their contracts, that we know nothing of. They do when it comes to GameWorks, we know devs are not allowed to optimize or share code with AMD.

 

BenQ's XL2420G has both a Gsync module and a standard scaler. Oddly enough that scaler does not support Adaptive Sync. I WONDER WHY?

It's an expensive and clunky approach, we will hardly see gaining traction in the market.

 

 

PHYSX is IP that 3DFX and Nvidia developed. Frankly it's their right to enforce exclusive use of their tech when using their software tech. If AMD came up with a competing standard that won over more of the market, Nvidia would respond and improve their own and the competition would be on as normal. If AMD's GPU performance is on par or better at a lower price and the products represent the same quality in reliability and use, then eventually AMD will win out. Unfortunately for AMD it has no products which fulfill this requirement. The 290/X still have high failure rates and a decent number of negative reviews on tech websites. They also don't have even half-decent marketing. AMD is the one falling down on the job. Nvidia is just moving along as usual innovating for the sake of future sales.

Nvidia deserves exclusive use of its innovations and research. We've had IP laws since the U.S. was founded if not earlier. Without that protection no one would innovate because simply the most rich would steal and sell those ideas and cut the inventors out. We saw exactly this problem in Weimar Germany and in China.

Now, Intel is also no longer managed by Otellini who orchestrated the exclusivity deals with Dell and others. Kirzanich's record is squeaky clean, and that's a long record.

 

There is no such thing as "deserve" in a free competitive market. People buy what they want or are "forced" to for whatever reason. Nvidia has every RIGHT to do what they want, no one is disputing this. But their right is still anti competitive and bad for the consumer, whether you want to acknowledge this or not.

 

Standards are always desirable when including several products/companies. Nvidia is against this. With all the format wars, us techies has seen just the last decade, do anything really still think that proprietary closed eco systems, are still a good way to go?

 

Who cares? Intel did something bad, gained market shares from it, and was punished. Stating that they deserved their marked gain, is bonkers, when it was outright illegal.

Like I said, criticize the source all you want, I truly understand doing so, but you have to admit, it's fishy as hell.

 

No comment on the GameWorks catastrophe that is Arkham Origins? As in this horrible thing: http://www.extremetech.com/extreme/173511-nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd/2 that we know disables AA on AMD cards on top of that?

 

Dealing in facts is preferable, but sometimes the facts are not publicly available, so you look at the general behaviour of a company, and you either give them the benefit of the doubt or not. Based on all the BS we've seen from Nvidia, I will not give them the benefit of the doubt. Other people can do so if they truly feel like it, they can do so for all I care. As long as they take all the BS into consideration.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

@Notional, most of science is a theory, ideas of as to why or how things behave. Understanding the theory does not make it the truth. There are plenty of holes in this theory you're citing when it comes to Nvidia's situation. It's a theory many people subscribe to. That does not make it correct. Newton's theory of gravitation was right, until it wasn't. Economists change theories like babies change diapers. You're improperly applying the theory in the first place. Once you buy an Nvidia card, you can keep it. If you buy a shield after that, you can buy an AMd GPu and keep the Nvidia one! There is not a single switching cost Nvidia imposes. Find me one that Nvidia imposes, not a third party. The monitor and shield idea are BS and you know it. The monitor vendors have way more power than Nvidia, and if Nvidia were tryig to force ugly contracts they'd be tatted out instantly. BenQ doesn't have FreeSync because, omg, the monitor was developed before DP 1.2a was a thing!

GSync is an approach that allows growth and adaptation (benefits of an FPGA). If a problem comes up in FreeSync/Adaptive Sync, the users are screwed. GSync can be modified on both the hardware and software side. It's the more functional approach.

Edit: sorry, replying on a phone, so I have to go piecemeal.

GameWorks is Nvidia's work. AMD should not be allowed to have it for free. AMD should come up with their own optimization libraries. That will prove whether or not GW actually gimps their cards.

Producers are not beholden to consumers. It's entirely the other way around regardless of the market. If you can't produce something yourself, you are at the mercy of producers, regardless of competition levels.

Yes, proprietary standards are still good because they get the ball rolling. Do you not see how damn slow these standards committees work? VESA is half a decade behind where they should be, as is the FSF. Nvidia was first to the table on dynamic refresh rate control. Nvidia gets the benefits of being first to market with a workable solution. It's good for Nvidia and the consumer. Supporting Adaptive Sync supports planned obsolescence (there is an ASIC on the DP 1.2a PCB responsible for handling AS). GSYNC can be modified ever onward regardless of how many problems may arise.

Intel gained no market share from it. They were already the market leader and AMD gained ground those years. Furthermore by the time the floating point problem was discovered new chips from both companies were rolling onto store shelves. That's hardly the fault of Intel.

Also did you read that entire extreme tech article? That's not Nvidia's fault. The game maker made a choice about refusing code changes. Prove Nvidia bribed them or get over it. If AMD sucks at tessellation, that's their problem.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

LOL. I give up. This is pointless.

This thread became pointless days ago.

Link to comment
Share on other sites

Link to post
Share on other sites

 

your argument is if nothing uses it then they are lying so my example is the same thing so if displayport 1.4 comes out and it supports 10k displays vesa is lying because there are no 10k displays

 

Here, let me give you a better comparison of what AMD is doing. Since you are giving comparisons that aren't similar.

 

AMD stating the spec of 9-240Hz when their monitors only support 40-144Hz. Is like AMD saying on their R9 390X they are going to support DirectX 14. You see what I mean? Yes eventually there will be a Direct X 14, but it is not planned as of now and we shouldn't expect it for years to come. Nobody is working on DirectX 14 because they are still trying to get DirectX 12 out the door. The same is true with the 9-240Hz interface specification. Sure the interface supports it, but there are no monitors that will support it for many years. So there is no point saying you use 9-240Hz comparing it to NVIDIA's 30-144Hz because 9-240Hz is just not plausible at the moment and isn't going to be plausible for quite some time. 

 

 

Yes speculation. How you choose to interpret Tom's comment, is purely subjective, and is by no means official, nor fact. Very odd that Nvidia never claimed that before. I don't doubt it's possible to update Gsync down the line, but there is no official statements on this, nor on this widened hertz interval. Like I said, your subjective interpretation, and Tom's vague statement, is useless.

 

VESA is responsible for the 9-240hz interval. Just like they are responsible for 8K support, etc. Freesync just supports the full fat of the Adaptive Sync standard. It is ignorant to criticize AMD, for something VESA is claiming; especially considering Nvidia being a member of VESA.

 

Either way, what is your point? That AMD are liars for claiming something, that you then credit Nvidia for extremely vaguely claiming as well? And you say my glasses are fanboy tainted. Seriously.

 

There's no way to choose how to interpret Tom's comment, and it's not subjective. Tom said what he said, end of story. It might not be "officially stated" with legitimate numbers but it is very close to a fact because the engineer himself is stating this. To consider Tom's statement useless might be one thing, but I also posted Malventano's statements along with NVIDIA's own statements in this thread backing up the very so called, "useless" statements that Tom made. 

 

It's not ignorant to criticize AMD for putting 9-240Hz in their advertising against NVIDIA. Their monitors only support 40-144Hz maximum. So it's anything but ignorant. It would be considered normal to find that peculiar for a company to use such a baseless spec in their advertising when comparing to NVIDIA. When their actual monitors only support 40-144Hz. VESA just made a claim based on the interface spec, which has nothing to do with AMD promoting FreeSync remember Adaptive-Sync is VESA, FreeSync is from AMD. NVIDIA could support Adaptive-Sync at any point if they wanted to. It is not proprietary. 

 

If you don't get my point, then I'm afraid you aren't really reading very closely. Or you do understand what I'm saying but choose to ignore the facts because it's AMD. AMD is stretching the truth about what their FreeSync monitors are capable of. Me telling you that NVIDIA is capable of doing basically the same specs, doesn't have anything to do with what AMD is doing. All I'm trying to say is NVIDIA could easily have promoted G-Sync as being able to do 1-240Hz if they wanted to. But they know better, there are no monitors available that would allow such a wide range yet. So there is no point saying, look our technology is capable of doing this, when it won't actually be available for some time to come (which is why you do see them making off hand comments about the panel's limitations not being G-Sync limitations). However, this is not the case with AMD, they are touting the 9-240Hz interface spec, but yet are only offering products with 40-144Hz ranges. Which is basically lying. So I'm not sure why you choose to ignore that or why anyone would choose to ignore that fact. But like I stated before, I guess it's okay because AMD is doing it but God forbid NVIDIA did it, it would be the end of the world.

Link to comment
Share on other sites

Link to post
Share on other sites

Never used a Freesync monitor as yet but I have friends who said they do experience stuttering and ghosting, I personally have a gsync monitor (got it for a really good deal) and my friend prefers it over his.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm willing to believe that G-Sync has better anti-ghosting capabilities than your average FreeSync panel. 

I saw PCper's video earlier though, and you can't draw many conclusions from that. The ROG and the FreeSync monitors use different panels. The only way to really measure the difference between just G-Sync or a FreeSync monitor would be to have the exact same panel in two monitors and the only difference would be FreeSync or a G-Sync module. 

I'm not surprised that the ROG performs very well. It is a really good monitor, but you also pay for that quality.

 

Not having IPS, G-Sync or high refresh rate monitors myself, all this does make me more excited for the things I'll see in my next PC build :)

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

 

Here, let me give you a better comparison of what AMD is doing. Since you are giving comparisons that aren't similar.

 

AMD stating the spec of 9-240Hz when their monitors only support 40-144Hz. Is like AMD saying on their R9 390X they are going to support DirectX 14. You see what I mean? Yes eventually there will be a Direct X 14, but it is not planned as of now and we shouldn't expect it for years to come. Nobody is working on DirectX 14 because they are still trying to get DirectX 12 out the door. The same is true with the 9-240Hz interface specification. Sure the interface supports it, but there are no monitors that will support it for many years. So there is no point saying you use 9-240Hz comparing it to NVIDIA's 30-144Hz because 9-240Hz is just not plausible at the moment and isn't going to be plausible for quite some time. 

 

 

There's no way to choose how to interpret Tom's comment, and it's not subjective. Tom said what he said, end of story. It might not be "officially stated" with legitimate numbers but it is very close to a fact because the engineer himself is stating this. To consider Tom's statement useless might be one thing, but I also posted Malventano's statements along with NVIDIA's own statements in this thread backing up the very so called, "useless" statements that Tom made. 

 

It's not ignorant to criticize AMD for putting 9-240Hz in their advertising against NVIDIA. Their monitors only support 40-144Hz maximum. So it's anything but ignorant. It would be considered normal to find that peculiar for a company to use such a baseless spec in their advertising when comparing to NVIDIA. When their actual monitors only support 40-144Hz. VESA just made a claim based on the interface spec, which has nothing to do with AMD promoting FreeSync remember Adaptive-Sync is VESA, FreeSync is from AMD. NVIDIA could support Adaptive-Sync at any point if they wanted to. It is not proprietary. 

 

If you don't get my point, then I'm afraid you aren't really reading very closely. Or you do understand what I'm saying but choose to ignore the facts because it's AMD. AMD is stretching the truth about what their FreeSync monitors are capable of. Me telling you that NVIDIA is capable of doing basically the same specs, doesn't have anything to do with what AMD is doing. All I'm trying to say is NVIDIA could easily have promoted G-Sync as being able to do 1-240Hz if they wanted to. But they know better, there are no monitors available that would allow such a wide range yet. So there is no point saying, look our technology is capable of doing this, when it won't actually be available for some time to come (which is why you do see them making off hand comments about the panel's limitations not being G-Sync limitations). However, this is not the case with AMD, they are touting the 9-240Hz interface spec, but yet are only offering products with 40-144Hz ranges. Which is basically lying. So I'm not sure why you choose to ignore that or why anyone would choose to ignore that fact. But like I stated before, I guess it's okay because AMD is doing it but God forbid NVIDIA did it, it would be the end of the world.

 

I really don't get why this matters to you, no matter how hard a read through what you've had to say. AMD doesn't sell monitors or scalars, they sell processing units. So how can they take the blame for what other manufactures produce?

Link to comment
Share on other sites

Link to post
Share on other sites

I really don't get why this matters to you, no matter how hard a read through what you've had to say. AMD doesn't sell monitors or scalars, they sell processing units. So how can they take the blame for what other manufactures produce?

Same reason Intel should be blamed for "5%" performance gains each generation: it's easier to blame the bigger companies rather than the small ones responsible (the problem is software).

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Me telling you that NVIDIA is capable of doing basically the same specs, doesn't have anything to do with what AMD is doing. All I'm trying to say is NVIDIA could easily have promoted G-Sync as being able to do 1-240Hz if they wanted to. But they know better, there are no monitors available that would allow such a wide range yet. So there is no point saying, look our technology is capable of doing this, when it won't actually be available for some time to come (which is why you do see them making off hand comments about the panel's limitations not being G-Sync limitations).

 

Slightly off-topic maybe, but I have the feeling that Nvidia does that more often. Example that comes to mind is the memory bus on their video cards. AMD will give you a whopping 512-bit bus nowadays, whereas the Titan X gives you only a 384-bit. I'm thinking they could make it higher, but it won't necessarily give you an advantage in real-world scenarios. But then again I'm not knowledgeable enough about this to make a great assessment of it.

Link to comment
Share on other sites

Link to post
Share on other sites

 

I really don't get why this matters to you, no matter how hard a read through what you've had to say. AMD doesn't sell monitors or scalars, they sell processing units. So how can they take the blame for what other manufactures produce?

 

Because they are accompanying the FreeSync name with 9-240Hz, when it's nothing more than a Adaptive-Sync interface specification. Their FreeSync monitors don't support 9-240Hz but 40-144Hz. So what's the freaking point in saying we are better than NVIDIA because of our 9-240Hz range when they don't have any monitors capable of such a range either. That's why it matters. 9-240Hz is an interface specification from VESA for Adaptive-Sync it has nothing to do with AMD.

 

 

Slightly off-topic maybe, but I have the feeling that Nvidia does that more often. Example that comes to mind is the memory bus on their video cards. AMD will give you a whopping 512-bit bus nowadays, whereas the Titan X gives you only a 384-bit. I'm thinking they could make it higher, but it won't necessarily give you an advantage in real-world scenarios. But then again I'm not knowledgeable enough about this to make a great assessment of it.

 

I'm not sure how any of that correlates to what I'm saying. 512-bit bus is available right now, as is 384-bit bus. It would like AMD saying we offer a 1024-bit bus on our 390X, then it comes out and it's only 512-bit bus. But they put that spec in there because actually the 490X was going to have a 1024-bit bus, even though it won't be out for another 2-3 years.

Link to comment
Share on other sites

Link to post
Share on other sites

Slightly off-topic maybe, but I have the feeling that Nvidia does that more often. Example that comes to mind is the memory bus on their video cards. AMD will give you a whopping 512-bit bus nowadays, whereas the Titan X gives you only a 384-bit. I'm thinking they could make it higher, but it won't necessarily give you an advantage in real-world scenarios. But then again I'm not knowledgeable enough about this to make a great assessment of it.

For gaming it would make pretty much no difference. Games are bottlenecked on the cores and dedicated hardware. Scientific compute would benefit (lots of simple ops on tons of data moving back and forth) which is why PCIE 4.0 is coming out on Skylake-E only to go into the server market first.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Slightly off-topic maybe, but I have the feeling that Nvidia does that more often. Example that comes to mind is the memory bus on their video cards. AMD will give you a whopping 512-bit bus nowadays, whereas the Titan X gives you only a 384-bit. I'm thinking they could make it higher, but it won't necessarily give you an advantage in real-world scenarios. But then again I'm not knowledgeable enough about this to make a great assessment of it.

Maxwell juiced up the L2 cache which helps mitigate the dependency of a wide interface which conserves on power and cost. Although wider the interface the better.

Link to comment
Share on other sites

Link to post
Share on other sites

Same reason Intel should be blamed for "5%" performance gains each generation: it's easier to blame the bigger companies rather than the small ones responsible (the problem is software).

I posed some questions earlier that I'm still confused by, maybe you can help me with a few ideas on them:

 

The way I understand it, the G-sync module replaces the monitor scalar. So Nvidia only needs "tuning" information for a single scalar on there graphics card side of things.(driver updates) Does that also mean G-Sync modules themselves get tuned for the individual panel they connect to, by Nvidia(?), when the manufacturers make a G-sync monitor?

 

So do AMD need to adjust for each scalar, that manufacturers provide for their "Free-Sync enabled" monitors, that their graphics cards will connect to?(Driver implementation) Or are the scalars responsible for the panel adjustments for each implementation of the  Adaptive Sync functionality, the manufacturer's adjustment for their monitor, then?

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

 

Because they are accompanying the FreeSync name with 9-240Hz, when it's nothing more than a Adaptive-Sync interface specification. There FreeSync monitors don't support 9-240Hz but 40-144Hz. So what's the freaking point in saying we are better than NVIDIA because of our 9-240Hz range when they don't have any monitors capable of such a range either. That's why it matters. 9-240Hz is an interface specification from VESA for Adaptive-Sync it has nothing to do with AMD.

 

 

 

I'm not sure how any of that correlates to what I'm saying. 512-bit bus is available right now, as is 384-bit bus. It would like AMD saying we offer a 1024-bit bus on our 390X, then it comes out and it's only 512-bit bus. But they put that spec in their because actually the 490X was going to have a 1024-bit bus, even though it won't be out for another 2-3 years.

 

Do you mean "Their," as in what is AMD's? AMD may have a partnership with manufactures to get the adaptive sync pushed out as fast as possible, but how can they be called to responsibility for what gets made by those partners?

 

AMD has no ownership in Free-sync supported monitors. For them, stating they support 9-240 on their graphics card/driver, is no different what Vesa supports on their Display port, shouldn't it?

Link to comment
Share on other sites

Link to post
Share on other sites

@Notional, most of science is a theory, ideas of as to why or how things behave. Understanding the theory does not make it the truth. There are plenty of holes in this theory you're citing when it comes to Nvidia's situation. It's a theory many people subscribe to. That does not make it correct. Newton's theory of gravitation was right, until it wasn't. Economists change theories like babies change diapers. You're improperly applying the theory in the first place. Once you buy an Nvidia card, you can keep it. If you buy a shield after that, you can buy an AMd GPu and keep the Nvidia one! There is not a single switching cost Nvidia imposes. Find me one that Nvidia imposes, not a third party. The monitor and shield idea are BS and you know it. The monitor vendors have way more power than Nvidia, and if Nvidia were tryig to force ugly contracts they'd be tatted out instantly. BenQ doesn't have FreeSync because, omg, the monitor was developed before DP 1.2a was a thing!

GSync is an approach that allows growth and adaptation (benefits of an FPGA). If a problem comes up in FreeSync/Adaptive Sync, the users are screwed. GSync can be modified on both the hardware and software side. It's the more functional approach.

Edit: sorry, replying on a phone, so I have to go piecemeal.

GameWorks is Nvidia's work. AMD should not be allowed to have it for free. AMD should come up with their own optimization libraries. That will prove whether or not GW actually gimps their cards.

Producers are not beholden to consumers. It's entirely the other way around regardless of the market. If you can't produce something yourself, you are at the mercy of producers, regardless of competition levels.

Yes, proprietary standards are still good because they get the ball rolling. Do you not see how damn slow these standards committees work? VESA is half a decade behind where they should be, as is the FSF. Nvidia was first to the table on dynamic refresh rate control. Nvidia gets the benefits of being first to market with a workable solution. It's good for Nvidia and the consumer. Supporting Adaptive Sync supports planned obsolescence (there is an ASIC on the DP 1.2a PCB responsible for handling AS). GSYNC can be modified ever onward regardless of how many problems may arise.

Intel gained no market share from it. They were already the market leader and AMD gained ground those years. Furthermore by the time the floating point problem was discovered new chips from both companies were rolling onto store shelves. That's hardly the fault of Intel.

Also did you read that entire extreme tech article? That's not Nvidia's fault. The game maker made a choice about refusing code changes. Prove Nvidia bribed them or get over it. If AMD sucks at tessellation, that's their problem.

 

Choose not to acknowledge the theory all you want. That is your problem not mine.

Market research is very much research of human behaviour. Human behaviour, thus markets, change a lot, as new tech and opportunities comes out. Of course new theories will be created.

Saying shit like "Economists change theories like babies change diapers" is snobbish and disrespectful. It's like saying it theories change like hookers change condoms, because new programming languages and API's pops up all the time. Does that not sound completely retarded to you? This kind of disrespect for acknowledged academia, well founded at that, just supports my claim of you not being academic minded. The arrogance!

 


 

What are you even talking about? AMD getting it for free? GameWorks is a software API for gamedevs. If I buy a game infested with Gameworks, I pay for the use of it. Why am I not allowed to use it then? Again, if Nvidia licensed it out to AMD, more gamedevs would use it, without getting payed for it in underhanded deals, being anti competitive (see the arkham origins again).

How can AMD come up with their own optimization libraries? They are legally not allowed any access to GameWorks, nor any communication with any devs, as they are bounded on hands and feet with draconian NDA's.

 


 

I still don't buy the producers is king. Not one bit. The entire entertainment industry is unimportant, and can be replaced with many other things. Like I said, only essential things could be described as that, in a monopoly, like medicine or essential actives in a company.

 

But it all depends on supply and demand. Where is it biggest? Don't tell me you believe the graphics card market is not supply driven.

 

Here, let me give you a better comparison of what AMD is doing. Since you are giving comparisons that aren't similar.

 

AMD stating the spec of 9-240Hz when their monitors only support 40-144Hz. Is like AMD saying on their R9 390X they are going to support DirectX 14. You see what I mean? Yes eventually there will be a Direct X 14, but it is not planned as of now and we shouldn't expect it for years to come. Nobody is working on DirectX 14 because they are still trying to get DirectX 12 out the door. The same is true with the 9-240Hz interface specification. Sure the interface supports it, but there are no monitors that will support it for many years. So there is no point saying you use 9-240Hz comparing it to NVIDIA's 30-144Hz because 9-240Hz is just not plausible at the moment and isn't going to be plausible for quite some time. 

 

(...)

 

It's not ignorant to criticize AMD for putting 9-240Hz in their advertising against NVIDIA. Their monitors only support 40-144Hz maximum. So it's anything but ignorant. It would be considered normal to find that peculiar for a company to use such a baseless spec in their advertising when comparing to NVIDIA. When their actual monitors only support 40-144Hz. VESA just made a claim based on the interface spec, which has nothing to do with AMD promoting FreeSync remember Adaptive-Sync is VESA, FreeSync is from AMD. NVIDIA could support Adaptive-Sync at any point if they wanted to. It is not proprietary. 

 

What are you even talking about? Freesync utilizes Adaptive Sync, so of course, Freesync is capable of supporting the full interval range of Adaptive Sync (9-240hz). How is that dishonest?

"When their monitors"? Who's? AMD doesn't have any monitors, and has no influence on what scalers and panels are used by the monitor vendors. So call VESA liars instead, since "their" adaptive sync monitors only support 40-144hz. You are being silly to say it nicely.

 

No one has claimed otherwise, stated in your last two sentences. On the contrary, I have defined both Freesync and Adaptive Sync multiple times in here.

 

There's no way to choose how to interpret Tom's comment, and it's not subjective. Tom said what he said, end of story. It might not be "officially stated" with legitimate numbers but it is very close to a fact because the engineer himself is stating this. To consider Tom's statement useless might be one thing, but I also posted Malventano's statements along with NVIDIA's own statements in this thread backing up the very so called, "useless" statements that Tom made. 

 

 

If you don't get my point, then I'm afraid you aren't really reading very closely. Or you do understand what I'm saying but choose to ignore the facts because it's AMD. AMD is stretching the truth about what their FreeSync monitors are capable of. Me telling you th at NVIDIA is capable of doing basically the same specs, doesn't have anything to do with what AMD is doing. All I'm trying to say is NVIDIA could easily have promoted G-Sync as being able to do 1-240Hz if they wanted to. But they know better, there are no monitors available that would allow such a wide range yet. So there is no point saying, look our technology is capable of doing this, when it won't actually be available for some time to come (which is why you do see them making off hand comments about the panel's limitations not being G-Sync limitations). However, this is not the case with AMD, they are touting the 9-240Hz interface spec, but yet are only offering products with 40-144Hz ranges. Which is basically lying. So I'm not sure why you choose to ignore that or why anyone would choose to ignore that fact. But like I stated before, I guess it's okay because AMD is doing it but God forbid NVIDIA did it, it would be the end of the world.

 

 

Your interpretations are very much subjective. What did Tom mean? "Close to the fact"? What is that? 1 hz off? 10 hz? 40 hz? Did he mean above or below Gsync's current announced hz interval? Or both? None of this is defined or stated clearly in any way, so they are all your subjective interpretations.

 

Malventano is an Nvidia fanboy, and is in no way a credible source. Like I said, I don't necessarily doubt that Gsync could support it, only that there is no tangible proof in any way. Extremely vague comments from Tom is not at all any proof of anything.

 

AMD has only stated what Freesync and Adaptive Sync is capable of, now and in the future. Stating your displyport standard supports 8k with no monitors on the market, would be equally dishonest then. All Adaptive Sync monitors clearly state what they support in range. If not, that is on the monitor vendors, not AMD, as they have no say on the matter.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

What are you even talking about? Freesync utilizes Adaptive Sync, so of course, Freesync is capable of supporting the full interval range of Adaptive Sync (9-240hz). How is that dishonest?

"When their monitors"? Who's? AMD doesn't have any monitors, and has no influence on what scalers and panels are used by the monitor vendors. So call VESA liars instead, since "their" adaptive sync monitors only support 40-144hz. You are being silly to say it nicely.

 

No one has claimed otherwise, stated in your last two sentences. On the contrary, I have defined both Freesync and Adaptive Sync multiple times in here.

 

 

Your interpretations are very much subjective. What did Tom mean? "Close to the fact"? What is that? 1 hz off? 10 hz? 40 hz? Did he mean above or below Gsync's current announced hz interval? Or both? None of this is defined or stated clearly in any way, so they are all your subjective interpretations.

 

Malventano is an Nvidia fanboy, and is in no way a credible source. Like I said, I don't necessarily doubt that Gsync could support it, only that there is no tangible proof in any way. Extremely vague comments from Tom is not at all any proof of anything.

 

AMD has only stated what Freesync and Adaptive Sync is capable of, now and in the future. Stating your displyport standard supports 8k with no monitors on the market, would be equally dishonest then. All Adaptive Sync monitors clearly state what they support in range. If not, that is on the monitor vendors, not AMD, as they have no say on the matter.

 

Because there are no freaking monitors that support 9-240Hz and won't be for many freaking years. What is so hard to understand about that concept? Why can you not wrap your head around that? 

 

AMD is putting FreeSync side by side on the monitors available with Adaptive-Sync. The monitors are not just being released as Adaptive-Sync capable monitors but also FreeSync monitors:

 

15oaqlk.jpg

 

Notice the huge FreeSync trademark symbol? And you want to act like AMD isn't putting their name on any monitors. LMAO. If I'm being silly, what are you being? Bonkers? VESA isn't promoting FreeSync as better than G-Sync, AMD is:

 

2nbvrpd.jpg

 

Does it matter, specifically? No it doesn't. He said G-Sync is capable of going above and below the panel limits. So what's the difference what the actual number is? We know that means below 30Hz and above 144Hz. 

 

I didn't just post Tom's comments, I also posted in a separate interview NVIDIA states the following:

 

The upper bound is limited by the panel/TCON at this point, with the only G-Sync monitor available today going as high as 6.94ms (144Hz). NVIDIA made it a point to mention that the 144Hz limitation isn’t a G-Sync limit, but a panel limit.

 

 

http://www.anandtech.com/show/7582/nvidia-gsync-review

 

Coupled with Tom's comments we now know for sure, in the upper end, over 144Hz is not a G-Sync limitation. And you are only choosing to ignore Malventano's comments because he's apparently a "fanboy" so somehow that makes him any less credible. Even though he works for a highly reputable tech review website, that has no biases, because they review both products and aren't payed to sway anything. 

 

The problem, is, I guess because I have to repeat myself because you choose to ignore the facts. They aren't just stating what the Adaptive-Sync spec is. They are in their advertising, making it look like they offer a wider range than NVIDIA. When in actuality they don't, they offer less of a range than NVIDIA. The available FreeSync panels only offer 40-144Hz, while G-Sync offers 30-144Hz. Regardless of this fact, it's not 9-240Hz. So there is no point in releasing the above chart because it's useless and far from the truth. And you keep saying it's on monitor vendors, what dont you get that AMD is painting their name on these products? 

Link to comment
Share on other sites

Link to post
Share on other sites

Do you mean "Their," as in what is AMD's? AMD may have a partnership with manufactures to get the adaptive sync pushed out as fast as possible, but how can they be called to responsibility for what gets made by those partners?

 

AMD has no ownership in Free-sync supported monitors. For them, stating they support 9-240 on their graphics card/driver, is no different what Vesa supports on their Display port, shouldn't it?

 

You know what I meant, and I corrected it as soon as I posted it. So there's no point in arguing semantics right now when you knew exactly what I was saying. 

 

Anyway, they get called for responsibility because of their advertising. They are the ones trying to downplay G-Sync as inferior in their advertising:

 

23qtz8.jpg

 

It is different, because AMD is not VESA. AMD is a company with competition, VESA is an organization. AMD is putting FreeSync on the monitors with Adaptive-Sync saying they have no ownership is just stupid:

 

15oaqlk.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

The cards miners did use where no cool cards though, they where reference 290 X's and they ran really hot and load while consuming large amounts of juice. I would think continuous amounts of power and 100% usage would kill a card much quicker than otherwise. Not just the GPU but the memory also. I don't really know but I am sure that the memory isn't meant to be under full load for extended periods of time.

 

 

100% usage forever...I would say that probably has some effects. 

 

also the fans are the larger issue with this the fans seize up because they are spinning max rpm for literally the whole time (you do realize some cards have lifetime warranty right?) and stop working and then the gpu's overheat

 

I agree long term it will reduce the life of the card,  I just don't think it will reduce it's life quick enough to effect RMA statistics.  I think the only way they could burn out an otherwise healthy card that quick would be to bypass the thermal limiter on the GPU.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

You know what I meant, and I corrected it as soon as I posted it. So there's no point in arguing semantics right now when you knew exactly what I was saying. 

 

Anyway, they get called for responsibility because of their advertising. They are the ones trying to downplay G-Sync as inferior in their advertising:

 

 

 

It is different, because AMD is not VESA. AMD is a company with competition, VESA is an organization. AMD is putting FreeSync on the monitors with Adaptive-Sync saying they have no ownership is just stupid:

 

 

I didn't mean it offensively, I understand what you're saying but your reason is what I'm trying to find out. I really hope I don't have to tell you what the original title of this thread was and why it needed to be changed, do I? Their are G-sync monitors that are only 30-60 as well, wouldn't both companies require the same labeling treatment then? If VESA doesn't compete, what about HDMI? Free sync the label can also be considered a certification, is it wrong for monitor manufacturers and stores to label their qualifications to market their products?

Link to comment
Share on other sites

Link to post
Share on other sites

BiG StroOnZ

 

I'm not sure I understand what you mean, because I'm not sure you understand what you are even talking about.

 

Freesync, as you even wrote yourself, is the implementation of Adaptive Sync, right? So Freesync can support everything Adaptive Sync can, right? So why is it dishonest for AMD to write that their Freesync drivers support the full hz interval of Adaptive Sync, the hardware standard? Again AMD has no say over scaler, panel or manitor vendors, nor anything that has to do with them. Freesync is all about support/utilization of the Adaptive Sync standard. You even wrote it yourself?1?

 

AMD has a royalty free review program, so monitor vendors can use the Freesync brand and logo, if they so choose to themselves. Asus has chosen not to so far. AMD is not responsible for what other companies do or do not.

 

Why would VESA promote Freesync? Makes no sense. I'm not entirely sure you understand what Adaptive Sync is as a standard.

 

It's irrelevant what current monitors support, when the standard Adaptive Sync supports 9-240hz, and Freesync can utilize all of that in the driver and the hardware.

 

And what's up with the edited picture? That's just some odd biased fanboyism, come on. Freesync implementation? Didn't you just say yourself, that the hardware standard was called Adaptive Sync? Then how is the Freesync implementation only 40-144hz, when it is a hardware implementation on the monitor? You clearly don't seem to quite understand what is what, and what controls what?

 

But let's go through this image of yours:

 

23qtz8.jpg

 

 

  • The fee has nothing to do with AIB vendors or graphics cards. It's about monitor vendors having to pay licensing fees and/or buy the proprietary Gsync module.
  • Adaptive Sync only works on DisplayPort. I doubt extra input methods, has any influence on input lag. Either way it's good for a monitor to support HDMI for instance, so you can connect consoles, laptops, etc. Adaptive/free Sync is not just for gaming, but also power savings.
  • Adaptive Sync supports 9-240 hz. Free Sync supports the full range as well. The panels or scalers in the monitors might not. That has nothing to do with AS or FS as standards or implementations. Free Sync is not implemented in any monitor; it is an AMD driver. Only Adaptive Sync is implemented in monitors. Go yell at VESA, for making a standard that supports intervals and resolutions, that doesn't exist yet.
  • Prove that Gsync can support from 1-240 hz. Tom's quote does not in any way prove that what so ever, and you know it.
  • Gsync has a performance penalty when going above the max hz of the monitor. It forces VSync, which is bad for competitive gamers, and introduces massive latency: //ADDENDUM: Richard Huddy speaks about how Gsync syncs communication after each frame, which made them observe a 1-1,5% performance hit. Adaptive Sync is mono directional from Graphics card to monitor. Only hz interval at the plug and play start, goes the other way.

lag-csgo.png

 


 

In your own quote, Tom does not state, that GSync can go below 30hz, just that it can go above 144hz, without stating the limit. And YOU conclude that Gsync can now support 1-240hz?! Am I missing something here? Can you not understand why I question your subjective interpretations, when your Tom quote, in no way matches your statement?

 

AMD has ownership of the Free Sync implementation. That is how the graphics card behaves out of bounds on the hz interval. Has nothing to do with Adaptive Sync. It is also that the graphics card know how to utilize Adaptive Sync.

 

Monitors clearly state how high they can go. It makes no sense to criticize AMD for stating what they support.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

It really does amaze me sometimes how easily some people suck up the marketing BS.  Consume it like its the only information in the world. 

 

People you are being lied to from all sides.  The only thing regurgitating marketing BS from manufacturers proves is that you are gullible.

 

Take in the reviews and make up your own mind, live with your choices. You neither need nor can convince others to accept marketing hype. we will each look at the information as it's presented and make our own minds up.  Share your opinions, but don't let yourself be fooled with marketing BS.  

 

Also if you can't make a post without making a degrading comment,  you are a fanboy who has missed the point.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

 

I didn't mean it offensively, I understand what you're saying but your reason is what I'm trying to find out. I really hope I don't have to tell you what the original title of this thread was and why it needed to be changed, do I? Their are G-sync monitors that are only 30-60 as well, wouldn't both companies require the same labeling treatment then? If VESA doesn't compete, what about HDMI? Free sync the label can also be considered a certification, is it wrong for monitor manufacturers and stores to label their qualifications to market their products?

 

I don't understand what you mean my reason? Regardless, I made it the title because that is what the title of the article was, so personally I don't think it had to be change, was it kind of sensationalist sure but that's what grabs a potential reader's attention. I believe you are talking about the 4K panel, and the only reason why it only goes up to 60Hz is because there is no panel or interface capable of yet displaying over 60Hz 4K signal. But mind you that still is inside of their range of 30-144Hz. So I don't see how that is a problem. NVIDIA's range is on 30-144Hz, whereas AMD's is being stated as 9-240Hz. But their panels that exist are not really close to that range. Well HDMI is offered on products along with DisplayPort. So it's not like VESA is trying to make one product seem better than the other. Whereas with G-Sync or FreeSync you can only use one, but not both. Meanwhile, everyone has a video card that has HDMI along with DisplayPort so it's not like they are able to force a user to only use one with advertising. Since both can be offered on the same device.

 

No it's not wrong, but it still means AMD is promoting one of their products. Which people here seem to think that is not the case.

 

 

BiG StroOnZ

 

I'm not sure I understand what you mean, because I'm not sure you understand what you are even talking about.

 

Freesync, as you even wrote yourself, is the implementation of Adaptive Sync, right? So Freesync can support everything Adaptive Sync can, right? So why is it dishonest for AMD to write that their Freesync drivers support the full hz interval of Adaptive Sync, the hardware standard? Again AMD has no say over scaler, panel or manitor vendors, nor anything that has to do with them. Freesync is all about support/utilization of the Adaptive Sync standard. You even wrote it yourself?1?

 

AMD has a royalty free review program, so monitor vendors can use the Freesync brand and logo, if they so choose to themselves. Asus has chosen not to so far. AMD is not responsible for what other companies do or do not.

 

Why would VESA promote Freesync? Makes no sense. I'm not entirely sure you understand what Adaptive Sync is as a standard.

 

It's irrelevant what current monitors support, when the standard Adaptive Sync supports 9-240hz, and Freesync can utilize all of that in the driver and the hardware.

 

And what's up with the edited picture? That's just some odd biased fanboyism, come on. Freesync implementation? Didn't you just say yourself, that the hardware standard was called Adaptive Sync? Then how is the Freesync implementation only 40-144hz, when it is a hardware implementation on the monitor? You clearly don't seem to quite understand what is what, and what controls what?

 

But let's go through this image of yours:

 

23qtz8.jpg

 

 

  • The fee has nothing to do with AIB vendors or graphics cards. It's about monitor vendors having to pay licensing fees and/or buy the proprietary Gsync module.
  • Adaptive Sync only works on DisplayPort. I doubt extra input methods, has any influence on input lag. Either way it's good for a monitor to support HDMI for instance, so you can connect consoles, laptops, etc. Adaptive/free Sync is not just for gaming, but also power savings.
  • Adaptive Sync supports 9-240 hz. Free Sync supports the full range as well. The panels or scalers in the monitors might not. That has nothing to do with AS or FS as standards or implementations. Free Sync is not implemented in any monitor; it is an AMD driver. Only Adaptive Sync is implemented in monitors. Go yell at VESA, for making a standard that supports intervals and resolutions, that doesn't exist yet.
  • Prove that Gsync can support from 1-240 hz. Tom's quote does not in any way prove that what so ever, and you know it.
  • Gsync has a performance penalty when going above the max hz of the monitor. It forces VSync, which is bad for competitive gamers, and introduces massive latency:

lag-csgo.png

 


 

In your own quote, Tom does not state, that GSync can go below 30hz, just that it can go above 144hz, without stating the limit. And YOU conclude that Gsync can now support 1-240hz?! Am I missing something here? Can you not understand why I question your subjective interpretations, when your Tom quote, in no way matches your statement?

 

AMD has ownership of the Free Sync implementation. That is how the graphics card behaves out of bounds on the hz interval. Has nothing to do with Adaptive Sync. It is also that the graphics card know how to utilize Adaptive Sync.

 

Monitors clearly state how high they can go. It makes no sense to criticize AMD for stating what they support.

 

If you are not sure what I mean, then, what's the point of continuing. I made it very clear what I mean, and repeated myself multiple times. So now you say I don't know what I'm talking about, how convenient of you to pull a straw man argument out of thin air. When I made my point loud and clear, and multiple people understand my point. The only person who is not understanding it is clearly you. So maybe you should go back and read a little more closely because I shouldn't have to repeat myself or my point 1000 times for one single person who is confused to understand it.

 

It's dishonest when you are comparing to the competition, to make it seem like yours is better. When yours only supports current ranges of 40-144Hz. What they do have a say, is in what they advertise as you can see from the chart posted. You can clearly see them trying to make it seem like their implementation is better by making up lies or stretching the truth.

 

So now companies that allow other companies to use their branding aren't responsible for that? You have to be joking right? You don't actually believe that do you?

 

So now it's irrelevant what monitors support? Why is it irrelevant? Because AMD get's caught stretching the truth, now it's irrelevant? They say 9-240Hz but their panels only do 40-144Hz, but that's ok in your opinion? To basically lie to make yourself look better than the competition. FreeSync can't utilize what doesn't exist, and won't exist for many years. So let's nip that in bud.

 

I'm starting to believe you really don't understand what we are talking about or what is is actually going on. I guess it's easy to point out someone else doesn't know what they are talking about when you actually don't know what you are talking about.

 

The term "FreeSync" implementation is referring to the actual implementation that exists in current FreeSync monitors. You understand what it means, let's not over-analyze things here. It's referring to the fact that AMD touts the AdaptiveSync interface specification, but then their monitors release and they are still limited by the panel. Which means again, the interface spec is nothing more than a spec. It has nothing to do what is actually possible and implemented currently. Which means, AMD has no right to suggest their technology offers 9-240Hz compared to G-Sync's 30-144Hz considering their technology is no better with 40-144Hz. Oh but wait, I forgot, AMD has no say in what they advertise or what they promote, or companies putting their name on products. That just all happens randomly without any say. 

 

Let's go through your bullets:

 

 

 What it costs to implement G-Sync into a monitor is not a licensing fee. That's what it was saying, but you didn't seem to get that.

 So now you are going to lie and claim that Adaptive-Sync isn't for gaming? Really that's your bullet point, Adaptive-Sync isn't for gaming but also for "power savings." LMAO. Next.

 You seem to be pushing the blame on VESA, when VESA isn't competing with G-Sync. FreeSync is. But let's try to make it seem like AMD gets off scot-free here for promoting the VESA standard instead of what monitors are actually capable of

 I posted numerous sources with information that proves G-Sync can do 1-240Hz. From Tom, Malventano, and NVIDIA. Multiple times in this thread. If you choose to ignore said information, then that's your problem. Remember, "Tom's comments are useless, Malventano is a fanboy, and NVIDIA didn't state specific numbers" so that automatically means it's not true or accurate.

 Well first, let's read the article that you are using that chart from. Firstly, they are talking about Input Lag, not performance loss, which would mean much less framerate because of G-Sync being enabled. Secondly, if you continue to read the author concludes with the following, "The good news now comes: As a last-ditch, I lowered fps_max more significantly to 120, and got an immediate, sudden reduction in input lag (27ms/24ms for G-SYNC). I could no longer tell the difference in latency between G-SYNC and VSYNC OFF in Counterstrike: GO! Except there was no tearing, and no stutters anymore, the full benefits of G-SYNC without the lag of VSYNC ON." Before that they were running the test with an FPSMax of 300 which is why the results in the above chart happened. So no you didn't even read the article, and no it doesn't force V-Sync on. The author said the result was it felt like V-Sync was turned on when went over the cap. If you continue reading the article he concludes with the following, "As even the input lag in CS:GO was solvable, I found no perceptible input lag disadvantage to G-SYNC relative to VSYNC OFF, even in older source engine games, provided the games were configured correctly. G-SYNC gives the game player a license to use higher graphics settings in the game, while keeping the gameplay smooth"

 

Well, he does, I didn't think I would have to post this again, but I guess I do:

 

 

 

Forbes: Let’s talk about the minimum response times that both G-Sync and Adaptive Sync support.

Tom Petersen: “First of all, the spec ‘Adaptive Sync’ has no minimum. Both have the ability to communicate any range, so there’s nothing about the base specs that are different. What’s interesting though, is the reason there are panel-specific refresh limits. LCD images decay after a refresh, you kinda paint the screen and it slowly fades. That fade is just related to the panel. The reason there’s an Adaptive Sync spec and G-Sync module is because that lower limit is variable depending on the technology inside the panel. But games don’t know about that! So what do you do when a game has a lower FPS than the minimum rate you want to run your panel? Because when they run below that minimum rate things start to flicker, and that’s a horrible experience.”

 

Tom Petersen: “I can’t go into too much detail because it’s still one of our secret sauces. But our technology allows a seamless transition above and below that minimum framerate that’s required by the panel. PC Perspective wrote an article guessing how we did that, and they’re not that far off…”

 

 

http://www.forbes.com/sites/jasonevangelho/2015/03/23/nvidia-explains-why-their-g-sync-display-tech-is-superior-to-amds-freesync/2/

 

Oh and just to put it all together, why not include Malventano's comments:

 

 

 

I'm sorry, but this little bit of AMD marketing crap needs to stop being repeated. The adaptive sync spec range is 9-240, and AMD is repeating it to make themselves look better than the competition. If NV did the same, they could probably claim 1-240 (as they have a functional method to push lower than any panel physical limit). The AMD claimed spec is nowhere near what any panels are going to be capable of for a very long time - it's just what the interface standard supports. If NV claimed 1-240 as if it was so much better than everything else, everyone here would be calling BS (myself included), so you guys should really stop repeating that spec in that context. The real specs are those of the available panels for FreeSync, and for G-Sync (with it understood that they rate at the minimum of the panel but are capable of pushing lower with a method not (yet?) employed by FreeSync). I say 'yet' because if AMD's driver devs were sharp enough, they could implement frame redraws at the driver / GPU level.

 

 

 

You used 9-240 to make a point. It's an irrelevant spec. Use something real like 40-144 (the widest range of an available FreeSync panel), but stop using 9-240, which is just the interface spec. G-Sync actually sends frames as low as 1 per second, but you don't see me using their theoretical interface spec of 1-240 as a counter to your use of 9-240.

 

http://www.overclock.net/t/1546934/various-amd-freesync-reviews/740

 

And just to cover being able to do over 144Hz, I'll include this too:

 

The upper bound is limited by the panel/TCON at this point, with the only G-Sync monitor available today going as high as 6.94ms (144Hz). NVIDIA made it a point to mention that the 144Hz limitation isn’t a G-Sync limit, but a panel limit.

 

 

http://www.anandtech.com/show/7582/nvidia-gsync-review

 

Still subjective? Still an interpretation? Or do you just not read everything before you reply to a post?

 

It only makes no sense to criticize AMD for using the interface spec, as opposed to using the actual monitors specs, because you are a die hard AMD fan boy that ignores all the facts.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×