Jump to content

Nvidia states Adaptive-Sync rumour support as untrue

Samfisher

G-Sync has it's own processing unit, PCB and 768MB buffer, so i'd say it's not unreasonable to think that it will be much better than a free alternative.

 

750MB, it has 3x 2gbit hynix memory modules.

 

Actually if you know how g-sync and adaptive sync works, its very unreasonable to believe, that Nvidias overcomplicated 2way comm, buffer system, is better performing than Adaptive Sync. It is just a lot more expensive with nothing to show for it. As it is proprietary, it is not based on display controller know how, nor will it ever have any competition on manufacturing. Like most Nvidia products, its overpriced for the sake of milking customers, nothing else.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

750MB, it has 3x 2gbit hynix memory modules.

 

Actually if you know how g-sync and adaptive sync works, its very unreasonable to believe, that Nvidias overcomplicated 2way comm, buffer system, is better performing than Adaptive Sync. It is just a lot more expensive with nothing to show for it. As it is proprietary, it is not based on display controller know how, nor will it ever have any competition on manufacturing. Like most Nvidia products, its overpriced for the sake of milking customers, nothing else.

 

Actually there is a lot we know because it has been out and tested.  It works, everyone raves about it,  lag is not an issue, we know the cost, we know what it works with. what we don't know is real world performance and cost of freesync. But that will come.  So maybe if you try and be positive rather than just shit on a product unnecessarily, that would help the discussion remain positive.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Actually there is a lot we know because it has been out and tested.  It works, everyone raves about it,  lag is not an issue, we know the cost, we know what it works with. what we don't know is real world performance and cost of freesync. But that will come.  So maybe if you try and be positive rather than just shit on a product unnecessarily, that would help the discussion remain positive.

 

So I'm not allowed to critique a company, out of fear of removing positivity in the duplicate thread? Come on. 

G-sync is a proprietary tech, that has forced gaming monitors up to a price point, we've never seen before; as well as fragmenting the market, thus rendering competitive mechanisms, less potent. None of this helps the end consumer, who ends up paying too much for too little, while getting forced to stick with one vendor, regardless of the competitive state of that vendor in the future? Proprietarity is the worst thing, when it starts to include other vendors and products.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

So I'm not allowed to critique a company, out of fear of removing positivity in the duplicate thread? Come on. 

G-sync is a proprietary tech, that has forced gaming monitors up to a price point, we've never seen before; as well as fragmenting the market, thus rendering competitive mechanisms, less potent. None of this helps the end consumer, who ends up paying too much for too little, while getting forced to stick with one vendor, regardless of the competitive state of that vendor in the future? Proprietarity is the worst thing, when it starts to include other vendors and products.

 

Your allowed to have an opinion, but when your opinions are constantly negative about one brand, not really founded in an conventional logic and always read like the product is the spawn of satan. All it does is drag the threads down.

 

I.E:

 

 

 

G-sync is a proprietary tech, that has forced gaming monitors up to a price point,  Only g-sync monitors have increased in price.

 

we've never seen before; You haven't looked very hard then.

 

as well as fragmenting the market, thus rendering competitive mechanisms, less potent. Bollocks, the market is not fragmented and if anything it has spurred on competitiveness because AMD is now pushing freesync.

 

None of this helps the end consumer, who ends up paying too much for too little,  Unfounded personal opinion

 

while getting forced to stick with one vendor, regardless of the competitive state of that vendor in the future?  How is anyone forced to stick with one vendor, really clutching at straws here.  Do you really think G-sync prevents you from buying AMD?

 

Proprietarity is the worst thing, when it starts to include other vendors and products. Proprietary spurs the market just as much as open source, if not more.  You don't have to look far back in history to find most of the most common devices were originally proprietary devices.

 

Examples of proprietary devices that are now open or so cheap they may as well be free:

 

WIFI

Blutooth

DVD

usb drive, Originally the Sony memory stick.

 

And If I used google I could probably find more.

 

There is a difference between sharing an opinion and shit blasting a product with statements that just aren't true.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

G-sync is a proprietary tech, that has forced gaming monitors up to a price point,  Only g-sync monitors have increased in price. we've never seen before; You haven't looked very hard then.

That was my point, must have not expressed myself properly. All g-sync monitors have a hefty price premium. the Asus ROG sets new standards of high price for a gaming orientet monitor.

 

as well as fragmenting the market, thus rendering competitive mechanisms, less potent. Bollocks, the market is not fragmented and if anything it has spurred on competitiveness because AMD is now pushing freesync.

I disagree. By introducing proprietary tech into other product categories, you lock consumers into one tech. It is essentially a sunk cost you cannot have back, and you lose either the functionality when changing from Nvidia to AMD or the choice between the two. You either have to get G-sync or A-sync (until Nvidia decides to support the industry standard), which is by definition, fragmentation. 

 

None of this helps the end consumer, who ends up paying too much for too little,  Unfounded personal opinion

Paying 150$+ for a tech, that can be made with a 10$ increase on existing display controllers/scalers is hardly unfoundend, nor a personal opinion: http://linustechtips.com/main/topic/208028-freesync-monitor-80-100-cheaper-than-gsync-monitors/?hl=huddy 80-100$ more for what exactly?

 

while getting forced to stick with one vendor, regardless of the competitive state of that vendor in the future?  How is anyone forced to stick with one vendor, really clutching at straws here.  Do you really think G-sync prevents you from buying AMD?

Like I wrote further up, it's a sunk cost, when you buy one tech. If you want to go from Nvidia to AMD (or vica versa), you either have to miss out on the variable refreshrate, you paid a price premium for, or you are bound to stick with either of the two graphicscard makers.

 

Proprietarity is the worst thing, when it starts to include other vendors and products. Proprietary spurs the market just as much as open source, if not more.  You don't have to look far back in history to find most of the most common devices were originally proprietary devices.

Sony was the king of proprietarity (who remembers the horrible ATRAC3 sound format?). Look how they are doing now. Proprietarity fragments the market and subsequently reduces competition (the competitive mechanisms I talked about). Adaptive Sync, which is the industry standard for variable framerates, started as a proprietary tech in Freesync as well. The difference is that AMD decides to propose the hardware specs to VESA, so it could be adopted as an industry standard, open to all, including Nvidia. Nvidia only needs to rewrite their g-sync drivers to support this. Hardly a huge bump to overcome for Nvidia.

 

However my point was, when proprietarity starts to include other vendors and products, as en third parties. Look at all the speaker docs for Apple stuff. If you don't have alle and thus their proprietary connector, you cannot use the speaker. Heck even if you have the newest Apple stuff, you cannot use your old stuff properly anymore.

Anyone buying a piece of hardware, that is based on other companies proprietary tech (like g-sync and apple dockspeakers), are bound to use that brand, if they want to utilize the functionality of that third party hardware. That is in itself an incentive to keep using the brand, that owns the proprietary tech.

 

Your allowed to have an opinion, but when your opinions are constantly negative about one brand, not really founded in an conventional logic and always read like the product is the spawn of satan. All it does is drag the threads down.

 

I.E: 

Examples of proprietary devices that are now open or so cheap they may as well be free:

 

WIFI

Blutooth

DVD

usb drive, Originally the Sony memory stick.

 

And If I used google I could probably find more.

 

There is a difference between sharing an opinion and shit blasting a product with statements that just aren't true.

I am often negative about Nvidia, because of their business strategies benefiting only them, at the cost of the consumer; both price wise and tech adoption wise (fragmentation). That is a very legitimate basis to critique.

 

All those examples did not get wide adaption until they became industry standards, that was open to many vendors. DVD only happend because Philips and Sony decided not to have a format war, and combine their techs to combat another format called SDCD.

Ironic you should mention Sony's memory stick. That was a proprietary tech, that fragmented the market, had a price premium to MMC/SD cards without any benefit to the consumer what so ever, in functionality, speed or capacity.

 

What statements have I made, that is not true?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

This is disappointing, I want to be able to change my gpu brand without losing the function of A-sync or G-sync

CPU amd phenom ii x4 965 @ 3.4Ghz | Motherboard msi 970a-g46 | RAM 2x 4GB Team Elite | GPU XFX Radeon HD 7870 DD | Case NZXT Gamma Classic | HDD 750 GB Hitachi | PSU ocz modxstream pro 600w

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

I disagree. By introducing proprietary tech into other product categories, you lock consumers into one tech. It is essentially a sunk cost you cannot have back, and you lose either the functionality when changing from Nvidia to AMD or the choice between the two. You either have to get G-sync or A-sync (until Nvidia decides to support the industry standard), which is by definition, fragmentation. 

 

Where's your evidence? When people don't want to be locked to a specific product they don't buy it. Just like every other product on the market, Your argument is flawed. You claim Nvidia should support both to avoid this, why not monitor makers? why not AMD?  please there is a million reasons for each one to go in any direction.

 

 

 

 

 

Paying 150$+ for a tech, that can be made with a 10$ increase on existing display controllers/scalers is hardly unfoundend, nor a personal opinion: http://linustechtips.com/main/topic/208028-freesync-monitor-80-100-cheaper-than-gsync-monitors/?hl=huddy 80-100$ more for what exactly?

 

How much is too much is up to the individual, you might consider it too much, you might also consider that Async is not out yet so you are making arguments based on unknown conditions.

 

Like I wrote further up, it's a sunk cost, when you buy one tech. If you want to go from Nvidia to AMD (or vica versa), you either have to miss out on the variable refreshrate, you paid a price premium for, or you are bound to stick with either of the two graphicscard makers.

 

 

How is that any different from mother boards and CPUs? Games and OS's? many products lock you in, it's not new, it doesn't actually damage the industry, again it drives it further because the next company has to make something better or more appealing.   Consider Apple, it would be hard to use apple products if you hated Itunes, or to use an Ipod/Ipad without Itunes.  But it still works, millions of people still buy it and the fact that it's a locked system doesn't hurt the industry or the consumers.

 

 

Sony was the king of proprietarity (who remembers the horrible ATRAC3 sound format?). Look how they are doing now. Proprietarity fragments the market and subsequently reduces competition (the competitive mechanisms I talked about). Adaptive Sync, which is the industry standard for variable framerates, started as a proprietary tech in Freesync as well. The difference is that AMD decides to propose the hardware specs to VESA, so it could be adopted as an industry standard, open to all, including Nvidia. Nvidia only needs to rewrite their g-sync drivers to support this. Hardly a huge bump to overcome for Nvidia.

 

 

Your using an example of how proprietary hardware has driven the industry forward as an example of how bad it is?  Sony's current financial situation is not born of making proprietary products, if you think that you don't understand global business.

 

 

However my point was, when proprietarity starts to include other vendors and products, as en third parties. Look at all the speaker docs for Apple stuff. If you don't have alle and thus their proprietary connector, you cannot use the speaker. Heck even if you have the newest Apple stuff, you cannot use your old stuff properly anymore.

Anyone buying a piece of hardware, that is based on other companies proprietary tech (like g-sync and apple dockspeakers), are bound to use that brand, if they want to utilize the functionality of that third party hardware. That is in itself an incentive to keep using the brand, that owns the proprietary tech.

 

 

Doesn't say much except that you don't want companies to improve their products because it means you might have to buy new hardware.  I have got news for you, this is what its been like since before the 80's, nothing will change so you'd better get used to it.  Its one of the facts of the evolution of technology,  No one is forcing you to use either freesync or G-sync, no one is forcing you to buy one product over another. It is simple, a company produces a product and charges a price for it, if no one likes it or thinks the prices is too that company goes bust, if people like it they buy it.  You are not forced to either decision it is solely yours to make.

 

 

I am often negative about Nvidia, because of their business strategies benefiting only them, at the cost of the consumer; both price wise and tech adoption wise (fragmentation). That is a very legitimate basis to critique.

 

 

To criticize something means you can explain in a rational manner why you think certain attributes are only applicable to one company,  when you claim "their business strategies benefiting only them" you don't seem to acknowledge that AMD do exactly that, their strategies are designed only to improve their bottom line.  That is the core MO of every business. Jjust becasue one has more market share, more money and seems to be able to demand a premium for it's products doesn't mean it is more evil or more selfish than the other, just means it is doing better as a business at the moment.

 

 

 

All those examples did not get wide adaption until they became industry standards, that was open to many vendors. DVD only happend because Philips and Sony decided not to have a format war, and combine their techs to combat another format called SDCD.

Ironic you should mention Sony's memory stick. That was a proprietary tech, that fragmented the market, had a price premium to MMC/SD cards without any benefit to the consumer what so ever, in functionality, speed or capacity.

 

What statements have I made, that is not true?

 

But they still become the standard and only exist becasue a company invested time and money into developing them, If it wasn't for the memory stick other companies may not have developed memory with a USB interface for quite some time. It is the push to compete that drives innovation, that competition comes from proprietary products.  It's not Ironic, it's how it is.

 

The statements you make that are not true are the ones that you base on flawed concepts, you speak like you think AMD is giving away for the benefit of others, Your hatrred fo Nvidia prevents you from being open minded about how companies operate and the how technology evolves.

 

This is a pretty big article but it is well worth the read:

 

http://www.google.com/url?sa=t&rct=j&q&esrc=s&source=web&cd=2&ved=0CCgQFjAB&url=http%3A%2F%2Fehlt.flinders.edu.au%2Feducation%2Fiej%2Farticles%2Fv4n1%2Fparis%2Fpaper.pdf&ei=jBAkVMWuEIyn8AXKsYLoBA&usg=AFQjCNEpGL7g0CIoiKtRaChxQnQI_RE5Nw&sig2=AE4qSuPnnxX6MsCGzHzs0w&bvm=bv.76247554%2Cd.dGc

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Where's your evidence? When people don't want to be locked to a specific product they don't buy it. Just like every other product on the market, Your argument is flawed. You claim Nvidia should support both to avoid this, why not monitor makers? why not AMD?  please there is a million reasons for each one to go in any direction.

 

It's actually a common strategy for companies to keep customers in the brand invironment. Biggest example is Apple, but Bang&Olufsen and Sonos are big in the field as well. There is a reason why they don't open up to others. Simply to keep customers coming back, since all their tech ends up only working with that one brand.

Yes Nvidia should support an industry standard, instead of an over expensive (costs more, delivers less, like no native 24/23,96 fps video playback) proprietary colution.

I doubt any  monitor vendor, would pay 150$ for a G-sync module, and then add a highend display controller, supporting A-sync on top of that. The end price would be insane. And for what? Supporting a proprietary solution, that offers no additional benefit? AMD cannot get access to G-sync, you know that.

 

How much is too much is up to the individual, you might consider it too much, you might also consider that Async is not out yet so you are making arguments based on unknown conditions.

If people wants to waste their money, sure it's their choice. But paying 80-100$ for nothing additional, really is a waste of money don't you think? But yes, if you absolutely want variable refreshrate RIGHT NOW, then G-sync is the only option. And you will pay a high premium for that impatience.

 

How is that any different from mother boards and CPUs? Games and OS's? many products lock you in, it's not new, it doesn't actually damage the industry, again it drives it further because the next company has to make something better or more appealing.   Consider Apple, it would be hard to use apple products if you hated Itunes, or to use an Ipod/Ipad without Itunes.  But it still works, millions of people still buy it and the fact that it's a locked system doesn't hurt the industry or the consumers. 

In practice, it is very different, as monitors are usually universal, and have a lifespan independently of any GPU. CPU/Mobo usually have a similar lifespan. However, I agree. Would it not be better for consumers, if all sockets on all motherboards, supported both Intel and AMD? Right now the selection of motherboards are poor for AMD cpu/apu's.

Personally I use winamp for my old ipod, but do you honestly think it benefits the consumer, that you are forced to use Itunes, instead of something better? It very much hurts the consumer. Especially if you want to buy music online, cheaper than itunes.

 

Your using an example of how proprietary hardware has driven the industry forward as an example of how bad it is?  Sony's current financial situation is not born of making proprietary products, if you think that you don't understand global business.

Not really, "Freesync" is only becoming the success it should become, because it is an industry standard now (Adaptive Sync). AMD did not invent a proprietary solution, but essentially outsourced that to the companies, who nows about this (display controller vendors). The result is that you already have 3 different vendors, creating their own controllers, with their own features and price. Adaptive Sync already has more competition than G-sync, and it's not even out yet. That is possible, because VESA made it a standard (we don't know how much influence these controller vendors had on the standard). We don't know if AMD made the tech specs for these controllers, or if the vendors did that themselves. So no, it really is not an example of proprietary goodness.

 

There are many factors responsible for Sony's situation. Financial crisis, higher taxation in Japan, than SK, products being one step behind featurewise, etc. But focusing on proprietarity hardly did them any good. It is one of the reasons their old Digital music players failed in sales (ATRAC3, and yes I had one).

 

Doesn't say much except that you don't want companies to improve their products because it means you might have to buy new hardware.  I have got news for you, this is what its been like since before the 80's, nothing will change so you'd better get used to it.  Its one of the facts of the evolution of technology,  No one is forcing you to use either freesync or G-sync, no one is forcing you to buy one product over another. It is simple, a company produces a product and charges a price for it, if no one likes it or thinks the prices is too that company goes bust, if people like it they buy it.  You are not forced to either decision it is solely yours to make.

 

Oh come on, ad hominems are for people without any argumentational skills. You are better than that. I hove no problem with shadowplay for instance. There are alternatives, and I don't have to pay extra money for such a feature, that then locks me to just Nvidia cards. I can just switch and get another shadowplay "clone", without having to spend additional costs to the gfx itself.

It's not news for anyone, that you need new hardware for new features.

It IS news to me, that proprietary solutions, limiting the selection of products, and increasing the price, and locks you into only using that brand, should somehow be a good thing for the consumer.

Do you think it's in the benefit of a consumer to have to replace their monitor, if they change graphicscard vendor, just to keep a feature, that is actually an industry standard, one of the vendor refuses to support?

 

To criticize something means you can explain in a rational manner why you think certain attributes are only applicable to one company,  when you claim "their business strategies benefiting only them" you don't seem to acknowledge that AMD do exactly that, their strategies are designed only to improve their bottom line.  That is the core MO of every business. Jjust becasue one has more market share, more money and seems to be able to demand a premium for it's products doesn't mean it is more evil or more selfish than the other, just means it is doing better as a business at the moment.

Oh come on, just because you don't agree, does not mean my argumentations are irrational. My critique is very rational and supported in my arguments. If you disagree, fine, just explain WHY you disagree, instead of just calling me irrational.

All businesses, except non profit I guess, are obvously trying to make as much money as possible. But you forget the rest of the argument, so here it is again: "because of their business strategies benefiting only them, at the cost of the consumer". That is the difference between Nvidia and AMD in my mind. A lot of AMD's strategies benefits ALL consumers, not just AMD's customers. Like the new OpenGL based on Mantle, and of course Adaptive Sync. Calling companies evil is not a rhetoric I use or like.

 

But they still become the standard and only exist becasue a company invested time and money into developing them, If it wasn't for the memory stick other companies may not have developed memory with a USB interface for quite some time. It is the push to compete that drives innovation, that competition comes from proprietary products.  It's not Ironic, it's how it is.

Do you have any sources that Sony is responsible for USB sticks? Because here is what wikipedia says:

 

USB flash drives were invented by Amir Ban, Dov Moran and Oron Ogdan, all of the Israeli company M-Systems, who filed US patent 6,148,354 in April 1999.[8] However, the patent describes a product that has a cable between the memory unit and the USB connector.[citation needed] Released later the same year, IBM Patent Disclosure RPS8-1999-0201 from September 13, 1999 by Shimon Shmueli accurately describes the USB flash drive.[9] IBM partnered with M-Systems to bring the product to market.

 

Products and solution, can be both push or pull strategies. Not all innovations are push, or developed by big corporations. In the tech world, proprietarity always loses out to the industry standards in the end (Apple could be an exception, time will tell).

 

The statements you make that are not true are the ones that you base on flawed concepts, you speak like you think AMD is giving away for the benefit of others, Your hatrred fo Nvidia prevents you from being open minded about how companies operate and the how technology evolves.

 

Just because you don't agree with them, does not make them untrue. You are confusing subjectivity and objectivity. Argue against my arguments, no problem. But disagreeing does not mean i speak untruthfully.

Again AMD DID give the hardware part in freesync to VESA for free, as well as mantle tech to Khronos group for next gen OpenGL.

 

I don't hate Nvidia, they make very good GPU chips, but I think their business strategies are anti consumer and anti competitive (just look at the black boxed gameworks, that distorts performance between Nvidia and AMD. That alone should make people be disgusted with Nvidia's ethics) Read more here: http://linustechtips.com/main/topic/170584-nvidia-hairworks-–-new-video-shows-off-fur-hair-includes-animals-from-the-witcher-3/?p=2269073

 

*Sigh* Really? Back to the "irrational" argument? That is too low. :rolleyes:

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

A-Sync isn't a must-implement DP standard.  It's an OPTIONAL standard manufacturers can choose to implement.

QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to comment
Share on other sites

Link to post
Share on other sites

inb4 fanboy war

 

I was myself surprised when I read that they were going to be supporting other technologies, and am relatively unsurprised that it has been revealed that they will not.

 

A lot of people seemed to think that G-Sync is dead in the water, perhaps Nvidia think that is not the case.

 

 

It's dead in the water in the way that things requiring the apple ecosystem are dead in the water.  As in, not really.  There are enough nvidia fanboys who will only buy nvidia that they have a large enough market where a locked in variable frame rate monitor will survive perfectly fine... though they do need to get the cost down.  But that will happen with time.  And even if there is always a hefty premium, nvidia lovers are not going to bolt.  Whether they have the performance crown or the best performance per dollar or not, nvidia has a larger number of die hard fans.  You can find them in forums when they mouth off about shadowplay or drivers or physx.  As if the absence of such things would make a damn to them, reasons plucked from the air.  The attachment and fanboyism comes first, then the justifications.  And like I said, nvidia has more of such people.  I am more in the AMD camp, If a current offering was not enticing enough I would just wait until something else from AMD came along, because I am rooting for them not nvidia.  I am the same way you see, but again, since nvidia has more of me in their camp, they can afford to be the apple of the gpu world.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

It's dead in the water in the way that things requiring the apple ecosystem are dead in the water.  As in, not really.  There are enough nvidia fanboys who will only buy nvidia that they have a large enough market where a locked in variable frame rate monitor will survive perfectly fine... though they do need to get the cost down.  But that will happen with time.  And even if there is always a hefty premium, nvidia lovers are not going to bolt.  Whether they have the performance crown or the best performance per dollar or not, nvidia has a larger number of die hard fans.  You can find them in forums when they mouth off about shadowplay or drivers or physx.  As if the absence of such things would make a damn to them, reasons plucked from the air.  The attachment and fanboyism comes first, then the justifications.  And like I said, nvidia has more of such people.  I am more in the AMD camp, If a current offering was not enticing enough I would just wait until something else from AMD came along, because I am rooting for them not nvidia.  I am the same way you see, but again, since nvidia has more of me in their camp, they can afford to be the apple of the gpu world.

 

 

 

 

 

 

Here you go. I think you need this.  mirror-06.jpg

 

 

The Mistress: Case: Corsair 760t   CPU:  Intel Core i7-4790K 4GHz(stock speed at the moment) - GPU: MSI 970 - MOBO: MSI Z97 Gaming 5 - RAM: Crucial Ballistic Sport 1600MHZ CL9 - PSU: Corsair AX760  - STORAGE: 128Gb Samsung EVO SSD/ 1TB WD Blue/Several older WD blacks.

                                                                                        

Link to comment
Share on other sites

Link to post
Share on other sites

It's actually a common strategy for companies to keep customers in the brand invironment. Biggest example is Apple, but Bang&Olufsen and Sonos are big in the field as well. There is a reason why they don't open up to others. Simply to keep customers coming back, since all their tech ends up only working with that one brand.

Yes Nvidia should support an industry standard, instead of an over expensive (costs more, delivers less, like no native 24/23,96 fps video playback) proprietary colution.

I doubt any  monitor vendor, would pay 150$ for a G-sync module, and then add a highend display controller, supporting A-sync on top of that. The end price would be insane. And for what? Supporting a proprietary solution, that offers no additional benefit? AMD cannot get access to G-sync, you know that.

 

again you are pointing to extremely successful companies and saying they should do something different that benefits you and not them,  there is a reason they are still around improving their products.

 

If people wants to waste their money, sure it's their choice. But paying 80-100$ for nothing additional, really is a waste of money don't you think? But yes, if you absolutely want variable refreshrate RIGHT NOW, then G-sync is the only option. And you will pay a high premium for that impatience.

 

It's only your opinion that it's a waste of money and again an unfounded one because no one knows how free sync is going to perform.  How can you make a rational argument that something is better without actually seeing it in action?

 

In practice, it is very different, as monitors are usually universal, and have a lifespan independently of any GPU. CPU/Mobo usually have a similar lifespan. However, I agree. Would it not be better for consumers, if all sockets on all motherboards, supported both Intel and AMD? Right now the selection of motherboards are poor for AMD cpu/apu's.

Personally I use winamp for my old ipod, but do you honestly think it benefits the consumer, that you are forced to use Itunes, instead of something better? It very much hurts the consumer. Especially if you want to buy music online, cheaper than itunes.

 

Again the consumer is not forced to buy apple or any proprietary products.  Your arguments are based on what you want to see happen not on what is happening in reality. You think it hurts the customer, but what hurts the customer hurts the company unless they have a monopoly. There are no monopolies in this discussion.

 

Not really, "Freesync" is only becoming the success it should become, because it is an industry standard now (Adaptive Sync). AMD did not invent a proprietary solution, but essentially outsourced that to the companies, who nows about this (display controller vendors). The result is that you already have 3 different vendors, creating their own controllers, with their own features and price. Adaptive Sync already has more competition than G-sync, and it's not even out yet. That is possible, because VESA made it a standard (we don't know how much influence these controller vendors had on the standard). We don't know if AMD made the tech specs for these controllers, or if the vendors did that themselves. So no, it really is not an example of proprietary goodness.

 

What has that got to do with what I sad?

 

There are many factors responsible for Sony's situation. Financial crisis, higher taxation in Japan, than SK, products being one step behind featurewise, etc. But focusing on proprietarity hardly did them any good. It is one of the reasons their old Digital music players failed in sales (ATRAC3, and yes I had one).

 

But you inferred they were struggling because of proprietary hardware.  Changed your mind?

 

Oh come on, ad hominems are for people without any argumentational skills. You are better than that. I hove no problem with shadowplay for instance. There are alternatives, and I don't have to pay extra money for such a feature, that then locks me to just Nvidia cards. I can just switch and get another shadowplay "clone", without having to spend additional costs to the gfx itself.

It's not news for anyone, that you need new hardware for new features.

It IS news to me, that proprietary solutions, limiting the selection of products, and increasing the price, and locks you into only using that brand, should somehow be a good thing for the consumer.

Do you think it's in the benefit of a consumer to have to replace their monitor, if they change graphicscard vendor, just to keep a feature, that is actually an industry standard, one of the vendor refuses to support?

 

If someone wants dynamic refresh rates then they have to replace their monitor regardless of what tech they use, ergo not an ad hominem argument but a classic example of how you attack one without applying the same rational to the other.

 

Oh come on, just because you don't agree, does not mean my argumentations are irrational. My critique is very rational and supported in my arguments. If you disagree, fine, just explain WHY you disagree, instead of just calling me irrational.

All businesses, except non profit I guess, are obvously trying to make as much money as possible. But you forget the rest of the argument, so here it is again: "because of their business strategies benefiting only them, at the cost of the consumer". That is the difference between Nvidia and AMD in my mind. A lot of AMD's strategies benefits ALL consumers, not just AMD's customers. Like the new OpenGL based on Mantle, and of course Adaptive Sync. Calling companies evil is not a rhetoric I use or like.

 

No your arguments are irrational because they are irrational, what I think is irrelevant to that end.

 

Do you have any sources that Sony is responsible for USB sticks? Because here is what wikipedia says:

 

USB flash drives were invented by Amir Ban, Dov Moran and Oron Ogdan, all of the Israeli company M-Systems, who filed US patent 6,148,354 in April 1999.[8] However, the patent describes a product that has a cable between the memory unit and the USB connector.[citation needed] Released later the same year, IBM Patent Disclosure RPS8-1999-0201 from September 13, 1999 by Shimon Shmueli accurately describes the USB flash drive.[9] IBM partnered with M-Systems to bring the product to market.

 

Products and solution, can be both push or pull strategies. Not all innovations are push, or developed by big corporations. In the tech world, proprietarity always loses out to the industry standards in the end (Apple could be an exception, time will tell).

 

What came before USB sticks? A. proprietary memory sticks from Sony, infact they were launched in 1998, one year before the usb drive was patented.  They may not have been ideal but they pushed the development of USB storage. *

 

 

Just because you don't agree with them, does not make them untrue. You are confusing subjectivity and objectivity. Argue against my arguments, no problem. But disagreeing does not mean i speak untruthfully.

Again AMD DID give the hardware part in freesync to VESA for free, as well as mantle tech to Khronos group for next gen OpenGL.

 

I never said anything about AMD giving away hardware, I was saying that your opinions are stated as facts which are not true, they are opinions that are founded on what appears for all intents and purposes as one sided nvidia hate.

 

I don't hate Nvidia, they make very good GPU chips, but I think their business strategies are anti consumer and anti competitive (just look at the black boxed gameworks, that distorts performance between Nvidia and AMD. That alone should make people be disgusted with Nvidia's ethics) Read more here: http://linustechtips.com/main/topic/170584-nvidia-hairworks-–-new-video-shows-off-fur-hair-includes-animals-from-the-witcher-3/?p=2269073

 

*Sigh* Really? Back to the "irrational" argument? That is too low. :rolleyes:

 

Really? learning is something everyone should strive for, but to dismiss it out of hand because you find it too low, now that's a sign that one will never learn.

 

**

Quote from wiki:

 

 

Memory Stick is a removable flash memory card format, launched by Sony in October 1998

 

A year before the USB storage device.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

 

again you are pointing to extremely successful companies and saying they should do something different that benefits you and not them,  there is a reason they are still around improving their products.

 

I'm not saying they should do things differently as a company, what I'm saying is that their business strategies come at the expensive of the end consumer. From a business perspective, they are making more money off of it in the end, but it hurts the consumer. Why? Read further down.

 

 

It's only your opinion that it's a waste of money and again an unfounded one because no one knows how free sync is going to perform.  How can you make a rational argument that something is better without actually seeing it in action?

 

Getting nothing for 80-100$ is bad value, no matter how rich you are. I understand your scepticism about A-sync, because it is not out yet. You don't give it the benefit of the doubt, which is completely ok. But it IS an industry standard, which I very much doubt VESA would adopt, if it did not work properly. Based on the official information about how both g-sync and a-sync works, I find it hard to believe, G-sync would work any better, but I guess we will have to wait for 3-4 months to see it. Pretty much all university educations, have you read, process and conclude on theorems and information on a paper, without seeing anything in action. Bachelor and masters degrees, and their scienteific methodology is very much rational, whether you agree or not.

 

 

Again the consumer is not forced to buy apple or any proprietary products.  Your arguments are based on what you want to see happen not on what is happening in reality. You think it hurts the customer, but what hurts the customer hurts the company unless they have a monopoly. There are no monopolies in this discussion.

 

I would advise you to read about the endowment effect. http://en.wikipedia.org/wiki/Endowment_effect (this is actually what creates fanboys). It gives the basis to my point of loss aversion aka. sunk cost fallacy  http://www.logicallyfallacious.com/index.php/logical-fallacies/174-sunk-cost-fallacy which explains my point completely. Let me give you an example:

  1. You buy an Nvidia graphics card
  2. You buy a G-sync monitor
  3. You buy an Nvidia 3D kit

Now your graphics card needs to be upgrades/replaced. Lets for the argument say that you get 20% more performance on an AMD card for the price compared to the equavilent Nvidia card. Now any rational person, would pick the AMD card on a price/perf standpoint. However, they would lose out on variable refreshrate (no g-sync support), and 3D ( I assume Nvidia's 3D stuff does not work on AMD. For the sake of the argument, let's assume that). So now that you bought into the Nvidia eco system, you are locked in. You cannot just go from Nvidia to AMD, without losing out on a lot of features, you payed extra money for. You would have to replace monitor and 3d kit as well on top of the gfx card itself. This locks you into the eco system, and becomes a sunk cost fallacy. You will continue to use Nvidia, even if you get less performance for the price (in this example).

 

Now please tell me how locking someone into a sunk cost fallacy via a closed eco system, is not hurting the end consumer, compared to open standards, that all can/should support?

 

 

What has that got to do with what I sad?

 

You said I used an example of how proprietary solutions was bad, by using a proprietary solution as an example. My point was that it was not proprietary, as it was finalized in the VESA forum, and innovated using 3 display controller vendors. Or put in another way, I'm saying it was not developed fully as proprietary.

 

 

But you inferred they were struggling because of proprietary hardware.  Changed your mind?

 

Not at all, but an international business usually don't fall from just 1 issue. All I'm saying, is that their proprietary approach hurt them in the end, as they stopped being competitive.

 

 

If someone wants dynamic refresh rates then they have to replace their monitor regardless of what tech they use, ergo not an ad hominem argument but a classic example of how you attack one without applying the same rational to the other.

 

This is the adhominem I was reffering to: "you don't want companies to improve their products because it means you might have to buy new hardware" (it's actually also a locigal fallacy). And you end off with another logical fallacy? Dude.

 

Sure if you want new tech, it requires new hardware. But let's say I buy an Nvidia card and G-sync monitor. 2 years later I want and AMD card, so now I have to buy another monitor, because of Nvidias closed proprietary eco system. So in 2 years, in my example, I have to buy the same monitor twice for a feature, that is an industry standard in one, but an overpirced proprietary in the other. That leads back to the sunk cost fallacy, that is the problem.

 

 

No your arguments are irrational because they are irrational, what I think is irrelevant to that end.

 

Basing your arguments on scientific theorems is as far from irrational as you can be. My criticism is from a consumer standpoint, not the companies or what is economically the wisest for Nvidia. Feel free to disagree, but your subjective opinions does not render my argumentations irrational.

 

 

What came before USB sticks? A. proprietary memory sticks from Sony, infact they were launched in 1998, one year before the usb drive was patented.  They may not have been ideal but they pushed the development of USB storage. *

 

Quote from wiki:

Memory Stick is a removable flash memory card format, launched by Sony in October 1998

A year before the USB storage device.

 

You must be very confused by the name "Memory Stick". It is NOT a USB stick, but a flash card, (just like your own wiki link says). Sony's memory stick has nothing to do with USB or USB sticks, but competed with MMC/SD flash cards (MMC came out in 1997). Both replacing the aging CF flash card, that came out in 1994. Sony has created nothing but an overpriced proprietary card, to compete with industry standards (CF, MMC, SD, etc). I'm sorry, but you are wrong on this one, and your own link proves it. See more here: http://en.wikipedia.org/wiki/Comparison_of_memory_cards

 

 

I never said anything about AMD giving away hardware, I was saying that your opinions are stated as facts which are not true, they are opinions that are founded on what appears for all intents and purposes as one sided nvidia hate.

 

I'm basing it on what AMD and VESA have published. That's all. I prefer industry standards to proprietary closed solutions any day of the week; simply because it ensures competition and prevents sunk cost fallacies for end consumers. And like I said, I don't hate Nvidia, I'm not a fanboy, and I don't hate corporations. But I will criticize any company for being anti consumer and anti competitive. And that is exactly what Nvidia is because or their proprietary approach.

 

 

Really? learning is something everyone should strive for, but to dismiss it out of hand because you find it too low, now that's a sign that one will never learn.

 

I agree everyone should strive for learning, I'm one master thesis away from my masters degree (it will take some time sadly), so I'm all for learning. But there is nothing in that link I need to know, that I don't already know. My arguments are not irrational, just because you don't understand or agree with them.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I have a feeling that G-sync (on the monitor's end) won't ever work on an AMD GPU, but Nvidia will find away to make their GPUs run G-sync on both G-sync and free sync monitors.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

again you are pointing to extremely successful companies and saying they should do something different that benefits you and not them,  there is a reason they are still around improving their products.

1.

I'm not saying they should do things differently as a company, what I'm saying is that their business strategies come at the expensive of the end consumer. From a business perspective, they are making more money off of it in the end, but it hurts the consumer. Why? Read further down.

 

 

It's only your opinion that it's a waste of money and again an unfounded one because no one knows how free sync is going to perform.  How can you make a rational argument that something is better without actually seeing it in action?

2.

Getting nothing for 80-100$ is bad value, no matter how rich you are. I understand your scepticism about A-sync, because it is not out yet. You don't give it the benefit of the doubt, which is completely ok. But it IS an industry standard, which I very much doubt VESA would adopt, if it did not work properly. Based on the official information about how both g-sync and a-sync works, I find it hard to believe, G-sync would work any better, but I guess we will have to wait for 3-4 months to see it. Pretty much all university educations, have you read, process and conclude on theorems and information on a paper, without seeing anything in action. Bachelor and masters degrees, and their scienteific methodology is very much rational, whether you agree or not.

 

 

Again the consumer is not forced to buy apple or any proprietary products.  Your arguments are based on what you want to see happen not on what is happening in reality. You think it hurts the customer, but what hurts the customer hurts the company unless they have a monopoly. There are no monopolies in this discussion.

3.

I would advise you to read about the endowment effect. http://en.wikipedia.org/wiki/Endowment_effect (this is actually what creates fanboys). It gives the basis to my point of loss aversion aka. sunk cost fallacy  http://www.logicallyfallacious.com/index.php/logical-fallacies/174-sunk-cost-fallacy which explains my point completely. Let me give you an example:

  1. You buy an Nvidia graphics card
  2. You buy a G-sync monitor
  3. You buy an Nvidia 3D kit

Now your graphics card needs to be upgrades/replaced. Lets for the argument say that you get 20% more performance on an AMD card for the price compared to the equavilent Nvidia card. Now any rational person, would pick the AMD card on a price/perf standpoint. However, they would lose out on variable refreshrate (no g-sync support), and 3D ( I assume Nvidia's 3D stuff does not work on AMD. For the sake of the argument, let's assume that). So now that you bought into the Nvidia eco system, you are locked in. You cannot just go from Nvidia to AMD, without losing out on a lot of features, you payed extra money for. You would have to replace monitor and 3d kit as well on top of the gfx card itself. This locks you into the eco system, and becomes a sunk cost fallacy. You will continue to use Nvidia, even if you get less performance for the price (in this example).

 

Now please tell me how locking someone into a sunk cost fallacy via a closed eco system, is not hurting the end consumer, compared to open standards, that all can/should support?

 

 

What has that got to do with what I sad?

4.

You said I used an example of how proprietary solutions was bad, by using a proprietary solution as an example. My point was that it was not proprietary, as it was finalized in the VESA forum, and innovated using 3 display controller vendors. Or put in another way, I'm saying it was not developed fully as proprietary.

 

 

But you inferred they were struggling because of proprietary hardware.  Changed your mind?

5.

Not at all, but an international business usually don't fall from just 1 issue. All I'm saying, is that their proprietary approach hurt them in the end, as they stopped being competitive.

 

 

If someone wants dynamic refresh rates then they have to replace their monitor regardless of what tech they use, ergo not an ad hominem argument but a classic example of how you attack one without applying the same rational to the other.

6.

This is the adhominem I was reffering to: "you don't want companies to improve their products because it means you might have to buy new hardware" (it's actually also a locigal fallacy). And you end off with another logical fallacy? Dude.

 

Sure if you want new tech, it requires new hardware. But let's say I buy an Nvidia card and G-sync monitor. 2 years later I want and AMD card, so now I have to buy another monitor, because of Nvidias closed proprietary eco system. So in 2 years, in my example, I have to buy the same monitor twice for a feature, that is an industry standard in one, but an overpirced proprietary in the other. That leads back to the sunk cost fallacy, that is the problem.

 

 

No your arguments are irrational because they are irrational, what I think is irrelevant to that end.

7.

Basing your arguments on scientific theorems is as far from irrational as you can be. My criticism is from a consumer standpoint, not the companies or what is economically the wisest for Nvidia. Feel free to disagree, but your subjective opinions does not render my argumentations irrational.

 

 

What came before USB sticks? A. proprietary memory sticks from Sony, infact they were launched in 1998, one year before the usb drive was patented.  They may not have been ideal but they pushed the development of USB storage. *

 

Quote from wiki:

Memory Stick is a removable flash memory card format, launched by Sony in October 1998

A year before the USB storage device.

8.

You must be very confused by the name "Memory Stick". It is NOT a USB stick, but a flash card, (just like your own wiki link says). Sony's memory stick has nothing to do with USB or USB sticks, but competed with MMC/SD flash cards (MMC came out in 1997). Both replacing the aging CF flash card, that came out in 1994. Sony has created nothing but an overpriced proprietary card, to compete with industry standards (CF, MMC, SD, etc). I'm sorry, but you are wrong on this one, and your own link proves it. See more here: http://en.wikipedia.org/wiki/Comparison_of_memory_cards

 

 

I never said anything about AMD giving away hardware, I was saying that your opinions are stated as facts which are not true, they are opinions that are founded on what appears for all intents and purposes as one sided nvidia hate.

9.

I'm basing it on what AMD and VESA have published. That's all. I prefer industry standards to proprietary closed solutions any day of the week; simply because it ensures competition and prevents sunk cost fallacies for end consumers. And like I said, I don't hate Nvidia, I'm not a fanboy, and I don't hate corporations. But I will criticize any company for being anti consumer and anti competitive. And that is exactly what Nvidia is because or their proprietary approach.

 

 

Really? learning is something everyone should strive for, but to dismiss it out of hand because you find it too low, now that's a sign that one will never learn.

10.

I agree everyone should strive for learning, I'm one master thesis away from my masters degree (it will take some time sadly), so I'm all for learning. But there is nothing in that link I need to know, that I don't already know. My arguments are not irrational, just because you don't understand or agree with them.

 

 

1. So you say they do things only to milk the customer and they are a bad company but you now don't think they should do things differently?  I'm confused about what your criteria is for appraising a company.

 

2. You said:

 

 

750MB, it has 3x 2gbit hynix memory modules.

 

Actually if you know how g-sync and adaptive sync works, its very unreasonable to believe, that Nvidias overcomplicated 2way comm, buffer system, is better performing than Adaptive Sync. It is just a lot more expensive with nothing to show for it. As it is proprietary, it is not based on display controller know how, nor will it ever have any competition on manufacturing. Like most Nvidia products, its overpriced for the sake of milking customers, nothing else.

 

You actually say it is inferior, has nothing to show for the cost, and has been designed purely to milk a customer by locking them to a system.  Yet you have absolutely nothing to compare it to. You claim to be basing your arguments on rational scientific data, but this is a ludicrous claim, There are no Async in game demonstrations or reviews out there, so how can you claim to properly critique a product? how can you claim Gsync to be over complicated, expensive, and have nothing to show for it when its the only working model on the market? 

 

3. a red herring, your not forced to buy any of those products in the first place.

 

4. you still used an example of a proprietary system (apple) that was highly successful and also gave examples of how it didn't impede the consumer to demonstrate how they are bad.

 

5. not even relevant to the discussion any more

 

6. The reverse of your statement is also true, you could go and buy an Async monitor but then you are locked to using AMD cards until you want a new monitor.  You can't use one criteria to claim company a is inferior then use completely different criteria to extoll the benefits of another, that is the difference between critical thinking (critiquing) and shit bashing.

 

7. What scientific theorem did you base your arguments on?  All I see is shit bashing Nvidia and Gsync because a competing product that has yet to be realised is in the wings.  That, I am afraid is the grand opposite of scientific theorem, it is presenting as fact assumptions you have made.

 

8. Glad to know you have been reading my posts:

 

 

 

...

 

What came before USB sticks? A. proprietary memory sticks from Sony, infact they were launched in 1998, one year before the usb drive was patented.  They may not have been ideal but they pushed the development of USB storage.

 

...

A year before the USB storage device.

 

At what point did I say the memory stick was USB?  I didn't, I said it drove people to develop the USB memory stick, I said it was proprietary, I said it wasn't ideal and I said it was what drove the development of USB storage. and Last time I looked CF, SD, xD were not USB storage, but internal storage designed to be removable.

 

No wonder you think 9. means something. 

 

10. as I have stated, you are claiming a product is inferior based on:

 

1. comparison to a product that we haven't yet seen working or can buy.

2. comparison to a investment fallacy that has nothing to do with consumerism.

3. an endowment argument about perceived value,  You'll never stop fanboys, but that doesn't make everybody incapable of rational judgment nor does it make the reverse untrue.

4. you say you already know what you need to know, thus won't actively seek to learn more.

 

If you consider that to be rational then we have no more to discuss.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1. So you say they do things only to milk the customer and they are a bad company but you now don't think they should do things differently?  I'm confused about what your criteria is for appraising a company.

 

I have only said that I criticize companies doing anti competitive and anti consumer startegies. That is 100% a view from a consumers perspective, and has nothing to do with what is good business sense/economy. As a consumer I have every right to critique that. What makes economical sense for Nvidia, is irrelevant in that matter. Propriety is anti consumer/anti competive, which in my mind is unethical.

 

2. You said:

 

You actually say it is inferior, has nothing to show for the cost, and has been designed purely to milk a customer by locking them to a system.  Yet you have absolutely nothing to compare it to. You claim to be basing your arguments on rational scientific data, but this is a ludicrous claim, There are no Async in game demonstrations or reviews out there, so how can you claim to properly critique a product? how can you claim Gsync to be over complicated, expensive, and have nothing to show for it when its the only working model on the market? 

 

Yes, it is a fact that g-sync only has a hertz interval of 30-144hz, which means that it cannot play back video in 23,96/24fps natively. A-sync supports a hertz interval of 9-240hz (realistically the monitors will be around 20+ to 144hz). That is all the official stats and information given to us by Nvidia, AMD and VESA. Adaptive Sync is not a product, it is an industry standard, made by VESA. You do know Nvidia is part of VESA right? Again you have a right not to give A-sync the benefit of the doubt, but this is a VESA standard. When have VESA ever released a Display port feature, that does not work, as described?

G-sync is overcomplicated, needing a FPGA processor, 750MB ram and a complex 2way comm system, that is not needed in A-sync. If you know of any advantages to G-sync, a-sync cannot do, feel free to enlighten me.

 

3. a red herring, your not forced to buy any of those products in the first place.

 

I never mentioned "forcing", that is your rhetoric. You obvously did not understand what I wrote, nor the links I provided, as you don't seem to give me any indication of understanding the problem with sunk cost fallacy. That is not my problem, but it really is not a difficult theory to understand. The implications of the sunk cost fallacy is quite large for consumers though. See my answer to your 10.2 point further down.

 

4. you still used an example of a proprietary system (apple) that was highly successful and also gave examples of how it didn't impede the consumer to demonstrate how they are bad.

 

First off, I said Apple might be an exception to the rule, but the sunk cost fallacy applies very much to apple users as well (like sonos). Once you are in the eco system of this tech, it gets increasingly more exepnsive to switch brands, as you would need to replace a lot of other tech at the same time.

 

6. The reverse of your statement is also true, you could go and buy an Async monitor but then you are locked to using AMD cards until you want a new monitor.  You can't use one criteria to claim company a is inferior then use completely different criteria to extoll the benefits of another, that is the difference between critical thinking (critiquing) and shit bashing.

 

Do you honestly believe so yourself? Honestly? You do understand the difference between choice and option right? AMD does NOT have an option to support Nvidia's proprietary G-sync tech. Nvidia HAS every option to support A-sync, but chooses NOT TO. There is a world of difference between industry standards, and proprietary solutions, blocking off competition. It's not just AMD, that Nvidia blocks off, it's also display controller vendors, and (maybe) the monitor vendors, who don't want to pay the high price of the g-sync module, or maybe just won't be allowed by Nvidia to make a g-sync product.

That is not the case with A-sync. Any monitor vendor can support, by buying a display controller, from a VESA member (like one of the 3, making a-sync controllers now).

Do you honestly not see the difference here? This is not AMD vs. Nvidia, this is Nvidia vs everyone else, they won't give access to G-sync.

 

7. What scientific theorem did you base your arguments on?  All I see is shit bashing Nvidia and Gsync because a competing product that has yet to be realised is in the wings.  That, I am afraid is the grand opposite of scientific theorem, it is presenting as fact assumptions you have made.

 

Sunk cost fallacy/Loss aversion. I even linked it. I guess you didn't bother clicking the links. The rest is spec comparison between A-sync and G-sync.

 

8. Glad to know you have been reading my posts:

  

At what point did I say the memory stick was USB?  I didn't, I said it drove people to develop the USB memory stick, I said it was proprietary, I said it wasn't ideal and I said it was what drove the development of USB storage. and Last time I looked CF, SD, xD were not USB storage, but internal storage designed to be removable.

 

Not so glad to see that you neither read YOUR OWN LINK, nor any of mine. You still don't seem to understand, that Sony Memory Stick IS A FLASH MEMORY CARD, exactly like MMC/SD cards.  Let me make it easy for you:

Memory_Stick_Front_and_Back.jpg "Internal storage designed to be removable" What does this look like?

 

Sorry, but you have not provided ANY proof what so ever, that Sony or their proprietary Memory Stick, has driven the development of USB memory. On the contrary, I have very much proved you wrong, by not just showing who is credited for the invention of USB memory, but also, that Flash cards have existed for many years before Sony's own proprietary flash cards. There is no innovation, nor development drive in this what so ever.

 

No wonder you think 9. means something. 

 

10. as I have stated, you are claiming a product is inferior based on:

 

1. comparison to a product that we haven't yet seen working or can buy.

Not a product, a standard, that has defined a hertz interval capable of doing things G-sync cannot (23,96/24fps native video playback), at a lower announced pricepoint (of the controllers, not the final product).

 

2. comparison to a investment fallacy that has nothing to do with consumerism.

That is incorrect. Sunk cost fallacies are very much a theory appliable to consumers, and is even used as business strategies for companies, creating these closed eco systems. Why do you believe they are not appliable? Because the example in my link was about a company? Here have some consumer examples then, from a research paper on the psychology of sunk cost fallacy (here called sunk cost effect)  Arks and Blumer 1985:

http://www.researchgate.net/publication/4812596_The_psychology_of_sunk_cost/links/0046351cc39c9efc60000000

If you don't understand how Sunk Cost Fallacy can apply to consumers, then you obviously do not understand the theory in itself.

 

3. an endowment argument about perceived value,  You'll never stop fanboys, but that doesn't make everybody incapable of rational judgment nor does it make the reverse untrue.

I'm not talking about percieved value, but spec to spec comparison between 2 competing techs, one being proprietary and one being an industry standard. Price wise, (from what has been announced), there is another comparison point. Both points is at a disadvantage for g-sync.

 

4. you say you already know what you need to know, thus won't actively seek to learn more.

I skim read the overview of your document. It contains nothing, that I haven't already learned about scientific methodology. Correct sourcing is a fundemental necessity in research papers, and I know all I need to know about that for my needs. Your link does not provide anything additional to the education I have revieced on the matter. That does not mean I don't want to learn more. Again a logical fallacy.

 

If you consider that to be rational then we have no more to discuss.

You have yet to prove any irrational behaviour from me. Yet again, you disagreeing or not understanding, does not render my argumentation irrational.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Video response and some more information about the plans for G-Sync and A-Sync from Nvidia

G-Sync Info starts at 1:26:00

Link to comment
Share on other sites

Link to post
Share on other sites

 

snip

 

 

edit:  You know what, I am kinda over this discussion, there are more important things to talk about.  I'll come back to it next year and see how things have panned out before commenting again.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×