Jump to content

Nvidia Pitches Advantages of G-Sync Over AMD's FreeSync

BiG StroOnZ

Easy now, I have not defended the 285 in any way. I agree, it's an odd card, that under performs. That does not make the 960 any less over priced.

As for the second part, I'm not sure what your point is. The largest IT company in the world is Apple, known for its closed eco system. You are charged a hefty price premium, and usually receive less for it too. Consumers being dumb/sheepish, does not negate apples behaviour as anti competitive either.

Your last point simply is not true. If you bought an Nvidia graphics card, gsync monitor, shield, etc. You're not going to switch to AMD just because the next generation favours AMD. Why? Because of the sunk cost investment into the eco system. You would have to buy a new monitor, new streaming box, etc. as well. So now it's no longer just a case of price/performance, but also an entire replacement of an entire eco system. You're simply too invested into the eco system.

So now you are stuck potentially buying a subpar product, or forced to pay high price premiums.

Actually most of us would switch because we have no company loyalties and handhelds go out of style every 8 months. What do you mean new streaming box? Either company will work. You're forgetting people buy on the margin.

I'm not forced to do anything. My demand is my demand and I will go with whatever is best all around. Whether or not that's Nvidia's closed system or AMD's hodgepodge will not be determined by the one I'm currently using.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

EU law is warped beyond reason and there's really no reason for Microsoft to be forced to remove Ie and WMP. They are together a product package. Consumers can install anything else they want. In truth the EU is jut far too controlling and biased against US companies.

CUDA reigns as the top scientific computing language because AMD and Intel cannot match it in performance via OpenCL/OpenACC (we examined this extensively in my heterogeneous computing class in which we used all 3 languages on multiple platforms of the same generation and tier). Scientific computing programmers can reimplement an algorithms library in a week. Nvidia has done practically nothing to lock them into the ecosystem.

If the software was made by a 3rd party as CUDA-exclusive, that's on them (Adobe). Nvidia developed a superior standard for the time and it was adopted. Forcing Nvidia to use the same Adaptive Sync-based solutions limits its capability to expand, and ideas it comes up with should not be given away to the competition through VESA standards. Let AMD compete on merits instead of coattails. Also you completely overblow the cost of switching. People upgrade every 3-5 years anyway.

You've forgotten your fundamental laws of marginal purchasing and demand. The 960 is not overpriced for what it is or it would barely sell. If AMD's FreeSync is equal or superior at a cheaper price, in about 4 years the market will correct when everyone will be buying new cards anyway.

My education on markets may not be as expansive as yours, but I challenge your mastery of the fundamentals.

 

I don't much agree with the treatment of MS either, but that is mostly because Google and Apple did not get the same treatment, and they are by far much, much worse. Biased against American companies is a moot point, since all huge it companies are US based anyways.

 

CUDA was just an example. Say OpenCL2-3 will be better than CUDA, well a lot will be stuck using CUDA anyways. But I'm talking consumers, not professionals. Two very different markets with different needs, use cases, etc.

 

Overblow? A new graphics card + good monitor + streaming box, can easily be over 1000$. That is a hell of a lot more than 300$ for just a new graphics card. Most people keep their monitors for years, and just get a higher quality one.

 

The last point is exactly what I mean. No people will not go for freesync, even if Adaptive Sync is superior, if they are too invested into Nvidia's closed proprietary eco system. That is the entire point, and also why Nvidia don't want to support Adaptive Sync. It will make the market too volatile for their liking. They want consumer loyalty, even if they have to force it.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Easy now, I have not defended the 285 in any way. I agree, it's an odd card, that under performs. That does not make the 960 any less over priced.

 

As for the second part, I'm not sure what your point is. The largest IT company in the world is Apple, known for its closed eco system. You are charged a hefty price premium, and usually receive less for it too. Consumers being dumb/sheepish, does not negate apples behaviour as anti competitive either.

 

Your last point simply is not true. If you bought an Nvidia graphics card, gsync monitor, shield, etc. You're not going to switch to AMD just because the next generation favours AMD. Why? Because of the sunk cost investment into the eco system. You would have to buy a new monitor, new streaming box, etc. as well. So now it's no longer just a case of price/performance, but also an entire replacement of an entire eco system. You're simply too invested into the eco system.

 

So now you are stuck potentially buying a subpar product, or forced to pay high price premiums.

Okay, so next time you call out the 960 as being an overpriced card, be sure to include the R9 285 as well, since it's actually worse seeing as it's less efficient while still costing the same and yielding similar performance. 

 

'Usually receive less'.....Apple's AIOs/Macbooks (excluding the new one anyway) are usually priced more or less competitively with other prebuilt manufacturers. Not to mention, part of what you pay for when buying an Apple product is the excellent service you get -- something which is pretty much undeniable; Apple's support system is well above that of it's competition -- Dell, Asus, MSI, etc... And one of the reason Apple products sell so well is because they don't require a lot of maintenance over time and they last. My 8 year old mac still works perfectly (albeit is slow for a power user like myself nowadays seeing as it only has a C2D, but for most people it would still be plenty). 

 

You're partially correct......no one would switch to AMD..... if the performance benefit was within a few percent....if the performance benefit was well beyond that, then yes, people would switch regardless of the level of investment. Keep in mind, there are two main types of users who will buy into a GPU ecosystem: the average gamer who knows nothing about hardware, and users like those who are on this forum. The former aren't likely to know enough about either to care/decide either way, and the latter isn't exactly known for making rationale decisions which optimize price/performance -- see custom loops, yearly upgrade cycles, etc... 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't much agree with the treatment of MS either, but that is mostly because Google and Apple did not get the same treatment, and they are by far much, much worse. Biased against American companies is a moot point, since all huge it companies are US based anyways.

CUDA was just an example. Say OpenCL2-3 will be better than CUDA, well a lot will be stuck using CUDA anyways. But I'm talking consumers, not professionals. Two very different markets with different needs, use cases, etc.

Overblow? A new graphics card + good monitor + streaming box, can easily be over 1000$. That is a hell of a lot more than 300$ for just a new graphics card. Most people keep their monitors for years, and just get a higher quality one.

The last point is exactly what I mean. No people will not go for freesync, even if Adaptive Sync is superior, if they are too invested into Nvidia's closed proprietary eco system. That is the entire point, and also why Nvidia don't want to support Adaptive Sync. It will make the market too volatile for their liking. They want consumer loyalty, even if they have to force it.

No one's locked to CUDA. There are OpenCL alternatives to just about anything and everything. When OpenCL becomes the stronger standard (2.1 with direct STL support and unified memory may get there), I'm happy to have this conversation again. Until such time, you can't use it as an Nvidia anti-competitive mechanism, because it's the best thing on the market.

You don't need to replace the whole computer. $X graphics card. Done. As per the monitor, I don't know anyone other than my dad who keeps a monitor longer than 4 years. In other words, these are already existing, planned expenses.

Your problem is your premise is built on falsehoods. No one is so invested in the Nvidia ecosystem that the market is impervious to corrections in the medium to long term based on quality changes. Additionally, AMD should work to get FreeSync in alongside GSync in some monitors if it wants to compete on raw merit and be such a consumer champion. Otherwise it's just as guilty, also, there's nothing stopping it from doing just that.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Actually most of us would switch because we have no company loyalties

 

 

Holy moly, another thing we agree on, (checks to see if the world is ending).   Yes some of us have no loyalty to one brand or another.  We are enthusiasts and get excited over any tech advances.  We mostly call it as it is because we are more interested in discussing reality than blindly defending an emotional conviction.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Holy moly, another thing we agree on, (checks to see if the world is ending).   Yes some of us have no loyalty to one brand or another.  We are enthusiasts and get excited over any tech advances.  We mostly call it as it is because we are more interested in discussing reality than blindly defending an emotional conviction.

Indeed, it seems like there are very few of us who target the product and not the brand.

Link to comment
Share on other sites

Link to post
Share on other sites

Well recorder failure rates would suggest that AMD cards are more likely to fail. Take all statistics with a grain of salt though.

 

http://www.pugetsystems.com/labs/articles/Video-Card-Failure-Rates-by-Generation-563/

not to bash on this but AMD cards have also been used by miners VERY heavily.. So I would expect it to be this way if not WORSE.

The Vinyl Decal guy.

Celestial-Uprising  A Work In-Progress

Link to comment
Share on other sites

Link to post
Share on other sites

not to bash on this but AMD cards have also been used by miners VERY heavily.. So I would expect it to be this way if not WORSE.

Yea so would I. But AMD could have put it in their no return policy that if you have mined heavily with it you can't return it. It's not consumer friendly bu they aren't server grade GPU's.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

not to bash on this but AMD cards have also been used by miners VERY heavily.. So I would expect it to be this way if not WORSE.

 

 

Yea so would I. But AMD could have put it in their no return policy that if you have mined heavily with it you can't return it. It's not consumer friendly bu they aren't server grade GPU's.

 

I have been thinking about this, I don't think mining actually pushes a card hard enough that it would cause a fail within the warranty period anyway (I am sure it would reduce the life of the card but for it to encroach on the warranty period it would have to being the order of 80+% shorter life span).  So long as the temperature stays below 100 then mining should have next to zero effect on RMA statistics.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

The module can be tuned for the exact monitor it is inside of. While AMD does this in the drivers, meaning that the drivers would have to take into account every single freesync monitor and successfully pick the driver config that best matches this monitor. I'm not saying that AMD can't do it, but as a software developer I can guarantee that it's a nightmare.

 

That's probably why they do the Freesync certification program. It's a way to tune their drivers for the most popular monitors or the ones with the best FS experience. But as that number grows, so does the drivers.

 

 

How would that be any more of a nightmare than a gsync module?  Assuming the displays do need to be tuned to the module/drivers, that seems like the work would need to be done for each display.  The difference just seems to be the gsync would have that work done ahead of time before the monitor is released for sale, while the freesync variants could have that implemented after the fact (or before if display makers are proactive).

 

I don't understand why some monitor makers trying to implement freesync for gamers could not flesh out the proper settings and have specific profiles tuned to specific monitors.  It's not like the settings depend on the specific gpu you are using, they should be general settings that all freesync capable gpus can implement, so how is that any worse than gsync?

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

How would that be any more of a nightmare than a gsync module?  Assuming the displays do need to be tuned to the module/drivers, that seems like the work would need to be done for each display.  The difference just seems to be the gsync would have that work done ahead of time before the monitor is released for sale, while the freesync variants could have that implemented after the fact (or before if display makers are proactive).

 

I don't understand why some monitor makers trying to implement freesync for gamers could not flesh out the proper settings and have specific profiles tuned to specific monitors.  It's not like the settings depend on the specific gpu you are using, they should be general settings that all freesync capable gpus can implement, so how is that any worse than gsync?

 

I'm not saying you'd have to do less tuning with G-Sync. I'm saying that you tune each G-Sync module once for the monitor it's in then that's it. The next G-Sync module that goes in a different monitor is tuned differently according to that monitor.

 

If you're doing this in a driver on AMD's side (not monitor side but in the GPU drivers) then you still have to tune for each monitor just like G-Sync. But you also have to store every configuration for every monitor in those drivers and be able to select the right tune for the monitor the card is connected to. That is more complex and requires a driver to be much larger. It would be like G-Sync having the tune for every G-Sync monitor inside of the module that goes into the ROG Swift. It's unnecessary for G-Sync since you won't be taking the module out and moving it.

 

That's why I was saying that if AMD does tune for specific monitors, I could see them having you download drivers specifically for your monitor from them. That way you don't get driver bloat. Of course this isn't necessary if the monitor manufacturers can do it on their side.

Turnip OC'd to 3Hz on air

Link to comment
Share on other sites

Link to post
Share on other sites

It's not GSync vs Freesync, it's Asus's Swift vs BenQ's and LG's monitors. The sharpness is in the panel and maybe somewhat in the driver and firmware, which both can be updated so I don't see how FreeSync is inferior.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not saying you'd have to do less tuning with G-Sync. I'm saying that you tune each G-Sync module once for the monitor it's in then that's it. The next G-Sync module that goes in a different monitor is tuned differently according to that monitor.

 

If you're doing this in a driver on AMD's side (not monitor side but in the GPU drivers) then you still have to tune for each monitor just like G-Sync. But you also have to store every configuration for every monitor in those drivers and be able to select the right tune for the monitor the card is connected to. That is more complex and requires a driver to be much larger. It would be like G-Sync having the tune for every G-Sync monitor inside of the module that goes into the ROG Swift. It's unnecessary for G-Sync since you won't be taking the module out and moving it.

 

That's why I was saying that if AMD does tune for specific monitors, I could see them having you download drivers specifically for your monitor from them. That way you don't get driver bloat. Of course this isn't necessary if the monitor manufacturers can do it on their side.

The way I understand it, the G-sync module replaces the monitor scalar. So Nvidia only needs "tuning" information for a single scalar on there graphics card side of things.(driver updates) Does that also mean G-Sync modules themselves get tuned for the individual panel they connect to, by Nvidia(?), when the manufacturers make a G-sync monitor?

 

So do AMD need to adjust for each scalar, that manufacturers provide for their "Free-Sync enabled" monitors, that their graphics cards will connect to?(Driver implementation) Or are the scalars responsible for the panel adjustments for each implementation of the  Adaptive Sync functionality, the manufacturer's adjustment for their monitor, then?

Link to comment
Share on other sites

Link to post
Share on other sites

It's amusing that PcPer actually did CES coverage on a display that is FreeSync capable the ASUS MG279Q which has a 1440p IPS display with a range of 40-120 Hz.

 

 

 

It's amusing that PcPer actually did CES coverage on a display that is FreeSync capable the ASUS MG279Q which has a 1440p IPS display with a range of 40-120 Hz.

 

 

 

I wonder if this will support my r9 280x

Link to comment
Share on other sites

Link to post
Share on other sites

Actually most of us would switch because we have no company loyalties and handhelds go out of style every 8 months. What do you mean new streaming box? Either company will work. You're forgetting people buy on the margin.

I'm not forced to do anything. My demand is my demand and I will go with whatever is best all around. Whether or not that's Nvidia's closed system or AMD's hodgepodge will not be determined by the one I'm currently using.

 

I don't think you quite understand. This has nothing to do with brand loyalty as such. This is about switching barriers, when you are locked into a closed eco system (vendor lock in): http://en.wikipedia.org/wiki/Switching_barriers

 

 

Switching costs affect competition. When a consumer faces switching costs, the rational consumer will not switch to the supplier offering the lowest price if the switching costs in terms of monetary cost, effort, time, uncertainty, and other reasons, outweigh the price differential between the two suppliers. If this happens, the consumer is said to be locked-in to the supplier. If a supplier manages to lock-in consumers, the supplier can raise prices to a certain point without fear of losing customers because the additional effects of lock-in (time, effort, etc.) prevent the consumer from switching.

 

 

Vendor lock in can already be seen in Nvidia's pricing, which is generally quite high. In fact so high, that Nvidia created a new pricing point with their Titan series, thus forcing up the pricing on their none Titan cards too.

 

Feel free to read about vendor lock in as well, as that is what I've been talking about.

 

My point still stands, if you buy into the Nvidia eco system, as en graphics card, gsync monitor, shield streaming stuff, you need to replace all of it, if you switch to AMD. It's no longer just a graphics card switch. People don't necessarily buy all things at once, and a lot might have new monitors and old graphics cards as well. I don't know anyone who replaces both graphics card and monitor at the same time.

 

Okay, so next time you call out the 960 as being an overpriced card, be sure to include the R9 285 as well, since it's actually worse seeing as it's less efficient while still costing the same and yielding similar performance. 

 

'Usually receive less'.....Apple's AIOs/Macbooks (excluding the new one anyway) are usually priced more or less competitively with other prebuilt manufacturers. Not to mention, part of what you pay for when buying an Apple product is the excellent service you get -- something which is pretty much undeniable; Apple's support system is well above that of it's competition -- Dell, Asus, MSI, etc... And one of the reason Apple products sell so well is because they don't require a lot of maintenance over time and they last. My 8 year old mac still works perfectly (albeit is slow for a power user like myself nowadays seeing as it only has a C2D, but for most people it would still be plenty). 

 

You're partially correct......no one would switch to AMD..... if the performance benefit was within a few percent....if the performance benefit was well beyond that, then yes, people would switch regardless of the level of investment. Keep in mind, there are two main types of users who will buy into a GPU ecosystem: the average gamer who knows nothing about hardware, and users like those who are on this forum. The former aren't likely to know enough about either to care/decide either way, and the latter isn't exactly known for making rationale decisions which optimize price/performance -- see custom loops, yearly upgrade cycles, etc... 

 

Depends on where in the world you live. We have excellent consumer protection laws in Denmark, and somewhat in EU as well. I can tell you, apple has been slammed a lot more, than those other companies you mentioned, for not abiding by the consumer laws. So I cannot agree with you on that one, as it differs geographically.

People generally buy apple, because they have nice design, and OSX/iOS is usually easier for noobs to use. And that is ok. Apple managed to make information technology sexy, and they got a gigantic chunk of the market for it.

 

Samsung is probably the best. Your tv breaks, they come and collect it, give you a tv you borrow, fix the tv, and come back with it, and collect the borrowed one. I don't think you have that service level in the US?

 

For your last point, read further up about switching barriers. But let me ask you: Do you think the average user is more or less inclined to get locked into a closed eco system?

 

 

You don't need to replace the whole computer. $X graphics card. Done. As per the monitor, I don't know anyone other than my dad who keeps a monitor longer than 4 years. In other words, these are already existing, planned expenses.

Your problem is your premise is built on falsehoods. No one is so invested in the Nvidia ecosystem that the market is impervious to corrections in the medium to long term based on quality changes. Additionally, AMD should work to get FreeSync in alongside GSync in some monitors if it wants to compete on raw merit and be such a consumer champion. Otherwise it's just as guilty, also, there's nothing stopping it from doing just that.

 

Who said anything about the whole computer? I'm talking Nvidia specific tech, that stops working, the second you use an AMD card. So it's not just a new "graphics card. Done."

Just because you disagree, does not make my premise a falsehood. Stop being arrogant. The pricing Nvidia has followed the last couple of years, supports my claim, as do all the added proprietary tech, Nvidia is coming out with. It is sad, that you cannot see or understand this happening for the average consumer (as well as enthusiasts).

 

How many in here with a Gsync monitor would buy an AMD graphics card, even if it was cheaper and better performing?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

they have to lunch the extra features or freesync will destroy g sync

Link to comment
Share on other sites

Link to post
Share on other sites

@Notional You're deluded. The prices of Nvidia's other cards have dropped since Kepler, though I suspect a small rise with Pascal due to the use of HBM, but no matter. Nvidia's pricing is fair, or they'd attract no new customers. Your idea of a vendor lock in generally only applies to inelastic demand markets. As such graphics tech has elastic demand. There's no barrier to switching. You can have multiple GPUs in a system. If you want to stream to your shield, keep your old Nvidia card.

I don't know anyone who DOESN'T replace both the monitor and GPU at the same time, but of course the more intelligent route has always been buying around the $250-300 price point sweet spot to get the latest ports and good display tech and then in 3-4 years at the same price buy a new one that's just as good as the $1000 one others bought the first time around.

I'm sure the cost of those Samsung TVs is greatly increased in Denmark due to the extra costs incurred by that service which is unnecessarily imposed. There's no free lunch.

My disagreement is solely because your premise is a falsehood. It's not a bi-directional relationship. Nvidia is competing but not letting AMD in on the action by riding on Nvidia's coattails. There's nothing wrong with that.

Also, I have the ACER 1440p GSync monitor and I'd happily switch to AMD GPUs if the performance was better by a fair (10%+) margin. You're just being a shallow AMD apologist. Intel and Nvidia earned their market positions and maintain them fair and square. AMD's 300 series pricing is also rising to $700+ for the 390X in response. It's a fine line to walk between our pricing yor competition and looking cheap. It seems AMD finally understands that.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

they have to lunch the extra features or freesync will destroy g sync

No it won't. FreeSync still has visible downsides and worse ghosting problems than GSYNC in most cases. There were always going to be limits to what software can do. Even if you can emulate every feature, there will be a performance hit as resources are used to do it. GSync modules store the algorithms at the hardware level and the algorithms can be changed because it's an FPGA. GSYNC can always evolve on the same hardware until the module is totally filled and can fit no more, and then software can be added on top, giving a minimal performance hit vs. AMD's ever growing software-based solution to the problem.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

@Notional

I don't know anyone who DOESN'T replace both the monitor and GPU at the same time...

Wow more intellectual dishonesty. You're on a roll man.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Wow more intellectual dishonesty. You're on a roll man.

If there are people on this site who don't I'm unaware of them or their habits. It's how my family and friends roll.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I have been thinking about this, I don't think mining actually pushes a card hard enough that it would cause a fail within the warranty period anyway (I am sure it would reduce the life of the card but for it to encroach on the warranty period it would have to being the order of 80+% shorter life span).  So long as the temperature stays below 100 then mining should have next to zero effect on RMA statistics.

The cards miners did use where no cool cards though, they where reference 290 X's and they ran really hot and load while consuming large amounts of juice. I would think continuous amounts of power and 100% usage would kill a card much quicker than otherwise. Not just the GPU but the memory also. I don't really know but I am sure that the memory isn't meant to be under full load for extended periods of time.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

 I don't think mining actually pushes a card hard enough

100% usage forever...I would say that probably has some effects. 

 

also the fans are the larger issue with this the fans seize up because they are spinning max rpm for literally the whole time (you do realize some cards have lifetime warranty right?) and stop working and then the gpu's overheat

The Vinyl Decal guy.

Celestial-Uprising  A Work In-Progress

Link to comment
Share on other sites

Link to post
Share on other sites

@Notional You're deluded. The prices of Nvidia's other cards have dropped since Kepler, though I suspect a small rise with Pascal due to the use of HBM, but no matter. Nvidia's pricing is fair, or they'd attract no new customers. Your idea of a vendor lock in generally only applies to inelastic demand markets. As such graphics tech has elastic demand. There's no barrier to switching. You can have multiple GPUs in a system. If you want to stream to your shield, keep your old Nvidia card.

I don't know anyone who DOESN'T replace both the monitor and GPU at the same time, but of course the more intelligent route has always been buying around the $250-300 price point sweet spot to get the latest ports and good display tech and then in 3-4 years at the same price buy a new one that's just as good as the $1000 one others bought the first time around.

I'm sure the cost of those Samsung TVs is greatly increased in Denmark due to the extra costs incurred by that service which is unnecessarily imposed. There's no free lunch.

My disagreement is solely because your premise is a falsehood. It's not a bi-directional relationship. Nvidia is competing but not letting AMD in on the action by riding on Nvidia's coattails. There's nothing wrong with that.

Also, I have the ACER 1440p GSync monitor and I'd happily switch to AMD GPUs if the performance was better by a fair (10%+) margin. You're just being a shallow AMD apologist. Intel and Nvidia earned their market positions and maintain them fair and square. AMD's 300 series pricing is also rising to $700+ for the 390X in response. It's a fine line to walk between our pricing yor competition and looking cheap. It seems AMD finally understands that.

 

I'm at a loss. Deluded? Because you don't agree with me on something you don't seem to grasp even on a basic level WITH sources?

 

It is not "my idea" of vendor lock in. It is established academia that I rely on the matter. Price elasticity has nothing to do with the matter. That is the entire point with vendor lock in, that switching barriers annul normal market competition mechanics. How is this so difficult to understand?

 

Vendor lock in has nothing to do with riding on coattails. That would not be an issue, if you create industry standards. You share the R&D and also the results of it. That gives a broader adoption rate in the market, which benefits everyone. Nvidia's proprietary ways, only benefits Nvidia, at the cost of their own consumers, as well as AMD's (even Intel too).

 

Afaik Nvidia has actively locked the use of their graphics cards, if AMD cards are present. See the debacle on using an Nvidia card as a dedicated PhysX card in an AMD system. Doubt it would work with shield streaming.

 

Samsung costs the same as LG, Toshiba etc., and cheaper than Sony in general. Prices are higher than in the US, but so is the income and low wages. Either way, service is not a pro for Apple in DK (or necessarily EU). That was the point.

 

LOL Intel deserves their position? Come on, even you know about all the antitrust court cases against Intel. Even the Pentium 4 case has been on the WAN show:

http://www.nytimes.com/2014/06/13/business/international/european-court-upholds-1-06-billion-fine-against-intel.html?_r=0 (antitrust towards AMD)

Or the extremely fishy situation where OriginPC dropped AMD for a while to be Nvidia Exclusive?

http://semiaccurate.com/2013/10/07/nvidias-program-get-oems-like-origin-pc-dump-amd-called-tier-0/ (criticize the source all you want, that origin pc debacle was fishy as fuck).

 

http://www.techpowerup.com/104868/batman-arkham-asylum-enables-aa-only-on-nvidia-hardware-on-pcs.html (No AA on AMD GPU's on TWIWMTBP title)

 

These companies fight dirty, and closed eco systems are definitely one of them. Not my problem, that you do not understand that.

 

Most people don't pay 1-200$ extra for gsync support, and then go buy a graphics card, that cannot use it. You might, but that is not representative of the general market or general consumer behaviour.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

No it won't. FreeSync still has visible downsides and worse ghosting problems than GSYNC in most cases. There were always going to be limits to what software can do. Even if you can emulate every feature, there will be a performance hit as resources are used to do it. GSync modules store the algorithms at the hardware level and the algorithms can be changed because it's an FPGA. GSYNC can always evolve on the same hardware until the module is totally filled and can fit no more, and then software can be added on top, giving a minimal performance hit vs. AMD's ever growing software-based solution to the problem.

In what cases? There is literally ONE review of ONE monitor and that is "most cases" to you? AMD has no control over the range that manufacturers decide to use and yet somehow you're blaming the ghosting on freesync. Seeing as you claimed that no freesync monitor has multiple inputs, it's safe to say you're also pulling this out of your ass.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

I personally find it amusing of how much Tom contradicts his own words and ties things like VSYNC into problems with ghosting.

 

Nvidia, the way you're meant to be extorted.

 

 

I think you need to find some reading comprehension skills.  Just because the two points were in adjoining sentences doesn't mean they're "tied" together. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×