Jump to content

AMD says Nvidia’s GameWorks “completely sabotaged” Witcher 3 performance

 

There's a lot of economics we could get into over this subject, but in the end its a question of the money we make, being spent on items we hope to get a certain amount of enjoyment from, and over time many factors extending or paring back that return on our investment.

 

A good post.  I feel  this last sentence is the crux of it.  Because money is the driving force of all advancement, that is, no company will invest in tech if they can't get a big enough return on it (a matter of survival not greed).  So while consumers feel oppressed or dictated to because a company maintains tight control over their products,  we have to remember that just as we want to get every last drop of value out of our dollar, Companies need to make enough money from their products to cover the cost of developing current and new technology.  This is where the balance between closed proprietary IP, licensed IP and open source comes into play. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Can you explain further with this?

Intel would be forced to sell the x86 license to a competitor, or worse, get split into two companies. Neither situation is beneficial to Intel. Nvidia actually wants in on it, and I'm sure IBM would love to apply its Power 8 IP to a product with a much bigger user base. Apple integrates vertically as best it can, so being able to maintain compatibility with existing software while kicking Intel to the curb would be great for Apple.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

So what's the performance delta without hair works? It shouldn't have touched or done anything to the game outside of stuff that uses game works.

I can also say that I remember when Tress FX launched and I didn't use it because it ran like shit on nvidia. Might not be the exact same scenario but it's pretty close.

Link to comment
Share on other sites

Link to post
Share on other sites

Can't wait for the anti-trust lawsuit inevitably on the horizon, just need that one first step from someone with facts.

GameWorks was conceived, designed and delivered with one goal in mind: sabotage.

For now, even if I do use an Nvidia card, I'm not touching anything with GameWorks in it.

You just like anything else ever that was brand specific. I'm not saying this is how it should be with vendor specific stuff. However that is how it is and you can really fault anyone for it since its their tech.

Out of the two options on the table no hairworks or hairworks optimized for newer nvidia cards. I for one would like to see new tech used even if I couldn't utilize it.

This makes me think of the early days of DX10 and Vista. A bunch of tiles were coming out using DX10, 64 bit, and multithreaded. At the time there were a very short list of cards, all of which were top tier, that were DX10 capable. Then you needed Vista to have access to DX10. When it came to 64 bit XP was a unsupported joke so Vista was your only option. All of these things enable a bunch of amazing features (at the time) and you needed a very specific setup of both hardware and software to actually utilize it. At least in this case the percentage of people who can use it is much higher and there is no not being able to use it at all.

Link to comment
Share on other sites

Link to post
Share on other sites

But he is right thats exactly what happened,even my nvidia gpu runs bad,ruined game. Any gamer with half working brain should boycott nvidia's gameworks, even if it wont have much effect anyway.

Any game with gameworks has been pretty much ruined.

It that the fault of game works directly though? Or is it how the developer handled it. What about the possibility of the community creating a stigma that in turn ruins the game?

Link to comment
Share on other sites

Link to post
Share on other sites

wasn't there the same issue the other way around with tressfx (the hair in tomb raider)

Yep.
Link to comment
Share on other sites

Link to post
Share on other sites

Ideally, neither AMD or Nvidia should be producing technologies which perform proprietary graphics calculations / algorithms. The case for AMD doing it is somewhat better due to them open sourcing all their innovations - however it probably still shouldn't be thing.

Hardware manufactures should just build hardware, and the drivers which implement specific APIs - the fact that GPU makers have to release and optimise drivers for specific applications is just a broken situation in itself -hopefully DX12 and Vulkan will help with this, but I'm not particularly hopeful.

If Gameworks was a product of a 3rd party, that 3rd party could make sure their algorithms work well on all GPU venders, perhaps switching rendering technologies depending on the hardware available - e.g. Cuda vs OpenCL

Intel got a *huge* fine (though it was still far less than the amount they profited by) for making their x86 c++ compiler mute performance on AMD chips, and I am finding it difficult to find a difference between that and what Gameworks does.

TressFX is open source, and runs well (ish) on both hardware platforms, granted it probably doesn't do everything HairWorks probably does, but being open source Nvidia could of easily contributed to the project, benefiting the industry as a whole.

The thing is nvidia creates this stuff to take advantage of their hardware because other people arnt and they want to show it off. I mean ideally the developers would just do this stuff already in their games. Or the engine developers would have it built in and crops platform.

Link to comment
Share on other sites

Link to post
Share on other sites

What is the source on free physx and cuda?

And ever heard of http://en.m.wikipedia.org/wiki/Embrace,_extend_and_extinguish ?

 

Interesting link, Problem is there are a few people here that won't fully understand the implications and try to use it to support any argument they make.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

What is the source on free physx and cuda?

And ever heard of http://en.m.wikipedia.org/wiki/Embrace,_extend_and_extinguish ?

 

I never heard of this specific term, but the result and inherent ideology, is creating vendor lock in. For an industry, that functions best on industry standards, it really is a problematic for everyone.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I never heard of this specific term, but the result and inherent ideology, is creating vendor lock in. For an industry, that functions best on industry standards, it really is a problematic for everyone.

Doesn't really apply to this scenario. Also it still works unlike what's sited there which doesn't work at all unless you have the thing that can read it.
Link to comment
Share on other sites

Link to post
Share on other sites

Interesting link, Problem is there are a few people here that won't fully understand the implications and try to use it to support any argument they make.

 

EEE is a Microsoft staple, its what they are doing right now with Windows 10 and allowing Android and iOS apps to be easily ported over (natively to boot, none of this emulation nonsense). 

 

EEE however doesn't really apply in this scenario. The only thing that applies here is AMD not being rich enough to support game development on the level that Nvidia can, and I think its pretty hypocritical to judge Nvidia for putting their resources to good use (not dank memes, stupid ads and lots of hype) 

Link to comment
Share on other sites

Link to post
Share on other sites

EEE however doesn't really apply in this scenario. The only thing that applies here is AMD not being rich enough to support game development on the level that Nvidia can, and I think its pretty hypocritical to judge Nvidia for putting their resources to good use (not dank memes, stupid ads and lots of hype) 

Well that's the typical narrow vision of a fanboy who point's the problem at the party who has no power... you know Intel hardware is also affected by this and they are rich enough to buy AMD & NVIDIA, while supporting game development and juggling with fire balls, so no, it's not about being rich enough.

No one criticizes NVIDIA for their optimizations, just like AMD does. No one criticizes NVIDIA for helping game developers, just like AMD does. No one criticizes NVIDIA for their differentiation through this kind of software, just like AMD does.

What is criticized is the penalty a game takes on a huge segment of the market, NVIDIA cards included, for nothing other then over tesselation that can be, or not, used with mallicious intent. This is arguable since it's NVIDIA product, and they will always say "Well it has the level of tesselation we think it's necessary for the expected effect - even if it takes a tall on our own older hardware", and we will never know because everything is locked under NDA contracts, one thing we know - NVIDIA isn't interested in optimizing the code, or even let players choose the level of tesselation required, or even use dynamic tesselation (close vs far distances) - but again it's their black boxed code wrapped under a tight NDA contract, so they don't have to shit. 

 

What is a shame is that game developers accept such conditions.

 

Now imagine if AMD was a bit more closed sourced, and well funded and decided to swing dicks away... and Intel would join the party as well... yay for shitty experiences for everyone! "Who gives a fuck! If everyone were rich enough to support ... oh wait... they are... hu..."

The shit fanboys say... their logic is just... wow...

Link to comment
Share on other sites

Link to post
Share on other sites

Who said I'm a fanboy? Because I don't worship at the feet of Huddy, I am a fanboy? Okay. 

Nice dodging, old chap!

So if I belive in a official source because it's exposed to liability, then I worship it? :)

 

The two times I belived the man (Mantle being present in DX12 and in OpenGL), that turned out to be true... is worshiping?

 

You just look desperate.

 

Edit: BTW I know Huddy is also a bullshitter, I mean... we are grown boys here.

Link to comment
Share on other sites

Link to post
Share on other sites

Well that's the typical narrow vision of a fanboy who point's the problem at the party who has no power... you know Intel hardware is also affected by this and they are rich enough to buy AMD & NVIDIA, while supporting game development and juggling with fire balls, so no, it's not about being rich enough.

No one criticizes NVIDIA for their optimizations, just like AMD does. No one criticizes NVIDIA for helping game developers, just like AMD does. No one criticizes NVIDIA for their differentiation through this kind of software, just like AMD does.

What is criticized is the penalty a game takes on a huge segment of the market, NVIDIA cards included, for nothing other then over tesselation that can be, or not, used with mallicious intent. This is arguable since it's NVIDIA product, and they will always say "Well it has the level of tesselation we think it's necessary for the expected effect - even if it takes a tall on our own older hardware", and we will never know because everything is locked under NDA contracts, one thing we know - NVIDIA isn't interested in optimizing the code, or even let players choose the level of tesselation required, or even use dynamic tesselation (close vs far distances) - but again it's their black boxed code wrapped under a tight NDA contract, so they don't have to shit. 

 

What is a shame is that game developers accept such conditions.

 

Now imagine if AMD was a bit more closed sourced, and well funded and decided to swing dicks away... and Intel would join the party as well... yay for shitty experiences for everyone! "Who gives a fuck! If everyone were rich enough to support ... oh wait... they are... hu..."

The shit fanboys say... their logic is just... wow...

Key thing here is that its an additional thing and they allow yout to turn it off. These addtional things are meant for people who already have rocking rigs and can enable this in addition to having high settings for the rest of the game. Just because someone has one decent card doesnt mean that they can max out the game with all options checked and have a playable framerate. tech like this has and will always be about marketing. Do consoles have hairworks? No they donr so thats already one thing to market on.

I dont criticize either of them and have no problem with it. Its how the market is and if you want it to change its going to take a lot to do it. This is there is nothing wrong with what Nvidia or AMD are doing. In Nvidia's eyes they are giving a better experience to their customers and offering something that wouldnt have been in the game without their help. Why would they care as a company about how well it doesnt or doesnt run on a competitor's hardware? To me that is up to the dev or competitor to deal with if they want that platform to be utilized the best it could be.

Yea Nvidia on the launch driver only optimized for newer cards. Wheat is their tessellation performance in comparison. Also is their possibly a driver coming to address kepler and even fermi? No one but Nvidia knows. I think it should have been supported better from day 1 but maybe they have a reason?

As for what Nvidia could have allow us to adjust they may not have know that people would have want to adjust the quality especially since for the majority on and off is more than enough, most of whom will likely have it off unsurprisingly to me. This is something that could potentially be patched in the game if there was enough demand I imagine. I can also say that most of the stuff like this in the past didnt having settings for the most part its usually on or off. Sometimes a generic high, low, or off with no idea what they mean. Just because it has been that was im not saying it should stay that way but it does set a precedent.

Their short on change and can easily add features to the game and gain marketing material as well as a marketing ally, wouldnt you do it? I mean they already said that money from consoles allowed the game to be where it is today did they not?

Thats all up to the devs to allow that. They could say no brand locked stuff etc, the thing is this isnt. It runs on competitor hardware, just not well. Now lots of people are saying this is on purpose but unless some actually empirically finds out there is now way of truly knowing.

Ill agree with shit fanboys say, about everything ever. I try to always site neutral and give company's

benefit of the doubt unless there is evidence proving otherwise.

The last thing ill say is very few people just run out and buy a new GPU when their GPU doesnt do great with the latest tech or on benchmarks. Theyll keep it until it either cant play stuff or cant play the games they want on what they consider decent settings. This is once again the case of the few making a huge ruckus. Additionally how many of they people actually have the game and said effected hardware? I can go even farther into this but its another issue all together.

Link to comment
Share on other sites

Link to post
Share on other sites

Can't win the hardware war, might as well win the PR war. 

 

Although the reason why isn't clear, they lost the high end GPU market but they remain competitive in mid range and better in low range GPUs. Those two segments make up most of the market, not everyone buys a 980. Plus even the 290x in Crossfire is only slightly behind SLI 980s at 4k resolutions (the gap between them gets smaller with higher resolutions), the biggest difference being power draw, although considering you can get two 290x's for the price of like one 980 you will never make up that difference in energy savings. Not to mention their drivers are now at least on par with Nvidia's, they really don't seem to be in that bad of a position.

 

The only reason I can think of why they aren't doing as well, especially at the lower end, is because everyone just recommends Nvidia hardware. The strangest thing being how many people suggest the 750 Ti, when the R7 265 very slightly outperforms it at a lower price, and the R9 270 blows it out of the water for only $10-15 more. The 750 Ti is in an incredibly awkward middle ground getting pushed out of its market on both ends yet still for some reason is bought and recommended by a huge number of people.

 

On another note AMD is gaining market share in professional grade GPUs rather quickly so they do have that going for them.

 

That is just the GPU market though, they are kinda piss out of luck with CPUs. Only at the cheaper price points do AMD CPUs make sense, like $100 for the Fx 6300. They don't have much to go against the better i5s and up though.

Link to comment
Share on other sites

Link to post
Share on other sites

Key thing here is that its an additional thing and they allow yout to turn it off. These addtional things are meant for people who already have rocking rigs and can enable this in addition to having high settings for the rest of the game. Just because someone has one decent card doesnt mean that they can max out the game with all options checked and have a playable framerate. tech like this has and will always be about marketing. Do consoles have hairworks? No they donr so thats already one thing to market on. - I agree 100%, but then again the developer should market it has an extra, and not as a main feature of the game else it is misleading to people who buy the game.

I dont criticize either of them and have no problem with it. Its how the market is and if you want it to change its going to take a lot to do it. This is there is nothing wrong with what Nvidia or AMD are doing. In Nvidia's eyes they are giving a better experience to their customers and offering something that wouldnt have been in the game without their help. Why would they care as a company about how well it doesnt or doesnt run on a competitor's hardware? To me that is up to the dev or competitor to deal with if they want that platform to be utilized the best it could be. - I agree 100% with you, yet I think the market adjusts itself, and we clearly see the adoption of such proprietary features - their traction over the years is just terrible. Like you said, it's fueled by marketing, nothing else. That's why I point out the developers to be the ones to blame - why would they use something that they cannot optimize for their own clients? Why would they leave their product optimization in the hands of IHV? Makes no sense to me, even with the financial boost they get.

Yea Nvidia on the launch driver only optimized for newer cards. Wheat is their tessellation performance in comparison. Also is their possibly a driver coming to address kepler and even fermi? No one but Nvidia knows. I think it should have been supported better from day 1 but maybe they have a reason? 

As for what Nvidia could have allow us to adjust they may not have know that people would have want to adjust the quality especially since for the majority on and off is more than enough, most of whom will likely have it off unsurprisingly to me. This is something that could potentially be patched in the game if there was enough demand I imagine. I can also say that most of the stuff like this in the past didnt having settings for the most part its usually on or off. Sometimes a generic high, low, or off with no idea what they mean. Just because it has been that was im not saying it should stay that way but it does set a precedent. - Well this comes to the PC experience... if the goal is  to strip people out of options then we are merging into some "console-ish" enviornment, were people have no choice and they get what some think it's the best option for them. I'm not saying it's bad, or good. It is what it is. We have no information if this is a game development choice, or a feature of Gameworks itself.

Their short on change and can easily add features to the game and gain marketing material as well as a marketing ally, wouldnt you do it? I mean they already said that money from consoles allowed the game to be where it is today did they not?

 

Thats all up to the devs to allow that. They could say no brand locked stuff etc, the thing is this isnt. It runs on competitor hardware, just not well. Now lots of people are saying this is on purpose but unless some actually empirically finds out there is now way of truly knowing.

 

I'm a product manager, and I've worked with alot of known brands. I would never let the quality of my product be in the hands of any third party - and trust me, I've dealt with my share of sponsorships, both in the client side and in agency side. If it's symbiotic, I surely would! If my product and my clients were the only partys who gained with it? I surely would! If I had clients with a subpar experience for the likes of a sponsor? Pardon my French, fuck no. In the end of the day, they are MY CLIENTS using MY PRODUCT, so it's our brand on the line of fire here when people spend their hard earned money. I don't work for share holders - those are the first ones to fuck off when the shit hits the fan. 

Ill agree with shit fanboys say, about everything ever. I try to always site neutral and give company's

benefit of the doubt unless there is evidence proving otherwise. - Amen!

The last thing ill say is very few people just run out and buy a new GPU when their GPU doesnt do great with the latest tech or on benchmarks. Theyll keep it until it either cant play stuff or cant play the games they want on what they consider decent settings. This is once again the case of the few making a huge ruckus. Additionally how many of they people actually have the game and said effected hardware? I can go even farther into this but its another issue all together. - Another story, for another topic :)

 

I addressed some of the points in your post, but mostly I agree with you.

 

 

Although the reason why isn't clear, they lost the high end GPU market but they remain competitive in mid range and better in low range GPUs. Those two segments make up most of the market, not everyone buys a 980. Plus even the 290x in Crossfire is only slightly behind SLI 980s at 4k resolutions (the gap between them gets smaller with higher resolutions), the biggest difference being power draw, although considering you can get two 290x's for the price of like one 980 you will never make up that difference in energy savings. Not to mention their drivers are now at least on par with Nvidia's, they really don't seem to be in that bad of a position.

 

 

The funny part is that the 290X is the 780ti competitor from late 2013 - and from the benchmarks I saw, the 290X is kicking ass in that aspect, way cheaper then the 780ti tbh. The 980 competition comes in June :)

Link to comment
Share on other sites

Link to post
Share on other sites

Well that's the typical narrow vision of a fanboy who point's the problem at the party who has no power... you know Intel hardware is also affected by this and they are rich enough to buy AMD & NVIDIA, while supporting game development and juggling with fire balls, so no, it's not about being rich enough.  You missed the bit where he said "on the same level as nvidia", meaning they both do it, it's just that nvidia can afford to do it more.  that's not an attack or criticism, that's just reality.

No one criticizes NVIDIA for their optimizations, just like AMD does. No one criticizes NVIDIA for helping game developers, just like AMD does. No one criticizes NVIDIA for their differentiation through this kind of software, just like AMD does.  No one is criticizing AMD for optimization or helping game dev's, We are mostly criticizing the blame game and the AMD fanboys who want to make it everybody else's fault but not accepting when it most likely is simply AMD not being able to afford to keep up.

What is criticized is the penalty a game takes on a huge segment of the market, NVIDIA cards included, for nothing other then over tesselation that can be, or not, used with mallicious intent. This is arguable since it's NVIDIA product, and they will always say "Well it has the level of tesselation we think it's necessary for the expected effect - even if it takes a tall on our own older hardware", and we will never know because everything is locked under NDA contracts, one thing we know - NVIDIA isn't interested in optimizing the code, or even let players choose the level of tesselation required, or even use dynamic tesselation (close vs far distances) - but again it's their black boxed code wrapped under a tight NDA contract, so they don't have to shit. Except it can be turned off, and any reviewer worth his salt will make that clear in benchmarks, if they don't then they are at fault for biased reviews and misleading the public, not the game dev for giving you the option to use a feature that only works on some cards nor is it nvidias fault for making it possible in first place.  Your argument comes awful close to the guns kill people argument.

 

What is a shame is that game developers accept such conditions. Again, they don't, you can turn the feature of, as an AMD user you can change the tessellation to minimize the performance hit.

 

Now imagine if AMD was a bit more closed sourced, and well funded and decided to swing dicks away... and Intel would join the party as well... yay for shitty experiences for everyone! "Who gives a fuck! If everyone were rich enough to support ... oh wait... they are... hu..."  You mean how it used to be when the the companies that innovated, built the best products and catered to their consumers needs the best survived? Yeah, I'd hate to go back to those days. 

The shit fanboys say... their logic is just... wow...   Don't thrown stones when you live in a glass house.

 

Replied in your quote.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Well that's the typical narrow vision of a fanboy who point's the problem at the party who has no power... you know Intel hardware is also affected by this and they are rich enough to buy AMD & NVIDIA, while supporting game development and juggling with fire balls, so no, it's not about being rich enough.

No one criticizes NVIDIA for their optimizations, just like AMD does. No one criticizes NVIDIA for helping game developers, just like AMD does. No one criticizes NVIDIA for their differentiation through this kind of software, just like AMD does.

What is criticized is the penalty a game takes on a huge segment of the market, NVIDIA cards included, for nothing other then over tesselation that can be, or not, used with mallicious intent. This is arguable since it's NVIDIA product, and they will always say "Well it has the level of tesselation we think it's necessary for the expected effect - even if it takes a tall on our own older hardware", and we will never know because everything is locked under NDA contracts, one thing we know - NVIDIA isn't interested in optimizing the code, or even let players choose the level of tesselation required, or even use dynamic tesselation (close vs far distances) - but again it's their black boxed code wrapped under a tight NDA contract, so they don't have to shit. 

 

What is a shame is that game developers accept such conditions.

 

Now imagine if AMD was a bit more closed sourced, and well funded and decided to swing dicks away... and Intel would join the party as well... yay for shitty experiences for everyone! "Who gives a fuck! If everyone were rich enough to support ... oh wait... they are... hu..."

The shit fanboys say... their logic is just... wow...

It doesn't affect Intel hardware. Intel's iGPUs post-IvyBridge all support CUDA 6.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

It doesn't affect Intel hardware. Intel's iGPUs post-IvyBridge all support CUDA 6.

 

And even if they didn't support it,  it's completely a moot point because no one benchmarks igpu's on whitcher3,  so using Intel iGPU to support the claims nvidia did this on purpose to skew benchmarks is not only fallacious but also idiotic.  On top of that the number of people who would try to play any AAA title from the last 2 years on an iGPU has got to be so small you couldn't blame any company for overlooking them.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

You missed the bit where he said "on the same level as nvidia", meaning they both do it, it's just that nvidia can afford to do it more.  that's not an attack or criticism, that's just reality.

The level NVIDIA is affording is to bind a blackbox with a NDA contract. AMD in the recent past hasn't done anything like this, I have no ideia if they ever done it.

 

No one is criticizing AMD for optimization or helping game dev's, We are mostly criticizing the blame game and the AMD fanboys who want to make it everybody else's fault but not accepting when it most likely is simply AMD not being able to afford to keep up.

 

Nah what you are doing is questioning the reply AMD gave, accusing a brand you guys worship (for some odd reason) after public allegations were made about them. It seems quite far fetched to say they cannot afford to keep up when the R9 290X performs better then a 780ti, and a R9 290 matches it's performance or is above it... the direct competitor costing twice as much ... (you can check: http://www.guru3d.com/articles_pages/the_witcher_3_graphics_performance_review,5.html) - and it's not the first NVIDIA tittle where this happens.

Unless you are talking about keeping up with propriatary software that can be used with mallicious intent - to that I can only reply it's not in AMD business model atm.

 

 Except it can be turned off, and any reviewer worth his salt will make that clear in benchmarks, if they don't then they are at fault for biased reviews and misleading the public, not the game dev for giving you the option to use a feature that only works on some cards nor is it nvidias fault for making it possible in first place.  Your argument comes awful close to the guns kill people argument.

 

Now you hit the nail right in the head. You said it right - A feature that only works on some cards, yet it's available to turn on and off... that says it all. Why would they even give such option to everyone if it only works on some cards? Such feature is only meant to run on NVIDIA hardware, because that's how NVIDIA wants it. Like NVIDIA usual bullshiting and beating around the bush they just can't come straight forward and say it - "listen, we developed this, we invested in this, we don't want anyone else to be able to run it."

You say my argument comes close to such thing if we were talking about a gun who would automatically shoot if aimed at some specific people. Then yes.

 

Again, they don't, you can turn the feature of, as an AMD user you can change the tessellation to minimize the performance hit.

 

Now you can, not the best solution in terms of user experience, but it's something indeed.

 

 

You mean how it used to be when the the companies that innovated, built the best products and catered to their consumers needs the best survived? Yeah, I'd hate to go back to those days.  

 

You've lost me there... there is not innovation today lol? I'm talking about market fragmentation, something that fucked game development (like consoles did for a long time), and you talk about innovation and darwinism? But hey, you can taste that with Apple! Maybe you are a client of them already, where everyday hardware gets proprietary connectors just because innovation lol.

Yeah the glass houses... everyone seems to have one here.

Link to comment
Share on other sites

Link to post
Share on other sites

The level NVIDIA is affording is to bind a blackbox with a NDA contract. AMD in the recent past hasn't done anything like this, I have no ideia if they ever done it.

 

 

Nah what you are doing is questioning the reply AMD gave, accusing a brand you guys worship (for some odd reason) after public allegations were made about them. It seems quite far fetched to say they cannot afford to keep up when the R9 290X performs better then a 780ti, and a R9 290 matches it's performance or is above it... the direct competitor costing twice as much ... (you can check: http://www.guru3d.com/articles_pages/the_witcher_3_graphics_performance_review,5.html) - and it's not the first NVIDIA tittle where this happens.

Unless you are talking about keeping up with propriatary software that can be used with mallicious intent - to that I can only reply it's not in AMD business model atm.

 

 

Now you hit the nail right in the head. You said it right - A feature that only works on some cards, yet it's available to turn on and off... that says it all. Why would they even give such option to everyone if it only works on some cards? Such feature is only meant to run on NVIDIA hardware, because that's how NVIDIA wants it. Like NVIDIA usual bullshiting and beating around the bush they just can't come straight forward and say it - "listen, we developed this, we invested in this, we don't want anyone else to be able to run it."

You say my argument comes close to such thing if we were talking about a gun who would automatically shoot if aimed at some specific people. Then yes.

 

 

Now you can, not the best solution in terms of user experience, but it's something indeed.

 

 

 

 

You've lost me there... there is not innovation today lol? I'm talking about market fragmentation, something that fucked game development (like consoles did for a long time), and you talk about innovation and darwinism? But hey, you can taste that with Apple! Maybe you are a client of them already, where everyday hardware gets proprietary connectors just because innovation lol.

Yeah the glass houses... everyone seems to have one here.

 

Wow, you make some very lateral leaps to make an argument out of nothing.

 

It seems according to you that having a feature you can turn of is proof nvidia did this on purpose to cripple AMD even thought AMD have cards that perform better than nvidia's. You seem to forget that a loss is a  loss, and while AMD are posting losses it means they have no money to invest further. you can't put more into something if you don't have more to give.  Why they are posting losses is quite frankly moot. until they have the resources to spend on the level that Nvidia do, they are not going to be able to keep up with the development and investment that nvidia have. This is basic business 101, and for someone who claims to be a manager, a marketer, an NDA expert, etc etc, you should know that.

 

What an immense argument you stack upon us.  We Are not questioning AMD's response, we are outright calling it what it is, I.E BS.

 

Hah, a gun that automatically shoots, this is pretty simple, you are adding more lateral complexity to it than it deserves.  Nvidia created hairworks, you can turn it off, Developers them selves say it does not in and of itself cripple AMD cards unless you use it specifically for that purpose.  That is exactly like a gun, it does not shoot people in and of itself unless the person using it wants it to.  You are blaming nvidia because you perceive the devs to be assassinating AMD, even though the only people purporting this to be true are you and your AMD fanboy friends and Richard Huddy.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Snip

Wow man you're such a fanboy for being logical!!!!!

/s

I wonder what else he will reach for in his next "argument". Or will he say "I've seen the supply lines and confidential info I know what's going on trust me". Or will he just call you a fanboy for not being a fanboy.

Tune in next time on the 3 Stooges of LTT.

Mods might as well lock this thread too. We're going in circles over the same garbage all over again. One side won't reconcile the other, and it's bringing the entire forum to the ground with it.

I don't get what it takes before people go with logic over blind loyalty. But whatever. Can't convince them all. The Only reality from all this is that the entire forum point blank knows what some people are truly like and how fanatical they get when their prized brands are threatened.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×