Jump to content

NVIDIA Responds to AMD Allegations

bogus

I never said they lied. What I said was that the only thing we got is their word for it until they actually release it.

The problem is that they until they actually do something, they are hypocrites when complaining about Nvidia's proprietary technologies. AMD is just as bad until they actually do something good, instead of just promising to do good things.

 

 

You never explicitly said they lied, but your tone was hostile and openly disbelieving until it's actually opened up.

 

You said claims of opening mantle up mean "jack shit" until they actually do.  There is an assumption of dishonestly until proven honest in that tone whereas I assume they are truthful until they have shown otherwise.  

 

 

Nvidia has an argument that mantle will not be open in the way other standards are open and controlled by a standards body.  They are correct there, the direction of mantle is fully in control of AMD, and as such one would expect them to use it as a testing ground for some of their newer fancier rendering techniques, or perhaps that is why Johan wants it to continue to be developed, so HE and ALL OF EA can have a testing ground for newer more exotic rendering/computational techniques that does not depend on microsofts time table.  How much that would actually hurt nvidia or set them behind AMD, I do not know, but I can understand them not wanting to hitch their wagon to a standard that AMD holds the keys to where they can benefit performance to the specific structure of their hardware most of all.

 

But then modern gpus are pretty flexible, so maybe that is not such a large constraint.  I don't know, and most of the people reading here don't know either.  It would be more interesting hearing gpu hardware/software engineers talk about these details.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

Seeing the performance drop of peoples videos on youtube in Watch Dogs on r290's with 4 gigs of VRAM well after release, I am going to say Nvidia is full of @$%^. Add to that the game ran like crap on everything so I see no point to the Game Works thing other than creating an artificial separation in performance. It is supposed to get our games more optimized and I would say it is doing a pretty piss poor job.

 

Nvidia might as well pull support from Steam OS with all the MS ass they kiss. Someone might also want to ask Nvidia if DX 12 is so great now, why the hell we have to wait until Holiday 2015 for it. Oh yeah...cus MS wants us to buy a new OS and or buy their crappy console.

 

When API's are two years apart and a 100 dollar OS upgrade away (seeing that the vast majority of PC gamers are still on Win 7), "Mantle not being needed" is an invalid and idiotic statement from Nvidia. 

 

2 years of getting screwed by MS who has prevented us from even getting DirectX 11.2 games, by withholding the API update is not awesome Nvidia. That is WHY Mantle is needed. Support like (below list) this makes it very easy for me to recommend an AMD GPU over an Nvidia GPU for anyone who doesn't have an overclocked I5 or I7. I think it is pretty funny that Nvidia thinks we are all on 4770 k's 4790k's and OC Hexa's though. Nvidia might want to look at the average gaming PC on Steam, cus the majority sure as hell aren't on those CPU's. Nvidia also obviously hasn't played an MMO where my 4770k at 4.5 ghz can sink to 30 fps lows in Guild Wars 2 or Wild Star. 

Guild Wars 2 is just a consummate failure in programming. After running OBJ Dump on it and decompiling the C-based code one discovers a litany of poorly-chosen algorithms. I love the game, but as a programmer I feel compelled to rage at Arenanet for the levels of incompetence they show in the basic architecture of their game.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

But you can disable GameWorks features if you want. Mirror's Edge is an example of this where they got a simple on/off switch for PhysX.

It's 100% up to the developers how they want to implement it, not Nvidia. It's the same deal as if someone decided to only support Mantle, except then instead of lower performance the Nvidia users are unable to play the game entirely.

The blame is on lazy developers that misuse the tools, not AMD or Nvidia.

Gameworks =/= from Physx.

It's 100% up to the developers who get paid by NVIDIA to implement it. What are you going to say to your CFO? "Hey NVIDIA wants to pay us 1,5 million to use Gameworks, do we say no?" :|

AMD isn't paying anyone to use Mantle, developers are the ones who want to use it, there is no financial support - neither AMD has the financial muscle to pay millions to each developer of the current 47+ in closed beta.

No one is going to support only Mantle - at least on Windows enviornment - because Direct3D is literally Mantle. It would be simply stupid to have the same renderer and limit a game to Mantle, when they just have to make minor tweaks to make the game playable in DX12.

I blame legislation that allows HVs to pay their way into software, giving them the freedom to criple other hardware vendors. This is unethical, and should be considered illegal.

Apple did it with SMS services on iPhones, and had to admit it and fix it. There is regulation on mobile hardware, where is the regulation for the rest of the hardware?

 

Link to comment
Share on other sites

Link to post
Share on other sites

1. How are AMD hypocrites? Noone is saying Mantle will be open source, only that other companies will have access to the source, once it's out of beta. That is nowhere near the same thing. Sure before the fact, it will be a question of trust, but like I've stated earlier, AMD's actions seems to support their claims.

A lot of people have said Mantle will be open source (and many claim it already is). I put some of that blame on AMD because of their liberal use of "free" and "open" but in the end, it's those peoples' faults for not looking into it.

Hubby have said that Mantle is an "open-source API" which sounds really vague to me but again, we will just have to wait and see what they do.

If you reread my posts you will realize that I have never said that I don't think they will break their promise. All I have been saying is that they do a lot of promises, but so far hasn't really delivered on any of them. That will probably change in the future, but right now they haven't actually done anything good.

Actions speak louder than words.

 

2. I guess it depends on how you define mandatory, and to what extent the gameworks effects are being incorporated into a game (if the effects are only for hihg/ultra, or used in all quality settings). But if you chose ultra settings, a gameworks based game will always use gameworks effects on both AMD and nvidia (like hbao+, smoke, bokeh dof, etc for watchdogs). AMD users want to have the best effects on their gameworks games, just like nvidia users wanted tressfx on Tomb Raider.

[Citation needed]

As far as I know, you are free to use whichever effects from GameWorks you want. For example Mirror's Edge can run on max without using PhysX. You could have TXAA as an optiong alongside MSAA and CSAA. Again, as far as I know it is up to the developers and they don't have to use all the like 15 effects included in GameWorks. Just because it is that way in 1 game does not mean it's impossible to do it some other way.

 

 

 

You never explicitly said they lied, but your tone was hostile and openly disbelieving until it's actually opened up.

it only seems that way because my posts are a strong contrast to all the AMD ass kissing in this thread. If you take out my posts one by one and look at then in an unbiased way, you will see that I am barely criticizing anyone, and when I do I also say that the alternative is bad as well. The worst thing I have said about AMD in this thread is that they are about as bad as Nvidia. From the POV of a fanboy that might be a huge insult, but I'd like to think of it as a very neutral comment.

 

You said claims of opening mantle up mean "jack shit" until they actually do.  There is an assumption of dishonestly until proven honest in that tone whereas I assume they are truthful until they have shown otherwise.  

Yep and I stand by that. I would tell Nvidia the same thing if they claimed they would make all the features of GameWorks open source in the future.

Actions always prove why words mean nothing... Talk is cheap...

Call me pessimistic if you want but to me a promise from some gigantic for-profit corporation means next to nothing. It's just very cheap PR.

 

 

 

Gameworks =/= from Physx.

PhysX is one part of GameWorks.

 

It's 100% up to the developers who get paid by NVIDIA to implement it. What are you going to say to your CFO? "Hey NVIDIA wants to pay us 1,5 million to use Gameworks, do we say no?" :|

AMD isn't paying anyone to use Mantle, developers are the ones who want to use it, there is no financial support - neither AMD has the financial muscle to pay millions to each developer of the current 47+ in closed beta.

I'd like a source on the claim that Nvidia pays developers, and that they pay 1.5 million dollars, to use GameWorks.

 

No one is going to support only Mantle - at least on Windows enviornment - because Direct3D is literally Mantle. It would be simply stupid to have the same renderer and limit a game to Mantle, when they just have to make minor tweaks to make the game playable in DX12.

That's just flat out wrong. Direct3D is not "literally" Mantle. They are different. We don't know how much tweaking is needed to port DirectX12 to Mantle or vice versa. Your entire argument is based on ignorance and assumptions.

 

I blame legislation that allows HVs to pay their way into software, giving them the freedom to criple other hardware vendors. This is unethical, and should be considered illegal.

Do we have any proof of this? Last time I checked we only had AMD's word vs Nvidia's word. Neither of them are to be trusted if you ask me.

That is illegal by the way. It's called antitrust laws and Intel was found guilty of it when their compiler didn't optimize as well as it could on non-Intel CPUs. This is why I am asking for a source, and it has to be trustworthy and not just "well X said so and therefore it must be true!" because if what you say is true, then Nvidia is breaking the law. I'd like to warn you that making false statements in order to harm the reputation of someone or something is called defamation and it is also illegal.

So if you don't have proof for your claim, you could potentially be breaking the law with your posts (depending on which country you are in).

Link to comment
Share on other sites

Link to post
Share on other sites

PhysX is one part of GameWorks.

 

I'd like a source on the claim that Nvidia pays developers, and that they pay 1.5 million dollars, to use GameWorks.

 

That's just flat out wrong. Direct3D is not "literally" Mantle. They are different. We don't know how much tweaking is needed to port DirectX12 to Mantle or vice versa. Your entire argument is based on ignorance and assumptions.

 

Do we have any proof of this? Last time I checked we only had AMD's word vs Nvidia's word. Neither of them are to be trusted if you ask me.

That is illegal by the way. It's called antitrust laws and Intel was found guilty of it when their compiler didn't optimize as well as it could on non-Intel CPUs. This is why I am asking for a source, and it has to be trustworthy and not just "well X said so and therefore it must be true!" because if what you say is true, then Nvidia is breaking the law. I'd like to warn you that making false statements in order to harm the reputation of someone or something is called defamation and it is also illegal.

So if you don't have proof for your claim, you could potentially be breaking the law with your posts (depending on which country you are in).

Physx existed long before Gameworks. But hey, names are just names. NVIDIA called the first Shield, Shield, now it calls it Shield Portable.

You can disable Physx and not much else.

I said 1.5 millions has an example. NVIDIA neither confirmed, neither denied they do such practices, they say and I quote "It's part of our business" in the video that is being commented here. One thing is sure, if they didn't pay to use it they would come out and say it. But they rather not comment. And only somone naive thinks that developers will make an effort to give special effects to something like 16% of the market lol. With Mantle devs will be in touch with the next standard API, will have easy portabilty to consoles and DX12 (since DX12 is literally Mantle).

About the flat out wrong:

- Direct3D was presented by Mantle developers. So if you think it's made by Microsoft and NVIDIA, they sure didn't made any technical presentation it to developers. NVIDIA was side by side AMD on OpenGL presentations - they were not to be seen in Direct3D.

- Johan Andersson said it on Twitter: https://twitter.com/repi/status/446718535616585730

- Then Richard Huddy said: "We had a lot of conversations with Microsoft about what we were doing with Mantle, and in those conversations, they said, 'OK, if you really can solve this problem of building a better throughput system that runs on Windows, then we'd like to take that to Windows as well and we'll make that the extra software functionality that comes in DX 12.' So that's how DX 12 has come about."

Source: http://www.techradar.com/news/computing/pc/amd-s-richard-huddy-on-the-state-of-pc-graphics-mantle-2-and-apus-1255575/2

- Then in this video, one of the NVIDIA reps after talking about Mantle, his tongue slipt to the truth... something like: " DX12 is coming and alot of the features of...uuuuuh... (Mantle LOL)... the benefits of having a low lvl api are present on DX12" 

About the proofs, if you don't consider the poor results on some GW games: the contracts are under NDA, and NVIDIA said they couldn't ask such thing to developers... only to ask them and I quote "Do you have access to GW source code". 

Keep being a blind fanboy and support this practices.

Link to comment
Share on other sites

Link to post
Share on other sites

They said explicitly that they were waiting for it to leave a sort of initial beta before they made it available for wider release to companies like intel, and then they could support it as well.

 

So before you claim they lied, wait to see if what they said is true.   

Except that it still being in development doesn't stop it from being open. As I said earlier, SFML 2.0 was open despite being in development for quite some time

Link to comment
Share on other sites

Link to post
Share on other sites

I can see within the next 5 years nVidia coming out with a shield portable that is more powerful than the "nex-gen" consoles. I know it's not AMD's fault MS & Sony didn't push for better hardware. But it just goes even further to show the situation AMD faces in the gaming industry. They do sometimes try their hardest to come up with new tech. But without the millions of dollars of R&D to back it. They are forced to not only rely on 3rd party companies helping them bring their idea's to market. But this means Their "tech" comes off as sub par and even behind the times by the time it hits market. I have owned both AMD & Nvidia GPUs but at this time, i own an nvidia tablet, shield & run 3 systems all with nvidia gpus. The experience's I want as a gamer just not are available at the standards from any other competitor. But having said that, If AMD was to bring a new miracle tech to market that is a total game changer. I will change in a heart beat. I want the best PC experience and nvidia are on that cutting edge in all aspects.

My fiancee asked me to build her a PC, she asked what is the differences between AMD and NV gpus. The simplest way i could explain it is this. The games will look better with extra features on nvidia, the pc will consume less power for the price point we where aiming for & be a lot quieter. AMD is slightly cheaper at most price points, But are not well known for leading edge driver compatibility, not well known for cooler and lower power. And From my personal experience I have done more warranties from AMD cards dying and having early in life issues. But that point i know everyone has different experiences than I have had.

The main thing i took from this interview was this, Their future goal is a subscription based game streaming service. Be the worlds first and best provider of "dumb device" game streaming. They eluded to this right out of the box by comparing their goal and industries evolutionary next step in the world of streaming. That the next step is a streaming service.

What if you didn't have to pay $700-1000 ever 2 years to get bleeding edge GPU horse power, But could pay a fixed amount of $14.99 a month? And the GPU horse power streaming service you are paying for evolved and adapted over time. This means you would end up basically needed a device that could only require, for example h.264 encoding. And that was your "beast" top end gaming machine? 

This would obviously only be a thing then next to latency internet was available globally. Otherwise the world wide market would still require the equivalent stand alone cards. Just and idea and what i read into their comments on their direction and future.
 


 

Link to comment
Share on other sites

Link to post
Share on other sites

I can see within the next 5 years nVidia coming out with a shield portable that is more powerful than the "nex-gen" consoles. I know it's not AMD's fault MS & Sony didn't push for better hardware. But it just goes even further to show the situation AMD faces in the gaming industry. They do sometimes try their hardest to come up with new tech. But without the millions of dollars of R&D to back it. They are forced to not only rely on 3rd party companies helping them bring their idea's to market. But this means Their "tech" comes off as sub par and even behind the times by the time it hits market. I have owned both AMD & Nvidia GPUs but at this time, i own an nvidia tablet, shield & run 3 systems all with nvidia gpus. The experience's I want as a gamer just not are available at the standards from any other competitor. But having said that, If AMD was to bring a new miracle tech to market that is a total game changer. I will change in a heart beat. I want the best PC experience and nvidia are on that cutting edge in all aspects.

My fiancee asked me to build her a PC, she asked what is the differences between AMD and NV gpus. The simplest way i could explain it is this. The games will look better with extra features on nvidia, the pc will consume less power for the price point we where aiming for & be a lot quieter. AND is slightly cheaper at most price points, But are not well known for leading edge driver compatibility, not well known for cooler and lower power. And From my personal experience I have done more warranties from AMD cards dying and having early in life issues. But that point i know everyone has different experiences than I have had.

The main thing i took from this interview was this, Their future goal is a subscription based game streaming service. Be the worlds first and best provider of "dumb device" game streaming. They eluded to this right out of the box by comparing their goal and industries evolutionary next step in the world of streaming. That the next step is a streaming service.

What if you didn't have to pay $700-1000 ever 2 years to get bleeding edge GPU horse power, But could pay a fixed amount of $14.99 a month? And the GPU horse power streaming service you are paying for evolved and adapted over time. This means you would end up basically needed a device that could only require, for example h.264 encoding. And that was your "beast" top end gaming machine? 

This would obviously only be a thing then next to latency internet was available globally. Otherwise the world wide market would still require the equivalent stand alone cards. Just and idea and what i read into their comments on their direction and future.

 

You might want to change that to AMD before someone has a shit storm that you think nvidia is cheaper with driver issues etc.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I like the part where these guys say they invented Physx.  WOOOOOOW.... I must be getting old.

 

I remember when Physx was first being discussed, and it was going to be a 3rd party add in card (a la discrete gpu) that the ORIGINAL company saw as the "holy trinity" of processors - 1 compute 1 graphics 1 physics.  It was an interesting concept at the time.  And then Nvidia bought the company that was working on Physx.  They didn't invent it at all.

 

I don't have a dog in this fight, and don't own stock in either company.  But these reps make me really not like Nvidia as a company.

 

So many statements make me angry in this interview being an somewhat intelligent consumer, and it really paints Nvidia in a bad light.

 

"we invest in CUDA" - cuda gives them lower level access via api....

"lower level api hasn't been proven to improve anything" - the reason consoles lifespans are so long is due to the lower level access.....

"we invented physx!" - uh, that was Ageia. You just bought them.

 

Sorry, but I can't take reps like this seriously when they really don't know what they're talking about, or if they do, pretend that no one else knows what they're talking about.

Link to comment
Share on other sites

Link to post
Share on other sites

 

Guys, stop b*tching about amd not giving mantle to intel.  They said explicitly that they were waiting for it to leave a sort of initial beta before they made it available for wider release to companies like intel, and then they could support it as well.

 

So before you claim they lied, wait to see if what they said is true.   

 

Again, the point people are raising is that we should be skeptical of companies making promises. Not all companies are truthful, and there's not really anything pushing them to actually follow through.

 

But let me ask you: Would you prefer that AMD made their own "gameworks", which would be exclusively used in certain games, and would run badly on Nvidia hardware? 

 

 

It's problematic basing it on just one game. So here's Crysis 2 and Arkham Origin as well:

 

Point 1 - Honestly, if it came to that, and AMD was crushing Nvidia because of something like Gameworks, I'd buy AMD cards. Or put up with the fact I bought an otherwise inferior product.

 

Point 2 - I used that as the premise of my example because that appeared to be the one causing the stir. The thing I was pointing out is that the Forbes article showed a different batch of results to Kotaku. Arkham Origins, I'll concede that point. Crysis 2 is 3 years old, being used in a discussion about a development a little more recently. It's not really the same thing. That being said I have a point to raise regarding tesselation

 

http://www.extremetech.com/extreme/173511-nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd/2

 

So that article, coupled with the one you posted about Crysis 2, I have a point I wish to raise about tesselation. It killed Radeon cards in Crysis 2, released in 2011. That article states that tesselation again, is a factor here, and AMD cards aren't handling it well either on Arkham Origins, released 2 years later. Maybe a little OT, but still something to think about

 

Guys, stop b*tching about amd not giving mantle to intel.  They said explicitly that they were waiting for it to leave a sort of initial beta before they made it available for wider release to companies like intel, and then they could support it as well.

 

So before you claim they lied, wait to see if what they said is true.   

 

Now this post I find disturbing. People have come into this thread with the assumption that Nvidia are intentionally trying to topple AMD, and AMD themselves are making that claim. Specifically, page 2 of this article I came across looking into this:

http://www.extremetech.com/gaming/183411-gameworks-faq-amd-nvidia-and-game-developers-weigh-in-on-the-gameworks-controversy/1

 

Developers have the option of licencing the source code is they choose, that's what Nvidia has done in response to developer feedback. So they can gain access to the source code.

 

Page 3, AMD claims Nvidia are trying to cripple them. Nvidia claims it was to help developers gain better access to NVs rendering capabilities, and by extension, provide a better experience for its customers. Developer consensus is that hurting AMD wasn't NVs motive.

 

 

What if you didn't have to pay $700-1000 ever 2 years to get bleeding edge GPU horse power, But could pay a fixed amount of $14.99 a month? And the GPU horse power streaming service you are paying for evolved and adapted over time. This means you would end up basically needed a device that could only require, for example h.264 encoding. And that was your "beast" top end gaming machine? 

 

This was the point I found the most interesting from what I've watched of the overall market, and to be honest, it makes sense, and I'm glad someone else didn't lose that in amongst the "Nvidia are a-holes" debate going on here. I mean, here in Australia, probably not that viable, at least until NBN is finished, and I wouldn't dare touch a multiplayer game if there was going to be latency. But music and movies both have had good success with subscription based streaming services, and I wouldn't be surprised if it's led to a reduction in piracy. Just some food for though I guess

CPU: Intel Core i7-4770k | Mobo: MSI Mpower Max | Cooling: Cryorig R1 Ultimate w/ XT140 front Fan | GPU: EVGA GTX 770 Dual SC SLI | Case: NZXT H440 | Case Fans: Phanteks PH-140SP x5 | PSU: EVGA Supernova P2 1000W | RAM: 16GB Crucial Ballistix Tactical Tracer | SSD: Kingston HyperX 3k 120GB | HDD: Seagate Barracude

Keyboard: Razer Blackwidow Ultimate 2013 | Mouse: Razer Deathadder 2013 | Headphones: Sennheiser HD438s | Mousepad: Razer Goliathus Control | Monitor 1: Benq XL2430T | Monitor 2: BenQ RL2455HM 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I like the part where these guys say they invented Physx.  WOOOOOOW.... I must be getting old.

 

I remember when Physx was first being discussed, and it was going to be a 3rd party add in card (a la discrete gpu) that the ORIGINAL company saw as the "holy trinity" of processors - 1 compute 1 graphics 1 physics.  It was an interesting concept at the time.  And then Nvidia bought the company that was working on Physx.  They didn't invent it at all.

 

I don't have a dog in this fight, and don't own stock in either company.  But these reps make me really not like Nvidia as a company.

 

So many statements make me angry in this interview being an somewhat intelligent consumer, and it really paints Nvidia in a bad light.

 

"we invest in CUDA" - cuda gives them lower level access via api....

"lower level api hasn't been proven to improve anything" - the reason consoles lifespans are so long is due to the lower level access.....

"we invented physx!" - uh, that was Ageia. You just bought them.

 

Sorry, but I can't take reps like this seriously when they really don't know what they're talking about, or if they do, pretend that no one else knows what they're talking about.

 

 

You could argue the same for AMD, they didn't invent  their GPU's that was ATI they just bought them. And until recently lower level api's have been difficult for the pc due to the wide range of hardware it had to assimilate.  Over the last few years we have seen the two GPU's become refined each into a longer running series that vary mainly in processing ability rather than features/architecture, which has meant that it has only been in the last 3 odd years that a lower level API for pc has become viable.  You can thank nvidia for sticking to cuda and AMD for sticking with the GCN because that is whats makes low level API possible on pc.

 

If you ignore the obvious marketing hype in this video, the request and explanation of advancement goes someway to explain why it has been difficult to create a low level and how previous versions were supposed to be but why they failed.

 

http://channel9.msdn.com/Blogs/DirectX-Developer-Blog/DirectX-Evolving-Microsoft-s-Graphics-Platform

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Physx existed long before Gameworks. But hey, names are just names. NVIDIA called the first Shield, Shield, now it calls it Shield Portable.

I don't see how that's relevant.

 

 

You can disable Physx and not much else.

FXAA is another example of a GameWorks feature some developers have let you toggle.

 

 

I said 1.5 millions has an example. NVIDIA neither confirmed, neither denied they do such practices, they say and I quote "It's part of our business" in the video that is being commented here. One thing is sure, if they didn't pay to use it they would come out and say it. But they rather not comment. And only somone naive thinks that developers will make an effort to give special effects to something like 16% of the market lol. With Mantle devs will be in touch with the next standard API, will have easy portabilty to consoles and DX12 (since DX12 is literally Mantle).

So you got no evidence? Got it. It might very well be the same though, and it's very likely that AMD sometimes does the same. I don't even think it's bad as long as there is no deliberate attempt at crippling the performance of your competitor.

It's not an effort to use GameWorks. It's designed to make development easier since they are just premade effects. I don't think it's unreasonable to make development easier if 16% also get some cool effects. Almost the same thing can be said about mantle. I don't think it's unreasonable to spend a little more effort (because Mantle does require some effort) to give <however many percent of users that support Mantle, like 10% maybe> higher performance.

DirectX 12 is not "literally Mantle". I don't know where you got that from but it won't become true just because you keep repeating it. I am not sure if you are misusing the world "literally" or if you got access to some secret info but last time I checked we knew next to nothing about Direct X, and we certainly did not have access to the source code for both mantle and Direct X so that we could compare them.

 

 

About the flat out wrong:

- Direct3D was presented by Mantle developers. So if you think it's made by Microsoft and NVIDIA, they sure didn't made any technical presentation it to developers. NVIDIA was side by side AMD on OpenGL presentations - they were not to be seen in Direct3D.

- Johan Andersson said it on Twitter: https://twitter.com/repi/status/446718535616585730

- Then Richard Huddy said: "We had a lot of conversations with Microsoft about what we were doing with Mantle, and in those conversations, they said, 'OK, if you really can solve this problem of building a better throughput system that runs on Windows, then we'd like to take that to Windows as well and we'll make that the extra software functionality that comes in DX 12.' So that's how DX 12 has come about."

Source: http://www.techradar.com/news/computing/pc/amd-s-richard-huddy-on-the-state-of-pc-graphics-mantle-2-and-apus-1255575/2

- Then in this video, one of the NVIDIA reps after talking about Mantle, his tongue slipt to the truth... something like: " DX12 is coming and alot of the features of...uuuuuh... (Mantle LOL)... the benefits of having a low lvl api are present on DX12"

That's not evidence that DirectX is a copy/paste of Mantle. All that shows is that DirectX offers a lot of similar functionality compared to Mantle. Do you realize what "literally" mean? You said that Direct X is literally, line for line and word for word, a 100% copy if Mantle.

It's ridiculous to draw that conclusion from your "proof". It's pretty safe to say that DirectX 12 was in development before Mantle which is one nail in the coffin for the "DirectX 12 is just a copy of Mantle" theory. Want another nail in the coffin? Here is a quote from the article I linked earlier:

At GDC, AMD's Corpus elaborated a little bit on that message. He told me Direct3D 12's arrival won't spell the end of Mantle. D3D12 doesn't get quite as close to the metal of AMD's Graphics Core Next GPUs as Mantle does, he claimed, and Mantle "will do some things faster." Mantle may also be quicker to take advantage of new hardware, since AMD will be able to update the API independently without waiting on Microsoft to release a new version of Direct3D. Finally, AMD is talking to developers about bringing Mantle to Linux, where it would have no competition from Microsoft.

No matter how you twist and turn it, DirectX 12 is not the same as Mantle. They share a lot of similarities in the same way OpenGL and DirectX shares some functionality, but that's probably it.

 

 

About the proofs, if you don't consider the poor results on some GW games: the contracts are under NDA, and NVIDIA said they couldn't ask such thing to developers... only to ask them and I quote "Do you have access to GW source code". 

Keep being a blind fanboy and support this practices.

Let me guess which GW game you are referring to, Watch_Dogs, right? The game where benchmarks were all over the place and some places (like LinusTechTips) showed results that indicated no bias, and some places (like Forbes) reported very weird results but no other site managed to duplicate to that extent? Yeah... Great example buddy.

Look, I am not trying to say that GameWorks games perform the same on Nvidia and AMD cards because chances are they don't. It's ridiculous to jump to the conclusion that Nvidia are deliberately crippling the performance for AMD though. There is a difference between

A) Making AMD cards perform worse on purpose.

and

B) Spending more time optimizing for Nvidia while only putting in a small amount of effort into making it run well on AMD cards.

 

There are huge differences and without very good evidence you can't say one or the other is true. You're jumping to conclusions because you are biased towards AMD,

 

I think it's hilarious that you're calling me a fanboy when all I have been saying in this thread is "be skeptical towards everyone. AMD and Nvidia re probably just as bad" and you have been saying "I have no evidence but since I believe in AMD and I believe Nvidia are bad I must be correct!". Again, I only look like an Nvidia fanboy because I am neutral, and that's a very strong contrast compared to the extreme AMD ass kissing going on in this thread.

Oh and I am strongly against company A deliberately crippling the performance of competitor B. I am not against developer X spending more time optimizing for manufacturer A though.

Link to comment
Share on other sites

Link to post
Share on other sites

IMHTWO Here's the truth: your gonna 1. die 2. revive better OR 1.die  OR Revolutionize (actualy  exclude this one because at your estimated age your dead already) OR stay gifted with negative qualities which wouldn't be named. It is already known and understood that your chosen faith will be a combination of the second and fourth.

 

Drones will replace your pitiful disgusting kind. Take your whares and go elsewhere.

/fun writing.

 

I did limit my fun but that doesn't justify what it already is,

You should get a hobby instead of wasting time on these posts.

You are not funny, you are not baiting anyone, just find something else to do.

“The mind of the bigot is like the pupil of the eye; the more light you pour upon it the more it will contract” -Oliver Wendell Holmes “If it can be destroyed by the truth, it deserves to be destroyed by the truth.” -Carl Sagan

Link to comment
Share on other sites

Link to post
Share on other sites

snip

The only thing the contract does not allow is letting AMD access the source code, the developers have access to the source code and they can optimize it for AMD hardware, i don't think there's anything that specifically prohibits them from contacting AMD to help them as long as they don't give them direct access to the source.

Also the developers could implement an on/off switch for some of the gameworks features that are problematic for AMD or everything altogether.

 

I just find that AMD's actions are extremely childish, a huge company throwing around accusations and allegations, playing the victim card. It's a pretty good publicity stunt.

The thing is that Nvidia invests millions of dollars and thousands of man hours to develop all this tech, to help developers implement it and to optimize the games for their hardware. In the meantime AMD is free to do the same but they don't, so they cross their arms and point fingers.

“The mind of the bigot is like the pupil of the eye; the more light you pour upon it the more it will contract” -Oliver Wendell Holmes “If it can be destroyed by the truth, it deserves to be destroyed by the truth.” -Carl Sagan

Link to comment
Share on other sites

Link to post
Share on other sites

You know,you can decompile direct link libraries into actual code...

Derp AMD.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

I like the fun game where nobody is objective and you can tell what GPU they are running just by the comments they leave trying as hard as they can to self justify the purchase they've made.

That said imma go ahead and not fight that trend by saying this interview was more about dodging questions than answering them.

Also the whole Mantle is useless, but dx12 is going to have AMAZING features borrowed from mantle made me giggle a little.

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

You should get a hobby instead of wasting time on these posts.

You are not funny, you are not baiting anyone, just find something else to do.

I succeeded my intention, you mislead yourself.

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't see how that's relevant.

 

 

FXAA is another example of a GameWorks feature some developers have let you toggle.

 

 

So you got no evidence? Got it. It might very well be the same though, and it's very likely that AMD sometimes does the same. I don't even think it's bad as long as there is no deliberate attempt at crippling the performance of your competitor.

It's not an effort to use GameWorks. It's designed to make development easier since they are just premade effects. I don't think it's unreasonable to make development easier if 16% also get some cool effects. Almost the same thing can be said about mantle. I don't think it's unreasonable to spend a little more effort (because Mantle does require some effort) to give <however many percent of users that support Mantle, like 10% maybe> higher performance.

DirectX 12 is not "literally Mantle". I don't know where you got that from but it won't become true just because you keep repeating it. I am not sure if you are misusing the world "literally" or if you got access to some secret info but last time I checked we knew next to nothing about Direct X, and we certainly did not have access to the source code for both mantle and Direct X so that we could compare them.

 

 

That's not evidence that DirectX is a copy/paste of Mantle. All that shows is that DirectX offers a lot of similar functionality compared to Mantle. Do you realize what "literally" mean? You said that Direct X is literally, line for line and word for word, a 100% copy if Mantle.

It's ridiculous to draw that conclusion from your "proof". It's pretty safe to say that DirectX 12 was in development before Mantle which is one nail in the coffin for the "DirectX 12 is just a copy of Mantle" theory. Want another nail in the coffin? Here is a quote from the article I linked earlier:

No matter how you twist and turn it, DirectX 12 is not the same as Mantle. They share a lot of similarities in the same way OpenGL and DirectX shares some functionality, but that's probably it.

 

 

Let me guess which GW game you are referring to, Watch_Dogs, right? The game where benchmarks were all over the place and some places (like LinusTechTips) showed results that indicated no bias, and some places (like Forbes) reported very weird results but no other site managed to duplicate to that extent? Yeah... Great example buddy.

Look, I am not trying to say that GameWorks games perform the same on Nvidia and AMD cards because chances are they don't. It's ridiculous to jump to the conclusion that Nvidia are deliberately crippling the performance for AMD though. There is a difference between

A) Making AMD cards perform worse on purpose.

and

B) Spending more time optimizing for Nvidia while only putting in a small amount of effort into making it run well on AMD cards.

 

There are huge differences and without very good evidence you can't say one or the other is true. You're jumping to conclusions because you are biased towards AMD,

 

I think it's hilarious that you're calling me a fanboy when all I have been saying in this thread is "be skeptical towards everyone. AMD and Nvidia re probably just as bad" and you have been saying "I have no evidence but since I believe in AMD and I believe Nvidia are bad I must be correct!". Again, I only look like an Nvidia fanboy because I am neutral, and that's a very strong contrast compared to the extreme AMD ass kissing going on in this thread.

Oh and I am strongly against company A deliberately crippling the performance of competitor B. I am not against developer X spending more time optimizing for manufacturer A though.

1) You don't see relevance because it doesn't suit you.

2) Do you have evidence they are not crippling AMD performance?

If you were talking about a indie developer, who doesn't have time or resources to make effects and physx into his own engine I would buy it. Now if you say a multi-million dollar studio, with undreds of employees "saves" resources on effects... you gotta be kidding me. Well it doesn't seem to have worked with Watch Dogs, Ubisoft has 800+ employees, "SAVED" resources by using GameWorks (according to you) and look at the final product. That's just a crap argument.

Comparing GameWorks to Mantle is plain ridiculous. One is a API the other is glitter. And no, im not missusing words - if it was made by the ones who made Mantle, if it was technicaly presented to devs by the ones who made Mantle, if it was claimed by the ones who made it as being Mantle, and if developers also commented that watching Direct3D presentation was the same as Mantle - then it must be Mantle.

3) I don't know that first blog, but a 12 year old kid sure can make you a more convincing list LOL ... I had to stop reading after this "(despite the Xbox One connection and the low-level API work done there)." LOL 

Then you have one vague sentence, made by the same guys who made the video we are commenting here today because of their BS, NVIDIA PR reps, where they manipulate and avoid the truth - you value more that vague setence then SEVERAL STATMENTS FROM DIFFERENT PARTYS, MANY OF THEM INDEPENT FROM AMD, technical presentations and a blunt resemblance?!

Now it all makes sense and it's a waste of time talking to you: in your oppinion NVIDIA and NVIDIA only says the truth - everyone else are liars. Mark Huddy, Johan Andersson, all the developers who attended GDC2014 D3D presentation. All liars. The truth is, with ZERO EVIDENCE, the one NVIDIA said: "DX12 is on the works for more time then Mantle". When there are public evidences that DX was on a hold for a long time. You, the one who demands evidences, takes that for granted in this scenario.

Don't you find a bit odd, the company who was working for 4 YEARS on DX12, didn't make ONE SINGLE DEVELOPER PRESENTATION about Direct3D? link me their TECHNICAL presentation please! The same kind of presentation made by Oxide, Johan Andersson, etc, where they show the tech behind Direct3D... not the one on a stage showing a game running on a pc wich was done in E3 2013 lol.

4) Actually I was talking about Batman Arkham Origins, where you have several unbiased sites with 290X getting the same less results then a 770, a 7970 being under a 760... lol. Just google it.

The thing is: even if they don't do it on purpous, wich I don't completly ignore, they have a way to do it and AMD is limited to fix it since they can't see what is wrong because developers can't show them the issues. NVIDIA brags themselfs that they send a team to developers to help them with the game, and AMD can't do the same to optimize for customers clients because developers that use GW can't disclose the content of GameWorks. NVIDIA praises pro-activity, yet they limit it to AMD.

That is called hypocrisy.

You are far from being neutral, trust me lol.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Link to comment
Share on other sites

Link to post
Share on other sites

1 - If you were talking about a indie developer, who doesn't have time or resources to make effects and physx into his own engine I would buy it. Now if you say a multi-million dollar studio, with undreds of employees "saves" resources on effects... you gotta be kidding me. Well it doesn't seem to have worked with Watch Dogs, Ubisoft has 800+ employees, "SAVED" resources by using GameWorks (according to you) and look at the final product. That's just a crap argument.

2 - Comparing GameWorks to Mantle is plain ridiculous. One is a API the other is glitter. And no, im not missusing words - if it was made by the ones who made Mantle, if it was technicaly presented to devs by the ones who made Mantle, if it was claimed by the ones who made it as being Mantle, and if developers also commented that watching Direct3D presentation was the same as Mantle - then it must be Mantle.

3 - Now it all makes sense and it's a waste of time talking to you: in your oppinion NVIDIA and NVIDIA only says the truth - everyone else are liars. Mark Huddy, Johan Andersson, all the developers who attended GDC2014 D3D presentation. All liars. The truth is, with ZERO EVIDENCE, the one NVIDIA said: "DX12 is on the works for more time then Mantle". When there are public evidences that DX was on a hold for a long time. You, the one who demands evidences, takes that for granted in this scenario.

4 - The thing is: even if they don't do it on purpous, wich I don't completly ignore, they have a way to do it and AMD is limited to fix it since they can't see what is wrong because developers can't show them the issues. NVIDIA brags themselfs that they send a team to developers to help them with the game, and AMD can't do the same to optimize for customers clients because developers that use GW can't disclose the content of GameWorks. NVIDIA praises pro-activity, yet they limit it to AMD.

 

1) So Gameworks is what we might call middleware. A set of libraries or features designed to simplify game development. It's inherently a bad decision when a AAA developer (Ubisoft AAA, lol) decides to make use of it. BUT WAIT!!!!

-Havok Physics

-Unreal Engine

-Gamebyro

-Unreal Engine

-Basically any game engine that someone is going to licence

 

So, yeah, there's that point. Just because they're a AAA developer doesn't mean they aren't going to take advantage of shortcuts. If anything, that's an efficient business model

 

2) True as that may be, the fact still stands that both companies still have their own proprietary technologies, which give them an edge in various places

 

3) And again, zero evidence Mantle will be open, but AMD always tells the truth, right? Because how could a company ever lie, lol, that'd be a first. Don't make me laugh hahahahahaha (See two can play this game)

 

4) Again, you've jumped straight to the conclusion that Nvidia and AMD wear devil horns and a halo, respectively. And again, you act like this is the first time a game runs better on one vendors card over another. It will happen, for a variety of reasons. But developers can access the source code, so, arguably couldn't they have optimized the games better?

 

 

Oh, by the way. I watched part of that interview with Huddy. Again, Tesselation, from a few years ago, with Crysis 2. Why do they not make their cards better at tesselation?

CPU: Intel Core i7-4770k | Mobo: MSI Mpower Max | Cooling: Cryorig R1 Ultimate w/ XT140 front Fan | GPU: EVGA GTX 770 Dual SC SLI | Case: NZXT H440 | Case Fans: Phanteks PH-140SP x5 | PSU: EVGA Supernova P2 1000W | RAM: 16GB Crucial Ballistix Tactical Tracer | SSD: Kingston HyperX 3k 120GB | HDD: Seagate Barracude

Keyboard: Razer Blackwidow Ultimate 2013 | Mouse: Razer Deathadder 2013 | Headphones: Sennheiser HD438s | Mousepad: Razer Goliathus Control | Monitor 1: Benq XL2430T | Monitor 2: BenQ RL2455HM 

 

Link to comment
Share on other sites

Link to post
Share on other sites

can you see the different treatment they did with nvidia and amd on this show ?  :rolleyes:

 

 

 

this is how a good interview is done

 

wow... the maturity is really obvious in the way they conduct themselves. AMD seems much more professional than Nvidia in these videos.  

Link to comment
Share on other sites

Link to post
Share on other sites

Locked... has devolved into the normal back and forth... plus the language, some personal attacks, etc...

Forum Links - Community Standards, Privacy Policy, FAQ, Features Suggestions, Bug and Issues.

Folding/Boinc Info - Check out the Folding and Boinc Section, read the Folding Install thread and the Folding FAQ. Info on Boinc is here. Don't forget to join team 223518. Check out other users Folding Rigs for ideas. Don't forget to follow the @LTTCompute for updates and other random posts about the various teams.

Follow me on Twitter for updates @Whaler_99

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×