Jump to content

Richard Huddy, AMD Gaming Scientist, Interview - Mantle, GameWorks, FreeSync and More!

thewhitestig

I was talking about this part:

You can't just say that Nvidia pays developers to implement features (and even going so far as to be pretty specific in how much they pay) and then say that Nvidia has purposely made it so that these features run worse on AMD hardware.

If you don't have proof that they have code in GameWorks specifically made to make AMD GPUs do completely useless calculations, which Nvidia skips in their GPUs, then please don't say they do.

The bold claims were taken directly from the pcper interview and the maximumpcmag's podcast. I have no proof that what Richard Huddy is claiming is 100% true. But I suspect it is, considering how low the Batman Arkham Origins performance is. There is another game using GameWorks that nobody made a fuss about. It's Splinter Cell: Blacklist. 

14456194526_9ba89b2867_o.jpg

Keep in mid that this is not initial testing. This result chart has culminated over the last few months. The drivers they used here are Catalyst 14.2, released in February. The game itself was released in August of 2013. Surely there could be many reasons for these results, and I'm not saying that it's 100% because of GameWorks. But in the case of Batman, we have AMD staff calling it out, and we have the obvious poor performance on AMD hardware. So I'm more inclined to believe that this really is the case. It's not a 50/50 issue. On one side we have evidence of poor performance and a very detailed explanation of why that's the case, and on the other side we have complete denial. Well, the first obviously outweighs the second by a considerable margin.

And no, I'm not picking favourites here. I'm being objective and I examine all the available evidence, while some of the others posting here are just in complete denial because they can't ever believe that their favourite Nvidia can do something wrong. And no, I'm not an AMD fanboy. I do currently have an AMD GPU but I was thinking of jumping ship when the 800 series come. 

My PC: CPU: Intel Core i3 3220, MB: ASUS P8P67 LE, GPU: Palit Jetstream GTX 670, RAM: 4GB 1333mhz DDR3, Storage: 750GB Hitachi, PSU: CoolerMaster G650M 80+ Bronze, Cooling: Coolermaster Hyper 212 Plus, Case: Multirama, Display: Acer x229w 22" 1680x1050, Keyboard: Logitech K120, Mouse: Steelseries Kinzu v2, Sound: Logitech 2.1 system

Link to comment
Share on other sites

Link to post
Share on other sites

Hopefully Nvidia will wake up and start suing the shit out of AMD for this libel crap. If GameWorks actually functioned the way AMD was claiming, AMD would be the one suing Nvidia. It is 100% illegal for Nvidia to do something that prevents a developer from optimizing their games for AMD hardware. Also, no developer would ever pay Nvidia to use GameWorks if that was true. If anyone honestly believes this bullshit, they need to get off the internet. Nvidia has a 2 to 1 market share advantage in dedicated GPUs, not because they illegally force developers to shut out AMD, but because AMD refuses to innovate and just cries like a little kid whenever Nvidia refuses to giveaway their innovations and technology for free.

What a horribly misinformed statement.

You accuse AMD of failing to innovate, while Nvidia shoves the titan z under your nose and expects you to pay £1500 more for a card that has a 10% increase in performance over the competitions 295x2? Please.

AMD's current niche is in BUDGET hardware. Regardless of ATIs place in the past, that is where AMD is currently focussing as it rightly knows that there are far more people buying on a budget than there are buying consumer hardware in the thousands of pounds. It's a sensible strategy, and it's how mantle was born - they are in the process of open sourcing it.

They rely on optimisation to provide the most for their market, and when something nvidia does hinders that process of optimisation (which it clearly did, no ifs or buts - it is clear AMD did not get to input as much as it should have on the optimisation from pure benchmarks) then AMD has a right to call them out on it. Legality is a whole seperate issue, there are plenty of places where this sort of practise can be shrugged off, notably the US.

AMD have done the most in recent years for budget hardware, and that is the area in which they are innovating. You clearly are not the target market for AMD, but surely you are not so blind as to dismiss that innovation as nothing? You are showing your bias clear as day.

Nvidia specialise in high end. AMD work much better in the low end market. Neither are free from faults, but in this case I would definitely side with AMD because it is clear something went wrong with communication with the developers in optimisation, and it is completely reasonable after looking at their statements that Nvidia was in some part responsible, even if it was unintentionally.

Everything said by me is my humble opinion and nothing more, unless otherwise stated.

Link to comment
Share on other sites

Link to post
Share on other sites

Not this again, and openned by the same dude. This is like a cry help from adm filled with marketing crap. They can't just "demand" stuff made by the opposite company to make stuff available.

 

It's actually in Nvidia's best interest even if it seems counter intuitive at first: If Nvidia continues to close down everything they might end up with their own version of Mantle and have an Nvidia-only new API except without plans to make it available to everybody. Meaning they devs could chose questionable performance improvements vs locking themselves out of 40% of the market and basically turning PC gaming into consoles: You got either an AMD system or an Nvidia system with exclusive titles for each. Except without the billions of dollars to back an adversary system because unlike Microsoft or Sony they can't afford to buy endless exclusive titles and such.

Is that what you want for PC gaming? No trust me even if you say yes you don't really want that, it would end up killing both companies in favor of either non 3D games or another company with crappier graphics since it's the only differentiation point the PC has over consoles.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

[Citation needed]

If you don't have a reliable source then you should not make bold claims like that because THAT is in fact illegal.

 

 

Haven't watched the video yet so won't comment on that but I will follow my rule to not trust anything a spokes person from company X says about X or about their main competitor Y.

Take everything said in this interview with a big grain of salt, just like you should take anything someone from Nvidia says with a big grain of salt. At least if the statement can't be objectively proven.

 

Do watch the video though: He was controversial yes but was very careful to make it clear this is just speculating on his part: He didn't disclose any third party devs and gave alternative "theories" as well as speaking in terms like "what could happen" and such. People just get irritated when they hear PR that's cleverly stated to appear factual but he didn't state any of this was a fact he backhandedly acknowledges it's all speculation on his part.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Just to make sure I am on the same page, here's what I understand.

 

Gameworks is basically code that is ready to be implemented by developers, and developers can see and change this code to make it run better. However, AMD can't see this code, thus are in a bad position when it comes to trying to optimise for their game / releasing better drivers.

 

If this is how it worked, then I'm a little on AMD's side. AMD can't just make a competing program, as that would cause fragmentation in the industry, which *is* bad. You could argue that Mantle was similar, but then the counter argument is that AMD wanted everyone to adopt it and left it open.

 

Anyway, that's how I see it. Please correct me if I am wrong somewhere.

Tea, Metal, and poorly written code.

Link to comment
Share on other sites

Link to post
Share on other sites

Huddy has mentioned in a couple of interviews about how nVidia uses tesselation where it isn't required to bog down AMD cards while their own cards can just about handle it (due to the CUDA implementation). So for example they are using tesselation on hair or fur when both companies have an equally optimized TressFX implementation that is more efficient and takes less time to compute for a better end result.

 

Now, I'm not a software engineer so I can't verify his statements from personal/professional experience, but the examples he mentions do work better on nVidia hardware (Batman's cape in Arkham Origins, dog fur in CoD, water in Crysis 2 iirc).

Having more tesselation than needed is not the same as hiding code completely useless code which only AMD cards will run though.

Nvidia cards are simply better at tesselation. They still have to do the same work, but Nvidia cards are more efficient at doing it.

There is a huge huge difference between 1) taking advantage of a weakness in the competitors cards by adding a lot of those calculations and 2) adding completely useless code which does absolutely nothing except waste resources, and then making it so that AMD cards has to run it while Nvidia cards skips it.

 

Scenario 1 might be happening (or maybe the developers of those games just prefered using tesselation instead of TressFX for some reason, but thewhitestig claimed that scenario 2 was happening which I think is bullshit.

 

On one side we have evidence of poor performance and a very detailed explanation of why that's the case, and on the other side we have complete denial. Well, the first obviously outweighs the second by a considerable margin.

Can you please give me the "very detailed explanation"? Saying "Nvidia are hiding compeltely useless code in GameWorks which AMD cards has to run!" is not a detailed explanation.

AMD cards being inferior at tesselation than Nvidia cards is an explanation, but we don't know why the developers added it. How "poor" is "poor performance"? I haven't really looked up benchmarks for Batman so I honestly don't know how big difference there is. Maybe the developers thought it made the game look good, and when they tested it the game ran well enough on AMD cards, and good on Nvidia cards and thought "it is worth the tradeoff to make the game look better"? Claiming that they got paid 5 million dollars to do so without any evidence whatsoever is not a proper explanation. You can't deny that you are jumping to conclusions and assumes the worst from Nvidia and the best from AMD without having all the facts needed to make a proper judgement. You're not as unbiased you think you are.

 

Do watch the video though: He was controversial yes but was very careful to make it clear this is just speculating on his part: He didn't disclose any third party devs and gave alternative "theories" as well as speaking in terms like "what could happen" and such. People just get irritated when they hear PR that's cleverly stated to appear factual but he didn't state any of this was a fact he backhandedly acknowledges it's all speculation on his part.

The [citation needed] wasn't aimed at the guy in the video, it was aimed at thewhitestig. Like I said before I haven't watched the video but even if it was the AMD guy that said it I still think it's completely idiotic of thewhitestig to blindly trust AMD and presenting what they say as facts. You simply can't say:

It's actually the other way around. Nvidia pays the developer to implement their features. Those deals are often upwards of a million, and up to 5 million.  The features they implement might be exclusive to the Nvidia platform, or in the case of GameWorks, they run on AMD hardware too, but purposefully cripple down performance because they make AMD run code that Nvidia are not actually running, and on top of that Nvidia are locking AMD from optimizing this code. This is quite malicious. I noticed that you keep saying "illegal". There is nothing illegal here. It's just plain simple evilness.

without any proof whatsoever. You have to make clear that it is just speculations and not actual facts. Even if the AMD guy made it clear that he was just speculating thewhitestig did not make it clear that the content of his post were just grabbed out of thin air without any proof whatsoever to prove that it is true.

If you're going to post an opinion or speculations then make it clear that's what you're posting. Don't present it as facts and then go "oh I was just speculating in my post, even though I said it as if it was fact".

I am perfectly okay with speculations as long as you make it clear that what you are posting is just that, speculations.

 

 

Just to make sure I am on the same page, here's what I understand.

 

Gameworks is basically code that is ready to be implemented by developers, and developers can see and change this code to make it run better. However, AMD can't see this code, thus are in a bad position when it comes to trying to optimise for their game / releasing better drivers.

 

If this is how it worked, then I'm a little on AMD's side. AMD can't just make a competing program, as that would cause fragmentation in the industry, which *is* bad. You could argue that Mantle was similar, but then the counter argument is that AMD wanted everyone to adopt it and left it open.

 

Anyway, that's how I see it. Please correct me if I am wrong somewhere.

As far as I know, the developers of the game can't see the code either. It's like with DirectX. You call a function, your input goes into kind of a black box and the black box then spits out a result.

Mantle works the same way, which is why I again and again say that Mantle is not open (it is as "open" as DirectX, e.g completely closed source and proprietary). Developers gets access to functions but have no idea how the functions work.

 

 

Here is an example, let's say I will use a function which allows me to increase a number by a certain percentage. So if I call the function and inputs (100, 30) I will get what 100 + 30% is.

 

number = PercentIncrease(100, 30);

This would set "numbers" to 130% of 100.

 

I have no idea how the function "PercentIncrease" works though. I only see what I put into it, and what the result is. PercentageIncrease might work like this:

100 * 1.3 = 130

number = 130

or maybe it works like this:

100 * 0.3 = 30

100 + 30 = 130

number = 130

 

Both achieve the same result but in different ways, and I as a developer don't know which way it achieves it. This is obvious a really simple example and the functions in GameWorks, Mantle, DirectX etc are much more advanced, but hopefully you get the point.

PercentIncrease is a black hole from the POV of the developer.

Obviously the best would be if all code was available for free (both freedom and free beer) to everyone but sadly that's not how the world works. Both Nvidia and AMD are equally guilty of keeping their secret sauces hidden from one another. AMD is just more outspoken about how upset they are that they can't look at how Nvidia's functions work (like PhysX).

Link to comment
Share on other sites

Link to post
Share on other sites

Having more tesselation than needed is not the same as hiding code completely useless code which only AMD cards will run though.

Nvidia cards are simply better at tesselation. They still have to do the same work, but Nvidia cards are more efficient at doing it.

There is a huge huge difference between 1) taking advantage of a weakness in the competitors cards by adding a lot of those calculations and 2) adding completely useless code which does absolutely nothing except waste resources, and then making it so that AMD cards has to run it while Nvidia cards skips it.

 

Scenario 1 might be happening (or maybe the developers of those games just prefered using tesselation instead of TressFX for some reason, but thewhitestig claimed that scenario 2 was happening which I think is bullshit.

 

Can you please give me the "very detailed explanation"? Saying "Nvidia are hiding compeltely useless code in GameWorks which AMD cards has to run!" is not a detailed explanation.

AMD cards being inferior at tesselation than Nvidia cards is an explanation, but we don't know why the developers added it. How "poor" is "poor performance"? I haven't really looked up benchmarks for Batman so I honestly don't know how big difference there is. Maybe the developers thought it made the game look good, and when they tested it the game ran well enough on AMD cards, and good on Nvidia cards and thought "it is worth the tradeoff to make the game look better"? Claiming that they got paid 5 million dollars to do so without any evidence whatsoever is not a proper explanation. You can't deny that you are jumping to conclusions and assumes the worst from Nvidia and the best from AMD without having all the facts needed to make a proper judgement. You're not as unbiased you think you are.

 

The [citation needed] wasn't aimed at the guy in the video, it was aimed at thewhitestig. Like I said before I haven't watched the video but even if it was the AMD guy that said it I still think it's completely idiotic of thewhitestig to blindly trust AMD and presenting what they say as facts. You simply can't say:

without any proof whatsoever. You have to make clear that it is just speculations and not actual facts. Even if the AMD guy made it clear that he was just speculating thewhitestig did not make it clear that the content of his post were just grabbed out of thin air without any proof whatsoever to prove that it is true.

If you're going to post an opinion or speculations then make it clear that's what you're posting. Don't present it as facts and then go "oh I was just speculating in my post, even though I said it as if it was fact".

I am perfectly okay with speculations as long as you make it clear that what you are posting is just that, speculations.

 

 

As far as I know, the developers of the game can't see the code either. It's like with DirectX. You call a function, your input goes into kind of a black box and the black box then spits out a result.

Mantle works the same way, which is why I again and again say that Mantle is not open (it is as "open" as DirectX, e.g completely closed source and proprietary). Developers gets access to functions but have no idea how the functions work.

 

 

Here is an example, let's say I will use a function which allows me to increase a number by a certain percentage. So if I call the function and inputs (100, 30) I will get what 100 + 30% is.

 

number = PercentIncrease(100, 30);

This would set "numbers" to 130% of 100.

 

I have no idea how the function "PercentIncrease" works though. I only see what I put into it, and what the result is. PercentageIncrease might work like this:

100 * 1.3 = 130

number = 130

or maybe it works like this:

100 * 0.3 = 30

100 + 30 = 130

number = 130

 

Both achieve the same result but in different ways, and I as a developer don't know which way it achieves it. This is obvious a really simple example and the functions in GameWorks, Mantle, DirectX etc are much more advanced, but hopefully you get the point.

PercentIncrease is a black hole from the POV of the developer.

Obviously the best would be if all code was available for free (both freedom and free beer) to everyone but sadly that's not how the world works. Both Nvidia and AMD are equally guilty of keeping their secret sauces hidden from one another. AMD is just more outspoken about how upset they are that they can't look at how Nvidia's functions work (like PhysX).

 

I code a little, so I understand what you're saying: Inteface vs Implementation. However, the point remains that it could be considered a problem. I'll have to check up on Mantle though, I thought it nVidia would be allowed to optimise the API for their GPUs.

Tea, Metal, and poorly written code.

Link to comment
Share on other sites

Link to post
Share on other sites

So many people here are completely hopeless. The majority of what RIchard Huddy has said can and HAS BEEN validated by previously leaked emails from developers and almost everything Nvidia said in defense of GameWorks has been refuted not just by AMD but by the game developers themselves.

Link to comment
Share on other sites

Link to post
Share on other sites

So many people here are completely hopeless. The majority of what RIchard Huddy has said can and HAS BEEN validated by previously leaked emails from developers and almost everything Nvidia said in defense of GameWorks has been refuted not just by AMD but by the game developers themselves.

Yeah even devs don't like GameWorks... Found these threads from a while back which might give people more insight...

 

http://linustechtips.com/main/topic/157592-forbes-why-watch-dogs-is-bad-news-for-amd-users-and-potentially-the-entire-pc-gaming-ecosystem/?hl=gameworks

http://linustechtips.com/main/topic/137965-developers-criticze-nvidias-gameworks-program-on-twitter-for-its-blackbox-nature/?hl=gameworks

Processor: AMD FX8320 Cooler: Hyper 212 EVO Motherboard: Asus M5A99FX PRO 2.0 RAM: Corsair Vengeance 2x4GB 1600Mhz

Graphics: Zotac GTX 1060 6GB PSU: Corsair AX860 Case: Corsair Carbine 500R Drives: 500GB Samsung 840 EVO SSD & Seagate 1TB 7200rpm HDD

 

Link to comment
Share on other sites

Link to post
Share on other sites

...

 

Everything I said in my posts was either taken directly from Richard Huddy's interviews or independent benchmarks. That's why I explicitly said several times that what AMD is claiming is in no way shape or form 100% proof of GameWorks being at fault for the poor performance, but it's nevertheless a very strong indicator that something fishy is going on behind the scenes, and the best explanation was offered directly from AMD. 

 

Can you please give me the "very detailed explanation"? Saying "Nvidia are hiding compeltely useless code in GameWorks which AMD cards has to run!" is not a detailed explanation.

http://youtu.be/fZGV5z8YFM8?t=30m20s

 

And about the benchmarks you requested, here's some initial testing of Batman. (because I can't find any newer)

2560.png

And here's some testing on Splinter Cell: Blacklist with Catalyst 14.2 and Nvidia drivers from the same February-March period. The game itself was released in August of 2013. So there was plenty of time for AMD to optimize. Right?

14456194526_9ba89b2867_o.jpg

Again, poor performance could be attributed to many things, but BECAUSE amd made a big fuss about GameWorks it leads me to believe that their theory might not be that far fetched. Quite the opposite actually. If they were coming out of nowhere accusing Nvidia of doing malicious things then I would've called bullshit on their part. But they're not. We have performance based evidence that supports their claims. And although we can never really know weather or not the poor performance is precisely because of malicious GameWorks code, we at least have this outcry from AMD. And it would be wrong to dismiss everything they say even though you don't have evidence pointing to the contrary.

 

 

So many people here are completely hopeless. The majority of what RIchard Huddy has said can and HAS BEEN validated by previously leaked emails from developers and almost everything Nvidia said in defense of GameWorks has been refuted not just by AMD but by the game developers themselves.

Can you please give the sources. I wanna check them out. 

My PC: CPU: Intel Core i3 3220, MB: ASUS P8P67 LE, GPU: Palit Jetstream GTX 670, RAM: 4GB 1333mhz DDR3, Storage: 750GB Hitachi, PSU: CoolerMaster G650M 80+ Bronze, Cooling: Coolermaster Hyper 212 Plus, Case: Multirama, Display: Acer x229w 22" 1680x1050, Keyboard: Logitech K120, Mouse: Steelseries Kinzu v2, Sound: Logitech 2.1 system

Link to comment
Share on other sites

Link to post
Share on other sites

It's actually in Nvidia's best interest even if it seems counter intuitive at first: If Nvidia continues to close down everything they might end up with their own version of Mantle and have an Nvidia-only new API except without plans to make it available to everybody. Meaning they devs could chose questionable performance improvements vs locking themselves out of 40% of the market and basically turning PC gaming into consoles: You got either an AMD system or an Nvidia system with exclusive titles for each. Except without the billions of dollars to back an adversary system because unlike Microsoft or Sony they can't afford to buy endless exclusive titles and such.

Is that what you want for PC gaming? No trust me even if you say yes you don't really want that, it would end up killing both companies in favor of either non 3D games or another company with crappier graphics since it's the only differentiation point the PC has over consoles.

please don't get me wrong, and i think alot of other people missinterpreted me.

 

I pointed out 2 different things:

1- AMD's aproach making alot of stuff "open source" (mantle isn't yet...) or free to use is a good thing. What i mean is that they can't demand nvidia do make their own tecnology available just becuse they think like that. nVidia doesn't and AMD can't call them out on that. The reason why nvidia didn't give in to mantle was because of dx12. Could be because of pride, could be because as announced by microsoft dx has low lvl implementations and was being worked on, we rly don't know. Let's not forget that there is a 3rd party here: microsoft.

AMD

2-WatchDogs, ubisoft and the "amd left out" thing. This is plain bullshit. AMD is crying on this. Lets see some facts: the game itself is poorly optimized for pc (as most previous games from them) and it ran like sht both on nvidia and amd cards at launch. AMD guys keep saying that GameWorks was made to cripple AMD hardware... WTf is this. Why does AMD have to use the nvidia eyecandy technology? They can just use the standard dx11 and make it run buttersmooth. Also, as stated before, it was Ubi developers choosing to side with nvidia and not nvidia dictating what ubisoft does or doesn't. AMD calling out gameworks and saying it cripples AMd is kinda retarded. No1 said that amd needs to use those in order to perform good...

To be honest, this game should even stop being used in benchmarks and reviews for new gpu's/pc's. It's just a bad example to be testing with. Its generating so much retarded stuff for both amd and nvidia when the culprit here is clearly Ubisoft.

 

To summ this up: i don't see the point on being speculating and making a big fuss out of this. I follow the nvidia/ati stuff since like 2002 and it has always been like this. I don't know whay all of a sudden this is a big deal. The circlejerks need to stop. This one, the z87/97 being the only chips for gamming (and overclock), the "asrock sucks" circlejerk, and so on. People need to start being smart reading the product and not the marketing and see if they think it's worth the money. If yes then buy and enjoy. This fanboyism/hatred is old.

 

(as i mentioned before, i'm not a fanboy of any of those 2. I'm just a consumer that buys what he likes at a certain point in time)

Link to comment
Share on other sites

Link to post
Share on other sites

please don't get me wrong, and i think alot of other people missinterpreted me.

 

I pointed out 2 different things:

1- AMD's aproach making alot of stuff "open source" (mantle isn't yet...) or free to use is a good thing. What i mean is that they can't demand nvidia do make their own tecnology available just becuse they think like that. nVidia doesn't and AMD can't call them out on that. The reason why nvidia didn't give in to mantle was because of dx12. Could be because of pride, could be because as announced by microsoft dx has low lvl implementations and was being worked on, we rly don't know. Let's not forget that there is a 3rd party here: microsoft.

AMD

2-WatchDogs, ubisoft and the "amd left out" thing. This is plain bullshit. AMD is crying on this. Lets see some facts: the game itself is poorly optimized for pc (as most previous games from them) and it ran like sht both on nvidia and amd cards at launch. AMD guys keep saying that GameWorks was made to cripple AMD hardware... WTf is this. Why does AMD have to use the nvidia eyecandy technology? They can just use the standard dx11 and make it run buttersmooth. Also, as stated before, it was Ubi developers choosing to side with nvidia and not nvidia dictating what ubisoft does or doesn't. AMD calling out gameworks and saying it cripples AMd is kinda retarded. No1 said that amd needs to use those in order to perform good...

To be honest, this game should even stop being used in benchmarks and reviews for new gpu's/pc's. It's just a bad example to be testing with. Its generating so much retarded stuff for both amd and nvidia when the culprit here is clearly Ubisoft.

 

To summ this up: i don't see the point on being speculating and making a big fuss out of this. I follow the nvidia/ati stuff since like 2002 and it has always been like this. I don't know whay all of a sudden this is a big deal. The circlejerks need to stop. This one, the z87/97 being the only chips for gamming (and overclock), the "asrock sucks" circlejerk, and so on. People need to start being smart reading the product and not the marketing and see if they think it's worth the money. If yes then buy and enjoy. This fanboyism/hatred is old.

 

(as i mentioned before, i'm not a fanboy of any of those 2. I'm just a consumer that buys what he likes at a certain point in time)

 

1) Fair enough

 

2) That is the only thing that I like to point out though: this decision is up to the game developer. It is ultimately the game developer that is deciding to use Nvidia-specific technology AMD can't properly optimize for (for whatever reason, let's not get into that since I don't think it's helpful to believe either company at face value: too much stuff is hidden from this debate for us to really know what's going on) so in that regard, I agree that AMD should be taking it's "fight" to the game devs who decided to favor one company massively, except they can't cause of Mantle: so far it is closed and the promise of them opening it up are just that: promises.

But overall fair enough to your post: I don't think your points are unreasonable overall.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

I'll have to check up on Mantle though, I thought it nVidia would be allowed to optimize the API for their GPUs.

They might be allowed to do that in the future. Some developers might have access to the source code for Mantle, but the majority of developers do not at this point in time.

To me it seems like AMD are huge hypocrites right now. They love talking about how they will make Mantle open and all that in the future, but until they actually do something and not just talk about what they might do in the future, they are just as bad as Nvidia in my eyes. When I can download the source code for Mantle to my computer and I am free to modify it however I want, then I will call AMD better than Nvidia in terms of open and free (as in freedom) software.

 

 

So many people here are completely hopeless. The majority of what RIchard Huddy has said can and HAS BEEN validated by previously leaked emails from developers and almost everything Nvidia said in defense of GameWorks has been refuted not just by AMD but by the game developers themselves.

Can you please post a source on that?

 

 

That first link is bullshit. Not only was the Forbes benchmarks really really strange compared to pretty much all other benchmarks on the Internet. Not only was Watch_Dogs awful on both Nvidia and AMD cards.

The biggest reason I think it is bullshit is because AMD cried about not being able to optimize for Watch_Dogs, and yet they released a driver update which boosted performance by 25% at 1920x1080, and 28% at 2560x1600 on the same day as the game was launched.

It is obvious that they had access to the game before launch and that they could optimize for it, yet they said that they didn't.

 

 

Everything I said in my posts was either taken directly from Richard Huddy's interviews or independent benchmarks. That's why I explicitly said several times that what AMD is claiming is in no way shape or form 100% proof of GameWorks being at fault for the poor performance, but it's nevertheless a very strong indicator that something fishy is going on behind the scenes, and the best explanation was offered directly from AMD. 

That's not how you worded your post though. You said "here is how it is" and then started parroting what someone from AMD said was speculations.

Anyway I am just repeating myself over and over so let's leave it.

 

And about the benchmarks you requested, here's some initial testing of Batman. (because I can't find any newer)

<benchmark>

And here's some testing on Splinter Cell: Blacklist with Catalyst 14.2 and Nvidia drivers from the same February-March period. The game itself was released in August of 2013. So there was plenty of time for AMD to optimize. Right?

 

<benchmark>

Again, poor performance could be attributed to many things, but BECAUSE amd made a big fuss about GameWorks it leads me to believe that their theory might not be that far fetched. Quite the opposite actually. If they were coming out of nowhere accusing Nvidia of doing malicious things then I would've called bullshit on their part. But they're not. We have performance based evidence that supports their claims. And although we can never really know weather or not the poor performance is precisely because of malicious GameWorks code, we at least have this outcry from AMD. And it would be wrong to dismiss everything they say even though you don't have evidence pointing to the contrary.

Yes there is some strange performance difference there, but it could be because of a ton of different things. In Batman AMD says it is because of heavy use of tessellation. That has nothing to do with GameWorks, it's just that AMD cards are bad at it.

I don't see that as Nvidia trying to harm AMD. I see that as the developers using a standard technology in DirectX 11 which Nvidia cards are simply superior at.

I mentioned before that Nvidia had issues with the YUV to RGB conversion (the lowest value was 16 by default instead of 0). Blaming Nvidia because the developers of Batman for using tessellation in the cape would be like blaming AMD because VLC used the hardware YUV -> RGB conversion when Nvidia had issues with that.

If your product (AMD with tessellation or Nvidia with YUV->RGB conversion) is not that good at some standard feature then don't blame the competitor when a developer decides to use that standard feature.

 

The Splinter Cell results obviously favors Nvidia as well, but I don't think it's far to jump to the conclusion that it is Nvidia's fault and that they are purposely making it run worse on AMD hardware. The issue could be with the developer of the game, or it could be something on AMD's side as well, or maybe a mixture of the two (or three). I won't say anything the spokesperson from AMD said is false, but will not take anything he says as truth either as long as we don't have solid evidence for it.

It's easy to jump to conclusions but it's hard to find the truth.

Link to comment
Share on other sites

Link to post
Share on other sites

They might be allowed to do that in the future. Some developers might have access to the source code for Mantle, but the majority of developers do not at this point in time.

To me it seems like AMD are huge hypocrites right now. They love talking about how they will make Mantle open and all that in the future, but until they actually do something and not just talk about what they might do in the future, they are just as bad as Nvidia in my eyes. When I can download the source code for Mantle to my computer and I am free to modify it however I want, then I will call AMD better than Nvidia in terms of open and free (as in freedom) software.

Can you please post a source on that?

That first link is bullshit. Not only was the Forbes benchmarks really really strange compared to pretty much all other benchmarks on the Internet. Not only was Watch_Dogs awful on both Nvidia and AMD cards.

The biggest reason I think it is bullshit is because AMD cried about not being able to optimize for Watch_Dogs, and yet they released a driver update which boosted performance by 25% at 1920x1080, and 28% at 2560x1600 on the same day as the game was launched.

It is obvious that they had access to the game before launch and that they could optimize for it, yet they said that they didn't.

That's not how you worded your post though. You said "here is how it is" and then started parroting what someone from AMD said was speculations.

Anyway I am just repeating myself over and over so let's leave it.

Yes there is some strange performance difference there, but it could be because of a ton of different things. In Batman AMD says it is because of heavy use of tessellation. That has nothing to do with GameWorks, it's just that AMD cards are bad at it.

I don't see that as Nvidia trying to harm AMD. I see that as the developers using a standard technology in DirectX 11 which Nvidia cards are simply superior at.

I mentioned before that Nvidia had issues with the YUV to RGB conversion (the lowest value was 16 by default instead of 0). Blaming Nvidia because the developers of Batman for using tessellation in the cape would be like blaming AMD because VLC used the hardware YUV -> RGB conversion when Nvidia had issues with that.

If your product (AMD with tessellation or Nvidia with YUV->RGB conversion) is not that good at some standard feature then don't blame the competitor when a developer decides to use that standard feature.

The Splinter Cell results obviously favors Nvidia as well, but I don't think it's far to jump to the conclusion that it is Nvidia's fault and that they are purposely making it run worse on AMD hardware. The issue could be with the developer of the game, or it could be something on AMD's side as well, or maybe a mixture of the two (or three). I won't say anything the spokesperson from AMD said is false, but will not take anything he says as truth either as long as we don't have solid evidence for it.

It's easy to jump to conclusions but it's hard to find the truth.

As far as I knew, nVidia chose not to embrace mantle. Also, it wasn't going to impact negatively for nVidia users, as it was completely separate.

Additionally, Mantle is pretty much gone at this point, due to DX12.

Contrast this to Gameworks, where people with AMD cards have no choice but to run the game with it. If what AMD says is true regarding the fact they can't see and optimise the code, then this is pretty poor for the industry.

Tea, Metal, and poorly written code.

Link to comment
Share on other sites

Link to post
Share on other sites

They might be allowed to do that in the future. Some developers might have access to the source code for Mantle, but the majority of developers do not at this point in time.

To me it seems like AMD are huge hypocrites right now. They love talking about how they will make Mantle open and all that in the future, but until they actually do something and not just talk about what they might do in the future, they are just as bad as Nvidia in my eyes. When I can download the source code for Mantle to my computer and I am free to modify it however I want, then I will call AMD better than Nvidia in terms of open and free (as in freedom) software.

Can you please post a source on that?

That first link is bullshit. Not only was the Forbes benchmarks really really strange compared to pretty much all other benchmarks on the Internet. Not only was Watch_Dogs awful on both Nvidia and AMD cards.

The biggest reason I think it is bullshit is because AMD cried about not being able to optimize for Watch_Dogs, and yet they released a driver update which boosted performance by 25% at 1920x1080, and 28% at 2560x1600 on the same day as the game was launched.

It is obvious that they had access to the game before launch and that they could optimize for it, yet they said that they didn't.

That's not how you worded your post though. You said "here is how it is" and then started parroting what someone from AMD said was speculations.

Anyway I am just repeating myself over and over so let's leave it.

Yes there is some strange performance difference there, but it could be because of a ton of different things. In Batman AMD says it is because of heavy use of tessellation. That has nothing to do with GameWorks, it's just that AMD cards are bad at it.

I don't see that as Nvidia trying to harm AMD. I see that as the developers using a standard technology in DirectX 11 which Nvidia cards are simply superior at.

I mentioned before that Nvidia had issues with the YUV to RGB conversion (the lowest value was 16 by default instead of 0). Blaming Nvidia because the developers of Batman for using tessellation in the cape would be like blaming AMD because VLC used the hardware YUV -> RGB conversion when Nvidia had issues with that.

If your product (AMD with tessellation or Nvidia with YUV->RGB conversion) is not that good at some standard feature then don't blame the competitor when a developer decides to use that standard feature.

The Splinter Cell results obviously favors Nvidia as well, but I don't think it's far to jump to the conclusion that it is Nvidia's fault and that they are purposely making it run worse on AMD hardware. The issue could be with the developer of the game, or it could be something on AMD's side as well, or maybe a mixture of the two (or three). I won't say anything the spokesperson from AMD said is false, but will not take anything he says as truth either as long as we don't have solid evidence for it.

It's easy to jump to conclusions but it's hard to find the truth.

As far as I knew, nVidia chose not to embrace mantle. Also, it wasn't going to impact negatively for nVidia users, as it was completely separate.

Additionally, Mantle is pretty much gone at this point, due to DX12.

Contrast this to Gameworks, where people with AMD cards have no choice but to run the game with it. If what AMD says is true regarding the fact they can't see and optimise the code, then this is pretty poor for the industry.

Tea, Metal, and poorly written code.

Link to comment
Share on other sites

Link to post
Share on other sites

As far as I knew, nVidia chose not to embrace mantle. Also, it wasn't going to impact negatively for nVidia users, as it was completely separate.

Additionally, Mantle is pretty much gone at this point, due to DX12.

Contrast this to Gameworks, where people with AMD cards have no choice but to run the game with it. If what AMD says is true regarding the fact they can't see and optimise the code, then this is pretty poor for the industry.

Well you can twist it however you want since we don't have that much info about it.

We don't know if Nvidia would lose something if they agreed to implement Mantle. For example AMD might have demanded something in return, such as money or access to things like PhysX and Nvidia simply did not want that deal.

AMD might have agreed to license Mantle to Nvidia but been very strict in how it was implemented. Maybe Nvidia would have to make major changes to their GPUs in order for it to work. Maybe Nvidia knew about DirectX 12 and thought "we will just focus on that instead". We just don't know how everything went and why decisions were made.

I would be glad if Mantle died with the release of DirectX 12. We don't need more proprietary bullshit. I would be even happier if DirectX died out and something like OpenGL (preferably a better version of it) replaced it. The same goes for GameWorks.

I think GameWorks is bad and should not exist. I won't say that Nvidia deliberately cripples performance for AMD with it though, and I won't blame Nvidia for providing it to developers so that they can use it. I blame developers for using it. You don't blame the gun manufacturer if someone shoots someone else with their weapon. You don't say "well they deliberately designed their gun to kill that specific person, they wanted him dead!" without any proof either.

 

You could even argue that the developers spending time implementing Mantle took away time from other development and therefore had a negative effect on Nvidia users. That's pretty far fetched I know, but something that requires extra work will always impact the overall product if it is implemented.

The developers can choose to not use GameWorks items, just like a developer can choose not to include Mantle. In the end it is up to the developer to decide what they want to include and exclude from the game.

 

AMD can optimized the code for GameWork games by the way. They did it for Watch_Dogs for example. It's up to the developer to take help from AMD if they so desire. Even if AMD don't help with the actual game they can still optimize their drivers afterwards if they want. Don't get me wrong, GameWorks is bad, but AMD is trying to make it sound like Nvidia are buddies with Satan and AMD is a glorious angel who own tries to help everyone. They are both pretty horrible and doesn't deserve even half the praise they get.

Link to comment
Share on other sites

Link to post
Share on other sites

Well you can twist it however you want since we don't have that much info about it.

We don't know if Nvidia would lose something if they agreed to implement Mantle. For example AMD might have demanded something in return, such as money or access to things like PhysX and Nvidia simply did not want that deal.

AMD might have agreed to license Mantle to Nvidia but been very strict in how it was implemented. Maybe Nvidia would have to make major changes to their GPUs in order for it to work. Maybe Nvidia knew about DirectX 12 and thought "we will just focus on that instead". We just don't know how everything went and why decisions were made.

I would be glad if Mantle died with the release of DirectX 12. We don't need more proprietary bullshit. I would be even happier if DirectX died out and something like OpenGL (preferably a better version of it) replaced it. The same goes for GameWorks.

I think GameWorks is bad and should not exist. I won't say that Nvidia deliberately cripples performance for AMD with it though, and I won't blame Nvidia for providing it to developers so that they can use it. I blame developers for using it. You don't blame the gun manufacturer if someone shoots someone else with their weapon. You don't say "well they deliberately designed their gun to kill that specific person, they wanted him dead!" without any proof either.

 

AMD can optimized the code for GameWork games by the way. They did it for Watch_Dogs for example. It's up to the developer to take help from AMD if they so desire. Even if AMD don't help with the actual game they can still optimize their drivers afterwards if they want. Don't get me wrong, GameWorks is bad, but AMD is trying to make it sound like Nvidia are buddies with Satan and AMD is a glorious angel who own tries to help everyone. They are both pretty horrible and doesn't deserve even half the praise they get.

 

 

You're adding far too much speculation by introducing Mantle as a counter argument.

 

We don't know if Nvidia would lose something if they agreed to implement Mantle. For example AMD might have demanded something in return, such as money or access to things like PhysX and Nvidia simply did not want that deal.

AMD might have agreed to license Mantle to Nvidia but been very strict in how it was implemented. Maybe Nvidia would have to make major changes to their GPUs in order for it to work. Maybe Nvidia knew about DirectX 12 and thought "we will just focus on that instead". We just don't know how everything went and why decisions were made.

I would be glad if Mantle died with the release of DirectX 12.

 

 

That's why I say, ignore Mantle for a moment. It probably will die out due to DX12, and there weren't many games supporting it. Going back to gameworks.

 

 

Logically, if AMD is telling the truth, that they indeed can't see the code, then they're right, it is a problem.

Adding speculation regarding mantle isn't going to change that, as, thus far, there is no issue present, and there isn't likely to be any issue present.

 

I agree with you, I think it is blown out of proportion, but it is something that is going solving, and the solution isn't as simple as allowing AMD to create their competitor to Gameworks.

Tea, Metal, and poorly written code.

Link to comment
Share on other sites

Link to post
Share on other sites

I agree with you, I think it is blown out of proportion, but it is something that is going solving, and the solution isn't as simple as allowing AMD to create their competitor to Gameworks.

According to Richard Huddy the problem in a nutshell is

-Gameworks implementing modules into some games for certain effects

-The source code of these modules often not available to the devs (black box), and even if they are the devs not allowed to share it with other graphics vendors

-This gives Nvidia the leverage to prevent it from being optimized for Intel and AMD graphics, and even gives them the power to cripple it on competing hardware without being called out

 

If this is indeed the problem then an open source alternative to Gameworks would be a good counter move.

The challenge of course would be too popularize it. Somebody mentioned in this thread that Nvidia pays devs to implement gameworks. I don't know if that is true; but if it is it's a problem for AMD because they don't have that kind of money to throw around...

Link to comment
Share on other sites

Link to post
Share on other sites

According to Richard Huddy the problem in a nutshell is

-Gameworks implementing modules into some games for certain effects

-The source code of these modules often not available to the devs (black box), and even if they are the devs not allowed to share it with other graphics vendors

-This gives Nvidia the leverage to prevent it from being optimized for Intel and AMD graphics, and even gives them the power to cripple it on competing hardware without being called out

 

If this is indeed the problem then an open source alternative to Gameworks would be a good counter move.

The challenge of course would be too popularize it. Somebody mentioned in this thread that Nvidia pays devs to implement gameworks. I don't know if that is true; but if it is it's a problem for AMD because they don't have that kind of money to throw around...

 

Sorry for not being specific. What I meant was that it would be bad if AMD felt pressured to produce their own proprietary alternative. This would result in a net bad effect for the consumer. I could very much end up buying a card that would run some games nicely and others poorly.

 

However, an open source version of gameworks, like you suggested could alleviate that possibility.

Tea, Metal, and poorly written code.

Link to comment
Share on other sites

Link to post
Share on other sites

Well you can twist it however you want since we don't have that much info about it.

We don't know if Nvidia would lose something if they agreed to implement Mantle. For example AMD might have demanded something in return, such as money or access to things like PhysX and Nvidia simply did not want that deal.

AMD might have agreed to license Mantle to Nvidia but been very strict in how it was implemented. Maybe Nvidia would have to make major changes to their GPUs in order for it to work. Maybe Nvidia knew about DirectX 12 and thought "we will just focus on that instead". We just don't know how everything went and why decisions were made.

I would be glad if Mantle died with the release of DirectX 12. We don't need more proprietary bullshit. I would be even happier if DirectX died out and something like OpenGL (preferably a better version of it) replaced it. The same goes for GameWorks.

I think GameWorks is bad and should not exist. I won't say that Nvidia deliberately cripples performance for AMD with it though, and I won't blame Nvidia for providing it to developers so that they can use it. I blame developers for using it. You don't blame the gun manufacturer if someone shoots someone else with their weapon. You don't say "well they deliberately designed their gun to kill that specific person, they wanted him dead!" without any proof either.

 

You could even argue that the developers spending time implementing Mantle took away time from other development and therefore had a negative effect on Nvidia users. That's pretty far fetched I know, but something that requires extra work will always impact the overall product if it is implemented.

The developers can choose to not use GameWorks items, just like a developer can choose not to include Mantle. In the end it is up to the developer to decide what they want to include and exclude from the game.

 

AMD can optimized the code for GameWork games by the way. They did it for Watch_Dogs for example. It's up to the developer to take help from AMD if they so desire. Even if AMD don't help with the actual game they can still optimize their drivers afterwards if they want. Don't get me wrong, GameWorks is bad, but AMD is trying to make it sound like Nvidia are buddies with Satan and AMD is a glorious angel who own tries to help everyone. They are both pretty horrible and doesn't deserve even half the praise they get.

Considering mantle's being given to developers as an open source api, and if AMD are strict enough with timing it will soon be available to everyone as such, that argument is completely ridiculous. Mantle was given on a silver platter to Nvidia, Nvidia refused. Not all the facts have been revealed, but personally I can think of no compelling reason Nvidia would deny mantle other than purely out of spite to AMD, which is completely understandable as they are competing corporations but not in the best interest of consumers.

This whole argument is about whether Nvidia had profit or consumer interest in mind, and I can think three reasons Nvidia have been in a dodgy position recently in this regard:

Marketing a content creation GPU at a gaming audience in order to essentially rip them off for £1500 more than the comparable competition. Titan Z.

Denying mantle for no concievable reason despite considerable performance improvement to budget buyers AT NO R&D COST, which is what really takes the cash out of the GPU industry.

Gameworks working better with Nvidia cards than on comparable AMD cards in THREE SEPERATE CASES and with REPRODUCABLE RESULTS.

Nvidia are, in my eyes and until more info comes out, in the wrong. Defending them in spite of the evidence against Nvidia and the utter lack of evidence Nvidia gave to the community, is the very definition of fanboyism.

Everything said by me is my humble opinion and nothing more, unless otherwise stated.

Link to comment
Share on other sites

Link to post
Share on other sites

You're adding far too much speculation by introducing Mantle as a counter argument.

I am not really using it as a counter argument. I am using it to point out that AMD are hypocrites.

I am not trying to argue that GameWorks is good. It sucks ass and I want it to go away. I want it to go away just as much as I want AMD to stop trying to push their own proprietary solutions. What really pisses me off is that AMD tries to act as if they are angels and for open standards and solutions when they are guilty of the same things as Nvidia are. Everyone should stop pointing fingers going "look at how bad company Y is! We at company X is much better than they are because they do bad things!" and instead do something constructive. If AMD want to whine about Nvidia having closed source APIs then maybe they should make their own closed source APIs open source first.

The really sad thing is that people believe AMD's marketing bullshit. You have no idea how many times I've heard that Mantle is open source even though it isn't.

 

 

Logically, if AMD is telling the truth, that they indeed can't see the code, then they're right, it is a problem.

Adding speculation regarding mantle isn't going to change that, as, thus far, there is no issue present, and there isn't likely to be any issue present.

 

I agree with you, I think it is blown out of proportion, but it is something that is going solving, and the solution isn't as simple as allowing AMD to create their competitor to Gameworks.

AMD most likely can't see the code for GameWorks, yes. Not even developers for the games can. Yes it is bad and it would be best if it stopped.

Sadly there will always be proprietary APIs though, and I don't think AMD is in any position to talk bad about Nvidia for GameWorks. It's like Stalin telling people "that Hitler guy sure sucks. We should all unite against him, right?", only less people dies.

I want people to stop siding with Stalin and see that both of them are doing some pretty stupid things which ultimately affects all of us in a negative way.

In the end though, both of them (as well as Microsoft) and companies and the main goal of any company is to make money. Giving away things you put lots of R&D money into is not the best way to make money and we will therefore always have propitiatory stuff like GameWorks.

One way to show that you disprove of proprietary solutions is to not buy games that uses them. In the end, it is the developers that choose to implement them and if we as customers continue to buy their products they will think it is okay. Nvidia and AMD won't stop as long as their tools are being used either.

Link to comment
Share on other sites

Link to post
Share on other sites

I cant wait to see these contracts... Oh wait nvidia won't show them.. 

CPU: i7 4770k | GPU: Sapphire 290 Tri-X OC | RAM: Corsair Vengeance LP 2x8GB | MTB: GA-Z87X-UD5HCOOLER: Noctua NH-D14 | PSU: Corsair 760i | CASE: Corsair 550D | DISPLAY:  BenQ XL2420TE


Firestrike scores - Graphics: 10781 Physics: 9448 Combined: 4289


"Nvidia, Fuck you" - Linus Torvald

Link to comment
Share on other sites

Link to post
Share on other sites

-SNIP- Post #46

What are you talking about.. What closed APIs do AMD have? With Mantle you got to realise it is still in beta.. Nvidia and Intel will get to play with it at the end of this year when it is released into the wild. Did you not watch the videos.. 

CPU: i7 4770k | GPU: Sapphire 290 Tri-X OC | RAM: Corsair Vengeance LP 2x8GB | MTB: GA-Z87X-UD5HCOOLER: Noctua NH-D14 | PSU: Corsair 760i | CASE: Corsair 550D | DISPLAY:  BenQ XL2420TE


Firestrike scores - Graphics: 10781 Physics: 9448 Combined: 4289


"Nvidia, Fuck you" - Linus Torvald

Link to comment
Share on other sites

Link to post
Share on other sites

You're adding far too much speculation by introducing Mantle as a counter argument.

I disagree: AMD goes on the offensive about how closed source and secretive Nvidia is while at the same time only promising to open up mantle but not actually doing so: What's the fucking hold up? What's the difference between opening it up now and "later this year" or whatever? They keep saying that open code would help everybody including them so then why not release it?

 

Likely they're hoping for another major title to be released with Mantle support (I'm thinking BF: Hardline) so that they can capitalize on it before "opening" the API for all: That's quite fucking hypocritical of them.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

What are you talking about.. What closed APIs do AMD have? With Mantle you got to realise it is still in beta

 

Nope: he said how everyone would benefit if anyone takes the code and improves on it. This is typical of open source as everybody gets to actively participate on the development even on pre-alfa stages. Well you know, this is true for devs that actually give a fuck about Open Source like Linux devs and not AMD who tries to embrace this supposedly "open source" ethos while keeping shit proprietary anyway.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×