Jump to content

Richard Huddy, AMD Gaming Scientist, Interview - Mantle, GameWorks, FreeSync and More!

thewhitestig

You should really check this out. 

My PC: CPU: Intel Core i3 3220, MB: ASUS P8P67 LE, GPU: Palit Jetstream GTX 670, RAM: 4GB 1333mhz DDR3, Storage: 750GB Hitachi, PSU: CoolerMaster G650M 80+ Bronze, Cooling: Coolermaster Hyper 212 Plus, Case: Multirama, Display: Acer x229w 22" 1680x1050, Keyboard: Logitech K120, Mouse: Steelseries Kinzu v2, Sound: Logitech 2.1 system

Link to comment
Share on other sites

Link to post
Share on other sites

Ahh pcper, I will never forget Ryan Shrout's name after the whole AMD frame pacing issue.

 

I will likely watch the entirety of this video at some point, thanks for posting.

Link to comment
Share on other sites

Link to post
Share on other sites

Excellent interview! I especially like the part about gameworks. Holy shitsnacks, this is bad. How can anyone support a company like nvidia, doing shit like this for the entire gaming industry? Pardon my language, but this is infuriation.

 

The mantle stuff is very interesting. Would be nice to see on linux.

 

Added: So he basically said adaptive sync is superior to g-sync, because the communication between the monitor and gpu as better, and defines a min/max fps, the monitor can handle. Yeah g-sync is stillborn. Great technologies. though. Hope adaptive sync will become a defacto standard in monitors within a year.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

a very well put together interview with interesting information, active sync is better then g sync oh snap 

Link to comment
Share on other sites

Link to post
Share on other sites

im gonna feel guilty buying an 880 GTX

i really hate their tactics

 

edit

ill just buy a next gen AMD Gpu - completely forgot about freesync

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

Excellent interview! I especially like the part about gameworks. Holy shitsnacks, this is bad. How can anyone support a company like nvidia, doing shit like this for the entire gaming industry? Pardon my language, but this is infuriation.

 

The mantle stuff is very interesting. Would be nice to see on linux.

 

Added: So he basically said adaptive sync is superior to g-sync, because the communication between the monitor and gpu as better, and defines a min/max fps, the monitor can handle. Yeah g-sync is stillborn. Great technologies. though. Hope adaptive sync will become a defacto standard in monitors within a year.

I know what I'm buying after I upgrade to a new CPU. An AMD videocard. Once again. I have never had an Nvidia GPU and I was thinking of leaving the AMD camp just to try out something new buuut.... NO, after this I know for sure that I'm not gonna give my money to the evil Nvidia. + all the games I wanna play are gonna have excellent AMD support because of Mantle - Battlefield: Hardline, Star Citizen, Mirror's Edge, Battlefront, the next Mass Effect, etc. Basically everything from EA that runs on the Frostbite engine, and that's 90% of their game lineup, with the exception of their sports titles ofc. R9 390 it is! + the FreeSync monitors are coming in Q1 2015 which are not gonna have a premium price just because they support this technology. 

My PC: CPU: Intel Core i3 3220, MB: ASUS P8P67 LE, GPU: Palit Jetstream GTX 670, RAM: 4GB 1333mhz DDR3, Storage: 750GB Hitachi, PSU: CoolerMaster G650M 80+ Bronze, Cooling: Coolermaster Hyper 212 Plus, Case: Multirama, Display: Acer x229w 22" 1680x1050, Keyboard: Logitech K120, Mouse: Steelseries Kinzu v2, Sound: Logitech 2.1 system

Link to comment
Share on other sites

Link to post
Share on other sites

Excellent interview! I especially like the part about gameworks. Holy shitsnacks, this is bad. How can anyone support a company like nvidia, doing shit like this for the entire gaming industry? Pardon my language, but this is infuriation.

 

The mantle stuff is very interesting. Would be nice to see on linux.

 

Added: So he basically said adaptive sync is superior to g-sync, because the communication between the monitor and gpu as better, and defines a min/max fps, the monitor can handle. Yeah g-sync is stillborn. Great technologies. though. Hope adaptive sync will become a defacto standard in monitors within a year.

 

What a surprise, an employee of one company has problems with a competitor's technology during an interview. ;)

 

Part of the reason you’re not seeing as much outcry right now is because there was fair bit of it a few weeks ago, in the Forbes article about how Gameworks was potentially bad for the industry. The writer posted benchmarks of how a GTX 770 would beat out a R9-290X in Watch Dogs, and everybody was worried. 
 
And then Nvidia responded shortly afterward, saying it was the developer’s choice whether they wanted to give AMD access to better optimize their drivers. To my knowledge, Ubisoft hasn’t made a comment on the issue, so it’s up to you who you want to believe. As far as I’m concerned, it’s just a whole lot of “he said, she said”.
 
But not too long after Nvidia’s response, we began getting Watch Dogs benchmarks from more legitimate tech websites, and none of the ones I’ve seen were able to corroborate Forbes’ posts about a 770 beating a 290X. In fact, the ‘GPU hierarchy’ of 780 Ti > 290X > 290 = 780, etc. remained pretty much intact. So the way I see it, we’ve had GameWorks implemented in approximately four games already, and aside from the typical small bump Nvidia-optimized games give to Nvidia GPUs (the same bump AMD ‘gaming evolved’ titles give to AMD GPUs), nothing seems to suggest that the industry is in peril.
 
Don’t get me wrong, even though I’m currently using a 780, I’m not totally gung-ho for GameWorks. But until I see some legit data of one side getting “gimped”, I’m not going to get myself too worked up. 
Link to comment
Share on other sites

Link to post
Share on other sites

My opinion on the matter is that I have no problem with Nvidia implementing technology like this, but if it's gonna cause issues on competitors hardware due to being dicks about how they implement it, then there should be an option in-game to disable such graphical features, so Nvidia customers can enjoy these effects, but not be detrimental to AMD customers. Similar in a way to how Physx, Tressfx etc work.

Antec 1100 | Asus P8Z87-V | Silverstone Strider 850W 80+ Silver | Intel i5 3570k 4.3Ghz | Corsair h80 | Asus Xonar DGX | Sapphire HD 7850 1000 Mhz | 16 GB Patriot 1600MHz | Intel 330 180GB | OCZ Agility 3 60GB (Cache for HDD) | Seagate Barracuda 2TB | Asus VE247H x2 | Ducky Shine 2 - Cherry MX Brown | Razer Deathadder 3.5G | Logitech G430

Link to comment
Share on other sites

Link to post
Share on other sites

No Bullshit interview from MaximumPC with Richard Huddy a couple of days ago. There's more heat in this one.    :lol:

 

 

https://www.youtube.com/watch?v=fZGV5z8YFM8

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Hopefully Nvidia will wake up and start suing the shit out of AMD for this libel crap. If GameWorks actually functioned the way AMD was claiming, AMD would be the one suing Nvidia. It is 100% illegal for Nvidia to do something that prevents a developer from optimizing their games for AMD hardware. Also, no developer would ever pay Nvidia to use GameWorks if that was true. If anyone honestly believes this bullshit, they need to get off the internet. Nvidia has a 2 to 1 market share advantage in dedicated GPUs, not because they illegally force developers to shut out AMD, but because AMD refuses to innovate and just cries like a little kid whenever Nvidia refuses to giveaway their innovations and technology for free.

i7 2600K @ 4.7GHz/ASUS P8Z68-V Pro/Corsair Vengeance LP 2x4GB @ 1600MHz/EVGA GTX 670 FTW SIG 2/Cooler Master HAF-X

 

http://www.speedtest.net/my-result/3591491194

Link to comment
Share on other sites

Link to post
Share on other sites

Hopefully Nvidia will wake up and start suing the shit out of AMD for this libel crap. If GameWorks actually functioned the way AMD was claiming, AMD would be the one suing Nvidia. It is 100% illegal for Nvidia to do something that prevents a developer from optimizing their games for AMD hardware. Also, no developer would ever pay Nvidia to use GameWorks if that was true. If anyone honestly believes this bullshit, they need to get off the internet. Nvidia has a 2 to 1 market share advantage in dedicated GPUs, not because they illegally force developers to shut out AMD, but because AMD refuses to innovate and just cries like a little kid whenever Nvidia refuses to giveaway their innovations and technology for free.

 

He was careful not to name the dev in question so it's hearsay and not really admisible anyway, basically by not disclosing that dev he's admitting he's just speculating.

And really I don't think it's as bullshit as you'd think, I'm sure he's stretching and exagerating quite a bit but I would be surprised if he was 100% incorrect.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

He was careful not to name the dev in question so it's hearsay and not really admisible anyway, basically by not disclosing that dev he's admitting he's just speculating.

And really I don't think it's as bullshit as you'd think, I'm sure he's stretching and exagerating quite a bit but I would be surprised if he was 100% incorrect.

Really? He named Nvidia, then specifically complained about nonsense like "Batman's cape is tessellated more than it needs to be". That's him naming both Nvidia and Warner Brothers, and it is 100% total crap. He is directly accusing Nvidia of designing software that it not optimized on purpose, even claiming that it's not optimized on purpose for Nvidia hardware, but just worse on AMD. He is a lying sack of crap and I hope Nvidia goes after not just AMD but him personally. Again, NO developer would use GameWorks if what he said was true. Not a single one. 

For the record, I'd be calling bullshit against Nvidia if they were saying this sort of crap about AMD to. 

 

i7 2600K @ 4.7GHz/ASUS P8Z68-V Pro/Corsair Vengeance LP 2x4GB @ 1600MHz/EVGA GTX 670 FTW SIG 2/Cooler Master HAF-X

 

http://www.speedtest.net/my-result/3591491194

Link to comment
Share on other sites

Link to post
Share on other sites

Hopefully Nvidia will wake up and start suing the shit out of AMD for this libel crap. If GameWorks actually functioned the way AMD was claiming, AMD would be the one suing Nvidia. It is 100% illegal for Nvidia to do something that prevents a developer from optimizing their games for AMD hardware. Also, no developer would ever pay Nvidia to use GameWorks if that was true. If anyone honestly believes this bullshit, they need to get off the internet. Nvidia has a 2 to 1 market share advantage in dedicated GPUs, not because they illegally force developers to shut out AMD, but because AMD refuses to innovate and just cries like a little kid whenever Nvidia refuses to giveaway their innovations and technology for free.

 

They also have the market share because AMD suffered greatly in the first part of the 00's at the hands of a downturn in the economy and some shoddy business practices by intel. They have not had the cash to put into R+D since and are still playing catch up. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

They also have the market share because AMD suffered greatly in the first part of the 00's at the hands of a downturn in the economy and some shoddy business practices by intel. They have not had the cash to put into R+D since and are still playing catch up. 

That's Nvidia's fault how???? That's right, it's not. Also, your argument is invalid since it was ATI who crashed and they were saved by AMD. If they can't get their shit together after a decade, they don't deserve to be in business any more.

i7 2600K @ 4.7GHz/ASUS P8Z68-V Pro/Corsair Vengeance LP 2x4GB @ 1600MHz/EVGA GTX 670 FTW SIG 2/Cooler Master HAF-X

 

http://www.speedtest.net/my-result/3591491194

Link to comment
Share on other sites

Link to post
Share on other sites

That's Nvidia's fault how???? That's right, it's not. Also, your argument is invalid since it was ATI who crashed and they were saved by AMD. If they can't get their shit together after a decade, they don't deserve to be in business any more.

WTF?  I never said it was Nvidia's fault, and ATI going arse up has nothing to do with AMD's current predicament.

 

Learn some history before making silly remarks. I wasn't disagreeing with you, I was just adding to the reasons why Nvidia has twice the market share.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Really? He named Nvidia, then specifically complained about nonsense like "Batman's cape is tessellated more than it needs to be". That's him naming both Nvidia and Warner Brothers, and it is 100% total crap.

Well that's not really code related, that's like saying "Oh BMW puts way too much horse power in this specific engine, it's mostly wasted and cars would work better with less horse power but more efficient transmissions" That's just criticism of the overall design desitions you cannot sue them any more than you can sue totalbisquit for saying a game runs badly and is poorly optimized.

 

He is directly accusing Nvidia of designing software that it not optimized on purpose, even claiming that it's not optimized on purpose for Nvidia hardware, but just worse on AMD.

Well to start that was a separate point (also why I separated the quote). Also he's just speculating about what he can observe: at one point there was just a dll and as a rumor he heard from unnamed sources that Nvidia enforces an NDA on the code they are now sharing. Even now for whatever reason they can't really optimize properly and devs have to fence for themselves. As an unconfirmed rumor he points at an NDA but he mentions that he cannot say who or any other circumstances. He also points out how Nvidia are free to come out and say that he's wrong and offer the code so they can take a look.

 

He is a lying sack of crap and I hope Nvidia goes after not just AMD but him personally. Again, NO developer would use GameWorks if what he said was true. Not a single one.

That I won't address at all, nor do I think there's any point in continuing with this conversation, you go ahead and keep raging if you want but leave me out of it if you're going to have this attitude.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Really? He named Nvidia, then specifically complained about nonsense like "Batman's cape is tessellated more than it needs to be". That's him naming both Nvidia and Warner Brothers, and it is 100% total crap.

How do you know it's nonsense?

Link to comment
Share on other sites

Link to post
Share on other sites

No Bullshit interview from MaximumPC with Richard Huddy a couple of days ago. There's more heat in this one.    :lol:

 

 

 

 

Everyone needs to watch the entirety of this interview.   It's quite clear nvidia is not innocent here.  AMD uses mantle to buff their own hardware, nvidia uses gameworks to cripple amd hardware performance.  One tactic is considered fair play, features like shadow play that allow low cost video recording while playing games is a straight up perk of nvidia hardware (although apparently amd just released a similar update that does it on theirs), that is a legitimate type of addition.  But building in excessive tesselation that is not needed that slows down amd hardware more than nvidia hardware is not kosher.  They hurt their own customers performance, just because they spite amd performance even more.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

Not this again, and openned by the same dude. This is like a cry help from adm filled with marketing crap. They can't just "demand" stuff made by the opposite company to make stuff available. If nvidia creates something it's theirs to do whatever they want. Dangerous to the ecosystem? not rly imo, since this makes AMD actually care about it to make something similar... It's basicly a 2 sided competition.

This said, i would be ok again with AMD doing stuff that doesn't work on nvidia hardware. It's how this sht has been so far. Why is this only a problem now?

 

And why did this come up after Watch Dogs "thingie"? I don't understand how people keep complaining about nvidia or calling tactics what they do. It's their technology and their ways of doing their business. AMD is no1 to call them out on this. People just don't wanna see that most developers choose a side on wich to work wich was the case. 

Why the fuk does AMD keep insisting that nvidia gameworks screw them over? they didn't. AMD doesn't need nvidia's eyecandy technology to make the game run good on their hardware with just pure dx11. People, specially the fcking op, need to stop this circlejerk. Stuff as this "incident" have been hapenning over the years but only now it is a problem.

Stop, think, and fcking put in your minds that WatchDogs and most Ubisoft games are so retarded when running on pc. I don't understand how nvidia cripples AMD. AMD doesn't need to implement nvidia tecnology in their drivers or whatever, god.. The game is standarly good, and AMd needs to stop fcking crying about this making people go all pitty and hatefulll on nVidia.

 

And please, refrain from even thinking i'm an nVidia fanboy. I own a 280x and have an amd gpu on my laptop.. (wich has broken video acel since 13.3 drivers or so. thanks amd)

Link to comment
Share on other sites

Link to post
Share on other sites

Also, no developer would ever pay Nvidia to use GameWorks if that was true.

It's actually the other way around. Nvidia pays the developer to implement their features. Those deals are often upwards of a million, and up to 5 million.  The features they implement might be exclusive to the Nvidia platform, or in the case of GameWorks, they run on AMD hardware too, but purposefully cripple down performance because they make AMD run code that Nvidia are not actually running, and on top of that Nvidia are locking AMD from optimizing this code. This is quite malicious. I noticed that you keep saying "illegal". There is nothing illegal here. It's just plain simple evilness. 

 

 

Not this again, and openned by the same dude. 

Are you being serious????? Go look at the date of the original post before you start crapping all over the place. It was posted on June 19th, and my Watch Dogs thread was from yesterday. This thread had nothing to do with bashing GameWorks. The members of the forums started discussing GameWorks in here. This was just a post about an interview. I can't stand your bullshit. 

My PC: CPU: Intel Core i3 3220, MB: ASUS P8P67 LE, GPU: Palit Jetstream GTX 670, RAM: 4GB 1333mhz DDR3, Storage: 750GB Hitachi, PSU: CoolerMaster G650M 80+ Bronze, Cooling: Coolermaster Hyper 212 Plus, Case: Multirama, Display: Acer x229w 22" 1680x1050, Keyboard: Logitech K120, Mouse: Steelseries Kinzu v2, Sound: Logitech 2.1 system

Link to comment
Share on other sites

Link to post
Share on other sites

It's actually the other way around. Nvidia pays the developer to implement their features. Those deals are often upwards of a million, and up to 5 million.  The features they implement might be exclusive to the Nvidia platform, or in the case of GameWorks, they run on AMD hardware too, but purposefully cripple down performance because they make AMD run code that Nvidia are not actually running, and on top of that Nvidia are locking AMD from optimizing this code. This is quite malicious. I noticed that you keep saying "illegal". There is nothing illegal here. It's just plain simple evilness.

[Citation needed]

If you don't have a reliable source then you should not make bold claims like that because THAT is in fact illegal.

 

 

Haven't watched the video yet so won't comment on that but I will follow my rule to not trust anything a spokes person from company X says about X or about their main competitor Y.

Take everything said in this interview with a big grain of salt, just like you should take anything someone from Nvidia says with a big grain of salt. At least if the statement can't be objectively proven.

Link to comment
Share on other sites

Link to post
Share on other sites

[Citation needed]

If you don't have a reliable source then you should not make bold claims like that because THAT is in fact illegal.

Nono, jmaster299 was talking about how the GameWorks practices would be illegal "if they were true". 

My PC: CPU: Intel Core i3 3220, MB: ASUS P8P67 LE, GPU: Palit Jetstream GTX 670, RAM: 4GB 1333mhz DDR3, Storage: 750GB Hitachi, PSU: CoolerMaster G650M 80+ Bronze, Cooling: Coolermaster Hyper 212 Plus, Case: Multirama, Display: Acer x229w 22" 1680x1050, Keyboard: Logitech K120, Mouse: Steelseries Kinzu v2, Sound: Logitech 2.1 system

Link to comment
Share on other sites

Link to post
Share on other sites

Nono, jmaster299 was talking about how the GameWorks practices would be illegal "if they were true". 

I was talking about this part:

It's actually the other way around. Nvidia pays the developer to implement their features. Those deals are often upwards of a million, and up to 5 million.  The features they implement might be exclusive to the Nvidia platform, or in the case of GameWorks, they run on AMD hardware too, but purposefully cripple down performance because they make AMD run code that Nvidia are not actually running, and on top of that Nvidia are locking AMD from optimizing this code. This is quite malicious. I noticed that you keep saying "illegal". There is nothing illegal here. It's just plain simple evilness. 

You can't just say that Nvidia pays developers to implement features (and even going so far as to be pretty specific in how much they pay) and then say that Nvidia has purposely made it so that these features run worse on AMD hardware.

If you don't have proof that they have code in GameWorks specifically made to make AMD GPUs do completely useless calculations, which Nvidia skips in their GPUs, then please don't say they do.

Link to comment
Share on other sites

Link to post
Share on other sites

I was talking about this part:

You can't just say that Nvidia pays developers to implement features (and even going so far as to be pretty specific in how much they pay) and then say that Nvidia has purposely made it so that these features run worse on AMD hardware.

If you don't have proof that they have code in GameWorks specifically made to make AMD GPUs do completely useless calculations, which Nvidia skips in their GPUs, then please don't say they do.

 

Huddy has mentioned in a couple of interviews about how nVidia uses tesselation where it isn't required to bog down AMD cards while their own cards can just about handle it (due to the CUDA implementation). So for example they are using tesselation on hair or fur when both companies have an equally optimized TressFX implementation that is more efficient and takes less time to compute for a better end result.

 

Now, I'm not a software engineer so I can't verify his statements from personal/professional experience, but the examples he mentions do work better on nVidia hardware (Batman's cape in Arkham Origins, dog fur in CoD, water in Crysis 2 iirc).

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×