Jump to content

AMD says Nvidia’s GameWorks “completely sabotaged” Witcher 3 performance

the reason why i hate Nvidia and the reason why i wont buy AMD

 

Ok, have fun with your Intel iGpus ayyyy

Link to comment
Share on other sites

Link to post
Share on other sites

This guy is always saying nvidia sabotaged AMD performance,yet there have been no lawsuits or complaints filed against Nvidia. If sabotage actually was occurring then AMD would have already done something about it.

| Currently no gaming rig | Dell XPS 13 (9343) |

| Samsung Galaxy Note5 | Gear VR | Nvidia Shield Tab | Xbox One |

Link to comment
Share on other sites

Link to post
Share on other sites

Ideally, neither AMD or Nvidia should be producing technologies which perform proprietary graphics calculations / algorithms. The case for AMD doing it is somewhat better due to them open sourcing all their innovations - however it probably still shouldn't be thing.

Hardware manufactures should just build hardware, and the drivers which implement specific APIs - the fact that GPU makers have to release and optimise drivers for specific applications is just a broken situation in itself -hopefully DX12 and Vulkan will help with this, but I'm not particularly hopeful.

If Gameworks was a product of a 3rd party, that 3rd party could make sure their algorithms work well on all GPU venders, perhaps switching rendering technologies depending on the hardware available - e.g. Cuda vs OpenCL

Intel got a *huge* fine (though it was still far less than the amount they profited by) for making their x86 c++ compiler mute performance on AMD chips, and I am finding it difficult to find a difference between that and what Gameworks does.

TressFX is open source, and runs well (ish) on both hardware platforms, granted it probably doesn't do everything HairWorks probably does, but being open source Nvidia could of easily contributed to the project, benefiting the industry as a whole.

The Intel suit was BS too. ICC optimizes per architecture based on the clock counts per instruction. It can't be expected to optimize to the same level for AMD.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Very taxing does not equal unoptimized. Some things, like computing individual hair movements based on things like wind and movement is very resource heavy, even if well optimized.

 

Your red herring with adaptive sync is completely irrelevant. AMD is behind in terms of tessellation performance and they have been ever since the 5000 series (about 5 years ago). What tier of Direct X compatibility they are at is irrelevant because it doesn't change the fact that AMD cards don't perform well with high tessellation on. Even their own "fix" is to override game settings and force less tessellation.

 

We will most likely be able to set tessellation to whatever we want when the patch comes but as for now we have an on/off switch for HairWorks.

 

When you spend an excessive amount of tessellation (64x), without any added graphical fidelity, then it's just wasteful. Wasteful is by definition unoptimized. I bet no one can see any difference between 32x and 64x, and I think most can't see any difference between 32x and 16x.

 

Just a general comment, when it comes to being behind the other. But Maxwell being so good at tessellation, and HairWorks, being wasteful about it, causing both AMD and non Maxwell NVidia cards to suffer, should be criticized by all parties.

 

For future performance, full DX12 support will probably be quite crucial.

 

AMD has to override tessellation, because it's wasteful and they cannot optimize the HairWorks effect itself.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I still fail to see how the game is "ruined" because of hairworks. Here's an idea, just turn it off.

Well just because you turn it off, do not mean that some code from Nvidia GameWork is not running it is not like a plugin

WHY!!!

Link to comment
Share on other sites

Link to post
Share on other sites

This guy is always saying nvidia sabotaged AMD performance,yet there have been no lawsuits or complaints filed against Nvidia. If sabotage actually was occurring then AMD would have already done something about it.

 

Huddy is a loudmouth that communicates back and forth between AMD engineers and Game developers, and figures out what game developers need, which let's AMD software/hardware engineers knows what to improve with future GPU architecture, drivers, etc... He's venting because CDPR Developers inadvertently slipped a knife from Nvidia into his back 2 months before the Witcher 3 released, and he's a little salty about it.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

The problem with Intels ICC was not the fact that it intentionally made AMD processor perform worse, just that they weren't been open about it.

To this day, I believe that ICC will favor a Intel processor over an AMD, however, it is made clear that there is an unfair advantage.

Link to comment
Share on other sites

Link to post
Share on other sites

These technologies are not even THAT good most of the times. They may look cool, but physics are somewhat off most of the times. 

Don't care particularly for those specific technologies since they're not so great but are resource hog. In future they will I'm sure get better in terms of both.

 

AMD didn't even release a new driver though. Last one was quite a while.

I'm running latest stable driver on my R9 290 and both GTA5 and The Witcher 3 run pretty good. 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Of course he said that, he doesn't know how to open his mouth without blaming, criticizing or otherwise simply being derisory to NVidia. 

 

Huddy is a loudmouth who gets his talking points from reddit users and spends more time talking shit than he does delivering new products.

 

AMD can't win the product war so they have to try the PR war, since thats the only thing they seem to be good at these days. 

 

AMD: Gaming Shitposting Evolved

 

 

Huddy is a loudmouth that communicates back and forth between AMD engineers and Game developers, and figures out what game developers need, which let's AMD software/hardware engineers knows what to improve with future GPU architecture, drivers, etc... He's venting because CDPR Developers inadvertently slipped a knife from Nvidia into his back 2 months before the Witcher 3 released, and he's a little salty about it.

 

 

I'd be salty too if developers were dumping my platform and going to the other side because the other side offered more money, more engineers, more support or just in general cared more about helping with optimization on the game and driver side to give people better experiences. 

 

Huddy is an embarrassment these days. 

Link to comment
Share on other sites

Link to post
Share on other sites

WTF is a Chief Gaming Scientist? That sounds like a made up title...

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Am I the only one just waiting for TotalBiiscuit to do a video on this topic and introduce a bit of sanity into this argument?

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

WTF is a Chief Gaming Scientist? That sounds like a made up title...

 

Its an insult to the term scientist. 

 

That baboon wouldn't know science if the dictionary definition slapped him in the face. 

 

the intellectual and practical activity encompassing the systematic study of the structure and behavior of the physical and natural world through observation and experiment.

 

 

Dunno about everyone else, but talking shit about your competitor without much proof and off the backs of rumours and deceit hardly says "scientist". Rather, it says "fanboy". 

Link to comment
Share on other sites

Link to post
Share on other sites

Huddy is a loudmouth who gets his talking points from reddit users and spends more time talking shit than he does delivering new products.

 

AMD can't win the product war so they have to try the PR war, since thats the only thing they seem to be good at these days. 

 

AMD: Gaming Shitposting Evolved

 

I'd be salty too if developers were dumping my platform and going to the other side because the other side offered more money, more engineers, more support or just in general cared more about helping with optimization on the game and driver side to give people better experiences. 

 

Huddy is an embarrassment these days. 

 

Your rhetoric is getting more and more extreme. But then again, you already failed Godwin's law, so no surprise there.

 

AMD has talked fine with CDPR, and the game is very well optimized on AMD. The only exception, is the black boxed GameWorks effect, HairWorks, that neither CDPR, nor AMD has access to optimize for.

 

Something is embarrassing here, but it's not Huddy.

 

WTF is a Chief Gaming Scientist? That sounds like a made up title...

 

All titles are made up ;)

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Am I the only one just waiting for TotalBiiscuit to do a video on this topic and introduce a bit of sanity into this argument?

 

He said he's not going to do a video on Witcher 3, and he already talked about Gameswork in his Watch Dogs port report I believe. 

 

I'd like to hear people in the industry talk about Hairworks v. TressFX though... 

Link to comment
Share on other sites

Link to post
Share on other sites

WTF is a Chief Gaming Scientist? That sounds like a made up title...

 

lol not only that, but if that even exsists,

it realy sounds very logical, that he has time to speak to just an ordenairy LTT user...

 

yeah right. :P

Link to comment
Share on other sites

Link to post
Share on other sites

The problem with Intels ICC was not the fact that it intentionally made AMD processor perform worse, just that they weren't been open about it.

To this day, I believe that ICC will favor a Intel processor over an AMD, however, it is made clear that there is an unfair advantage.

There is a warning on the product page and the manual that it only supports optimization of i386 and Intel64 architectures. If you change your CPUID string the front end will query the chip for available instructions. From there it determines the architecture family and then builds the code based on the cycle count and cache latency rules which were painstakingly developed from an optimization program (yes, a program was used to build the core component of another program) which likely took the form of an integer linear program or genetic algorithm.

Did Cinebench decide to switch to GCC or Clang?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Your rhetoric is getting more and more extreme. But then again, you already failed Godwin's law, so no surprise there.

AMD has talked fine with CDPR, and the game is very well optimized on AMD. The only exception, is the black boxed GameWorks effect, HairWorks, that neither CDPR, nor AMD has access to optimize for.

Something is embarrassing here, but it's not Huddy.

All titles are made up ;)

AMD has two problems, one of which is very easy to solve.

1) it has no CUDA license (despite being offered it for free twice)

2) it has a terrible tesselation engine which is basically all Hairworks runs on. Is it Nvidia's fault AMD is just that bad at it?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

All titles are made up ;)

 

True but legitimate titles give insight into what you do and are generally widely accepted.

 

Ex: Chief Engineer, Director or Regional Sales, Etc. His title sounds like it was given to him by a six year old.

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why reinvent the wheel? They used DirectX as well because it is more convenient than writing everything in machine code. They knew that HairWorks wouldn't run well so they gave us an on/off switch.

I think CDPR handled this flawlessly. They gave all the power to the users.

I do not understand the context of your question. I am only speaking in regards to what CDPR said in past interviews. I myself understand why someone would use code available to them, but it contradicts their previous statements regarding in-house development. I think the blame on Nvidia is rather silly, as they only make the software available. They are not forcing developers to use their software. I do however, find it very plausible that this software is contributing to shortcomings in AMD's ballpark. CDPR did say that certain optimizations just could not be done for AMD in regards to how Nvidia's software works. That being said, if you can turn it off and remove the performance hit, then i do not see why this is such a big fuss. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I guess we have to wait for Arkham Night and see if the same pattern happens (both 700 series and AMD) to see if GW is bad.

 

Whatever is happening, it's extremely bad for gamers.

FX-8120 | ASUS Crosshair V Formula | G.Skill Sniper 8GB DDR3-1866 CL9 | Club3D Radeon R9 290 RoyalAce |Thermaltake Chaser MkIII | 128GB M4 Crucial + 2TB HDD storage | Cooler Master 850M | Scythe Mugen 3 | Corsair Strafe RGB | Logitech G500

Link to comment
Share on other sites

Link to post
Share on other sites

AMD has two problems, one of which is very easy to solve.

1) it has no CUDA license (despite being offered it for free twice)

2) it has a terrible tesselation engine which is basically all Hairworks runs on. Is it Nvidia's fault AMD is just that bad at it?

 

We have no clue of the consequences of such a CUDA license. The only other company, that has one, seems to not use it for anything (Intel).

It's not quite as optimized as Maxwell, but we don't know how well the new GCN is for that.

 

But is it necessary? If you just waste excessive amounts of tessellation on HairWorks @ 64x, or the worlds most detailed concrete slab in Crysis 2, then what is the point of wasting resources on it? There does not seem to be any difference in graphics fidelity between 64x and 32x in HairWorks, and not noticeably in 16x either. So what is the point of extreme tessellation, when it is that taxing, and the law of diminishing returns hits so hard?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

*sigh*

 

Look as an AMD owner, I still think Richard needs to well:

 

post-13628-shut-the-fuck-up-gif-Penn-Jil

 

He has made his point before and there is nothing to gain from just throwing more flames to the fucking fire. The feature can be fucking turned off for fuck's sake. That makes it close to a non fucking issue is not like the non hairworks hair looks disgusting or makes all characters go fucking bald. This actually doesn't help us AMD gamers and just furthers his fucking agenda.

 

And for what? Is Nvidia gonna change? Fat fucking chance. This would just marginalize AMD gamers even further. Wait for a fucking game where you can't turn off gameworks to make a big stink otherwise this is just pandering.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

-snip-

 

 

 

amd are struggling and they seam to be trying to blame everything else for the issues.....i was really considering getting a 300 series gpu but now im not so sure.  looks like ill get a 980

Edited by Blade of Grass

"if nothing is impossible, try slamming a revolving door....." - unknown

my new rig bob https://uk.pcpartpicker.com/b/sGRG3C#cx710255

Kumaresh - "Judging whether something is alive by it's capability to live is one of the most idiotic arguments I've ever seen." - jan 2017

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

 

 

amd are struggling and they seam to be trying to blame everything else for the issues.....i was really considering getting a 300 series gpu but now im not so sure.  looks like ill get a 980

 

Word on the street is that the 390/X that people were looking forward to is nothing more than  290/X rebrand. The HBM card is going after the Titan, which is nice, but maybe 5% of the market is even remotely interested in those cards much less buying them. 

 

So I will probably get really, really cheap 290s (maybe even 8GB versions) and call it a day. AMD gets no money from me, I get cheap cards. Win win? 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×