Jump to content

AMD says Nvidia’s GameWorks “completely sabotaged” Witcher 3 performance

Of course he said that, he doesn't know how to open his mouth without blaming, criticizing or otherwise simply being derisory to NVidia. 

 

sadly Games are made in the direction of nVidia's architecture. Not allot of game developers are behind AMD so they say Fuck AMD users because AMD has a gaming card market of only 20% that is nothing. compared to nVidia's 80%.

EOC folding stats - Folding stats - My web folding page stats

 

Summer Glau: Quote's The future is worth fighting for. Serenity

 

My linux setup: CPU: I7 2600K @4.5Ghz, MM: Corsair 16GB vengeance @1600Mhz, GPU: 2 Way Radeon his iceq x2 7970, MB: Asus sabertooth Z77, PSU: Corsair 750 plus Gold modular

 

My gaming setup: CPU: I7 3770K @4.7Ghz, MM: Corsair 32GB vengeance @1600Mhz, GPU: 2 Way Gigabyte RX580 8GB, MB: Asus sabertooth Z77, PSU: Corsair 860i Platinum modular

Link to comment
Share on other sites

Link to post
Share on other sites

sadly Games are made in the direction of nVidia's architecture. Not allot of game developers are behind AMD so they say Fuck AMD users because AMD has a gaming card market of only 20% that is nothing. compared to nVidia's 80%.

 

The issue really seems to come back to tessellation though,  Hairworks is heavily dependent on tessellation processing power, while AMD hardware typically has less.  This is not the same as proprietary monopolizing, or intentionally crippling your opposition.  The issue here is not the effect it has on AMD cards but whether or not the dev's can alleviate some of the performance issues by dialing down the effect?  They say they can't while AMD say they can with driver updates. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Why is a setting that runs like shit on both cards, that can be turned off, such a big deal to everyone. 

I am wondering this myself. How many times can someone say "turn it off". It doesn't add much to the game anyhow.

Link to comment
Share on other sites

Link to post
Share on other sites

nvidia sabotaged themselves as a 960 runs better than a 780

Link to comment
Share on other sites

Link to post
Share on other sites

The Witcher 3 release is already the most interesting release in this year IMO..

Unless Ubisoft will release that new AC this year :)

"We're all in this together, might as well be friends" Tom, Toonami.

 

mini eLiXiVy: my open source 65% mechanical PCB, a build log, PCB anatomy and discussing open source licenses: https://linustechtips.com/topic/1366493-elixivy-a-65-mechanical-keyboard-build-log-pcb-anatomy-and-how-i-open-sourced-this-project/

 

mini_cardboard: a 4% keyboard build log and how keyboards workhttps://linustechtips.com/topic/1328547-mini_cardboard-a-4-keyboard-build-log-and-how-keyboards-work/

Link to comment
Share on other sites

Link to post
Share on other sites

Nothing wrong with The Witcher 3 performance..

i7 6700K - ASUS Maximus VIII Ranger - Corsair H110i GT CPU Cooler - EVGA GTX 980 Ti ACX2.0+ SC+ - 16GB Corsair Vengeance LPX 3000MHz - Samsung 850 EVO 500GB - AX760i - Corsair 450D - XB270HU G-Sync Monitor

i7 3770K - H110 Corsair CPU Cooler - ASUS P8Z77 V-PRO - GTX 980 Reference - 16GB HyperX Beast 1600MHz - Intel 240GB SSD - HX750i - Corsair 750D - XB270HU G-Sync Monitor
Link to comment
Share on other sites

Link to post
Share on other sites

Putting the blame on Nvidia or AMD aside, what i really want to know is why CD Project Red decided to even use gameworks in the first place? Were they not the same company that stated "we kept everything in-house, including the engine because we feel we could do better with our own work" or something along those lines? It just seems odd to me for a company to put quite a bit of importance on developers learning to stand on their own feet, then adopt technology that may affect some of their consumers hardware. I hope this practice stops soon. Game developers need to start optimizing their own products more before handing them off to the driver engineers as it is getting old for Nvidia and AMD to pick up the slack of lazy coding.

 

Sorry for the off-topic rant, i just think the entire Witcher 3 is extremely odd.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Putting the blame on Nvidia or AMD aside, what i really want to know is why CD Project Red decided to even use gameworks in the first place? Were they not the same company that stated "we kept everything in-house, including the engine because we feel we could do better with our own work" or something along those lines? It just seems odd to me for a company to put quite a bit of importance on developers learning to stand on their own feet, then adopt technology that may affect some of their consumers hardware. I hope this practice stops soon. Game developers need to start optimizing their own products more before handing them off to the driver engineers as it is getting old for Nvidia and AMD to pick up the slack of lazy coding.

 

Sorry for the off-topic rant, i just think the entire Witcher 3 is extremely odd.

Why reinvent the wheel? They used DirectX as well because it is more convenient than writing everything in machine code. They knew that HairWorks wouldn't run well so they gave us an on/off switch.

I think CDPR handled this flawlessly. They gave all the power to the users.

Link to comment
Share on other sites

Link to post
Share on other sites

I think it would be bigger news if Ubisoft had a year they didn't release an AC game. :blink:

True that ^^

 

Why reinvent the wheel? They used DirectX as well because it is more convenient than writing everything in machine code. They knew that HairWorks wouldn't run well so they gave us an on/off switch.

I think CDPR handled this flawlessly. They gave all the power to the users.

Flawlessly is different, but they did decent.

The Nvidia 'sponsoring' seems a bit obvious (with Nvidia releasing those wallpapers before release and stuff..), but it is indeed good they gave 'us' the option.

"We're all in this together, might as well be friends" Tom, Toonami.

 

mini eLiXiVy: my open source 65% mechanical PCB, a build log, PCB anatomy and discussing open source licenses: https://linustechtips.com/topic/1366493-elixivy-a-65-mechanical-keyboard-build-log-pcb-anatomy-and-how-i-open-sourced-this-project/

 

mini_cardboard: a 4% keyboard build log and how keyboards workhttps://linustechtips.com/topic/1328547-mini_cardboard-a-4-keyboard-build-log-and-how-keyboards-work/

Link to comment
Share on other sites

Link to post
Share on other sites

Putting the blame on Nvidia or AMD aside, what i really want to know is why CD Project Red decided to even use gameworks in the first place? Were they not the same company that stated "we kept everything in-house, including the engine because we feel we could do better with our own work" or something along those lines? It just seems odd to me for a company to put quite a bit of importance on developers learning to stand on their own feet, then adopt technology that may affect some of their consumers hardware. I hope this practice stops soon. Game developers need to start optimizing their own products more before handing them off to the driver engineers as it is getting old for Nvidia and AMD to pick up the slack of lazy coding.

Sorry for the off-topic rant, i just think the entire Witcher 3 is extremely odd.

It's just CDPR is starting to fall down the AAA slope. Soon they'll be no better than the rest of em.

- snip-

Link to comment
Share on other sites

Link to post
Share on other sites

The issue really seems to come back to tessellation though,  Hairworks is heavily dependent on tessellation processing power, while AMD hardware typically has less.  This is not the same as proprietary monopolizing, or intentionally crippling your opposition.  The issue here is not the effect it has on AMD cards but whether or not the dev's can alleviate some of the performance issues by dialing down the effect?  They say they can't while AMD say they can with driver updates. 

 

Well, the question is *how* they send those tessellation commands to the GPU. DX11 (and I assume OpenGL has a tessellation api somewhere) has a API which handles tessellation and both GPU makers have designed their cards to perform those commands at a decent level.

 

I have no proof of this, but I strongly suspect the HairWorks code is sending tessellation commands to the GPU in a non standard way which rely heavily on the specific Nvidia architecture of the Maxwell cards.

 

Even if AMD cards were amazing at tessellation, I expect it would still run poorly on them.

Link to comment
Share on other sites

Link to post
Share on other sites

Huddy is such a Bulldog sometimes, needs to pick his battles better. Nvidia cards that aren't Maxwell-based are hit with massive performance problems in Witcher 3, while all GCN-based AMD cards (HD7000 and newer) are doing fine and performing well, even being able to run Hairworks through manual tessellation adjustments (although hairworks doesn't really look that much better that it's worth turning on anyway).

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

the reason why i hate Nvidia and the reason why i wont buy AMD

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

the reason why i hate Nvidia and the reason why i wont buy AMD

So you're waiting for Intel to make a PCIe GPU?

"The of and to a in is I that it for you was with on as have but be they"

Link to comment
Share on other sites

Link to post
Share on other sites

Runs fine on my 280X with hairworks disabled. So whatever...

 

If you go to your catalyst suite, and into the gaming tab, you can create a profile for witcher3.exe. Then set a manual tessellation level of 4-16 (probably 4 or 8), and you can use hairworks without having your performance tank. STandard setting is 64x btw. Note that 2x will make Gerald look like a bald bear, and 4 might not be enough.

 

But what if the reason the code can't be optimised is becasue of AMD drivers and not because of hairworks?  Not saying it is, but if AMD are confident they can fix the issue with drivers then why couldn't CDPR fix it with code?

 

Although I wish they would take it to court, then the developers would be under oath to tell the truth about it all.

 

Optimization should always lie in the game, not the driver itself. But even if that was the case, AMD cannot really optimize, because GameWorks is a black box for them. And if they could, there is still only very little you can do at the very end of the graphics stack, as stated by an ex valve programmer.

 

This is why Catalyst now has a Tessellation setting (due to Crysis 2 concrete slabs, and underground water, and Batmans cape in Origin.

 

I kinda do too. What little we hear about these gameworks contracts, are quite fucked.

 

I don't know about you Dick, but I think it is more likely that your cards are just behind in terms of tessellation performance. You know tessellation, the thing you have been way behind at for about 5 years now (ever since the 5000 vs 400 series days). The thing you were heavily criticized for and promised a quadrupling of when going from 5000 to 6000 series because the performance was so abysmal.

You know the thing you "fixed" in drivers by simply setting a limit on how much tessellation can be going on at any time.

If you spent less time criticizing Nvidia and more time fixing that we might not have needed this debate.

 

A Titan X gains about 24% more performance without HairWorks, so I think we can conclude, it's extremely taxing, and very unoptimized. Maxwell does seem to do well, but Kepler is already falling behind a LOT.

 

As for falling behind the facts are, that even 7000 series AMD cards (GCN 1), is 100% compatible with DX12. All GCN cards are tier 3 DX12. Even Titan X (Maxwell v2), is only teir 2 DX12. And even Titan X, uses an old obsolete DP controller 1.2, from 2012, which is why NVidia can't support Adaptive Sync, even if they wanted to. NVidia is the company, that is technologically behind.

 

As for the last part, read my answer to moose above . ↑

 

The issue really seems to come back to tessellation though,  Hairworks is heavily dependent on tessellation processing power, while AMD hardware typically has less.  This is not the same as proprietary monopolizing, or intentionally crippling your opposition.  The issue here is not the effect it has on AMD cards but whether or not the dev's can alleviate some of the performance issues by dialing down the effect?  They say they can't while AMD say they can with driver updates. 

 

I generally agree with this, but the problem is, that it seems, CDPR, cannot optimize it, which I assume, includes setting the tessellation multiplier. After all it is either 0 or 64 on NVidia systems, where as on AMD, you can set it manually in Catalyst, or have AMD set it). After all catalyst has full low level control of the GPU, so AMD can pretty much do whatever is possible.

When we see AMD's Witcher 3 driver later in the week, I think we will see tessellation, being lowered from 64, down to 16 on highend cards, or so.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Huddy is such a Bulldog sometimes, needs to pick his battles better. Nvidia cards that aren't Maxwell-based are hit with massive performance problems in Witcher 3, while all GCN-based AMD cards (HD7000 and newer) are doing fine and performing well, even being able to run Hairworks through manual tessellation adjustments (although hairworks doesn't really look that much better that it's worth turning on anyway).

 

This.

 

If anything nvidia gimped their own cards (Kepler)

 

GCN gpus perform fine in the game .

Link to comment
Share on other sites

Link to post
Share on other sites

there-it-goes-the-last-fuck-i-give.gif Why are people even caring about a feature that is either shit or in development ?

... Life is a game and the checkpoints are your birthday , you will face challenges where you may not get rewarded afterwords but those are the challenges that help you improve yourself . Always live for tomorrow because you may never know when your game will be over ... I'm totally not going insane in anyway , shape or form ... I just have broken English and an open mind ... 

Link to comment
Share on other sites

Link to post
Share on other sites

They should have just made Ultra have hairworks off by default, just like HBAO+ isn't on the high setting.  I wonder how many people would have even noticed performance issues and be crying because they "couldn't run ultra"

Link to comment
Share on other sites

Link to post
Share on other sites

 

 Why are people even caring about a feature that is either shit or in development ?

 

Because we need reasons to either blame nVidia or AMD...that's just how it works when you only have 2 manufacturers for CPUs or GPUs.

Why isn't there any "flame war" at this scale in say, motherboards, RAM, SSDs etc?

'Course we know why, because CPU and GPU are the two parts that matter the most regarding performance so we need them to work flawlessly.

MARS_PROJECT V2 --- RYZEN RIG

Spoiler

 CPU: R5 1600 @3.7GHz 1.27V | Cooler: Corsair H80i Stock Fans@900RPM | Motherboard: Gigabyte AB350 Gaming 3 | RAM: 8GB DDR4 2933MHz(Vengeance LPX) | GPU: MSI Radeon R9 380 Gaming 4G | Sound Card: Creative SB Z | HDD: 500GB WD Green + 1TB WD Blue | SSD: Samsung 860EVO 250GB  + AMD R3 120GB | PSU: Super Flower Leadex Gold 750W 80+Gold(fully modular) | Case: NZXT  H440 2015   | Display: Dell P2314H | Keyboard: Redragon Yama | Mouse: Logitech G Pro | Headphones: Sennheiser HD-569

 

Link to comment
Share on other sites

Link to post
Share on other sites

Because we need reasons to either blame nVidia or AMD...that's just how it works when you only have 2 manufacturers for CPUs or GPUs.

Why isn't there any "flame war" at this scale in say, motherboards, RAM, SSDs etc?

'Course we know why, because CPU and GPU are the two parts that matter the most regarding performance so we need them to work flawlessly.

I always dislike flame wars ... They are too one-sided ... it needs a third-party to confirm if something works or not , someone who is unbiased ... 

... Life is a game and the checkpoints are your birthday , you will face challenges where you may not get rewarded afterwords but those are the challenges that help you improve yourself . Always live for tomorrow because you may never know when your game will be over ... I'm totally not going insane in anyway , shape or form ... I just have broken English and an open mind ... 

Link to comment
Share on other sites

Link to post
Share on other sites

...and the GPU war rages on...

 

It's very difficult for any of us to be completely objective in matters such as this. We don't have exclusive insight into what happens behind the scenes at AMD or Nvidia. I wouldn't put it past either of them to say and or do things to discredit the other. They are, after all, in direct competition. 

 

It would not surprise me if Nvidia knowingly put code or features in the game that purposely handicapped performance on AMD cards, but I also take Richard Huddy's words with an extremely large grain of salt. 

 

Has there ever been a case where Nvidia cards suffered a similar performance hit with any AMD titles before?

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's been said before. The best thing to do is disable tessellation and then see what the performance differential is. Tessellation is dependent one fixed-function unit. AMD does need to catch up to Nvidia with their tessellator but if AMD wants to mitigate the issue for older architectures, they or CD Projekt could do some tweaks or tessellate parts of it ahead of time.

In 5 years AMD will be dead. We'll all have to use Nvidia cards and pay $500 minimum.

 

The thing is that the Tessellation is way beyond reasonable levels. It's built to be so demanding, not built to look amazing. 16X or whatever you need should be enough for it. Sure you can always put more stuff on the die or improve it in other ways (Tonga aka. 285 is much better in tessellation). nVidia is willing to take the performance hit since it is far greater on the opposing side, and as a collateral the Kepler cards are taking it as well. Or hell, have a slider option for the hair tessellation. It's getting really hard to convince people why the option isn't offered, because even nVidia users wouldn't leave it at max.

 

Can't wait for the anti-trust lawsuit inevitably on the horizon, just need that one first step from someone with facts.

GameWorks was conceived, designed and delivered with one goal in mind: sabotage.

 

For now, even if I do use an Nvidia card, I'm not touching anything with GameWorks in it.

 

Yes no, maybe. The devil is in the details. AMD does have worse tessellation performance, that's a given. Using 64x or whatever tessellation makes no sense whatsoever, even if you have good tessellation performance, the visual improvement is pretty much zero and you'll still take a hit in performance, just a bit less with good hardware / software.

 

What got Intel sued is that they literally implemented a "GenuineIntel" string into the instruction set, which made AMD processors slower than their Intel counterparts. It's not the same thing as what nVidia is doing. Imagine a race. nVidia owns the technology, so they set the rules. They are better at doing things with diesel engines, so they demand that everyone ends up using diesel as the fuel in the race, and AMD cannot use their gas-powered implementations, but have to make through the race with diesel. The audience would also prefer races to be run with gas ( = higher framerate and the cars are just as shiny). Both companies are faster with their gas cars than what they are on their diesel cars, but nVidia is significantly better than AMD with diesel, and nVidia wants the easy win as much as possible.

 

What Intel did, was sending an undercover mechanic to AMDs garage to sabotage the car, before the race even starts.

 

If they are just eye-candy effects that developers can use, they won't do much harm. Like pointless leaves flying around the battlefield. Sure, the developer could have used solution that doesn't care about a hardware, but at least they're getting money from nVidia to fund the development. Yes, nVidia is paying in some way, be it marketing or otherwise, to get their GameWorks into games.

 

What I'm most worried about is if stuff like Crysis 2 is going to happen again, where the game itself gimps your performance, with certain cards in particular, and it isn't just some eye-candy option #400 that you can turn off.

Link to comment
Share on other sites

Link to post
Share on other sites

A Titan X gains about 24% more performance without HairWorks, so I think we can conclude, it's extremely taxing, and very unoptimized. Maxwell does seem to do well, but Kepler is already falling behind a LOT.

 

As for falling behind the facts are, that even 7000 series AMD cards (GCN 1), is 100% compatible with DX12. All GCN cards are tier 3 DX12. Even Titan X (Maxwell v2), is only teir 2 DX12. And even Titan X, uses an old obsolete DP controller 1.2, from 2012, which is why NVidia can't support Adaptive Sync, even if they wanted to. NVidia is the company, that is technologically behind.

 

As for the last part, read my answer to moose above . ↑

Very taxing does not equal unoptimized. Some things, like computing individual hair movements based on things like wind and movement is very resource heavy, even if well optimized.

 

Your red herring with adaptive sync is completely irrelevant. AMD is behind in terms of tessellation performance and they have been ever since the 5000 series (about 5 years ago). What tier of Direct X compatibility they are at is irrelevant because it doesn't change the fact that AMD cards don't perform well with high tessellation on. Even their own "fix" is to override game settings and force less tessellation.

 

We will most likely be able to set tessellation to whatever we want when the patch comes but as for now we have an on/off switch for HairWorks.

Link to comment
Share on other sites

Link to post
Share on other sites

Because some people need someone to blame,  while others want the feature on but are getting annoyed because it seems it's too hard to make it work properly without running around blaming everyone else.

or maybe people are just worried this might be a trend. like DLC and microtransations, which aren't so micro anymore.

 

but maybe you're right. maybe no one has a valid point. just you.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×