Jump to content

NVIDIA Under Attack Again for GameWorks in The Witcher 3: Wild Hunt

CtW

What's worse is the downgrade is probably going to be blamed on Gameworks. Just you watch.

 

That will happen, for sure. You would think that GTA 5 would have put an end to this discussion for how well optimized that game is and how great it looks. What ever happened to game devs pushing hardware to it's limits to where at the maximum settings wasn't playable on a high end single card system? Who will design the next 'Crysis'? Who will release the next game where the maximum settings are going to require the hardware manufacturers to catch up? It looked like that would have been Witcher 3, but it isn't.

 

I'm still hopeful that the disabled effects are hidden in the code and someone will find a way to turn them on.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

GameWorks features use methods that are not necessary to achieve higher graphical fidelity, but are still done that way because the performance hit is higher on AMD cars (i.e far more tessellation than what is actually needed). Even before Gameworks had the name, Crysis 2 had very weird tessellation behavior.

 

You say it's unnecessary, why? Crysis 2 looked great because of tessalation, why was it too much tessalation and who says it didn't improve visuals. Did it suffer from diminishing returns?

 

 

AMD cannot modify the GameWorks component in a way where it would be optimized for their hardware. It is a black box, the license forbids all modifications without consulting nVidia and developers cannot share this code to AMD.

 

Which is because Gameworks libraries are the property of nvidia, and the idea that companies should disclose all material which puts them in a competing/agressive position over to the competitor out of some vague sense of "moral fairness" is really appalling. That's so anti free-market it hurts.

 

TressFX is an example of a solution that nVidia could look at an optimize it for their hardware, and now it runs just as well on nVidia hardware as it does on AMD. I'm not saying it was perfect, nVidia probably didn't have the access to the code early enough for Tomb Raider, but all things considered the end result won't affect anyone today. Meanwhile, GameWorks has been going on for quite a while now, and they won't be opening it up, at all.

 

But I don't see that this should be the default (for reasons in the previous point), nor did anyone force AMD at to disclose the TressFX software freely but AMD themselves. Like a bunch of lemons. This is exactly why AMD is doing so poorly, the lack of understanding how to run a succesful business model. They apparently hate money and can't seem to want to turn a buck or stick by their products. They just create something, and throw it out there for others to finish it. Rather then comitting to it, and sell it to developers/competition for royalties.

 

Again, AMD is free to do so, but I don't agree that this is the smart thing to do. Or that in some way, this should be the norm. Because this isn't healthy for a company to do. If they're not getting a return of investment, they eventually lose the ability to invest or create new technology. Which is the point AMD has reached for years now. I don't know what is keeping them afloat.

 

I think project cars has gotten a bigger PR hit right now for using PhysX. It could be that AMDs cards being more CPU-intensive would mean that they just don't have enough oomph left in the CPU to do the CPU-bound physX. nVidia rep on reddit said that the PhysX on Pcars is CPU-only, but then again their own list of "hardware supported PhysX" includes Pcars. Some users report that disabling gpu physx on nVidia tanks performance, others say it has no effect. AMD seems to have much higher performance with WDDM 2.0 on Windows 10.0. Who knows what the final answer really is.

 

I won't know unless I get a copy and are able to test for myself. The findings of people on reddit are always subjective, anecdotal and to be taken with a grain of salt.

Link to comment
Share on other sites

Link to post
Share on other sites

You say it's unnecessary, why? Crysis 2 looked great because of tessalation, why was it too much tessalation and who says it didn't improve visuals. Did it suffer from diminishing returns?

 

 
 

 

Which is because Gameworks libraries are the property of nvidia, and the idea that companies should disclose all material which puts them in a competing/agressive position over to the competitor out of some vague sense of "moral fairness" is really appalling. That's so anti free-market it hurts.

 
 

 

But I don't see that this should be the default (for reasons in the previous point), nor did anyone force AMD at to disclose the TressFX software freely but AMD themselves. Like a bunch of lemons. This is exactly why AMD is doing so poorly, the lack of understanding how to run a succesful business model. They apparently hate money and can't seem to want to turn a buck or stick by their products. They just create something, and throw it out there for others to finish it. Rather then comitting to it, and sell it to developers/competition for royalties.

 

Again, AMD is free to do so, but I don't agree that this is the smart thing to do. Or that in some way, this should be the norm. Because this isn't healthy for a company to do. If they're not getting a return of investment, they eventually lose the ability to invest or create new technology. Which is the point AMD has reached for years now. I don't know what is keeping them afloat.

 
 

 

I won't know unless I get a copy and are able to test for myself. The findings of people on reddit are always subjective, anecdotal and to be taken with a grain of salt.

 

 

You should check the articles regarding Crysis 2 tessellation, there was stuff that you couldn't even see, like water under the level, and plenty of objects were tessellated like crazy. nVidia is willing to take a 5% performance hit with crazy amounts of tesselation if it means that AMD is going to take a higher, let's say 25-35% performance hit because of it. This happens because nVidia overbudgets the tessellation capabilities of their cards, AMD won't because it makes sense to use the wafer on things which make sense from a performance and visual quality perspective.

 

They might be property of nVidia, but they've essentially made them completely proprietary blackboxes. AMD can't do anything about it. Effects libraries shouldn't be used like that, and some devs agree. Free market never existed in absolute terms, otherwise things like Intel putting pieces of code (GenuineIntel) to instruction sets where it would hinder performance on AMD hardware would be legal.

 

nVidia doesn't sell developers anything. Devs are free to use it, and well, if they don't implement other things... marketing money. Witcher 3 GPU bundles.

 

AMDs actions are smart or dumb, depending on your perspective. Regardless of the viewpoint, nVidia is being the parasite here. Marketing advancements benefit everybody. Locking down things like this hinders the whole market and nVidia takes the benefit at the expense of competitors and consumers. Where nVidia is stronger is marketing, innovation is subjective since both brands have had plenty of it.

Link to comment
Share on other sites

Link to post
Share on other sites

You say it's unnecessary, why? Crysis 2 looked great because of tessalation, why was it too much tessalation and who says it didn't improve visuals. Did it suffer from diminishing returns?

.

Tesselation is good. The problem was Crysis 2 had redundant tessellation in order to penalize AMD GPUs.

Check out tech report's investigation

Using more tesselation than is required to achieve visual pay-off

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2

Asking the GPU to render tesselated water even though it is not in the scene

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/3

I.e.lowering your framerate with a fake workload

Link to comment
Share on other sites

Link to post
Share on other sites

I just can't comprehend how people can not see that it's only visual options that will help differentiate NVidia from AMD? Without GameWorks, what would compel one with a 1440p 60fps monitor to purchase NVidia over AMD? It's simply adding optional features to a game. A game lacks visual options, they rage. A game gives several options including additional options if you have a capable GPU and they rage.

 

I can completely understand raging about a graphics downgrade, but I just cannot fathom why some are raging about this. I just can't see things how they see them, which is odd because I can usually see both sides of the argument, but I just can't on this one.

People are mad because they assume this game is going to perform just like project cars did as that is a Gameworks title and performance is noticeably worse if you are using an AMD gpu. Also, the game works visual options also, when turned on, do cause performance issues for those GPUs too. The people who are raging want to have the same experience with an AMD gpu as they would with an Nvidia gpu, which simply just wont happen for the above reasons. Honestly, people are just going to have to get over it as no amount of whining is going to change how Gameworks tech is integrated into the game. 

Link to comment
Share on other sites

Link to post
Share on other sites

from the benchmarks i've seen, the only people getting screwed in witcher 3 are the players using kepler gpus.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD made HBM with hynix and they gave this to nvidia, they made Tressfx and they gave this to nvidia, they made mantle, then made better VULKAN and gave this to nvidia too.

 

Nvidia is bitch! buys physx does not give to amd, makes this kind of shit. and does not give amd, makes G-sync for 150$ that same tech on amd is fucking free!

 

I think you mean Nviadia buys PhysX and offers it to AMD but AMD being AMD they think they are above supporting competing standards and said no thank you and now they suffer for it.  Why do you think AMD is losing money even when 100% of the next gen consoles use their chips?  Cos of decisions like this.  Their decisions never make sense.  Making stuff open source does not gain you karma, making something that works well does.

QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not unbiased. I'll never claim to be in this matter. I see a closed box like gameworks segregating the pc market purposefully and it does nothing but hurt the community as a whole. I am adamantly against it. I would be adamantly against it even if I ran Nvidia cards. My statement may seem very cynical and blindly bias but the amount of people (on this forum and others) that will basically just say "awesome! I have a nvidia card! AMD should just get good lolololol!" and are apparently 100% ok with them forcing the community into segments is astonishing. 

 

In the same way that AMD should open up TressFX, Mantle, FreeSync etc up to Nvidia and Intel? Yes, I'm an nVidia fan but I have no issue with Gameworks or "Nvidia; the way its meant to be played" titles because the same thing happened when AMD releases their AMD-only technology. Did mantle and TressFX get a lot of hate? No. Does Gameworks? Yes.

 

Why?

 

TL;DR: You can't expect Nvidia to spend millions developing technology to just give it away to AMD and Intel, nor can you expect the same from AMD. Each vendor has their set of unique technologies, pick which one you prefer. I didn't get the "Full experience" with Radeon titles like Battlefield 4 w/ Mantle, I think this is perfectly fair.

 

Each vendor should be allowed to have their own special technology; if Nvidia spends more time and money working with devs to make their cards and drivers optimized, AMD has nothing to complain about. 

 

 

People are mad because they assume this game is going to perform just like project cars did as that is a Gameworks title and performance is noticeably worse if you are using an AMD gpu. Also, the game works visual options also, when turned on, do cause performance issues for those GPUs too. The people who are raging want to have the same experience with an AMD gpu as they would with an Nvidia gpu, which simply just wont happen for the above reasons. Honestly, people are just going to have to get over it as no amount of whining is going to change how Gameworks tech is integrated into the game. 

 

 

I agree with what you say, but I think that rather than saying "playing with an AMD card is noticeable worse", it'd be more fair to say "playing with an Nvidia card is noticeably better. Imagine Card 1 from Nvidia and Card 2 from AMD both achieve a stable 60fps, If Nvidia spends time with the game developer to make Card 1 get 70fps instead, did they make Card 2 perform worse? No. Card 2 performs the same, they just made theirs run at 110% instead.

NCASE M1 i5-9600k  GTX 1080 FE Z370N-WIFI SF600 NH-U9S LPX 32GB 960EVO

I'm a self-identifying Corsair Nvidia Fanboy; Get over it.

Link to comment
Share on other sites

Link to post
Share on other sites

In the same way that AMD should open up TressFX, Mantle, FreeSync etc up to Nvidia and Intel? Yes, I'm an nVidia fan but I have no issue with Gameworks or "Nvidia; the way its meant to be played" titles because the same thing happened when AMD releases their AMD-only technology. Did mantle and TressFX get a lot of hate? No. Does Gameworks? Yes.

 

Why?

 

All of those examples you mentioned, are NOT like GameWorks. TressFX is open source for both dev teams and NVidia (the latter after release). Adaptive Sync, is an industry standard (freesync is a driver, not a tech), mantle is also open source, and is now implemented into Vulkan (OpenGL Next), which is open to all Kronos members (this includes NVidia). None of you examples, are AMD exclusive tech, THAT is why no one hates on AMD for it.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

All of those examples you mentioned, are NOT like GameWorks. TressFX is open source for both dev teams and NVidia (the latter after release). Adaptive Sync, is an industry standard (freesync is a driver, not a tech), mantle is also open source, and is now implemented into Vulkan (OpenGL Next), which is open to all Kronos members (this includes NVidia). None of you examples, are AMD exclusive tech, THAT is why no one hates on AMD for it.

 

Freesync is MORE than a driver.  It needs specialised scalars.  It is very much tech.  Sure, Vulkan uses a big part of Mantle, but when was the last time you played a game in OpenGL that wasn't a Source game, and DX 12 does the same things OpenGL will do.

QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync is MORE than a driver.  It needs specialised scalars.  It is very much tech.  Sure, Vulkan uses a big part of Mantle, but when was the last time you played a game in OpenGL that wasn't a Source game, and DX 12 does the same things OpenGL will do.

The scalar used is from VESA Standard Adaptive Sync, so yes, Freesync is a driver.

DX12 also uses a big part of Mantle.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think these sorts of graphics libraries should be tied to a vendor at all. The game engines themselves ought to have their own tools built in, or open source tools should be able to be ported to specific game engines.

Note we don't hear about gameworks issues with the frostbite 3 engine that EA uses for games like battlefield/dragon age I/basically all their new games. Because EA has enough money and talent to pay for their own physics rendering engines and other special effects that work in their engine. They don't NEED to go begging nvidia for table scraps of added effects.

unity

unreal 4 (actually, this is kind of a street whore engine for nvidia now with the built in gameworks)

cryengine

etc, those seem to be the larger open engines that THRONGS of developers use. Why not build rendering code for physics effects and hair effects and everything else into the BLANKING ENGINE !!!!!!!

It's so god damn obvious.

What's obvious is you have no idea what's going on.

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync is MORE than a driver.  It needs specialised scalars.  It is very much tech.  Sure, Vulkan uses a big part of Mantle, but when was the last time you played a game in OpenGL that wasn't a Source game, and DX 12 does the same things OpenGL will do.

 

@bogus already answered. All hardware parts of FreeSync, is an open royalty free industry standard by VESA, called Adaptive Sync, used in DP 1.2a optional. Sure AMD proposed that tech standard, but NVidia is completely free to support it, if they choose. They only need to make a driver like the Freesync driver. They could even call it Gsync if they wanted to. But they won't since NO NVidia cards, including Titan X, has a DP display controller newer than 1.2, which is from 2012! Not exactly new tech.

 

OpenGL sucks. It's full of redundancy and obsoletion. Vulkan however, is written from the ground up, and is completely new. No old crap, that doesn't work anymore. I bet we will see a lot of games supporting Vulkan in the future, now that opengl is actually useful again.

 

Remember that DX12 does not run on Linux or OSX.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

DX12 also uses a big part of Mantle.

 

 

Citation?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Citation?

There are several, but they could at least try to change the documentation a bit more, instead of "copy paste", example:

https://pbs.twimg.com/media/CBBu9COWwAAPzZB.jpg:large

 

You can also check what some people who developed/are developing DX12 say: https://forum.beyond3d.com/threads/direct3d-feature-levels-discussion.56575/page-9

Link to comment
Share on other sites

Link to post
Share on other sites

There are several, but they could at least try to change the documentation a bit more, instead of "copy paste", example:

https://pbs.twimg.com/media/CBBu9COWwAAPzZB.jpg:large

 

You can also check what some people who developed/are developing DX12 say: https://forum.beyond3d.com/threads/direct3d-feature-levels-discussion.56575/page-9

So, a pretty picture and some forum posts?  nice work. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

So, a pretty picture and some forum posts?  nice work. 

 

No, two official quotations from official documentations and a threat where developers of the API are commenting.

Link to comment
Share on other sites

Link to post
Share on other sites

Citation?

Doesn't have one. Cause there isn't proof. Just conjecture and here say. Nothing concrete that would hold up to rational, critical thinkers.

I guess he's next gonna tell me that Mantle was responsible for Apple developing their "metal" APIs for iOS coding. no wait! Apple coming up with "to the metal" coding clearly inspired the rest of the industry! It's a fact, trust me.

/s

Link to comment
Share on other sites

Link to post
Share on other sites

Doesn't have one. Cause there isn't proof. Just conjecture and here say. Nothing concrete that would hold up to rational, critical thinkers.

I guess he's next gonna tell me that Mantle was responsible for Apple developing their "metal" APIs for iOS coding. no wait! Apple coming up with "to the metal" coding clearly inspired the rest of the industry! It's a fact, trust me.

/s

 

I remember during a Wan show, Mr Linus was saying that AMD released something that would supposedly help developers in developing DX12 since DX12 is similar to mantle in a way... Not sure how much of Mantle was used though....  :blink:

 

Oh found an article http://wccftech.com/amd-ends-revolutionary-mantle-api-10-asks-devs-focus-directx-12-releasing-450-page-programming-guide-developers-public-sdk/

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

There are several, but they could at least try to change the documentation a bit more, instead of "copy paste", example:

https://pbs.twimg.com/media/CBBu9COWwAAPzZB.jpg:large

 

You can also check what some people who developed/are developing DX12 say: https://forum.beyond3d.com/threads/direct3d-feature-levels-discussion.56575/page-9

 

You do know Nvidia have been working with MS on DX12 for 5 years before Mantle was even announced right?

QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sure we all will appreciate your invaluable input in this matter.

I did it was funny.

 

Its an Nvidia title isnt it? Lol GeForce till i die

GeForce bros till we die. SLI bros? lol

Love cats and Linus. Check out linuscattips-fan-club. http://pcpartpicker.com/p/Z9QDVn and Asus ROG Swift. I love anime as well. Check out Heaven Society heaven-society. My own personal giveaway thread http://linustechtips.com/main/topic/387856-evga-geforce-gtx-970-giveaway-presented-by-grimneo/.

Link to comment
Share on other sites

Link to post
Share on other sites

I did it was funny.

GeForce bros till we die. SLI bros? lol

I will be an SLI bro in a couple months lol

Intel i5-4690K, Asus Z97-A, G.Skill Trident X 2400 Mhz

Asus GTX 970 Strix OC, Corsair RM 850, Corsair H105 AIO Water Cooler

Cooler Master HAF XB EVO Case 

Link to comment
Share on other sites

Link to post
Share on other sites

I find it hilarious so many people are defending this


Look, whether you like it or not, it's not fair to make, not only another companies cards run worse, but also make your older cards run worse as well to make users wanna upgrade. The whole thing reeks of anti consumer, no matter how you try to spin it.

Link to comment
Share on other sites

Link to post
Share on other sites

I find it hilarious so many people are defending this

Look, whether you like it or not, it's not fair to make, not only another companies cards run worse, but also make your older cards run worse as well to make users wanna upgrade. The whole thing reeks of anti consumer, no matter how you try to spin it.

 

People defend everything NVIDIA does.

 

I am not surprised anymore.

 

If NVIDIA used slave labor to make their GPUS people would still defend them.

 

The amount of people licking their boots after they lied to their entire customer base with the 970 VRAM issue is staggering.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×