Jump to content

NVIDIA Under Attack Again for GameWorks in The Witcher 3: Wild Hunt

CtW

Like in Bioshock Infinite where Elizabeth's hair and dress spontaneously started breakdancing at times because REALISM.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

idk...seems pretty solid to me.

 

Oh really, point me to the driver that allows me to use Mantle with my Geforce GTX card. Because that's what he was referring to when he said API. TressFX isn't an API.

Link to comment
Share on other sites

Link to post
Share on other sites

This Thread's Logic

Nvidia has GPU specific visual features: ANTI CONSUMER PC GAMING SEGREGATION INCOMING.

AMD has GPU specific API: Not anti consumer at all, unifying PC Gamers.

No one has claimed the later. In fact I specifically acknowledged and said it was just as bad when they did it.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Coolio idea but I feel it does have that Tressfx1.0 feel of moving way more than it should.

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Oh really, point me to the driver that allows me to use Mantle with my Geforce GTX card.

Nvidia could give you one, now that they have full and unrestricted access to the source code. Yes it took some time and it's now obsolete anyway but it did happen.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

It does clip a bit but i'd take it over the original look of the brick hair. We'll see what Tressfx3.0 brings with deusX

Is hairworks better? i've not seen it used in a longer hair fashion, only fur. If it was better i'd gladly use it, assuming it was available for me. 

 

and that's a valid argument. Tomb Raider was using AMD's first release of tressFX. Despite that, I found it looked pretty good, especially if you go back to when Tomb Raider was released and compare to any other game with any physics based hair. It was quite revolutionary. TressFX 2.0 added in fur and grass, but I'm not aware of any games that use 2.0. Can't wait for new Deus Ex.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Coolio idea but I feel it does have that Tressfx1.0 feel of moving way more than it should.

That's because it's just Skyrim modders playing around with HDT physics, not really a hair specific thing. It does look rather bad with many hair styles though spasms everywhere freaking out, only a few hair styles really look ok with it. Not relevant to the discussion or anything I just remembered and had a good laugh

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Now, in topics like these is where the fanboys really shine

Error: 451                             

I'm not copying helping, really :P

Link to comment
Share on other sites

Link to post
Share on other sites

why mutter over TressFX?

it's a new tech implemented for a game

regardless the outcome, the tech is available to everyone to use and optimize it.

nvidia always tries to lock down other competitor over their own term & license

helping developer to build the game is a good thing, but not allowing others to optimize it, that is not a good thing.

why you support the idea of locking down the performance over 1 company?

you know, there's already devices for that purpose, it's called "CONSOLES".

They don't lock down the performance or any optimizations for AMD to make.

What's "locked down" is AMD being able to use Gameworks effects on their cards since they're Cuda accelerated effects.

That's the thing everyone is bitching about.

So if you can't run the effect, do what any normal would do and turn it off.

Link to comment
Share on other sites

Link to post
Share on other sites

Oh really, point me to the driver that allows me to use Mantle with my Geforce GTX card. Because that's what he was referring to when he said API. TressFX isn't an API.

If i'm not mistaken the code for mantle is there. If nvidia wanted to they could make a driver and have the gpu run a mantle api. 

If i'm completely mistaken then I'll gladly say that despite what it did to kick start the movement of DX12 its existence as a whole is not a fantastic thing, specifically if there ever comes a point when a game is built using on the mantle api.

 

EDIT: regardless of if DX12 makes it "relevant" or not its still be added to new games and its still open source. the point still stands.

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia could give you one, now that they have full and unrestricted access to the source code. Yes it took some time and it's now obsolete anyway but it did happen.

 

I guess I should've added "a driver that allowed me to use mantle when it was still relevant". And not a dead tech like it is now.

 

@Trik'Stari save some for me will ya.

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia could give you one, now that they have full and unrestricted access to the source code. Yes it took some time and it's now obsolete anyway but it did happen.

 

Nvidia would have to do massive rewrites of the code since AMD developed it specifically for their own architecture. By the time AMD released the source code it was too late to even matter. It's nice that it's out there but there was never any danger of Nvidia using it.

Link to comment
Share on other sites

Link to post
Share on other sites

and that's a valid argument. Tomb Raider was using AMD's first release of tressFX. Despite that, I found it looked pretty good, especially if you go back to when Tomb Raider was released and compare to any other game with any physics based hair. It was quite revolutionary. TressFX 2.0 added in fur and grass, but I'm not aware of any games that use 2.0. Can't wait for new Deus Ex.

Lichdom Battle Mage, i linked a video somehwere on the last page. Looked much better then 1.0

Less stringy and less extreme movements. 

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

This Thread's Logic

Nvidia has GPU specific visual features: ANTI CONSUMER PC GAMING SEGREGATION INCOMING.

AMD has GPU specific API: Not anti consumer at all, unifying PC Gamers.

mantle is becoming open source, nvidia refuses to use it. (kind of moot now because DX12)

 

The only thing that needs to be done with physx is much better cpu utilization/multithreaded or outright made to run on AMD's stream processors. Surely they wouldn't be afraid to be beaten in their own creation?

Link to comment
Share on other sites

Link to post
Share on other sites

Just like I said in the Project Cars topic - look at the developer, not at the IHV.

 

Don't mind what NVIDIA says, besides being a very biased source and being known for one of the worst, and most shaddy, PR stunts only compared to BP oil spills - yet they have the right to do so, specially in a two player market where the other party doesn't want to go in a lawsuit over this marketing shit (trust me, if this was other competitive industry, NVIDIA would have been paying by the word, and not by the claim). Either way, people still support them, fuck I supported them in my last laptop purchase - just because there was nothing better at the time.

 

Now this goes beyond NVIDIA.

 

NVIDIA pays for this implementations in this tittles. No one forces developers to accept the money, to sign a NDA contract that does not allow other partys to see the source code so they can develop improvements (Yes, if you think you are a indie dev and you want to implement Gameworks NVIDIA is going to ship a Engineer right at your door with a backpack filled with graphics cards, with full expenses paid, to stay there the time needed fuse Gameworks to your poorly developed game engine... RIGHT! This is done in valuable titles, where NVIDIA themselfs invest marketing money on them ... them NVIDIA logos while loading to the game aint there for free bros).

 

So bashing NVIDIA is just pointless - just don't support the developer. You can say : "oh like that's going to make a difference, since NVIDIA marketing invesment is probably in the order of thousands if not a couple of millions of USD... just on that contract it's like they sold thousands of units", wich I will reply, "You are damn right it aint going to make a difference!", look at Ubisoft like they give a fuck.

 

The managers look at the injection of capital and their eyes shine.

 

Just make sure and DON'T FORGET - so the next time they (Ubisoft, Project Cars  devs wich I don't know their name, Projkt Red, etc) come to social media with the typical bullshit "Oh we are for pc gaming!" bullshit, make them know that you didn't forget. Make some noise. Make a Community Manager have a fucking bad day for it at least.

Link to comment
Share on other sites

Link to post
Share on other sites

I guess I should've added "a driver that allowed me to use mantle when it was still relevant". And not a dead tech like it is now.

Was it ever really that relevant? You got to used on Battlefield for a bit and Dragon Age: Inquisition but those games were ok on Nvidia anyway.

It was stillborn tech if you ask me, AMD didn't have the money to keep at it and just bowed their heads down to Microsoft.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

mantle is becoming open source, nvidia refuses to use it. (kind of moot now because DX12)

 

The only thing that needs to be done with physx is much better cpu utilization/multithreaded or outright made to run on AMD's stream processors. Surely they wouldn't be afraid to be beaten in their own creation?

 

Nvidia would have to do massive rewrites of the code since AMD developed it specifically for their own architecture. By the time AMD released the source code it was too late to even matter. It's nice that it's out there but there was never any danger of Nvidia using it.

 

 

And do they really need to? Their drivers have such low overhead that they're basically on par with AMD on mantle on their DX11 drivers. Especially in games like Battlefield 4.

Link to comment
Share on other sites

Link to post
Share on other sites

It doesn't bother me too much if there is an "off" button. I just wish nvidia enbraced open source so AMD could adapt their drivers to run gameworks.related stuff decently.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

idk...seems pretty solid to me.

What you don't seem to realize is that  Gameworks effects are Cuda accelerated effects, including PhysX effects... And Nvidia licenses CUDA out to other companies, including AMD.

 

Which AMD declined...  

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

why the fuck is Nvidia under attack for this? just because AMD is too lazy to make fancy hair posible on their GPUs?

TresFX, dawg.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Don't mind what NVIDIA says, besides being a very biased source and being known for one of the worst, and most shaddy, PR stunts only compared to BP oil spills -

LMFAO

You're comparing all of this to an environmental hazard?

I'm fucking done. lolololololololololololololololololololololol

Link to comment
Share on other sites

Link to post
Share on other sites

Just like I said in the Project Cars topic - look at the developer, not at the IHV.

 

Don't mind what NVIDIA says, besides being a very biased source and being known for one of the worst, and most shaddy, PR stunts only compared to BP oil spills - yet they have the right to do so, specially in a two player market where the other party doesn't want to go in a lawsuit over this marketing shit (trust me, if this was other competitive industry, NVIDIA would have been paying by the word, and not by the claim). Either way, people still support them, fuck I supported them in my last laptop purchase - just because there was nothing better at the time.

 

Now this goes beyond NVIDIA.

 

NVIDIA pays for this implementations in this tittles. No one forces developers to accept the money, to sign a NDA contract that does not allow other partys to see the source code so they can develop improvements (Yes, if you think you are a indie dev and you want to implement Gameworks NVIDIA is going to ship a Engineer right at your door, with full expenses paid, to stay there the time needed fuse Gameworks to your poorly developed game engine... RIGHT! This is done in valuable titles, where NVIDIA themselfs invest marketing money on them ... them NVIDIA logos while loading to the game aint there for free bros).

 

So bashing NVIDIA is just pointless - just don't support the developer. You can say : "oh like that's going to make a difference, since NVIDIA marketing invesment is probably in the order of thousands if not million of USD... just on that contract it's like they sold thousands of units", wich I will reply, "You are damn right!", look at Ubisoft like, they give a fuck. The managers look at the injection of capital and their eyes shine.

 

Just make sure and DON'T FORGET - so the next time they (Ubisoft, Project Cars  devs wich I don't fucking know them, Projkt Red, etc) come to social media with the typical bullshit "Oh we are for pc gaming!" bullshit, make them know that you didn't forget. Make some noise. Make a Community Manager have a fucking bad day for it at least.

So we should support no games at all then? As it has been pointed out any game sponsored by Nvidia must use gameworks now and developing without having sponsorship from either one (AMD or Nvidia) just does not happen. Most devs can't afford to make their own engines and engines they license come with the sponsorship and stuff build in so they have little choice but to use it.

Trying to realistically avoid GPU vendor sponsored games would be harder than dodging rain.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

mantle is becoming open source, nvidia refuses to use it. (kind of moot now because DX12)

 

The only thing that needs to be done with physx is much better cpu utilization/multithreaded or outright made to run on AMD's stream processors. Surely they wouldn't be afraid to be beaten in their own creation?

 

GPU accelerated PhysX is built around CUDA. That's like telling AMD to completely redesign Mantle to run on CUDA cores. It makes zero business sense to develop something specifically for your competitor.

Link to comment
Share on other sites

Link to post
Share on other sites

What you don't seem to realize is that  Gameworks effects are Cuda accelerated effects, including PhysX effects... And Nvidia licenses CUDA out to other companies, including AMD.

 

Which AMD declined...  

At what cost? Again its a bit of an Argument from Ignorance situation but i'd argue that if it was pure cuda acceleration in that "black box" the black box wouldnt be there. There would be no reason to not have code out if AMD could do nothing with it as they have no CUDA. This makes me more than think Nvidia thinks AMD cards are more than cable of running the gameworks if they had the code. Without cuda.

 

EDIT: My other though on this is if Nvidias Gameworks was truly pure CUDA, which AMD could do nothing about they could have easily released the code when accusations were made against them pertaining to slowing down amd hardware. There would be no negative (that im aware of), amd could do nothing with it without CUDA and they would have proven innocents to any doubters. They did not and as such we can hop to the conclusion that there was either malicious code against amd hardware or they were aware that the edge it would give AMD (who could now optimize for it, potentially taking away a "selling point") was to much of a loss vs the bad press. 

 

Also AMD offered Mantle to nvidia before it was open source. nvidia declined. So if we wanna play that it cancels out i guess?

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×