Jump to content

Project CARS devs address AMD performance issues, AMD drivers to blame entirely, PhysX runs on CPU only, no GPU involvement whatsoever.

How is that competition when you're basically preventing others from even competing by artificial means? That's actually the opposite of competition.

which year are you on ? this is not the stone age anymore, you can't be for real. i know it sucks that apple has a bunch of patent and doesn't allow other companies to use them, but that's business.

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

You do not understand PhysX at all. On systems with AMD cards PhysX is flat out disabled if the Nvidia driver is installed, or not even there as its part of the Nvidia driver.

 

Some games will allow you to turn on PhysX even if you have a Radeon card though.

 

I don't have Project Cars so I can't test this with my AMD rig. Thing is, I keep seeing people with Radeon cards say they're playing the game just fine except when it rains. The rain could be a PhysX simulation, and if it is and there isn't an alternative simulator that isn't as heavy as PhysX, something should be done about that.

Link to comment
Share on other sites

Link to post
Share on other sites

Competition...deal with it

 

and i should believe that because you said so.... /s

 

You obviously did not understand anything in my post. That's a shame. The gist of it, though, is that vendor lock in, is inherently ANTI competitive, so no it's not competition.

 

No, you should believe it, because it is fact. A fact, known for years! http://www.gamespot.com/forums/system-wars-314159282/tressfx-patched-to-work-on-nvidia-cards-amd-exclus-29363863/

AMD themselves has made official statements on this too.

 

You do not understand PhysX at all. On systems with AMD cards PhysX is flat out disabled if the Nvidia driver is installed, or not even there as its part of the Nvidia driver.

 

Not correct, PhysX is primarily a physics engine, that the most fundamental game mechanics will be based on (just like Intel's Havoc engine, used in most games, like Half Life 2). PhysX has 2 levels, a primary low level, that is the physics engine, that can run on CPU's, when not using an NVidia card, and a higher level, that contains other physics based effects, exclusively for NVidia users.

 

If you completely disabled PhysX in this game, all the cars would clip through each other, and not be able to touch, bump, push or crash for that matter.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

which year are you on ? this is not the stone age anymore, you can't be for real. i know it sucks that apple has a bunch of patent and doesn't allow other companies to use them, but that's business.

Except apple offers it's own platform, apple only. You buy an iphone that works with it's itunes and store for your music, your apps, etc. Nvidia however is not working on that vacuum because they're simply not there yet in terms of market dominance and a compeling enough product with mass appeal. If they were they'd be launching a console to compete with Sony and Microsoft (another example of closed walled garden environments) What Nvidia is doing is taking PC gaming which is largely an open standard and try to sneak in bits that are exclusive to them. They're not really exclusive (because that would piss off devs for good) but it's the next best thing they can afford sort of saying "Yep, this game just won't work at all on AMD" which is to basically trip AMD with closed source (Or so AMD claims, the effects are mostly theoretical but in practice close to irrelevant so far)

The thing is that it's better for ALL OF US including Nvidia fans for PC to remain as close to an open standard as possible, otherwise you end up with situations in which a complacent force like Microsoft gets lazy and releases crap like Windows ME and Vista (and to a lesser extend, 8) and it took serious competition from Apple in the desktop and the raise of the mobile devices as the main computing device (Both with Unix like systems in the form of Android and iOS) for MS to wake up and apparently trying to right the ship (though still not convinced of this myself until it happens)

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Performance issues due to AMD's drivers? Yeah... I'll believe that when game devs actually start making games that don't require specific driver releases to work correctly.

 

Almost ALWAYS the correct answer, and can't be said enough.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Some games will allow you to turn on PhysX even if you have a Radeon card though.

 

I don't have Project Cars so I can't test this with my AMD rig. Thing is, I keep seeing people with Radeon cards say they're playing the game just fine except when it rains. The rain could be a PhysX simulation, and if it is and there isn't an alternative simulator that isn't as heavy as PhysX, something should be done about that.

I was just reading this article on performance and it seems like this devs, for as much AMD blaming as they did, only really optimized for Maxwell cards specifically: even the 780ti is taking it up the ass at the moment.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

No, you should believe it, because it is fact. A fact, known for years! http://www.gamespot.com/forums/system-wars-314159282/tressfx-patched-to-work-on-nvidia-cards-amd-exclus-29363863/

AMD themselves has made official statements on this too.

yes, you should believe everything a CEO/PR says, it's always the truth.

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

Not correct, PhysX is primarily a physics engine, that the most fundamental game mechanics will be based on (just like Intel's Havoc engine, used in most games, like Half Life 2). PhysX has 2 levels, a primary low level, that is the physics engine, that can run on CPU's, when not using an NVidia card, and a higher level, that contains other physics based effects, exclusively for NVidia users.

 

If you completely disabled PhysX in this game, all the cars would clip through each other, and not be able to touch, bump, push or crash for that matter.

 

Right, but if the game has the higher level PhysX effect in the game (I don't have the game so I don't know about the graphics options) and is being forced to run even if you don't have an Nvidia GPU, it's going to make Radeon setups eat shit. If that's the case, something should be done about that on the development side.

 

 

I was just reading this article on performance and it seems like this devs, for as much AMD blaming as they did, only really optimized for Maxwell cards specifically: even the 780ti is taking it up the ass at the moment.

 

Yeah, if those benchmarks are accurate then I don't think the game is properly made. Looks like it runs like shit across the board when maxed out.

 

I don't have the game: do the looks justify the performance hit?

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, if those benchmarks are accurate then I don't think the game is properly made. Looks like it runs like shit across the board when maxed out.

 

I don't have the game: do the looks justify the performance hit?

Well it does look amazing, I'm thinking this will be a cruel mistress for a while: The Crysis of the racing sims world.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Some games will allow you to turn on PhysX even if you have a Radeon card though.

 

I don't have Project Cars so I can't test this with my AMD rig. Thing is, I keep seeing people with Radeon cards say they're playing the game just fine except when it rains. The rain could be a PhysX simulation, and if it is and there isn't an alternative simulator that isn't as heavy as PhysX, something should be done about that.

 

The rain might just be it. 

Found this article in 2013 talking about physX in Star Citizen and Project Cars. (Article link: http://physxinfo.com/news/11822/star-citizen-and-project-cars-will-include-gpu-accelerated-physx-and-apex-effects/)

"The physics in the game(Project Cars) is already utilizing PhysX SDK 3.2, and the extra efffects will consist of dynamic particles and APEX Turbulence based smoke."

 

The rain droplets might have been the dynamic particles mentioned.   

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Well it does look amazing, I'm thinking this will be a cruel mistress for a while: The Crysis of the racing sims world.

 

Yeah.

 

Even if that's so, I don't think the 290x should be sitting that low on performance, if those benchmarks are accurate. Even the 780 is eating shit like you mentioned.

 

Something's wrong there.

Link to comment
Share on other sites

Link to post
Share on other sites

yes, you should believe everything a CEO/PR says, it's always the truth.

 

How about you actually reading the link I posted, or do you think the actual developer of TR, AND NVidia themselves are lying?

 

Right, but if the game has the higher level PhysX effect in the game (I don't have the game so I don't know about the graphics options) and is being forced to run even if you don't have an Nvidia GPU, it's going to make Radeon setups eat shit. If that's the case, something should be done about that on the development side.

 

I thought that was impossible, as PhysX supposedly disables the advanced form in the middleware itself, when not detecting an NVidia card. But then again, a lot has happened in the last couple of years, and it is getting difficult distinguishing, what is GameWorks separate effects, and what is tied to PhysX itself these days. 

 

It does however look, like rain completely destroys frame rate on all cards. It's odd, since the rain effects in Metro Last light, was amazing, but not very taxing. But then again, looking at the screen shot in OP, it does look damn good.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

How about you actually reading the link I posted, or do you think the actual developer of TR, AND NVidia themselves are lying?

 

 

I thought that was impossible, as PhysX supposedly disables the advanced form in the middleware itself, when not detecting an NVidia card. But then again, a lot has happened in the last couple of years, and it is getting difficult distinguishing, what is GameWorks separate effects, and what is tied to PhysX itself these days. 

 

It does however look, like rain completely destroys frame rate on all cards. It's odd, since the rain effects in Metro Last light, was amazing, but not very taxing. But then again, looking at the screen shot in OP, it does look damn good.

 

I'm not 100% sure. All I know is without a driver update for PhysX on Metro: Last Light, performance would tank even on high end systems with the effect enabled. It could have been because the game was using the CPU to render the simulation instead of the Nvidia card. If that's the case, it could be the same issue with Mirror's Edge (experienced a similar issue but I'm not 100% sure it was tied to PhysX drivers because I vaguely remember what happened) all while you had PhysX enabled.

 

Now, if there wasn't such a way to disable the PhysX simulation for things that are destroying the performance on setups without an Nvidia card in Project Cars, there's something wrong on the development side.

Link to comment
Share on other sites

Link to post
Share on other sites

How about you actually reading the link I posted, or do you think the actual developer of TR, AND NVidia themselves are lying?

i did read your very credible forum discussion link you gave to me /s, but you didn't seem to grasp what i was trying to say to you.

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe you're unfamiliar with the term TressFX. HairWorks was Nvidia's answer to TressFX and needless to say it looks like shit in comparison (like a washed out texture pasted on a retarded mesh). They could of done a way better job with it but didn't. We'll see how TressFX stacks up with animal fur eventually.

 

Uh, nah. Hairworks looks superior as it sits. I haven't seen the "TressFX 2.0" yet, but those examples you posted clearly show Hairworks looking superior. TressFX looked odd and out of place in Tomb Raider especially. The physics of it were fine, but the aesthetic of it was so out of place.

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

how many times did you edit that comment to add stuff up ?

anyway, are you here complaining about how nvidia hair works sucks as well ? do you just hate anything other companies work on except amd's ? 

yeah, people hate when they've been told the truth, some say i'm arrogant for saying and some people just say it's regurgitated bs. but some people just can't handle the truth at the end of the day

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

Uh, nah. Hairworks looks superior as it sits. I haven't seen the "TressFX 2.0" yet, but those examples you posted clearly show Hairworks looking superior. TressFX looked odd and out of place in Tomb Raider especially. The physics of it were fine, but the aesthetic of it was so out of place.

I would dispute the exact opposite. VisualFX needs a lot of polishing as shown in the videos I've posted above. There's about only one place in them two videos where HairWorks looks alright. From then on it's extremely washed out and pixelated. So bad it looks like they've added motion blur to the effect to cover up defects.

Link to comment
Share on other sites

Link to post
Share on other sites

Uh, nah. Hairworks looks superior as it sits. I haven't seen the "TressFX 2.0" yet, but those examples you posted clearly show Hairworks looking superior. TressFX looked odd and out of place in Tomb Raider especially. The physics of it were fine, but the aesthetic of it was so out of place.

 

Lichdom uses TressFX 2, which was a fairly small update I believe. Deus Ex: Mankind Divided, will have TressFX 3.0. If you want to learn more about it, watch the GDC presentation about it here: http://amd-dev.wpengine.netdna-cdn.com/wordpress/media/2012/10/Augmented-Hair-in-Deus-Ex-Universe-Projects-TressFX-3.0.ppsx

 

Also the animals in Far Cry 4, in opcode's link, looked like mesh. Absolutely horrible. But it is a bit difficult to compare fur to actual hair. Especially long hair like Lara's. Personally I loved TressFx in Tomb Raider. I thought it looked really nice, and gave her more character, and nicer to look at.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

yeah, people hate when they've been told the truth, some say i'm arrogant for saying and some people just say it's regurgitated bs. but some people just can't handle the truth at the end of the day

 

Hypocrisy at its finest.  

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

  Opcode What was your intention to posting the 2 Nvidia and then 2 TressFX videos.

If it was to try and say TressFX looks better all you did was further my liking of Gameworks

features. The fur look way better then the hair on Tomb Raider it make the animals come 

to life almost as if instead of just hair on top of it swinging around that i see  actual muscles

and flexing under the Fur. The hair in the Tomb Raider video looks like floppy strings of

spagetti flung around a head. The way hair moves doesn't act like that at all its as if they

weighted the end of every strand to make it move more then it should. And as for the Lichdom

its the same the hair doesn't follow a natural path when moving it looks more like when I'm

under water and my hair moves around me then in the air and my hair is dry.

Link to comment
Share on other sites

Link to post
Share on other sites

I can only recommend you, and everyone else to read up on this article, as you get feedback from AMD, NVidia and a dev (valve programmer):http://www.extremetech.com/gaming/183411-gameworks-faq-amd-nvidia-and-game-developers-weigh-in-on-the-gameworks-controversy

 

The original titan was literally a 780 with more vram, nothing else.

 

Not just for the lack of driver optimization, but also the fact that 780 series only had 3GB of vram, which everyone should have known would be a bottleneck within 1 year (I have argued that point many times back in the day).

 

 

NVidia did not invent PhysX, they bought it, then made the shittiest non NVidia exclusive software for CPU, the world has ever seen, using x87 instruction set. Read more here: http://arstechnica.com/gaming/2010/07/did-nvidia-cripple-its-cpu-gaming-physics-library-to-spite-intel/

 

NVidia users have always been able to use TressFX, though it was not proper optimized, when Tomb Raider launched. However AMD supplied NVidia with full source code access for optimization after launch. The result is that TressFX in Tomb Raider runs equally well on both platforms. Oddly enough, NVidia suddenly had their own version, called HairWorks/FurWorks a year or so later. Odd that.

 

The problem is not NVidia inventing middleware. The issue is having a black boxed, proprietary piece of tech, that is fundamentally anti consumer, due to vendor lock in. Vendor lock in, will diminish competition, choice/options of products, and cause price increase (which we already see in all NVidia products), because vendor lock in, results in high switching costs.

 

 

The tables were turns with TressFX, and AMD did not behave in this manner.

 

 

What an absolutely horrible comparison. I simply cannot fathom, how you don't understand the problem with GameWorks, and the consequences for the entire industry, including NVidia's own users in the long run. It's disturbing and sad really.

How about company A, bribes company C (the game dev) to use proprietary black boxed tech, that punishes company C's customers, that are also Company B's customers, by delivering a sub par experience, either by bad performance or downgraded graphics. Customers of company C's products, pays for this.

 


For all those, who support the business strategies of GameWorks: How would you feel, if AMD, made the same middleware package, that game devs, would heavily base their graphics on, which would run sub par on your PC's?

 

There is no real issue with gameworks, that has been debunked over and over again. We all know on paper it is harder for a dev to optimize gameworks for AMD, But it can be done because in reality we see very few gameworks titles that favour Nvidia.  The other thing you must remember is that if a dev wants to use gameworks there must be a reason for it, obviously the benefits of using gameworks outweigh the drawbacks.

 

 

http://www.extremetech.com/gaming/183411-gameworks-faq-amd-nvidia-and-game-developers-weigh-in-on-the-gameworks-controversy

http://linustechtips.com/main/topic/316992-eidos-amd-to-show-off-tressfx-30-in-new-deus-ex-engine/page-2#entry4323846<-no point reinventing the wheel.

http://www.tech24.biz/nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd/

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I would dispute the exact opposite. VisualFX needs a lot of polishing as shown in the videos I've posted above. There's about only one place in them two videos where HairWorks looks alright. From then on it's extremely washed out and pixelated. So bad it looks like they've added motion blur to the effect to cover up defects.

 

 

Lichdom uses TressFX 2, which was a fairly small update I believe. Deus Ex: Mankind Divided, will have TressFX 3.0. If you want to learn more about it, watch the GDC presentation about it here: http://amd-dev.wpengine.netdna-cdn.com/wordpress/media/2012/10/Augmented-Hair-in-Deus-Ex-Universe-Projects-TressFX-3.0.ppsx

 

Also the animals in Far Cry 4, in opcode's link, looked like mesh. Absolutely horrible. But it is a bit difficult to compare fur to actual hair. Especially long hair like Lara's. Personally I loved TressFx in Tomb Raider. I thought it looked really nice, and gave her more character, and nicer to look at.

 

Lara's hair did not match the game's aesthetic with TressFX enabled. It looked heavily out of place.

 

I don't understand how the Hairworks examples that Opcode presented are not convincing you that the effects looks better over TressFX's implementation in Tomb Raider. The examples of Hairworks looked like it was actually coming from the subject and matched the scene and design aesthetic. Lara's hair looked like something from some dumbass anime hair mod for Skyrim, which also look out of place and do not match the overall aesthetic.

Link to comment
Share on other sites

Link to post
Share on other sites

I'll just leave a tid bit here.

Why do you think that the only games that run well regardless of gpu used are the ones that required no driver side optimization?

Link to comment
Share on other sites

Link to post
Share on other sites

I've been let down my shoddy AMD/ATI drivers in the past, why I stick to nvidia, yet to have a problem.

Corsair 650D, i5-4690K (Corsair h50 cooler), Asus ROG maximus VII Ranger, Asus GTX 980ti Strix, 16GB Corsair Vengeance pro series Red (2x8gb 2400mhz), Corsair HX850W, Asus Xonar Essence STX, Sennheiser HD595 / Beyerdynamic dt 770 pro 250 ohm, Samsung 850 pro 128GB / WD black/blue/green, OCZ vertex 3 60gb . Corsair M40/Corsair k70.

Link to comment
Share on other sites

Link to post
Share on other sites


 

I'll just leave a tid bit here.

Why do you think that the only games that run well regardless of gpu used are the ones that required no driver side optimization?

Nearly all games require some sort of driver optimization. Although it's not something you will always see in release notes. This is why Intel regardless if they made GPUs would never catch up with Nvidia/AMD for the simple fact that drivers need to be hacked up for most games. It would take Intel a decade to do all the work that Nvidia/AMD has done over the years. That's how much time gets invested into drivers. From that perspective you could say graphics drivers are some of the shittiest pieces of software on any computer for the simple fact they have to be constantly manipulated for new games.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×