Jump to content

Nvidia Disables GPU PhysX when Second non-Nvidia GPU is Installed

Najuno

Back when I had my 650ti and A8-3850 I tried using one of the 8800GTSs I revived as a PhysX card. In Mafia 2 I gained like 10+ FPS and the 8800GTS never reached above about 60% usage.

 

What's the point of having a dedicated PhysX card? Are there really people who play PhysX games enough to warrant having another GPU just for another ~10 FPS gain?

 

(I think the only PhysX game I own is Mafia 2 and I get good performance on there anyway. I haven't played it with my new hardware at all so..)

NZXT Phantom|FX-8320 @4.4GHz|Gigabyte 970A-UD3P|240GB SSD|2x 500GB HDD|16GB RAM|2x AMD MSI R9 270|2x 1080p IPS|Win 10

Dell Precision M4500 - Dell Latitude E4310 - HTC One M8

$200 Volvo 245

 

Link to comment
Share on other sites

Link to post
Share on other sites

is this a joke i have intel integrated crap that i cant physical remove how can they disable this

Link to comment
Share on other sites

Link to post
Share on other sites

i understand that they have competition and that its their technology. BUTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT..... IF IM A FUCKIN CUSTOMER  weather its AMD or nVidia I EXPECT MY THING TO WORK AS ADVERTISED. its like saying NIssan wont let me charge my leaf if i have another chevy volt and a honda civic in my driveway as well. WTF is wrong with them

Well to be fair, its more like nissan not letting you charge your Leaf with a Chevy Volt charger because they can't guaruntee if it will work well or the car will explode. Nvidia doesn't mind if you have an amd card, it minds when their in the same system. It does complicate matters if you have amd drivers interfering with nvidia physx

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

I have a serious problem with this, for the simple fact that Nvidia is screwing at least some people that are legitimately using a Nvidia card to game+Physx but also have an AMD card for driving other displays. For example, I do not see why they should care if I run a GTX480 with Physx (primary card) and something like a HD6450 pushing 2 other displays.

 

Though I can see why they want to do this.

Spoiler

Desktop <dead?> 

Spoiler

P8P67-WS/Z77 Extreme4/H61DE-S3. 4x4 Samsung 1600MHz/1x8GB Gskill 1866MHzC9. 750W OCZ ZT/750w Corsair CX. GTX480/Sapphire HD7950 1.05GHz (OC). Adata SP600 256GB x2/SSG 830 128GB/1TB Hatachi Deskstar/3TB Seagate. Windows XP/7Pro, Windows 10 on Test drive. FreeBSD and Fedora on liveboot USB3 drives. 

 

Spoiler

Laptop <Works Beyond Spec>

Spoiler

HP-DM3. Pentium U5400. 2x4GB DDR3 1600MHz (Samsung iirc). Intel HD. 512GB SSD. 8TB USB drive (Western Digital). Coil Wine!!!!!! (Is that a spec?). 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

!HAHAHAHA wolf physX and more! U mad AMD?

I hope you are aware that TW3 uses DirectCompute for its fur physics so they should run on AMD as well

Edit. Granted it's still Nvidia developed effect so they can artificially disable support for AMD GPUs at any point.

Link to comment
Share on other sites

Link to post
Share on other sites

Isn't this really old news... it's why Hybrid PhysX was made

Asrock 890GX Extreme 3 - AMD Phenom II X4 955 @3.50GHz - Arctic Cooling Freezer XTREME Rev.2 - 4GB Kingston HyperX - AMD Radeon HD7850 - Kingston V300 240GB - Samsung Spinpoint F3 1TB - Chieftec APS-750 - Cooler Master HAF912 PLUS


osu! profile

Link to comment
Share on other sites

Link to post
Share on other sites

This isn't new, you've not been able to do this without tweaking drivers for the longest of times, Nvidia killed the PPU. And I don't understand why? Even if people are using AMD GPU's as their main card if they want PhysX they still need a Nvidia GPU for it? Nvidia Still gets money.

 

They could bring back the PPU like Ageia had before they were bought out by NV and they'd sell? 

 

Just little history lesson for the amount of fan boying going on in this thread, PhysX was not brought to market or invented by NV, Ageia is whom you have to thank, they also made PPU(physics Processing Units). So all gamers could enjoy PhysX, you bought yourself a little PCI-e add-in card, installed some drivers and you were away. Then in 2008 NV bought out Ageia and things instantly went to shit, once you updated your drivers for your PPU past a certain version the card stopped working. So NV forced existing Ageia customers to buy their product. This is a bad day for gamers.

 

@martinrox1568  it doesn't and if it does it because NV made it that way, Ageia managed it just fine. 

AMD Ryzen 5900x, Nvidia RTX 3080 (MSI Gaming X-trio), ASrock X570 Extreme4, 32GB Corsair Vengeance RGB @ 3200mhz CL16, Corsair MP600 1TB, Intel 660P 1TB, Corsair HX1000, Corsair 680x, Corsair H100i Platinum

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

so now i cant have physx because i have an apu?

not that i use it alot (not even a litle xD)  but anyhow it should work

idk how could that gpu turn the other one down since its not even in use

Link to comment
Share on other sites

Link to post
Share on other sites

What I find more worrying than this is that physx runs a lot worse on win 8.1 than on win 7. Seriously on windows 8.1 there is simply no way to run physx smoothly with 1 gpu, and I have an 780ti, which means that its not optimized properly for it or some weird monopoly is going on behind our backs. Because of it I reverted back to win 7 and phsyx works as intended.

Link to comment
Share on other sites

Link to post
Share on other sites

so now i cant have physx because i have an apu?

not that i use it alot (not even a litle xD)  but anyhow it should work

idk how could that gpu turn the other one down since its not even in use

this is a good point. But I think you can disable the onboard graphics of the apu the same way you would when you install a dedicated GPU, unless this new Nvidia fail checks for that as well...

 

EDIT: read the rest of the artice and yeah... this is just retarded, wtg nvidia, another reason I wont support you.

Processor: AMD FX8320 Cooler: Hyper 212 EVO Motherboard: Asus M5A99FX PRO 2.0 RAM: Corsair Vengeance 2x4GB 1600Mhz

Graphics: Zotac GTX 1060 6GB PSU: Corsair AX860 Case: Corsair Carbine 500R Drives: 500GB Samsung 840 EVO SSD & Seagate 1TB 7200rpm HDD

 

Link to comment
Share on other sites

Link to post
Share on other sites

How about kaveri users? Can't I do this either?

Also, Nvidia stop it! Just stop it!

The same goes for you AMD this rivalry is childish!

is this a joke i have intel integrated crap that i cant physical remove how can they disable this

You can disable the iGPU if you want (at least on Intel). You do it in BIOS so Windows won't even know the iGPU exists (you can confirm this in the task manager). This will only affect people with an Nvidia graphics card and an AMD graphics card (or people who have not properly turned off their iGPU).

Link to comment
Share on other sites

Link to post
Share on other sites

You can disable the iGPU if you want (at least on Intel). You do it in BIOS so Windows won't even know the iGPU exists (you can confirm this in the task manager). This will only affect people with an Nvidia graphics card and an AMD graphics card (or people who have not properly turned off their iGPU).

"In the Bright Side labs, testing is under way to further investigate this issue to see if it happens on AMD APU systems or systems with Intel integrated GPUs. Nvidia says it is currently working on a resolution for this issue."

Most consumers wont know how to do this.... currently you're f*cked if you have an integrated GPU on your CPU.... More reasons nvidia are anti-user friendly as there is no benefit of doing this other than to screw over any competition (And actually hurt your own sales).

 

EDIT: To add, a friend of mine has a laptop that uses both the iGPU of an intel CPU and a dedicated GPU for gaming (it switches automatically when needed) so I guess he's fucked.

 

Nvidia: here's a bunch of exclusive features we can turn off whenever we want, cause FU.

Processor: AMD FX8320 Cooler: Hyper 212 EVO Motherboard: Asus M5A99FX PRO 2.0 RAM: Corsair Vengeance 2x4GB 1600Mhz

Graphics: Zotac GTX 1060 6GB PSU: Corsair AX860 Case: Corsair Carbine 500R Drives: 500GB Samsung 840 EVO SSD & Seagate 1TB 7200rpm HDD

 

Link to comment
Share on other sites

Link to post
Share on other sites

This is old news mayne, been going on for years

Link to comment
Share on other sites

Link to post
Share on other sites

I think it is a totally justifiable move. Don't see what all the fuzz is about ^^

5.1GHz 4770k

My Specs

Intel i7-4770K @ 4.7GHz | Corsair H105 w/ SP120 | Asus Gene VI | 32GB Corsair Vengeance LP | 2x GTX 780Ti| Corsair 750D | OCZ Agility 3 | Samsung 840/850 | Sandisk SSD | 3TB WD RED | Seagate Barracuda 2TB | Corsair RM850 | ASUS PB278Q | SyncMaster 2370HD | SyncMaster P2450
Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia may be overpriced, but its cards fail far less often than AMD's, use less electricity, run cooler, and yes they have features which Nvidia owns the rights to and has an obligation to protect vis a vis their shareholders. People need to stop complaining about sensible business practices and get AMD to produce competitive products.

PhysX is actually extremely important in running calculations for simulation. It's a great hardware engine if you know how to use it properly. AMD is fine for unrefined gaming and some low-end mining, but otherwise how exactly has it pushed the industry forward lately, and how has it warranted this bashing of nvidia?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia may be overpriced, but its cards fail far less often than AMD's, use less electricity, run cooler, and yes they have features which Nvidia owns the rights to and has an obligation to protect vis a vis their shareholders. People need to stop complaining about sensible business practices and get AMD to produce competitive products.

PhysX is actually extremely important in running calculations for simulation. It's a great hardware engine if you know how to use it properly. AMD is fine for unrefined gaming and some low-end mining, but otherwise how exactly has it pushed the industry forward lately, and how has it warranted this bashing of nvidia?

 

Because if you don't have a "AMD SUCKS" or "NVIDIA SUCKS" thread once a day, this forum and its 12 year olds have failed their purpose in life. 

Link to comment
Share on other sites

Link to post
Share on other sites

So with this NVIDIA doesn't let anyone with Intel and AMD APUs use their stream/recording functionality?

 

Link to comment
Share on other sites

Link to post
Share on other sites

"In the Bright Side labs, testing is under way to further investigate this issue to see if it happens on AMD APU systems or systems with Intel integrated GPUs. Nvidia says it is currently working on a resolution for this issue."

Most consumers wont know how to do this.... currently you're f*cked if you have an integrated GPU on your CPU.... More reasons nvidia are anti-user friendly as there is no benefit of doing this other than to screw over any competition (And actually hurt your own sales).

 

EDIT: To add, a friend of mine has a laptop that uses both the iGPU of an intel CPU and a dedicated GPU for gaming (it switches automatically when needed) so I guess he's fucked.

 

Nvidia: here's a bunch of exclusive features we can turn off whenever we want, cause FU.

They are testing it if happens, and if it happens then it's because the iGPU is not properly turned off. Again, if it's turned off properly then not even Windows will know that the iGPU exists, and neither will the Nvidia driver.

The only way that I can see this being an issue is if your motherboard has a poor BIOS that doesn't turn the iGPU off automatically. Nvidia will fix any false positives though. The people who don't know how to disable the iGPU in BIOS probably don't update to the latest drivers either. I doubt this will trigger many false positives.

 

Don't get me wrong, I think it's stupid of Nvidia to disable features on their own cards (which they have been doing for a long long time, this just enforces that policy even more) but I doubt more than a handful of legitimate users will suffer from this.

 

 

So with this NVIDIA doesn't let anyone with Intel and AMD APUs use their stream/recording functionality?

It only disables PhysX, not the entire card, and they are already working on fixing any potential false positives.

Link to comment
Share on other sites

Link to post
Share on other sites

Why do people care about Physx in 2014 anyway?

Witcher 3 ...

and games look better with PhysX.

Too many ****ing games!  Back log 4 life! :S

Link to comment
Share on other sites

Link to post
Share on other sites

Witcher 3 ...

and games look better with PhysX.

 

 

I guess I should say support the concept of Physx these days instead of protesting the practice of it stifling competition where one product has benefits over another instead of it being shared to benefit all gamers.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

It only disables PhysX, not the entire card, and they are already working on fixing any potential false positives.

Well I think this is more about streaming/recording, then anything else...

What I meant is:

Imagine you want to stream/record yourself playing The Witcher 3, using Intel QuickSync or Raptr (accordingly if you have Intel or AMD)... you won't be able to use Physx in the game, right?

If that's so... it's kinda messed up. They are forcing you to use Shadowplay if you want to fully enjoy the game with what the hardware offers you... NVIDIA is limiting your choise. It doesn't matter if you like more QuickSync or Raptr, either by a matter of taste, performance loss, w/e - they won't let you use any of those apps if you want to use Physx.

Now imagine Game Streamers, Reviewers, Tech Journalists being forced to use ShadowPlay if they want to show the game with Physx.

In fact I might ask this question directly to Projekt Red - if I can fully enjoy what their game offers and be able to stream with a APP of my choise, exposing this Physx situation.

Link to comment
Share on other sites

Link to post
Share on other sites

You can disable the iGPU if you want (at least on Intel). You do it in BIOS so Windows won't even know the iGPU exists (you can confirm this in the task manager). This will only affect people with an Nvidia graphics card and an AMD graphics card (or people who have not properly turned off their iGPU).

There is no way to completely disable igpu,the only thing i can do is not connect any monitor to it but its still not disabled like when you unplug a video card from the slot,if i connect the monitor it instantly recognized the monitor and shows extension desktop for my 2nd monitor...

I dont want to connect my 2nd monitor to main gpu cause dont have enough ram and i run other stuff like youtube on 2nd screen when i game and it uses VRAM.

Anyway there arent any physx games anymore but i ever want to play any this will hurt.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not sure how this is even legal, this is beyond petty GPU wars somebody with an Nvidia GPU could sue the fuck out of them for intentionally turning off features for no good reason. It wouldn't be easy to prove it's not just a technical issue however since it used to work it wouldn't be impossible to prove.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×