Jump to content

Ubuntu 20.04 - NVIDIA GPU consuming power even when using only integrated graphics card (Intel iGPU)

The issue is pretty simple: using Nvidia X Sever Settings GUI to switch from NVIDIA GPU to Intel iGPU doesn't work and the NVIDIA GPU does still consume some power after rebooting and, as a consequence, it generates unnecessary heat. Same thing (obviously) happens if I use prime-select from the terminal. I can see from powertop that when I have everything on idle, I am still consuming around 18-22W which is at least 10W more than I would expect.

This seems to be a rather old bug and supposedly it got "fixed". I can find at least two launchpad bug reports (https://bugs.launchpad.net/ubuntu/+source/nvidia-prime/+bug/1765363) and there are some workaround there. Problem is that at this point I am not even sure what works and what doesn't since most posts are over 3 years old. I tried for example installing ubuntu without selecting "install third party drivers" since someone suggested it would fix the problem but it didn't work in my case. 
Someone else here recently posted about this issue again -> https://discourse.ubuntu.com/t/nvidia-prime-not-powering-off-the-dgpu/21856
I used to have another Optimus laptop years ago and I remember it worked fine in older Ubuntu versions but now something seems wrong.

 

Config:
- Zephyrus M16, 11800H, 3070
- Ubuntu 20.04 (but same problem with 21.10)
- Nvidia Driver 470 and 460 tested.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, TommyBirba said:

Ubuntu 20.04 (but same problem with 21.10)

I hate to be "that person" but have you tried running a live install of a different distro?

NOTE: I no longer frequent this site. If you really need help, PM/DM me and my e.mail will alert me. 

Link to comment
Share on other sites

Link to post
Share on other sites

Thread cleaned. If you have nothing useful / helpful to say or don't intend to say it in a friendly way, don't. 

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Kilrah said:

Thread cleaned. If you have nothing useful / helpful to say or don't intend to say it in a friendly way, don't. 

Fine. But I'll still say that as a Linux user, Nvidia is no good to use. It's a fact people need to face. That's me being helpful. If it doesn't work, don't use it. Don't support a company who won't help you.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Radium_Angel said:

I hate to be "that person" but have you tried running a live install of a different distro?

Not really, I might try this weekend but, from reading online, I think it might work right away on Fedora and POP OS. This seems to be more of a Ubuntu problem rather than NVIDIA (although I am sure their Linux approach doesn't help).

Link to comment
Share on other sites

Link to post
Share on other sites

I found this on Arch Wiki. https://wiki.archlinux.org/title/PRIME#PCI-Express_Runtime_D3_(RTD3)_Power_Management
Maybe you can try google similar topic for Ubuntu? 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, TommyBirba said:

The issue is pretty simple: using Nvidia X Sever Settings GUI to switch from NVIDIA GPU to Intel iGPU doesn't work and the NVIDIA GPU does still consume some power after rebooting and, as a consequence, it generates unnecessary heat. Same thing (obviously) happens if I use prime-select from the terminal. I can see from powertop that when I have everything on idle, I am still consuming around 18-22W which is at least 10W more than I would expect.

This seems to be a rather old bug and supposedly it got "fixed". I can find at least two launchpad bug reports (https://bugs.launchpad.net/ubuntu/+source/nvidia-prime/+bug/1765363) and there are some workaround there. Problem is that at this point I am not even sure what works and what doesn't since most posts are over 3 years old. I tried for example installing ubuntu without selecting "install third party drivers" since someone suggested it would fix the problem but it didn't work in my case. 
Someone else here recently posted about this issue again -> https://discourse.ubuntu.com/t/nvidia-prime-not-powering-off-the-dgpu/21856
I used to have another Optimus laptop years ago and I remember it worked fine in older Ubuntu versions but now something seems wrong.

 

Config:
- Zephyrus M16, 11800H, 3070
- Ubuntu 20.04 (but same problem with 21.10)
- Nvidia Driver 470 and 460 tested.

Best thing you can do is trade that laptop (or sell it) for one that's AMD. It will spare you a great deal of grief. Nvidia has never worked right on Linux. As a Linux user, you are a second class citizen to Nvidia. For this reason, I am not surprised Valve is going full AMD with their Steam Deck either.

 

This is why I don't use anything Nvidia anymore. Because, over the years, it just gave me a bunch of problems on Linux. Like I said; don't support a company who won't help you. If anything, Nvidia is more focused on selling to crypto miners (which has been contributing to global warming). They don't care about gamers either.

 

Videos like this wouldn't exist otherwise:

Or this by Linus Sebastian himself:

Funny how both of these Linus's have something bad to say about Nvidia. 🤣

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×