Jump to content

How to use my iGPU without using my graphics card?

lerodemmy

Isn't there some setting that will allow me to use my CPU's integrated graphics for basic tasks, and allow my graphics card to remain idle?  I think there is, but I don't know where to find it.  Is it good to use the iGPU for non-gaming stuff (web browsing, Office, Photoshop, YouTube)?

CPUIntel Core i9-14900K I GPU: EVGA RTX 3090 FTW3 Ultra Gaming I MotherboardAsus ROG Strix Z790-E Gaming I RAM: Corsair Dominator Titanium 64GB I SSDSamsung 980 Pro 2TB I PSUCorsair RM1000x (2021) I Cooler: Corsair iCUE Link H150i I CaseCorsair 5000D Airflow I FansCorsair QX120 x10 Cables: Corsair Premium Individually Sleeved Keyboard: Corsair K100 RGB I MouseCorsair Nightsabre Wireless Mouse Pad: Asus ROG Sheath Monitor: Aorus FV43U 

Link to comment
Share on other sites

Link to post
Share on other sites

Is the dedicated GPU AMD or Nvidia? If you are on a laptop it will save battery life. On desktop it really doesn't make much of a difference.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Pixelfie said:

Is the dedicated GPU AMD or Nvidia? If you are on a laptop it will save battery life. On desktop it really doesn't make much of a difference.

Nvidia.  Desktop.

CPUIntel Core i9-14900K I GPU: EVGA RTX 3090 FTW3 Ultra Gaming I MotherboardAsus ROG Strix Z790-E Gaming I RAM: Corsair Dominator Titanium 64GB I SSDSamsung 980 Pro 2TB I PSUCorsair RM1000x (2021) I Cooler: Corsair iCUE Link H150i I CaseCorsair 5000D Airflow I FansCorsair QX120 x10 Cables: Corsair Premium Individually Sleeved Keyboard: Corsair K100 RGB I MouseCorsair Nightsabre Wireless Mouse Pad: Asus ROG Sheath Monitor: Aorus FV43U 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, manikyath said:

the question is 'why'... there's not really a reason to do it on a desktop. even if your power cost is ridiculous the difference in power draw is neglible.

Just because I notice the GPU temp goes up a couple of degrees no matter what I'm doing.  I thought maybe it would be better to avoid that.  Though I suppose the CPU temp would go up if I was using the integrated graphics.

CPUIntel Core i9-14900K I GPU: EVGA RTX 3090 FTW3 Ultra Gaming I MotherboardAsus ROG Strix Z790-E Gaming I RAM: Corsair Dominator Titanium 64GB I SSDSamsung 980 Pro 2TB I PSUCorsair RM1000x (2021) I Cooler: Corsair iCUE Link H150i I CaseCorsair 5000D Airflow I FansCorsair QX120 x10 Cables: Corsair Premium Individually Sleeved Keyboard: Corsair K100 RGB I MouseCorsair Nightsabre Wireless Mouse Pad: Asus ROG Sheath Monitor: Aorus FV43U 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, lerodemmy said:

Just because I notice the GPU temp goes up a couple of degrees no matter what I'm doing.  I thought maybe it would be better to avoid that.  Though I suppose the CPU temp would go up if I was using the integrated graphics.

The GPU will still use a small amount of power no matter what. I wouldn't bother trying to use the iGPU as well. 

Phobos: AMD Ryzen 7 2700, 16GB 3000MHz DDR4, ASRock B450 Steel Legend, 8GB Nvidia GeForce RTX 2070, 2GB Nvidia GeForce GT 1030, 1TB Samsung SSD 980, 450W Corsair CXM, Corsair Carbide 175R, Windows 10 Pro

 

Polaris: Intel Xeon E5-2697 v2, 32GB 1600MHz DDR3, ASRock X79 Extreme6, 12GB Nvidia GeForce RTX 3080, 6GB Nvidia GeForce GTX 1660 Ti, 1TB Crucial MX500, 750W Corsair RM750, Antec SX635, Windows 10 Pro

 

Pluto: Intel Core i7-2600, 32GB 1600MHz DDR3, ASUS P8Z68-V, 4GB XFX AMD Radeon RX 570, 8GB ASUS AMD Radeon RX 570, 1TB Samsung 860 EVO, 3TB Seagate BarraCuda, 750W EVGA BQ, Fractal Design Focus G, Windows 10 Pro for Workstations

 

York (NAS): Intel Core i5-2400, 16GB 1600MHz DDR3, HP Compaq OEM, 240GB Kingston V300 (boot), 3x2TB Seagate BarraCuda, 320W HP PSU, HP Compaq 6200 Pro, TrueNAS CORE (12.0)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, wat3rmelon_man2 said:

Look up 'graphics settings' in start menu

It will come up with something like this

image.thumb.png.59f6b57b97b60e6de4224f0832051307.png

And you can choose what GPU for what app. 

Thanks!

CPUIntel Core i9-14900K I GPU: EVGA RTX 3090 FTW3 Ultra Gaming I MotherboardAsus ROG Strix Z790-E Gaming I RAM: Corsair Dominator Titanium 64GB I SSDSamsung 980 Pro 2TB I PSUCorsair RM1000x (2021) I Cooler: Corsair iCUE Link H150i I CaseCorsair 5000D Airflow I FansCorsair QX120 x10 Cables: Corsair Premium Individually Sleeved Keyboard: Corsair K100 RGB I MouseCorsair Nightsabre Wireless Mouse Pad: Asus ROG Sheath Monitor: Aorus FV43U 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, lerodemmy said:

Just because I notice the GPU temp goes up a couple of degrees no matter what I'm doing.  I thought maybe it would be better to avoid that.  Though I suppose the CPU temp would go up if I was using the integrated graphics.

the temp changing a few degrees wont have any sort of measurable impact on anything.

 

trying to switch graphics back and forth on the other hand.. can be quite problematic for some software to deal with.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, wat3rmelon_man2 said:

Look up 'graphics settings' in start menu

It will come up with something like this

image.thumb.png.59f6b57b97b60e6de4224f0832051307.png

And you can choose what GPU for what app. 

but that wont actually stop the dedicated GPU from running, so it will make even less of a difference...

 

you're essentially just making the software side more complicated, for the sake of making things more complicated.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, manikyath said:

but that wont actually stop the dedicated GPU from running, so it will make even less of a difference...

 

you're essentially just making the software side more complicated, for the sake of making things more complicated.

I'll just leave it alone and let the GPU do its thing.  I just wanted to find out if there was any benefit to using the iGPU and it appears I got my answer.

CPUIntel Core i9-14900K I GPU: EVGA RTX 3090 FTW3 Ultra Gaming I MotherboardAsus ROG Strix Z790-E Gaming I RAM: Corsair Dominator Titanium 64GB I SSDSamsung 980 Pro 2TB I PSUCorsair RM1000x (2021) I Cooler: Corsair iCUE Link H150i I CaseCorsair 5000D Airflow I FansCorsair QX120 x10 Cables: Corsair Premium Individually Sleeved Keyboard: Corsair K100 RGB I MouseCorsair Nightsabre Wireless Mouse Pad: Asus ROG Sheath Monitor: Aorus FV43U 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, lerodemmy said:

I'll just leave it alone and let the GPU do its thing.  I just wanted to find out if there was any benefit to using the iGPU and it appears I got my answer.

With laptops that have dual graphics it's worth the effort to switch them, mainly for power savings, but it won't make any appreciable difference on a desktop. 

Phobos: AMD Ryzen 7 2700, 16GB 3000MHz DDR4, ASRock B450 Steel Legend, 8GB Nvidia GeForce RTX 2070, 2GB Nvidia GeForce GT 1030, 1TB Samsung SSD 980, 450W Corsair CXM, Corsair Carbide 175R, Windows 10 Pro

 

Polaris: Intel Xeon E5-2697 v2, 32GB 1600MHz DDR3, ASRock X79 Extreme6, 12GB Nvidia GeForce RTX 3080, 6GB Nvidia GeForce GTX 1660 Ti, 1TB Crucial MX500, 750W Corsair RM750, Antec SX635, Windows 10 Pro

 

Pluto: Intel Core i7-2600, 32GB 1600MHz DDR3, ASUS P8Z68-V, 4GB XFX AMD Radeon RX 570, 8GB ASUS AMD Radeon RX 570, 1TB Samsung 860 EVO, 3TB Seagate BarraCuda, 750W EVGA BQ, Fractal Design Focus G, Windows 10 Pro for Workstations

 

York (NAS): Intel Core i5-2400, 16GB 1600MHz DDR3, HP Compaq OEM, 240GB Kingston V300 (boot), 3x2TB Seagate BarraCuda, 320W HP PSU, HP Compaq 6200 Pro, TrueNAS CORE (12.0)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×