Jump to content

does running a monitor at a lower refresh rate use less power?

Tom_nerd

hey guys

 

I was wondering if running a monitor at 30Hz instead of 60Hz would use less power. I know it would use more power from a GPU perspective but if you exclude that and only measure the monitors power consumption would it make a difference?

 

thanks in advance,

Tom.

Link to comment
Share on other sites

Link to post
Share on other sites

Monitors don't consume much power,if you want to save then i would begin with more efficient power supply and system components.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Tom_nerd said:

hey guys

 

I was wondering if running a monitor at 30Hz instead of 60Hz would use less power. I know it would use more power from a GPU perspective but if you exclude that and only measure the monitors power consumption would it make a difference?

 

thanks in advance,

Tom.

Your GPU would consume a lower power for a given task, yes. This is why some laptops drop to 40Hz from 60Hz when idle. The saving is not useful on a desktop PC though, and you won't get a meaningful drop on your electricity bill.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Tom_nerd said:

hey guys

 

I was wondering if running a monitor at 30Hz instead of 60Hz would use less power. I know it would use more power from a GPU perspective but if you exclude that and only measure the monitors power consumption would it make a difference?

 

thanks in advance,

Tom.

Actually most of the power saving will come from the PGU and not the monitor. The monitor will use less power but from 60 to 30 is so insignificant that it wouldnt matter. If the frame rate is lower (hertz and frame rate are two different things, we here are talking about the amount of frames displayed per second so its frame rate) then the monitor will show every frame for longer, so it wont really use less power. and usually for a 24 inch 1080p screen it takes around 0.003 amps per frame to change the oriantation of the crystals to get the right color. all of that is in LCDs, for oleds use the same amount of power per second anyway, doesnt matter the refresh rate. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×