Jump to content

Monitor won't display signal, but is detected by PC (solved)

I built a PC about two months ago and recently found out the monitor (LG 2560x1080 25 inch ultra wide) could be overclocked to 75hz (was 60hz before). I overclocked it and it has been working for multiple restarts for around two weeks. Earlier today, I started a game for the first time (Project Argo if relevant) and the monitor went black and wouldn't detect a signal. I assumed this meant the game was outputting at a resolution the monitor didn't support, and I was right. I changed the resolution and display frame rate via the config files and started the game again. The monitor lost connection again, but this time, even after closing the game the monitor wouldn't detect a signal from my pc. I could tell that the pc still thought it was detected because my second monitor still was not acting as the main monitor and my cursor could go far off into the other screen. I tested the monitor on another PC, and it worked right away. Any ideas?

 

TL;DR: Monitor with stable overclock won't display signal from one pc, but pc still thinks it is detected.

Specs / other relevant things

Windows 10 - 64 Bit

GTX 970 / i5-6500 / H170 Mobo / 500 Watt EVGA PSU

 

EDIT:

Solved, the game that I was troubleshooting managed to change the hz of my monitor in Nvidia control panel to 85hz and crashed (is that the right term?) the monitor whenever video was sent to it.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×