So I'll have multiple questions and I'm not sure if it's graphics card, monitor or cable related so bear with meh.
I've been using a BenQ XL2720Z with an MSI GTX 970 for over a year now. I bought a display port cable for it because I wasn't sure (still not sure to be honest, never tried anything else other than the display port) if the cable I got with the monitor would work with a 144hz refresh rate.
Since there wasn't a single resolution with the 144hz option, I set up my own. Didn't do anything really, just went into the control panel and made a custom 1920x1080 res with 144 hz but changed nothing else there.
First issue goes back to when I first plugged in the display port cable. Resolution is working, refresh rate is what it's supposed to be but sometimes when I start up my PC and turn on the monitor, I get a "no signal detected" message. A quick reset fixes the problem so I never really looked into it but due to other issues, and because I'm still having the same thing I figured I should ask about it at this point. Why is that? Any ideas? Is it normal with a display port or the monitor and graphics card I'm using?
Second issue is a fresh one, happened an hour or two ago. First some backstory: I have a laptop that I'm using mainly for a Teamspeak server. It's half a laptop, meaning it's missing the monitor panel but otherwise working. We had a blackout, the spare monitor I was using with the laptop died so I took the laptop and plugged it into my main monitor(the BenQ) using a D-sub cable to restart the server and whatnot (Since the laptop is old as heeeeell) while I still had the display port cable connected to my graphics card, PC was running too. Figured I can swap between the two without restarting or turning anything off. I wasn't paying attention and for some reason instead of unplugging anything from the laptop, I unplugged the graphics card side of the dp cable. I was like eh, just gonna plug it back in. I did that, screen came back but it went into a much smaller aspect ratio with like a 1024x768 resolution. "I'll just re-select the resolution I was using" says I, but the control panel didn't show any other options apart from the first 2 maybe. Kinda like when it doesn't detect your video card I think. Quick restart and I had to re-select the "full" display mode on my monitor and I wanted to enable my original resolution but it disappeared along with all the other ones. I only had some default, factory resolutions set to 60hz max, nothing I could raise higher than that. I thought for some reason it reset something and I just have to make a new custom resolution. I did that and I kept getting "out of range" messages and it wouldn't revert back from the test of the new resolution (It would just go into a "no signal detected error and I would have to reset). I got scared that I broke my graphics card so I kept restarting but it wouldn't revert back from that 144hz resolution for some reason. After like 3 or 4 restarts and trying different things, I did a fresh install of the nvidia driver so that I can try again after it reset my resolution to the default. I did it again, and still wasn't working. Tried it again with 120hz, screen was flickering like it's being shocked by a taser(no sound, but the words were bouncing around, they looked like sound waves but still readable tho). No error message here so I could revert back to the original. 144hz was still not working. I pretty much gave up at this point and figured I will have to buy a new graphics card, or cable or both, I don't know. Did another restart so I could at least write this topic to see if you guys could help me, checked on the resolutions and all my original ones, along with the custom set 144hz resolution was there. I selected it and it's working now. I did nothing really, but it came back. ¯\_(ツ)_/¯
So I guess my question is: Did I fuck something up? What fixed it? I shouldn't ever ever EVER plug two different cables into my monitor or accidentaly unplug the displayport cable from a graphics card while the PC is running?
I'm not even sure it was working properly anymore, maybe I'm lucky and even tho I'm having this no signal detected error it still starts for some reason, but the cable or the monitor or my graphics card is a faulty one. I guess I should test the cable I got with the monitor to see how it works. It's a DVI-D I think? Says Dual Link on top. Can that handle 144 hz or if I want that I should definitely stick with display port?
Yes, I can't keep my framerate above 140 in every game obviously so we could argue why do I need that high of a refresh rate. Mainly because it's a 144hz monitor but I'll also get a 1080 or 1080ti later when it's available here and it doesnt cost a fortune
Any help ?