Jump to content

Thunderbolt/Mini displayport requires a VGA connection to work???

gagral

Hi! :D

So, i got a strange situation here. I have an old Asus ROG G751JM laptop connected to an old Samsung monitor (Model S20C301L) through a VGA connection. 👴
This monitor supports both DVI-D and VGA inputs, while my laptop can output through VGA, HDMI or Thunderbolt/Mini DIsplayport. I wanted to upgrade to digital signal, so i bought an Thunderbolt/Mini DIsplayport to DVI cable.

At first, it was a nightmare. There was absolutely nothing i could do to get it to work. I spent hours troubleshooting, but the monitor just could not detect any signal, only VGA one. But then this happens.
I was testing with only one connection at once and i got tired of plugging and unplugging the cables every time. So i had the idea to keep both cables plugged in and change the source detection in monitor's menu when i wanted to. So, with the TB/MiniDP cable already connected, i've then connected the VGA cable (while DVI was selected as input source). The display started flashing. After 2 seconds, voilà, it was working. And i was like, "what the f***???":S I've then unplugged the VGA cable, signal lost. Plugged it again, got digital signal.
Well, i'm happy that it's working now, but i simply don't understand what's happening here. Why is that i need both connections simultaneously to make it work? WIll that still happen if i use a HDMI to DVI cable?
Fun fact: the image displays only after windows startup. So, if i want to enter BIOS setup for example, i need to switch back to VGA as input source.

Sorry for my english.

 

Update: Can't run games with It, i get bsod. 🙃

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×