Jump to content

Problem setting up two monitors. :?

DZONS

Description:
So recently I decided to add another monitor to my pc so I can watch streams and do more stuff on the side.

I got this dell 1907fp monitor from my friend, he was using it perfectly fine before.

Since this dell monitor dosen't have an HDMI output, I used a VGA to HDMI adapter.

 

The problem:

Ok, so when I added the Dell monitor, at first everything seemed fine, but after a minute or two, the computer's wallpapers turned grey, only the wallpapers, icons and everything else stayed. 5 minutes after plugging the second monitor in, the pc just turns off, and then turns back on, and this happens on a loop, not sure what could be the problem here, I really want this to work, as having two monitors would make my life much easier. 

I used two different VGA cables, this happens with both ones. I have the latest drivers for RX 570.

 

My pc specs:

Ryzen 5 3500x

Not sure of the brand but 500W PSU

Rx 570 4gb

16gb ram.

2tb hard drive.

 

Some Pictures:

 

905707c8-d0dd-4be5-9fe0-59725da118b9.jpg

4738f318-d6ff-48d5-a5cd-0120f1f665e9.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

Try with an older driver ... I have RX 570 card at home and my drivers are probably from January or december of previous year. 

There's very little fixes or improvements for these older generations, most of the driver development is made for the newer cards, so you wouldn't miss much. 

 

I doubt it's the vga cable, but the hdmi to vga converter could be the problem. The converter uses an active chip which uses power from the hdmi connector to work and that amount of power may be more than what the video card can supply (or the video card can not sustain the amount of power that converter needs, because the regulator on the video card maybe gets too hot due to too much power being drawn by the hdmi to vga converter)

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mariushm said:

Try with an older driver ... I have RX 570 card at home and my drivers are probably from January or december of previous year. 

There's very little fixes or improvements for these older generations, most of the driver development is made for the newer cards, so you wouldn't miss much. 

 

I doubt it's the vga cable, but the hdmi to vga converter could be the problem. The converter uses an active chip which uses power from the hdmi connector to work and that amount of power may be more than what the video card can supply (or the video card can not sustain the amount of power that converter needs, because the regulator on the video card maybe gets too hot due to too much power being drawn by the hdmi to vga converter)

 

Ok which driver do you suggest me to try?

I hope that it's not the adapters problem, as i spent 10 bucks on it. 😄

If getting an older driver won't help then, i guess im gonna have to buy a DVI to DVI cable.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mariushm said:

Try with an older driver ... I have RX 570 card at home and my drivers are probably from January or december of previous year. 

There's very little fixes or improvements for these older generations, most of the driver development is made for the newer cards, so you wouldn't miss much. 

 

I doubt it's the vga cable, but the hdmi to vga converter could be the problem. The converter uses an active chip which uses power from the hdmi connector to work and that amount of power may be more than what the video card can supply (or the video card can not sustain the amount of power that converter needs, because the regulator on the video card maybe gets too hot due to too much power being drawn by the hdmi to vga converter)

 

I doubt its a Power issue.  I have what looks like that exact HDMI adapter and its always been fine for me.  The trouble with VGA is that its not a digital standard.  And so you may need to play around with the display settings in Windows to get it to work properly.  That said Im not sure why the PC would turn off after some time.  Perhaps using DVI if you can, as its digital and will likely still work reasonably well.  I suspect that the VGA chip in that adapter isnt talking to the monitor properly for some reason.

Linux Daily Driver:

CPU: R5 2400G

Motherboard: MSI B350M Mortar

RAM: 32GB Corsair Vengeance LPX DDR4

HDD: 1TB POS HDD from an old Dell

SSD: 256GB WD Black NVMe M.2

Case: Phanteks Mini XL DS

PSU: 1200W Corsair HX1200

 

Gaming Rig:

CPU: i7 6700K @ 4.4GHz

Motherboard: Gigabyte Z270-N Wi-Fi ITX

RAM: 16GB Corsair Vengeance LPX DDR4

GPU: Asus Turbo GTX 1070 @ 2GHz

HDD: 3TB Toshiba something or other

SSD: 512GB WD Black NVMe M.2

Case: Shared with Daily - Phanteks Mini XL DS

PSU: Shared with Daily - 1200W Corsair HX1200

 

Server

CPU: Ryzen7 1700

Motherboard: MSI X370 SLI Plus

RAM: 8GB Corsair Vengeance LPX DDR4

GPU: Nvidia GT 710

HDD: 1X 10TB Seagate ironwolf NAS Drive.  4X 3TB WD Red NAS Drive.

SSD: Adata 128GB

Case: NZXT Source 210 (white)

PSU: EVGA 650 G2 80Plus Gold

Link to comment
Share on other sites

Link to post
Share on other sites

I've tried two different vga cables, this happens with both ones.

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, DZONS said:

I've tried two different vga cables, this happens with both ones.

 

I was referring to the adapter, and the monitor settings you are attempting to use.  Resolution, refresh rate etc.  In VGA there is no "resolution" its an analog signal for X time, then a blank, then another signal.  From top to bottom to make the image.  Digitally you would say X pixel is X color, rinse repeat for the whole resolution.  So if the adapter is sending signals in the wrong way then the new monitor wont look right, or display at all.

Linux Daily Driver:

CPU: R5 2400G

Motherboard: MSI B350M Mortar

RAM: 32GB Corsair Vengeance LPX DDR4

HDD: 1TB POS HDD from an old Dell

SSD: 256GB WD Black NVMe M.2

Case: Phanteks Mini XL DS

PSU: 1200W Corsair HX1200

 

Gaming Rig:

CPU: i7 6700K @ 4.4GHz

Motherboard: Gigabyte Z270-N Wi-Fi ITX

RAM: 16GB Corsair Vengeance LPX DDR4

GPU: Asus Turbo GTX 1070 @ 2GHz

HDD: 3TB Toshiba something or other

SSD: 512GB WD Black NVMe M.2

Case: Shared with Daily - Phanteks Mini XL DS

PSU: Shared with Daily - 1200W Corsair HX1200

 

Server

CPU: Ryzen7 1700

Motherboard: MSI X370 SLI Plus

RAM: 8GB Corsair Vengeance LPX DDR4

GPU: Nvidia GT 710

HDD: 1X 10TB Seagate ironwolf NAS Drive.  4X 3TB WD Red NAS Drive.

SSD: Adata 128GB

Case: NZXT Source 210 (white)

PSU: EVGA 650 G2 80Plus Gold

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, MedievalMatt said:

I was referring to the adapter, and the monitor settings you are attempting to use.  Resolution, refresh rate etc.  In VGA there is no "resolution" its an analog signal for X time, then a blank, then another signal.  From top to bottom to make the image.  Digitally you would say X pixel is X color, rinse repeat for the whole resolution.  So if the adapter is sending signals in the wrong way then the new monitor wont look right, or display at all.

I have played around with the resolutions and stuff. i also tried updating my drivers to the latest. at this moment i'm considering to buy a DVI to DVI cable, hoping that would fix the problem.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Either way, the PC should NOT turn off because of the second monitor or because of a VGA cable. 

 

PC tuning off is a sign of something more serious going on, like video card drivers crashing, or some other bigger issue. 

 

If it's a problem of the analogue signal not being correct for the monitor, the system would simply receive  "monitor turned off " and "monitor turned on" signals as the monitor loses sync with the analogue signal and shows "no signal" on screen. 

 

If the active chip in the converter loses power for some reason (like consuming too much power from the hdmi connector and the regulator shuts down due to overheating, or maybe the converter is just fucked and overheats itself and restarts randomly)  then it's like pulling hdmi cable out and in repeatedly, the video card could see it as insert cable / remove cable several times a second and mess up with the driver and with the Windows GUI (which enables or disables second monitor, maybe tries to move windows over to the active monitor and so on).

Such "glitches" could also cause issues with HDCP encryption, if such encryption is enabled (I doubt it is)

 

 

@DZONSIf that old monitor has DVI, there's cheap DVI-HDMI passive adapters which simply rearrange the wires between connectors, so no active chips or any complications.  It may be cheaper to use such passive adapter with a hdmi cable, instead of buying a dvi-dvi cable you'll never use otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

I've seen this happen before and had this happen before. These hdmi to vga adapters are active adapters as you need to turn a digital signal into an analog signal. The problem here is that some of these adapters are just kinda crap or semi defective and feed back a small current causing a power trip in the gpu and thus a crash. Not uncommon really.

Link to comment
Share on other sites

Link to post
Share on other sites

So i should get a dvi to hdmi adapter rather than just a dvi to dvi?

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's entirely up to you.   A DVI-DVI cable is single use, you can only use between two dvi connectors. It will work for you for this exact job, connecting the monitor to the video card.

 

A passive DVI-HDMI adapter  simply passively converts a HDMI to DVI, or a DVI to HDMI - it works both ways because like I said, there's only a bunch of wires between the connectors, and hdmi is designed to be backwards compatible. 

 

So you could attach the adapter to a monitor and convert it to a HDMI monitor,  or you could attach the adapter to a video card and get a second HDMI port on the video card (for example to connect the video card to a TV with hdmi inputs) 

 

And if you'll get rid of that monitor, you'd still be left with a hdmi cable that you can reuse with other monitor, instead of ending up with a dvi-dvi cable you can't use with anything. 

 

You get more possible uses with a dvi-hdmi adapter AND hdmi cables are potentially cheaper, because they're more mass produced, and lighter and use less volume of space and you can buy them from various local shops. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×