Jump to content

Hmm why does my secondary monitor have a blue tint

theuserjohnny

So here is the story, right now I am using my HDTV as my main monitor so it is currently (obviously) taking up my HDMI port on my GTX 770. Now my brother recently left for collage and left one of his monitors behind and well I took it for the taking to use as a secondary. Now the issue is that whenever I connect it, it creates a blue tint to it... Every thing seems to look fine in terms of resolution but it has this nasty tint to it. I've tried fixing the settings in Windows 8 and even switched over to the nVida control panel to try and change this and nothing has worked. Even changing the settings from the monitor are no good.

 

The display is a VGA display and in order for it to work I am using a DVI adapter to help connect it in. 

 

Now at first I thought maybe it's something faulty with the GPU... so before my brother left I used his main monitor which can connect to both the DVI and DVI-D ports (without any adapter) and the monitor worked out fine with no color issue (while still having my HDTV connected in at the HDMI port). 

 

So I'm kinda confused as to what's going on since my brother has been able to use the display and it's adapter to his computer and he doesn't have any color issues like I'm having. I even hooked it up to my Retina Macbook Pro with a VGA to Thunderbolt adapter and no color issues whatsoever. 

 

The only difference between my brothers computer and mine is that he's running an AMD GPU (not sure of it's model) and he's running Windows 7. 

 

Here is a picture of the adapter, I mean I'm not sure if it's suppose to be missing the other half of pins? But even if that is an issue why did it turn out fine when my brother used it for his computer?

 

Sorry for the longish story and thank you for any sort of advice! I'll try and answer whatever questions to the best of my ability! 

post-84971-0-30500800-1409635339_thumb.j

Link to comment
Share on other sites

Link to post
Share on other sites

HOW?

You should find where the other half is and take them out and just replace it.

Computing enthusiast. 
I use to be able to input a cheat code now I've got to input a credit card - Total Biscuit
 

Link to comment
Share on other sites

Link to post
Share on other sites

It could posibly be the cable, I had this same problem, try using a different cable or adaptor.

Planning on trying StarCitizen (Highly recommended)? STAR-NR5P-CJFR is my referal link 

Link to comment
Share on other sites

Link to post
Share on other sites

snip

Had the same thing happen to me on a secondary VGA monitor. It had something to do with the monitor itself because if I moved the cable it would flicker in and out of the blue like state.

CPU: Intel i7 4770k w/Noctua NH-D15, Motherboard: Gigabyte Z97 Ultra Durable, RAM: Patriot 8Gb 1600Mhz (2x4Gb), GPU: MSI R9 390x Gaming,


SSD: Samsung 840 EVO 1Tb, HDD: Caviar Black 1Tb, Seagate 4Tb Hybrid, Case: Fractal Design Define R4, PSU: Antec Earthwatts 750w 


Phone: LG G2 32Gb Black (Verizon) Laptop: Fujitsu Lifebook E754 w/ 1TB Samsung 840 Evo SSD Vehicle: 2012 Nissan Xterra named Rocky

Link to comment
Share on other sites

Link to post
Share on other sites

The missing pins aren't important, VGA only uses a few pins from the DVI side (hot-plug detect, things like that), the bar and four pins on the left of the picture are where the analog signals are at.  VGA cables tend to become tinted when they wear out.  Most likely you will need to replace one of the VGA portions of your connection setup.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×