Jump to content

Computer doesn't show color red

Edster30

So, I just finished building my computer and after I finished installing all the things I needed I noticed that the color red doesn't show on my monitor. I tried my friends monitor and It still wouldn't show the color red and made everything look green. Then, I bought a new vga cable to see if thats the problem ,but still it didn't work.   

 

 

My computer has: EVGA Gtx 970 Graphics card, 430 watt corsair psu, AMD FX-6300 Vishera 6-Core 3.5Ghz (4.1GHz Turbo),16 gb hyperx ram, ASUS M5A97 R2.0 AM3+ AMD 970 + SB950 SATA 6Gb/s USB 3.0 ATX AMD Motherboard with UEFI BIOS, Intel 530 Series SSDSC2BW240A4K5 2.5" 240GB SATA III MLC Internal Solid State Drive (SSD),Western Digital Blue WD10EZEX 1TB 7200 RPM 64MB Cache SATA 6.0Gb/s 3.5" Internal Hard Drive Bare Drive, Cooler Master Hyper 212 EVO - CPU Cooler with 120 mm PWM Fan, NZXT Source 210 Elite White Steel with painted interior ATX Mid Tower Computer Case w/ Black Front Trim

 

 

Please help,

 

Thank You

Link to comment
Share on other sites

Link to post
Share on other sites

is the cable plugged in all the way ?

CPU: Xeon 1230v3 - GPU: GTX 770  - SSD: 120GB 840 Evo - HDD: WD Blue 1TB - RAM: Ballistix 8GB - Case: CM N400 - PSU: CX 600M - Cooling: Cooler Master 212 Evo

Update Plans: Mini ITX this bitch

Link to comment
Share on other sites

Link to post
Share on other sites

try moving the male end of the cable around while it is connected

Indus Monk = Indian+ Buddhist

Link to comment
Share on other sites

Link to post
Share on other sites

the monitor might not be receiving the red color band from that monstrosity of an analog VGA cable. As someone suggested before, play around with the cable pins on both ends (so they are clean and straight), get a new cable, or, better yet, upgrade to a digital cable standard

AMD FX8350  2x ASUS Radeon R9 270X 4GB  ASUS Sabertooth 990FX  -  Crucial Ballistix Sport DDR3 16GB

Link to comment
Share on other sites

Link to post
Share on other sites

the solution is to not use VGA, this is exactly the reason it should have disappeared ages ago...

Link to comment
Share on other sites

Link to post
Share on other sites

Why are you using vga

n0ah1897, on 05 Mar 2014 - 2:08 PM, said:  "Computers are like girls. It's whats in the inside that matters.  I don't know about you, but I like my girls like I like my cases. Just as beautiful on the inside as the outside."

Link to comment
Share on other sites

Link to post
Share on other sites

VGA?? This inst 1990

My Rig:  CPU: Core i7 4790K @4.8ghz  Motherboard: Asus Maximus Vii Hero  Ram: 4x4GB Corsair Vengeance Pro 2400mhz (Red)  Cooling: Corsair H105, 2x Corsair SP120 High Preformance Editions, Corsair AF 140 Quiet Edition  PSU: Corsair RM 850  GPU: EVGA GTX 980 SC ACX 2.0  Storage: Samsung 840 EVO 120GB, WD Blue 1TB  Case Corsair 760t (Black)  Keyboard: Razer Blackwidow Chroma  Mouse: Razer Deathadder Chroma  Headset: ATH-M50X Mic: Blue Yeti Blackout

 

Link to comment
Share on other sites

Link to post
Share on other sites

Also my computer monitors are really old should I probably buy a new one?

Link to comment
Share on other sites

Link to post
Share on other sites

Please use the one of the following cables: DVI or DisplayPort.

If your current monitor doesn't have either, please look into purchasing a new monitor with at least either video inputs.

Forget VGA, forget HDMI. Why?

VGA is a video signal that was introduce in 1987. The idea idea was to deliver "LOW COST", and full rich 16.7 million colors at 60Hz for CRT monitors (the tube based monitors). I said "LOW COST", because it was OK cost wise back then, but implementing digital video signal would be significantly more expensive, and the monitors which at the time cost a fortune already especially if you wanted a monitor that didn't flicker, and "high-resolution" (1024x780 or 1280x1024) would cost even more. And I don't mean 10$ more, I mean like possibly 1000$ more possibly, or even more, as processor (a required component in a digital circuit design) was very costly. I mean, back in the 1996, we had the Pentium 2. That was going at only 300MHz. And that was near the end of the 90s. It was way way slower in the early 90s. So analogue was used. The monitor it self was analogue from a to z. That is why until much later, CRT monitor, had knobs to adjust the monitor brightness and so on, and didn't have on screen menus. 'cause that would mean needing a digital circuitry and that was not even a possible option to anyone. Like I said, way to costly.

But today, or even back in the early days of LCD monitor, things were different. Digital is cheap. LCD from the ground up was designed to be digital. In fact it can only operate with a digital circuitry. So when you use VGA, this is what happens. Your graphics card convert the digital signal to analogue, it passes through a long cable (your VGA cable), which it gets all sorts of interference, especially these days, where we have wireless routers, cellphones, satellite TV, etc. and a crap tons of it. All wireless signals that all bombards your VGA cable. Yes it is shielded, but it can only go so far. So, it gets all this interference. And beside all this, I am not done, you are pushing possibly (I don't know your monitor exactly), a very high resolution which is simply too much for the VGA cable, and when it reaches your monitor. Your monitor is digital, so it converts the signal back to digital. But unlike you graphics card, the conversion circuit isn't really high-end, in fact you can be sure that the conversion circuit isn't very good. Each conversion cost quality.

Oh and did I mention that is assuming a perfect VGA cable?

So you see, VGA is really not an option to anyone. Always use digital.

So, why not HDMI? HDMI is digital, no? HDMI problem was that it was designed for TVs, not for computers. All it needs to support is a few internal specs standardized by the HDMI group, which TV manufactures implement. On the PC space it is not the case. Unless you are lucky, everything will work perfectly. But usually you need to play with set of options to get it to appear correctly on your monitor. Size, resolution and full colors.

DVI has been out for years, it has been tried and tested. It just works.

DisplayPort has been out for a while now, but only recently adapted on the mass market, and it could have small issues like the graphics card thinking your are using a TV instead of a monitor and limits colors spectrum because of that (fixable in the graphics card control panel), and some graphics card doesn't allow you to see the BIOS/UEFI screen via DisplayPort. It is all issues that DVI had it its early days, and will be fixed in time. I expect the next gen GPUs and monitors have no issues. Already the mentioned issues are not usual, most people don't have problems, but I expect next gen to really be make these issues a very rare thing.

Link to comment
Share on other sites

Link to post
Share on other sites

Please use the one of the following cables: DVI or DisplayPort.

If your current monitor doesn't have either, please look into purchasing a new monitor with at least either video inputs.

Forget VGA, forget HDMI. Why?

VGA is a video signal that was introduce in 1987. The idea idea was to deliver "LOW COST", and full rich 16.7 million colors at 60Hz for CRT monitors (the tube based monitors). I said "LOW COST", because it was OK cost wise back then, but implementing digital video signal would be significantly more expensive, and the monitors which at the time cost a fortune already especially if you wanted a monitor that didn't flicker, and "high-resolution" (1024x780 or 1280x1024) would cost even more. And I don't mean 10$ more, I mean like possibly 1000$ more possibly, or even more, as processor (a required component in a digital circuit design) was very costly. I mean, back in the 1996, we had the Pentium 2. That was going at only 300MHz. And that was near the end of the 90s. It was way way slower in the early 90s. So analogue was used. The monitor it self was analogue from a to z. That is why until much later, CRT monitor, had knobs to adjust the monitor brightness and so on, and didn't have on screen menus. 'cause that would mean needing a digital circuitry and that was not even a possible option to anyone. Like I said, way to costly.

But today, or even back in the early days of LCD monitor, things were different. Digital is cheap. LCD from teh ground up was deisgned to be digital. In fact it can only operate with a digital circuitry. So when you use VGA, this is what happens. Your graphics card convert the digital signal to analogue, it passes through a long cable (your VGA cable), which it gets all sorts of interference, especially these days, where we have wireless routers, cellphones, satellite TV, etc. and a crap tons of it. All wireless signals that all bombards your VGA cable. Yes it is shielded, but it can only go so far. So it gets all this interference, and beside all this, it's not done, you are pushing possible (I don't know your monitor exactly), a high resolution which is too much for the VGA cable which is just too much for it, and when it reach your monitor. Your monitor is digital, so it converts the signal back to digital. But your monitor is not as fancy as your graphics card, you can be sure that the conversion circuit isn't good. Oh and did I mention that is assuming a perfect VGA cable?

So you see, VGA is really not an option to anyone. Always use digital.

So why not HDMI, HDMI is digital? HDMI problem was that it was designed for TVs, not for computers. All it needs to support is a few internal specs standardized by the HDMI group, which TV manufactures implement. On the PC space it is not the case. Unless you are lucky, everything will work perfectly. But usually you need to play with set of options to get it to appear correctly on your monitor. Size, resolution and full colors.

DVI has been out for years, it has been tried and tested. it just works.

DisplayPort has been out for a while now, but only recently adapted on the wide mass, and it could have small issues like the graphics card thinking your are using a TV instead of a monitor and limits colors spectrum, and some graphics card doesn't allow you to see the BIOS/UEFI screen via DisplayPort. It is all issues that DVI had it its ear;y days, and will be fixed in time. I expect the next gen GPUs and monitors have no or no issues. Already the mentioned issues are not usual, most people don't have problems, but I expect next gen to really be a rare thing.

 

 

 

The reason I used a vga cable is because I tried connecting a dvi cable to the monitor and It said no signal. I dont know the exact specs of the monitor rn but I now its like a 2010 dell monitor that is not that good. Do you suggest getting a new monitor. If so what is a good price for a nice monitor $200-$300 price range.

Link to comment
Share on other sites

Link to post
Share on other sites

The reason I used a vga cable is because I tried connecting a dvi cable to the monitor and It said no signal. I dont know the exact specs of the monitor rn but I now its like a 2010 dell monitor that is not that good. Do you suggest getting a new monitor. If so what is a good price for a nice monitor $200-$300 price range.

If you have this, that is because:

-> Your graphics card DVI connector is broken or the card has a problem (uninstall the drivers, see if that works, try safe mode, try the latest drivers. if nothing helps, then it could be the card that is busted)

-> The DVI cable is faulty. It's digital. All works or nothing. They are no "2's" in digital. Either you have current (1) or not (0).

-> The monitor DVI plug is broken.

Link to comment
Share on other sites

Link to post
Share on other sites

Ok so I just tried the dvi cable and its shows the dell logo and then after like 5 seconds it says no vga cable

Link to comment
Share on other sites

Link to post
Share on other sites

Good thing I use a TV... HDMI works great for me!

Case: HAF XBCPU: 4690kCPU Cooler: NH D15Motherboard: Gigabyte Gaming 7 | RAM: Hyper X FuryVideo Card: G1 Gaming 970SSD: 850 EVO |  PSU: Supernova 550 G2 | 

Link to comment
Share on other sites

Link to post
Share on other sites

Did you switch input on the monitor?

 

Ohh I didn't know you had to do that how do I do that?

Link to comment
Share on other sites

Link to post
Share on other sites

So,what cable do you recommend I use?

DVI, HDMI, DP

We can't Benchmark like we used to, but we have our ways. One trick is to shove more GPUs in your computer. Like the time I needed to NV-Link, because I needed a higher HeavenBench score, so I did an SLI, which is what they called NV-Link back in the day. So, I decided to put two GPUs in my computer, which was the style at the time. Now, to add another GPU to your computer, costs a new PSU. Now in those days PSUs said OCZ on them, "Gimme 750W OCZs for an SLI" you'd say. Now where were we? Oh yeah, the important thing was that I had two GPUs in my rig, which was the style at the time! They didn't have RGB PSUs at the time, because of the war. The only thing you could get was those big green ones. 

Link to comment
Share on other sites

Link to post
Share on other sites

Ohh I didn't know you had to do that how do I do that?

plug in the DVI cable, the swap inputs on the monitor. it should have an "input" or "source" button

We can't Benchmark like we used to, but we have our ways. One trick is to shove more GPUs in your computer. Like the time I needed to NV-Link, because I needed a higher HeavenBench score, so I did an SLI, which is what they called NV-Link back in the day. So, I decided to put two GPUs in my computer, which was the style at the time. Now, to add another GPU to your computer, costs a new PSU. Now in those days PSUs said OCZ on them, "Gimme 750W OCZs for an SLI" you'd say. Now where were we? Oh yeah, the important thing was that I had two GPUs in my rig, which was the style at the time! They didn't have RGB PSUs at the time, because of the war. The only thing you could get was those big green ones. 

Link to comment
Share on other sites

Link to post
Share on other sites

ok so I just plugged in the dvi cable changed the input and the windows logo showed then it sayed going into power saving mode and when I shook the mouse it would not turn on

Link to comment
Share on other sites

Link to post
Share on other sites

ok so I just plugged in the dvi cable changed the input and the windows logo showed then it sayed going into power saving mode and when I shook the mouse it would not turn on

Ok great we are getting somewhere. Do Win+P 2 times on your keyboard and let go all keys and wait 5sec. If nothing happens, do it again (Win+P 2x). Does your screen goes out of sleep mode and shows you an image?
Link to comment
Share on other sites

Link to post
Share on other sites

OK hold up let me plug in my dvi vable Ill be back soon with the respponse

Link to comment
Share on other sites

Link to post
Share on other sites

Ok so I tried doing windows key + P but nothing really worked the power saving mode went away but nothing showed up. When I connect the vga cable to reply it showed  Project to a different screen adn the options were pc screen only which it is on now, duplicate, extend and second screen only.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×