Jump to content

My resolution changed

reitze10
Go to solution Solved by mariushm,

My guess... the Samsung 2243SN has only VGA input (D-sub) so most likely you use a passive DVI->VGA adapter to connect this monitor to your card.

 

A few contacts in the VGA connector let the monitor send a list of supported resolutions to the video card, and that's how the OS typically shows a list of resolutions there. The passive DVI to VGA adapters simply rearrange the contacts between the DVI and VGA sides without any processing, there's just some wires inside basically making connections between contacts.

If the adapter doesn't connect those pins on the VGA side that are used to transfer the resolutions, or maybe if the DVI side is not inserted quite properly inside the video card (so those pins are not connected) then then the video card may not receive the resolutions and default on some lower resolutions.

 

Unplug and plug again, try several times, blow some air on the dvi connectors in case there's some dust inside causing bad connection.

 

if that doesn't solve anything, look up what monitors were detected by Windows... if you see there a Samsung SyncMaster something, make sure it's the same model. If it's not, try manually selecting your monitor or if there isn't any, maybe pick a monitor that you know for sure has 1080p resolution.

in my previous built i had 2 monitors 

1 samsung syncmaster

1 BenQ XL2411

 

They where both 1080p

 

i builded a new pc with a R9 280x and a xeon x3450,

when i installed drivers and stuff, my samsung monitor wouldnt go to 1080p, not in settings and olso not with custom resolution from amd.

when i checked the specs on the menu on the monitor is said 1600x1200, and on the internet its 1680x1050,

but im pretty sure i had 2 1080p monitors.

 

what going on??

 

 

previous built:

core 2 quad

r9 280x

some old mobo

 

new built

x3450

r9 280x

ga-h55-ud3h v1

sasmung evo 850

500 watt psu evga

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, reitze10 said:

1 samsung syncmaster 

 

when i installed drivers and stuff, my samsung monitor wouldnt go to 1080p, not in settings and olso not with custom resolution from amd.

when i checked the specs on the menu on the monitor is said 1600x1200, and on the internet its 1680x1050,

but im pretty sure i had 2 1080p monitors.

What model number is the Samsung? That will let us search its specifications and find its resolution.

Chances are you just didn't notice it was a different resolution until you went in to adjust it with the new system.

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Glenwing said:

Have you restarted since installing the graphics card drivers?

yes I did

Link to comment
Share on other sites

Link to post
Share on other sites

My guess... the Samsung 2243SN has only VGA input (D-sub) so most likely you use a passive DVI->VGA adapter to connect this monitor to your card.

 

A few contacts in the VGA connector let the monitor send a list of supported resolutions to the video card, and that's how the OS typically shows a list of resolutions there. The passive DVI to VGA adapters simply rearrange the contacts between the DVI and VGA sides without any processing, there's just some wires inside basically making connections between contacts.

If the adapter doesn't connect those pins on the VGA side that are used to transfer the resolutions, or maybe if the DVI side is not inserted quite properly inside the video card (so those pins are not connected) then then the video card may not receive the resolutions and default on some lower resolutions.

 

Unplug and plug again, try several times, blow some air on the dvi connectors in case there's some dust inside causing bad connection.

 

if that doesn't solve anything, look up what monitors were detected by Windows... if you see there a Samsung SyncMaster something, make sure it's the same model. If it's not, try manually selecting your monitor or if there isn't any, maybe pick a monitor that you know for sure has 1080p resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, mariushm said:

My guess... the Samsung 2243SN has only VGA input (D-sub) so most likely you use a passive DVI->VGA adapter to connect this monitor to your card.

 

A few contacts in the VGA connector let the monitor send a list of supported resolutions to the video card, and that's how the OS typically shows a list of resolutions there. The passive DVI to VGA adapters simply rearrange the contacts between the DVI and VGA sides without any processing, there's just some wires inside basically making connections between contacts.

If the adapter doesn't connect those pins on the VGA side that are used to transfer the resolutions, or maybe if the DVI side is not inserted quite properly inside the video card (so those pins are not connected) then then the video card may not receive the resolutions and default on some lower resolutions.

 

Unplug and plug again, try several times, blow some air on the dvi connectors in case there's some dust inside causing bad connection.

 

if that doesn't solve anything, look up what monitors were detected by Windows... if you see there a Samsung SyncMaster something, make sure it's the same model. If it's not, try manually selecting your monitor or if there isn't any, maybe pick a monitor that you know for sure has 1080p resolution.

My video cart also has a DVI-D port so I used a DVi-D to a HDMI converter and than a HDMI to VGA converter.

that worked! Now i have a working 1080p,

so i think there is somethng wrong with my DVI-I port or DVI to VGA converter.

thanks ?

Link to comment
Share on other sites

Link to post
Share on other sites

It's good that it works, but you should be aware that HDMI to VGA converters have a chip inside which decodes the digital signal sent through the HDMI and create the analogue output typical for VGA and these chips don't always produce the highest image quality. It may be noisier and often not as "sharp".

The video card itself has the circuitry to produce analogue signal and routes the wires for that signal into one of your DVI ports, that's how those DVI to VGA adapters work as I said, they only rearrange wires between connectors.  What I'm trying to say is that the image quality of the analogue signal coming from the DVI connector should always be better than the signal produced If you're watching movies it won't matter, but it may matter if you're doing something that requires good color reproduction and other things.

 

You should try finding another DVI to VGA passive adapter, they're cheap, 1-2 bucks ... and maybe try using the other DVI port, if both ports are DVI-I (have those 4 holes by the side, the analogue outputs)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×