Jump to content

Installing a graphics card in an old server

ttop191

Hey guys, posted this over in the graphics card section before realizing it should really be over here. So here it is.

I was gifted a Dell Poweredge T310 server and am looking to turn it into a media hub for my house and possibly some lighter gaming/ 720p low-mid settings.

Its got a Xeon x3460, 16Gb 1333Mhz DDR3, 1TB WD blue HDD, 250gb WD blue HDD

Currently running on a fresh install of Windows 10

The motherboard has a full length PCI.e G2 x8 slot for slot 2. The only full length slot.

The issue I'm having is adding a graphics card. I have a GT 1030 not sure if DDR4 or GDDR5 I was also given.

I can get into the OS just fine using the on board graphics on the chip and everything runs fine. Device Manager however does not show the GT1030 under display adapter or other. When attempting to install the nvidia driver for the card it fails the compatibility scan saying the card does not exist.

Booting into the BIOS I can see the card in the IRQ priority settings as Slot 2 "VGA adapter" and also see Slot 2 "multimedia".

Additionally if I disable the on board graphics I can then boot and navigate the BIOS without issue using the HDMI out on the 1030. But after BIOS I simply get a never ending black screen and never boot into the OS.

I imagine it is something I've got wrong in the BIOS settings but I haven't got a clue where to look.

Sorry for being long winded but wanted to give all the details I could. 

Additionally here is a post from Tom's hardware where a guy seems to have the same issue that went unresolved.

 

Thanks

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×