Jump to content

All about server rooms

Hey guys,

Im posting this to ask if there are people on here who run server rooms, or have knowledge about server rooms.

I'm studying IT technology in Denmark, and we have been asked to come up with a theoretical server room surveillance product for a small/medium company. We have no idea what makes a server room run around, so I'm gonna ask a few questions, and if you have an answer, please reply to this post. It may be updated as our project develops. And thank you in advance.

What are the main concerns when talking about maintenance? I'm talking both server maintenance, and like keeping the server room clean, is that a necessity? Does dust build up

Link to comment
Share on other sites

Link to post
Share on other sites

server rooms are sealed with air filtration and air conditioning

 

it gets extremely hot in there with all the servers

liquid cooled server rooms dont have this problem, but most arent liquid cooled

its also extremely loud

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

well, i have a small server "room" at home.

 

theres a few important aspects to a server room:

- stabile power, UPS if budget is available.

- temperature and moisture control. (dont have it be an oven, dont put your servers in a moldy basement)

- good wiring is key, as well as options for upgrading and maintaining that wiring.

- have it seperated from areas that require quietness, 1u servers scream until you're deaf.

Link to comment
Share on other sites

Link to post
Share on other sites

I used to work at a Datacenter (DataHive in Calgary, Alberta), if there's any advice I can give you, is this.

 

1) Make sure your server room is adequately cooled, and look for having a redundant cooling system. God forbid if your AC broke down, you'll be cooking your servers.

 

2) Dust is and can be a major issue in a Server room, look at getting a electric dust/micro filter that is capable of 2x the room capacity if possible.

 

3) Remember, the way you run a server room is a impression on who you are as a person. Keeping it clean, clutter free, and organized is key. Makes diagnosing issues a breeze when you can walk the floor without stepping all over crap.

 

4) Sure you heard this before but, CABLE MANAGEMENT. I don't care if it takes forever, label cables, tie them into groups. Trust this me makes it easier if your core router goes out, or a switch randomly calls it quits (especially if you have 10-20 switches deployed). If you're making your own cables remember this. Measure twice, cut once. Datacenters/Server rooms can't afford to have slack/excess cable around. And it makes cleaned up much easier

 

5) Proper power requirements. You can't run many servers on a basic 15A 120v circuit, you need to get a technician to come in to enable multiple high-amp breakers in your server room.

 

6) UPS (Uninterruptable Power Supplies) - These are crucial, if you're running backups and the power goes out, thats it, all that data is gone. UPS's can range from plug in, to diesel running generators. And will save your ass in the long run.

 

7) If the server room is already built, then take this with a grain of salt. But having sewers and fire suppression is a must. We all know water doesn't like electricity and vice versa, so if a flood/water line broke it'll have somewhere to go. Also if your server room ever caught fire, it'll be much easier to contain instead of watching the room burn.

 

8) Humidity / Temperature controlled rooms. We already talked about temperature on bullet point 1, but humidity is a killer. Our humidity control alarms were set between 45-55%, with alarms triggering at 30-70% to either remove or add humidity to the air. Remember this, too much humidity will cause condensation in cold server rooms, and too little humidity will cause static buildup.

 

If theres anything I missed, I'll update it.

Link to comment
Share on other sites

Link to post
Share on other sites

Its not that loud, but you do have to yell if your standing right next to the racks. 

 

MUST-Have feature: trench flooring for cable management Grates in the floor. Its like a sewer full of cat 6 goodness.

 

Rack Bridges: its like the opposite of the floor trenches except they run across the top of the racks so cable are easy to swap and run. 

 

One of my pet peeves is  labeling switch port --> Patch panel panel --> network port in the wall ---> computer. Without this I kill people.

 

Patch panels with <2 ft Ethernet patch cable are a nightmare. There can be no slack cable in the rack or it looks terrible. 

 

 Yes dust build up is a problem but not if the room is filtered. This is pretty important since most servers cant just go offline for cleaning.

 

Make sure your Air conditioner is badass and filtered if its drawing air from outside the room (it should be)

 

P.S Don't put the UPS at the top of the rack unless you want to kill someone, then it perfectly fine.

Current: R2600X@4.0GHz\\ Corsair Air 280x \\ RTX 2070 \\ 16GB DDR3 2666 \\ 1KW EVGA Supernova\\ Asus B450 TUF

Old Systems: A6 5200 APU -- A10 7800K + HD6670 -- FX 9370 + 2X R9 290 -- G3258 + R9 280 -- 4690K + RX480

Link to comment
Share on other sites

Link to post
Share on other sites

Proper ventillation is a most. Dust is a killer. redunandancy is also a must. 

Rig: Thermaltake Urban S71 | MSI Z77 G45-Gaming Intel Core i5 3570K (4.4Ghz @ 1.4v) CM Hyper 212 EVO | Kingston HyperX Fury 8GB | MSI GTX 660 | Kingston 120GB SSD | Seagate 3TB HDD | EVGA 850W B2

Link to comment
Share on other sites

Link to post
Share on other sites

4) Sure you heard this before but, CABLE MANAGEMENT. I don't care if it takes forever, label cables, tie them into groups. Trust this me makes it easier if your core router goes out, or a switch randomly calls it quits (especially if you have 10-20 switches deployed). If you're making your own cables remember this. Measure twice, cut once. Datacenters/Server rooms can't afford to have slack/excess cable around. And it makes cleaned up much easier

So.... like this?

2.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

documentation is always key imo (especially for cabeling)

 

everything else has pretty much been said, a "double layer floor" for cabeling and the cool air for the AC(redundant if possible), a good UPS System with a generator if its really important that those servers keep running. Security and access controll are also pretty important (-> good surveilance system and keycards) Filtering the room is also very important because you dont want any downtime for cleaning. Keeping everything you can redundant (maybe even a backup serverroom) can also save your life! Backups are also a mayor concern, keeping a backup at another location, doing archival backups, etc we were always told: "think about the restore case!" -> essentialy dont think: "that backupmethod is easy to run" think about how difficult it is to do a restore! Also finding the wright temprature for your AC can be pretty difficult 1°C can make a HUGE difference in energy cost!

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

Thank you to all of your great replies, they have been a great help for our project!

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×