Jump to content
17 hours ago, scottyseng said:

Also, the other issue is that in some SuperMicro Boards, the min fan speed is 50% or higher (I don't know why they did that). So the fans are running faster than they really need to be.

Because they're designed to be used in a rack probably -- so they don't want to encourage you to be stupid. (just my guess)

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, djdwosk97 said:

Because they're designed to be used in a rack probably -- so they don't want to encourage you to be stupid. (just my guess)

Yeah, according to SuperMicro support, they're like that because they prefer reliability over noise. So even if everything is in temp, they'd rather have it run faster just in case. Their desktop boards aren't like that though. I heard if you have a board with IPMI though, you can hack the min fan speed, but not sure how true that is.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, scottyseng said:

Yeah, according to SuperMicro support, they're like that because they prefer reliability over noise. So even if everything is in temp, they'd rather have it run faster just in case. Their desktop boards aren't like that though. I heard if you have a board with IPMI though, you can hack the min fan speed, but not sure how true that is.

 

5 hours ago, djdwosk97 said:

Because they're designed to be used in a rack probably -- so they don't want to encourage you to be stupid. (just my guess)

Yea I mean... its server hardware. There is NO need to be quiet in a server environment and I fully understand that, but since I am not in that environment (even though that was the design intention), I want to change it as many of you do/have as well lol.

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, {EAC} Shoot em UP said:

 

Yea I mean... its server hardware. There is NO need to be quiet in a server environment and I fully understand that, but since I am not in that environment (even though that was the design intention), I want to change it as many of you do/have as well lol.

haha, I know. I have this in my room. Yeah the SuperMicro hotswap fans are proprietary so you can't really change them for slower fans, only the same (They have a different pin out / fan end). I think the other option was to run a fan controller inside of the server, but the RAID or HBA controller will throw an error about there being no fans on the backplane, which got annoying. The fan pins on the backplane and motherboard are normal fan 4 pins so you can use your own fans, but yeah, just accept that the backplane and motherboard have aggressive fan profiles.

 

The other thing to note though if that if you plan on filling up all of the drives with hot 7200 RPM drives, you will need the stock fans, as the drives do get really hot, as my ghetto bequiet fan setup isn't strong enough to get the air moving fast enough. I had to space out my drives because they were overheating.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, scottyseng said:

haha, I know. I have this in my room. Yeah the SuperMicro hotswap fans are proprietary so you can't really change them for slower fans, only the same (They have a different pin out / fan end). I think the other option was to run a fan controller inside of the server, but the RAID or HBA controller will throw an error about there being no fans on the backplane, which got annoying. The fan pins on the backplane and motherboard are normal fan 4 pins so you can use your own fans, but yeah, just accept that the backplane and motherboard have aggressive fan profiles.

 

The other thing to note though if that if you plan on filling up all of the drives with hot 7200 RPM drives, you will need the stock fans, as the drives do get really hot, as my ghetto bequiet fan setup isn't strong enough to get the air moving fast enough. I had to space out my drives because they were overheating.

Yea, the 24 bays will not be all used up for quite some time, not to worried about that. Thanks for the info!

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, {EAC} Shoot em UP said:

Yea, the 24 bays will not be all used up for quite some time, not to worried about that. Thanks for the info!

Have a fun time with the build. Yeah, I had six WD Re drives stacked in one column...and yeah...under load they got to 60C+

The WD Reds chug along at 45C, even under load.

Link to comment
Share on other sites

Link to post
Share on other sites

Is the ranking still a thing? Doesn't seem it has been updated in a few months...

Gaming Rig - ASUS ROG Crosshair VIII Dark Hero, AMD Ryzen 7 5800X (stock), ND-D15 Chromax Black, MSI Gaming Gaming X Trio RTX 3070, Corsair Vengeance LPX 32Gig (2 x 16G) DDR4 3600 (stock), Phanteks Eclipse P500A, 5x Noctua NF-P14 redux-1500, Seasonic FOCUS PX-850, Samsung 870 QVO 2TB (boot), 2x XPG SX8200 Pro 2 TB NVMe (game libraries), 2x Seagate BarraCuda ST8000DM004 8TB (storage), 2x Dell (27") S2721DGF, 2x Asus (24") VP249QGR, Windows 10 Pro, SteelSeries Arctis 1 Wireless, Vive Pro 2, Valve Index

NAS /Plex Server - Supermicro SC826TQ-R800LPB (2U), X8DTN+, 2x E5620 (Stock), 72GB DDR3 ECC, 2x Samsung 860 EVO (500GB) (OS & Backup), 6x WD40EFRX (4TB) in RaidZ2, 2x WD 10TB white label (Easy Store shucks), 2x Q Series HDTS225XZSTA (256GB) ZIL & L2ARC mirrored, Ubuntu 20.04 LTS

Other Servers -2x Supermicro CSE-813M ABC-03 (1U), X9SCL, i3-2120 (stock), 8 Gigs DDR3, 4x Patriot Burst 120GB SSD (Raid10 OS array), Mushkin MKNSSDHL250GB-D8 NVMe (game drive), Ubuntu 20.04 LTS - RAID 10 failed after a power outage... dang.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Ziggidy said:

Is the ranking still a thing? Doesn't seem it has been updated in a few months...

I think it got updated about a month ago? There hasn't been a huge amount of submissions since then

Looking to buy GTX690, other multi-GPU cards, or single-slot graphics cards: 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Might as remove the old configuration I posted, Almost everything is different about this system from the original.
 
Hardware
CASE: Fractal Define R5
PSU: EVGA SuperNOVA 650

CPU: Intel Core i3-6100 3.7 GHz

RAM: 32GB - G.SKILL Ripjaws V Series 

MB: ASRock H170M Mini-ITX

 

HBA CARD: LSI 9207-8i

INTEL SAS 36 port expander

HDD 1: 1x 500G WD Blue 2.5" (OS)

HDD 2: 12x 1TB  WD Red 2.5"

HDD 3: 4x 3TB WD Green 3.5"

 
 
Software and Configuration:
My server is running CentOS 6.8 
I'm using mdadm to make my arrays.
The OS drive is the WD Blue (encrypted with luks).
My main array consists of 2 sets of six 1TB drives in RAID 5.
The WD Green drives will be in RAID 5 and used as storage for backups.
 
Usage:
I use the reds as storage for iscsi targets and/or kvm storage.
The greens are for making putting copies of all the files I want to have a second copy of.
 
 
Photo's:
 
DSCN21001_zpsb8lf89fc.JPG.c9f7f8f1feff72277e32490600a57c27.JPG
 
 
DSCN21021_zpsuomewi0v.thumb.JPG.f3da5da48cf2656f41bfccdd5e9bffa9.JPG
 
 
 

testa.PNG

Can Anybody Link A Virtual Machine while I go download some RAM?

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

Just to let you all know, I have bought a  SuperMicro 6017R-N3RF4+ put my nas system in.

I plan to buy another 8 8TB drives soon which will get me well over 200TB :P

Respect the Code of Conduct!

>> Feel free to join the unofficial LTT teamspeak 3 server TS3.schnitzel.team <<

>>LTT 10TB+ Topic<< | >>FlexRAID Tutorial<<>>LTT Speed wave<< | >>LTT Communies and Servers<<

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, looney said:

Just to let you all know, I have bought a  SuperMicro 6017R-N3RF4+ put my nas system in.

I plan to buy another 8 8TB drives soon which will get me well over 200TB :P

Haha, just when I thought someone would have you beat for the #1 spot...

 

Congratulations though.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/29/2016 at 0:06 PM, brwainer said:

I think it got updated about a month ago? There hasn't been a huge amount of submissions since then

Perhaps I didn't meet some requirement. Posted just about 3 months back, but hasn't been added to rankings, unless I am just overlooking it.

Gaming Rig - ASUS ROG Crosshair VIII Dark Hero, AMD Ryzen 7 5800X (stock), ND-D15 Chromax Black, MSI Gaming Gaming X Trio RTX 3070, Corsair Vengeance LPX 32Gig (2 x 16G) DDR4 3600 (stock), Phanteks Eclipse P500A, 5x Noctua NF-P14 redux-1500, Seasonic FOCUS PX-850, Samsung 870 QVO 2TB (boot), 2x XPG SX8200 Pro 2 TB NVMe (game libraries), 2x Seagate BarraCuda ST8000DM004 8TB (storage), 2x Dell (27") S2721DGF, 2x Asus (24") VP249QGR, Windows 10 Pro, SteelSeries Arctis 1 Wireless, Vive Pro 2, Valve Index

NAS /Plex Server - Supermicro SC826TQ-R800LPB (2U), X8DTN+, 2x E5620 (Stock), 72GB DDR3 ECC, 2x Samsung 860 EVO (500GB) (OS & Backup), 6x WD40EFRX (4TB) in RaidZ2, 2x WD 10TB white label (Easy Store shucks), 2x Q Series HDTS225XZSTA (256GB) ZIL & L2ARC mirrored, Ubuntu 20.04 LTS

Other Servers -2x Supermicro CSE-813M ABC-03 (1U), X9SCL, i3-2120 (stock), 8 Gigs DDR3, 4x Patriot Burst 120GB SSD (Raid10 OS array), Mushkin MKNSSDHL250GB-D8 NVMe (game drive), Ubuntu 20.04 LTS - RAID 10 failed after a power outage... dang.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Ziggidy said:

Perhaps I didn't meet some requirement. Posted just about 3 months back, but hasn't been added to rankings, unless I am just overlooking it.

One of the mods usually adds a note saying when they update the listings

Looking to buy GTX690, other multi-GPU cards, or single-slot graphics cards: 

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 10/17/2016 at 7:55 PM, brwainer said:

One of the mods usually adds a note saying when they update the listings

Looks like it may be a while...

Can Anybody Link A Virtual Machine while I go download some RAM?

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

Hardware

CASE: Fractal Design Node 804

PSU: Seasonic SS-760XP

MB: Asus P9D-M

CPU: Intel Xeon E3-1231 v3

HS:  Noctua NH-C14S

RAM: 2 * 4GB DDR3L ECC (Kingston KVR16LE11S8/4KF)

RAID CARD 1: SAS9211-8I (Flashes to IT)

SSD: 2 * Samsung SSD 850 PRO 128GB

HDD 1: 8 * WD 6TB red (WDC WD60EFRX-68L0BN1)

Software and Configuration:

My server is running FreeBSD 11 and I'm using ZFS to manage a RAID 10 pool. 

My array consists of 8 6TB WD drives in RAID 10. This gives 21 TB of real availability from the pool. 

Currently I am using Samba to connect to my desktop, I will be setting up iSCSI to share space with an ESXi host. 

Usage:

I use the storage for movies and series, Run plex on the server and I connect to it for local playback. 

It will be used as a datastore for an ESXi host. 

Backup:

Hopes and dreams.

Additional info:

I'm looking at increasing the memory and running bhyve since the CPU is woefully underutilized. I would mostly be shifting the services into VMs and possibly running related VMs on it. 

I used RAID 10 for ease of recovery and lower overhead on high write operations. 

I use identical sized partitions in my ZFS pool rather than direct drive size as this prevents possible issues if a replacement drive is a different size by a MB or two.

Photo's:

3ZP26Dp.jpg

4 deep on each stack. The SSDs are installed in a section in the front panel. I installed the PSU upside down and couldn't be bothered to rewire it all after I realized. 

FChNz1n.jpg
On the plus side, I never have problems with heat. 

BLcn1YH.jpg

Nothing fancy, but it keeps it off the floor so it doesn't suck up as much dust and lint. 

bqZoH6d.png

lLrUUzm.png

Pool status and capacity. 

 

EDIT: Yes, I'm a bad for running as root. I was lazy and had to install and modify some services. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just for 'fun' I thought I'd throw the SAN I'm using at work (I just upgraded our infrastructure systems from an old VMware nightmare to a shiny new HyperV beastie that doesn't fall over if you look at it too hard).

 

So yes, this is a just for fun, because, well, this is small/medium business/enterprise grade stuff:

 

Hardware

HP MSA2040
2 x Controllers
Each with: 1 x 1 Gbit Ethernet (management), 4 x SPDF+ Adaptors (populated with 2 x 16 Gb Fibre Channel and 1 x 1 Gbit iSCSI)
Redundant power supplies
24 SFF SAS drive ports - connecting to 12Gb/sec SA controller.  Controller has super capacitor and flash memory backup for writes during power loss.
 
Bays 1 - 16: HP 2.5" 600GB SAS 15K Drives.
Bays 17 - 24: HP 2.5" 1.2TB SAS 10K Drives.
 
Software and Configuration:
 
HP MSA Storage Management Utility v3 (Web based).  I think it's BSD based, but it might be linux.
 
Bays 1 - 8 Configured as a storage volume, RAID 5 for about 4.2TB
Bays 9 - 16 Configured as a storage volume, RAID 5 for about 4.2TB
Bays 17 - 24 Configured for remote archive (from our DR system) for about 8.4TB.
 
Total storage about 16.8TB
 
Controllers are cross connected (no FC switch) to 2 x HP ProLiant DL 360p Gen8 servers running Windows Server 2012 R2 (each server has 1 connection from controller A and one from controller B) using 16Gb Fibre Channel HBAs.
 
Usage:
 
Storage for HyperV of 14 Windows servers, including failover clustering/AlwaysOn for SQL Server 2016, Exchange 2016, IIS and so on.
It also remotely clones itself to our DR data centre once an hour to an identical device.
 
Photo's:
I thought I had some on my phone, but apparently I don't!
 
All told it's good, solid, stable and deals with all of the bandwidth needs we can throw at it.  It's difficult to properly bench something like this (While it's in a live environment anyway), but I've seen iops over 4000, and a 5GB file copies at about 300 - 500MB/second - obviously more than enough to saturate our 1Gbit network.
 
Anyway, that config (without servers, just the SAN + Drives) comes in at around $12,000USD (at today's exchange rate anyway).
 
Thought you guys might enjoy seeing how much cheaper and probably more capable your builds are compared to that!
 
The price we pay for 'zero' down time...
Link to comment
Share on other sites

Link to post
Share on other sites

Alrighty then, time for an update!

 

Note: The list of noteworthy builds will now also hold the decommissioned builds.

 

@MyNameIsNicola Updated. I hope I got it right, so many drives. :D

@scottyseng Updated

@maxtch Added your second system, updated your first one.

@b3nno Added system to list. Nice box!

@Jonny Updated.

@Ziggidy Nope, rankings are not dead, but yes, it does usually take me a while to get around to them. Added your system to the list. Thanks for the entry!

@username6465 Updated.

@leadeater Answered your question in chat already, but just for public info: We'd count each system in a clustered file system as a single entry, if it qualified for the list.

@FattyDave Added your system to the list. Thanks!

@{EAC} Shoot em UP Added your new system to the list, relegated the old one to the secondary list (no, we don't delete posts). The pics in your new post seem to be broken though! :( As for the noise: I put noise dampening material into our server, it actually made quite a difference. Wasn't cheap though.

@unijab Added your new system, moved old one to secondary list.

@kerradeph Added system to list. Very neat.

 

 

It would also appear that there are a few entries which I've overlooked. The thread seems to be becoming a bit unwieldy, I'm going to think a bit if I can come up with a solution for that. Wouldn't want this to keep happening. Apologies to everyone below.

@paps511 Added.

@brwainer Fixed that and added to list. Apologies.

@weeandykidd Overlooked your last update, fixed that as well.

@Ramaddil Updated.

 

Lastly:

@Bhav If you post the rest of the system info as per @looney's template, that seems like it would qualify for the list.

@saitir Obviously we won't be starting to add work systems to the rankings (would be a bit unfair), but that does sound pretty neat, so if you ever do get a hold of some pictures... wouldn't mind seeing those. ;)

 

BUILD LOGS: HELIOS - Latest Update: 2015-SEP-06 ::: ZEUS - BOTW 2013-JUN-28 ::: APOLLO - Complete: 2014-MAY-10
OTHER STUFF: Cable Lacing Tutorial ::: What Is ZFS? ::: mincss Primer ::: LSI RAID Card Flashing Tutorial
FORUM INFO: Community Standards ::: The Moderating Team ::: 10TB+ Storage Showoff Topic

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, wrathoftheturkey said:

Why?

Primarily to simplify things (though some people have such a hodgepodge of drives it can still be pretty time-consuming to enter their config into the system). But yes, it is of course completely arbitrary.

BUILD LOGS: HELIOS - Latest Update: 2015-SEP-06 ::: ZEUS - BOTW 2013-JUN-28 ::: APOLLO - Complete: 2014-MAY-10
OTHER STUFF: Cable Lacing Tutorial ::: What Is ZFS? ::: mincss Primer ::: LSI RAID Card Flashing Tutorial
FORUM INFO: Community Standards ::: The Moderating Team ::: 10TB+ Storage Showoff Topic

Link to comment
Share on other sites

Link to post
Share on other sites

@alpenwasser I actually have two submissions in this post, completely seperate systems. In the rankings you linked to my first system's post, but only put in the statistics of the second, smaller system :( And by now I've actually increased the capacity on both those systems, and added a third seperate system that is also eligible, so I will make a new post making it clear the specs of all 3.

Looking to buy GTX690, other multi-GPU cards, or single-slot graphics cards: 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, brwainer said:

@alpenwasser I actually have two submissions in this post, completely seperate systems. In the rankings you linked to my first system's post, but only put in the statistics of the second, smaller system :( And by now I've actually increased the capacity on both those systems, and added a third seperate system that is also eligible, so I will make a new post making it clear the specs of all 3.

Haha, alright, sounds good. I'll update the list when you make the new post. emoticons/thumb.gif

BUILD LOGS: HELIOS - Latest Update: 2015-SEP-06 ::: ZEUS - BOTW 2013-JUN-28 ::: APOLLO - Complete: 2014-MAY-10
OTHER STUFF: Cable Lacing Tutorial ::: What Is ZFS? ::: mincss Primer ::: LSI RAID Card Flashing Tutorial
FORUM INFO: Community Standards ::: The Moderating Team ::: 10TB+ Storage Showoff Topic

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/19/2016 at 4:36 AM, alpenwasser said:

 

@{EAC} Shoot em UP Added your new system to the list, relegated the old one to the secondary list (no, we don't delete posts). The pics in your new post seem to be broken though! :( As for the noise: I put noise dampening material into our server, it actually made quite a difference. Wasn't cheap though.

 

 

Fixed the pix. And for noise, I am about to swap out the PSU's for my "old" RM750, I recently upgraded to a RM650i since my rig doesn't turn my 750's fan on, which I know is fine for the PSU, but it was dumping heat into my case and it was noticeable! I don't run my fans at high RPM since everything is on water so the heat was just stagnant in there.

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/19/2016 at 1:36 PM, alpenwasser said:

Obviously we won't be starting to add work systems to the rankings (would be a bit unfair),

Probably a good thing.... They just installed a new storage closet at work... 2 PB redundant space, to put alongside the existing 1.2 PB closet :D

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MG2R said:

Probably a good thing.... They just installed a new storage closet at work... 2 PB redundant space, to put alongside the existing 1.2 PB closet :D

I'm not saying you can't post pics though. You know, for science, or stuffs. :D

BUILD LOGS: HELIOS - Latest Update: 2015-SEP-06 ::: ZEUS - BOTW 2013-JUN-28 ::: APOLLO - Complete: 2014-MAY-10
OTHER STUFF: Cable Lacing Tutorial ::: What Is ZFS? ::: mincss Primer ::: LSI RAID Card Flashing Tutorial
FORUM INFO: Community Standards ::: The Moderating Team ::: 10TB+ Storage Showoff Topic

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×