Jump to content

Alexey Koznov

Member
  • Posts

    7
  • Joined

  • Last visited

Reputation Activity

  1. Funny
    Alexey Koznov reacted to 101m4n in Repurposing old dl180 server motherboard, ATX PSU?   
    Massive necro incoming, but I figured I'd leave the solution here anyway for anyone else that comes along.
    I too ended up running into the bios problem, then solved it and promptly forgot about this thread (my bad).
    Anyway, the dl180/se316m/x1600 are actually all the same server (more or less), with the same bios. The bios update comes packaged as a utility that runs from within windows, it's a bit weird, but it does actually work.
    You can find the updater on the hpe website here.
     
    P.S. There was some confusion for me early on in this project as to which board is which, this should clarify:
    dl180/se316m/se326m/se1220/x1600(?):
    Some of these have the integrated lights out connector (rj45) built into the board, others have a small daughter board that plugs into the board separately (see the small plug next to the button midway up the right side of the board in this picture). I tried briefly to use it, but it's a nightmare. It's flakey as hell, I couldn't get it to open in anything but IE, and a good chunk of the functionality is locked behind a license. So if you ask me, it's no great loss. I ended up using ordinary wake-on-lan to power up my nodes.
     
    dl160 (and probably a slew of other names too...):
    I have no experience with this board, though greenie214 (from earlier in this thread) does.
     
    P.P.S. The cluster, janky though it may be, is very much still kicking!
    It has been (and continues to be) an interesting project:
     
  2. Agree
    Alexey Koznov got a reaction from leadeater in Looking for a Backup solutions   
    Yea, +1 to Veeam. It is pretty good for backup all kind of stuff + if you want to go into virtualization Veeam will be good solution to work with
  3. Like
    Alexey Koznov reacted to NoxOps in LTT Storage Rankings   
    Hardware:
    SERVER: IBM System x3650 M4 v2
    CPU: 2 x Intel Xeon CPU E5-2680 @ 2.70GHz
    RAM: 384GB
    NIC: 3 x Intel X540-T2 10GB Dual Port Ethernet Adapter
    RAID CARD 01: LSI SAS9207-8e 8-Port External HBA with "IT" firmware
    RAID CARD 02: LSI SAS9207-8e 8-Port External HBA with "IT" firmware
    RAID CARD 03: IBM M1015 with "IT" firmware
    STORAGE: 3 x 36 Bay SuperMicro CSE-847E26-R1400LPB
    SSD 01: 2 x 960GB ScanDisk Ultra II (SDSSDHI-960G)
    SSD 02: 6 x 1TB Samsung SSD 860 EVO (MZ-76E1T0B/AM)
    HHD 01: 66 x 5TB Toshiba X300 (HDWE150XZSTA)
    HHD 02: 18 x 8TB WD 8TB My Book (WDBBGB0080HBK-NESN) shucked Ultrastar He10-8 SATA (WD80EZAZ)
    SSD total capacity: 7.920TB (raw)
    HHD total capacity: 474TB (raw)
     
    Software and Configurations:
    My IBM server is running FreeNAS 11.1-U7, the IBM backplane is connected to the internal IBM M1015 with the six SSD drives. In freenas it's setup with four vdevs (mirrored) with two drives per vdev, which is used for the virtual machines.
     
    Next I have two LSI SAS9207-8e cards that connect externally with three SAS cables to the three 36 Bay SuperMicro CSE-847E26-R1400LPB which have a total of 84 drives. The 84 drives are setup in RAID-Z2, with 14 vdev's with six drives per vdev.
     
    I present both zpools to vmware using iscsi, and the RAID-Z2 pool is setup for SMB.
     
    Usage:
    I use the storage to run all my virtual machines in vmware and to hold my large media, basically I run my own personal dropbox (seafile), email, game servers etc.
     
    Backup:
    I backup all the virtual machines to my RAID-Z2 pool, external qnap device, gdrive and LTO-6 tape drive.
     
    Additonal info:
    I currently have three esxi hosts connected to this storage, each esxi hosts has two Intel X540-T2 10GB cards to give me 40GB throughput for each hosts. My ISP provides 10G internet connect to the house.
     
    Photo's/FreeNAS Info:
    pool: Raidz_2_SATA state: ONLINE scan: scrub repaired 0 in 1 days 19:02:24 with 0 errors on Tue Jul 16 19:02:30 2019 config: NAME STATE READ WRITE CKSUM Raidz_2_SATA ONLINE 0 0 0 raidz2-0 ONLINE 0 0 0 gptid/c0d5e371-15a7-11e8-ab59-a0369f484a68 ONLINE 0 0 0 gptid/731d04df-9fd7-11e7-8ebe-3440b5b0233c ONLINE 0 0 0 gptid/25c4fcf6-9283-11e7-8ebe-3440b5b0233c ONLINE 0 0 0 gptid/e078d3f2-1bee-11e7-a096-3440b5b0233c ONLINE 0 0 0 gptid/707e7c41-891c-11e7-8d2a-3440b5b0233c ONLINE 0 0 0 gptid/dd4e684f-9f14-11e7-8ebe-3440b5b0233c ONLINE 0 0 0 raidz2-1 ONLINE 0 0 0 gptid/cf88ceb0-165c-11e8-ab59-a0369f484a68 ONLINE 0 0 0 gptid/08c08b57-1724-11e8-ab59-a0369f484a68 ONLINE 0 0 0 gptid/f57817a3-17f3-11e8-ab59-a0369f484a68 ONLINE 0 0 0 gptid/c266fc1a-18c9-11e8-ab59-a0369f484a68 ONLINE 0 0 0 gptid/a1d42969-193e-11e8-b2bc-a0369f484a68 ONLINE 0 0 0 gptid/ce9ac33c-19a5-11e8-b2bc-a0369f484a68 ONLINE 0 0 0 raidz2-2 ONLINE 0 0 0 gptid/8a305bc3-b5fd-11e8-bc7a-a0369f484a68 ONLINE 0 0 0 gptid/7e1eadaa-b64c-11e8-bc7a-a0369f484a68 ONLINE 0 0 0 gptid/fafc4062-b6d7-11e8-b121-a0369f484a68 ONLINE 0 0 0 gptid/41a15fff-0a22-11e9-890f-a0369f484a68 ONLINE 0 0 0 gptid/bdd63c8a-b77e-11e8-b121-a0369f484a68 ONLINE 0 0 0 gptid/a4449699-b7d1-11e8-b121-a0369f484a68 ONLINE 0 0 0 raidz2-3 ONLINE 0 0 0 gptid/58a70f71-b763-11e6-9118-3440b5b0233c ONLINE 0 0 0 gptid/8769c8e0-c325-11e6-9118-3440b5b0233c ONLINE 0 0 0 gptid/e190fb70-c29e-11e6-9118-3440b5b0233c ONLINE 0 0 0 gptid/b4313200-c0a8-11e6-9118-3440b5b0233c ONLINE 0 0 0 gptid/bfe5c8f8-a796-11e6-9118-3440b5b0233c ONLINE 0 0 0 gptid/eb24c94c-1d82-11e8-b2bc-a0369f484a68 ONLINE 0 0 0 raidz2-4 ONLINE 0 0 0 gptid/0ebeae35-d9f5-11e7-b15a-a0369f484a68 ONLINE 0 0 0 gptid/5c9b81c3-e50b-11e7-a57a-a0369f484a68 ONLINE 0 0 0 gptid/e1738310-daf0-11e7-b15a-a0369f484a68 ONLINE 0 0 0 gptid/e2c55c3f-d8cb-11e7-b15a-a0369f484a68 ONLINE 0 0 0 gptid/62de80ff-d79d-11e7-a3c8-a0369f484a68 ONLINE 0 0 0 gptid/ca314311-d62e-11e7-a3c8-a0369f484a68 ONLINE 0 0 0 raidz2-5 ONLINE 0 0 0 gptid/c55c76b7-d9f4-11e7-b15a-a0369f484a68 ONLINE 0 0 0 gptid/beacf4e9-dbf8-11e7-b15a-a0369f484a68 ONLINE 0 0 0 gptid/6297aa25-daf0-11e7-b15a-a0369f484a68 ONLINE 0 0 0 gptid/9f689504-d8cb-11e7-b15a-a0369f484a68 ONLINE 0 0 0 gptid/722cf68d-d71e-11e7-a3c8-a0369f484a68 ONLINE 0 0 0 gptid/85d6f320-d62e-11e7-a3c8-a0369f484a68 ONLINE 0 0 0 raidz2-6 ONLINE 0 0 0 gptid/2a5228d8-8a12-11e5-9905-3440b5b0233c ONLINE 0 0 0 gptid/2b8c0b23-8a12-11e5-9905-3440b5b0233c ONLINE 0 0 0 gptid/2cc53e86-8a12-11e5-9905-3440b5b0233c ONLINE 0 0 0 gptid/2e0403b9-8a12-11e5-9905-3440b5b0233c ONLINE 0 0 0 gptid/2f3e74bf-8a12-11e5-9905-3440b5b0233c ONLINE 0 0 0 gptid/3076d876-8a12-11e5-9905-3440b5b0233c ONLINE 0 0 0 raidz2-7 ONLINE 0 0 0 gptid/70c3d74e-b33a-11e5-83e2-3440b5b0233c ONLINE 0 0 0 gptid/72c3e734-b33a-11e5-83e2-3440b5b0233c ONLINE 0 0 0 gptid/74c1d3e1-b33a-11e5-83e2-3440b5b0233c ONLINE 0 0 0 gptid/76a320ea-b33a-11e5-83e2-3440b5b0233c ONLINE 0 0 0 gptid/787028e4-b33a-11e5-83e2-3440b5b0233c ONLINE 0 0 0 gptid/7a4744e8-b33a-11e5-83e2-3440b5b0233c ONLINE 0 0 0 raidz2-8 ONLINE 0 0 0 gptid/df118278-ed2e-11e5-aaaf-3440b5b0233c ONLINE 0 0 0 gptid/e00ae5d7-ed2e-11e5-aaaf-3440b5b0233c ONLINE 0 0 0 gptid/e10dc1c8-ed2e-11e5-aaaf-3440b5b0233c ONLINE 0 0 0 gptid/e2121190-ed2e-11e5-aaaf-3440b5b0233c ONLINE 0 0 0 gptid/e306c84f-ed2e-11e5-aaaf-3440b5b0233c ONLINE 0 0 0 gptid/e3f9b4dc-ed2e-11e5-aaaf-3440b5b0233c ONLINE 0 0 0 raidz2-9 ONLINE 0 0 0 gptid/456df18a-ed2f-11e5-aaaf-3440b5b0233c ONLINE 0 0 0 gptid/465657e6-ed2f-11e5-aaaf-3440b5b0233c ONLINE 0 0 0 gptid/47418cad-ed2f-11e5-aaaf-3440b5b0233c ONLINE 0 0 0 gptid/4826457b-ed2f-11e5-aaaf-3440b5b0233c ONLINE 0 0 0 gptid/490ee34f-ed2f-11e5-aaaf-3440b5b0233c ONLINE 0 0 0 gptid/4a01eb1f-ed2f-11e5-aaaf-3440b5b0233c ONLINE 0 0 0 raidz2-10 ONLINE 0 0 0 gptid/39be88a8-1beb-11e7-a096-3440b5b0233c ONLINE 0 0 0 gptid/3b34d341-1beb-11e7-a096-3440b5b0233c ONLINE 0 0 0 gptid/3c5d28df-1beb-11e7-a096-3440b5b0233c ONLINE 0 0 0 gptid/3d7920e3-1beb-11e7-a096-3440b5b0233c ONLINE 0 0 0 gptid/3edca0cf-1beb-11e7-a096-3440b5b0233c ONLINE 0 0 0 gptid/3ff30819-1beb-11e7-a096-3440b5b0233c ONLINE 0 0 0 raidz2-11 ONLINE 0 0 0 gptid/fa7ec530-b64c-11e8-bc7a-a0369f484a68 ONLINE 0 0 0 gptid/6e60fc6e-b6d8-11e8-b121-a0369f484a68 ONLINE 0 0 0 gptid/e2265afd-b77f-11e8-b121-a0369f484a68 ONLINE 0 0 0 gptid/b8ed3d10-b724-11e8-b121-a0369f484a68 ONLINE 0 0 0 gptid/162c05d7-b7d3-11e8-b121-a0369f484a68 ONLINE 0 0 0 gptid/f4ed5ea8-b5fc-11e8-bc7a-a0369f484a68 ONLINE 0 0 0 raidz2-12 ONLINE 0 0 0 gptid/71c4646d-b77f-11e8-b121-a0369f484a68 ONLINE 0 0 0 gptid/444ad7d3-b725-11e8-b121-a0369f484a68 ONLINE 0 0 0 gptid/f952906a-b6d8-11e8-b121-a0369f484a68 ONLINE 0 0 0 gptid/85674958-b64d-11e8-bc7a-a0369f484a68 ONLINE 0 0 0 gptid/8788fd86-b5fc-11e8-bc7a-a0369f484a68 ONLINE 0 0 0 gptid/1d9eb7c6-b7d2-11e8-b121-a0369f484a68 ONLINE 0 0 0 raidz2-13 ONLINE 0 0 0 gptid/af9f90f6-9069-11e8-b9d5-a0369f484a68 ONLINE 0 0 0 gptid/b12bb3dd-9069-11e8-b9d5-a0369f484a68 ONLINE 0 0 0 gptid/b2c15ad1-9069-11e8-b9d5-a0369f484a68 ONLINE 0 0 0 gptid/b44ffec6-9069-11e8-b9d5-a0369f484a68 ONLINE 0 0 0 gptid/b5ebeadd-9069-11e8-b9d5-a0369f484a68 ONLINE 0 0 0 gptid/b7aa1120-9069-11e8-b9d5-a0369f484a68 ONLINE 0 0 0 pool: Raidz_VMware_SSD state: ONLINE scan: resilvered 467G in 0 days 00:24:06 with 0 errors on Fri Jul 19 19:45:32 2019 config: NAME STATE READ WRITE CKSUM Raidz_VMware_SSD ONLINE 0 0 0 mirror-0 ONLINE 0 0 0 gptid/e4a59571-aa7e-11e9-80cf-a0369f484a68 ONLINE 0 0 0 gptid/4584cc52-aa84-11e9-80cf-a0369f484a68 ONLINE 0 0 0 mirror-1 ONLINE 0 0 0 gptid/4dfca7d9-8da5-11e9-b1aa-a0369f484a68 ONLINE 0 0 0 gptid/799d7a4c-8e7a-11e9-b1aa-a0369f484a68 ONLINE 0 0 0 mirror-2 ONLINE 0 0 0 gptid/00ae5d7e-ec99-11e5-a41c-3440b5b0233c ONLINE 0 0 0 gptid/02072c05-ec99-11e5-a41c-3440b5b0233c ONLINE 0 0 0 mirror-3 ONLINE 0 0 0 gptid/3a47f312-a266-11e8-826f-a0369f484a68 ONLINE 0 0 0 gptid/3b4a66a9-a266-11e8-826f-a0369f484a68 ONLINE 0 0 0  





  4. Agree
    Alexey Koznov reacted to brandishwar in Slow 10GB network transfer speed with SFP+   
    You haven't wasted anything. You've at least gained the ability to saturate the HDDs on your NAS. Whereas the Gigabit connection was holding you back. That's the single main reason to go 10GbE. And if you made the leap and bought a 10GbE switch, though I doubt that since you said you've spent only $150, everything on your network will be able to talk without running into Gigabit caps all over the place.
     
    I have a Gigabit Internet connection, so having a 10GbE backbone to my home network means I can do whatever I want on the home network and not throttle whatever my wife is doing online. We've actually tested that - she downloaded a game from Steam while I copied several large video files from the NAS, and neither of us were throttled. For me the movies copied in at about 300MB/sec, but again that's because you can't saturate 10Gb with only a few HDD clusters. And my wife's game download nearly saturated our Internet connection. I call that a WIN even if I'm not getting 1 gigabyte per second to or from the NAS.
     
    And I could probably copy video files from my HDD storage on my desktop to the NAS while downloading a game from Steam and see a combined throughput of several hundred megabytes per second. I haven't tried that, but I'm sure I could do that without issue.
     
    And that, again, is the main reason to go 10Gb. Even with just HDD storage, the fact you're not capped at gigabit speeds means you're still coming out ahead, even if you're not saturating the network connection to your NAS. Now you just need to figure out how to expand your NAS storage.
×