Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


Senior Moderator
  • Content Count

  • Joined

  • Last visited

Reputation Activity

  1. Like
    alpenwasser reacted to MafiaMoe for a blog entry, Water Cooling: The Build   
    Now that the water cooling components have arrived, the building has really begun.

    First out of the boxes were the heat sinks. The CPU water block was the easiest I have installed to date, while the GPU predictably took some time. The overall GPU size shrank significantly, reveling the nice lower motherboard heat sink once again (Yay!).

    Next step was to build the custom cables and rework the wiring through the case. I think these white/blue accents add quite a bit of style to this build, and being able to customize the look makes the build a little more unique.

    Finally, the water cooling prep truly begun. Installing the last major components of the loop ran into a snag with the large radiator. The selected radiator has a built-in reservoir, which added to the length and could not be installed correctly along the top. Thankfully, I was able to move the radiator forward thanks to the NZXT case extra long screw mounts. I couldn't get all the screws in, but there should be enough to keep the fans from vibrating and causing noise. This did mean that one of the front case fans had to be permanently uninstalled.

    The pump and reservoir were purchased as a set and assembly was simple. They will be quite hidden from view. Before I started bending, I looked for a way to position the pump in a way that would minimize the number of bends in the case. This would make my job of bending the tubes easier and keep the look of the case as clean as possible. I found that mounting the pump on a drive tray, as originally planned, worked well and only needed to be raised about 7mm on washers.
    Installing the fittings also added that final bit of chrome, and the selected blue fittings brought in a little more of the color accent into the case. I believe this gave a fairly consistent look to the build.

    The tube bending took a considerable amount of time. Hardline tube bending is not a trivial matter and requires lots of measuring. Thankfully I purchased the complete bending kit from Monsoon, which came with measuring tools, but it took over an hour until I had enough confidence in the measurement to bend and trim my first tube. The process of measurement was tricky, but I believe that simply has to do with the awkwardness of making measurements in a confined 3D space where the rulers can't be secured to anything.
    As for the bending, it took a few tries until I was able to make bends with millimeter accuracy. I kept to the simple bends at first, making sure that the minor mistakes I was making wouldn't ruin the looks. I saved the most noticeable double bend for last and the result was more than acceptable. I would say that many who have worked around power tools and is a little patient would have about the same experience after bending and cutting only 5 tubes.

    I test fit the bent tubes before securing the ends of the tubes on. I went with Monsoon fittings, which require the extra step of attaching the ends of the tubes with adhesives before securing the ends of the tubes into the case. Once the ends of the tubes are on, there is no undoing the adhesive and getting the ends back. Same thing goes for the tube they are connected to. I would highly advise a full test-fit before gluing any ends onto a pipe. A dremel tool may also come in handy, as some of the ends might need to be sanded down and flattened to ensure the end caps adhere straight.
    Once the test fit was done, I followed the instructions on securing the end caps from YouTube videos published by Performance PCs ( https://www.youtube.com/channel/UCEHPVLaMY9zeDOynltKhgdA ).

    This is also about the time I noticed a good number of my fittings had parts where the blue paint had chipped off, revealing the chrome underneath. I had worked over a tile floor and had dropped two fitting, but somehow 5 of the fittings had significant chipping which must have mainly come from handling them. Oils from hands can be quite harsh to finishes, but the quality of these 'premium' hardline fittings was simply not up to my expectations. I placed the blemished fittings around the case in locations that would be less noticeable, but I wonder if this is the norm for Monsoon free center colored fittings or if this set of fittings skipped a step during manufacturing...
    Once the tube ends were cured to the tubes, I installed them and realized just how clean this particular build ended up. I managed to keep the water cooling loop toned down and simple, while still color coordinating it with the rest of the build. The water cooling loop still dominates the center of the case, but the Y created between the pump and the water blocks has symmetries and looks like the rather elegant configuration I was aiming for. This was my first time doing a custom water cooling at all, so a clean result was far from guaranteed.
    At this point I tried to pressure test the loop, but made a horrible rookie mistake in my eagerness to complete the system. I forgot to attach any thing to the bottom of the GPU water block... When my pressure test was inconclusive and I decided to proceed with the 'paper towel' method, a fair bit of water doused the SSD just below the GPU after turning on the pump for a quick second. I was using an external power brick to run the pump, so thankfully the power supply, which was also under the GPU, was not powered or plugged in. I quickly cleaned it up and found no water hit any crucial components. The water used is PrimoChill and is not conductive, but I will still be letting it dry out just in case.

    I continued filling the loop by filling the res and then running the pump for a second until there was enough water in the loop to keep the pump on. I was watching the bubbles flow through the loop and noticed a bit of water leaking from below one of the GPU's rotary fittings. Second rooky mistake, I didn't tighten the rotary fitting down to the block enough and I could clearly see space below the o-ring... Another reason to wait for the system to dry out, even though it looks like the water didn't touch any part of the GPU. I tightened the rotary fitting down, cleaned up the drops of water, and continued cycling the water.
    I ran the system like this for a little over an hour with no more signs of leaks anywhere. It looks like it may be a day or two before I try to boot again, but I am willing to be a little patient for this project. Overall, I am very happy with how this computer looks and would go so far as to say it might pass as a professional build. I'll post a summary when/if the computer is up and running.

  2. Like
    alpenwasser reacted to wpirobotbuilder for a blog entry, A Potential Dual-1366 Machine   
    In terms of raw performance, you can pick up some older dual-socket CPUs that'll beat out modern quad cores at the same price point (though they'll use more power).
    The build will probably be a ZFS storage box at first, and if I upgrade later on I might use it to host virtual machines or as a computing node or something.
    Intel x5650 @2.66 GHz (x2) - Each one consumer about the same amount of power as a 4670K and has the same raw performance (though much less performance per clock)
    24GB DDR3 1333 ECC Memory (3x8GB) (x2) - The motherboard I'm looking at has 6 slots, and supports up to 96GB of memory. However, 96GB of registered ECC memory would cost an arm and a leg, and I found 24GB kits for relatively little, so I'm going with 48 to fully populate the board. For a ZFS build, a lot of RAM will make it speedy, and an ESXi machine will help with VM hosting.
    ASUS Z8NR-D12 - I ended up not being able to find a Z8NA-D6, but Newegg is selling this model for $200. What's more, it's being sold from a reputable reseller with some RMA support. Unfortunately it limits the cases I can install it in.
    RM 850 - It's got two 8-Pin CPU connectors, which are required for the motherboard I'm using. I'm looking for one with less wattage, and the Seasonic 520W might be one (I'm looking for confirmation).
    NVidia GT 520 - The board doesn't come with a graphics adapter, and I have one lying around.
    I'll probably pick up an LSI 9211 if I end up needing more than 6 drives.
    The drives will probably be WD Red or Seagate NAS, my research shows little reason to choose an SE over a Red, with the exception of peak performance. However my intention is to have a lot of data hit the cache first before going to the disk, so the performance doesn't matter as much.
    If the system performs well enough, I might pick up an Intel I350-t4 (4x1Gb) or an Intel x540 (10Gb NIC). A network iSCSI target for game footage, video editing, backups will help a good deal if the array can push upwards of 300 MB/s.
    I need an SSI EEB case. A full tower ATX seems like a waste, so I'm looking at the GD07 from Silverstone. I might go rackmount, but I'm having trouble finding a rackmount case with SSI EEB compatibility. Here's a 2U one from Norco. A 1U would be ideal but is much harder to find power supplies for (and is also much louder).

  3. Like
    alpenwasser reacted to Whaler_99 for a blog entry, The Network...   
    Every good system boils down to the network. This is the underlying foundation for your entire setup. If this isn't rock solid and supports everything you need to do, it doesn't matter how fast that rig of yours is, you will still lag as your switch drops packets due to a network flood.
    Most people seem to think when they need one, the thirty dollar switch is just a good as the fifty dollar one. Most cases they are correct, when you only have a few connections not pushing the limits of the switch. But, a two hundred dollar one? In networking most times you get what you pay for. There are reasons why we have five hundred dollar switches and five thousand dollar ones. Course some brand names carry their own costs, <cough> cisco <cough> but that is another story... Why do I rant away on this? So, you might better understand why I have some of the gear I have. Because when it comes down to it, I do not want the network to be the cause of any issues in the house.
    So, first off, I have my main network hub in the crawl space. Coming into the house I have my cable connection on which modem I have reset in bridge mode (don't get me started on these crap cable and DSL modem/router units) and use my own firewall.
    The firewall is a higher end SonicWall unit. Why this? Due to my job, certifications and partnerships I have access to this and other gear a very low costs for personal use and training. So voila... For those of you that know, I am running all the security services on the unit.
    Feeding of the firewall is what I referencing as my "server" switch. This is a mid range HP 24 port GB switch with a very decent back plane. Off this switch are all my secondary connections as well as my Folding farm, which also resides in the crawl space, my Hyper-V server, unRAID server and my voice server. I also have a redundant one sitting there. Why... just in case and because I could.
    From this switch I lead out to:
    My wireless AP unit, an Aruba unit, powered via PoE, which is nice. I ran cabling up through wall and roof and this sucker sits nicely hidden away on the main floor. A HP GB Switch, a specific edge model, to my main desk. I have a few of my desktops there, test beds, etc. A low end HP GB switch at the main media area, hooked into this is a HTPC, media streaming device and few other things. A powerline adapter for one of the bedrooms. I always wanted to test this and was having wireless issues to this one spot, so voila. And works really well.  
    All the networking gear in the crawl space runs off a nice UPS, mostly to protect against power surges. Don't forget a good power surge unit for that shiny gear.
    All in all a fairly complicated setup and more then what 99% of home users need, but I work a lot with this gear, so I also do lots of testing and such as well with this gear. Should hear people scream when I reboot the firewall to test something.
    As you can see, my underlining network is more then adequate to support the throughput I may have from various sources. The system gets hardest hit when say all of the kids are streaming movies from the unRAID server (I use PLEX, what an awesome product), as well as some buddies and lots of folks on the voice system. Then myself playing some game or other... But everything works without a hiccup.
    Now I just need the cable company to stop calling me because I am flooding their network.
    The "network room"
    Closer in...
    Here you can see the firewall, switch, cable modem and VoIP unit, along with a spare switch. PDU unit...
    And here is what my Aruba AP looks like...
    Yep.. in the floor... In case wondering, that is not a ducted vent - just open to the basement to let air pass through between floors. Very handy.