Jump to content

BHJohnson

Member
  • Posts

    76
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Profile Information

  • Gender
    Male
  • Location
    United States
  • Interests
    Computers (Team Red, though I swing both ways with GPU's), Powerlifting/Olympic weightlifting, cars (JDM with or without LS swaps), many fantasy/sci fi universes, playing jazz, listening to rock, anything you can make in a slow cooker
  • Biography
    The initials stand for Big Hairy Johnson, because I am big, I am very hairy, and my name is Johnson. Not sure what else you could possibly think that means. I've got a wide array of interests, and I geek out about all of them. My background in education is mostly self taught with half of a mechanical engineering degree, and half of an engineering physics degree, which doesn't add up to a whole degree unfortunately, plus my training as a nuclear power operator, which is seriously not as cool as it sounds.
  • Occupation
    Nuclear operator. It's not that cool.

System

  • CPU
    Ryzen 5 2600
  • Motherboard
    MSI X370 Gaming Pro Carbon
  • RAM
    G Skill Trident Z 2x8 GB
  • GPU
    EVGA FTW ACX 3.0 (water cooled, so no ACX)
  • Case
    Thermaltake View 71
  • Storage
    WD Black NVMe M.2 500 GB boot drive, 4 x WD Blue 1 TB 7200 RPM HDD's on a RocketRAID controller (RAID 5)
  • PSU
    Antec HCP-1000W
  • Display(s)
    3 x Asus VS248, 1 x Asus VK248
  • Cooling
    420 mm radiator, EK CPU block, Bykski GPU block, EK D5 pump/reservoir combo
  • Keyboard
    G Skill Ripjaws KM780
  • Mouse
    Roccat Nyth
  • Sound
    Cheap Logitech Speakers and Turtle Beach headset
  • Operating System
    Windows 10

Recent Profile Visitors

509 profile views
  1. This seems at least interesting to look at. I might even find a use in doing external backups to cheap 2.5" hard drives? Idk, it's cheap enough and will fill the slot while adding some possible use maybe in the future. Thanks!
  2. I'd save the money and replace the board/CPU/RAM outright. The best CPU upgrade you can do is to a 3770, which doubles your threads. And going from 4 to 8 is a difference in most modern stuff. But saving that money, finding another 50-100$, and getting a Ryzen 2600 or something similar and a dirt cheap B350 board with a 2x8GB kit of garbage bin 3200/CL16 memory will serve you better. I guess that wouldn't work if the Optiplex 7010 doesn't use standard PSU cables, which I'm gonna guess is true. You might be kinda stuck. Regardless, I recommend you put the money you would've spent under your mattress, do that a couple times over the next months, and then get an upgrade you'll actually notice.
  3. You might want to check for a 3070 Ti on eBay. They're pretty consistently in the 500$ shipped range. That's US, so your offers might be more limited or more expensive, but that's what I've seen in the past week or so. Possibly a 21:9 monitor. I use a Viotek that was pretty reasonably priced and is a massive upgrade over any of my 16:9. Hardware Unboxed did a review of it, and they liked it too. But I like Minecraft at the wider resolution, and the higher pixel count doesn't significantly impact performance.
  4. Because the build is honestly most of the fun. Gonna do some angle iron, some grinding, it'll be a good time. My favorite build I've done was in an old school Antec Dark Fleet from 2012, and I'm trying to take that build one step further in the modding challenges. And the 4000D is boring. Great for someone looking to build a computer in half an hour flat, boring for someone who already has a View 71 and just wants something different to work on. Aquaero 6. I am not aware of a better fan controller. And definitely not one that fits in a 3.5" bay.
  5. I'm rebuilding a computer for the first time in literal years. I'm downsizing from a View 71 to an 80$ Rosewill Challenger. I'm using Jay'z motto of "where there's a Dremel, there's a way", and putting a 420mm radiator on it, and shoving a 5900X and a 3070 Ti in it, at least until I can get a 6800XT for a decent price and the 3070 Ti will move to the HTPC. Also getting a bunch of hard drives in the interal drive slots, an Aquaero 6 in one of the 5.25" bays, a dual 5.25" reservoir with a D5 pump, and a card reader/USB-C hub in one of the external 3.5" slots. That leaves one external 3.5" slot empty, and that feels wrong. Currently, I'm thinking of just putting this in and just making it a single 2.5" SSD scratch drive. The nVME drive is 1TB, but it might be nice to put a 2TB SATA SSD to act as a middle ground for game installs and active working files between the spinning rust array and the zoomzoom drive. Aside from that, my other thought is maybe another USB hub, but that doesn't seem super useful. I haven't found anything else that's particularly fun looking, and I'm wondering what other cool stuff people have found for 3.5" external drives. And no, I'm not putting a floppy on it. That's on the Bulldozer computer. I don't need two. Budget (including currency): 100$ for dumb things, up to 500$ scaling with the usefulness. Country: US Games, programs or workloads that it will be used for: Normal daily driver gaming and video editing.
  6. Problem with the base PC: HP makes literal garbage. See Craft Computing's recently posted video for a general summary of what you might be fighting through by trying to save a buck or two. Expect a lot of proprietary stuff that will be a pain to replace or not work with other stuff you try to add too it. Even stuff as simple as the cooler will be proprietary. GamersNexus also has done some coverage of HP systems with more detail. I would be very wary of anything from them.
  7. I let out a hard "oof" when I saw that 970 chipset. Though the UD3 isn't a terrible board, at least for that era of AMD. Cooler: Noctua. NH-U12. If that's too much money, Hyper 212. If you want something more high end, get a real CPU before you worry about getting a better cooler. Anything (and even arguably the NH-U12 itself) more expensive is wasted on Bulldozer/Excavator era CPUs. And I built a full open loop with dual radiators with lots of case mod that has an FX 8350 and RX 580 (it's a Nitro+, so I feel justified). The cost to benefit just isn't there past a basic 120mm tower cooler. Fans: Gonna stay on the Big Brown... Fan. Definitely fan. No other noun works in that sentence. Noctua NF-A12. There is no better general purpose fan made. They even come in black now, if that's your thing for your... fans. Barring that price tag, you can generally get an EK fan cheap on clearance. Check here: https://www.ekwb.com/shop/clearance/clearance-radiators-fans. Vardars are currently 13.00$ each, and easily do 13.00$ worth of work. The Meltemis and Furies are good too, if you don't mind your PC sounding like it's spinning up for takeoff. But a Noctua fan will last you several builds, so I wouldn't flinch away from the $30+ price on a Noctua fan. Memory: G. Skill Ripjaws. Just search "g. skill ripjaws ddr3" in eBay, and you'll find lots of very affordable options. Generally, you should be able to get 1600/CL9 or better.
  8. So what exactly is "radically" different about the drive designs? I know their intended purposes, but are the drive heads different, is the firmware different, does it use cache differently, does it prioritize writes over reads to a detrimental point, like what is actually different? I haven't found any specifics about it, just the generic "surveillance vs NAS" that doesn't really tell me much. Yes, for now I'm capturing/encoding and editing on the same machine. I'll have a 1TB NVMe drive with the OS and program files on, so I was thinking of just copying the raw footage onto the NVMe for use as a scratch drive while editing to smooth out the process. Or use a 2TB NVMe if needed. And yes, I will be using either an Avermedia or Elgato capture card to take 21:9 1440p and encode it/pass it through.
  9. I'm going with 2TB drives for budget. I could in fact just throw 250$-400$ per drive for 12-15 drives at the problem and have a lot of storage, but using 2 arrays at 6-8TB each will give me plenty to work with for now, I can always scale up later, and I'll be using a third array of 4TB drives to act as an in chasis backup for the time being. I was using a 2TB drive as my capture storage, and it was doing fine, so going to 12TB total storage will be plenty for now. I'm not doing a single 12TB drive which would in fact be cheaper, because This is as much a learning experience as anything else, and I just want to try it. I'm not using ZFS or TrueNAS because this will be primarily a video capture/encoding and editing machine, so I want to run Windows natively and not in a VM. I could use Windows's drive array setup, but no. Eventually, the goal is to have a second machine for those mundane non-NAS tasks, and I can just use this machine to run the NAS and whatever network service VM's I feel like implementing. Then a ZFS or TrueNAS setup in a VM is in fact the right answer. But I want to, and I quote myself, "use used enterprise hardware raid cards, which isn't a great plan but I want to fight inanimate objects." I can't fight inanimate solid state objects that just work with no firmware, driver, and UI compatibility issues. I find I learn stuff when I do things, so I don't expect that to go smoothly at all. Also, yes, if I was doing this for professional or practical reasons, you make good points. To get 12TB of storage in 2TB drives, it's 600$ to get RAID 5, not counting controllers. To get it in 12TB drives, it's 450$ for RAID 1. That then gets more relatively cost effective once you have RAID 5 or 6. Software raid is easier to replace drives and expand array volumes. It also makes hot spares easier and more practical, and means you can make a share drive that you don't have to manage through Windows's file sharing system. But this is not for practical purposes. I'm also watercooling the CPU and GPU, in case you had any remaining doubt about what this build is definitely not.
  10. I'm picking drives for a NAS/video capture rack. I'm planning on making an attempt to use used enterprise hardware raid cards, which isn't a great plan but I want to fight inanimate objects. I bought one 2TB each of; WD Red Plus, WD Red Pro, WD Purple, and Seagate Ironwolf. I then benchmarked them all using CrystalDiskMark. The results were pretty much stacked the same as their prices, except the WD Purple being the cheapest was only beaten by the Red Pro. So I'm kinda leaning towards the Purples. They're 25$ cheaper than the Pro's, which over the scale of 12-15 drives adds up. A couple reviews with benchmarks for them in RAID show they don't have any scaling issues. Is there a down side that I'm missing to using the Purples over Red Pluses?
  11. Again, the NVMe card was just for seeing if a solution existed, not for purchasing in the next six months. I'll cross that bridge when/if I get there. And I had a couple Mellanox cards open on eBay, just hadn't gotten around to checking compatibility. They generally just drop in and go?
  12. I lied, I still had the product page opened for that NVMe raid card. It's only a x8 connection, but otherwise here it is.
  13. I closed that tab a few hours ago, and don't care enough to pull it find it again. It's a PCIe 3.0 x16 card with 4 M.2 x4 slots on it that supports 0/1/10/JBOD. It was 200-300$ for the card, maybe a bit more. Wasn't really looking for it to buy, just looking to see what options are out there. The network as a whole will only ever be 10 Gbe. The only chunk that I am thinking of doing as 40 Gbe is a direct connection from my NAS server made out of workstation parts to my workstation that will be made out of prosumer parts, with no switches and only 1 crossover cable between their NICs. I had a whole bunch of math to show why I wanted to use 40 Gbe for this, when I realized that it's way simpler than I was doing: A 40 Gbe NIC takes up about 60-70% of a PCIe 3.0 x8 pipe on a single connection. That PCIe 3.0 x8 pipe is the same throughput as a PCIe 4.0 x4 pipe, which happens to correspond to the fastest consumer NVMe drives on the market. And while the NAND on those chips doesn't saturate said connection, I would rather not have my network be the limiting factor on the connection speed, which a 10 Gbe network connection very well could be, depending on how fast the SSD is. In RAID, they definitely can be bottlenecked by the connection. Also, this is not supposed to be practical. This is literally all just an amusing project. I will never make money on this setup, I am just doing this to build some stuff and do some stuff I've never done before. So while I don't "need" it, I don't "need" a Miata with a Silverado's V8 in it, and yet there's one in my garage. Why? Because I can, that's why. Also also, it'll be fun to test various configurations of hardware and settings can get what network speeds and latencies.
  14. You're talking about their QSFP cards? I've found a couple of them, just want to make sure that they'll work with relative ease. This one seems relatively easy to find and pretty cheap, even after the 8" fiber cable, so if I can find drivers that people have gotten to work on Windows 10 I'd be down to try it and at worst burn 60$.
  15. I'm considering using a NVMe RAID controller at some point, which utilizes a PCIe 3.0 x16 connection. Four NVMe drives in RAID 5 can absolutely saturate a 10 Gbe connection, and the cost difference in hardware isn't much different (networking hardware, the storage hardware for that setup is kinda dumb). I'm gonna do some more research to make sure whatever I buy is more or less compatible, and if I can't find a relatively simple 40 Gbe solution then I'll drop down. My fighting of the H310 RAID card is kinda taking my fighting sprit out of me. We'll see. The actual connecting of the two machines is a very long way off, as the desk this is all going in doesn't exist yet, much less the chassis themselves. I was just gathering data on potential solutions whenever I do get there.
×