Jump to content

Optimoos

Member
  • Posts

    12
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Optimoos's Achievements

  1. I'm considering replacing my current triple monitor setup with a pair of ultrawides. I have two old Dell U2410s on each side and an Asus PG279 as my main monitor. The PG279 will move to a dedicated gaming rig and this system will become my workstation - primarily network and systems administration, color accuracy is nice but not a requirement for my work. I really can't deal with less than 1200 vertical pixels so 1080p panels are out. I'd prefer to have higher than 60Hz refresh rate but 144Hz is not a necessity, I think anything over 75Hz would probably be fine. Similarly, I'm an IPS panel snob but as that tends to skyrocket price I'm willing to look at more budget friendly options. The last consideration is that I'll likely stack them vertically, so VESA mounts are a necessity. Alternatively to a dual ultrawide setup is picking up three 16:9 2560x1440 monitors - if I can do that for less than two ultrawides, I'd probably consider that as well. All of this is complicated by current global supply issues as it seems everything is out of stock. For now, I'm happy with recommendations that will fit in well with the requirements above so that I can pounce when supply sorts itself out. Thanks!
  2. Seems like the perfect time to drop this info - I've been doing a bunch of evaluation on home storage as I prepared to rebuild my home NAS. This included going way overboard and doing a bunch of diskspd testing against FreeNAS, Unraid, and Windows. I was going to do a long write-up on the whole process but when the results started coming back figured my nerd cred would get obliterated so I dropped it. This video is begging for the details to be shared though. Essentially I wanted to test what an "amateur hobbyist" could expect to get performance wise out of these three platforms. This meant mostly next-next-nexting through install and configuration, finding a document that described the setup I was going for, following it, and testing the results. The goal was to use the SSDs to provide enhanced performance in whatever way made sense to the platform. For FreeNAS, this meant a pair of them acting as cache and a pair acting as journal drives; in Unraid they were assigned as cache disks; in Windows they were configured as tiered storage in a storage space (mirrored SSD tier, parity HDD tier). Tests were run multiple times with various options to diskspd, these options are all recorded in the results spreadsheet. In all cases I was testing against a CIFS/SMB share. Here's the hardware details: Server system: AMD 3400G on ASUS Prime X570 Pro 32GB DDR4-3200 WDC WDS250G1B0C (250g boot drive) 4 x Samsung 860 EVO 500GB SSD (using motherboard SATA) 8 x WDC WD2003FYYS 2TB HDD (using LSI 9305-16i HBA) Intel X520 10GB NIC Testing system: i7-4790K @ 4GHz on Z97X-UD3H 16GB DDR3-1600 Windows 10 Pro Intel X520 10GB NIC Network switch: Cisco 3750X w/C3KX-NM-10G Without further ado, here are the results: https://docs.google.com/spreadsheets/d/17S0K3_VM2vIDYFKhjz4QJS18S_VaMHRjhbQ75CBTaHI/edit?usp=sharing Multi and Single denote whether diskspd was using a single target file for read/writes or multiple (4) files. If people are interested I can share the powershell script I wrote for the test but it's basically just a bunch of loops iterating over the config options as listed in the spreadsheet. As for the results... Yes, that's right, at the very least for my testing setup, Storage Spaces kind of wiped the floor with everything else.
  3. Alright, I'll keep that in mind, thanks again. Good point, that makes in unsuitable. Too bad, although you're right about the case problem there too.
  4. Hah, amazing! That's probably overkill for what I need but I'll definitely keep it in mind, thanks for pointing that one out.
  5. Yeah, then it's just figuring out a case to make that work in.
  6. Yeah, Threadripper might be the better route. The ASUS ROG Strix X399 has 4 X16 and 1 X4 slot. Of course no integrated graphics means giving one of those up to video. What's the PCIe lane situation in Intel land? Should I be investigating that route instead?
  7. Looking at potentially 4-6 cards, as I initially mentioned. While there are plenty of motherboards with 3x X16, which I understand I can install X4 cards into, additional slots beyond that on any motherboard I've seen have been X1, and therefor pretty useless for my purposes.
  8. I'm trying to build a workstation machine which is going to require multiple PCIe X4 cards. I've searched what feels like high and low and have been unable to find any AM4 motherboards with anything more than 3 X16 slots. Do any AM4 boards exist with less (or even no) X16 but 4-6 X4 slots? Am I relegated to finding a riser/splitter to do this task? If so, does anyone have recommendations on a riser or splitter, and furthermore a good tower case to install it all in?
  9. I cannot swap the OS and use FreeNAS as a simple answer, is what I was trying to get across there.
  10. I've got a machine acting as my home storage that's got 8 drives attached to an old Adaptec card as the storage array and then 3 SSDs attached to the onboard controller for the OS and applications. I'd like to remove the Adaptec card from the scenario because if it fails I'm going to have a fun time replacing it. Unfortunately I've also got video capture hardware in this system that only works under windows, which somewhat limits my options and the system is currently running Windows 10 Pro. Is the MS sw raid effective? Are there better options?
  11. Huh, that's super random but also precisely what was going on for me. Unfortunately it seems like I've fixed one problem and moved on to the next - now the system will post and boot fine, but during adapter initialization the card spits out "Microsemi BIOS is unable to continue due to insufficient EBDA memory space". Once in the OS it appears fine and even sees the drives attached, but it's not detecting the previous logical configuration which it should. I'll spend some more time troubleshooting this later, just wanted to say thanks for getting me over the first hurdle.
  12. I've got a system I use as a server which has an Adaptec 5805 in it. The 5805 is a bit ancient and not officially supported under Windows 10 which is what I'm using as the OS, and I've been experiencing device failures under heavy I/O - the drives presented by the 5805 just stop being accessible although the device doesn't complain in the event log or anything like that. In hopes of resolving this, I thought I'd modernize the card with the suggested replacement, a Microsemi (Adaptec) 3152-8i. However, I ran into a completely different problem when trying to install the new card - my system will no longer post. The motherboard is a GA-Z77X-UD3H with an i5-2500 on BIOS F20e. This is a beta BIOS but is also the only one that deals with booting off the on-board controller properly while any additional storage controllers are connected. The error code off the board indicates the system is halting during memory initialization at which point it power cycles, swaps the active BIOS, tries again, fails, and repeats. I'm typically seeing code 08 or 15 when this happens. I tried pulling memory and swapping combinations in all flavours of single and dual channel but the result is the same every time - the system boots fine without the card, gets nowhere with it. I'm open to the possibility of this truly being a memory/motherboard problem, but it's a very strange manifestation as the system is rock solid otherwise (apart from the aforementioned occasional storage failure under load, which doesn't smell like a memory problem). I asked Microsemi as their compatibility charts (while incomplete in the consumer board space) only mention the Z87 chipset as supported, not the Z77. As expected, their response was essentially that it should be fine but no guarantees, so not particularly helpful. Anyone have any thoughts? Any other information I can provide?
×