Jump to content

looking to build m.2 only server

ZaziNabu
7 minutes ago, mariushm said:

If you get a motherboard that supports bifurcation, you can use an adapter board to do bifurcation on the pci-e x16 slot into 4 m.2 pci-e x4 slots. The video card, as it's a server, it can go into a pci-e x1/x2/x4 from chipset.

So you'd have 4 + 1 dedicated m.2 connector from cpu and on boards with 2 connectors, the 2nd connector gets pci-e lanes from chipset... so you get 6 pci-e slots.

 

Keep in mind that you'll be limited by the network speed ... a 10 gbps ethernet is 1.25 GB/s ... a SATA drive can do 500 MB/s so for some steam caching, even some cheap QLC sata drives would serve a cached game at 1 gbps or 2.5 gbps no problems.

 

For fun, you could experiment with some old stuff like for example socket G34 and power hungry opterons - you can get a board and 2 cpus for $130, maybe even less, this is just one listing, random search result : https://www.ebay.com/itm/285024203344

You get 6 pci-e slots but they're pci-e 2.0 , so if you use pci-e to m.2 adapters you'd get max 2 GB/s from each m.2 ssd but it's still faster than sata. You also get a bunch of ddr3 registered slots, and ddr3 registered is relatively cheap, around 1$ per GB.

The board MAY support bifurcation on some slots, but I don't know.

 

 

 

Hmm, that could be interesting to try, I'll look at old servers on ebay (and at work might ask for an old one as I work at IBM lol) 
 

Quote

I’m sure that is possible, yes. Any old PC you have laying around would also work, but a Mac mini could be nice because of it’s very small size. 

I think I have someone in my family with an old mac mini with a thunderbolt exit on it I could play with.

 

 

Another Q, how do I make the steam cache and the streaming service work at the same time? Do I use vm's?

Link to comment
Share on other sites

Link to post
Share on other sites

i have some older 6tb sas drives and im running unraid.  which means only a single disk is getting used at a time generally.  

it saturates my 1gbps link no problem.  my server has a 10g uplink to my switch, I do have a 1.2tb cache drive for my VM's and write caching, and i could stash some shares on there if i wanted if i was expecting multiple 1gig links to pull from a subset of data.  It was cheap and used on an old 10core xeon server chassis. 

 

If i was 10gig networking everywhere, id probably just look at the unraid beta that supports ZFS.  Get more disk speed that way.  

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, ZaziNabu said:

Hmm, that could be interesting to try, I'll look at old servers on ebay (and at work might ask for an old one as I work at IBM lol) 

Old servers yes, trying to use bifurcation for m.2 SSD’s….. no. I really would advise against this. 
 

It will be cheaper, easier, and cause less headaches to just use harddrives. I would argue once you have been “doing the homelab thing” for a few years, and really get into the hobby of it, then you could look into maybe running an SSD based system for a specific purpose. But you will run into enough headaches and require troubleshooting just to get a homelab up and running smoothly, no need to introduce an entire separate can of worms with bifurcation and PCIe weird intricacies.
 

If you had a genuine need for the speed, sure, then I would say you have a lot of learning ahead of you to get that to work smoothly. But since you don’t have a need in the slightest, I would start with something much more standard where you will more easily find resources to help you get things working correctly. Then once you have a decent handle on things (this will take years), your next server setup you can potentially change it up. 
 

Going from no server to server is a big enough step as it is. You can start playing with VM’s, docker containers, virtual networking, all while you need to take security into consideration… there is a lot to jump into, but there are plenty of great resources to help with all of this. No need to try and build an even more complicated setup then needed.
 

Homelands are best set up 1 step at a time, going from 0-100 will result in massive frustration, I promise. Even 1 step at a time will result in frustration… but that’s part of the fun 🙂

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, LIGISTX said:

Old servers yes, trying to use bifurcation for m.2 SSD’s….. no. I really would advise against this. 
 

It will be cheaper, easier, and cause less headaches to just use harddrives. I would argue once you have been “doing the homelab thing” for a few years, and really get into the hobby of it, then you could look into maybe running an SSD based system for a specific purpose. But you will run into enough headaches and require troubleshooting just to get a homelab up and running smoothly, no need to introduce an entire separate can of worms with bifurcation and PCIe weird intricacies.
 

If you had a genuine need for the speed, sure, then I would say you have a lot of learning ahead of you to get that to work smoothly. But since you don’t have a need in the slightest, I would start with something much more standard where you will more easily find resources to help you get things working correctly. Then once you have a decent handle on things (this will take years), your next server setup you can potentially change it up. 
 

Going from no server to server is a big enough step as it is. You can start playing with VM’s, docker containers, virtual networking, all while you need to take security into consideration… there is a lot to jump into, but there are plenty of great resources to help with all of this. No need to try and build an even more complicated setup then needed.
 

Homelands are best set up 1 step at a time, going from 0-100 will result in massive frustration, I promise. Even 1 step at a time will result in frustration… but that’s part of the fun 🙂

Hmm, that's a fair point.

OK, Lets go to basic, The steam cache is less important then the media server as the Steam Cache is kinda working weird with the fact I need to download all the games one by one over time and it wont self update (I dont see much point in that as games get updates every week nowadays with more then 20gb updates).

So, the first part is a simple hdd server with m.2 cache (for a bit more speed) for movies and shows.

On another note, I found this guide for building a steam cache (if I'll end up doing it), is the guide any good?
https://arstechnica.com/gaming/2017/01/building-a-local-steam-caching-server-to-ease-the-bandwidth-blues/

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, LIGISTX said:

It will be cheaper, easier, and cause less headaches to just use harddrives. I would argue once you have been “doing the homelab thing” for a few years, and really get into the hobby of it, then you could look into maybe running an SSD based system for a specific purpose. But you will run into enough headaches and require troubleshooting just to get a homelab up and running smoothly, no need to introduce an entire separate can of worms with bifurcation and PCIe weird intricacies.

Isn't part of it to learn?  Troubleshooting can sometimes be a good thing.  I don't agree with HDDs being "less headache".  They die so easily.  I've lost so many of them in 10 years vs ONE SSD (I've had more SSDs as well I'm a bit addicted... probably 30+).  At work we have gone through thousands, and they barely ever die before the system is replaced.  The only one I can think of was DOA.  Even servers tend to go through several drives before warranty expires.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ZaziNabu said:

Hmm, that's a fair point.

OK, Lets go to basic, The steam cache is less important then the media server as the Steam Cache is kinda working weird with the fact I need to download all the games one by one over time and it wont self update (I dont see much point in that as games get updates every week nowadays with more then 20gb updates).

So, the first part is a simple hdd server with m.2 cache (for a bit more speed) for movies and shows.

On another note, I found this guide for building a steam cache (if I'll end up doing it), is the guide any good?
https://arstechnica.com/gaming/2017/01/building-a-local-steam-caching-server-to-ease-the-bandwidth-blues/

A steam cache I believe can be configured to pull down updates automatically, as well as be able to download many games at once (this will obviously be bottlenecked by your internet connection). So that could still be useful. 
 

As far as SSD cache for a media serving server…. There is no need. A harddrive is plenty fast enough. If you hate money and want to get rid of it, sure. But for a media serving server (or even a homelab doing many things, including media and strewn cache, plus PC backups, NVR streams, and some VM’s hitting the array) a SSD cache is still not needed. 1 harddrive can easily do 150 MBps, a few in RAID can do 150xa few (however many a few is… with some loss to overhead). Gigabit LAN is only 125MBps at absolute best. Your limitation is basically always networking in a home setup.

 

The homelab is my signature used 10x4TB 5900 rpm drives in RAID Z2. I can easily write to it at full gigabit (125MBps) indefinitely, and remember, writing is the hard part. Reading is much easier, so if I can write to that array at max network speed, I can easily pull data off of it at the same speed. 
 

I have multiple deluge clients running 24/7, sonarr, Plex with multiple users, rsync jobs from remote servers, and VM backups all hitting that array, and it never even breaks a sweat. I also do photography as a hobby, and it houses my archive of multiple TB of Nikon RAW images at ~50 MB each. Network limitations cause me not to edit off the NAS directly, but I’ll throw my data into it at full gigabit, while doing other things with the server, and all is perfectly happy. 
 

1 hour ago, ewitte said:

Isn't part of it to learn?  Troubleshooting can sometimes be a good thing.  I don't agree with HDDs being "less headache".  They die so easily.  I've lost so many of them in 10 years vs ONE SSD (I've had more SSDs as well I'm a bit addicted... probably 30+).  At work we have gone through thousands, and they barely ever die before the system is replaced.  The only one I can think of was DOA.  Even servers tend to go through several drives before warranty expires.

I totally agree, and specifically said leering and troubleshooting is part of it. But going from 0 to sys-ad is a really big step… starting with a more simple setup is very much advised. 
 

Yes, drives do fail. I have been unlucky and have had 2 of my 10 fail over 5 years of use. A buddy of mine who also used WD Reds and had a 24 bay box filled with them hasn’t had a single failure in the same timespan, and his were accidentally running at 50c for years….. so some of that is just luck, and some of it, as in your experience, is just simple statistics. But this is why we run Z2 arrays, specifically to mitigate this. 
 

Back to SSD vs HDD, harddrive will be much cheaper per GB of space, will perform the same for this workload, and will be more straight forward to setup (no need to worry about PCIe lanes, or bifurcation, or which card will work with which SSD, and how many cards can you fit on a mobo, and which slot goes to the chipset vs direct to CPU). All of this can be overcome, yes. But why introduce this extra level of complexity into what otherwise would be a very simple and standard homelab/NAS setup. And like I said, once OP has more knowledge about all of this, and has more defined requirements, which really only come with time spent doing this…, will they then have a better idea of what they actually want. 
 

For a good analogy, I also race cars, and a lot of people struggle with the idea that their car, as it sits, is probably more capable then they are. People will ALWAYS be looking for the next mod to make them faster on track, when the correct answer is they likely don’t know what the car needs or doesn’t need because they the driver are not at the level to know what is lacking, and they are no where near the limit of what the current parts will provide. In this situation, the correct answer is spend money on seat time and the consumables that go along with lots of seat time (tires and brakes). THEN once you have a good understanding of the cars dynamics, and you are pushing it to the point where you no longer are gaining much time each event, then you can start spending money on parts - people rarely follow this advice, but it is the correct advice 🙂  

 

TLDR; learn what you don’t know yet via playing around with a homelab, then actually teach yourself what you don’t know. Then down the line when you have a better set of requirements for what you want your honekab to do for you, iterate. 

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, LIGISTX said:

A steam cache I believe can be configured to pull down updates automatically, as well as be able to download many games at once (this will obviously be bottlenecked by your internet connection). So that could still be useful. 
 

Huh, Can you please show me a way to do that? A link to a guide will be good too. all I found is a really slow grabbing system when you need to download from steam and then the NAs will grab the img for later usage.

On another note, I saw some sinology NAS with 4-7 drives with 2 m.2 slots (https://www.synology.com/en-us/products/DS923+).  but the RAM amount is very very low on them (8gb) and the upgrade they demand for support for the product is really over priced (even for ecc memory), and I'm not sure the cpu is fast enough for streaming and cache usage at the same time. 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, ZaziNabu said:

Huh, Can you please show me a way to do that? A link to a guide will be good too. all I found is a really slow grabbing system when you need to download from steam and then the NAs will grab the img for later usage.

I have never done it myself, but I am fairly certain I have seen it work in this way based in other threads I have seen on here. 
 

 

I want to say they may have been docker containers. Maybe read through this? https://lancache.net/docs/installation/docker/

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, LIGISTX said:

I have never done it myself, but I am fairly certain I have seen it work in this way based in other threads I have seen on here. 
 

 

I want to say they may have been docker containers. Maybe read through this? https://lancache.net/docs/installation/docker/

Just read through it, it says the same, need to download via steam, it just grabs the download and caches it while it's downloading.
I wonder if Steam blocks caching for pirating reasons...

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, ZaziNabu said:

Just read through it, it says the same, need to download via steam, it just grabs the download and caches it while it's downloading.
I wonder if Steam blocks caching for pirating reasons...

First google result for “lancache auto update”:

 

 

Link in there to your solution, at least it would seem 🙂 

 

Welcome to homelab….. better get good at googling things 😉

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, LIGISTX said:

First google result for “lancache auto update”:

 

 

Link in there to your solution, at least it would seem 🙂 

 

Welcome to homelab….. better get good at googling things 😉

Hmm, https://github.com/tpill90/steam-lancache-prefill is really epic! 
And it runs on macos so I might just get an Mac mini M1 with 32gb of ram, thunderbolt connect driver bay and be done with most of the work with MacOS (until Linux on arm will be a thing).

But it will take the fun out of the learning

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ZaziNabu said:

Hmm, https://github.com/tpill90/steam-lancache-prefill is really epic! 
And it runs on macos so I might just get an Mac mini M1 with 32gb of ram, thunderbolt connect driver bay and be done with most of the work with MacOS (until Linux on arm will be a thing).

But it will take the fun out of the learning

 

What do you mean “until linux on arm will be a thing”?

 

1) why do you want to run linux on arm?

2) raspberry pi’s are arm…. And raspbian is extremely popular.

 

What exactly do you mean by this? 

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, LIGISTX said:

What do you mean “until linux on arm will be a thing”?

 

1) why do you want to run linux on arm?

2) raspberry pi’s are arm…. And raspbian is extremely popular.

 

What exactly do you mean by this? 

I meant os's like CentOS or ubuntu 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ZaziNabu said:

I meant os's like CentOS or ubuntu 

Ubuntu has an ARM version… but what ARM chip are you trying to run linux on? 
 

The most common is a raspberry pi, which again, raspbian is what it ships with, and it’s “basically ubuntu” since it’s just their version of Debian.

 

What exactly do you want Ubuntu on arm for tho? Why is this a consideration?

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, LIGISTX said:

Ubuntu has an ARM version… but what ARM chip are you trying to run linux on? 
 

The most common is a raspberry pi, which again, raspbian is what it ships with, and it’s “basically ubuntu” since it’s just their version of Debian.

 

What exactly do you want Ubuntu on arm for tho? Why is this a consideration?

I'm considering getting an m1 mac mini and making it my NAS with a thunderbolt HDD bay 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, ZaziNabu said:

I'm considering getting an m1 mac mini and making it my NAS with a thunderbolt HDD bay 

Even if they made a full phat linux distro work for apples custom chip (not all ARM is the same… that thing is proprietary as can be), I wouldn’t do this… Mac Mini’s are GREAT, at being Macs. 

 

Just use some old PC as your NAS. You don’t want hacky solutions for your NAS/server, you want something that natively works. Uptime is critical for such appliances. 

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, LIGISTX said:

Even if they made a full phat linux distro work for apples custom chip (not all ARM is the same… that thing is proprietary as can be), I wouldn’t do this… Mac Mini’s are GREAT, at being Macs. 

 

Just use some old PC as your NAS. You don’t want hacky solutions for your NAS/server, you want something that natively works. Uptime is critical for such appliances. 

Hmm ok, I'll scratch that option then :).

I was thinking of getting https://www.synology.com/en-us/products/DS923+ as it looks like a cheap solution that I can expand in the future.
But the specs looks so freaking low, do I need more then that? or is it enough for 4k HDR and Steam cache? 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, ZaziNabu said:

Hmm ok, I'll scratch that option then :).

I was thinking of getting https://www.synology.com/en-us/products/DS923+ as it looks like a cheap solution that I can expand in the future.
But the specs looks so freaking low, do I need more then that? or is it enough for 4k HDR and Steam cache? 

Can you run a steam cache on synology? If you can, it would likely be fine as long as you don’t need to transcode. 

Rig: i7 13700k - - Asus Z790-P Wifi - - RTX 4080 - - 4x16GB 6000MHz - - Samsung 990 Pro 2TB NVMe Boot + Main Programs - - Assorted SATA SSD's for Photo Work - - Corsair RM850x - - Sound BlasterX EA-5 - - Corsair XC8 JTC Edition - - Corsair GPU Full Cover GPU Block - - XT45 X-Flow 420 + UT60 280 rads - - EK XRES RGB PWM - - Fractal Define S2 - - Acer Predator X34 -- Logitech G502 - - Logitech G710+ - - Logitech Z5500 - - LTT Deskpad

 

Headphones/amp/dac: Schiit Lyr 3 - - Fostex TR-X00 - - Sennheiser HD 6xx

 

Homelab/ Media Server: Proxmox VE host - - 512 NVMe Samsung 980 RAID Z1 for VM's/Proxmox boot - - Xeon e5 2660 V4- - Supermicro X10SRF-i - - 128 GB ECC 2133 - - 10x4 TB WD Red RAID Z2 - - Corsair 750D - - Corsair RM650i - - Dell H310 6Gbps SAS HBA - - Intel RES2SC240 SAS Expander - - TreuNAS + many other VM’s

 

iPhone 14 Pro - 2018 MacBook Air

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, LIGISTX said:

Can you run a steam cache on synology? If you can, it would likely be fine as long as you don’t need to transcode. 

You should be able too, as you can install whatever you want on the NAS.
I'm just worried about the ram options there, it's a really low amount of RAM for a cache and streaming server (8gb) 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, ZaziNabu said:

You should be able too, as you can install whatever you want on the NAS.
I'm just worried about the ram options there, it's a really low amount of RAM for a cache and streaming server (8gb) 

It's low if you compare to what windows x86 needs. These are neither of those. Less ram is not as much of an issue.

Link to comment
Share on other sites

Link to post
Share on other sites

$600 for a DiskStation® DS923+  doesn't sound cheap to me.

 

You can get a ryzen 2200ge or a 2400ge for 60$ 

ex : https://www.ebay.com/itm/185267529909  or https://www.ebay.com/itm/134424354860

An A320 or B450 motherboard is another $50-70 ... B450 is better because you get 4 memory slots, so you could have 4 x 16 GB sticks, but you can start with just one (though really recommended to have two for dual channel, adding 5-10% performance just from that)

example motherboard : https://www.ebay.com/itm/255954013873

ram is $20-30, add a cpu cooler for $10-15, a cheap $30-$40 case and a 430w power supply like that thermaltake smart 430w reviewed by gamersnexus  - https://www.amazon.com/Thermaltake-Continuous-Active-Supply-PS-SPD-0430NPCWUS-W/dp/B07BFJ91TY/  - and you have your nas for under $200-250

 

If you want lower power consumption, a bit less performance, smaller form factor, there's also boards like the ones with soldered cpus and passive heatsink.

For example for $160 you can get a board like this : https://www.newegg.com/asrock-j5040-itx-mini-itx/p/N82E16813157967?Item=9SIA1K6JH98596&quicklink=true

Add a standard atx power supply and at least a so-dimm ddr4 stick (like the ones for laptops) and you're done. 

Has 4  sata ports, has a m.2 slot good for wireless cards, has one pci-e x1 slot so you could add another sata controller should you want to, or a 10g network card (though it would be limited to around 950 MB/s out of 1.25 GB because only one pci-e lane)

CPU performance is nothing to brag about , around 60-70% of a quad core ryzen like 2200ge, but it's a bit more powerful than the R1600 in Diskstation by... I don't know, around 5% maybe.

It also has hardware encoders for hevc, h264, vp9, if the software you use supports intel hardware encoders that would reduce load on cpu. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Hi there! also new here,

From my prespective you just want to do some experiment/learn about building all M.2 drive, is that correct? (let's me just assume this mean M.2 NVMe not SATA).

I have several question before I go any further:

  • How much storage you ideally need?
  • Do you have some kind of budget ceiling for this system?.
  • Do you concern about the overall power consumption of the system?
  • Are you gonna use this system in the long term? (like deploying it permanently, etc.).
  • Are you gonna use this system for other stuff than just a basic NAS? (like running docker/Plex/Nextcloud, etc).
  • What the main purpose to build this system? does it's need to handle many client at the same time?.

I personally experimenting with "ancient" Haswell-EP Xeon for mixed/full flash NAS system for my client who usually use it as CCTV DVR storage. Here's my quick summary of my experiment from the past 4 years :

  • Haswell-EP have the best "bang for the buck" combo in terms of PCIe lanes and atleast basic features for DIY SMB NAS (In my opinion). you can get 40 PCIe lanes CPU for less than $30 (like E5-2640v3).
  • it's quite difficult to find a motherboard that fit especially for tons of NVMe drives without using additional PCIe bifurfication card (like PLX, etc, those thing is freaking expensive even in used condition). most server-grade motherboard support this features, but for almost all gaming focused/mid-low end workstation boards and prebuilts does'nt have the features, that's bring me to the next point.
  • Using chinese (AKA AliExpress) "X99" boards as NAS is kinda bad idea (most of them event did'nt using X99 chipset LOL), especially if you need a robust motherboard with all the features needed available. Well there's some good chinese board for running 24/7 like Huananzhi X99-TF/X99-TF-Q that have x4x4x4 mode PCIe bifurfication mode on all it's PCIe x16 slot, even that having problem on it's own. If you want to skip the headache i recommend to buy from reputable brand like Supermicro or for complete prebuilts, HP Z840 system.
  • It's kinda power hungry compared to it's modern NAS system counterpart, especially when used as bare NAS without any other service running (like a hypervisor, etc).

 

LENNYYYYYYYYY!!!!! WHERE R U BUOYYYYY??????

  •  Laptop:
    • Current
      • Spoiler

        Lenovo ThinkPad X1 Yoga Gen 8 | i5-1335U | 16GB LPDDR 6400MT/s Soldered | WD SN810 1TB NVMe (Win11) | 14 inch 1200p IPS 500 nits panel | Intel AX211 | 65W USB-C PD adapter | Aliexpress DIY Thunderbolt 3 eGPU dock with JieShuo RTX 3060M 12GB

         

    • Retired
      • Spoiler

        Heavily modified Lenovo ThinkPad T480 | i7-8550U | 2x 16GB Teamgroup DDR4 3200MT/s | Custom FCBGA 595 to Oculink 2.0 @ PCIe 3.0 x4 (using removed nvidia MX150 dGPU pinout) | 14 inch 1080p IPS 350 nits panel | 7 Row classic keyboard mod (from T25) | 512GB Samsung PM981a NVMe SSD (Win10) + 256GB WD SN520 2242 NVMe SSD (Manjaro; on WWAN slot) | Intel AX210NGW | 65W USB-C PD adapter  | Oculink 2.0 to PCIe x16 DIY dock eGPU with JieShuo RTX 3060M 12GB 

 

  • Workstation:
    • Current - Repurposed as Homelab Dev Server
      • Spoiler

        AMD EPYC 9454P 48C/96T | Tyan Tomcat HX S8050 (S8050GM2NE) Socket SP5 | 8x32GB (256GB) Kingston Fury Renegade DDR5 4800MT/s RDIMM | Colorful iGame RTX 4080 Advanced OC (windows VM) | PowerColor RX 6600XT Fighter (MacOS VM) | 2x Intel Optane M10 64GB (RAID 1, Proxmox) | 4x Samsung 980 Pro 1TB (RAIDZ1, VM storage) | Nvidia Mellanox ConnectX-4 LX 25Gb SFP+ PCIe x8 (MCX4111A) | Seasonic Prime TX 1600W | Thermaltake Core W100 case | Enermax LIQTECH TR4 II 280mm AIO (SP5 Kit) |8x Deepcool TF140S Fan

         

    • Retired
      • Spoiler

        Intel Xeon E5-2650v3 10C/20T (locked all core 3GHz) | Dell ALienware Area 51 R2 Motherboard (MSi MS-7862) LGA 2011v3 with modded BIOS (ReBAR enabled) | 4x16GB (64GB) SKhynix ECC LRDIMM 2666MT/s Quad Channel | Colorful iGame GTX 1070 Flame Ares U-TOP | Samsung PM981a 512GB NVMe SSD (Win10) + TeamGroup MS30 M.2 SATA 512GB (Manjaro) | Mellanox ConnectX-3 OCP 2.0 NIC (with PCIe adapter) |  Thermalright AXP-100H Muscle CPU cooler (hacksawed to work with LGA 2011v3) | PowerUp Raptor 1633 case + 6x Deepcool RF-120FS case fan | IndoCase 500W 80+ Silver PSU (Rebranded & binned up Gamemax GM-500).

         

  • Homelab AIO server:
    • Current - Repurposed as Backup server
      • Spoiler

        AMD Ryzen 9 3900 (OEM CPU, got it from AliExpress) | Asus Prime B550-Plus AC-HES | 4x16GB (64GB) Samsung DDR4 3200MT/s ECC UDIMM | 2x Intel Optane M10 64GB (RAID1, TrueNAS Scale) | 6x Seagate SkyHawk 8TB (Bulk storage; RAID Z1; Alhua rebranded drive) | Asrock ARC A380 6GB Challenger OC (Terminal + Transcoding GPU) | Mellanox ConnectX-3 OCP 2.0 NIC | Deepcool Gammax C40 (dual fan) | IndoCase IC4008 4U rackmount case + 5x Deepcool XFAN120 + 2x Deepcool XFAN80 | IndoCase 800W 80+ Silver PSU (Rebranded & binned up Gamemax GM-800) | Samsung Galaxy S10+ 5G (5G backup uplink for Router VM).

         

    • Retired
      • Spoiler

        Intel Xeon E5-2695v3 14C/28T (force dynamic turbo all core up to 3GHz) | Huananzhi X99-TF-Q (Q87 chipset) LGA 2011v3 with modded BIOS (ReBAR & all core turbo enabled) | 4x32GB (128GB) Samsung ECC LRDIMM 2133MT/s Quad Channel | Nvidia Quadro NVS 295 (via M.2 E key riser; for basic display) + Dell GTX 1070 OEM (transcoding GPU) + Sapphire Pulse RX 470D 4GB (MacOS Ventura VM) | 2x Kingston A400 120GB (mirrored; TrueNAS SCALE) + 6x Samsung PM981a 1TB (VM storage; RAID Z1) + 7x WD Purple 4TB (Bulk storage; RAID Z1) | Mellanox ConnectX-3 OCP 2.0 NIC (with PCIe adapter) | Alseye M90 CPU cooler (hacksawed to work with LGA 2011v3) | IndoCase IC4008 4U rackmount case + 5x Deepcool XFAN120 + 2x Deepcool XFAN80 | IndoCase 800W 80+ Silver PSU (Rebranded & binned up Gamemax GM-800).

         

  • DIY Router:
    • Current
      • Decommisioned, current router system is running on current AIO Homelab server as VM (Modemsofmen ROOTer GoldenORB OpenWRT 22.10).
    • Retired
      • Spoiler

        HP EliteDesk 800 G1 SFF Prebuilt | Intel core i5-4460T | 4x 4GB Samsung DDR3L 1600MT/s UDIMM Dual Channel | 1x Kingston A400 120GB (ROOTer GoldenORB OpenWRT 22.10) | Mellanox ConnectX-3 OCP 2.0 NIC (with PCIe adapter) + Realtek RTL8125 2.5GBe PCIe NIC + Intel i210 SFP PCIe card with Mikrotik SFP ONU module (for WAN, my default ISP modem sucks) + Fibocom L860 4G modem (with USB 3.0 adapter, for WAN redundancy).

 

  • Desk setup:
    • Spoiler

      viewsonic VA2732-H 27 inch + VA2215H 22 inch (portrait) 1080p 75Hz IPS monitor | DIY sound system with 2 50W DIY bookshelf speaker | Senheisser HD 600 Headphone | Keychron K1 SE keyboard (Low profile Gateron brown) | Lenovo ThinkLife WLM210 mice | Samsung Galaxy S8+ (used as a Webcam LOL).

       

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, TimedPing said:

Hi there! also new here,

From my prespective you just want to do some experiment/learn about building all M.2 drive, is that correct? (let's me just assume this mean M.2 NVMe not SATA).

I have several question before I go any further:

  • How much storage you ideally need?
  • Do you have some kind of budget ceiling for this system?.
  • Do you concern about the overall power consumption of the system?
  • Are you gonna use this system in the long term? (like deploying it permanently, etc.).
  • Are you gonna use this system for other stuff than just a basic NAS? (like running docker/Plex/Nextcloud, etc).
  • What the main purpose to build this system? does it's need to handle many client at the same time?.

I personally experimenting with "ancient" Haswell-EP Xeon for mixed/full flash NAS system for my client who usually use it as CCTV DVR storage. Here's my quick summary of my experiment from the past 4 years :

  • Haswell-EP have the best "bang for the buck" combo in terms of PCIe lanes and atleast basic features for DIY SMB NAS (In my opinion). you can get 40 PCIe lanes CPU for less than $30 (like E5-2640v3).
  • it's quite difficult to find a motherboard that fit especially for tons of NVMe drives without using additional PCIe bifurfication card (like PLX, etc, those thing is freaking expensive even in used condition). most server-grade motherboard support this features, but for almost all gaming focused/mid-low end workstation boards and prebuilts does'nt have the features, that's bring me to the next point.
  • Using chinese (AKA AliExpress) "X99" boards as NAS is kinda bad idea (most of them event did'nt using X99 chipset LOL), especially if you need a robust motherboard with all the features needed available. Well there's some good chinese board for running 24/7 like Huananzhi X99-TF/X99-TF-Q that have x4x4x4 mode PCIe bifurfication mode on all it's PCIe x16 slot, even that having problem on it's own. If you want to skip the headache i recommend to buy from reputable brand like Supermicro or for complete prebuilts, HP Z840 system.
  • It's kinda power hungry compared to it's modern NAS system counterpart, especially when used as bare NAS without any other service running (like a hypervisor, etc).

 

How much storage you ideally need? About 10tb usable storage

Do you have some kind of budget ceiling for this system? about 1200$ - 1500$

Do you concern about the overall power consumption of the system? nope, but low power and quite will be a huge plus

Are you gonna use this system in the long term? (like deploying it permanently, etc.). It's going to be my playground, but it will be nice if it will be possible to keep it for the long term usage

Are you gonna use this system for other stuff than just a basic NAS? (like running docker/Plex/Nextcloud, etc). It's going to be only Steam cache system and Plex\other streaming service server

What the main purpose to build this system? does it's need to handle many client at the same time?. Not many, up to 2-3 at the same time on a local network. it's going to be a fun playground to learn and try new things and at the end it will be used as a plain plex server and Steam cache.

 

Many ppl suggested just to go with just to go with HDD as it will be much cheaper and I don't really need the speeds of the SSD (and that is true)

All in all, after this project I wanna know how to set up a basic NAS for streaming, data and cache.
If possible it will be really thin and really quite.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, ZaziNabu said:

How much storage you ideally need? About 10tb usable storage

Do you have some kind of budget ceiling for this system? about 1200$ - 1500$

Do you concern about the overall power consumption of the system? nope, but low power and quite will be a huge plus

Are you gonna use this system in the long term? (like deploying it permanently, etc.). It's going to be my playground, but it will be nice if it will be possible to keep it for the long term usage

Are you gonna use this system for other stuff than just a basic NAS? (like running docker/Plex/Nextcloud, etc). It's going to be only Steam cache system and Plex\other streaming service server

What the main purpose to build this system? does it's need to handle many client at the same time?. Not many, up to 2-3 at the same time on a local network. it's going to be a fun playground to learn and try new things and at the end it will be used as a plain plex server and Steam cache.

 

Many ppl suggested just to go with just to go with HDD as it will be much cheaper and I don't really need the speeds of the SSD (and that is true)

All in all, after this project I wanna know how to set up a basic NAS for streaming, data and cache.
If possible it will be really thin and really quite.

Well, if your budget is around $1200 - $1500 including the drive's, is really thight actually, and I dont wanna recommend an used drive even you didn't care about the data loss, here's my specs sugestion for your needs:

  1. Main hardware: you can choose between WS-prebuilts like HP Z840 or the better solution, DIY system, and heres the quick checkboxes you will need to follow:
    • The motherboard need to support x4x4x4x4 bifurfication, both HP Z840 and almost all Intel C612-based chipset motherboard support this, some chinese motherboard like the Huananzhi X99-TF/X99-TF-Q also supported it (for more info on Chinese Aliexpress motherboard PCIe bifurfication compability and durabilty you can visit Myconst hardware youtube channel or on other YouTube channel/sites).
    • Make sure the motherboard have good PCIe slot arrangement to properly get all the PCIe slot being used and slightly improve cooling, the Huananzhi X99-TF is a good example for that.
    • You need atleast 32GB of RAM, especially when you using TrueNAS/Proxmox and running it as hypervisor, however, I recommend 64GB ECC Fully Buffered RAM running in quad channel. sadly the Haswell-EP only support maximum of 2133-2400 MT/s RAM (hey atleast you got quad-channel as the tradeoff). and since mainstream server is began to replacing their old DDR4 to DDR5-based system, you can get the DDR4 ECC RAM pretty cheap in used condition.
    • USE GOOD POWER SUPPLY! You know why. This system config usually comfy running with good 550-750W PSU.
    • For the processor, you can choose between E5-2650/2660v3 (10C/20T) / E5-2680v3 (12C/24T), or if you wanna maxxed out with the cores, you can get E5-2699v3 (18C/36T) with more power hungry and premium price tradeoff. I personally use E5-2650v3 on my so called "home server".
    • For the CPU cooler, just make sure it can handle the CPU TDP you're using and the most important, MAKE SURE THE COOLER SUPPORT LGA-2011 SOCKET.
    • Remember that this platform doesn't have integrate GPU, make sure you have dedicated GPU for accessing the OS console, like GT 710/HD 5450 will do just fine. Some motherboard like Supermicro one have basic GPU provided by it's IPMI/iKVM chip (usually ASPeed AST2500/2600, etc).
  2. Boot storage: you can use reputable brand 128GB SATA-based SSD in mirrored mode for redudancy purposes, personally use x2 128GB Teamgroup GX2 SSD.
  3. L2ARC/SSD Cache/SSD VM storage: This part is the reason you will need x4x4x4x4 PCIe bifurfication support, while TrueNAS/Proxmox usually using RAM as storage cache (CMIW), you can add 2nd tier cache with faster NVMe drive. Remember, despite all of that, you need to buy PCIe x16 to 4x NVMe adapter to this to work, usually cost $25 - $40 on AliExpress/eBay. More explanation on SSD configuration as follows:
    • If you want use it as cache, I really recommend used 4x 64GB Intel optane M10 SSD, while yes, it's have slower throughput because only using 2 lanes of PCIe 3.0 bandwidth, but the price and random IO performance value cannot be matched especially when you running  it on RAID0
    • If you want use it as fast storage (for VM boot drive for example), you can choose several matches NVMe drive, like 4x 256GB Samsung 970 Pro running in RAID-Z1.
    • Remember most LGA 2011v3 motherboard have 2 PCIe x16 slot actually running in x16 mode, not like most consumer platform. So you if you have the additional budget, you can use both configuration mentioned above.
  4. Bulk storage: You can use various config on this one, for example using 4x 4TB WD RED CMR drive to achieve 12TB usable storage with RAID-Z1 config. To save a few bucks, you can use refurbished enterprise drive or buy WD MyBook external drive which usually go on sale, on some of this drive, you need to cut off the 3.3v power on SATA Power lane, especially on salvaged WD MyBook HDD to avoid the drive not detected by the system. In case of your motherboard SATA is not enough, you can buy used Dell PERC H310 or other LSI SAS2008 based card flashed in IT mode to expand the SATA port (you get additional 8x SAS/SATA 6Gbit port on this card).
  5. GPU(s): For Plex/Jellyfin video transcoding acceleration, you can use multiple option, but atleast make sure there's still available PCIe slot for it. If you're using low profile case, Video card like low profile GTX 1050Ti/GTX 1650 LP (Low Profile) is pretty nice. You just need to passtrough the GPU to the Plex/Jellyfin VM.
  6. Other stuff: well, you can choose whatever case you want, just make sure all the component is fits and properly cooled, and since you want it quiet, you can find several budget PWM fan that suit the price.

Sorry for the long explanation, I hope my explanation is helping your journey to find the prefect system for your need. Also sorry for my bad english, I'm not native english speaker.

 

LENNYYYYYYYYY!!!!! WHERE R U BUOYYYYY??????

  •  Laptop:
    • Current
      • Spoiler

        Lenovo ThinkPad X1 Yoga Gen 8 | i5-1335U | 16GB LPDDR 6400MT/s Soldered | WD SN810 1TB NVMe (Win11) | 14 inch 1200p IPS 500 nits panel | Intel AX211 | 65W USB-C PD adapter | Aliexpress DIY Thunderbolt 3 eGPU dock with JieShuo RTX 3060M 12GB

         

    • Retired
      • Spoiler

        Heavily modified Lenovo ThinkPad T480 | i7-8550U | 2x 16GB Teamgroup DDR4 3200MT/s | Custom FCBGA 595 to Oculink 2.0 @ PCIe 3.0 x4 (using removed nvidia MX150 dGPU pinout) | 14 inch 1080p IPS 350 nits panel | 7 Row classic keyboard mod (from T25) | 512GB Samsung PM981a NVMe SSD (Win10) + 256GB WD SN520 2242 NVMe SSD (Manjaro; on WWAN slot) | Intel AX210NGW | 65W USB-C PD adapter  | Oculink 2.0 to PCIe x16 DIY dock eGPU with JieShuo RTX 3060M 12GB 

 

  • Workstation:
    • Current - Repurposed as Homelab Dev Server
      • Spoiler

        AMD EPYC 9454P 48C/96T | Tyan Tomcat HX S8050 (S8050GM2NE) Socket SP5 | 8x32GB (256GB) Kingston Fury Renegade DDR5 4800MT/s RDIMM | Colorful iGame RTX 4080 Advanced OC (windows VM) | PowerColor RX 6600XT Fighter (MacOS VM) | 2x Intel Optane M10 64GB (RAID 1, Proxmox) | 4x Samsung 980 Pro 1TB (RAIDZ1, VM storage) | Nvidia Mellanox ConnectX-4 LX 25Gb SFP+ PCIe x8 (MCX4111A) | Seasonic Prime TX 1600W | Thermaltake Core W100 case | Enermax LIQTECH TR4 II 280mm AIO (SP5 Kit) |8x Deepcool TF140S Fan

         

    • Retired
      • Spoiler

        Intel Xeon E5-2650v3 10C/20T (locked all core 3GHz) | Dell ALienware Area 51 R2 Motherboard (MSi MS-7862) LGA 2011v3 with modded BIOS (ReBAR enabled) | 4x16GB (64GB) SKhynix ECC LRDIMM 2666MT/s Quad Channel | Colorful iGame GTX 1070 Flame Ares U-TOP | Samsung PM981a 512GB NVMe SSD (Win10) + TeamGroup MS30 M.2 SATA 512GB (Manjaro) | Mellanox ConnectX-3 OCP 2.0 NIC (with PCIe adapter) |  Thermalright AXP-100H Muscle CPU cooler (hacksawed to work with LGA 2011v3) | PowerUp Raptor 1633 case + 6x Deepcool RF-120FS case fan | IndoCase 500W 80+ Silver PSU (Rebranded & binned up Gamemax GM-500).

         

  • Homelab AIO server:
    • Current - Repurposed as Backup server
      • Spoiler

        AMD Ryzen 9 3900 (OEM CPU, got it from AliExpress) | Asus Prime B550-Plus AC-HES | 4x16GB (64GB) Samsung DDR4 3200MT/s ECC UDIMM | 2x Intel Optane M10 64GB (RAID1, TrueNAS Scale) | 6x Seagate SkyHawk 8TB (Bulk storage; RAID Z1; Alhua rebranded drive) | Asrock ARC A380 6GB Challenger OC (Terminal + Transcoding GPU) | Mellanox ConnectX-3 OCP 2.0 NIC | Deepcool Gammax C40 (dual fan) | IndoCase IC4008 4U rackmount case + 5x Deepcool XFAN120 + 2x Deepcool XFAN80 | IndoCase 800W 80+ Silver PSU (Rebranded & binned up Gamemax GM-800) | Samsung Galaxy S10+ 5G (5G backup uplink for Router VM).

         

    • Retired
      • Spoiler

        Intel Xeon E5-2695v3 14C/28T (force dynamic turbo all core up to 3GHz) | Huananzhi X99-TF-Q (Q87 chipset) LGA 2011v3 with modded BIOS (ReBAR & all core turbo enabled) | 4x32GB (128GB) Samsung ECC LRDIMM 2133MT/s Quad Channel | Nvidia Quadro NVS 295 (via M.2 E key riser; for basic display) + Dell GTX 1070 OEM (transcoding GPU) + Sapphire Pulse RX 470D 4GB (MacOS Ventura VM) | 2x Kingston A400 120GB (mirrored; TrueNAS SCALE) + 6x Samsung PM981a 1TB (VM storage; RAID Z1) + 7x WD Purple 4TB (Bulk storage; RAID Z1) | Mellanox ConnectX-3 OCP 2.0 NIC (with PCIe adapter) | Alseye M90 CPU cooler (hacksawed to work with LGA 2011v3) | IndoCase IC4008 4U rackmount case + 5x Deepcool XFAN120 + 2x Deepcool XFAN80 | IndoCase 800W 80+ Silver PSU (Rebranded & binned up Gamemax GM-800).

         

  • DIY Router:
    • Current
      • Decommisioned, current router system is running on current AIO Homelab server as VM (Modemsofmen ROOTer GoldenORB OpenWRT 22.10).
    • Retired
      • Spoiler

        HP EliteDesk 800 G1 SFF Prebuilt | Intel core i5-4460T | 4x 4GB Samsung DDR3L 1600MT/s UDIMM Dual Channel | 1x Kingston A400 120GB (ROOTer GoldenORB OpenWRT 22.10) | Mellanox ConnectX-3 OCP 2.0 NIC (with PCIe adapter) + Realtek RTL8125 2.5GBe PCIe NIC + Intel i210 SFP PCIe card with Mikrotik SFP ONU module (for WAN, my default ISP modem sucks) + Fibocom L860 4G modem (with USB 3.0 adapter, for WAN redundancy).

 

  • Desk setup:
    • Spoiler

      viewsonic VA2732-H 27 inch + VA2215H 22 inch (portrait) 1080p 75Hz IPS monitor | DIY sound system with 2 50W DIY bookshelf speaker | Senheisser HD 600 Headphone | Keychron K1 SE keyboard (Low profile Gateron brown) | Lenovo ThinkLife WLM210 mice | Samsung Galaxy S8+ (used as a Webcam LOL).

       

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×