Jump to content

threadripper vs epyc, for unraid homeserver

Gday all,

 

'Recently' discovered that a Ryzen 1700 doesn't fit my needs in terms of PCI-Express lanes for my unraid server.

Since i'm needing to use atleast 2 Video cards (vga output and en/decoding) i'm running out of pci-e lanes for a 3rd video card to use for a VM.

The 'easier' option in current usecase is to switch to an APU, but as with anything currently, they're hard to come by.

 

So i was looking at Threadripper and Epyc options, and found that currently the overall costs of an Epyc based system is actually cheaper than a threadripper.

In turn, the Epyc would offer more PCI-E lanes, but lose clockspeed compared to a threadripper. But there are other factors.

Like, the form factor of an Epyc mainboard. Availalable cooling sollutions, and maybe other factors i haven't taken into account.

 

At the same time, i'm wondering what the impact is on hosting game servers.

I know in the past, i've noticed a big difference in terms of clockspeed and hosting them.

The 4300 (yes yes, piledriver) had a noticable increase in performance, setting it to a fixed 4300mhz and upping the memory speed, for games like factorio and ark.

Would this be noticable as well between threadripper and epyc?

 

So i'd like to hear your thoughts on the following subjects:

 

  1. How much impact do you think the clockspeed makes on game servers (and VM passthrough)
  2. Do you see any other practical issues like cooling solutions and form factor
  3. Would an epyc / threadripper be better in handling software encoding for VM graphics compared to a 1700, elimating the need for a high amount of graphic cards?
  4. other things i might have forgotten you have run into using a similar build for home use?

 

p.s. yes i'm also part enthusiast and starting to get more and more interested in the more heavy hardware, like advanced home networking and servers.

Gamesystem: X3700, 32GB memory @3200mhz, GTX1080 Hybrid

Unraid system: Epyc 7352, 24/48, 96GB ECC buffered @2666mhz, 2x GT710, GTX1050Ti

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Caennanu said:

How much impact do you think the clockspeed makes on game servers

Pretty big actually, some server clients (mostly minecraft) cant scale cores properly and the clock speed can matter more than the core count.

8 minutes ago, Caennanu said:

Would an epyc / threadripper be better in handling software encoding for VM graphics compared to a 1700, elimating the need for a high amount of graphic cards?

If your work can be GPU accelerated, always grab one if possible. And you dont need a lot of it, really. Something like 1660 or 1650 super is fast enough for video render, and most engineering apps i know of supports render nodes, so you can use multiple system to boost 1 project.

Press quote to get a response from someone! | Check people's edited posts! | Be specific! | Trans Rights

I am human. I'm scared of the dark, and I get toothaches. My name is Frill. Don't pretend not to see me. I was born from the two of you.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SorryClaire said:

If your work can be GPU accelerated, always grab one if possible. And you dont need a lot of it, really. Something like 1660 or 1650 super is fast enough for video render, and most engineering apps i know of supports render nodes, so you can use multiple system to boost 1 project.

It can, Currently i'm using a 1050Ti that en/decode for shinobi and plex. But i've noticed my 'demands' are growing quickly cause i'm wanting to try more and more haha.

 

3 minutes ago, SorryClaire said:

Pretty big actually, some server clients (mostly minecraft) cant scale cores properly and the clock speed can matter more than the core count.

Yeah, thats what i was afraid off. Sure more and more games are starting to use more cores in their main thread design, but it's still not where i would like it to be. 

Let's see what others have to say 😉

Gamesystem: X3700, 32GB memory @3200mhz, GTX1080 Hybrid

Unraid system: Epyc 7352, 24/48, 96GB ECC buffered @2666mhz, 2x GT710, GTX1050Ti

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Caennanu said:

Availalable cooling sollutions,

SP3 uses the same mounting as TR4. wouldnt realistically be an issue. if you are using large amounts of Registered high capacity dimms, you should have a fan pointed at the memmory. 

 

a fan pointed at the VRM is also likely a good idea. 

6 minutes ago, Caennanu said:

Would this be noticable as well between threadripper and epyc?

if you pick up something like the EPYC 7443P (or a 16 core offering). the difference would be smaller, but each instance will likely be worse. 

 

12 minutes ago, Caennanu said:

Do you see any other practical issues like cooling solutions and form factor

to utilize the ammount of PCIe the Epyc plattform offers. might want to prep for using extensions and U.2 to PCIe adapters. 

 

also depending on what is actually available at the time you are buying. id suggest looking at Intel Ice Lake Xeon. some of their low end products are fairly nice offerings, tho only 64 lanes of PCIe. 

Link to comment
Share on other sites

Link to post
Share on other sites

Thanks for the feedback Goldenlag. Really helpful.

 

3 minutes ago, GoldenLag said:

to utilize the ammount of PCIe the Epyc plattform offers. might want to prep for using extensions and U.2 to PCIe adapters.

Yeah, i was thinking about that, i already have some risers laying around. And when this actually starts becomming a reality, i'm likely building it in a 4U or home build case, so that i have space and proper spacing. 

 

4 minutes ago, GoldenLag said:

also depending on what is actually available at the time you are buying. id suggest looking at Intel Ice Lake Xeon. some of their low end products are fairly nice offerings, tho only 64 lanes of PCIe. 

64 pci lanes should be sufficient. Currently i'm expecting to not use more than x8 on a graphics card, heck my GT710's are X1 cards. So the 1050ti on x8, maybe my current 1080 on a x8 or x16 if 30 series actually become available for decent prices. 1 SAS controller currently using 4 lanes. Which brings my current total to 22 - 38 for expansion slots.

Gamesystem: X3700, 32GB memory @3200mhz, GTX1080 Hybrid

Unraid system: Epyc 7352, 24/48, 96GB ECC buffered @2666mhz, 2x GT710, GTX1050Ti

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Caennanu said:

64 pci lanes should be sufficient. Currently i'm expecting to not use more than x8 on a graphics card, heck my GT710's are X1 cards. So the 1050ti on x8, maybe my current 1080 on a x8 or x16 if 30 series actually become available for decent prices. 1 SAS controller currently using 4 lanes. Which brings my current total to 22 - 38 for expansion slots.

you are free to add a cheap used 40gbit card  and use the server for external storage to load everything from. Plug it into a chipset PCIe slot on your main system and you should have pretty high speed storage. 

 

you can also find fairly cheap U.2/PCIe 2-bit MLC/DLC SSDs. if you ever wanted stupidly high endurance and sustained write speeds. However i cant recall what such drives are called.

14 minutes ago, Caennanu said:

Yeah, i was thinking about that, i already have some risers laying around. And when this actually starts becomming a reality, i'm likely building it in a 4U or home build case, so that i have space and proper spacing.

if you are doing it in a normal-ish case with decent airflow. Id suggest picking up a 5 pack of fans (of what size you want, 120mm down to 40mm (92mm is a nice middleground) to be able to stick anywhere that might need airflow. 

 

Server boards may also not have decent fancontrol.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, GoldenLag said:

you are free to add a cheap used 40gbit card  and use the server for external storage to load everything from. Plug it into a chipset PCIe slot on your main system and you should have pretty high speed storage. 

Who knows what the future will bring . . . 🙂 options, options, not enthusiast enough yet to actually start a homelab where i would want / need this. But yeah, options.

 

3 minutes ago, GoldenLag said:

you can also find fairly cheap U.2/PCIe 2-bit MLC/DLC SSDs. if you ever wanted stupidly high endurance and sustained write speeds. However i cant recall what such drives are called.

Current system is using 2x m.2 nvme drives for cache (also the reason i'm running into my issue in first place), for now this gives me most the speed i need. But i hear you. For long term storage i'm using 4TB WD reds in an array. When unraid supports a 2ndary array, i'm planning to upgrade with 2 purple drives for the CCTV recordings, which are currently buffered to cache to be written to the array. And for the Vm's, i'm really using left over SSD's i have no other use for currently, 

 

6 minutes ago, GoldenLag said:

if you are doing it in a normal-ish case with decent airflow. Id suggest picking up a 5 pack of fans (of what size you want, 120mm down to 40mm (92mm is a nice middleground) to be able to stick anywhere that might need airflow. 

 

Server boards may also not have decent fancontrol.

Plenty of fans laying around, bit of a hoarder in that regard. But good mention about the fancontroller. Maybe i have an old 5,25 fan controller laying around somewhere, so that i atleast have 'some' control over the fans. The location the server is currently in, is not entirely ideal in terms of temperatures.

Gamesystem: X3700, 32GB memory @3200mhz, GTX1080 Hybrid

Unraid system: Epyc 7352, 24/48, 96GB ECC buffered @2666mhz, 2x GT710, GTX1050Ti

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, Caennanu said:

Gday all,

 

'Recently' discovered that a Ryzen 1700 doesn't fit my needs in terms of PCI-Express lanes for my unraid server.

Since i'm needing to use atleast 2 Video cards (vga output and en/decoding) i'm running out of pci-e lanes for a 3rd video card to use for a VM.

 

You can buy a cheap 5-10$ pci-e video card from eBay for vga output only, and cut the edge connector until you get it down to pci-e x1  - pci-e is designed to work like that, and with the exceptions of some devices that have firmware which complains, all pci-e devices are supposed to work on as little as a single pci-e lane.

So then you can plug the video card in a  pci-e x1 slot and be done with it. 

Now you should have the first pci-e x16 available for something that actually needs it and you would probably have the bottom pci-e x16 slot which is electrically x4 available as well. 

A video card that's used for hardware decoding and encoding won't benefit from 16 pci-e lanes or from being connected directly to cpu... 4 pci-e lanes are more than enough to support multiple simultaneous encodings and decodings of content.

 

Why can't you have a video card for both vga output AND hardware encoding/decoding... it's not clear to me. 

 

I don't know ... it just seems like you're too quick to be willing to let go of a perfectly functional 1700 which has quite a bit of performance.

 

edit : saw the comments added while I was writing this message ... if the current mobo supports bifurcation, you could get a  m.2 adapter card in the pci-e x16 slot and have 4 m.2 nvme slots.  Plug the hardware encoding decoding video card in the bottom pci-e slot, or maybe leave that slot for a 10g ethernet card.

 

Consider using some basic pci-e x1 to pci-e x16 riser cables to plug the hardware encoder/decoder card into a pci-e x1 slot, without having to cut the edge connector. You'll probably see a single pci-e lane giving you 900+ MB/s is still plenty for hardware encoders on the card. 

Link to comment
Share on other sites

Link to post
Share on other sites

Gday Marius,

 

i'm aware that you can cut off parts of the slots to make them fit, but i don't want to do that to be honest.

The SAS controller i have needs the 4 lanes, the slot its in works on x8 as i have no further control over it, since the 1050ti is on the same 'bus' and also running at x8.

From what i understand, the other x1 slots are on the chipset, which only has 4 lanes available, and they're pretty much taken up by 2 NVME drives as is.

 

13 minutes ago, mariushm said:

Why can't you have a video card for both vga output AND hardware encoding/decoding... it's not clear to me

Simultanious passthrough and use for docker does not currently work. When unraid boots it takes a GPU, i have not found proper control over this to make it completely headless (and with the instability issues i'm experiencing currently due to oversaturation of the system, i don't really want to either at this point).

So i have the option to either chose to assign it to the docker engine or an VM, if its not actually used as VGA output for unraid itsself.

 

 

And with this post, i'm simply exploring options, as i seem to have run into a dead end in terms of trying to fix my issue in the first place. Which has its own topic too: 

Currently this is my internal setup, when adding the 2nd GT710 (see arrow), my SAS controller goes haywire. So you tell me why, and ill try to fix it.

What i understood from the other post, is that the 4 3.0 lanes going to the chipset are downgraded to 8 2.0 lanes. making 4 2.0 available for NVME and 3 for the x1's.

 

image.png.d19802963c59cc7dd7f117367f1b87f7.png

Gamesystem: X3700, 32GB memory @3200mhz, GTX1080 Hybrid

Unraid system: Epyc 7352, 24/48, 96GB ECC buffered @2666mhz, 2x GT710, GTX1050Ti

Link to comment
Share on other sites

Link to post
Share on other sites

@Caennanu why not old dual Xeon E5 first gen, it's cheap in used. For example, Xeon E5-2690 8-Core (3.8GHz single core turbo / 3.3GHz all cores turbo), you can to find for $50 unit. With dual Xeon, you can to get total 80 PCIe 3.0 lanes. And DDR3 ECC Reg sticks in used are very cheap.

PC #1 : Gigabyte Z170XP-SLI | i7-7700 | Cryorig C7 Cu | 32GB DDR4-2400 | LSI SAS 9211-8i | 240GB NVMe M.2 PCIe PNY CS2030 | SSD&HDDs 59.5TB total | Quantum LTO5 HH SAS drive | GC-Alpine Ridge | Corsair HX750i | Cooler Master Stacker STC-T01 | ASUS TUF Gaming VG27AQ 2560x1440 @ 60 Hz (plugged HDMI port, shared with PC #2) | Win10
PC #2 : Gigabyte MW70-3S0 | 2x E5-2689 v4 | 2x Intel BXSTS200C | 32GB DDR4-2400 ECC Reg | MSI RTX 3080 Ti Suprim X | 2x 1TB SSD SATA Samsung 870 EVO | Corsair AX1600i | Lian Li PC-A77 | ASUS TUF Gaming VG27AQ 2560x1440 @ 144 Hz (plugged DP port, shared with PC #1) | Win10
PC #3 : Mini PC Zotac 4K | Celeron N3150 | 8GB DDR3L 1600 | 250GB M.2 SATA WD Blue | Sound Blaster X-Fi Surround 5.1 Pro USB | Samsung Blu-ray writer USB | Genius SP-HF1800A | TV Panasonic TX-40DX600E UltraHD | Win10
PC #4 : ASUS P2B-F | PIII 500MHz | 512MB SDR 100 | Leadtek WinFast GeForce 256 SDR 32MB | 2x Guillemot Maxi Gamer 3D² 8MB in SLI | Creative Sound Blaster AWE64 ISA | 80GB HDD UATA | Fortron/Source FSP235-60GI | Zalman R1 | DELL E151FP 15" TFT 1024x768 | Win98SE

Laptop : Lenovo ThinkPad T460p | i7-6700HQ | 16GB DDR4 2133 | GeForce 940MX | 240GB SSD PNY CS900 | 14" IPS 1920x1080 | Win11

PC tablet : Fujitsu Point 1600 | PMMX 166MHz | 160MB EDO | 20GB HDD UATA | external floppy drive | 10.4" DSTN 800x600 touchscreen | AGFA SnapScan 1212u blue | Win98SE

Laptop collection #1 : IBM ThinkPad 340CSE | 486SLC2 66MHz | 12MB RAM | 360MB IDE | internal floppy drive | 10.4" DSTN 640x480 256 color | Win3.1 with MS-DOS 6.22

Laptop collection #2 : IBM ThinkPad 380E | PMMX 150MHz | 80MB EDO | NeoMagic MagicGraph128XD | 2.1GB IDE | internal floppy drive | internal CD-ROM drive | Intel PRO/100 Mobile PCMCIA | 12.1" FRSTN 800x600 16-bit color | Win98

Laptop collection #3 : Toshiba T2130CS | 486DX4 75MHz | 32MB EDO | 520MB IDE | internal floppy drive | 10.4" STN 640x480 256 color | Win3.1 with MS-DOS 6.22

And 6 others computers (Intel Compute Stick x5-Z8330, Giada Slim N10 WinXP, 2 Apple classic and 2 PC pocket WinCE)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, X-System said:

@Caennanu why not old dual Xeon E5 first gen, it's cheap in used. For example, Xeon E5-2690 8-Core at 3.3GHz all cores turbo, you can to find for $50 unit. With dual Xeon, you can to get total 80 PCIe 3.0 lanes. And DDR3 ECC Reg sticks in used are very cheap.

Thank you for your suggestion. Reason i'm looking at Threadripper and Epyc is because i have most the hardware already.

For example, i cannot use my M.2's on that platform, unless i buy an additional pci express card, which leads me 'back' to my original problem, having too few pci lanes.

With Epyc i know i will have enough lanes for the immediate and longer term future, Only reason i'm even considering Threadripper is because of the higher clocks.

Gamesystem: X3700, 32GB memory @3200mhz, GTX1080 Hybrid

Unraid system: Epyc 7352, 24/48, 96GB ECC buffered @2666mhz, 2x GT710, GTX1050Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×