Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

jkirkcaldy

Member
  • Content Count

    109
  • Joined

  • Last visited

Everything posted by jkirkcaldy

  1. Install the wireguard plugin on the unraid box and forward the port from your router to unraid. That's what I do to gain access to my unraid system out of the house. I also have a vpn on my router for backup if my unraid system crashes or needs to be rebooted. Just a word of warning on nextcloud, once your files are in it, you can only access them from within the nextcloud system. i.e. you can't have a smb share with your files and have them available in nextcloud.
  2. You can't use port numbers in DNS entries as far as I am aware. But if I am wrong, please send a link to where you can because I would like to see that.
  3. That doesn't really work. not unless you add the port to the domain (eg plex.example.com:32400). In which case there is literally no point to using different sub domains because plex.example.com:32400 would go to the same place as ts.example.com:32400 Also if services need specific ports to be open, these will need to be open regardless of whether you use a reverse proxy or not.
  4. A caching server is a good way to go about this, probably the only way to go about it really. As far as I know, you can't use the same files hosted on a server to serve the game to multiple client PCs. So you would still need to download the game onto each PC in the LAN party. The caching server can take some of the strain off your internet connection, so you would need to have a dedicated machine that would act as a cache. Basically you would download PUBG on your computer and the server would save the files as they came from Steam. Then in theory, anyone else that needs to download the game on your network would pull the files from your server rather than the steam servers. With the correct setup you can get some stupid speeds from the cache server. But a word of warning, I tried to do this with a Virtual machine, and you actually needed to dedicate a large amount of resources to the caching server in order to get really fast download speeds.
  5. Depending on how comfortable around PC hardware and building your own PC you may be better off building something rather than buying. I built my first Plex server for about £250. There are some advantages to buying a NAS but unless you spend a little more you will be missing out on the more advanced features. I started off with a pentium dual core and 8GB RAM (RAM was cheap when I built my machine) The idea was that I could scale up various parts as and when my needs grew. And since then I have gone from a small 4TB tower PC to a rackmount beast with 52TB (40 usable). You just don't have that scalability with a store bought NAS Also, unless you are spending money on a very good CPU, you should get the idea of transcoding 4K footage out of your mind. You can optimise the media ahead of time, but I - as well as most people I think - have found that it is generally best to have both a 4K version as well as a 1080p version. Then if you do need to transcode, you can do so from the 1080p copy and have a much better time. Also most 4K files have HDR colour toning, and as soon as you transcode this you will loose it and your file will look very washed out and flat.
  6. A reverse proxy is what you're after. I have a single static public IP address then a load of services running on different virtual machines. Here's a brief list of some of them: Plex Organizr Monitorr sonarr radarr ombi nextcloud hastebin gitlab homeassistant Bookstack wiki There are probably more but you get the idea. I have a single VM dedicated to being the reverse proxy. So all my traffic is forwarded to this internal IP address then it separates it out and forwards the traffic to where it needs to go. It can be quite a daunting task at first but it gets a bit easier as you go and there are a shed load of tutorials online.
  7. Depending on how much of an issue aesthetics are for you, you could always buy a rack-mount case full of hot-swap bays, take the rack ears off and store it on its side. They are roughly the same sort of size of a desktop case, they are usually a little longer though. Most rack-mount cases look alright from the front but are usually bare metal on the sides as you usually wouldn't see this in a rack. But I got a rack-mount case for my white-box server long before I ever got a rack to mount it in.
  8. The problem with workstation equipment is they can often have not a lot of room for expandability. The HP Z Series workstations for example only have enough space for something like 5 drives in the 840 series. So whilst they are very powerful and very quiet, you can only add a few more drives. Compared to some of the rackmount equipment where you can fit 24+ drives in a case of a similar size.
  9. you need to include a bit more information. Are you hosting this at home? What level of redundancy are you aiming for? Do you want full redundancy? This will cost way more money than you think as involves a minimum of 2 of everything. 2xinternet connections, ups, generators, minimum of two servers etc etc, What's your budget? Once we have a better understanding of what it is you are trying to do and what sort of tools/budget to work with we can help more than we can now.
  10. I use Transparent Raid from FlexRAID. From what I can gather, it works in much the same way that unraid does in terms of storage. But it runs on windows, as well as other platforms. I have 40TB usable space with 2x6tb parity drives. Although I'm considering removing one of these as I need the storage space. You can pool random disks from different vendors and different sizes. You can delete and recreate the array with no data loss. You can add a write cache using SSDs if you want. It's worked well for me for a couple of years now. Having said that in that time I haven't lost a disk so I'm not sure what the rebuild process is like. But because everything is stored on normal ntfs drives you can pull the drives out and read them on another computer.
  11. I think on the newer hyper-v you can pass through a graphics card but you need to pass through the entire card to a single VM. So you would need to have a GPU per VM.
  12. I ran an Ark server on a windows 10 Virtual machine with 4cores and 8GB RAM. This was only for a handful of friends so wasn't a huge server, but it wasn't that resource intensive. The VM was running on Windows server 2016 using Hyper-V. Hardware was a Dell R710 low powered Xeons, I think L5630 or something similar. Ran perfectly.
  13. the same reason people go to best buy or PC world and buy a desktop computer there. Most people aren't comfortable building their own systems and buying pre-built usually comes with some sort of support system as well as a single place to return the hardware if there is a fault or dealing with warranty etc.
  14. If he's an indie film maker I'm going to assume he only really works/edits as a one man team. In which case I wouldn't get a nas. I would get a DAS. If a NAS is needed, Qnap do one that I have used on a feature film before that had 2x10GB rj45 and 2xThunderbolt2 as well as a couple usb3 and 1gb ethernet. I can't remember the exact model as it was a couple of years ago now. It was about $2500 for the NAS and then the disks were extra. It had 8 x 8tb disks and in a raid array it had a throughput of around 700mbps.
  15. There is also a Hyper-V OS that you can install for free. The catch is that I think realistically you need another windows PC to manage it. But it is completely free, no limits, no trial. Just free.
  16. This ^ You can run a web server on a RaspberryPi with little difficulty. Even with a larger website you don't need that much power. You could buy a celeron or pentium which would save more than $10 for the 1800x. You need to put in what you want the web server to actually do, how many visitors are you expecting, are you going to need a huge SQL database? is it just for media streaming? Any advice without any of that information is useless.
  17. Grafana and telegraf. There's a bit of a learning curve but you can monitor everything from one web page. This is my dashboard for some useful info: https://grafana.themainframe.co.uk
  18. The problem is with things like the benchmark tests is that they do great usually because the files that they are writing and reading are all still contained on the ssd. So if you are reading a lot of files shortly after writing them you will see a huge increase in performance like you did above. But I would wager that if you were to copy a large file onto the array then copy it back again immediately, record the results, then wait until the file isn't on the cache any more and copy it off the array you will see far less impressive numbers. But if it's as much a technical exercise as it is a practical one, and you can spare the SSD then it's worth it.
  19. No, H265 is way to CPU intensive still for my CPU. I keep most things to H264 blu-ray rips. It's always the problem when people say, what specs should I use for my plex server. There are so many variables that there isn't a one size fits all answer.
  20. I think this is not good advice. I'm running a plex server with an Intel 3770 and that works perfectly for a Plex media server, can transcode around 5-6 streams before the CPU is the bottleneck. However In this situation, everything else I would agree with. Use the E3 quad-core for gaming and the 10 core for a media server. Also here is no reason you couldn't install some VMs on the box with the E5. Plex doesn't really need that much RAM. I have no idea about the system requirements for Blue Iris but I think you could easily allocate 48GB RAM and a few cores to the VM. The way I would set it up is: System #1 Server 2016/Hyper-V Run it as the NAS/backup server 1VM for plex/handbrake 8-10 vCPU 4GB RAM 1VM for Blue Iris 4vCPU 4-8GB RAM (depends on how many cameras) Leaves you with at least 52GB RAM for various other VM (don't forget you can over provision your CPU it's not a 1:1 allocation ratio like RAM.) System #2 As you have it. you could easily have a few VM running whilst gaming and not see much of a performance hit (depending on the game/workload.
  21. I wouldn't bother adding an SSD. Especially as you will only really see a performance increase in the write speeds, of which, unless you are backing up loads of clients at once, the speed of a normal mechanical HDD will be fine.
  22. Everyone always talks about transcoding 4k, That just seems so insane to me. Until Plex can transcode 4k > 4k there is no point in transcoding it. you would be better off with two separate files. And if the idea of that puts you off because it will use too much storage space, you shouldn't be looking at a 4k server, those files can reach 100gb per 2 hour film in some cases. Because as it stands, as soon as plex transcodes a 4k file (which is usually HDR) it will downscale it to 1080 and you lose the HDR. So I would look at whether your client can direct stream. If it can't it doesn't matter how powerful your system is it will look like crap. A 1080p remux will look 100 times better than any 4k transcode.
  23. My two cents, I work in a TV production company in England so have a fair bit of experience with what you are trying to do. First of all, as it has been said, do not try and remote edit. It doesn't work. There is too much latency and compression involved to make it worth while unless you have specific infrastructure in place. (I'm talking dedicated lines from your ISP that I can guarantee you don't have now as they are thousands per month) Now onto the solution: You could have a central file server where your editors could download the footage if your upload speeds are any good. But honestly, with 600€ I think you are going to struggle to buy anything new. I would check ebay and pick up a used workstation. Have a look at HP Z820, Z620, Z420 or newer.
  24. Tried all that. I have a feeling it's a dead psu. my next plan of action, is to plug in a normal atx psu just into the motherboard to see iif it will post, at least that way I can see where the problem lies. But if it is a power supply issue I think that will be the end of it unfortunately. The replacement psu is around £350 each so it would be 1k to replace all three and I don't think they will be willing to spend that sort of money on getting it fixed. Unless I can replace the psu with another brand as long as all three are the same model?
×