Jump to content

jkirkcaldy

Member
  • Posts

    118
  • Joined

  • Last visited

Everything posted by jkirkcaldy

  1. So I am having issues when trying to transfer files between my two servers. One server is a Dell R710 and the other is a whitebox build. Both are equipped with Mellanox Connect x2 10GbE adaptors (only nic the server can see, all others are dedicated to Hyper-V), both are running win server 2016, both are plugged into a Quanta LB4M switch. (2x10GB SFP+ 48x1GB RJ45) The speeds transfering from any other PC to either one of the servers is fine and seems to be working as expected. The issue is when trying to transfer files between the two servers themselves. I also have an issue now where I can't setup an iSCSI connection between them using the 10GbE connections. But have no problem connecting to other machines using the same connections. It's almost like, whilst the servers can see each other on the network, they can't talk to each other or there is something stopping them from transmitting too much data. I am sure it's something in the configurations that I have messed up somewhere, or a limitation of the switch (which would be annoying) but I have no idea where to start looking. Any help would be greatly appreciated.
  2. I have a dell R710, it's great, but it can use quite a lot of power. Mine sits at about 250w on average. which works out to about £25-30 per month where I am. (2x L5640, 48GB RAM, 6x 2TB 3.5" SATA drives) alternatively my other server that I built from a collection of random left over parts, sits much lower and uses about £10-15p/m (I7 3770, 14x various 3.5" SATA, 32GB RAM) Buying a used server is definitely easier in some ways, i.e. it's much easier to find compatible parts etc, and can often come with awesome features like the iDRAC management but there are also some pretty compelling arguments for building your own. Like someone else said, building a server is the same as building a normal PC, it's just got a different purpose. You can run 10GB speeds, but realistically it's pretty difficult to hit them constantly. You will need to have fast enough drives in a RAID array on both sides. But this would all be independent of your home router. You can either plug the two computers into each other directly (cheap and not really scalable) or get a switch that is 10G capable (more expensive and expandable.) As with most things though it really depends on your budget. If you are sitting around the £/$200-300 mark you will get a lot more for your money buying used servers from ebay. If you have a bigger budget you may be better building something more power efficient and saving in the long term.
  3. Depending on how deep you wanted to go you could look into something like nextcloud. This is a self hosted alternative to dropbox/google drive. It installs on top of a linux OS so you would likely need a server to run it on. Although you could run it on a low power Nuc and run the storage on a nas device but that may be complicating things. You can share links or emails with documents with passwords and download expiration dates. You can give each employee a user account and share files internally between users. You can also integrate open source office software to let people edit word documents etc online. A word of warning though, this can be a very involved process and if you install it - as with anything you install - you become the tech support for it. So if it ever goes down, you will probably have to fix it.
  4. This doesn't really wash with server builds. Unlike a workstation/gaming system where you just buy this years most powerful cpu and gpu, slap as much ram in as you can afford, throw it all on a decent motherboard and you will have spent around 5K, Servers can go up to the hundreds of thousands £££/$$$ Some indication of budget can vastly change the nature of the advice given. Eg. advice for £500, I'd say go second hand. For £5000, there is a lot more options available to you. Buy Second hand - Something like an Dell R510 with 12x3.5" drive bays would be good. Although if you can afford it going for the newer R520 or R710xd would be better on power. Then you have Xeon cpus, ECC ram, and there is a shed load of support for them on forums etc. If you want to build for the experience of building a server, InWin have some pretty sexy 2u cases that are kind of like the Dells but without all the proprietary connections so you just use standard off the shelf parts. Or do you already have a chassis? They also have SAS expander backplanes that take sata drives so you can throw in a raid/hba card and bob's your uncle. Either way I would look for a supermicro board with IPMI support. This can be really useful for troubleshooting when you are out of the house. Look for a board that can hold a lot of RAM as you will probably need somewhere around 64GB+ for running a lot of VMs. I would also recommend looking at redundant power supplies, they can be quite a bit more expensive than normal power supplies but they don't use any more power (maybe a Watt here or there) and it gives you a little more protection from hardware failures.
  5. What's your storage space setup? I ask because back when I ran storage spaces for a while I jumped in without really understanding them and just wanted to pool my drives so I saw one big drive not loads of smaller ones. I didn't realise at the time but it basically turns your drive into a RAID 0. One of my drives died and I lost everything. If you have a similar setup I would recommend getting a backup solution in place pronto. Or do like you said and get some drives to keep a second copy. You can get external raid enclosures but to get one that's worth it you will be spending a similar amount to getting a prebuilt nas unit anyway. Which is what I would recommend over the external RAID. Synology and QNAP seem to be fan favourites at the moment. Something I have come to accept is that with computers a lot of things can be done cheaply but not storage. (I mean you can, but it will be inconvenient and you will lose data eventually) It's best to think about your long term goals and invest in something that will see you through the next few years. Or at least have a clear upgrade path or when you need to. i.e. if you need 5TB today, buy 10 so you have room to grow. And if you are filling an 8 bay nas, do it with 3x 5TB drives in a RAID 5, giving you space to add 5 more drives in the future.
  6. That is completely up to you. You can install Plex on nearly any OS. I run mine on Windows server because I wanted to play around with other things it can do as well. And I am most comfortable with Windows and Mac and didn't have a spare mac. But Linux is a good choice too. Or depending on if there is anything else you want to do/need to do you can install the server on top of FreeNAS or Unraid. Apart from Plex is there anything else you want to server to do? Do you know how to use linux at all? Are you looking for a project that would be a great way to fill a weekend? If windows is what you know and you don't mnd buying another licence or have an old win box lying around go for windows. If you are a Mac guy go for Mac. (but this can be a bit of a pain to expand storage at a later date) If you know or want to learn linux install ubuntu, or ubuntu server for a lower impact OS. If you want a project or to learn something new, go with FreeNAS or Unraid. There is good documentation and tutorial videos out there to help with the process. You will get basically the same performance from the all the OS because they will have the same hardware. For lower spec hardware I'd probably recommend Linux, FreeNas or Unraid. Something worth noting is that there can be some high requirements for FreeNas. i.e. it's recommended that you run 1GB RAM for 1TB storage, but there are people in the forums that are running with slightly less than that. It's also recommended that you run ECC ram but that's a whole debate that you can read up on elsewhere. But it's free whereas Unraid costs about $60
  7. I think you would be better served by putting a very lightweight OS on the servers and connecting to the drives via SMB/CIFS/NFS or iSCSI depending on your needs or what's best for the files you're storing. If you install windows server on all the physical servers you can control all of them from a windows 10 PC or from one of the servers directly so you wouldn't necessarily need to go into all the servers themselves once you are all setup. I'm not sure if this would require you to set up active directory or not. You could look into something like freenas. That runs from a USB pen drive so you wouldn't lose any of the raw hard drive space for the OS. But if memory serves me correctly it can be very hungry for RAM. I seem to remember something like 1GB RAM for every 1TB storage. But I could be wrong.
  8. Yes to all. Plex can handle files with multiple audio streams embedded in the file, it can also read subtitles in the video file too. Allowing you to easily select the audio and subtitles you want either before you start playing the video or whilst streaming. You can set the default language, so it will select english audio and forced english subtitles as a default but you can change this like I said earlier. Plex can do 4K currently although it should be noted that 4K is much more taxing on the server running plex and not many clients support it. As far as I know (and I'm sure there is probably more) the Xbox One S and the Shield TV will do 4K playback. My 4K LG OLED TV transcodes the 4k to 1080p for some reason so it may not be supported on my TV. In terms of your network bandwidth, if you are using a hardwired connection it shouldn't really be a factor. As the streams, even for 4K Blu-ray. A 4K Blu-ray disc has a maximum bitrate of 128Mbps just over a 10th of gigabit ethernet. Wi-Fi is a different matter and can have so many factors play into the performance, such as the quality of the router/access point. Position of your device and the router, whether someone is using a microwave, how many devices are connected, other Wi-Fi networks around you. It can work fine with the right setup for full 4k Blu-ray quality playback, it can be a little more challenging. It really depends. But I would say that for most 1080p content you should be fine if you have a semi decent network. Playback of 4k media is fine when your device supports it so it can be directly streamed and won't be any more taxing on your system than a SD direct stream as it is just serving the file. When you need to transcode 4K that becomes a whole different beast. The CPU in a NAS or lower powered CPUs just wont cut it for 4K streaming. If you are going to be building something as a server and 4K is something you want to be ready for without upgrades, you should read up on the hardware transcoding for plex. With newer generations of processors from Intel (not sure about AMD) you can use the build in GPU on the CPU to give a substantial performance boost. Which should help with 4K transcoding. Alternatively, one of the other cool things plex will is optimize your media. So you can have a full 4K copy of the film, then before you go to bed one night or when your server is quiet, tell plex to optimise the file for playback on another device, you can change the setttings of this too, and it will create a 1080p/720p/480p copy of the file. Then you can select the 4K quality for your 4K tv and for anything else you can use the lower, easier to play versions.
  9. Transcoding on the fly is useful for a number of reasons. When you setup plex you can stream from your NAS to a local device just to watch your media. You can also stream to pretty much any device, anywhere with an internet connection. Having the ability to transcode on the fly means that if you wanted to watch an MKV file on your iPhone for example, normally you would need to transcode the file to an .mp4 or a .mov. With plex it will transcode it to these codecs as you are watching the stream. (means you can just remux your Blu-rays to an MKV container and have the full quality file play on any device rather than compressing it) This may be less of an issue if you have only a handful of movies and will be transcoding them from Blu-ray to mp4 anyway, but when your collection starts getting as big as mine, (500 movies and over 5000 TV episodes) you're not going to want to transcode all of that media to device friendly codecs. Especially as there isn't always a one codec that plays nicely on all devices. Plex doesn't always transcode. If your device can play the codec of the video it will just stream it as is. But if your device doesn't support it, it will transcode automatically. You don't need a really powerful cpu for plex either. it is supported on loads of NAS boxes and even a router from netgear. So if you build/buy a NAS and are planning on converting everything before hand anyway you can get something with a low powered cpu.
  10. No, I don't think you did. It's more that I have seen this question pop up a few times and DIY is always suggested. If places like DDP are still well over budget (which I think they could be for 300TB) check out Indistor. They may have something that could suit your needs. TL:DR - DIY is great but it has a time and a place. Professional video editing is usually not either. DIY is fine in and of itself, it's just that the advice that often comes with that suggestion tends to be off. There is a very big difference in creating a SAN for a homelab and creating a SAN for a business. And as much as love the videos where Linus and the team show off their servers and as jealous I get at the ridiculous power they have at their disposal. I'd probably categorise their setup as a homelab (An amazing setup that I'd give a kidney for) rather than a setup I would recommend anyone try and replicate. Which is why I think people often suggest stupidly fast drives are needed to edit from. I can't think of many situations where editors would need a server with 48 nvme SSDs to edit from. Even 10GB (from server to workstation) is overkill in most situations. I often wonder if they would go down a similar route if they weren't sponsored by Intel or have the relationships with manufacturers that they can use in place of tech support. That being said, Premiere is a PITA on a 1GB network and it can also be a PITA on a 10GB network too. It's just the way it works, it's just not designed to run on a network drive. It's getting better as they take more and more market share, and you can see this with the way they are trying to implement things like sharing projects between multiple editors at the same time. So in a year or two this may be a very different conversation.
  11. It depends on what you want? If you want network storage then building/buying a nas with or without RAID is fine. But whilst you may be protected from a drive failure it's still not a backup. But then again it you want something that it just going to replace google drive, that's not really a backup either so you'd be in exactly the same position. The difference being you would be responsible for the hardware and infrastructure. You could build your nas, a couple of 2TB drives in RAID 1 for hardware failure protection and buy an external 2TB HDD and back up your NAS to the external HDD. Then you have both protection from hardware failure and a backup. If your backup disk is formatted as NTFS or exFAT or HFS then if your nas dies you can plug in your drive and still have access to your data.
  12. You can run Hyper-V as a bare metal install, where it is basically it's own operating system. Or you can run it on top of a normal windows installation. If you install hyper-V as a bare metal install it is free. But if you want to install it on windows you will need to buy a windows 10 Pro licence or a windows server licence. The upside of Hyper-V is that if you have a windows 10 pro machine already you can install Hyper-V for free and either install virtual machines on that PC to test or you can connect it to your Hyper-V server and use it to view and control your VMs on the server. Even use it as a console KVM to remote into the VM. I personally favour Hyper-V but I am running in a Windows active directory network so it fits in well. My hardware also did not support ESXi as you need to have a compatible nic and I didn't.
  13. Here it is: https://www.ddpsan.com/ I agree. I am all for building servers and getting the most for my money, but if my work asked me to build a server I'd run a mile. It's the same with workstations. I love this type of conversation as much as the next guy/gal in here. But there is a time and place for it. If the OP had a budget of £5000 then it would be a different story and the advice would be very different. You should sit down and think about your workflow before spending any amount of money on a SAN/NAS. There are many questions being asked here that could quite possibly lead you down the wrong path. 300TB for a proper SAN for video editing will probably be out of reach. But do you need to have footage from 5 years ago online and ready immediately? Designing a good workflow with the right equipment will probably be more valuable in the long run than "cheaping out" now in order to get more "bang for the buck" Speak to the vendors before spending a penny. You don't necessarily need to buy their products but it may give you a better understanding of how realistic your expectations are. They can probably build some sort of deal as well.
  14. These questions always get me worked up as a lot of the information in the replies can be very misleading. If this is for a business, and you rely on being able to edit and deliver videos on a deadline. Buy video editing equipment. Don't build your own, and definitely don't build your own storage system. Don't get me wrong, gaming hardware is great and the performance to price ratio these days is amazing. But, when something goes wrong, being able to call various support teams will be invaluable. (Also for the company I work for there are specific requirements that the insurance company says we must meet.) If you have a budget of 50K go and speak to the vendors who specialise in video editing network storage. Look at companies like DDP (DiskDrivePool) or edit share. The problem with building your own SAN or NAS is that this is for business use! If you are paying editors to come in and work, if something goes wrong you need to be able to get everything back up as soon as possible. If you are your own support, you're going to run into problems. Editors are not cheap. Depending on what editing software you use. Premiere can be a nightmare on network storage for anything longer than a short video. Avid is the industry standard and is designed to run on a network I would recommend looking into Avid, you can get a subscription for the same sort of price as adobe these days. One caveat with Avid though is you need to be working on approved machines, as if you call for support, the first thing they will ask is are you using an avid supported machine, if you say no, they tell you to use an avid supported machine. The workstations do not need 10GB , You will likely need 10GB from the san to a switch but after that 1GB should be plenty. I work in Broadcast television in the UK, In the production company I work for currently there are up to 10 editors working at once. All the footage is transcoded to XDCAM 50Mbits and the TV shows are all around 45 minutes long. Each edit machine is working off a single slow mechanical hard drive, there are no SSDs anywhere in the workflow and apart from the fact that the machines boot fairly slowly and it can take a couple of minutes for avid to open in the mornings, there is no noticeable bottleneck. You may want to think about some sort of workflow that means you don't have to have everything on your storage machine all the time. Backing up to tape is a good plan Or alternatively for video you should look at Sony's alternative ODA machines. They work the same way as tape in a lot of ways, but they work via usb3 and can be read and written to by any machine with no special software as they appear as an external disk.
  15. You should checkout the video jayz two cents just released about bottlenecking. Over watch isn’t particularly hard to run. But without hyperthreading 2 cores may be a little too little to work with. Not it to mention that you would likely need to run three virtual machines. (Not at the same time) one for you, one for your brother each with 2cores a gpu and probably around 6gb ram (you will still need some for (what I assume will be) unraid. You will then also also need* a third vm for when you are playing alone giving yourself all your cores and ram back. Alternatively you you could dual boot with unraid on one drive and windows on another then use the bios boot drive selector when you boot up to choose between them. TL:DR You can probably get this working and get a passable 1080p experience but you will get a fair amount of performance hit not too mention the technical headache it will be to troubleshoot if/when something doesn’t work. (Linus has a special relationship with the guys and unraid so he can call them and speak to their engineers to troubleshoot. You probably can’t to that. )
  16. That site was very helpful, I think I identified the big one as SCSI-3 or HD68 and the smaller as SCSI-5 or VHCDI
  17. Hey guys, I just got hold of a rackmount tape drive for dirt cheap but I'm strugling to identify some ports. I know that they are scsi ports (it's LTO-4 so fairly old now) but I just wanted a second opinion. I think that the bigger port is a HD68 - SCSI 3 connection and maybe the smaller is a HD50? Can I get an adapter to connect the two? Here's the cable for the PCIE card (the smaller of the two connectors) Thanks p.s. not sure if it matters but I'm in the UK
  18. A lot of NAS devices will have some sort of option to access your data across the internet some are better than others. My only experience with a 'proper' NAS device was a QNAP system and the way that worked was it went through QNAP servers then to your device so they didn't really want people hammering the connection. (I'm sure that there was other ways of setting this up but it was a production device in a video editing environment so even connecting it to the internet so I could download individual clips was not really ideal.) If you are going DIY or if you are looking at the more expensive NAS units from QNAP/Synology etc. The ones with proper cpus and a decent amount of RAM, (the QNAP we were using was an i7 with 16GB RAM) then you could look into an Ubuntu VM running nextcloud. Nextcloud is a free and open source application that you host yourself, it works much like dropbox and google drive with a web interface that you can download and look through your files and even offers mobile and desktop clients to automatically sync your documents across different devices. You can also scale this from whatever you have lying around to multiple TB depending on your needs. For this to work you will need a domain name, a linux VM, a lot of patience and some general idea of how to get around Ubuntu. This guide from Digital Ocean is pretty good for getting you started. https://www.digitalocean.com/community/tutorials/how-to-install-and-configure-nextcloud-on-ubuntu-16-04 You will also probably want to be running your own DNS server inside your network. This can be a pretty complicated install but if you are looking for a weekend project then look into it. It's pretty fun and interesting and pretty rewarding when it all works properly. A fast internet connection is recommended but it depends on what you're using your cloud for. If you are using it to store massive video files you are going to need a good connection. If you are just storing word docs and backing up your pictures from your phone, you won't need too much. (Especially as when you are uploading away from home you are using your download connection for the server and you will be limited by the upload connection of your local internet.)
  19. yeah, it's really thrown a spanner into my backup solution. I'm thinking about creating a VM running windows 7 or maybe ubuntu if it's supported and running backblaze on that for file backups. I just took delivery of an R710 as well to replace another one of my old diy server builds. Nothing fancy, a pentium CPU with 8GB RAM and 12TB Raw Storage. I'm now planning to turn this from RAID 1 with 6TB storage into JBOD or maybe RAID 5 (or RAID 5 equivalent with 2 data drives and one parity drive) stick this in my parents house 100 miles away and use that to backup all my important files (no plex media etc.) Which should be fine as I only have about 2-3TB of data if you exclude Plex Stuff.
  20. I have a spare set of 16gb ram lying (known good) around so I will swap that in and see if it's more stable. then I can test her memory on my spare system at the same time
  21. A bit more info from my friend:
  22. I have a feeling that it could be a combination of a few things, to be honest I'm not sure that she has even bothered with things like chipset drivers or anything like that. I should be getting a new server delivered today so once I migrate my old server I can pull the psu from that and see if it makes any difference, update all the drivers (including all the mobo drivers etc and put it through a furmark stress test for a couple of hours to see how it does.
  23. Hey, wonder if you guys can help me troubleshoot my friends PC. She put this system herself, well a friend of hers who i don't know built it for her. I can't remember the exact part off the top of my head but it's an AMD gpu and an Intel CPU with 8GB RAM. (Probably DDR3, I think we are looking at haswell generation CPU's) She has had non stop problems with the system where whenever she plays a game, she gets about half an hour in and then the game crashes and she has to relaunch. This has been going on a while. She thinks it's her GPu but I'm not sure. She has done a clean install of windows and updated her graphics drivers, this seemed to help a little. Any ideas on how I can/should test and troubleshoot her system to see what's causing the crashes. Whenever I try and help her with it, the system works flawlessly, then as soon as I leave it's back to it's old tricks again. Any ideas would be greatly appreciated. Thanks
  24. Should be possible in theory. But I think you would be better served by making the gaming servers individual machines that are hidden away in a (well ventalated) rack or cupboard then use Steam streaming or something similar to play the games on lower powered machines. These machines don't need to be very powerful at all. Just need a semi-decent cpu (like a fairly recent i3 should be fine) as it only needs to decode the h264 stream from steam. I will be looking into virtualising a gaming pc though. I have a gaming pc and two of my flatmates have a gaming pc but there is my gf and another friend who both own macs so can't really join in. So my plan is to create a fairly able gaming vm so they could stream the games from the server to their macs to join in on games like ark and maybe another coulple of games that don't depend too much on low latency. Use what you have now, install a hypervisor and test to see how and if you can get gpu passthrough working and what the perfomance is like, then you can see whether or not that was a massive PITA or if it's worth the effort to achieve what you want.
×