Jump to content

AceScottie

Member
  • Posts

    11
  • Joined

  • Last visited

Awards

This user doesn't have any awards

AceScottie's Achievements

  1. How well does this run as a VM ? I use a standard cloned windows 10 VM all the time for temp system access at work and allocating 4GB of RAM to each vm for RDP, chrome and a small application is extreamly overkill but anything less stutters and lags too much for it to be useful.
  2. Id really like to see an implementation of this system but without leveraging networking infrastructure and removing the bottleneck of Ethernet. I had tis same idea a while ago too but lack funding to really make it a reality. Weather its possible yet i dont know, but if you could get an 8 or 10 port USB3.0 dock along with 8 or 10 GPUS then you could try a different type of solution. Firstly Belkin B2B122-BLK USB 3.0 Dual DVI Display Video Docking i bought this device to go along with the research and it works really well. This would replace the thin-clients you are currently using and has support for audio, dual DVI (1 DVI-I and 1 DVI-D), an Ethernet port and some USB3.0 ports. When testing the device i used all the USB 3 ports for mouse, keyboard and headset, the Ethernet on a 1GBps LAN connection and both DVI ports (one DVI - DVI and one DVI - HDMI). After a full day of using it i honestly could not tell the difference regardless of what task i was doing. If you could pair this with some USB3.0 optical cables you should be able to get about 30m-50m cables which would not only remove the huge strain on your existing network, it would stop the onboard NIC of the server board having to deal with 8/10 virtual machines all streaming at the same time and impacting system performance. The cables are pretty expensive but tbh if you take the cost of the thin-clients and 10GBps switches /networking stuff its not huge. (still pretty costly though) USB3.0 Active Optical Cable 50m $499.00
  3. So just thought i would post an update here as a lot of stuff has happened since i started this thread. For starters i went and got a Belkin USB3.0 video Dock (2xDIV and 3XUSB3 + headphone and mic inputs) and gave it a quick test drive. The results are really good, playing games is perfectly fine and there seems to be no issues at all with lag/latency from using the dock (playing CS:GO, LOL, JustCause 3). Attempted to copy the setup over to my server (with some issues with using unRAID which i stared another Thread for). So as for the Dock its self (link: http://www.belkin.com/us/p/P-B2B122/ ) The dock works well after initial setup. There are some issues with PnP drives not working but i managed to find some "Third Party" drivers (basically Belkin use their own drivers from hardware the got form Display Link and call the Display Link drivers Third party) but after that everything works really well. Chances are it will reset your desktop layout and resolution the first time you plug it in but apart from that i forgot i was using it even when i booted up Just Cause 3 (a game my PC can barely handle normally). The USB cables however are another story. Going above the standard 3.3m range currently offered by standard cables you have to switch from copper to fibre connections. This costs and average of £180 for a 30m cable. This is expensive however from a cost to cost standpoint (PC to Server) its actually quite minimal. I plan to release my full report which talks about not only the setup cost but also the long term running cost of use servers rather than PCs.
  4. Hello all. Id like to start by saying i currently do not have access to the equipment, logs or other information which may help in this situation. This is a Uni project and it is currently a holiday so the place is shut down for a couple of weeks. I have a HP Proliant DL585 G6 server which i wanted to basically copy the 7 gamers 1 CPU project a little bit. I am writing a report which considers cost to performance aspect of such a project with pure server based hardware. So the project is simple enough. However there is an issue with SAS Controllers and i dont quite understand it. Basically i have 5 SAS drives ( 2xRAID 0) all connected via the SAS backplane to the SAS controller. This works fine for operating systems like Windows and Linux, however unRAID cannot detect the drives in order to assign them. I have tried deleting the RAID however the SAS controller just defaults back and re-creates each drive into its own RAID 0. So my question. Has anyone had this issue of SAS drives on a server not being detected by unRAID and what steps did you take to resolve it. Additional notes: The first 3 drives (in 1 RAID 0) are for a separate project and have Cent OS + a lot of config on them so i cant delete that RAID. I cant provide any logs or configs, however i do know (by running lspci) that the SAS controller is being detected by unRAID. Im not looking for exact solutions, simply some things i can take a look at and try. I am extremely limited in the time i have access to the equipment. (less than 7 hours left before deadline).
  5. yea using a thunderbolt dock was an idea i had but not sure how it would work using USB3 but the problem is the length. from breifly looking at the specks its recommended to be about 3m however you can add boosters and get about 15m, which is still quite short. Thunderbolt on the other hand can support about 30m which would be able to cover most tasks I need it for. Also the idea is to not be physically connected so using long HDMI + boosters is out. As a simple solution it can work but can you imagine trying to plug in a mouse, keyboard, USB drive or a second monitor from 30m away. Using VNC you could forward additional inputs across easily, using docks all the stuff is there for you but having long cables would mean you need to run additional cables every time you want to change something small.
  6. anywhere between 10-50m and forwarding the mouse and keyboard over the network was in my original concept.
  7. the goal is to use a server to run multiple gaming machines without being in the same room as the clients. and at this point im pretty sure network is out unless i use 10Gbps networking equipment,
  8. At the moment i have a virtual enabled server and a lot of PCs + monitors. Budget is basically £0 lol but i can get quite a lot of stuff without too much effort. The server is a DL585 g6 (at the moment im not event sure if it supports PCI-e) but i have a lot of beastly PCs i can use (50 - 60). This is basically proof of concept at the moment so it doesn't need to be 100% working. The main thing is to prove it works and demo it on a small scale.
  9. For the graphics pass through. VNC uses the graphics card to encode the frames and the streams them. Thunderbolt/capture cards take the physical output of the graphics card then streams it. RDC uses its own soft driver to grab the screen and streams it. So how does this USB device grab the screen and what resources does it use to get it. it doesn't have video input or connect to the graphics card at all. As for the games. if you ever walk into a gaming cafe what games would you want to play? (basically everything from pac man to GTAV, Fallout 4, Just Cause 3 etc...)
  10. I did have a look at USB docks before but i couldn't understand how it can pass the graphics information through. The main problem with using VNC was the graphics card has to do the encoding to stream the video, this takes away from the performance on the games. By using Thunderbolt/capture cards your basically taking the output from the graphics card and using the device to encode and stream it. Do you know anything about the performance of the USB dock ? also note this is for servers where USB3.0 are not generally included. Also i have to consider how to pass this through 5-7 virtual machiens using the availble ports. Capture cards can be external devices and thunderbolt can be in a different server, USB has to be attatched to the server with enough ports for every client.
  11. So this doesn't exactly fit into any section as its more R&D area, but after watching the 7 Gamers 1 CPU videos I though I'd post about it as I was working on something similar before and it kinda ties in. (This project was mainly aimed at a gaming cafe so bare that in mind while reading through this) So the whole point of this project was to have multiple high powered games running off a single server that multiple people can access. There is however 1 huge difference. My server is going to be locked in a rack in another room so no one can touch it. Due to the problems with HDMI, USB and general communication over long distances, I couldn't be happy with just having a 50m HDMI cable running from the server to a monitor due to signal loss, noise, etc. I came up with a plan to find other ways of sending the info across the distance without having massive costs getting in the way. I have 3 plans to solve the problem and each has their own drawbacks. 1: VNC. This was one of the earlies options i though of. Basically create 5 virtual machines, link each up to a graphics card, then stream the result using a combination of VNC and virtualGL. The problem with this however is the encoding of the video has to be done by the graphics card while also gaming on the same card. After a lot of trial and error i managed to get somewhat of a solution working, this was using jsmpeg1 (from github) and streaming that using UDP and viewing that in a browser. There were quite a few drawbacks with this. The browser didn't forward mouse/keyboard commands correctly (this can be solved in another method). Also the frame rate was about 10-15fps with quite a lot of latency. The solution was mainly developed for external networks i feel as during the entire streaming process my network utilization barely went above 1MB/s upload and the quality was severely reduced. 2: capture card There are a few capture cards that support networking streaming. Having multiple of these set up for each graphics card is a possible solution but due to the cost i cant exactly test. The drawbacks for using capture cards are more to do with the manufacturers. These devices are normally meant to stream to the internet so will be set up to reduce quality and ignore latency as an issue, this means although the could work, the lag may be the deciding factor. 3: Thunderbolt. This is the only option that does not require using a network. Unlike its standard implementation, i planned to build a second server with 5 thunderbolt cards which would be separate from my main graphics server. Again there would need to be some implementation of forwarding the mouse and keyboard back to the gaming server but I'm sure that could be simply achieved. The main benefit of using Thunderbolt is the lack of network usage something that would be a problem in the other 2 solutions. The drawbacks on this are rather simple. Distance and cost. While a good capture card can set you back about £100-£150 a decent thunderbolt dock can set you back about £200-£300. Also the cable length i have seen supported in the small amount of research i have done is about 30m which would not be great for large adoption. (perhaps there is a way of relaying this to extend the range). So what do you guys think ? Anyone tried anything like this before and if so what were your results.
×