Jump to content

2 Gamers 1 Tower on steroids

Before we start, heres a arts list of my computer. (have 90% of parts, just waiting for mobo. Currently using ASRock x99x which may be giving me problems with PCI lanes since I'm using a 5820k.)

 

Parts

 

 

 

Currently in the process of upgrading my rig so that me and my brother can use it at the same time for gaming using UnRaid. He's purchased a gtx 970 FTW+ to use. We were able to successfully set up two VMs before but had a fair amount of lag using them to play games, even with GTX 980 ti's being used and only playing on 1080p. Also was using a gtx 710 as gfx card for server.

 

My question is, is my 5820k bottlenecking performance here or was the sloppily setup disk array the culprit of the lag? It looked like both systems were using the same SSD as their main drives in the UnRaid disk setuo menu before which may have caused problems.

 

If the 5820k is the problem I'll most likely find a decent 10-16 core ES v3 Xeon on ebay for us to use.

 

Thanks for any responses, sorry for sloppy post. I've been researching and trouble shooting for the passed few weeks and just decided before I dozed off I should ask for help setting things up.

 

 

Edit: here's current mobo

motherboard

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, yakiddenme said:

Before we start, heres a arts list of my computer. (have 90% of parts, just waiting for mobo. Currently using ASRock x99x which may be giving me problems with PCI lanes since I'm using a 5820k.)

 

Parts

 

 

 

Currently in the process of upgrading my rig so that me and my brother can use it at the same time for gaming using UnRaid. He's purchased a gtx 970 FTW+ to use. We were able to successfully set up two VMs before but had a fair amount of lag using them to play games, even with GTX 980 ti's being used and only playing on 1080p. Also was using a gtx 710 as gfx card for server.

 

My question is, is my 5820k bottlenecking performance here or was the sloppily setup disk array the culprit of the lag? It looked like both systems were using the same SSD as their main drives in the UnRaid disk setuo menu before which may have caused problems.

 

If the 5820k is the problem I'll most likely find a decent 10-16 core ES v3 Xeon on ebay for us to use.

 

Thanks for any responses, sorry for sloppy post. I've been researching and trouble shooting for the passed few weeks and just decided before I dozed off I should ask for help setting things up.

 

 

Edit: here's current mobo

motherboard

First off, you don't have any cpu cooler. Second, your cpu has 28 pcie lanes, which means both your cards will run at x8 speeds. I would say go with an e5-1650v3 which has 40 pcie lanes.

My native language is C++

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Kyle Manning said:

First off, you don't have any cpu cooler. Second, your cpu has 28 pcie lanes, which means both your cards will run at x8 speeds. I would say go with an e5-1650v3 which has 40 pcie lanes.

8x is more than enough for a card to run at full speed. I can't gurentee it won't cause any odd issues with Unraid though. but don't forget about your chipset's PCI lanes! I think you should be fine with your CPU. 

Link to comment
Share on other sites

Link to post
Share on other sites

Daym what a build

CPU: i7 3770 3.40Ghz 3.90Ghz II  Motherboard: Gigabyte II Ram: 16GB Corsair Vengeance II GPU: Gigabyte GTX 970 G1 Gaming Edition II Case: CM Storm Enforcer II

Storage: Intel 240GB SSD, 2TB HDD, 4TB SSHD II PSU: Thermal Take 700w II

Madness, as you know, is a lot like gravity, all it takes is a little push.  ~Joker

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Beeeyeee said:

8x is more than enough for a card to run at full speed. I can't gurentee it won't cause any odd issues with Unraid though. but don't forget about your chipset's PCI lanes! I think you should be fine with your CPU. 

@yakiddenme In the parts list it shows you have two 980 Ti's, a 970 and a 710, does this mean the system is going to have 4 graphics cards? If so 28 lanes will actually end up being a real issue. Even with a 40 lane cpu, which would be enough, slot speeds will only be x8 or less depending on motherboard and configuration.

 

If you are going to be putting that many PCI-E devices in a single system using unRAID then 40 lane cpu's are highly advised.

 

MSI X99A GODLIKE GAMING:

Quote

- 4-way mode: x8/ x8/ x0/ x16/ x8*, x8/ x8/ x0/ x8/ x4**

* For the CPU that supports 40 PCIe lanes
** For the CPU that supports 28 PCIe lanes

 

• Supports 4-Way NVIDIA® SLI™ Technology (For the CPU that supports 40 PCIe lanes)
• Supports 3-Way NVIDIA® SLI™ Technology (For the CPU that supports 28 PCIe lanes)
* Supports Windows 7 and Windows 8/ 8.1

The first slot must be the server 710 GPU so the two 980 Ti's get x8 but the 970 will only be x4. I know the SLI support specs above don't really directly apply to what you are doing but the slot speeds are important to note.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, yakiddenme said:

We were able to successfully set up two VMs before but had a fair amount of lag using them to play games, even with GTX 980 ti's being used and only playing on 1080p. Also was using a gtx 710 as gfx card for server.

 

My question is, is my 5820k bottlenecking performance here or was the sloppily setup disk array the culprit of the lag? It looked like both systems were using the same SSD as their main drives in the UnRaid disk setuo menu before which may have caused problems.

An single SSD will be able to run more than the two VMs you were using just fine, it could still be a storage issue but the SSD itself shouldn't be the cause. Run some long length disk benchmarks one at a time then both VMs at ounce and see if it effects performance.

 

Do the same with CPU and GPU benchmarks. Before you can properly diagnose the issue you will first need to verify that a single VM is performing correctly, within 10% of bare metal.

 

Also did you make sure that each VM got 3 real cores each?

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, leadeater said:

@yakiddenme In the parts list it shows you have two 980 Ti's, a 970 and a 710, does this mean the system is going to have 4 graphics cards? If so 28 lanes will actually end up being a real issue. Even with a 40 lane cpu, which would be enough, slot speeds will only be x8 or less depending on motherboard and configuration.

Thanks for the replies everyone. I was planning on using four GPUs, that's why the GODLIKE was purchased as its one of the only motherboards that can run 4 graphics cards with the 5820k, even if lanes are x8 and one is x4

 

The 970 running at x4 shouldn't be too much of a problem, as my brother is only going to be playing at 1080p (please correct me if I'm wrong).

 

I'm not too concerned about the performance of his VM to be honest, as one as he gets somewhere near 60FPS at 1080p he'll be fine. On the other hand, I'm going to try and SLI the 980 ti's to run them at 3440x1440 and at least get around 60-100 FPS. Looking at benchmarks running that resolution even by subtracting ~10% I should be able to reach those numbers.

 

5 hours ago, leadeater said:

 

If you are going to be putting that many PCI-E devices in a single system using unRAID then 40 lane cpu's are highly advised.

I've been looking at a variety of Haswell-EP Xeons as I've had luck with them in the passed. The only thing I'm worried about is the fact that most of them have fairly low clock speeds. Let's say I was able to get a solid 3.0 gigahertz overclock on a 12-14 core  Xeon by adjusting BLCK frequencies. If each machine was dedicated 5-6 hyperthreaded cores all running at 3 ghz I really shouldn't have a performance problem with anything l, correct?

 

5 hours ago, leadeater said:

An single SSD will be able to run more than the two VMs you were using just fine, it could still be a storage issue but the SSD itself shouldn't be the cause. Run some long length disk benchmarks one at a time then both VMs at ounce and see if it effects performance.

 

Do the same with CPU and GPU benchmarks. Before you can properly diagnose the issue you will first need to verify that a single VM is performing correctly, within 10% of bare metal.

 

Also did you make sure that each VM got 3 real cores each?

I'll be running benchmarks for the rest of today to sort out the problem. I did make sure each VM got 3 real cores each so that shouldn't be a problem. I'll report back with anything I find. Thanks for the help troubleshooting so far

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, yakiddenme said:

I've been looking at a variety of Haswell-EP Xeons as I've had luck with them in the passed. The only thing I'm worried about is the fact that most of them have fairly low clock speeds. Let's say I was able to get a solid 3.0 gigahertz overclock on a 12-14 core  Xeon by adjusting BLCK frequencies. If each machine was dedicated 5-6 hyperthreaded cores all running at 3 ghz I really shouldn't have a performance problem with anything l, correct?

You don't have to go Xeon to get the extra lanes, a 5930k has 40, but I suspect you want the cores. I do agree with you though, the current CPU should work fine so unless there is actually a problem or you want more cores for the VMs the extra cost won't actually give any benefit to gaming.

 

The low clock on the Xeons really won't be a problem unless it's below ~2.4GHz, everything is so heavily GPU bound currently. Every time I have overclocked a CPU it has practically done nothing for FPS in games, all the way back to E7300 era. Pentium 4 overclocking actually did something but those were just so bad to start with.

 

I'd be interested to know if 3 cores and 6 threads actually bottleneck your 980 Ti's, most games until recently level out at 3 cores and 4 being the point where you stop seeing gains. This is basically due to the majority of gaming systems being 4 cores and developers optimize for the masses >.<

 

Also my reasoning for checking the single VM performance is it may show up an underlying issue that you don't notice until you use both VMs, if the performance isn't correct running just 1 then 2 could be the tipping point to where you notice it in game.

Link to comment
Share on other sites

Link to post
Share on other sites

Is there a reason why you are wanting to do this? While it sounds cool and may be cool to show off, it is impractical anND a waste of money.

Link to comment
Share on other sites

Link to post
Share on other sites

Before you go spending thousands, I hope you know that SLI isn't really supported through VM's? While you may be able too pass through both video cards to a single VM, I don't believe there is a way to currently setup any SLI communication between the cards.

Spoiler

Desktop: Ryzen9 5950X | ASUS ROG Crosshair VIII Hero (Wifi) | EVGA RTX 3080Ti FTW3 | 32GB (2x16GB) Corsair Dominator Platinum RGB Pro 3600Mhz | EKWB EK-AIO 360D-RGB | EKWB EK-Vardar RGB Fans | 1TB Samsung 980 Pro, 4TB Samsung 980 Pro | Corsair 5000D Airflow | Corsair HX850 Platinum PSU | Asus ROG 42" OLED PG42UQ + LG 32" 32GK850G Monitor | Roccat Vulcan TKL Pro Keyboard | Logitech G Pro X Superlight  | MicroLab Solo 7C Speakers | Audio-Technica ATH-M50xBT2 LE Headphones | TC-Helicon GoXLR | Audio-Technica AT2035 | LTT Desk Mat | XBOX-X Controller | Windows 11 Pro

 

Spoiler

Server: Fractal Design Define R6 | Ryzen 3950x | ASRock X570 Taichi | EVGA GTX1070 FTW | 64GB (4x16GB) Corsair Vengeance LPX 3000Mhz | Corsair RM850v2 PSU | Fractal S36 Triple AIO + 4 Additional Venturi 120mm Fans | 14 x 20TB Seagate Exos X22 20TB | 500GB Aorus Gen4 NVMe | 2 x 2TB Samsung 970 Evo Plus NVMe | LSI 9211-8i HBA

 

Link to comment
Share on other sites

Link to post
Share on other sites

If you are going to spend a bit of money on a high core count xeon you would probabbly be better off getting a whole other system for the second person and throw the 970 in that. Then you wouldn't have to worry about the pcie lanes and sli support unless there is a specific need for the two users. 

 

The cheapest 10 core with a decent clock i could find is http://m.newegg.com/Product/index?itemnumber=9SIA4GH3EY0500

and that is $900. 

 

 •E5-2670 @2.7GHz • Intel DX79SI • EVGA 970 SSC• GSkill Sniper 8Gb ddr3 • Corsair Spec 02 • Corsair RM750 • HyperX 120Gb SSD • Hitachi 2Tb HDD •

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Jarsky said:

Before you go spending thousands, I hope you know that SLI isn't really supported through VM's? While you may be able too pass through both video cards to a single VM, I don't believe there is a way to currently setup any SLI communication between the cards.

I actually thought the same thing so I decided to test it. When dedicating both GPUs to a single vm with a SLI bridge installed both of the GPUs showed up in the nvidia control panel, but I was running into some issues in terms of performance. I don't think the GPU's will be able to utilize SLI for increases performance in gaming, but since the system can still detect and use both, the additional gpu should help with productivity. Though, with NVLink coming with Pascal it sounds like SLI in VMs may become possible which is exiting.

 

1 hour ago, beavo451 said:

Is there a reason why you are wanting to do this? While it sounds cool and may be cool to show off, it is impractical anND a waste of money.

 The only thing I've purchased for this is a the $60 program, I already owned everything else. It is really cool and it is even better to show off :D! But, for my uses (Game & software development mainly, but still fairly often use Adobe and Autodesk programs) it is not impractical or, a waste of money. Most of these parts will be written off anyways so don't worry about the price. My brother is my main game/software guinea pig as well so it would be nice if he could have his "own" system as well instead of having to use mine since he doesn't own a computer.

 

@leadeater You've been a tremendous help so far so thank you. I've gotten most issues sorted out but I can almost guarantee there will be more later. Even with many workarounds in place I just can't seem to get SLI to work inside a VM, everything in the system shows SLI in enabled and working but benchmarks seem to indicate otherwise. The 5820k has really hampered me for a while in terms of content creation so I've been considering this processor as an upgrade path. I'll make sure to post a build log when I've finished.

 

16 minutes ago, SLAYR said:

If you are going to spend a bit of money on a high core count xeon you would probabbly be better off getting a whole other system for the second person and throw the 970 in that. Then you wouldn't have to worry about the pcie lanes and sli support unless there is a specific need for the two users. 

 

The cheapest 10 core with a decent clock i could find is http://m.newegg.com/Product/index?itemnumber=9SIA4GH3EY0500

and that is $900. 

Shh, I've been looking for an excuse to upgrade my processor! I only want to dedicate a small portion of the system to another VM besides my main one. When i bought the 5820k it was great for what I did, but work demands have changed since then and it doesn't quite cut it anymore.

Link to comment
Share on other sites

Link to post
Share on other sites

@yakiddenme Last time I was talking to an unRAID rep Nvidia SLI was not working but AMD crossfire was. Pretty sure that was only the newer AMD cards that use the PCI-E bus for crossfire but this sort of thing isn't something they actively test or specifically try to make it work.

 

Also on the NVLink thing, when I was reading up on that it sounded like it would not be coming to desktop computing but is aimed as a proprietary connection bus in multi GPU accelerated servers. Basically don't expect to see this on desktop gaming boards, running both PCI-E and NVLink would be too costly and the CPU also has to support it and running only NVLink is likely not possible if you want anything other than an Nvidia GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, yakiddenme said:

snip

Please don't waste money on a cpu upgrade. First try it without SLI. Run CPU, GPU, and disk benchmarks on the VM and off the VM to figure out what the issue is.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, bobhays said:

Please don't waste money on a cpu upgrade. First try it without SLI. Run CPU, GPU, and disk benchmarks on the VM and off the VM to figure out what the issue is.

Lanes were the issue. Everything pointed to them being a problem, and once my new Broadwell-EP processor got here along with my motherboard everything besides SLI worked perfectly inside UnRaid. For work the VM performs very well, for gaming with SLI I have to switch to Windows as my main OS instead of UnRaid. Everything's working well, I also set up a some cloud storage and a Plex VM along with my own VPN. Pretty cool what the software can do once you get into it.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×