Jump to content

Windows Server Network Performance Slow

I have a pc running windows server 2012. I created a network share but when reading from it or writing to it I only get ~55 MB/s. This was a large movie. I previously had FreeNAS on this pc and was getting ~100MB/s. The client computer I am testing with has also stayed the same. I even ran a crystaldiskmark speed test on the disk and I was getting ~120MB/s read and write. Right now I am using a single disk until my new disks come in the mail. I am just trying to get things set up and ready for when my new drives arrive. But this is quite a big issue. I should be able to get full gigabit speed just fine. I have done a lot of reading online and can't find a solution that works for me. Any advice is welcome.

I am also open to better ways to share files throughout the network

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Electronics Wizardy said:

what files are you sending. Small files and folders are much slower.

I was testing with a large movie. 13GB

Even connecting the server directly to the client with a patch cable (no network connected to either computer), the speeds were ~20MB/s

Link to comment
Share on other sites

Link to post
Share on other sites

Are you also running the Active Directory role on the server OS? If you are the enforced signed and encryption policies applied by default require at lot of CPU resource and will reduce the throughput. Number of people have lately been posting about similar problems and almost all of them were also running an DC on the OS.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

Are you also running the Active Directory role on the server OS? If you are the enforced signed and encryption policies applied by default require at lot of CPU resource and will reduce the throughput. Number of people have lately been posting about similar problems and almost all of them were also running an DC on the OS.

I am not running active directory. Right now it is acting only as a file server

Link to comment
Share on other sites

Link to post
Share on other sites

Create a RAM disk on the server and also on the client and do a file copy test to and from the server, just to rule out any disk limitations. I know the the benchmark showed high enough performance but it is a worth while test.

 

Also use something like iperf to test the actual network throughput.

 

Doing both of these will hopefully show something useful to further diagnose the issue.

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, leadeater said:

Create a RAM disk on the server and also on the client and do a file copy test to and from the server, just to rule out any disk limitations. I know the the benchmark showed high enough performance but it is a worth while test.

 

Also use something like iperf to test the actual network throughput.

 

Doing both of these will hopefully show something useful to further diagnose the issue.

Ok, I will try both of those right now

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, leadeater said:

Create a RAM disk on the server and also on the client and do a file copy test to and from the server, just to rule out any disk limitations. I know the the benchmark showed high enough performance but it is a worth while test.

 

Also use something like iperf to test the actual network throughput.

 

Doing both of these will hopefully show something useful to further diagnose the issue.

Hmmm, well with the ram disk I transferred a 1GB file and it got ~82MB/s. So I guess the bottleneck is the drive. It just seems weird because locally it gets perfectly fine test results. Not that 82MB/s is perfect, but it shows that the network is not the current bottleneck? Maybe?

I threw a different drive into the system and got the exact same results as the first drive. ~55MB/s. I tried to use iperf but I couldn't figure it out

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, fanchazstic said:

Hmmm, well with the ram disk I transferred a 1GB file and it got ~82MB/s. So I guess the bottleneck is the drive. It just seems weird because locally it gets perfectly fine test results. Not that 82MB/s is perfect, but it shows that the network is not the current bottleneck? Maybe?

I threw a different drive into the system and got the exact same results as the first drive. ~55MB/s. I tried to use iperf but I couldn't figure it out

The 82MB/s you mentioned is actually pretty bad for a RAM disk network transfer, you should be getting ~110MB/s. Have a look at this http://45drives.blogspot.co.nz/2016/05/how-to-tune-nas-for-direct-from-server.html.

 

Scroll to the bottom "Example 3" and check those along with making sure any offload settings are enabled, do not enable jumbo frames.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

The 82MB/s you mentioned is actually pretty bad for a RAM disk network transfer, you should be getting ~110MB/s. Have a look at this http://45drives.blogspot.co.nz/2016/05/how-to-tune-nas-for-direct-from-server.html.

 

Scroll to the bottom "Example 3" and check those along with making sure any offload settings are enabled, do not enable jumbo frames.

I already tried increasing the buffers. They are currently set at their max on both ends

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, fanchazstic said:

 I tried to use iperf but I couldn't figure it out

 

For basic usage, you download it to both computers and unzip it.

Open Command prompt and navigate to the iperf folder.

start 1 with the command 'iperf -s' to be server and the other with the command 'iperf -c <hostname of iperfserver>

 

Spoiler

Desktop: Ryzen9 5950X | ASUS ROG Crosshair VIII Hero (Wifi) | EVGA RTX 3080Ti FTW3 | 32GB (2x16GB) Corsair Dominator Platinum RGB Pro 3600Mhz | EKWB EK-AIO 360D-RGB | EKWB EK-Vardar RGB Fans | 1TB Samsung 980 Pro, 4TB Samsung 980 Pro | Corsair 5000D Airflow | Corsair HX850 Platinum PSU | Asus ROG 42" OLED PG42UQ + LG 32" 32GK850G Monitor | Roccat Vulcan TKL Pro Keyboard | Logitech G Pro X Superlight  | MicroLab Solo 7C Speakers | Audio-Technica ATH-M50xBT2 LE Headphones | TC-Helicon GoXLR | Audio-Technica AT2035 | LTT Desk Mat | XBOX-X Controller | Windows 11 Pro

 

Spoiler

Server: Fractal Design Define R6 | Ryzen 3950x | ASRock X570 Taichi | EVGA GTX1070 FTW | 64GB (4x16GB) Corsair Vengeance LPX 3000Mhz | Corsair RM850v2 PSU | Fractal S36 Triple AIO | 12 x 8TB HGST Ultrastar He10 (WD Whitelabel) | 500GB Aorus Gen4 NVMe | 2 x 2TB Samsung 970 Evo Plus NVMe | LSI 9211-8i HBA

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/16/2016 at 2:44 PM, Jarsky said:

 

For basic usage, you download it to both computers and unzip it.

Open Command prompt and navigate to the iperf folder.

start 1 with the command 'iperf -s' to be server and the other with the command 'iperf -c <hostname of iperfserver>

 

sorry it took so long. I had a very busy couple of days. but here it is GruSAPI.png

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, fanchazstic said:

sorry it took so long. I had a very busy couple of days. but here it is GruSAPI.png

You network connection looks fine for a network point of view.

 

I'd say it is going to be something else. Normally, without any information i'd point you in the direction of the storage array. My guess is your using a software raid of some type? And if you using a parity raid also then that's your issue.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Blake said:

You network connection looks fine for a network point of view.

 

I'd say it is going to be something else. Normally, without any information i'd point you in the direction of the storage array. My guess is your using a software raid of some type? And if you using a parity raid also then that's your issue.

 

 

But RAMdisk had speeds of less than 100MB/s so I'm not too sure array is at fault.

Comb it with a brick

Link to comment
Share on other sites

Link to post
Share on other sites

You created a ramdisk on both ends correct? Does either box use a really long cable? Have you tried swapping the cable out? 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mikensan said:

You created a ramdisk on both ends correct? Does either box use a really long cable? Have you tried swapping the cable out? 

I created a ramdisk on both ends. On the server I created a folder in the ram disk made it a share. I put a 1GB file in the ram disk on the client. I copied the file from the client ram disk to the shared folder on the server ram disk.

3 hours ago, .:MARK:. said:

But RAMdisk had speeds of less than 100MB/s so I'm not too sure array is at fault.

I am currently using a single drive for this until my new drives get here. I have an OS drive and a data drive. Both 500GB WD Caviar Blue drives. Nothing fancy, just what I had laying around to do some tests until my 2tb drives arrive. 

14 hours ago, Blake said:

You network connection looks fine for a network point of view.

 

I'd say it is going to be something else. Normally, without any information i'd point you in the direction of the storage array. My guess is your using a software raid of some type? And if you using a parity raid also then that's your issue.

 

 

 

Like I said, single disk. Right now the server is sitting right next to the client. I have a gigabit switch they are both connected to with ~5ft cables. They are cat6 cables, so they are up to spec. When I was testing out freenas I had an issue with an older cable limiting the link speed to 100mbps, but the issue was fixed by switching out the cable to a new one. Also both the client and the server are showing a gigabit connection.

 

I'll try out the ramdisk again and see what happens. Is this issue just overhead in the protocol? It seems a little excessive, but maybe I should just stick with freenas. Freenas works fine, I just wanted to be able to run things like plex on the same machine and that is surprisingly difficult to do in freenas. I was hoping to run a dynamic dns client, plex server, ftp server, and arq 5 backup software to backup the whole nas to amazon cloud drive (offsite backup). I have tried using freenas jails but I only got plex to work once. It worked for a couple minutes and then stopped working. No matter how many times I removed the plugin, deleted everything, and tried again it would not work. So I gave up on freenas and figured server 2012 would be much easier to work with. But maybe I should install esxi and run both freenas and a windows server vm. I could just use freenas for the file server and use windows for all the apps and other servers I need. I know you aren't supposed to run freenas in a vm, but it is possible. I'm just throwing out ideas at this point and any advice would be appreciated. 

Link to comment
Share on other sites

Link to post
Share on other sites

I used this guide to setup a plex jail. Using the plugin just doesn't work for me either. Instead using the package from FreeBSD worked great. If you have a plex pass, a little further down in the comments it lists the plexpass package name.

https://forums.freenas.org/index.php?threads/tutorial-how-to-install-plex-in-a-freenas-9-3-jail-updated.19412/

 

Unless you're doing something else in the background like watching youtube or downloading a file while you also try to transfer the movie between ramdisks, there's no reason you shouldn't be seeing over 100mbps.

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, Mikensan said:

I used this guide to setup a plex jail. Using the plugin just doesn't work for me either. Instead using the package from FreeBSD worked great. If you have a plex pass, a little further down in the comments it lists the plexpass package name.

https://forums.freenas.org/index.php?threads/tutorial-how-to-install-plex-in-a-freenas-9-3-jail-updated.19412/

 

Unless you're doing something else in the background like watching youtube or downloading a file while you also try to transfer the movie between ramdisks, there's no reason you shouldn't be seeing over 100mbps.

I was using teracopy to copy the files and decided to try using the default windows one instead. To the ram disk, it stayed at about 95MB/s but would drop to 91. When I went back to copying the movie to the disk share, it transferred at 50MB/s. So I'm just going to assume my disk is the issue. 95MB/s isn't perfect but I'm totally fine with it if I can get that on consistent basis.

Also I was not doing anything in the background on either machine

Link to comment
Share on other sites

Link to post
Share on other sites

I would say its your disk thats limiting performance.

 

The iPerf results look perfectly fine which proves to us that the network and the OS/Memory/etc...is not at fault.

So it can only be something related to your SATA/Disks.

 

Wether that be a dodgy SATA connection. SATA configuration e.g set to SATA1 speeds. Something to with an array/software raid configuration, or something to do with the hardware disk controller.

Spoiler

Desktop: Ryzen9 5950X | ASUS ROG Crosshair VIII Hero (Wifi) | EVGA RTX 3080Ti FTW3 | 32GB (2x16GB) Corsair Dominator Platinum RGB Pro 3600Mhz | EKWB EK-AIO 360D-RGB | EKWB EK-Vardar RGB Fans | 1TB Samsung 980 Pro, 4TB Samsung 980 Pro | Corsair 5000D Airflow | Corsair HX850 Platinum PSU | Asus ROG 42" OLED PG42UQ + LG 32" 32GK850G Monitor | Roccat Vulcan TKL Pro Keyboard | Logitech G Pro X Superlight  | MicroLab Solo 7C Speakers | Audio-Technica ATH-M50xBT2 LE Headphones | TC-Helicon GoXLR | Audio-Technica AT2035 | LTT Desk Mat | XBOX-X Controller | Windows 11 Pro

 

Spoiler

Server: Fractal Design Define R6 | Ryzen 3950x | ASRock X570 Taichi | EVGA GTX1070 FTW | 64GB (4x16GB) Corsair Vengeance LPX 3000Mhz | Corsair RM850v2 PSU | Fractal S36 Triple AIO | 12 x 8TB HGST Ultrastar He10 (WD Whitelabel) | 500GB Aorus Gen4 NVMe | 2 x 2TB Samsung 970 Evo Plus NVMe | LSI 9211-8i HBA

 

Link to comment
Share on other sites

Link to post
Share on other sites

95 is reasonable, there's also your switch to take in to account (unless you were geting more than 95 with freenas on the same switch). Some switches can only handle 1gbps total throughput.

 

You could also use iperf to test between the two ramdisks as well, which takes smb portocol as a possible issue.

 

Just thought of another thing, I wonder if FreeNAS enables device polling while windows has it disabled - or vice versa. If you go into the device configuration for the NIC, one of the many settings may be device polling, toggle it on/off see if there's any improvement.

 

But overall yes, it does sound like your disk is the cause for 50mbps. Weird that the benchmarks are 120mbps but then again they are synthetic tests.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Mikensan said:

95 is reasonable, there's also your switch to take in to account (unless you were geting more than 95 with freenas on the same switch). Some switches can only handle 1gbps total throughput.

 

You could also use iperf to test between the two ramdisks as well, which takes smb portocol as a possible issue.

 

Just thought of another thing, I wonder if FreeNAS enables device polling while windows has it disabled - or vice versa. If you go into the device configuration for the NIC, one of the many settings may be device polling, toggle it on/off see if there's any improvement.

 

But overall yes, it does sound like your disk is the cause for 50mbps. Weird that the benchmarks are 120mbps but then again they are synthetic tests.

I went into the device config for both nics and neither had that option

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×