Jump to content

NAS Cache

Go to solution Solved by 2bitmarksman,

For reference, if you connect a drive via iSCSI the CPU processing needed to write data to the iSCSI drive will be consumed on the PC that it is using it as local storage, not the FreeNAS host.

 

Also to reiterate, unless you have 10g networking, don't bother with the SSD cache, and if you can, try to get an old server that you can cram a ton of RAM into instead like a Dell R710 if you need a highspeed cache. All the spare RAM that ZFS doesn't need on FreeNAS is used as the ARC cache, which is basically a RAM disk filled with the most used data you access. Once it fills, it cycles out the least used/old stuff to the L2ARC, if it has one, or just drops it if it doesn't. For reference, I bought a Dell R710 with 144 gb of RAM without drives and it was 538 bucks. Chelsio 10g NIC is like 25-50 USD and perc h200's are around 32-40 USD. HP servers can also be used and are a little cheaper but usually lower RAM capacities (128gb) and you would need an HBA (they don't have any that are compatible with FreeNAS that aren't for external enclosures).

ZIL is good for lots of random writes (like running a bunch of VM's that all want to write small bits of data at once). Generally though, this really isn't needed for large file sharing in a home like Plex and a Steam library or anything that writes large files all at once.

 

Oh, and make sure you setup Jumbo Frames, if you can. Big performance boost right there for lots of tiny files like textures for games over the network.

Hello,

I have been looking at getting a NAS and using FreeNAS on it. If I make an SSD cache, will it ever dump the contents of the cache or does it just keep the contents in that you use a lot?

I would make a RAID 1 cache, maybe with NVMe drives and use 10 GbE.

For context, I would be using it as a file server for the house and running a VM on it to host server.

 

Thanks.

Link to comment
Share on other sites

Link to post
Share on other sites

There are two types of ssd cache in zfs, the slog and the l2arc. The l2arc is a read cache and discarded on bootup.

 

Are you using sync?

 

You ca ntry a l2arc, it may help, but often doesn't do much depending on the setup, it can even hurt performance as its ram heavy.

 

Id just make a second pool with the nvme and use it for the vms.

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, Overlandr said:

What is sync?

When the program asks for the write to be confirmed thats its on disk.

 

What are you doing on the NAS, for this use, id probably just go with a simple qnap box, they work well here

Link to comment
Share on other sites

Link to post
Share on other sites

For what I want, it looks like it will be cheaper to make it myself rather than using a qnap box. I would be using it as a general storage box for files, photos and my plex stuff. The VM would be running on it as well. Also, do you know why it is advised to use Intel over AMD stuff, because the higher core count on AMD would probably be better for me with the VM.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Overlandr said:

For what I want, it looks like it will be cheaper to make it myself rather than using a qnap box. I would be using it as a general storage box for files, photos and my plex stuff. The VM would be running on it as well. Also, do you know why it is advised to use Intel over AMD stuff, because the higher core count on AMD would probably be better for me with the VM.

What hardware is the nas running on?

 

What type of vms?

 

Id look into something like this for the server.https://www.ebay.com/itm/Dell-PowerEdge-C2100-Dual-Intel-XEON-E5620-2-40GHZ-48GB-RAM-HDD-0F3R29-FS12-TY/152977682914?hash=item239e2e31e2:g:MrMAAOSwUS9Z1aUO

 

Freenas is pretty bad at vms, if you want lots of vms, id suggest looking at proxmox. You get the same zfs support, and much better vm support.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I would only be using it for 1 VM, it would be running windows with a game server on it. The hardware I was looking at was either an i3-8100 with a motherboard with 2 M.2 drives and 6 SATA ports. I have looked a the old servers and they look like a better value, but they are really loud. I would have to make a custom case if I wanted to make it quieter.

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, Overlandr said:

I would only be using it for 1 VM, it would be running windows with a game server on it. The hardware I was looking at was either an i3-8100 with a motherboard with 2 M.2 drives and 6 SATA ports. I have looked a the old servers and they look like a better value, but they are really loud. I would have to make a custom case if I wanted to make it quieter.

Yea that cpu looks good. Id also look at the 2200g, should work well here aswell.

 

m.2 really isn't usefull here. 

Link to comment
Share on other sites

Link to post
Share on other sites

You're going to be limited by your network, which for most people is 1gb/s. This means the fastest you'll be able to read/write is ~110mbyte/s, so no real need for a cache normally. Typically you'll want cache when you have a need for more than 1gb/s (either through LACP or 10gb+ cards).

 

The other scenario you'd need cache was hinted at earlier, if you're going to be using a service (like NFS) that relies on full time sync writes you might want a SLOG or a lot of RAM. It also matters which SSD you get for SLOG as well. Though if you're just using NFS for storage and not connecting to a hypervisor, you could live without it. 

 

tl/dr: Start without a cache device, it can be added later.

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/1/2018 at 5:27 PM, Electronics Wizardy said:

Yea that cpu looks good. Id also look at the 2200g, should work well here aswell.

It looks like the 2200G is better for performance on something like this, but FreeNAS says its better to use intel because of the system they use of VMs using a VNC.

 

On 5/1/2018 at 8:56 PM, Mikensan said:

You're going to be limited by your network, which for most people is 1gb/s. This means the fastest you'll be able to read/write is ~110mbyte/s, so no real need for a cache normally. Typically you'll want cache when you have a need for more than 1gb/s (either through LACP or 10gb+ cards).

 

The other scenario you'd need cache was hinted at earlier, if you're going to be using a service (like NFS) that relies on full time sync writes you might want a SLOG or a lot of RAM. It also matters which SSD you get for SLOG as well. Though if you're just using NFS for storage and not connecting to a hypervisor, you could live without it. 

 

tl/dr: Start without a cache device, it can be added later.

I was looking at getting a 10GbE NIC in the future as an upgrade, same with the cache, it would be after I have made the main storage RAID array. The reason for using M.2 over normal SSDs was for the better IOPS performance, as I will get less storage for the same price, but it will be faster for smaller stuff.

 

Also sorry for the late reply.

Link to comment
Share on other sites

Link to post
Share on other sites

You should research more if you want to use ryzen as your based system.

As far as I know, ryzen (or AM4 platform) is still an issue for freebsd in general

 

For ssd cache, just do it if you have an extra money to spend

I've just done it recently and I regret why I haven't done it sooner.

 

SSD cache isn't going to speed up the network connection if that is the limit, it will process lots of smaller files faster than regular hdd, there are lot's of application that won't even utilize 1gbs connection since most of required files are smaller files, it just need better seek.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Blebekblebek said:

SSD cache isn't going to speed up the network connection if that is the limit, it will process lots of smaller files faster than regular hdd, there are lot's of application that won't even utilize 1gbs connection since most of required files are smaller files, it just need better seek.

I am planning on using it as for some plex stuff as well, so it is likely more than one person will be using it at a time. It will be also used to store a load of photos so I will be dumping stuff on the server occasionally. Not planning on using 10 GbE for while. What are you running for cache, because I think I would get 2 drives in raid 1 or is there a better way?

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Overlandr said:

I am planning on using it as for some plex stuff as well, so it is likely more than one person will be using it at a time. It will be also used to store a load of photos so I will be dumping stuff on the server occasionally. Not planning on using 10 GbE for while. What are you running for cache, because I think I would get 2 drives in raid 1 or is there a better way?

If your not using 10gbe don't bother, it won't make a difference, esp as your not doing anything that is very io heavy.

 

raid 1 works well here.

 

Also you will have a ram cache that will speed things up a good amount.

Link to comment
Share on other sites

Link to post
Share on other sites

Mostly games, where one game require to load thousands of small files immediately.

 

I don't see any benefit for your scenario where ssd cache is going to boost your usage there, since mostly it's static files already.

But then again, just like previous poster says, do without it first and you can add it later on if you think you need it.

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, Blebekblebek said:

Mostly games, where one game require to load thousands of small files immediately.

 

I don't see any benefit for your scenario where ssd cache is going to boost your usage there, since mostly it's static files already.

But then again, just like previous poster says, do without it first and you can add it later on if you think you need it.

Ok, can you store games using FreeNAS? If so, it would be really good for bulk game storage and might even load stuff like DirectX faster on the cache.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Overlandr said:

Ok, can you store games using FreeNAS? If so, it would be really good for bulk game storage and might even load stuff like DirectX faster on the cache.

you can, but many games won't install on network stoarge so you need iscsi. A local ssd is a much better and faster option.

Link to comment
Share on other sites

Link to post
Share on other sites

Im guessing that using a NAS in raid 5 or 6 will make game loading faster, if they will install. And how would you use iscsi? How does FreeNAS allow you to install games like that?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Overlandr said:

Im guessing that using a NAS in raid 5 or 6 will make game loading faster, if they will install. And how would you use iscsi? How does FreeNAS allow you to install games like that?

It will be a lot slower than a interal ssd, and about the same as a hdd on gigabit.

 

iscsi is anouther way to access a nas. it shows up as a local disk.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Electronics Wizardy said:

iscsi is anouther way to access a nas. it shows up as a local disk.

Does that mean it makes a separate partition that you access as a local drive? And would 10GbE make it faster accessing games than using a local HDD that isnt in RAID. I wouldnt use a hard drive as it is cheaper to use the bulk storage on HDDs.

Link to comment
Share on other sites

Link to post
Share on other sites

iscsi is what force me to use ssd as cache and I'm glad I did, running in smb through mapped network is fast enough for me

comparing local ssd to nas is just stupid imo, I mean, even Local SSD vs NAS SSD will be different, it's different subject, the idea of nas is sharing everything though network.

 

Most steam games have no problem with only mapped network, there are few games required local disk to operate, thats what iscsi are for, it emulates nas storage as local storage.

 

Origin, epicgames, uplay, and blizzard need be run locally, but most my library is on steam.

You just need to install once to your nas, and any computer can connect and run it simultaneously.

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Overlandr said:

Does that mean it makes a separate partition that you access as a local drive? And would 10GbE make it faster accessing games than using a local HDD that isnt in RAID. I wouldnt use a hard drive as it is cheaper to use the bulk storage on HDDs.

You don't use a partition, use use a zvol, but it shows up as a internal hdd on the system. Should be fine for game installs, give it a try.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Blebekblebek said:

Most steam games have no problem with only mapped network, there are few games required local disk to operate, thats what iscsi are for, it emulates nas storage as local storage

Thats good, that makes it much easier. Would be possible to run the other games, either locally on iscsi.

3 minutes ago, Electronics Wizardy said:

You don't use a partition, use use a zvol, but it shows up as a internal hdd on the system.

Does iscsi show up as a local drive on all computers that see it on the network?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Overlandr said:

 

Does iscsi show up as a local drive on all computers that see it on the network?

Not all computer, only one that you set up. (On windows, search for iscsi initiator)

Technically it can be used by multiple machine, however since all machine can also write the disk then most of time the image data on iscsi profile would be corrupted.

 

best way to use iscsi to multple machine is set one machine able to read/write and set other machine to read only, it's like master and client side, if there's an update then only updates on master machine, and all clients will get latest update when it's done.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Blebekblebek said:

Not all computer, only one that you set up. (On windows, search for iscsi initiator)

Technically it can be used by multiple machine, however since all machine can also write the disk then most of time the image data on iscsi profile would be corrupted.

 

best way to use iscsi to multple machine is set one machine able to read/write and set other machine to read only, it's like master and client side, if there's an update then only updates on master machine, and all clients will get latest update when it's done.

Ok thats good, thanks for the help.

Link to comment
Share on other sites

Link to post
Share on other sites

For reference, if you connect a drive via iSCSI the CPU processing needed to write data to the iSCSI drive will be consumed on the PC that it is using it as local storage, not the FreeNAS host.

 

Also to reiterate, unless you have 10g networking, don't bother with the SSD cache, and if you can, try to get an old server that you can cram a ton of RAM into instead like a Dell R710 if you need a highspeed cache. All the spare RAM that ZFS doesn't need on FreeNAS is used as the ARC cache, which is basically a RAM disk filled with the most used data you access. Once it fills, it cycles out the least used/old stuff to the L2ARC, if it has one, or just drops it if it doesn't. For reference, I bought a Dell R710 with 144 gb of RAM without drives and it was 538 bucks. Chelsio 10g NIC is like 25-50 USD and perc h200's are around 32-40 USD. HP servers can also be used and are a little cheaper but usually lower RAM capacities (128gb) and you would need an HBA (they don't have any that are compatible with FreeNAS that aren't for external enclosures).

ZIL is good for lots of random writes (like running a bunch of VM's that all want to write small bits of data at once). Generally though, this really isn't needed for large file sharing in a home like Plex and a Steam library or anything that writes large files all at once.

 

Oh, and make sure you setup Jumbo Frames, if you can. Big performance boost right there for lots of tiny files like textures for games over the network.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×