Jump to content

Nvme Raid setup?

tjrose91

Should I go for a 5x2tb nvme raid setup it will be at Gen 3x1 speeds as is mass long term storage not worried about full bandwidth  or 3x4tb that would be at full gen 3 speeds I am getting a dual nvme adapter that allows me to use two drives without bi-fabrication the raid with more drives will be less expensive and if I go for the 3x4tb set up then I will have almost 10tb filled up as I will considerate my storage with my steam library or move it back to my raid 0 sata set up 

Link to comment
Share on other sites

Link to post
Share on other sites

Isn't NVMe raid kinda buggy?

I have concerns about the adapter. I don't know how it would work without bifurcation.

I'd bet you'd be better off using SATA SSDs, cheaper and you probably would never notice the difference.

Link to comment
Share on other sites

Link to post
Share on other sites

Is the usecase purely steam library? Or are there other considerations?

5950X/3080Ti primary rig  |  1920X/1070Ti Unraid for dockers  |  200TB TrueNAS w/ 1:1 backup

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, tjrose91 said:

mass long term storage not worried about full bandwidth

curious why you would want to spend more for nvme drives and sacrifice functionality and reliability if you just want storage.

 

a 10tb hdd is very cheap and will likely be 1/10th the total cost

 

5 samsung 970 evos would cost $800 ish and get you 10 tb

10tb barracuda costs <$200 and gets the same storage, with no need to raid, buy adapters, or take up slots.

 

edit: additionally, if you wanted to upgrade down the line, the five slots that were taken up can be filled by 4 more 10tb hdds, totallying five disks, and 50tb of storage.

If your question is answered, mark it so.  | It's probably just coil whine, and it is probably just fine |   LTT Movie Club!

Read the docs. If they don't exist, write them. | Professional Thread Derailer

Desktop: i7-8700K, RTX 2080, 16G 3200Mhz, EndeavourOS(host), win10 (VFIO), Fedora(VFIO)

Server: ryzen 9 5900x, GTX 970, 64G 3200Mhz, Unraid.

 

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, tjrose91 said:

Should I go for a 5x2tb nvme raid setup it will be at Gen 3x1 speeds as is mass long term storage not worried about full bandwidth  or 3x4tb that would be at full gen 3 speeds I am getting a dual nvme adapter that allows me to use two drives without bi-fabrication the raid with more drives will be less expensive and if I go for the 3x4tb set up then I will have almost 10tb filled up as I will considerate my storage with my steam library or move it back to my raid 0 sata set up 

The best way to do NVMe RAID is using a single device that handles the raid for you, like a PCIe x16 to m.2 nvme x4 card with a built in raid controller. Or by using an OS on a dedicated machine with that many m.2 slots, like TrueNAS or UnRaid.

 

But that being said, using NVMe drives for mass data storage seems like a waste, as others here have pointed out.

if your goal is mass storage, for something you want extreme high bandwidth for, a SATA SSD raid is going to give you all the throughput you'd need to max out a 10GB ethernet adapter. If you had a 40GB ethernet adapter then a single gen4 NVMe drive can saturate that. If you have a 100GB ethernet adapter, then maybe you need NVMe raid. But by that point you'd likely be into the realm of U.2 devices, not M.2 devices.

It's all about optimizing for your need.

Need The most storage at okay throughput? HDD raid
Need lots of storage and decent to good throughput? SATA SSD raid
Need a little more than normal storage with godlike throughput? NVMe raid

There are other things to consider like access latency and IOPS, but if you are optimizing for that, you typically use a single ultra fast drive, not a raid. Additionally endurance is a consideration. If you are doing lots of writes to the raid, then you want HDDs typically, or you want NAS rated SSDs which are typically much more expensive.

Link to comment
Share on other sites

Link to post
Share on other sites

From my experience today's hard drives are not reliable in any way.  I've lost 3 times as many hard drives than solid state drives since 2010 and I've had more SSDs.  It is also by far the most replaced component we have on servers.  Just lost an ESX datastore over the weekend and had to recover from backup to the new server.

 

Also, NVME drives aren't all that much more than SATA these days.  Especially if you are avoiding QLC.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, KingTdiGGiTTy said:

Isn't NVMe raid kinda buggy?

I have concerns about the adapter. I don't know how it would work without bifurcation.

I'd bet you'd be better off using SATA SSDs, cheaper and you probably would never notice the difference.

Windows Raid setup, Linus did a video on the adapters I'm sure that it's just convert a nvme to the actual PCie slot 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, OddOod said:

Is the usecase purely steam library? Or are there other considerations?

Faster load times and I'm bored something to do but want a good setup 

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, ewitte said:

From my experience today's hard drives are not reliable in any way.  I've lost 3 times as many hard drives than solid state drives since 2010 and I've had more SSDs.  It is also by far the most replaced component we have on servers.  Just lost an ESX datastore over the weekend and had to recover from backup to the new server.

 

Also, NVME drives aren't all that much more than SATA these days.  Especially if you are avoiding QLC.

Is figured a raid zero would be best as the array would be rarely accessed but powered on as it's my main game setup, 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, tjrose91 said:

Is figured a raid zero would be best as the array would be rarely accessed but powered on as it's my main game setup, 

I wouldn't do RAID0.  One drive goes everything goes.  If you want it to just show up as one drive you can do storage spaces.  

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, tjrose91 said:

Faster load times and I'm bored something to do but want a good setup 

As demonstrated by LTT previously, there is ~0 load time decrease with faster storage than a gen3 NVMe. 

5950X/3080Ti primary rig  |  1920X/1070Ti Unraid for dockers  |  200TB TrueNAS w/ 1:1 backup

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, OddOod said:

As demonstrated by LTT previously, there is ~0 load time decrease with faster storage than a gen3 NVMe. 

From HDD to SSD hence why we don't recommend HDD for boot drives yes there is a considerably faster but , for SSD to SSD very little difference unless your video editing then I just have a dedicated scratch disk 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, KingTdiGGiTTy said:

Isn't NVMe raid kinda buggy?

I have concerns about the adapter. I don't know how it would work without bifurcation.

I'd bet you'd be better off using SATA SSDs, cheaper and you probably would never notice the difference.

the adapter is normally fine it converts the nvme a normal PCIE interface as NVME uses the PCIE protocol if I'm correct

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, tjrose91 said:

the adapter is normally fine it converts the nvme a normal PCIE interface as NVME uses the PCIE protocol if I'm correct

Which adapter are you using? Most of the cheap dual m.2 adapters are actually for one nvme and one sata, not dual nvme.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Blue4130 said:

Which adapter are you using? Most of the cheap dual m.2 adapters are actually for one nvme and one sata, not dual nvme.

i found one on ebay and i think it may work after reading the listing if not the Ebay protection it is ( https://www.ebay.com/itm/225363160171 )

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, tjrose91 said:

i found one on ebay and i think it may work after reading the listing if not the Ebay protection it is ( https://www.ebay.com/itm/225363160171 )

Yep, That is not one of the cheap ones, you should be good. But I would echo what @ewitte says. Skip the raid and just make a storage spaces pool if you want one big drive.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/30/2023 at 8:44 PM, Blue4130 said:

Yep, That is not one of the cheap ones, you should be good. But I would echo what @ewitte says. Skip the raid and just make a storage spaces pool if you want one big drive.

i was going to use disk management on win 10 and do stripped or Raid 0, i had two sata SSD and its only at 98%, but gonna up load pics soon got the card to work

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think it's just took a bit for windows to update because I thought I cleared my CMOS but didn't was messing around and buy us and apparently I didn't change anything thought I did but both drives appear at windows and operate a full gen3x4 speed's , I plan to get Allie's two to three PCIE 1X NVMe adapters and create something and a couple months this way I can dedicate two terabyte drives now or 5x2tb I mean I know I'm not going to get full MVMe speeds it'll be reduced to Gen 3x1 * 5 which might equal 5 GB per second

IMG20230202140504.jpg

IMG20230202140613.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, tjrose91 said:

Raid 0

raid 0 with NVME? are you running a datacenter?

 

raid 0 is the worst thing to do unless you need obscene speed. you will lose ALL of your data if one drive fails.

If your question is answered, mark it so.  | It's probably just coil whine, and it is probably just fine |   LTT Movie Club!

Read the docs. If they don't exist, write them. | Professional Thread Derailer

Desktop: i7-8700K, RTX 2080, 16G 3200Mhz, EndeavourOS(host), win10 (VFIO), Fedora(VFIO)

Server: ryzen 9 5900x, GTX 970, 64G 3200Mhz, Unraid.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Takumidesh said:

raid 0 with NVME? are you running a datacenter?

 

raid 0 is the worst thing to do unless you need obscene speed. you will lose ALL of your data if one drive fails.

im not that worried about losing date with SSD been using a Raid 0 on a Sata SSD for a steam library , and i wont get insane speeds, it will be limited to Gen3 x 1 Speeds and a Sata SSD will fail more as the connectors are more fragile, i could go Sata SSD in a 6 Bay Config but i dont trust my MB Sata Ports i think they are bad i tired a Raid 1 on 2x10tb HDD even new HDD and the array would not stay, at least with NVME less points of failure and it more for long term storage i dont access my approx 10tb data much, , and i would keep a clone copy in a HDD or several of them might make a NAS or DAS with al l my 10 tb HDD, i got at least 5x10 TB, i got new drives when the warrenty expired, and the old drives seemed to be SMART Passed

Link to comment
Share on other sites

Link to post
Share on other sites

Yep after doing some benchmarking using the adapter card the read speeds seem to be fine, the rice be seem to be cut in up to 75%, but I'm sure in a raid configuration is not going to make a difference is more or less lol allow me to create a storage pool on Microsoft Windows or striped

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, tjrose91 said:

Sata SSD will fail more as the connectors are more fragile,

LOL, that's not the type of failure that the folks here are discussing.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bad5ector said:

LOL, that's not the type of failure that the folks here are discussing.

O I know hardware failure, but had SATA raid zero and been just fine on two drives and regular backups 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×