Jump to content

Fleet of new SSD's for 42 NAS units.

CoreMuncher

So... I run 42 Synology DS918+ NAS's. They are all over the UK and provide recording/Live veiw of Traffic signals and junctions. I have had the units for a few years and am noticing failures (that are expected) of the installed SSD's. I plan to update all the SSD's and move from the Crucial MX300 1TB drives (4 drives per NAS) to either the Iron Wolf pro or the WD red (1tb SSD version of either). Other than not liking seagate (past experiance) is there a compelling reason to use either drive?

Link to comment
Share on other sites

Link to post
Share on other sites

With such a large deployment ? you'd actually have to be the one to tell us which is best.

You're probably going to have the largest sample size to determine the failure rate on the new drives.

the drives you order most likely won't even be all the same drive from the same batch so whether or not they are reliable even within their own product range is up in the air.

Link to comment
Share on other sites

Link to post
Share on other sites

what failure rate do you see? What causes the failures, controller issues or writes? How many hours do the drives have?

 

Id look at the intel and samsung datacenter drives aswell here. I have some and they have all be rock solid drives. Look at something like a d3 s3510 for 1dwpd drive.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Electronics Wizardy said:

what failure rate do you see? What causes the failures, controller issues or writes? How many hours do the drives have?

i like to know too.

you may want a datacenter ssd, pricey but longer warranty.

Ryzen 5700g @ 4.4ghz all cores | Asrock B550M Steel Legend | 3060 | 2x 16gb Micron E 2666 @ 4200mhz cl16 | 500gb WD SN750 | 12 TB HDD | Deepcool Gammax 400 w/ 2 delta 4000rpm push pull | Antec Neo Eco Zen 500w

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Electronics Wizardy said:

what failure rate do you see? What causes the failures, controller issues or writes? How many hours do the drives have?

 

Id look at the intel and samsung datacenter drives aswell here. I have some and they have all be rock solid drives. Look at something like a d3 s3510 for 1dwpd drive.

Out of the 168 drives.

2 arrived DOA.

2 failed in the first 6 months.

0 failed 0 -12 months.

5 failed in 12 - 18 months.

16 failed 18 - 24 months. 

Arround 90% of the errors i have seen are Writes as opposed to Controller issues. 

 

I had not concidered Data centre drives. Not sure if i can swing for them however will give them a look. 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, CoreMuncher said:

Out of the 168 drives.

2 arrived DOA.

2 failed in the first 6 months.

0 failed 0 -12 months.

5 failed in 12 - 18 months.

16 failed 18 - 24 months. 

Arround 90% of the errors i have seen are Writes as opposed to Controller issues. 

 

I had not concidered Data centre drives. Not sure if i can swing for them however will give them a look. 

How many total writes do the drives have?

 

Id really look at those data center drives, not that much more, and generally better.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×