Jump to content

What's the reasoning behind this

Spazthe magician
On 2/22/2023 at 8:37 PM, Needfuldoer said:

It's tech folklore. Back in ye olde days, you could make your OS load ever so slightly faster by corralling it in a small partition on the outside edge of a spinning hard drive's platters, where data throughput was a little better. When Windows 9x would fall on its face, you could erase the C:\ drive and reinstall without touching your data on the other partitions.

 

That's completely irrelevant now in the age of 0ms seek times and versions of Windows that don't need to be reinstalled every other Tuesday (unless you constantly tinker with it thinking you can "tune" it better than the developers).

It did come back again in the early days of ssd's,  when you had a smaller ssd (32 or 64G), you'd use that for windows and any programs you really need fast boot times for, and put all your games on a second HDD, because once loaded they typically never suffered from slow storage.

 

Things changed again and now ssd's are big enough and some games need/benefit from faster storage.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I know this question was posted by a home user but I thought I would give an enterprise perspective.

 

My company has over 30,000 servers that need patching each month. Sometimes more often.

Patching is a time where we see highest risk of systems not coming up post restart. (We get a few every week)

By being able to take an image of the OS drive only prior to pathing we can quickly rollback to known good state if we have any issues. (We backup data using separate processes).

 

Also. OS drives do not normally need to be resized often. Large companies will run data drives lean with minimal free space as a cost management activity. Imagine having 1TB free per server, with my company that would be 30PB wasted (Less so for lazy provisioning). We run much leaner and use dynamic resizing as needed through automation.

Whilst you can often simply resize a data volume while the system is running, you cannot for an OS drive. So Seperation makes sense again for the enterprise.

 

Some enterprises will configure servers to boot locally but use SAN storage for data. In this case the apps are installed to OS drive but all data stored externally. This is how LTT does it.

 

I think today it still has value for home user but it is dependant on the individual use case.

My home setups mostly have seperation, Perhaps due to my age and what I'm used to. I generally have a 512 or 1TB flash drive (MVME or SSD) as the OS drive and a 5-10TB drive as a Mass data volume for my desktop PC's and Laptops have single 512GB volume.

 

I don't backup my OS drives often as they generally can be rolled back to last image + patching but my mass volume drives are all backed up to an attic server and/or cloud storage.

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×