Today we have a very special issue of Just Ask Matt. OK, it’s not really that special. But this question was not actually directed to me specifically. Still, I am going to be offering my thoughts on this matter in addition to asking you for yours.
Today, David writes:
I recently accepted a job maintaining a small network (4 servers, 51 clients) for an educational facility. The configuration that was originally established here forces each client to automatically re-image after each shutdown. While this is great from the standpoint of not having to reset settings and erase downloaded junk after students use the computers, I am concerned that this constant repartitioning / rewriting will wear out the internal HDD parts from excessive use.
I have seen software that allows you to benchtest HDDs for speed, or
other software that monitors HDD temperature, but I’m wondering if these metrics will really tell me anything without having any previous data to compare with as a reference. I vaguely recall seeing a program that claimed to predict the meantime to failure using SMART, but I’ve forgotten what it was called and my Google search results didn’t produce any programs that seemed to ring a bell.
So, I guess I’ve got two questions for you:
- Do you think that daily re-imaging really puts any more strain on the drive than, for example, home systems where users are adding, deleting, moving, and defragmenting a HDD? and, if so,
- What would you recommend to try to determine which drives are in the worst condition… or in other words, which ones should be replaced first?
As for question number one, I feel that it is six of one, half a dozen of the other. No matter what, there is going to be a fair amount of read/write activity. If we want to get technical, though, I would imagine that there is a little more ‘usage’ with the active home user than with the other situation. That, I would imagine, is likely to stir quite a bit of debate.
So, you need to determine which drives are worse off and should be attended to before total failure. Sounds like a good, preemptive plan to me. Now when I first had this question sent to me, I had a piece of software that came to mind that actually did a fair job helping in this area of prediction. Unfortunately, the person that was going to remind me what the name of the program was has long since forgotten what it is called, as well. To be honest, this is easy to have happen with all of the apps that IT pros work with on a daily basis.
So this brings us to you, the readers. What are your thoughts? I am interested in getting your input on this matter. Just use the e-mail link above to send us your suggestions.