When you read in HDD specification sheets, there is a reliability specification named MTBF listed.
Seagate 15K.4 has a MTBF of 1,400,000 hours (160 yrs at 24/7 use)
A typical consumer hard drive might have something around 500,000 hours (57 yrs @ 24/7 use)
My initial reaction is that it makes no difference whatsoever as you likely won't have the drive around much past ten years. From what I read on one website, they run say 10,000 drives for a 1,000 hours and one drive fails. They calculate MTBF using the formula (10,000 drives x 100 hours = 1,000,000). I don't know what statistical evaluation they use, but if they use the above formula, I can't see how it reprsent real life conditions, because it doesn't factor in the wear-and-tear.
It's like driving ten identical cars with 10,000 miles for 100 miles and trying to evaluate how the same vehicle will be like when one of those cars is driven for 1,000 miles after 100,000 miles.
Anyone know what the real life significance of 1/2 mil vs 1.4 mil hrs MTBF and the statistical method used to get these figures?
Seagate 15K.4 has a MTBF of 1,400,000 hours (160 yrs at 24/7 use)
A typical consumer hard drive might have something around 500,000 hours (57 yrs @ 24/7 use)
My initial reaction is that it makes no difference whatsoever as you likely won't have the drive around much past ten years. From what I read on one website, they run say 10,000 drives for a 1,000 hours and one drive fails. They calculate MTBF using the formula (10,000 drives x 100 hours = 1,000,000). I don't know what statistical evaluation they use, but if they use the above formula, I can't see how it reprsent real life conditions, because it doesn't factor in the wear-and-tear.
It's like driving ten identical cars with 10,000 miles for 100 miles and trying to evaluate how the same vehicle will be like when one of those cars is driven for 1,000 miles after 100,000 miles.
Anyone know what the real life significance of 1/2 mil vs 1.4 mil hrs MTBF and the statistical method used to get these figures?