Jump to content

Refreshing data on the disk


Recommended Posts

Read this thread on VOGONS a while ago, also found this article. Now I'm considering running the process on one or two of my disks (HDD) that have the most data that haven't been touched for years. And I wonder if there are any benefits in the long run or is it more dependent on the external factors like hot/cold temperatures and humidity, which isn't an issue here. They're not stored in the closet, but plugged in the computer and have data read or written to occasionally, they're used as data disks.

Seems like that kind of topic that it would be difficult to give unbiased advice. What's curious to me about that article, it says to run it once a year, assuming normal conditions. Obviously, the process logically results in some wear and tear. I did check few of the old files that were on the drive as-is for over 12 years and their checksum matches. Most stuff was downloaded from the internet, that's how I checked those few. And the first guy on VOGONS did say mechanics will probably fail earlier (presumably before one has to worry about data getting scrambled).

Thoughts?

Link to comment
Share on other sites


From my very long experience with various HDDs, a year is too short, I'd say refresh data every 2-3 years or so. IMPORTANT data needs to be on 3 different backup storages, also preferably use different locations (your second/third house, for example). If you have something to hide, then NOT your house, obviously. I even used to store data in a very humid enviroment (with the measures taken). BUT 12 years HDDs (if I read your post correctly, they were constantly on) - the only place for them is a dumpster, sorry, they aren't safe to store anything on them. Don't buy the new ones from Toshiba, and don't buy the ones that have the "new" crappy SMR tech, like WD60EFAX or any Toshiba model.

Link to comment
Share on other sites

On 4/9/2023 at 10:06 PM, UCyborg said:

Read this thread on VOGONS a while ago, also found this article. Now I'm considering running the process on one or two of my disks (HDD) that have the most data that haven't been touched for years. And I wonder if there are any benefits in the long run or is it more dependent on the external factors like hot/cold temperatures and humidity, which isn't an issue here. They're not stored in the closet, but plugged in the computer and have data read or written to occasionally, they're used as data disks.

Seems like that kind of topic that it would be difficult to give unbiased advice. What's curious to me about that article, it says to run it once a year, assuming normal conditions. Obviously, the process logically results in some wear and tear. I did check few of the old files that were on the drive as-is for over 12 years and their checksum matches. Most stuff was downloaded from the internet, that's how I checked those few. And the first guy on VOGONS did say mechanics will probably fail earlier (presumably before one has to worry about data getting scrambled).

Thoughts?

Spinrite is a good tool for checking and refreshing old HDDs, presumably with a lower size. I don't know if there are any size restrictions. I already used Spinrite 6.0 in the past, and it helped me to refresh an old HDD and check a few pending sectors on it. The tool DiskFresh also sounds promising, though. metiers1.gif

Edited by AstroSkipper
Link to comment
Share on other sites

This whole stuff:

1) makes no sense whatsoever
2) should it make sense in some very specific, niche situation, we don't have any meaningful, valid, data to support the method, let alone the frequency at which it should be implemented

If you feel good refreshing your data, do it.

If you feel good refreshing your data every year, do it every year, if you feel good doing it every 2-3 years, do it every 2-3 years.

You will anyway lose some data before or later (or possibly you will never lose any) but there is no way to know in advance nor any way to know whether this strategy contributed in any meaningful way to the outcome (whatever that happens to be).

Replicating data (having multiple copies, on different media and stored  in different location) is an effective strategy, though it is difficult to implement, let alone maintain over the years.

The only thing that promised (but has to be seen if they delivered) long enough data retention were (are?) M-DISCs:

https://en.wikipedia.org/wiki/M-DISC

jaclaz

 

 

Link to comment
Share on other sites

7 hours ago, jaclaz said:

This whole stuff:

1) makes no sense whatsoever

Unfortunately, you are wrong. smilie_denk_24.gif There are indeed physical reasons for losing data. I'll give you a hint. It's called magnetism when it comes to HDDs. But even for SSDs, there are physical processes that can lead to data loss. lousyputer.gif
Here is a German article to bring you a little closer to the subject:  
https://www.computerwoche.de/a/der-langsame-tod-von-festplatten-und-ssds,3549906
Use a translator if German is not one of your languages! Anyway! @UCyborg's concerns are fully justified and understandable. ssuper5sur5.gif

AstroSkipper

Edited by AstroSkipper
Update of content
Link to comment
Share on other sites

3 hours ago, AstroSkipper said:

Unfortunately, you are wrong. There are indeed physical reasons for losing data. I'll give you a hint. It's called magnetism when it comes to HDDs. But even for SSDs, there are physical processes that can lead to data loss. lousyputer.gif
Here is a German article to bring you a little closer to the subject:  
https://www.computerwoche.de/a/der-langsame-tod-von-festplatten-und-ssds,3549906
Use a translator if German is not one of your languages!

AstroSkipper

AstroSkipper, I agree with you 100% ! I'd also add simple mechanical failures due to the dried out motor bearings lubricant ! And/or dust inside (yes, dust inside of newly-made japs-made Toshiba/Hitachi) , jaclaz may just be too young to remember the DeathStar HDD from Toshiba, so let's not get too hard on him ...

https://en.wikipedia.org/wiki/Deskstar

Link to comment
Share on other sites

46 minutes ago, D.Draker said:

AstroSkipper, I agree with you 100% ! I'd also add simple mechanical failures due to the dried out motor bearings lubricant ! And/or dust inside (yes, dust inside of newly-made japs-made Toshiba/Hitachi) , jaclaz may just be too young to remember the DeathStar HDD from Toshiba, so let's not get too hard on him ...

https://en.wikipedia.org/wiki/Deskstar

There are many different reasons for failures and data loss in terms of HDDs and SSDs. But this thread is about refreshing data on disks. That's why I didn't mention other failures or problems which of course can occur at any time. :)

Edited by AstroSkipper
Link to comment
Share on other sites

4 hours ago, AstroSkipper said:

There are many different reasons for failures and data loss in terms of HDDs and SSDs. But this thread is about refreshing data on disks. That's why I didn't mention other failures or problems which of course can occur at any time. :)

Lubricants drying out is related to the data prevention loss. One would have to turn on the HDD  once in a while, from time to time. By doing this you prevent it from being stuck.

 
Link to comment
Share on other sites

1 hour ago, D.Draker said:

Lubricants drying out is related to the data prevention loss. One would have to turn on the HDD  once in a while, from time to time. By doing this you prevent it from being stuck.

 

Yep! That's riight. And we could list many other strategies for prevention of data loss and disk failures. :yes: But this thread is about the periodical refresh of disks to preserve the data on these media as long as possible. And this can be done indeed with tools like DiskFreshSpinrite and presumably some others. However, the intervals at which something like this should be carried out are quite flexible and rather arbitrary. TBH @D.Draker, that was actually what @UCyborg wanted to discuss here. b025.gif

Edited by AstroSkipper
Update of content
Link to comment
Share on other sites

Well, you completely missed the point.

There is no doubt that data - over time - may be lost due to failure of the media/device it resides on.

The point is that there is no reliable real world data on what exactly causes the media or device to fail, nor how soon (if ever) this may happen.

ALL other possible reasons are relevant, as the point is whether the data is readable or not.

With CD's and DVD's the media is separated by the device, so if the media is prone to failure, it makes a lot of sense to copy the data from the media and write it to some new media.

On hard disk the media and the device are the same thing, refreshing data (on a "same" device) at an elemental level (one single byte) you are going to read (say) AA and write in the same exact place an exactly same AA.

There is no evidence that the newly written AA is "better" than the previous AA or that it will last any longer, and anyway there is no way to know in advance if any of the tens of other issue the device may experience (both at hardware and software/firmware) may make that data unreadable.

Now, if you copy the data (as files, or as a dd copy or image) to a new device you have some better chances that the new device will last more, but no certainty whatsoever.

Since you (we, noone) do not know how often this is needed (could be a year, 2-3 years, 5 years, we simply don't know) how often are you going to do this refresh?

Let's take two years, you go for your refreshing and find out that for *some reasons* your AA is not anymore readable, your data is lost and you cannot refresh anything.

Let's take one year, you go for refreshing and perform it successfully.

Then you do it again, one year later, but this time you find out that for *some reasons* your AA is not anymore readable, your data is lost and you cannot refresh anything.

Now, if you had (as theoretically needed) other two copies of the data you could attempt making a new third copy from one of the other two, without this you are essentially flipping a coin every n months/years.

So if we had thousands of (accurate/correct) reports by thousands of people that use ALL these approaches:
1) never refresh
2) refresh every year
3) refresh every 2 years
4) refresh every 3 years

with the SAME data using the SAME make/model of hard disk, kept in the same room/climate, then after a few years we would have some data to decide which strategy is the best one (and that will be accurate only for the one, or possibly even two, generation(s) of hard disks and possibly not applicable to the current generation of disks).

In a nutshell, choosing this (or that) strategy with the reports we actually have (none) is simply an act of faith in something intangible.

Then we will start with anecdata, people that lament losing their data (and made no data refresh will be criticized by those that refresh their data every 3 years and never lost their data, people that lament losing their data even if they do a refresh every 3 years will be told that they should have done it yearly, etc., etc.

@D.Draker

Unlfortunately I am old enough not only to know about the the deskstar/deathstar, but also had quite a few of them failing in the firm I was working with at the time.

@Astroskipper

Here is one source (recommending DiskFresh) stating that data refresh should be performed much more often (the page is referred to on the diskfresh page by puran software but it is long dead):

https://www.puransoftware.com/DiskFresh.html

https://web.archive.org/web/20160413062810/http://www.fact-reviews.com/info/diskfresh.aspx

>In order to keep the data signal from fading, you need to re-write the data. This is often known as “hard disk maintenance”, and should be done 3 or 4 times a year.

...

>A regular (quarterly) refresh of all hard disk drives will help the drive detect and fix errors before they turn into problems, and keep the data integrity intact. Don't forget to refresh any external USB drives you may use for backup purposes.

The procedure is recommended (by the actual seller of the software that can do it and that should know about how it works) every 3-4 months.

How were the 1 year or the 2-3 years determined then? (3 to 12 times the recommended interval)

Isn't it queer that if the procedure is so needed and needed so often there is only one program to do it (besides Spinrite[1])?

jaclaz

[1] that has its own list of doubtful claims, there are endless critiques of Steve Gibson and his works/programs, the most benevolent ones saying that he tends to exaggerate greatly (either the seriousness of the issues or the capabilities of his software to fix them)

Link to comment
Share on other sites

1 hour ago, jaclaz said:

Well, you completely missed the point.

Unfortunately, not!

1 hour ago, jaclaz said:

There is no doubt that data - over time - may be lost due to failure of the media/device it resides on.

The point is that there is no reliable real world data on what exactly causes the media or device to fail, nor how soon (if ever) this may happen.

ALL other possible reasons are relevant, as the point is whether the data is readable or not.

With CD's and DVD's the media is separated by the device, so if the media is prone to failure, it makes a lot of sense to copy the data from the media and write it to some new media.

This thread is not about common failures in terms HDDs or SSDs.

1 hour ago, jaclaz said:

There is no evidence that the newly written AA is "better" than the previous AA or that it will last any longer, and anyway there is no way to know in advance if any of the tens of other issue the device may experience (both at hardware and software/firmware) may make that data unreadable.

That's what the topic is about. And you are wrong, unfortunately. There are evidences and physical reasons that the newly written data is better than the previous ones. Read the linked articles and think about magnetism! In terms of HDDs, stored bits can reverse their magnetic polarity, unfortunately. smilie_denk_24.gif

AstroSkipper

Edited by AstroSkipper
Link to comment
Share on other sites

 

Evidence is not a (single, apocryphal) article, that expresses opinions.

But no problem whatsoever, everyone is free to believe whatever he/she wishes to believe, as long as he/she is happy. :yes:

And now, for no apparent reason, the Get Perpendicular movie by Hitachi (2005) featuring the super-para-magnetic effect:

 

jaclaz

 

Link to comment
Share on other sites

18 minutes ago, jaclaz said:

Evidence is not a (single, apocryphal) article, that expresses opinions.

There is not a single article. The linked articles are just examples to inform users like you. Physics and how magnetism works is evidence enough. And that doesn't have much to do with opinions, either. :no: Simply put, magnetic information can change and unfortunately be lost in its original state, which is simply unavoidable. No matter whether you want to believe that or not. ignoresmiley.gif Unfortunately, physics does not take this into account. albert.gif

AstroSkipper

 

Edited by AstroSkipper
Update of content
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...