The same is true for the zip media I have. As I have much less zips as floppies and the drive is external I decided to get rid of this first.
- copy the data with rsync of each zip into a separate directory
- backup all this data onto a second HD
- calculate the md5sum of the files and remove the double ones
There is no need to spend (loose) some time and sort the data. There was no need to access the data for some years, there will be no need in the future. And if, the new index and search tools like beagle will find the data anyway - and quicker as I am, no matter how the data is sorted.
Why keep the data at all, if I think I will not need it in the future. Nostalgic, like the stuff from university. Reference, some mails, letters, etc. And as I said the data doesn't take any space. We still talk about Mega Bytes in a time where HD's start to be measured in Terra Bytes. Way spent time to reduce the space used for the old data from 0.1% to 0.05%?
Keine Kommentare:
Kommentar veröffentlichen