Really good backup software can help a lot like that. Unfortunately, the backup software I like and rely on (BackInTime
- included, or available through the software manager, in most Linux distributions) has, in free software for Windows, only an almost-equivalent
. And one thing the almost-equivalent really falls down on how to get a new installation of it to recognize and pick up on backups from an old installation.
(I recently replaced my hard drive. First time I ran the backup software's GUI after the replacement, it said "hey, I don't have a configuration file - is there an old backup?" I told it yes and guided it to the top of the most recent backup. From there it automatically picked itself up and was ready to resume where the previous installation had left off. That's how these things should
--- Warning - I'm about to get somewhat technical. ---
One reason the Windows almost-equivalent falls down is a legacy issue: it's willing to put its backups on disks formatted with the FAT file system characteristic of MS-DOS and of Windows up to and including WindowsME (but not WindowsNT except for very early versions, or Windows2000). The structure of FAT file systems does not support having two directory entries pointing at the same file - disk scans consider this an error, and deleting either directory entry causes the space the file occupies to become "vacant" and available for reuse. More sophisticated file systems, such as those used in Unix a decade before MS-DOS was pulled out of the scrap-pile and made a major OS, intelligently handle having multiple directory entries for a single file.
This matters when you want to have two full backups. If the backups are on a FAT-formatted disk, two full backups take twice the space of one. If they are on NTFS or EXT3 or.. um... whatever OSX uses, there can be only one copy of the files that haven't changed, with two (or more) directory entries pointing at it. On a home computer usually the large majority of files - by both count and space - don't change for months or years at a time, so there's a HUGE savings. The backup job that protects my writing and related stuff, one full backup takes about 1.4 GB but most additional full backups actually take no more than 1/1000 that much space. (Making a new directory entry pointing at an existing file is also faster than making a new directory entry and copying a file.)
One effect of this is there is no tradeoff between backup frequency and space, and a minimal tradeoff between backup frequency and system performance. (The former is reduced to identifying work files that don't need backed up because they'll be automatically rebuilt as needed or are used only for short-term storage.) The folder my writing is in gets backed up every two hours - if something has changed. In two hours I don't change enough stuff to notice the momentary slowdown while the computer identifies and copies a tiny handful of files and builds new directory entries for everything that didn't change.
But if you allow the backup to be on FAT, you can't do that.
So the Windows almost-equivalent fakes it, by giving backup copies of all files new names and moving the real directory structure and file names into a database - replicating the more-sophisticated file systems. Which means your backups are pretty much useless unless you have a special plugin installed and configured to recognize those backups. And getting a new installation of the software to recognize old backups is a pain - or at least it was last I looked. (Which was last year, so maybe they've recognized this problem and fixed it.)