We have an NAS that stores critical files. It’s due to be replaced, but in the interim I’d like to automate/improve the backup process. Right now I wait until no one is working off the device, select folders, copy/paste onto another networked drive, let it run for several hours, and then delete the second-to-last backup folder (don’t laugh, at least I’m doing backups). This, obviously, is a giant PITA, imperfect, and takes up much more space than needed. I’m doing it this way because the device has no software/firmware to back itself up.
I have a Windows 7 machine, and am willing to purchase software if necessary. I’ve heard people refer to Rsync, but if that’s the way to go I hope someone here can walk me through exactly what I need to do to have it:
[ol]
[li]Do full, periodic backups; Keep only two recent full backups and delete the older one as it’s being replaced;[/li][li]Run nightly incremental backups of any file that has been changed; [/li][li]Safely/securely compress the backups so I don’t run out of space; and [/li][li]Wake up my PC in the middle of the night so I don’t have to.[/li][/ol]
Again, if this is too much to do safely and reliably and there is software out there that does all this (Acronis?) with customer support, it would be worth the fifty bucks.
Bonus question #1: Does any software out there verify the integrity of backups? Something like an elaborate checksum on the compressed files and a warning if anything is wonky with them or the drive? Or do I need to keep doing this manually?
Bonus question #2: For my work and home PC (same network), I am using Windows backup to keep files on another NAS. Is Acronis (or whatever other software you’d recommend) that much better than Windows backup? Their 3-license pack is only 30 bucks more than one license, and I have a laptop that could use it if it’s a $30 improvement over Windows Backup.
Bonus question #3: What the hell am I supposed to do about the Mac? Mrs. Devil keeps all data files on the NAS, but has a ton of fonts and several programmes (e.g. Adobe Design Suite) that she uses in our business daily. While data is safe and we have all software CDs/Licenses, the couple times we’ve had to restore the entire system was somewhat nightmarish time-wise. She loves her Mac but knows little about OSX. Does Mac have built-in software to help?
IT Pro for 13 years chiming in.
Acronis is worth the money if you’ve got it to spend.
If not:
I’d suggest looking at Robocopy.
You can script it to run in the dead of night without too much work.
It’s really more oriented around a single backup, rather than “2” distinct backup states.
It mirrors, it doesn’t really compress.
It is roughly analogous to rsync.
Does it do things that Windows backup can’t do, do them better, or some combination? I was hoping Acronis had a product compare page that included their products and others (including Windows), but no dice.
That said, I’ve heard the name mentioned often enough by Dopers whose tech knowledge I respect enough that I can go on faith.
Hmm… I hadn’t spotted that this was for a home network. For backing up the local PCs, have you looked at OSX Lion Server? It will mean a dedicated machine, though.
I’d second Mr. Slant’s Robocopy suggestion. In the past, I’ve had to rely on it for months to backup a legally sensitive domain on a fairly large enterprise (we were suffering chronic tape failures). You can just script it and forget it, but bear in mind that, depending on the dependent systems (and who depends on wearing Depends… but it all depends), you will likely have some failures in there, but the bulk of the data that is being unused at that moment will get Robocopied just fine and you can catch the stuff that’s in use on the next cycle.
As previously mentioned, continuous backup will cause service degradation; you’re chewing up all kinds of CPU, memory, disk, and bandwidth (and accelerating hardware failure to a degree).
One caveat I’d offer is that, if this is an audit-type of concern, auditors sometimes don’t understand that backups tend to be “best effort.” Perfect backups can be rather elusive in most shops.
I’d second the suggestion of Acronis True Image to back up the Windows 7 PC. I use it, although not for any sort of automated backup. I just periodically rerun backups to multiple external drives every couple of weeks, and also have MozyHome continuously backup my most critical files.
Crap. I just got done with a customer service chat at Acronis. The agent said it won’t back up an NAS device–that I should be looking at their enterprise software.
Our desktops have very little data on them–imaging them to speed recovery if need be is what’s important. It’s the NAS that holds the critical files, and that’s what I’ve been manually backing up.
I am looking at a new NAS (one that has its own backup solution), but there are other issues in making that change and I was hoping to tide us over.
I’m not quite sure I understand the technical details of why Acronis can’t work with a mapped network drive. Actual problems (e.g. failure if it’s not connected) or do they want to move that type of business to their enterprise line? Maybe I should just download the trial and try anyway.
Robocoy might be the way to go, but I’d probably wimp out and go to the Marketplace or vWorker for the scripting side of it. This isn’t for an audit (just our own peace of mind), but getting it ‘right’ the first time will save a lot of frustration.
Good call on TimeMachine. I looked it over and have a spare external drive to plug in. That should take care of getting the Mac up and running in case of failure.
Simplest - from the DOS days…
Use the “AT” command to schedule a job when weveryone has left for the night.
Set up a batch that will XCOPY everything form the network drive to a USB drive.
In backup.bat that runs each night…
XCOPY M:*.* F:\BACKUP\ /S /Y /C etc. check for the options you need.
I have used this on several customers systems, it can schedule, can build a variety of job types and timings, it also can be set to ignore unchanged files so backups will run faster later.
Since you mentioned compression…
I am not a fan of anything that makes backups of mission critical files in any format that cannot be read by a normal windows file system. Bare metal OS and system backups, fine, but I always want to be able to pick out specific files without wondering if I can extract it individually from a backup archive that requires support for a company that may, or may not still be in business. I dont want to have to reload a server to last weeks image to get a quickbooks company file back.
I used robocopy for years in one place. Never actually had a problem with it. Scripted it and even got an email log nightly.
Of course this was at a place that was so cheap that a failure, in another system, cost them about $40,000 in data recovery services. THEN they spent six figures on a data infrastructure (then they got sold and closed the division, but that is another story.) To their credit no fingers were pointed in the data loss problem, they were warned and owned up to getting burned with the exact scenario a brilliant IT person told them would happen.
The problem with that, and what the OP is doing now, is that it copies the data from the server drive to the workstation and then back down the network connection to the destination drive.
This, at the least, doubles the network traffic if there are two physical servers. And is completely unnecessary if both drives live in the same server or are just separate mappings on the same drive.
Og cries if it’s happening on a 10 meg connection.
Much better if he can do it from one of the actual servers, preferably the faster/most RAM of the two if they’re not identical.