Linux backup software

We have a Linux machine (Ubuntu) primarily used as a file server (it runs Samba if it makes a difference). Files are created/opened/saved to and from it from a couple Macs and PCs—the files are never directly opened or manipulated on the Linux box.

I installed and ran Sbackup, configured it per several tutorials on the Net, told it to backup now (that went fine), and have been waiting for new/modified files to appear on the remote drive. However, after the initial backup, no new folders have been created on the remote drive.

Before I go in search of a problem with configuration or connection, it occurred to me that perhaps Sbackup will only keep track of new files/modifications that were made under the Linux OS (not sure if that’s the right term, but I mean a file that was created/modified by OpenOffice running on the Linux box), and that files placed there or modified by other OSs and saved to the Linux machine via Samba might not be tracked and backed up.

Should Sbackup be making backups of files created by a PC/Mac and only stored on the Linbox?

If so, where do I start troubleshooting? I’ve double-checked checked the frequency tab (set to daily), and have created/modified files within directories that were backed up via the “backup now” button (i.e., the locations/file type settings are correct).

If not, can you recommend a graphical backup program that will? While I’d clearly prefer to stay with an GNU license, I’ll look at commercial software if it’s the only option.

Thanks,

Rhythm

If you haven’t already tried it then you may get a faster response from the Linux community at the Ubuntu Forums

Thanks – tried there a few days back; sunk like a stone.

My apologies, RHythmdvl. I suppose I haven’t been keeping up with the flow at UF lately. Threads there sink like a stone, especially you post to Absolute Beginners and General Help.

I’m editing this post as we speak.

Missed the edit window, heh.

I don’t see why SBackup would discriminate between file edits like that… a changed file is a changed file, after all :slight_smile:

You could try downloadable software, but as always, you should stick with the tried and true, if only because there is less that can go wrong.

Personally, I use a combination of three programs, called “rsync”, “gzip”, and “tar”.

Tar takes a collection of input files and encodes them into a single stream, like a tape drive. When run in conjunction with “gzip”, the entire collection is compressed, like a zip file.

rsync then transfers the files to a backup source, but unlike other alternatives (or simply dragging and dropping the archive from computer to computer), rsync calculates the difference in the files and sends only the changes, which speeds up the process a fair bit.

All of this is command-line oriented, but there are graphical alternatives available, “gRsync” being one of them. My WAG? Maybe a package called “gTar” exists too.

Now, problem 1 is that rsync is not a linux-to-windows utility… and you didn’t mention whether your “remote drive” was another drive connected to your Linux box, whether it’s an external drive, or whether it’s a Windows computer on the network.

I can help you custom tailor a solution so that the entire backup process will run on its own without you even being in the same room!

Let me know.

I’ve got to agree with Casserole. I do daily backups to an external hard drive using only rsync and cron. It’s really very simple. I can post the scripts I wrote when I get home.

Ok, I’m home now, here’s what I’ve got:

First, my cron file is very simple. All it says is to execute the program backup every day of the year at 9:37 pm:


*     *   *    *     *     command to be executed
-     -    -   -     -
|     |     |  |     |
|     |     |  |     +----- day of week (0 - 6) (Sunday=0)
|     |     |  +------- month (1 - 12)
|     |     +--------- day of month (1 - 31)
|     +----------- hour (0 - 23)
+------------- min (0 - 59)

37 21 * * * $HOME/sh/backup

and then the actual program uses rsync to copy some things to the external hard drive:


#!/bin/bash

STORAGE_DIR=/data
BACKUP_DIR=/mnt/backup
DATA_DIR=$HOME/sh/data

if $(mount -w $BACKUP_DIR)
then
        rsync -avu --delete --exclude-from=$DATA_DIR/backup.dat $STORAGE_DIR/ $BACKUP_DIR/data/
        rsync -avu --delete /etc $BACKUP_DIR/
        rsync -avu --delete /boot/grub/menu.lst $BACKUP_DIR/
        sleep 10
        umount $BACKUP_DIR
else
        echo "Could not mount $BACKUP_DIR.  Already mounted?"
fi

Oh, and I just added one line to my /etc/fstab file to tell the computer where to look for my backup external hard drive:


UUID=800cf942-0b48-4aa0-9d5a-e00cd3b1b784 /mnt/backup  ext3     defaults,noauto,user,ro,noexec,nosuid   0 0

Another satisfied user of rsync.

I just finished setting up a file server for a small office, using FreeNAS for the server, and backing up via rsync to rsync.net.

Anyone who is using rsync should look into this article on Snapshot-style Backups using rsync. The author describes a fairly simple way of using hard links to maintain a whole series of snapshot backups, with minimal duplication of files.

This is the same technique that Apple uses for their Time Machine technology.

With a short useful page provided by the rsync.net guys, I was able to get my FreeNAS server going with hourly, daily, weekly, and then monthly snapshots uploaded to their San Diego servers, with minimal overhead. :cool:

When it rains it pours.

I’m not quite panicking yet, but things have gone a bit off the deep end here. Amidst all this, I started getting $Home/.dmrc errors which led to /.ICEauthority errors and inability to log on, which, comically enough, led to restarts and getting motherboard beep codes (CPU OverVoltage!). Hahaha…really funny, no? OS and hardware issues at the same time! Bwahahaha!

Thankfully the issue seems to be PSU-related, so that’s a (relatively) easy fix.

What, that’s not funny enough? Ok, ok, you got me. I’d been using a Linksys NAS200 as a file server, but due to mundane issues (e.g., speed) I was transitioning to using the Linux box as the file server and the NAS as its backup. I pulled and stored the filer-server drives and replaced them with new backup-drives and started this thread about them. But when I started getting the beep codes, I swapped back the drives, and have been (and am going through) hours of Linksys tech support help (phone and chat) because … wait for it … the NAS200 can no longer read the RAID or access the drives.

Now that’s funny.

(oh, no panic because the data exists in four or five places — I pushed the NAS200 files to the (temporarily dormant) Linux machine, which has two drives in RAID 1, so as soon as it’s up and running again the data should be there. As the beep codes started, I was able to access files via Samba (though not log in) and pulled all the essentials locally, and by following recovery instructions I was able to individually mount the drives in a USB enclosure on an old Linux laptop, so I can pull files from there.)

I don’t don’t don’t DON’T want to go through this again!

Anyway…
It sounds like Sbackup should have been looking at date modified and not given a damn as to how the file got there — and since it wasn’t, something was wrong on my end.

Rsync — or for the sissies among us, Grsync — sounds like a good, straightforward way to go.

Since the Linux machine typically runs headless, I’m going to test out BackupPC and the Filesystem Backup module of Webmin. Both of these have Web-based interfaces, so checking/changing the backup configuration will be easier — BackupPC seems to have rsync support and a few other bells and whistles.

But it seems my main question has been answered—regardless of where and how files were created or modified, the “last modified” table gets changed, and hence the incremental aspect of rysnc and other backup programs should detect and save those files.

But everything is on hold until the PSU problem gets sorted out, then figure out the .dmrc and .ICEauthority errors, then who knows what else will crop up…
But thanks so far!

Rhythm

I find that using a NAS server introduces a lot of extra variables which could potentially cause the backup process to come to a halt. Hence, KISS! Keep it simple, silly :slight_smile:

Let’s say you had all your data on /home/ftp/data, and your external hard drive is located on /media/Backup


tar czf /tmp/backup.tar.gz /home/ftp/data
rsync avz /tmp/backup.tar.gz /media/Backup/backup.tar.gz

You can even put these two lines into a script, and run it whenever you feel like backing up your data. Furthermore, you could add this script to your “crontab”, and request it to run at a specified interval (such as every two seconds, maybe? :))

Ah, the NAS is in the loop in an attempt at security/safety. We (Mrs. Dvl and I) work from home and, since moving out to the sticks, can go for a while without reason to leave the house. Yeah, we’re hermits. Therefore, storing off-site backups isn’t really in the cards, since we’re not shuttling between locations (home/office) to drop off a spare drive or DVD, and online storage is extraordinarily expensive for the amount of storage we’d need.

So for data integrity, I have two drives working as a RAID 1 device inside the main Linux box. Secure(ish) as that is, I want backups outside the box in case of a major hardware malfunction (I just had a thread about a power supply gone bad). This is where the NAS comes in — it too stores data in RAID 1.

I wired the house with Cat-6 to a few distant locations, once of which is in the basement in a far corner of the house opposite the office. That’s where the NAS is. Not perfect, but if we’re robbed and the office is emptied out, we’re hoping the thieves won’t be looking in the basement next to the cat litter for a small device. Same thing with a fire – we can only hope that an inferno will spare (at least to the point of recovery) either the office machines or the NAS. We’re on a mountain, so flooding probably isn’t a concern, and even still it’s on a high shelf. Raptor attack, well, that’s a whole different story.

So, that’s why you should always put a NAS in the loop (and you always leave a note). But if you’re saying that an NAS can occasionally screw things up, that’s why I’m hoping one of the Web interfaces will work out, so I can check things from my desktop.

Errrrgh! Not sure if this should be its own thread — half is my need to learn Linux, the other half is getting this backup project finished.

It turns out that Sbackup works fine when “backup now” is pressed — the problem is getting it to do automatic, scheduled backups.

I have it set to hourly on the time tab, and in checking Webmin’s “Scheduled Cron Jobs” module, /etc/cron.hourly/sbackup is reported as active.

Hours pass and no new backups appear, but if I press “backup now,” all new files show up.

How would I start diagnosing this problem?

I’d start by perusing the log files around the time the backup was supposed to take place. If any messages were output to stderr, they’d be in /var/log

Try /var/log/syslog… sbackup may have their own backup folder too, take a look!