System backups - BackupPC

0 views
Skip to first unread message

Don Delp

unread,
Sep 23, 2008, 9:32:24 AM9/23/08
to nlug...@googlegroups.com
I used to use a backup script I found online that created .bz2
archives of each of your directories 1 level up from root. I found it
handy because it would create a different archive for each user and
was a nice balance between huge archives and too many archives. I
lost my backup drive and haven't started backups again yet.

I've been looking at a few different solutions, but I think BackupPC
looks best for multi-pc backups. It keeps single copies of redundant
files (even across different clients). With my last setup I could
only manage to keep a few old backups before I ran out of space.

Does anyone have any experience with BackupPC, or suggestions on a
better way to handle backups? Personally I only have ~4 hosts to
backup, but I'd enjoy hearing about larger backup schemes as well.

Don Delp

unread,
Sep 23, 2008, 9:32:54 AM9/23/08
to nlug...@googlegroups.com

Jonathan Moore

unread,
Sep 23, 2008, 9:39:45 AM9/23/08
to nlug...@googlegroups.com
On Tue, Sep 23, 2008 at 8:32 AM, Don Delp <nesm...@gmail.com> wrote:

> Does anyone have any experience with BackupPC, or suggestions on a
> better way to handle backups? Personally I only have ~4 hosts to
> backup, but I'd enjoy hearing about larger backup schemes as well.

I've used BackupPC before as a alternative to BackupExec for a ~60
Windows XP and about 10 Debian workstations network. It worked fast,
pretty easy to use, the web interface made it simple.

I never had to really restore anything with it, other than downloading
a file now and then, so I can't say much for that.

-Jon

Andrew Farnsworth

unread,
Sep 23, 2008, 9:51:04 AM9/23/08
to nlug...@googlegroups.com
On Tue, Sep 23, 2008 at 9:32 AM, Don Delp <nesm...@gmail.com> wrote:

I used to use a backup script I found online that created .bz2
archives of each of your directories 1 level up from root.  I found it
handy because it would create a different archive for each user and
was a nice balance between huge archives and too many archives.  I
lost my backup drive and haven't started backups again yet.

<snip>

Don,
  I presume you mean "One level up from the current directory" as, by definition, there is nothing above "root", though I suppose it could be "One level down from root".

As to your question on backups, it really depends on what your goal is.  Do you want to be able to restore the entire system to it's current state or just preserve user files in the case of a failure?  It is very important that you determine what you want to backup, what your goals are for the backup, and where the data is that you are trying to backup.  For example, don't forget to locate and backup any database files that a user may be using if they are not stored in their home directory and you are trying to preserve user data and decide to only backup the home directories.

Another important point is to be sure to test your backups by running a restore and validating that the data was backed up and restored correctly.  Also, permissions and other file attributes may be very important to backup.  Don't forget that if you are restoring to a different server (and you didn't do a complete system backup) that your users may have different UIDs and GIDs.  We have been bitten by this one before and is the reason we use LDAP / AD to manage our users now which also allows us to manage users across multiple servers but does require another level of backups to be done (i.e. you must backup your LDAP user database).

If you can give us a bit more detail on what your goals are, we can probably provide a bit more directed advice and it will probably help you get your head around it as well.

Andy

Andrew Farnsworth

unread,
Sep 23, 2008, 9:53:22 AM9/23/08
to nlug...@googlegroups.com

Jon makes a good point here, don't forget to include in your goals what you want to be able to restore?  Some backup schemes are pretty much a "snapshot" type backup and would require restoring everything before you can get to individual files where others would let you easily restore a single file from the backup.

Andy

Don Delp

unread,
Sep 23, 2008, 11:13:41 AM9/23/08
to nlug...@googlegroups.com
Thanks for the replies. Sorry about the double post. Gmail had a
seizure. Guess that's what I get for using a beta email system. :)

On Tue, Sep 23, 2008 at 8:53 AM, Andrew Farnsworth <far...@gmail.com> wrote:
> On Tue, Sep 23, 2008 at 9:39 AM, Jonathan Moore <superm...@gmail.com>
> wrote:
>>
>> On Tue, Sep 23, 2008 at 8:32 AM, Don Delp <nesm...@gmail.com> wrote:
>>
>> > Does anyone have any experience with BackupPC, or suggestions on a
>> > better way to handle backups? Personally I only have ~4 hosts to
>> > backup, but I'd enjoy hearing about larger backup schemes as well.
>>
>> I've used BackupPC before as a alternative to BackupExec for a ~60
>> Windows XP and about 10 Debian workstations network. It worked fast,
>> pretty easy to use, the web interface made it simple.
>>
>> I never had to really restore anything with it, other than downloading
>> a file now and then, so I can't say much for that.
>>
>> -Jon

BackupPC does seem the most similar to BackupExec, from my limited
experience. In some ways I think BackupPC is a little more user
friendly. I always find it a PITA to mess with the incremental
backups and try to hunt down a file when the date is at the root of
the file tree and I have to enter the entire file path for each date I
want to check. (This may not be the case in current versions >= 10 or
for users that have read the manual)

>
> Jon makes a good point here, don't forget to include in your goals what you
> want to be able to restore? Some backup schemes are pretty much a
> "snapshot" type backup and would require restoring everything before you can
> get to individual files where others would let you easily restore a single
> file from the backup.
>
> Andy
>

Since this is just a home network, I hope to stop short of keeping
disk images. I haven't looked into the option of keeping an old image
and just dropping the most recent backup on top of it. That sounds
like something fun to play with.

My biggest concerns are a combination of "ease of file restore" and
disk usage. As long as my data files can be recovered I don't mind
rebuilding the file system. Right now I don't use any databases so I
haven't checked to see if there's an easy solution to handle them. I
think the quick and dirty solution would be to have cron mysqldump
them somewhere that gets backed up.

I think that usually I will only have to restore a few files at a
time. "Where did our pictures go?" and "xorg.conf was fine yesterday"
are situations that I expect to run into.

My last system was horrible with UIDs. BackupPC can't be any worse. :)

Ken Barber

unread,
Sep 23, 2008, 2:24:12 PM9/23/08
to nlug...@googlegroups.com
The absolute *BEST* solution for backing up systems on a home network
is Apple's Time Capsule. It works similarly to 'rsnapshot' in Linux.
Prob'ly wouldn't take much scripting to get Linux boxen to work with it.

Andrew Farnsworth

unread,
Sep 23, 2008, 2:37:20 PM9/23/08
to nlug...@googlegroups.com

Any idea how to get Apple's Time Machine to work with other network drives?

Andy

Ken Barber

unread,
Sep 23, 2008, 2:56:34 PM9/23/08
to nlug...@googlegroups.com
On Sep 23, 2008, at 1:37 PM, Andrew Farnsworth wrote:

> Any idea how to get Apple's Time Machine to work with other network
> drives?

Interesting. I had never noticed before, that there doesn't seem to
be any way to get it to include files & directories, only a way to
exclude them.

I wonder what would happen if you mounted a network share directly
into the top level of the filesystem tree (e.g. /some_mount_point)
instead of where Apple usually mounts that stuff?

Steven S. Critchfield

unread,
Sep 23, 2008, 3:13:20 PM9/23/08
to nlug...@googlegroups.com
Maybe you should look at some of the better options out there for
network backup options. Amanda is one I know that is being used by
some people around here. Here at my work, we use Bacula. Bacula seem
plenty happy to use 3 500gb drives we have exported via AoE to do it's
backups.

One of the benefits of the bigger backup apps is the idea of doing
incremental backups. This usually gives you smaller backup files.
It also allows you the ability to keep a longer history around.

Most of the big backup apps will give you plenty of tools for recovery.
Many will even do backup of the results of a script. We use Bacula to
dump our postgres database to a SQL file that is importable to pretty
much any DB. Might take a little massage work, but it beats trying to
backup the data files. Also you could easily backup configuration data
and decide not to backup your binaries. For instance on Debian, you could
backup your /etc including /etc/apt/sources* and then backup a copy of
the results from dpkg --get-selections so you could reinstall from scratch
fast.

Critch

--
Steven Critchfield cri...@basesys.com

Drew

unread,
Sep 23, 2008, 3:20:53 PM9/23/08
to nlug...@googlegroups.com
Actually, there is a way to make time machine on an apple see/use network drives. I messed with it for a little while, but at that point there were still some bugs that cause backups to not be reliable (?) with time capsule. As a result, simply picked up a 4 bay USB drive holder for all the IDE drives I had laying around here, and solved the problem, plus gave myself a little extra working room. If anyone is *really* interested in making time machine work with network drives, I'll dig out the little bit of magic I found and where to put it, and post it up.
Reply all
Reply to author
Forward
0 new messages