Adapt for use on (dv), a few questions

2 views
Skip to first unread message

D-Rock

unread,
Dec 4, 2009, 8:51:01 AM12/4/09
to Zipline Backup
Hey Brett - seems like an excellent script. I'm wondering a couple of
things:

- how can this be adapted for use on a MediaTemple (dv) server? Seems
like the path format for the (gs) is hard-coded in several places.

- I have 28 sites totaling about 15 GB. Is it safe to assume this
script will fail in my case? I guess I'd set it up to run on smaller
batches of sites individually, at different times?

- how could I adapt the script to *not* store local backups, and only
use S3 storage? (I'll run out of space very quickly if I keep local
copies)

If you have a sec to think about this, let me know. Thanks.
Derek

Brett

unread,
Dec 4, 2009, 2:38:57 PM12/4/09
to Zipline Backup
Hello D-Rock!

Thanks for the complement on the script.

To answer the questions on the script.

The path is hard coded to some extent. I hacked it a bit to work with
cPanel (have not tested, but it should work). Can you give me an
absolute path that Plesk uses? I might be able to give you some
advice to make the script work. I will make some more configurable
options in the next release to allow people to select what system they
are using.

I would take a guess and say that it would fail if you do not split it
up into multiple steps. Since it is 15GB I would definitely not do on
site backups. Just set it up for each site maybe 1 hour apart. Try 1
site and run it from command line and see how long it takes. Then
make your calculations based on that.

As for not latting do local backups. There are some switches under
the advanced options. This is what you need to change.

DAILYBACKUP=yes
WEEKLYBACKUP=yes
MONTHLYBACKUP=yes

Change to

DAILYBACKUP=no
WEEKLYBACKUP=no
MONTHLYBACKUP=no

Let me know if I need to clear anything up. And if you can get the
absolute path for your websites, that would be awesome! e.g. /home/
username/public_html

D-Rock

unread,
Dec 4, 2009, 3:51:20 PM12/4/09
to Zipline Backup
Thanks for the detailed response Brett.

Sites on the (dv) live in /var/www/vhosts/. Inside that directory,
the structure is 'sitename.com/httpdocs/'.

Re: local backups - would changing those advanced config options also
disable the daily/weekly/monthly cycling to Amazon S3? Or does the
remote sync only do the latest backup regardless?

Thanks,
Derek

Brett Wilcox

unread,
Dec 7, 2009, 11:22:57 AM12/7/09
to ziplin...@googlegroups.com
Thanks for the path Derek!  I will see what I can do to implement better path conventions in the next release.

Those options that I gave you only affect the local backup options.

If you want to affect what gets sent to the online storage, then you would want to change these options.

DAILYONLINE=yes
WEEKLYONLINE=yes
MONTHLYONLINE=yes

Let me know if that answers your question.

Cheers,

Brett Wilcox
br...@brettwilcox.com

D-Rock

unread,
Dec 10, 2009, 8:59:57 AM12/10/09
to Zipline Backup
Hey Brett - I finally found the time to test this out, and I did get
it working on my dv (!). One problem though - if I disable daily local
backups (DAILYBACKUP=no), then the daily online backup doesn't run, as
the online part of the script doesn't archive the actual sites, it
just tars the entire daily backup directory. Since I disabled local
backups, that directory is empty.

It seems that at the moment, getting daily/weekly/monthly cycled
backups on S3 requires using local backups. Am I wrong?

Thanks,
Derek

D-Rock

unread,
Dec 11, 2009, 11:00:46 AM12/11/09
to Zipline Backup
Another small update here - I got around the issue with having so many
sites and so much data by doing the following:

TODAY=`date +%A`

if [ "$TODAY" = "Monday" ]; then
WEBSITES="site1.com site2.com site3.com"

elif [ "$TODAY" = "Tuesday" ]; then
WEBSITES="site4.com site5.com site6.com"

elif [ "$TODAY" = "Wednesday" ]; then
WEBSITES="site7.com site8.com"

elif [ "$TODAY" = "Thursday" ]; then
WEBSITES="site10.com site11.com"

elif [ "$TODAY" = "Friday" ]; then
WEBSITES="site12.com site13.com site14.com"

elif [ "$TODAY" = "Saturday" ]; then
WEBSITES="site 15.com"

else
WEBSITES="site16.com site17.com"
fi

I'm running this backup daily, which gives me an entire site backup
once a week for each site.

I also modified the ONLINEBACKUPs section to do the tarring of the
sites directly, since i'm bypassing the local backups altogether. (I
have every backup option set to "no" except DAILYONLINE.)

# Daily
if [ "$DAILYONLINE" = "yes" ]; then
echo Daily backup \for transfer to Online Storage Backup of
$WEBSITE
echo
tar -czf $TEMP/$WEBSITE.$SHORTDATE.daily.tar.gz $DOMAINDIR/
$WEBSITE
echo
echo
----------------------------------------------------------------------
echo
fi

Lastly, I only wanted the database backups to by transferred to S3
once a week, so rather than setting this option to 'yes' or 'no', I
set it to the day number I wanted it to be backed up on, like so:

DATABASEBACKUP="1"

Then changed the conditional like so:

if [ "$DATABASEBACKUP" = "$DNOW" ]; then ...

Now my database backups (which are backed-up locally every day using
the automysqlbackup.sh script) will be transferred to S3 every Monday.

My plan is to once a month, login to S3 and delete all site backups
older than 4 weeks (which is easy, as each site tar is dated).

Maybe this will help someone else on a DV server?

Best,
Derek

Gregory

unread,
Dec 11, 2009, 6:18:06 PM12/11/09
to Zipline Backup
Hi Derek,
I have what's probably a simple question for you. Where in your dv are
you storing s3sync, the backups and this script?

Thanks!

Greg

D-Rock

unread,
Dec 14, 2009, 9:24:43 AM12/14/09
to Zipline Backup
Greg - I'm storing it all in /var/backups/ like so:

/var/backups/s3sync
/var/nackups/scripts/
/var/backups/mysql/
/var/backups/temp/

I'm not storing local backups, but I would use /var/backups/sites/ if
I were.

One note of a problem I am running into, which I'm asking MediaTemple
about - I'm exceeding my TCP buffer size sometimes when transferring
to S3, which kills the whole operation. It seems erratic and
unpredictable. Will see if they can offer a fix.

Derek

Gregory Maher

unread,
Dec 14, 2009, 2:20:41 PM12/14/09
to ziplin...@googlegroups.com
Hi Derek,
Thanks for the insight. Did you use sudo to create these directors as I do not have permission to write in the /var/ directory.

Thanks again for the help - and for all of the information you've provided previously on this topic as it's a great resource.

Greg

Brett Wilcox

unread,
Dec 14, 2009, 3:58:30 PM12/14/09
to ziplin...@googlegroups.com
Hello Greg,

Just to let you know, you can select anywhere you want to put the backups.  There is no reason that you would have to put them in the /var/ location as Derek has done.

Can you just chmod the /var/ directory to give whatever user that is running the script access to that directory to create the backups?

Let me know if I can help!

Cheers,

Brett Wilcox
br...@brettwilcox.com

Gregory Maher

unread,
Dec 14, 2009, 7:15:35 PM12/14/09
to ziplin...@googlegroups.com
Hi Brett,
Thanks for the followup and the great script.
I have this working on both Media Temple gs and dv servers and so far, so good. I'm using another script to backup the databases, but ZiplineBackup to copy that file to S3 also. It would be cool if MySQL backup was in the same script. Any thoughts on including it?

Thanks again for the great resource!

Greg
Reply all
Reply to author
Forward
0 new messages