Another small update here - I got around the issue with having so many
sites and so much data by doing the following:
TODAY=`date +%A`
if [ "$TODAY" = "Monday" ]; then
WEBSITES="
site1.com site2.com site3.com"
elif [ "$TODAY" = "Tuesday" ]; then
WEBSITES="
site4.com site5.com site6.com"
elif [ "$TODAY" = "Wednesday" ]; then
WEBSITES="
site7.com site8.com"
elif [ "$TODAY" = "Thursday" ]; then
WEBSITES="
site10.com site11.com"
elif [ "$TODAY" = "Friday" ]; then
WEBSITES="
site12.com site13.com site14.com"
elif [ "$TODAY" = "Saturday" ]; then
WEBSITES="site
15.com"
else
WEBSITES="
site16.com site17.com"
fi
I'm running this backup daily, which gives me an entire site backup
once a week for each site.
I also modified the ONLINEBACKUPs section to do the tarring of the
sites directly, since i'm bypassing the local backups altogether. (I
have every backup option set to "no" except DAILYONLINE.)
# Daily
if [ "$DAILYONLINE" = "yes" ]; then
echo Daily backup \for transfer to Online Storage Backup of
$WEBSITE
echo
tar -czf $TEMP/$WEBSITE.$SHORTDATE.daily.tar.gz $DOMAINDIR/
$WEBSITE
echo
echo
----------------------------------------------------------------------
echo
fi
Lastly, I only wanted the database backups to by transferred to S3
once a week, so rather than setting this option to 'yes' or 'no', I
set it to the day number I wanted it to be backed up on, like so:
DATABASEBACKUP="1"
Then changed the conditional like so:
if [ "$DATABASEBACKUP" = "$DNOW" ]; then ...
Now my database backups (which are backed-up locally every day using
the automysqlbackup.sh script) will be transferred to S3 every Monday.
My plan is to once a month, login to S3 and delete all site backups
older than 4 weeks (which is easy, as each site tar is dated).
Maybe this will help someone else on a DV server?
Best,
Derek