i (am) [no, was] in search of an Internet Backup Solution (first via VPN, later ssl secured). Now i found Bacula or even better Bareos. ;-)
I did everything as it has to do, as i think. My problem is as following: I want to get a daily (or nightly) Backup of remote Data, which are approxemately about 30-31 GB. Daily change is about 300MB.
I did some small Initial Backups with about 10MB and 3GB which run as they should.
After this i wanted to get a full initial backup of all data which led to an Timeout Error after 10 Hours and ~ 11GB of data transfered after my Internet connection was resynced.
My issue is now how to deal with this time outs? Is there an flag to set for an increased "hey bareos, there could be a reconnect soon" time? So that i can set it to maybe 10 or 15minutes and everything should be fine?
The second, but less important question is if there is another, maybe higher compression rate than GZIP3, which i am using at the moment?
Thanks for your help and of course time.
Chris
First thank you for your help.
[i cut the whole quotes for a better reading experience.]
I understand your way of thinking, but in my case with VPNs there should be no problem with, or am I wrong?
I mean if I aim from backup host One (192.168.1.1/24) to client host two (192.168.2.1/24) there should be no difference if the global external adress has changed since the UTM is handling the external Net to Net Communication?
Of course there could be a problem in case of direct (secured) connections without any VPN like masquerading. But actually the Bareos Connectors should be able to rescan a maybe changed DNS Name in case of a connection error.
But i will give the GZIP9 Compression a try, maybe (and i would hope so) this is an already good enough solution to ship around the problem.
Thank you,
Best regards,
Chris
Thanks for your reply.
The main concern is that the whole Backup is running until the resync of the client side at 4.40 (last night). Before it run well from 15.30. In this time there were transfered about 14G. Even the resync of the server side internet connection hadn't break the backup. So i am in search of an method to avoid breaking up the whole thing just because of an maybe 2 or 3 minute resync?
Is there a way to set this limits a little bit higher?
I will run some tests gz6 vs gz9 today with a partial backup data but it seems as you are right gz6 gives me a backup rate of 264kb/s and gz9 is about 220kb/s with a 40m package. i will raise the amount now to a gigabyte.
Some another question is: can i do a fullbackup from a pre synced other server and how can i avoid that a full backup is syncing the whole data masses again from source instead of the former incremental/differential backup states?
Thank you once more.
Have a nice weekend.
Chris