I have set up Bacula to backup several servers and everything is humming along nicely.
My goal is to get Bacula to backup to multiple smaller files instead of one large (in my case ~2GB) file for offsite to Amazon S3. My thought being if I can get the incremental changes from all the servers each day to go to a new file. This file will then get picked up by a cron job and sent to S3.
The way to do this seems to center around
Use Volume _Once_ = yes
but right now that just creates a new volume for every backup. I just want one volume per day and like 14 days history (so 14 volumes that rotate around or I can use my cron job to delete volumes as needed). For scale's sake, this will probably have 20 or so jobs run to it and each job seems to net around 7MB of changed data. Total on the initial backup is like 250MB per job. Backup jobs run at night but eventually I want to crank that up to run several times a day (all to the same daily volume)
Relevant data from config files to follow:
bacula-dir.conf
Client {
Name = server-sandbox-fd
Password = <crazypassword>
Address = fqdn-to-server.srv.internal.domain.com
FDPort = 9102
Catalog = MyCatalog
File Retention = 30 days
Job Retention = 6 months
}
Job {
Name = server_SandboxBackup
Type = Backup
Level = Incremental
Client = server-sandbox-fd
FileSet = "Data Opt Backup"
Schedule = "Daily Evening"
Storage = File
Pool = Serverpool
Messages = Standard
}
Pool {
Name = Serverpool
Pool Type = Backup
Volume Retention = 24h
Recycle = yes
AutoPrune = yes
LabelFormat = servervol_
Maximum Volumes = 12
Use Volume _Once_ = yes
}
Storage {
Name = File
Address = fqdn-to-server.srv.domain.com
SDPort = 9103
Password = <crazypassword>
Device = FileStorage
Media Type = File
Maximum Concurrent Jobs = 20
}
bacula-sd.conf
Device {
Name = FileStorage
Media Type = File
Archive Device = /mnt/backupdisk/serverbackups
LabelMedia = yes
Random Access = yes
AutomaticMount = yes
RemovableMedia = no
AlwaysOpen = no
}
Thanks so much! Any insight or alternative approaches are appreciated!