I had a similar project once few years ago. I cannot post the detailed solution due to legal restrictions but I can describe the general idea.
The situation was that a piece of software running on a server would create archive files in a given directory. Those files were to be archived in sufficiently many copies and then removed.
So I did a full backup job with a fileset created dynamicaly using a script which would do a directory scan in the directory, then run a query against bareos catalog and only backup those files which:
1) Weren't backed up sufficiently many times
and
2) Weren't backed up on this media yet.
Apart from that I'd run a cron job completely asynchronously which would scan the directory and remove the files which had been already backed up enough times.
This way I made sure that each archive file would get backed up on several different media and only after that it would get removed from the source server.
In my case it was an all-in-one installation of bareos so I had easy access to both director database and FD directory contents. If you have those components separated you might need to do some access rights juggling of course.
Of course you need to set the retention values to some ridiculously high periods in order to not get the files pruned from the database.
Hope this helps