What do you actually want to do?
And do you need a live database or read-only-snapshots of a concurrently running import?
I had an idea some time ago of driving the batch-inserter API to import a large amount of data concurrently.
Whenever there is a request for a snapshot, the batch-inserter is correctly shut down, a copy of the database is taken and the batch-inserter restarted to continue the import. Incoming messages are fed to a message queue / event processing system, so the short shutdown-time shouldn't make a difference.
HTH
Michael