Doh! You now how you're sometimes just staring at something and still can't see it?
Well, anyway -- yeah those numbers are definitely a bit high for the typical usage that I'm generally targeting and testing for. At 62mb, the JSON serialization/deserialization is most likely the culprit.
So one way to tackle the issue is to partition your Vault, say break it up by time, every month (or every week if you're frequency is even greater), just create a new vault, or archive the old one (zip it up, rename it and remove the folder it's at, and then initialize a new one). Or you can partition by some other dimension that's intrinsic to your data, kinda like sharding a DB. For example, let's say you're creating SSL certs for customer-specific sites on your service, you can partition it up by the site's name (say A-Z and have one vault for each starting letter), or if you you have a unique customer ID that's all numerical, you can take the first 1 or 2 digits of that ID and partition it by that (10 or 100 vaults).
These are all kinda stop gaps and a bit klunky, but if you wrap it up in a couple of your own PS scripts, then it will happen automagically and consistently.
Now the other thing that would pby work for you is a new development that's begun in the current development version (0.8.2), namely the introduction of alternate Vault providers. The current version of ACMESharp only has a single vault type, namely a
local file-based one, but from the start, the vault system was based on a provider model, and I've begun working on a couple different vault providers that are backed by different storage services. One of them is an EF Core based provider, which would theoretically work with any data store that has an EF Core provider, such as SQLite or SQL Server, or even some NoSQL stores that
should become supported in the future.
However, the alternate providers are not ready yet and pby won't be till the next dev release as those take a while to implement and test properly.