Hello everybody,
We are currently finishing setting up a Dataverse here at the State Archives of Belgium for an upcoming data archive for social sciences.
I'm reaching out to the community today because I'm having a hard finding a solution to a certain use case. I identified the following ones, the most complex of them being the last:
1. Someone wants to deposit a small amount of datasets (let's say 1 or a couple more) into our archive. To do that, they need only use the GUI, click on Add Data, fill in the metadata fields and add the files.
2. Someone wants a small amount of metadata records copied into our Dataverse for increased visibility. To this end, the
Import a dataset native API feature comes in handy.
3. Same thing but on a much larger scale, let's say 10s, 100s or even 1,000s of metadata records. In that case,
OAI-PMH is the way to go.
4. But what if we want to migrate a large quantity of both metadata and data from one Dataverse to another? Are there documented ways to do this as automatically, and with as little case-by-case human intervention, as possible? Perhaps by the powers combined of OAI-PMH and another protocol?