local batch metadata updates

38 views
Skip to first unread message

Youn Noh

unread,
Aug 12, 2024, 10:30:08 AM8/12/24
to Dataverse Users Community
I am trying to figure out how to make batch metadata updates to datasets in a local Dataverse installation. Any links to relevant documentation on how to export and import metadata in bulk before and after editing would be much appreciated. We are currently on version 5.13, so features available in that version would be most useful. Thanks!

Amber Leahey

unread,
Aug 13, 2024, 10:35:50 AM8/13/24
to Dataverse Users Community
There are some scripts in PyDataverse  pyDataverse — pyDataverse 0.3.1 documentation   for batch upload (including one using DDI) which we've built on and have used as part of a migration from Nesstar to Dataverse in the past. There are also a lot of existing scripts available for migrating from one Dataverse to another installation and this requires bulk uploads (see our's for example  GitHub - scholarsportal/dataverse-migration-scripts: Scripts for migrating datasets from one Dataverse installation to another. )  

The Dataverse API documentation is also pretty good  API Guide — Dataverse.org

In terms of batch metadata updates after upload and publishing, we do this directly in the database using the metadata tables but would love to see some kind of Database Admin Tool for batch edits to be made available. Sometimes we get a request to add a 'Series' field value to over 100 datasets, for example, or to add controlled vocabularies after publishing to enhance the dataset's discovery and reuse. 

Reply all
Reply to author
Forward
0 new messages