File upload through API

45 views
Skip to first unread message

Anders Conrad

unread,
Jan 5, 2017, 10:01:48 AM1/5/17
to Dataverse Users Community
Hi,

we are continuing our test of Dataverse for astrophysic datasets. We have created new datasets through the Native API and work on adding files to these. This raises some questions:

1. We could only find an upload option through the SWORD API, rather than the native API. Is that correctly understood, or did we miss something?

2. Our datasets are already structured as BagIt files, packaged in a zip archive. When we upload this to Dataverse, it automatically unpacks the BagIt, modifies the Dataset metadata and creates all the files as individual files. This is not the behaviour we want, as we would like to provide the entire BagIt package as the file content for the dataset. Is there any way to avoid Dataverse unpacking zip files and modifying metadata content?

Thank you!
Anders

Philip Durbin

unread,
Jan 5, 2017, 11:59:11 AM1/5/17
to dataverse...@googlegroups.com
Hi Anders,

No, you didn't miss anything. At the moment SWORD is the only game in town for uploading files via API but the good news is that the next release of Dataverse (4.6.1) is expected to include a new "native" API for uploading (and replacing) files. If you're interested in testing out this API prior to it being released, please let me know! Here's the issue we're using to track this feature: https://github.com/IQSS/dataverse/issues/1612

At the moment, you can "double zip" as a workaround for preserving zip files as-is. I hope my somewhat recent comment on this issues helps clarify the situation: https://github.com/IQSS/dataverse/issues/3439

Thanks for using Dataverse!

Phil

--
You received this message because you are subscribed to the Google Groups "Dataverse Users Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dataverse-community+unsub...@googlegroups.com.
To post to this group, send email to dataverse-community@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dataverse-community/9b61d172-85ff-4207-ac59-4b59f3ddd42e%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--

Anders Conrad

unread,
Jan 10, 2017, 4:41:09 AM1/10/17
to Dataverse Users Community, philip...@harvard.edu
Hi PHil,

thank you very much for your reply and links to the related Github tickets.

Depending on the expected release time for 4.6.1, I might be interested in testing before release. I would be ready to work on this in about two weeks time.

Our use case for keeping zip-files zipped is motivated both by the need of maintaining structure (as in #2249), but also to preserve the BagIt packages intact, as we expect the data to outlive Dataverse (sorry, not to offend you, we have a 50 years horizon ;-) We have anticipated changes of data reflected in reload of entire BagIt packages, but I will discuss this with the researchers, in the light of the open Github tickets.

Cheers,
Anders


Den torsdag den 5. januar 2017 kl. 17.59.11 UTC+1 skrev Philip Durbin:
Hi Anders,

No, you didn't miss anything. At the moment SWORD is the only game in town for uploading files via API but the good news is that the next release of Dataverse (4.6.1) is expected to include a new "native" API for uploading (and replacing) files. If you're interested in testing out this API prior to it being released, please let me know! Here's the issue we're using to track this feature: https://github.com/IQSS/dataverse/issues/1612

At the moment, you can "double zip" as a workaround for preserving zip files as-is. I hope my somewhat recent comment on this issues helps clarify the situation: https://github.com/IQSS/dataverse/issues/3439

Thanks for using Dataverse!

Phil
On Thu, Jan 5, 2017 at 10:01 AM, Anders Conrad <a...@kb.dk> wrote:
Hi,

we are continuing our test of Dataverse for astrophysic datasets. We have created new datasets through the Native API and work on adding files to these. This raises some questions:

1. We could only find an upload option through the SWORD API, rather than the native API. Is that correctly understood, or did we miss something?

2. Our datasets are already structured as BagIt files, packaged in a zip archive. When we upload this to Dataverse, it automatically unpacks the BagIt, modifies the Dataset metadata and creates all the files as individual files. This is not the behaviour we want, as we would like to provide the entire BagIt package as the file content for the dataset. Is there any way to avoid Dataverse unpacking zip files and modifying metadata content?

Thank you!
Anders

--
You received this message because you are subscribed to the Google Groups "Dataverse Users Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dataverse-community+unsub...@googlegroups.com.
To post to this group, send email to dataverse...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages