--
You received this message because you are subscribed to the Google Groups "Dataverse Users Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dataverse-commu...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dataverse-community/8e6d5ae2-6a5c-456a-84b7-225bec1cbf02%40googlegroups.com.
--
You received this message because you are subscribed to the Google Groups "Dataverse Users Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dataverse-commu...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dataverse-community/103447b0-7b98-4814-89c8-e365573a0fa4%40googlegroups.com.
Hi Jamie, when you say "collection" can you please be a little more specific? Do you mean all of https://dataverse.ucla.edu ? Including harvested datasets?
On Thu, Dec 19, 2019 at 7:49 PM Jamie Jamison <jam...@g.ucla.edu> wrote:
Hi, I also need to retrieve an entire collection's metadata. I've been working through the pyDataverse examples but can't find what feature to add so I can retrieve all the dataverses. Thank you, jamie--
On Tuesday, December 3, 2019 at 2:45:33 AM UTC-8, Stefan Kasberger wrote:Hi,I can also recommend pyDataverse, a Python API wrapper developed by me. :)You would have to add the feature to retrieve all Dataverses. Once you have all Dataverse Aliases, you can easily retrieve all Datasets inside.Cheerz, Stefan
Am Sonntag, 24. November 2019 04:22:10 UTC+1 schrieb Yuzhang Han:Hi there,I am working on a research project where I need to download the metadata of ALL datasets on a server. I was wondering if it was possible to do that using any API command, or in any way?Thanks,Andrew
You received this message because you are subscribed to the Google Groups "Dataverse Users Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dataverse-community+unsub...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dataverse-community/103447b0-7b98-4814-89c8-e365573a0fa4%40googlegroups.com.
Hi everyone,I'd like to share a collection of Python 3 scripts I've been using for collecting and analyzing the metadata of datasets in Dataverse-based repositories. The scripts are published and documented at https://github.com/jggautier/get-dataverse-metadata, and they use pyDataverse. Please feel free to re-write (and for you Python gurus out there, improve?) any of the scripts.I used the scripts to download the Dataverse JSON metadata of the 30k+ datasets published in the Harvard Dataverse repository and write certain metadata to CSV files. The JSON files and CSV files are published in Harvard Dataverse at https://doi.org/10.7910/DVN/DCDKZQ. The metadata is current as of December 12, 2019. Consider downloading the JSON metadata from there instead of using the scripts to re-download the JSON files from Harvard Dataverse (unless you really need the most recent metadata). I also downloaded the metadata of datasets in Scholar's Portal Dataverse and Dataverse NL and plan to add them to that dataset, too. I'm told it would be possible for repository installation system admins to get the cached JSON files straight from their servers, instead of using the Native API to download them. So I could imagine that each repository that wanted to publish their collection of dataset JSON files (or send them my way so I could add them to the dataset in Harvard Dataverse).There are other tools that do similar things as these scripts, listed in the User Guides at http://guides.dataverse.org/en/latest/admin/reporting-tools.html, particularly TDL's reporting tool at https://github.com/TexasDigitalLibrary/dataverse-reports, but they require access to the repository's postgres database to get the metadata, so they're more for repository admins.
To view this discussion on the web visit https://groups.google.com/d/msgid/dataverse-community/bdaec80a-c773-4d12-b0dc-78f320dd5ab8n%40googlegroups.com.