Sherry,
You can limit the max size per store (see the JSON example at https://guides.dataverse.org/en/latest/installation/config.html?highlight=maxfileuploadsizeinbytes#maxfileuploadsizeinbytes), and then assign stores per collection or per dataset to allow different projects/groups to have different limits. Harvard is doing this, (with stores pointing to the same bucket - not sure what the max is). QDR sets 2GB and a higher max (~20GB) on a store that is using a cheaper S3 option (storJ).
That said, right now the max file size is somewhat of a crude way to limit overall dataset size and Leonid is working on actual quotas right now. With that in place, setting the max file size to something very large, limited only by the max allowed in the store (AWS is 5 TB I think) and/or by reasonable upload times for your users (given their average bandwidth), etc. and using quotas would be a better approach/reduce the need for stores with different max sizes.
-- Jim
--
You received this message because you are subscribed to the Google Groups "Dataverse Users Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dataverse-commu...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dataverse-community/56cb8c49-609d-413c-849e-1694f3f51780n%40googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "Dataverse Users Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dataverse-commu...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dataverse-community/70459884-ce90-430b-a418-da15ccdec7a0n%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dataverse-community/CABbxx8HcT1v1k%3DPrLPrxBe4iYhW8RPG3U8tbg4%2Br3CCkXTnHRw%40mail.gmail.com.