Sherry,
You can limit the max size per store (see the JSON example at https://guides.dataverse.org/en/latest/installation/config.html?highlight=maxfileuploadsizeinbytes#maxfileuploadsizeinbytes), and then assign stores per collection or per dataset to allow different projects/groups to have different limits. Harvard is doing this, (with stores pointing to the same bucket - not sure what the max is). QDR sets 2GB and a higher max (~20GB) on a store that is using a cheaper S3 option (storJ).
That said, right now the max file size is somewhat of a crude way to limit overall dataset size and Leonid is working on actual quotas right now. With that in place, setting the max file size to something very large, limited only by the max allowed in the store (AWS is 5 TB I think) and/or by reasonable upload times for your users (given their average bandwidth), etc. and using quotas would be a better approach/reduce the need for stores with different max sizes.
-- Jim
--
You received this message because you are subscribed to the Google Groups "Dataverse Users Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dataverse-commu...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dataverse-community/56cb8c49-609d-413c-849e-1694f3f51780n%40googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "Dataverse Users Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dataverse-commu...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dataverse-community/70459884-ce90-430b-a418-da15ccdec7a0n%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dataverse-community/CABbxx8HcT1v1k%3DPrLPrxBe4iYhW8RPG3U8tbg4%2Br3CCkXTnHRw%40mail.gmail.com.
Hi Jacek,
thanks for the feedback.
Could you share the specific error message you’re encountering?
Python dvuploader defaults to direct uploads to S3 and checks if S3 upload is enabled for a collection. If it’s not enabled, it falls back to the regular upload and prints a log message to the console (see screenshot). Do you see this log message when you use it? This would help ensure there’s no bug and that the code isn’t using the regular upload.
All the best,
Jan

To view this discussion visit https://groups.google.com/d/msgid/dataverse-community/3e4556ea-11c4-4129-b074-0b0d18757dd4n%40googlegroups.com.

Thanks for the prompt response :)
Okay, the direct upload is definitely triggered.
I attach full error message. It points out to code 400 when trying to get uploadurls. If I change :MaxFileUploadSizeInBytes for a larger number than my file - upload works fine (but seems like transering whole file at once).
This should work fine because the progress bar tracks the upload of all concurrent tasks for a single file. Additionally, since your file size exceeds the minimum part size, the file will be split into the appropriate number of parts. However, I should log the type of upload used, and I’ll create a pull request to add a message indicating this.
I am not sure if the S3 upload can bypass the maximum file upload size. Maybe @qqmyers or @pdurbin could help?
All the best,
Jan
Am 26.11.2025 um 12:15 schrieb Jacek Chudzik <jacek....@cyfronet.pl>:
Hi,
I do not see this log message.
To view this discussion visit https://groups.google.com/d/msgid/dataverse-community/2beed4ef-f5d1-471b-881e-36dee6c5e563n%40googlegroups.com.
<Zrzut ekranu z 2025-11-26 12-02-14.png><error_message.txt>
A quick answer: The :MaxFileUploadSizeInBytes was intended to limit file size in general (admins may not want larger files). Per store limits, or different limits for normal and direct uploads could make sense. API vs UI isn’t so clear as the upload-a-folder capability and the SPA are both doing API uploads.
-- Jim
To view this discussion visit https://groups.google.com/d/msgid/dataverse-community/534DDCCC-554C-4016-80CE-55DE97002ACF%40simtech.uni-stuttgart.de.