Mark,
It’s hard to diagnose remotely, but I might suggest checking for timeouts. When you upload a zip file, the file is sent, and unzipped, and the list of files inside is sent back to the browser. If Payara, or Apache httpd or nginx or a load balancer, etc. times out, the list never gets back to the browser. Looking in the browser dev console might show if you are indeed getting an error on that call, usually a 504. In the info about the response header, you might see the name of the component that is sending the response which would point to where you need to up the timeout. If you can find that, people running similar systems might be able to point you to the right settings.
It sounds like you’ve checked for disk space – the only thing I can add there is that, with normal uploads (vs. direct uploads via S3 which don’t send files to the Dataverse server), there can be 2 copies of the file and the unzipped files, so 3x + of the file size can be needed.
In general, when files start to get large, we’d recommend looking into direct S3 upload (can be done over a local file system using minio or other service). There are pros and cons to that, but I’d guess most places with GB+ files use it (we don’t have actual stats that I know of).
Hope that helps,
-- Jim
--
You received this message because you are subscribed to the Google Groups "Dataverse Users Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
dataverse-commu...@googlegroups.com.
To view this discussion visit
https://groups.google.com/d/msgid/dataverse-community/1abc4e8c-0847-4ef6-9f9e-bf8662699c99n%40googlegroups.com.