Hello,
We're in the process of testing out the docker install of Archivematica and was wanting to test S3 bucket access. I believe the Storage Service is configured correctly, we did need to add DNS to the SS container and after doing so we were able to create the AIP Location with the Browse button working showing the folders we had created. Previously without DNS the browse button was unable to locate the S3 folders.
Once we get to the Ingest phase on the dashboard, the "Job: Store the AIP" fails with the following error:
500 Server Error: Internal Server Error for url: http://am-archivematica-storage-service-1:8000/api/v2/file/Traceback (most recent call last):
File "/src/src/archivematica/MCPClient/client/job.py", line 171, in JobContext
yield
File "/src/src/archivematica/MCPClient/clientScripts/store_aip.py", line 318, in call
store_aip(
File "/src/src/archivematica/MCPClient/clientScripts/store_aip.py", line 194, in store_aip
new_file = _create_file(
File "/src/src/archivematica/MCPClient/clientScripts/store_aip.py", line 59, in _create_file
new_file = storage_service.create_file(
File "/src/src/archivematica/archivematicaCommon/storageService.py", line 406, in create_file
response.raise_for_status()
File "/pyenv/data/versions/3.9.25/lib/python3.9/site-packages/requests/models.py", line 1026, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http://am-archivematica-storage-service-1:8000/api/v2/file/
Has anyone gotten the Storage Service communicating with S3 on a docker install? So far we've been able to work around this issue with s3fs and mounting to a folder in a docker volume but would prefer to use the service built in.
Any info would be very much appreciated!