Hello HCP-Users community,
I am seeking help with a persistent issue I'm facing while downloading the "Retinotopy Task fMRI 2mm/32k FIX-Denoised (Compact)" package using the Aspera client.
The download repeatedly fails with a "No such file or directory" error. The error log specifically names certain subjects, such as 131217 and 550439, as the source of the failure (a screenshot of the error is attached).
Believing it was a standard corrupted session, I followed the recommended procedure: I completely stopped the transfer, deleted all partial .zip and temporary .aspx files, and then restarted the entire package download from the beginning.
Unfortunately, after running for several hours, the download failed again with the exact same type of error.
Since a clean restart of the entire download did not solve the problem, I'm hoping someone here might have more advanced suggestions.
Has anyone encountered a situation where this error persists even after restarting the download?
Could this point to a potential server-side issue with these specific files, or is it more likely a local Aspera configuration problem?
Are there any specific Aspera settings (e.g., transfer policy, datagram size, or rate limits) that are known to improve the stability of these very large batch downloads?
I am also reporting this to the official HCP support team, but I wanted to leverage the community's experience in case this is a known issue with a specific solution.
Thank you for your help,
Prince Isaac Pantino

Hi Prince Issac,
Did the error message specify the same subjects every time you tried? Both those subjects should have the Retinotopy Task fMRI 2mm/32k FIX-Denoised (Compact) package. Were you able to download the packages for other subjects? You can use the filters on the Subject Dashboard in ConnectomeDB to select individual/some subjects with 7T data and see if that is successful.
I also can’t download these packages and it looks like it’s a problem with the availability of the zip archive itself rather than an Aspera setting issue:

We will look into what is going on, but in the meantime I also suggest you getting the data from our Amazon S3 bucket, see https://wiki.humanconnectome.org/docs/How%20To%20Connect%20to%20Connectome%20Data%20via%20AWS.html
Best,
Jenn
________________________________________
Jennifer Elam, Ph.D.
Scientific Outreach, Human Connectome Project
Washington University School of Medicine
el...@wustl.edu
www.humanconnectome.org

--
You received this message because you are subscribed to the Google Groups "HCP-Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
hcp-users+...@humanconnectome.org.
To view this discussion visit
https://groups.google.com/a/humanconnectome.org/d/msgid/hcp-users/a0a3f26d-7491-4f80-95b3-15b49911e7d1n%40humanconnectome.org.
The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail.

--
You received this message because you are subscribed to the Google Groups "HCP-Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hcp-users+...@humanconnectome.org.