curl -v -u username:pwd -X POST "https://xnaturl/data/prearchive/files?extract=true" -F "3dCORT1CLEAR.tar.gz=@3dCORT1CLEAR.tar.gz"
<html>
<head>
<title>Status page</title>
</head>
<body>
<h3>The method specified in the request is not allowed for the resource identified by the request URI</h3><p>You can get technical details <a href="http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html#sec10.4.6">here</a>.<br>
Please continue your visit at our <a href="/">home page</a>.
</p>
The REST API supports the upload of compressed files that can be extracted on the server after successful upload. There are a few caveats:
To use compressed uploads, you would use a similar structure to uploading a normal file. Any where that you can upload files via the REST API, you can use the ?extract option.
curl -u USERNAME:PASSWORD -X POST "http://central.xnat.org/data/archive/projects/X/subjects/X/experiments/X/scans/X/files?extract=true" -F "file.zip=@file.zip"If you are including the zipped file as the body of the message (curl's -d flag), you should also use the ?inbody=true attribute. Otherwise XNAT will expect MULTI-PART form data (curl's -F flag).
The file name must include the proper file extension. '.zip' for zipped data. '.gz' for a gzipped file. '.tar.gz' for a tar gz. This will be used to properly extract the file.
If the extract=true is missing, then the file will be uploaded and stored as is (not extracted).
Yes, that tutorial (and the parent page) are primarily for uploading resource files to various XNAT objects that already exist in the system. What you’re looking for is the import API: https://wiki.xnat.org/display/XAPI/Image+Session+Import+Service+API .
Here’s an example I just used to upload a tar.gz compressed study to the prearchive of project “MYPROJECT” in my local XNAT:
curl -u admin:admin -X POST 'http://10.1.10.17/data/services/import?dest=/prearchive/projects/MYPROJECT' -F "file=@images.tar.gz"
Thanks,
Charlie
--
You received this message because you are subscribed to the Google Groups "xnat_discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
xnat_discussi...@googlegroups.com.
To post to this group, send email to
xnat_di...@googlegroups.com.
Visit this group at https://groups.google.com/group/xnat_discussion.
For more options, visit https://groups.google.com/d/optout.
The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail.
My question being: is there a way from the command line (curl) that I can force the archiving ? So I dont have to manually archiving it ? As I have about 1,200 scans (each scan have about 5 different modality tar file) = 6,000+ tar files to upload. It would be extremely difficult for us to manually archive all the 6,000 tar file.
Many thanks for your assistance, and much appreciated for your time again!
Yes, that tutorial (and the parent page) are primarily for uploading resource files to various XNAT objects that already exist in the system. What you’re looking for is the import API: https://wiki.xnat.org/display/XAPI/Image+Session+Import+Service+API .
Here’s an example I just used to upload a tar.gz compressed study to the prearchive of project “MYPROJECT” in my local XNAT:
curl -u admin:admin -X POST 'http://10.1.10.17/data/services/import?dest=/prearchive/projects/MYPROJECT' -F "file=@images.tar.gz"
Thanks,
Charlie
From: xnat_di...@googlegroups.com [mailto:xnat_di...@googlegroups.com] On Behalf Of Haofei FENG
Sent: Wednesday, January 24, 2018 11:43 PM
To: xnat_discussion <xnat_di...@googlegroups.com>
Subject: [XNAT Discussion] How to use a curl command to upload the tar.gz (DICOM) and extract to prearchive
Hi,
How can I use curl command to upload the tar.gz and extract to prearchive ?
We have 1,200 scans in tar.gz files, and we would like to upload and extract to prearhive ?
The link has below info, but that is not to the prearhive, but into the scans.
I have tried to do the same thing into /data/prearchive or /data/prearchive/projects/ID, it will come with below error:
curl -v -u username:pwd -X POST "https://xnaturl/data/prearchive/files?extract=true" -F "3dCORT1CLE...@3dCORT1CLEAR.tar.gz"
<html>
<head>
<title>Status page</title>
</head>
<body>
<h3>The method specified in the request is not allowed for the resource identified by the request URI</h3><p>You can get technical details <a href="http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html#sec10.4.6">here</a>.<br>
Please continue your visit at our <a href="/">home page</a>.
</p>
Compressed file upload
The REST API supports the upload of compressed files that can be extracted on the server after successful upload. There are a few caveats:
- The files cannot cross hierarchical concepts. I.E. You cannot include files for both scans and reconstructions in the same zip file.
- The files cannot cross objects. I.E. You cannot include files for scan A and scan B. They must be within a single scan.
- The files cannot cross catalogs. Within each object you can have multiple catalogs (collections) of files. The files in a zip can only go in a single catalog.
To use compressed uploads, you would use a similar structure to uploading a normal file. Any where that you can upload files via the REST API, you can use the ?extract option.
curl -u USERNAME:PASSWORD -X POST "http://central.xnat.org/data/archive/projects/X/subjects/X/experiments/X/scans/X/files?extract=true" -F "file...@file.zip"
If you are including the zipped file as the body of the message (curl's -d flag), you should also use the ?inbody=true attribute. Otherwise XNAT will expect MULTI-PART form data (curl's -F flag).
The file name must include the proper file extension. '.zip' for zipped data. '.gz' for a gzipped file. '.tar.gz' for a tar gz. This will be used to properly extract the file.
If the extract=true is missing, then the file will be uploaded and stored as is (not extracted).
--
You received this message because you are subscribed to the Google Groups "xnat_discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to xnat_discussi...@googlegroups.com.
To post to this group, send email to xnat_di...@googlegroups.com.
Visit this group at https://groups.google.com/group/xnat_discussion.
For more options, visit https://groups.google.com/d/optout.
I think the prearchive settings are used for when DICOM comes in on the SCP Receiver. I don’t quite follow what you’re trying to do (or why the merge is needed), but if you’re just looking for the API to trigger the archive process from the prearchive, that can be found here: https://wiki.xnat.org/display/XAPI/Image+Session+Archive+Service+API . The “src” field can be found by querying the prearchive, I believe, or as the response body to the call to /data/services/import used to upload data to the prearchive.
-rwx------. 1 root root 24730292 Jan 24 22:10 3dCORT1CLEAR.tar.gz
-rwx------. 1 root root 920737 Jan 24 22:10 BOSENSE.tar.gz
-rwx------. 1 root root 29768109 Jan 24 22:10 DTI32.tar.gz
-rwx------. 1 root root 16809791 Jan 24 22:10 DUAL_TSE.tar.gz
-rwx------. 1 root root 7662730 Jan 24 22:10 T2FLAIR.tar.gz
I think the prearchive settings are used for when DICOM comes in on the SCP Receiver. I don’t quite follow what you’re trying to do (or why the merge is needed), but if you’re just looking for the API to trigger the archive process from the prearchive, that can be found here: https://wiki.xnat.org/display/XAPI/Image+Session+Archive+Service+API . The “src” field can be found by querying the prearchive, I believe, or as the response body to the call to /data/services/import used to upload data to the prearchive.
Thanks,
Charlie
From: xnat_di...@googlegroups.com [mailto:xnat_di...@googlegroups.com] On Behalf Of Haofei FENG
Sent: Monday, January 29, 2018 5:26 PM
To: xnat_discussion <xnat_di...@googlegroups.com>
Subject: Re: [XNAT Discussion] How to use a curl command to upload the tar.gz (DICOM) and extract to prearchive
Hi Charlie,
Thanks for the tips. It worked extremely well.
However, when I tried to send a few tar.gz file into the same project in the archive folder, (same problem for the prearchive one ) it came with the below error:
"Session already exists, retry with overwrite enabled"
And in the "prearchive" folder, it will show as "conflict"
So when I click "Archive", it will then archive into the project successfully. (Scan 401 is the one)
But in the project, I have already set up the project "PreArchive Setting" to be
"All image data will be placed into the archive automatically and will overwrite existing files. Data which doesn't match a pre-existing project will be placed in an 'Unassigned' project."
My question being: is there a way from the command line (curl) that I can force the archiving ? So I dont have to manually archiving it ? As I have about 1,200 scans (each scan have about 5 different modality tar file) = 6,000+ tar files to upload. It would be extremely difficult for us to manually archive all the 6,000 tar file.
Many thanks for your assistance, and much appreciated for your time again!
On Friday, January 26, 2018 at 4:58:06 AM UTC+11, Moore, Charlie wrote:
Yes, that tutorial (and the parent page) are primarily for uploading resource files to various XNAT objects that already exist in the system. What you’re looking for is the import API: https://wiki.xnat.org/display/XAPI/Image+Session+Import+Service+API .
Here’s an example I just used to upload a tar.gz compressed study to the prearchive of project “MYPROJECT” in my local XNAT:
curl -u admin:admin -X POST 'http://10.1.10.17/data/services/import?dest=/prearchive/projects/MYPROJECT' -F "fi...@images.tar.gz"