HCP Pipelines BIDS App

267 views
Skip to first unread message

Chris Gorgolewski

unread,
Sep 11, 2016, 1:03:32 AM9/11/16
to Hcp Users, bids-a...@googlegroups.com
Dear HCP community,

Many researchers want to run HCP Pipelines on their own datasets, but run into problems with software dependencies and different organization of data and metadata. To help with this issue I have created an HCP Pipelines BIDS App. Like all other BIDS Apps the HCP Pipelines one has the following features:
  • It's portable (meaning comes with all of the dependencies with the correct versions - no need to install FSL or Freesurfer).
  • Runs on Windows, Mac OS X and Linux (as well as HPCs or clusters via Singularity).
  • No need to specify any metadata - all you need as an input is a BIDS dataset .
  • The only software required (across all three platforms) is Docker.
  • The App (which includes all dependencies) is versioned and all historical versions are preserved. This allows you to keep the same software stack intact during a longitudinal study spread over years.
The App was designed to parse the input datasets, figure out which scans are available and run HCP pipelines with optimal parameters. I have tested it with the HCP example subject, but if you have data you would like to provide for testing purposes I would be very happy to take advantage of it.

I hope this will make HCP pipelines accessible to more researchers. Please let me know what you think!

More information how to use the HCP Pipeline App can be found here: https://github.com/BIDS-Apps/HCPPipelines

Best,
Chris Gorgolewski

Marcus, Daniel

unread,
Sep 12, 2016, 3:27:27 PM9/12/16
to Chris Gorgolewski, Hcp Users, bids-a...@googlegroups.com
Hi Chris,

This is great work and I think will be a serious asset to those looking to run HCP pipelines on their data.  Can you explain what steps you’ve taken to adapt the pipelines to work with the BIDS format?  Conversely, have you taken any steps to adapt the existing HCP packages to work with these containers?

-Dan

_______________________________________________
HCP-Users mailing list
HCP-...@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

 


The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail.

Harms, Michael

unread,
Sep 12, 2016, 4:02:58 PM9/12/16
to Marcus, Dan, Chris Gorgolewski, Hcp Users, bids-a...@googlegroups.com

Hi Chris,
This indeed looks very interesting.  Following up on Dan’s question, it will be important for users to have a transparent way to know if any changes have been made to the HCP Pipeline scripts.  

Along those lines, your github page says:

This BIDS App requires that each subject has at least one T1w and one T2w scan. Lack of fieldmaps, or fMRI scans is handled robustly.

Is the part above that “lack of fieldmaps” is handled “robustly” only in reference to the T1/T2 scans?  Because the HCP Pipelines have always considered fieldmaps to be a mandatory acquisition for the fMRI processing (given their importance in correcting for distortions).

cheers,
-MH

-- 
Michael Harms, Ph.D.
-----------------------------------------------------------
Conte Center for the Neuroscience of Mental Disorders
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave. Tel: 314-747-6173
St. Louis, MO  63110 Email: mha...@wustl.edu

Chris Gorgolewski

unread,
Sep 12, 2016, 4:32:23 PM9/12/16
to Harms, Michael, Marcus, Dan, Hcp Users, bids-a...@googlegroups.com
It's time for a Q&A!

Can you explain what steps you’ve taken to adapt the pipelines to work with the BIDS format?  
The pipelines are not modified - BIDS App is calling exactly the same high level functions as the scripts in https://github.com/Washington-University/Pipelines/tree/master/Examples are calling. All of the of work of adapting to BIDS is figuring out which raw files go where and which parameters to set. All of this code is contained in this script: https://github.com/BIDS-Apps/HCPPipelines/blob/master/run.py
 
Conversely, have you taken any steps to adapt the existing HCP packages to work with these containers?
If by "HCP packages" you mean datasets than yes. As mentioned in the readme "To convert DICOMs from your HCP-Style (CMRR) acquisitions to BIDS try using heudiconv with this heuristic file.". We are also working on a simple HCP2BIDS tool (I'll posted it here when it's ready).

This indeed looks very interesting.  Following up on Dan’s question, it will be important for users to have a transparent way to know if any changes have been made to the HCP Pipeline scripts.
I wanted to avoid having to change anything in the original Pipelines. Thus I'm only calling existing top level scripts (like PreFreeSurfer.sh) with command line parameters inferred from the input BIDS dataset.

Is the part above that “lack of fieldmaps” is handled “robustly” only in reference to the T1/T2 scans?  Because the HCP Pipelines have always considered fieldmaps to be a mandatory acquisition for the fMRI processing (given their importance in correcting for distortions).
It applies to both structural as well as functional scans. The word "robustly" is probably not the best in this context. All I mean is that the App will still run (by passing "NONE" to the HCP scripts which as far a I know is a valid option - please correct me if I am wrong). The results will of course be worse if fieldmaps were present. I would appreciate if you could suggest a better wording that would cause less confusion.


Let me know if you have any other questions/suggestions/comments.

Best,
Chris


cheers,
-MH

-- 
Michael Harms, Ph.D.
-----------------------------------------------------------
Conte Center for the Neuroscience of Mental Disorders
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave. Tel: 314-747-6173
St. Louis, MO  63110 Email: mha...@wustl.edu

From: <hcp-users-bounces@humanconnectome.org> on behalf of Daniel Marcus <dma...@wustl.edu>
Date: Monday, September 12, 2016 at 2:27 PM
To: Chris Gorgolewski <krzysztof.gorgolewski@gmail.com>, Hcp Users <hcp-...@humanconnectome.org>, "bids-apps-dev@googlegroups.com" <bids-apps-dev@googlegroups.com>
Subject: Re: [HCP-Users] HCP Pipelines BIDS App
Hi Chris,

This is great work and I think will be a serious asset to those looking to run HCP pipelines on their data.  Can you explain what steps you’ve taken to adapt the pipelines to work with the BIDS format?  Conversely, have you taken any steps to adapt the existing HCP packages to work with these containers?

-Dan

Satrajit Ghosh

unread,
Sep 12, 2016, 5:33:18 PM9/12/16
to bids-a...@googlegroups.com, Hcp Users

On Mon, Sep 12, 2016 at 1:32 PM, Chris Gorgolewski <krzysztof....@gmail.com> wrote:
It's time for a Q&A!

Can you explain what steps you’ve taken to adapt the pipelines to work with the BIDS format?  
The pipelines are not modified - BIDS App is calling exactly the same high level functions as the scripts in https://github.com/Washington-University/Pipelines/tree/master/Examples are calling. All of the of work of adapting to BIDS is figuring out which raw files go where and which parameters to set. All of this code is contained in this script: https://github.com/BIDS-Apps/HCPPipelines/blob/master/run.py
 
Conversely, have you taken any steps to adapt the existing HCP packages to work with these containers?
If by "HCP packages" you mean datasets than yes. As mentioned in the readme "To convert DICOMs from your HCP-Style (CMRR) acquisitions to BIDS try using heudiconv with this heuristic file.". We are also working on a simple HCP2BIDS tool (I'll posted it here when it's ready).

regarding packages, i'll take the software interpretation of package question. if that is the case, the docker container runs the hcp pipeline scripts as distributed from the project using the specific versions of dependencies listed in the documentation for the scripts. so they are running the "hcp versions" of the scripts.

just a quick aside on the heudiconv script - this would need to be adapted for each connectome project since the same data are not being acquired at every site, but gives the general flavor of how you can get bids output from an acquisition done with the  CMRR sequences.

cheers,

satra

Glasser, Matthew

unread,
Sep 12, 2016, 7:42:08 PM9/12/16
to Chris Gorgolewski, Harms, Michael, bids-a...@googlegroups.com, Hcp Users
“NONE” is not a supported option for fMRIVolume, but it is supported for PreFreeSurfer.  EPI distortions are much larger than 3D structural readout distortion.

Peace,

Matt.

paola.o...@yale.edu

unread,
Jun 15, 2017, 4:00:23 PM6/15/17
to bids-apps-dev, hcp-...@humanconnectome.org
Dear Chris,

Thank you for all your contributions to reproducible science! I've read your documentation and watched your neurohackweek video which have been really helpful, but I am still unclear on the specific command used to run the HCPpipeline BIDSapp using Singularity on an HPC. I see instructions for Docker commands here, but not Singularity,

I ran the following commands on my local computer to convert the image from docker to singularity:

$ docker run --privileged -ti --rm  -v /var/run/docker.sock:/var/run/docker.sock -v /Users/estee/Desktop/singularity_images:/output filo/docker2singularity bids/example:0.0.4

Then I
transferred this image to my HPC system (now named: bids_example_0.0.4-2016-08-28-fba9b3e0a751.img)  and installed Singularity. Is this the correct image file I should be using?

What is the specific command I should be running to preprocess one subject using Singularity? Should I be running the run.py script within singularity? And how do I specify the subjectID or directory?


Thanks,
Paola

Chris Gorgolewski

unread,
Jun 15, 2017, 4:07:53 PM6/15/17
to Paola Odriozola, bids-apps-dev, Hcp Users
On Thu, Jun 15, 2017 at 1:00 PM, <paola.o...@yale.edu> wrote:
Dear Chris,

Thank you for all your contributions to reproducible science! I've read your documentation and watched your neurohackweek video which have been really helpful, but I am still unclear on the specific command used to run the HCPpipeline BIDSapp using Singularity on an HPC. I see instructions for Docker commands here, but not Singularity,
 
I ran the following commands on my local computer to convert the image from docker to singularity:

$ docker run --privileged -ti --rm  -v /var/run/docker.sock:/var/run/docker.sock -v /Users/estee/Desktop/singularity_images:/output filo/docker2singularity bids/example:0.0.4

Then I
transferred this image to my HPC system (now named: bids_example_0.0.4-2016-08-28-fba9b3e0a751.img)  and installed Singularity. Is this the correct image file I should be using?
This command will create an image for the example app. To create one for the  HCP Pipelines you need to replace "bids/example:0.0.4" with "bids/hcppipelines:v3.17.0-13"

What is the specific command I should be running to preprocess one subject using Singularity? Should I be running the run.py script within singularity? And how do I specify the subjectID or directory?
./name_of_the_image.img <input_bids_directory> <output_directory> participant --participant_label <label>

You need to replace <input_bids_directory> <output_directory> <label> with values corresponding to your dataset.

Please mind that HCP pipelines only work on datasets that include T2 weighted images and fieldmaps!

Best,
Chris

 


Thanks,
Paola


On Sunday, September 11, 2016 at 1:03:32 AM UTC-4, Chris Gorgolewski wrote:
Dear HCP community,

Many researchers want to run HCP Pipelines on their own datasets, but run into problems with software dependencies and different organization of data and metadata. To help with this issue I have created an HCP Pipelines BIDS App. Like all other BIDS Apps the HCP Pipelines one has the following features:
  • It's portable (meaning comes with all of the dependencies with the correct versions - no need to install FSL or Freesurfer).
  • Runs on Windows, Mac OS X and Linux (as well as HPCs or clusters via Singularity).
  • No need to specify any metadata - all you need as an input is a BIDS dataset .
  • The only software required (across all three platforms) is Docker.
  • The App (which includes all dependencies) is versioned and all historical versions are preserved. This allows you to keep the same software stack intact during a longitudinal study spread over years.
The App was designed to parse the input datasets, figure out which scans are available and run HCP pipelines with optimal parameters. I have tested it with the HCP example subject, but if you have data you would like to provide for testing purposes I would be very happy to take advantage of it.

I hope this will make HCP pipelines accessible to more researchers. Please let me know what you think!

More information how to use the HCP Pipeline App can be found here: https://github.com/BIDS-Apps/HCPPipelines

Best,
Chris Gorgolewski

--
You received this message because you are subscribed to the Google Groups "bids-apps-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bids-apps-dev+unsubscribe@googlegroups.com.
To post to this group, send email to bids-a...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bids-apps-dev/87d5389a-dc9b-4f05-8715-34f8685c5958%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

paola.o...@yale.edu

unread,
Jun 22, 2017, 4:04:30 PM6/22/17
to bids-apps-dev, paola.o...@yale.edu, hcp-...@humanconnectome.org
Hi Chris,

Thank you for your response! This worked very well and I hope it will help others in the future.
I am now trying to run the entire HCP pipeline on one subject, labeled "Asample", using the following command:

./bids_hcppipelines_v3.17.0-13-2016-11-02-c9c1c3cc2228.img /tmp/BIDS_HCP_test /tmp/BIDS_HCP_test/HCP_PreProc_output participant --participant_label Asample --license_key CX2frM4fbVCk


I am getting the following error:

Final FOV is:
0.000000 176.000000 0.000000 256.000000 89.000000 150.000000

Image Exception : #22 :: ERROR: Could not open image /opt/HCP-Pipelines/global/templates/MNI152_T1_1.0mm
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/opt/HCP-Pipelines/PreFreeSurfer/scripts/ACPCAlignment.sh: line 84:  4731 Aborted                 ${FSLDIR}/bin/flirt -interp spline -in "$WD"/robustroi.nii.gz -ref "$Reference" -omat "$WD"/roi2std.mat -out "$WD"/acpc_final.nii.gz -searchrx -30 30 -searchry -30 30 -searchrz -30 30

Traceback (most recent call last):
  File "/run.py", line 340, in <module>
    stage_func()
  File "/run.py", line 69, in run_pre_freesurfer
    run(cmd, cwd=args["path"], env={"OMP_NUM_THREADS": str(args["n_cpus"])})
  File "/run.py", line 28, in run
    raise Exception("Non zero return code: %d"%process.returncode)
Exception: Non zero return code: 134

It seems like /opt/HCP-Pipelines/global/templates/ is not in my HPC but rather in the container, is this correct?
Could you please advise on ways to debug or things to try?
Should I be using Joke's Singularity file at all? I see the HCPPIPEDIR in question on there.

Any advice on how to proceed/troubleshoot would be greatly appreciated!

Thanks,
Paola Odriozola



To unsubscribe from this group and stop receiving emails from it, send an email to bids-apps-de...@googlegroups.com.

To post to this group, send email to bids-a...@googlegroups.com.

Chris Gorgolewski

unread,
Jun 22, 2017, 4:44:15 PM6/22/17
to Paola Odriozola, bids-apps-dev, Hcp Users
On Thu, Jun 22, 2017 at 4:04 PM, <paola.o...@yale.edu> wrote:
Hi Chris,

Thank you for your response! This worked very well and I hope it will help others in the future.
I am now trying to run the entire HCP pipeline on one subject, labeled "Asample", using the following command:

./bids_hcppipelines_v3.17.0-13-2016-11-02-c9c1c3cc2228.img /tmp/BIDS_HCP_test /tmp/BIDS_HCP_test/HCP_PreProc_output participant --participant_label Asample --license_key CX2frM4fbVCk


I am getting the following error:

Final FOV is:
0.000000 176.000000 0.000000 256.000000 89.000000 150.000000

Image Exception : #22 :: ERROR: Could not open image /opt/HCP-Pipelines/global/templates/MNI152_T1_1.0mm
terminate called after throwing an instance of 'RBD_COMMON::BaseException'
/opt/HCP-Pipelines/PreFreeSurfer/scripts/ACPCAlignment.sh: line 84:  4731 Aborted                 ${FSLDIR}/bin/flirt -interp spline -in "$WD"/robustroi.nii.gz -ref "$Reference" -omat "$WD"/roi2std.mat -out "$WD"/acpc_final.nii.gz -searchrx -30 30 -searchry -30 30 -searchrz -30 30

Traceback (most recent call last):
  File "/run.py", line 340, in <module>
    stage_func()
  File "/run.py", line 69, in run_pre_freesurfer
    run(cmd, cwd=args["path"], env={"OMP_NUM_THREADS": str(args["n_cpus"])})
  File "/run.py", line 28, in run
    raise Exception("Non zero return code: %d"%process.returncode)
Exception: Non zero return code: 134

It seems like /opt/HCP-Pipelines/global/templates/ is not in my HPC but rather in the container, is this correct?
This is a bug - I'm working on a fix. Will get back to you.

Best,
Chris
 
To unsubscribe from this group and stop receiving emails from it, send an email to bids-apps-dev+unsubscribe@googlegroups.com.

To post to this group, send email to bids-a...@googlegroups.com.

Chris Gorgolewski

unread,
Jun 26, 2017, 6:15:34 PM6/26/17
to Paola Odriozola, bids-apps-dev, Hcp Users
Hi,

I release a new version with a bugfix: v3.17.0-14. Please give it a try,

Best,
Chris

paola.o...@yale.edu

unread,
Jun 30, 2017, 5:52:30 PM6/30/17
to bids-apps-dev
Hi Chris,

Thank you so much! This bug fix seems to have worked very well.

Our script ran all of the T1w and T2w preprocessing but got stuck at rest1. I have attached an image of the error.

If we're reading this error correctly, it seems to be getting stuck at the Distortion correction, which led us to go back through the BIDS fieldmaps specifications, which I have a few questions about. We acquired two 3D distortion maps for resting state, one is in the AP direction, and the other in PA direction. We are assuming this means that our fieldmaps are consistent with Case 4: "Multiple phase encoded directions (topup)". We converted our dicoms using heudiconv which created the JSON files for us (example attached). Should we manually add the following variables to that JSON file?

{
"PhaseEncodingDirection": "j-",
"TotalReadoutTime": 0.095,
"IntendedFor": "func/sub-A200_task-rest_bold.nii.gz"
}

If so, does it matter where in the existing (and very long) JSON file we add these fields?

For PhaseEncodingDirection, what do each of these letters mean: "i”, “j”, “k”, “i-”, “j-, “k-” (taken from bids_spec1.0.1)? What do we put for PhaseEncodingDirection if our scan specifications just say "Phase enc. dir. A>>P"?

Thanks again for all your help!

Best,
Paola Odriozola
error.jpg
sub-Asample_acq-rest_dir-PA_run-3_epi.json

Chris Gorgolewski

unread,
Jun 30, 2017, 6:17:59 PM6/30/17
to Paola Odriozola, bids-apps-dev
On Fri, Jun 30, 2017 at 2:52 PM, <paola.o...@yale.edu> wrote:
Hi Chris,

Thank you so much! This bug fix seems to have worked very well.

Our script ran all of the T1w and T2w preprocessing but got stuck at rest1. I have attached an image of the error.
Yay!
 
If we're reading this error correctly, it seems to be getting stuck at the Distortion correction, which led us to go back through the BIDS fieldmaps specifications, which I have a few questions about. We acquired two 3D distortion maps for resting state, one is in the AP direction, and the other in PA direction. We are assuming this means that our fieldmaps are consistent with Case 4: "Multiple phase encoded directions (topup)". We converted our dicoms using heudiconv which created the JSON files for us (example attached). Should we manually add the following variables to that JSON file?

{
"PhaseEncodingDirection": "j-",
"TotalReadoutTime": 0.095,
"IntendedFor": "func/sub-A200_task-rest_bold.nii.gz"
}
If so, does it matter where in the existing (and very long) JSON file we add these fields?
Yes - this should be in the JSON file describing the fieldmap. The order of fields does not matter, but the fields need to appear at the top level of the JSON dictionary.

You can check if everything works by providing an incorrect path in the "IntendedFor" field and using the validator - it should throw an error. 

For PhaseEncodingDirection, what do each of these letters mean: "i”, “j”, “k”, “i-”, “j-, “k-” (taken from bids_spec1.0.1)? What do we put for PhaseEncodingDirection if our scan specifications just say "Phase enc. dir. A>>P"?
This is a bit more complex. To be absolutely sure you should open the file in fslview or other viewer that shows you the voxel indices and check which dimension (i, j , k) the A-P corresponds to and if the indices increase along this direction or decrease ('j' vs 'j-'). The good thing is that with TOPUP does not care about the polarity as long as you provide two opposite ones.

Remember to add the PhaseEncodingDirection to the _bold files as well!

I hope this helps!

Best,
Chris
To unsubscribe from this group and stop receiving emails from it, send an email to bids-apps-dev+unsubscribe@googlegroups.com.

To post to this group, send email to bids-a...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages