A colleague of mine asked me if I could help him figure out if the standard HCP DTI processing scripts might successfully run on a short DTI sequence he has. He starting collecting data with it because he was told it’d be compatible with the standard, much longer one that’s used in prior HCP projects. While I’m familiar with HCP scripts in general, I’ve moved away from DTI work myself. So I’m hitting this kinda cold. It was pretty easy to figure out what files were needed and how to get the main wrapper script running. But, it all crashed on the eddy current processing script. Gave me an exit error = 1, with the message:
“EDDY::: ECScanManager::GetShellIndicies: Data not shelled”
My first thought was that this might simply reflect the sequence he used… It’s only like 9-10 minutes long. And conceivably could’ve only collected all the directions with a single shell value. But here’s where my unfamiliarity with HCP’s normal sequence bumps up against my only middling knowledge of DTI in general.
If one of y’all has a good working knowledge of HCP DTI, can you point us in the right direction? Primarily, I’m really mostly hoping to learn what I might need to check to be able to go back to my colleague and tell him whether or not he really DOES have an HCP-compatible DTI sequence, or not. Or learn perhaps if it’s possible to make minor modifications to the standard pipeline code to accommodate whatever he ran. But also, I hate being in the dark on these things. So I’d love to put my hands on any sort of HCP DTI “walk through” or technical manual so I can understand how the sequences themselves and the pipelines were put together… if available somewhere? It also didn’t look like the HCP DTI pipeline does anything after basic processing… That is, I didn’t see code to generate FA maps, do tractography or the like. Or generate anything to facilitate DTI data QC, etc. It’d be useful to confirm that.
Thanks,
Mike
Michael C. Stevens, Ph.D.
Director, Clinical Neuroscience and Development Laboratory
Associate Director, Olin Neuropsychiatry Research Center
Director, Child and Adolescent Research, The Institute of Living
Adjunct Professor of Psychiatry, Yale University School of Medicine
Hi Mike,
I expect this has to do with the compatibility of the acquisition and eddy and perhaps not the HCP Pipelines per se. Perhaps it would be best to post the error message and bvals/bvecs on the FSL list and Jesper may be able to take a look.
Matt.
--
You received this message because you are subscribed to the Google Groups "HCP-Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
hcp-users+...@humanconnectome.org.
To view this discussion on the web visit
https://groups.google.com/a/humanconnectome.org/d/msgid/hcp-users/3a863f50f02b41abb5e289a2988d5d82%40hhchealth.org.
The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail.
To elaborate slightly:
See
https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/eddy/UsersGuide#A--data_is_shelled
and
https://www.jiscmail.ac.uk/cgi-bin/webadmin?A2=fsl;166e972c.1610
That error has nothing to do with the HCP Pipelines per se.
Re dMRI QC: HCPpipelines version 4.1.3 added automatic running of runs FSL’s QUAD tool
https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/eddyqc/UsersGuide
And once you have a population of individuals, you can run SQUAD to generate distributions of the QC variables that it creates.
Cheers,
-MH
--
Michael Harms, Ph.D.
-----------------------------------------------------------
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave. Tel: 314-747-6173
St. Louis, MO 63110 Email: mha...@wustl.edu
From: "Glasser, Matthew" <glas...@wustl.edu>
Reply-To: "hcp-...@humanconnectome.org" <hcp-...@humanconnectome.org>
Date: Thursday, May 13, 2021 at 1:38 PM
To: "hcp-...@humanconnectome.org" <hcp-...@humanconnectome.org>
Subject: Re: [hcp-users] DTI processing
* External Email - Caution * |
To view this discussion on the web visit https://groups.google.com/a/humanconnectome.org/d/msgid/hcp-users/4DFC19C9-C208-4687-B156-6D9CDE4E6299%40wustl.edu.
Thanks to you both Michael and Matt. That helps a lot. I can get the data past the eddy checking step by setting a flag, but I’m still tracking down errors. However, those errors also seem to be the tools the pipelines call, not the pipelines themselves (e.g., there’s a weird crash on one of the QUAD steps that fails to produce all the QC metrics… probably something specific to our cluster or something).
I’m working on the last DTI/T1 registration script now, ‘cause it’s got a problem finding files where the HCP pipeline example say they might be. While I’m more confident I can run down this and any other remaining glitches with straightforward code-tracing… I do have one related question if anyone happens to know this – The “DiffusionToStructural.sh” script calls the “epi_reg_dof” global script on a file that’s called simply “nodif” in the pipeline script. What is this file supposed to be? I don’t see anything in earlier code that produces a file simply called nodif, and it’s certainly nothing written to the ../Difussion/data sub-directory the example DiffusionToStructural.sh script says this file should be in. But there is a file called “nodif_brain.nii.gz” in the ../Diffusion/topup subdirectory. That file sorta looks like it’d be a valid source of registrations parameters. Is this the proper input for epi_reg_dof… to produce the files needed for the subsequent applywarp, BBR, and steps that follow to finish up the Diffusion pipelines?
Mike
From: Harms, Michael <mha...@wustl.edu>
Sent: Thursday, May 13, 2021 4:15 PM
To: hcp-...@humanconnectome.org
Subject: Re: [hcp-users] DTI processing
CAUTION: This email is from outside HHC. USE CARE when opening attachments or links.</<> |
To view this discussion on the web visit
https://groups.google.com/a/humanconnectome.org/d/msgid/hcp-users/8EFF7A91-0517-4D6C-AD18-932916B1A6BC%40wustl.edu.
Reminder: This e-mail and any attachments are subject to the current HHC email retention policies. Please save or store appropriately in accordance with policy.
Hi Mike,
Can you clarify: Is there a bug of some sort in the HCPpipelines code that is leading to these questions? If there is, that’s something we’d certainly like to know about.
If not, I’m going to let you track this down the flow of the ‘nodif’ file with “code-tracing”, as you said.
Cheers,
-MH
--
Michael Harms, Ph.D.
-----------------------------------------------------------
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave. Tel: 314-747-6173
St. Louis, MO 63110 Email: mha...@wustl.edu
To view this discussion on the web visit https://groups.google.com/a/humanconnectome.org/d/msgid/hcp-users/c275e1c115824627b26f3972d43db882%40hhchealth.org.
Ahhh… Again, hugely helpful guys. I think I see what happened. I thought the glitch I got from eddy_quad function error simply shorted out that call… But that the rest of the script ran. I’m now seeing from what Matt pointed out that it’s actually killing the whole rest of the eddy_postproc.sh script. That’s what’s bombing out the structural mapping.
Michael – I’m pretty sure this isn’t any sort of general HCP pipeline bug. I’d started wondering if it might be something I had to customize. But it seems to not even be that.
Have a good weekend y’all.
To view this discussion on the web visit https://groups.google.com/a/humanconnectome.org/d/msgid/hcp-users/8114D64F-6E60-46BE-AE44-DF62EB959A4E%40wustl.edu.
Yes, the intent is that users should be able to simply run the DiffPreprocPipeline.sh script, without needing to “customize” anything.
If you find you need to customize something simply to get the script to run, that would be good for us to know about.
To view this discussion on the web visit https://groups.google.com/a/humanconnectome.org/d/msgid/hcp-users/5a107e5a8e0145868bebdefcdfeab520%40hhchealth.org.