normalization of diffusion images

74 views
Skip to first unread message

Kazim Gumus

unread,
Oct 10, 2022, 1:24:58 PM10/10/22
to HCP-Users
Hello,
I downloaded diffusion data of 100 unrelated subjects from WU-Minn HCP 1200 dataset. I would like to normalize the images of all subjects to a standard template to be able to use as input for a Neural Network algorithm. I was wondering what would be an ideal way or tool to do it. 
I am thinking to normalize b0 images of each subject to a template (?), then get the normalization parameters and apply them to diffusion weighted images using SPM. Does it sound reasonable? Any suggestion is appreciated.
Kazim

Glasser, Matt

unread,
Oct 10, 2022, 1:30:48 PM10/10/22
to hcp-...@humanconnectome.org

This has already been done.  Use ${StudyFolder}/${Subject}/MNINonLinear/xfms/acpc_dc2standard.nii.gz and FSL’s applywarp to achieve this.


Matt.

--
You received this message because you are subscribed to the Google Groups "HCP-Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hcp-users+...@humanconnectome.org.
To view this discussion on the web visit https://groups.google.com/a/humanconnectome.org/d/msgid/hcp-users/20053955-9942-4b6e-ba04-07dc40feb1dcn%40humanconnectome.org.

 


The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail.

Kazim Gumus

unread,
Nov 17, 2022, 2:56:02 PM11/17/22
to HCP-Users, glas...@wustl.edu
Thanks Matt!

I have been working on this. 
I have a few questions. I am using already processed datasets. So my input data to warp for a subject is "data.nii.gz" under Subject/T1w/Diffusion/.
I am not running whole HCP pipeline. So, I created a small script using a few lines from HCP pipeline. The key line is  below:

${FSLDIR}/bin/applywarp \
                    -i ${StudyFolder}/${Subject}/T1w/Diffusion/data.nii.gz \
                    -o ${StudyFolder}/${Subject}/T1w/Diffusion/data_warped.nii.gz \
                    -r ${StudyFolder}/${Subject}/MNINonLinear/T1w_restore.nii.gz \
                    -w ${StudyFolder}/${Subject}/MNINonLinear/xfms/acpc_dc2standard.nii.gz\
                    --interp=spline
                    #--premat=${StudyFolder}/${Subject}/T1w/Results/${fMRIName}/Import/fMRI2str.mat \

I am using acpc_dc2standard.nii.gz as you suggested,. For input, I used "T1w_restore.nii.gz" under /MNINonLinear/. I could not find any *.mat file.

I ran it. It seems that it transformed data.nii.gz to MNI space. I compared the results (data_warped) for two subjects. They look similar. I checked the matrix size of warped data. It is 260x311x260. This is the same dimensions as T1w_restore.nii.gz.

I now need to apply skull stripping. I am doing it in MATLAB. I am using nodif_brain_mask.nii.gz under /${Subject}/T1w/Diffusion/. However, its dimensions are 145x174x145 which is same the data before normalization. I cannot continue due to different matrix dimensions.

I am thinking that I need to warp the mask to MNI space too. Would that be same command given above except the input file will be nodif_brain_mask.nii.gz?

Or should I do skull stripping before the normalization? If I do that, I need to skull strip T1w_restore too before applywarp, right?

Any suggestions? Do you see anything wrong with what I do so far?
Thank you,
Kazim




Glasser, Matt

unread,
Nov 17, 2022, 6:33:24 PM11/17/22
to hcp-...@humanconnectome.org

Just use one of the _brain or brain mask files in MNINonLinear that is the right resolution, rather than separately running brain extraction.

Reply all
Reply to author
Forward
0 new messages