Schaefer parcellation

1,168 views
Skip to first unread message

Liz Izakson

unread,
Nov 23, 2020, 10:50:55 AM11/23/20
to hcp-...@humanconnectome.org
Hello all,

However, I am having problems transferring from dscalar format to dtseries format. I am trying to implement the parcellation in Python. Does someone have experience doing this?

Thank you very much,

Liz Izakson, MSc

Neuro-Economics laboratory

Sagol School of Neuroscience

Tel-Aviv University, Tel-Aviv, Israel 69978

Glasser, Matthew

unread,
Nov 23, 2020, 10:51:54 AM11/23/20
to hcp-...@humanconnectome.org

We would need to know more details about what is happening.


Matt.

--
You received this message because you are subscribed to the Google Groups "HCP-Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hcp-users+...@humanconnectome.org.
To view this discussion on the web visit https://groups.google.com/a/humanconnectome.org/d/msgid/hcp-users/CA%2B-ZC%3D0sNNpY8XqqskVMq1zrA7cZgMyuBLpEZkjdK6sqzt_DfQ%40mail.gmail.com.

 


The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail.

Ruby Kong

unread,
Nov 23, 2020, 11:01:39 AM11/23/20
to hcp-...@humanconnectome.org
Hi Liz,

May I know why you need to transfer the dscalar to dtseries? BTW, You can also use dlabel files.

Liz Izakson

unread,
Nov 23, 2020, 11:42:49 AM11/23/20
to HCP-Users, Glasser, Matthew
Hello Matt,

Thank you for your reply.
Here are more details regarding my question:
I use resting-state fMRI data from 900 subjects from the HCP. I want to try several different parcellations to validate my model.
I use Python for my analyses. For Yeo7/ Yeo17/  Glasser et.al. Multi-Modal parcellations, I used Nilearn, as well as Hcp_utils.
However, I could not find an implementation for the Schaefer's parcellation in these packages, and the only examples which I found of how to implement this parcellation on the data were in MATLAB. Therefore, my question is if someone knows how to implement a parcellation (specifically  Schaefer's, but I guess it could be any parcellation) manually (without the usage of a package that does it automatically) in Python?

Sincerely,
Liz

Liz Izakson

unread,
Nov 23, 2020, 11:50:51 AM11/23/20
to HCP-Users, roo....@gmail.com
Hi,

When I looked at MATLAB codes that implement parcellation manually on the data (average the voxels/vertices according to the parcels), I saw that they transfer the dscalar data to dtseries format. I thought that this is the way to solve the problem of the unmatched dimensions between the raw data files which contain 91,282 voxels, and the dscalar parcellation files (specifically,  Schaefer's parcellation) which contain 64,984 vertices. The unmatched dimensions are my biggest issue.
How can I use the dlabel file? 

Thank you,
Liz

Glasser, Matthew

unread,
Nov 23, 2020, 11:54:40 AM11/23/20
to hcp-...@humanconnectome.org, roo....@gmail.com

I don’t know much about the extant Python tools; however, you should be able to use wb_command -cifti-parcellate together with a .dlabel.nii file to parcellate a dtseries file.

Coalson, Timothy Scott (S&T-Student)

unread,
Nov 23, 2020, 3:58:55 PM11/23/20
to HCP-Users, roo....@gmail.com
Having a different number of brainordinates than the standard 91282 grayordinates isn't solved by changing between dscalar and dtseries.  You need to extract the surface data from the timeseries and dlabel files so you can use the vertex indices, rather than trying to use the brainordinates that don't match between the files.  Additionally, 64.9k brainordinates suggests that the file is surface-only, and also does not exclude the medial wall, unlike the standard 91282 grayordinates (which only contain about 29k vertices from each hemisphere, out of a possible total of 32k).

wb_command -cifti-parcellate can handle these brainordinate differences (though most other commands expect an exact match in terms of brainordinates), but it isn't written in python.

Tim


From: Liz Izakson <liziz...@mail.tau.ac.il>
Sent: Monday, November 23, 2020 10:50 AM

To: HCP-Users <hcp-...@humanconnectome.org>
Cc: roo....@gmail.com <roo....@gmail.com>
Subject: Re: [hcp-users] Schaefer parcellation

Liz Izakson

unread,
Nov 24, 2020, 3:24:04 AM11/24/20
to HCP-Users, tsc...@mst.edu, roo....@gmail.com
Thank you all for your help.
So, I understood that my possibilities are either using wb_command or a MATLAB code that can handle these brainordinate differences?

Liz   


Coalson, Timothy Scott (S&T-Student)

unread,
Nov 24, 2020, 4:55:15 PM11/24/20
to Liz Izakson, HCP-Users, roo....@gmail.com
Presumably it can be done it python, but you might have to write it yourself.  I don't expect a matlab implementation to produce a proper .ptseries.nii file, I don't know if it would handle the different medial wall issue correctly (or even let you know if it couldn't), and I would guess it may use ft_read_cifti rather than the new cifti-matlab v2.  If you just want to make sure it is parcellated correctly and will tell you if you gave it the wrong files (and have the option to weight the averaging by vertex areas or volumes), I would recommend wb_command.

Tim


From: Liz Izakson <liziz...@mail.tau.ac.il>
Sent: Tuesday, November 24, 2020 2:24 AM
To: HCP-Users <hcp-...@humanconnectome.org>
Cc: Coalson, Timothy Scott (S&T-Student) <tsc...@mst.edu>; roo....@gmail.com <roo....@gmail.com>

Ruby Kong

unread,
Nov 25, 2020, 2:00:10 AM11/25/20
to Liz Izakson, HCP-Users
Hi Liz,

In HCP data the first 64984 vertices are surface vertices, the first 32492 are left hemisphere, the other half are right hemisphere, the medial wall is denoted as 0 in the Schaefer parcellations, which is NaN rows in the HCP fMRI data.

Not sure how to do in python, in matlab you can download the toolbox here: https://github.com/Washington-University/cifti-matlab

Check the readme here for how to use ft_read_cifti from the toolbox above for reading a dlabel file. The example is reading a dlabel file.
 
Let me know if you have any other questions about the Schaefer parcellation.

Best,
Ruby

Liz Izakson

unread,
Nov 25, 2020, 5:42:27 AM11/25/20
to Ruby Kong, hcp-...@humanconnectome.org
Hi Ruby,

Thank you very much for your help. I'll definitely take a look at what you have suggested.
Just a minor clarification: if in the HCP data the first 64984 vertices are surface vertices, then if I'll exclude the last 26298 vertices, I will equalize between the HCP data and the dlabel/dscalar files of the Schaefer parcellation, no? Then, I could manually average all the vertices in the HCP data according to the Schaefer parcellation because I'll have the same amount of vertices in both files.
Does this way of thinking sound plausible or am I missing something?

Thank you so much,
Liz

Ruby Kong

unread,
Nov 25, 2020, 6:15:37 AM11/25/20
to Liz Izakson, HCP-Users
Hi Liz,

You are right. You can double-check if the 0s in the Schaefer parcellation correspond to the NaN rows after you excluding the last 26298 vertices. It they match well then it should be correct.

Glasser, Matthew

unread,
Nov 25, 2020, 9:48:23 AM11/25/20
to hcp-...@humanconnectome.org, Liz Izakson

The HCP doesn’t produce files with NaNs in them.  There was an old, not recommended CIFTI matlab toolbox for MEG data only that did that, but we recommend either the simpler ciftiopen/ciftisave functions or at this point the new toolbox now available at Ruby’s link.


Matt.

 

From: Ruby Kong <roo....@gmail.com>
Reply-To: "hcp-...@humanconnectome.org" <hcp-...@humanconnectome.org>
Date: Wednesday, November 25, 2020 at 5:14 AM
To: Liz Izakson <liziz...@mail.tau.ac.il>
Cc: HCP-Users <hcp-...@humanconnectome.org>
Subject: Re: [hcp-users] Schaefer parcellation

 

 

Hi Liz,

Liz Izakson

unread,
Nov 25, 2020, 10:32:01 AM11/25/20
to Glasser, Matthew, hcp-...@humanconnectome.org
Hi Matt,

Thank you for the clarification. However, the point of excluding the last 26298 vertices from the HCP data in order to equalize between the number of vertices in the HCP data and the Schaefer parcellation file (a total of 64984 in both) still holds?

Thank you,
Liz

Glasser, Matthew

unread,
Nov 25, 2020, 11:35:47 AM11/25/20
to hcp-...@humanconnectome.org

HCP CIFTI files contain 29696 left hemisphere vertices and 29716 right hemisphere vertices with 31870 subcortical voxels.  If you were using the HCP’s multi-modal parcellation, it would work to include only the first 59412 grayordinates and exclude the subcortical greyordinates.  It sounds like the Schaefer parcellation might use a different internal organization though.  I don’t know the details of that and would defer to Ruby. 

Liz Izakson

unread,
Nov 25, 2020, 11:54:23 AM11/25/20
to HCP-Users, Glasser, Matthew
Thank you again for your help.

Ruby Kong

unread,
Nov 25, 2020, 11:57:06 AM11/25/20
to hcp-...@humanconnectome.org
Hi Liz,

I double-checked some numbers on my side. Matt is right, it seems that your fMRI data with 91282 vertices already exclude medial wall vertices.
You can simply ignore the vertices denoted as 0 in Schaefer parcellation, this will give you 59412 vertices corresponds to the top 59412 vertices in HCP data you read. You can use either ciftiopen or ft_read_cifti to read in the Schaefer parclelation in matlab. This will give you a 64984x1 vector, exclude 0s will give you 59412.

Liz Izakson

unread,
Nov 25, 2020, 11:58:47 AM11/25/20
to HCP-Users, roo....@gmail.com, HCP-Users, Liz Izakson
Hi Ruby,

I could not find any NaN values in the HCP data (as Matt had pointed out). However, I still excluded the last 26298 vertices from the HCP data to equalize the number of vertices in the HCP data and the Schaefer parcellation file (a total of 64984 in both). From your experience with this parcellation, is this matching correct?

Thank you,
Liz 

Glasser, Matthew

unread,
Nov 25, 2020, 12:03:21 PM11/25/20
to hcp-...@humanconnectome.org, roo....@gmail.com, Liz Izakson

As I said before, that can’t possibly be correct as there are 31870 subcortical voxels in HCP data.


Matt.

 

From: Liz Izakson <liziz...@mail.tau.ac.il>


Reply-To: "hcp-...@humanconnectome.org" <hcp-...@humanconnectome.org>
Date: Wednesday, November 25, 2020 at 10:58 AM
To: HCP-Users <hcp-...@humanconnectome.org>

Cc: "roo....@gmail.com" <roo....@gmail.com>, HCP-Users <hcp-...@humanconnectome.org>, Liz Izakson <liziz...@mail.tau.ac.il>
Subject: Re: [hcp-users] Schaefer parcellation

 

 

Hi Ruby,

Ruby Kong

unread,
Nov 25, 2020, 12:09:14 PM11/25/20
to Liz Izakson, HCP-Users
Hi Liz,

Sorry for being unclear. Previously I mentioned there are NaN values is because I thought you are reading the fMRI data in a way like us. We use ft_read_cifti to read in the HCP fMRI data, which will give a 96854 x T matrix, instead of 91282 x T. If your data has 91282 vertices then the medial wall is already excluded. So you should take the first 59412 x T  in your data and exclude 0s in Schaefer parcellation. 

In the end your data should be 59412xT, the Schaefer parcellation will also be 59412x1.

Liz Izakson

unread,
Nov 25, 2020, 3:15:49 PM11/25/20
to Ruby Kong, HCP-Users
Hi Ruby,

Thank you very much. This is very helpful!

Liz

Coalson, Timothy Scott (S&T-Student)

unread,
Nov 25, 2020, 4:31:04 PM11/25/20
to hcp-...@humanconnectome.org
The recommended way to deal with this manually in matlab going forward is not to exclude a "magic" number of indices to get the surface data, but to use the new v2 cifti-matlab to extract the surface data for each hemisphere from each file:

tseriescii = cifti_read(...);
labelscii = cifti_read(...);
leftseries = cifti_struct_dense_extract_surface_data(tseriescii, 'CORTEX_LEFT');
leftlabels = cifti_struct_dense_extract_surface_data(labelscii, 'CORTEX_LEFT');

and repeat the last 2 lines with 'CORTEX_RIGHT'.  This returns a full-surface array, like gifti (it basically functions like wb_command -cifti-separate), for ease of matching by surface vertex indices.  This is only necessary when you have to deal with files that don't use the same conventions for which vertices to include, and/or whether they contain subcortical voxels, or if you need to know the boundaries between structures for processing reasons.  If all your inputs were 91k grayordinates, you could just use the data matrices as-is.

If I recall, the released versions of the ft_ functions didn't allow writing a proper 91k grayordinates file, as they didn't respect the ROI of excluded vertices, so I would recommend migrating away from that, at least for MRI data.  The v2 cifti-matlab also has wrapper functions that act as a drop-in replacement for ciftiopen and ciftisave (but doesn't have wrappers to act like ft_read_cifti).

As mentioned, wb_command -cifti-parcellate already deals with this issue for you.  wb_command -cifti-create-dense-from-template could also have converted the dlabel file to use the same grayordinates, making it into a 91k file.

Tim


From: Ruby Kong <roo....@gmail.com>
Sent: Wednesday, November 25, 2020 10:56 AM
To: hcp-...@humanconnectome.org <hcp-...@humanconnectome.org>

Liz Izakson

unread,
Nov 26, 2020, 3:55:21 AM11/26/20
to HCP-Users, roo....@gmail.com, HCP-Users, Liz Izakson
Hi Ruby,

When I remove all the zeros from the  Schaefer parcellation I remain with 58609 X 1 shape. Does it make sense?

Thank you,
Liz

Liz Izakson

unread,
Nov 26, 2020, 4:09:39 AM11/26/20
to HCP-Users, tsc...@mst.edu
Hi Tim,

Thank you for your reply. I am trying to deal with it manually with Python since I think that it is a matter of understanding what the vertices in the HCP data as well as in the Schaefer parcellation mean to match between them.
However, I completely agree that I have to use wb_command (or MATLAB) to validate the parcellated data that I am getting.  

Thank you,
Liz

Ruby Kong

unread,
Nov 26, 2020, 6:20:29 AM11/26/20
to Liz Izakson, HCP-Users
No. It should be 59412. Can you point me to the file you are reading? Did you use ciftiopen or ft_read_cifti? 

Liz Izakson

unread,
Nov 26, 2020, 7:44:08 AM11/26/20
to Ruby Kong, HCP-Users
Hi Ruby,

I read the following file: Schaefer2018_100Parcels_7Networks_order.dlabel.nii which I downloaded from https://github.com/ThomasYeoLab/CBIG/tree/master/stable_projects/brain_parcellation/Schaefer2018_LocalGlobal.
Right now, I open the file using Python (nibabel package). Specifically, I use this code:
img = nib.load(path)
parc = img.get_fdata()

I end up having a parcellation file of the shape 1X64984.
Then, I removed all the zeros, and remained with 1X58609.

I will try this also with MATLAB as you have suggested.

Thank you,
Liz


Ruby Kong

unread,
Nov 26, 2020, 8:07:58 AM11/26/20
to Liz Izakson, HCP-Users
Hi Liz,

I now what is going on. The Schaefer parcellation is originally generated by data in fsaverage6 and then projected back to fsLR32k. So the medial wall is also originally defined by fsaverage6. After projection, the medial wall is not the same as HCP's medial wall. I am so sorry for the confusion. 

To use the Schaefer parcellation on HCP, you can "insert“ the medial wall back to your 59412xT. In this way both cases will have 64984 vertices. You can find the medial wall mask of HCP data here:
You can use the mask above to convert your 59412xT data to 64984xT.

However, this also means that some vertices assigned with a label might be medial wall vertices in HCP data. When we do our analysis, we simply ignore these vertices.

Sorry for the confusion. 

Liz Izakson

unread,
Nov 26, 2020, 9:53:14 AM11/26/20
to Ruby Kong, HCP-Users
Hi Ruby,

Thank you for the clarification and sorry for continuously asking you questions. 
In practice, you mean that I should use this mask file to filter the parcellation file (or the HCP data files??)
I understood that I can filter the data file with this mask and take only the vertices that are tagged 1 in the mask (there are 59412 of those).

Thank you,
Liz

Ruby Kong

unread,
Nov 26, 2020, 10:09:12 AM11/26/20
to Liz Izakson, HCP-Users
Hi Liz,

Assume fMRI_data_orig is your 59412xT matrix, You can do something like 

fMRI_data = zeros(64984,T);
fMRI_data(medial_mask,:) = fMRI_data_orig;

fMRI_data will then be 64984xT and then you can match it with Schaefer parcellation.

Coalson, Timothy Scott (S&T-Student)

unread,
Nov 30, 2020, 5:49:57 PM11/30/20
to Liz Izakson, hcp-...@humanconnectome.org
However, you need to be careful, if the cifti version of the Schaefer parcellation has some labeled vertices that are within the HCP medial wall, then naively parcellating will average some 0s into your parcels.  The easy way to deal with this is to also mask the label data (assuming "unlabeled" is key 0):

label_data(~medial_mask, :) = 0;

Technically, the cifti-2 format allows the vertices to be stored in any order, not just the typical ascending order, but in practice, I don't think anyone has done this.

Tim


From: Ruby Kong <roo....@gmail.com>
Sent: Thursday, November 26, 2020 9:08 AM

To: Liz Izakson <liziz...@mail.tau.ac.il>
Cc: HCP-Users <hcp-...@humanconnectome.org>
Subject: Re: [hcp-users] Schaefer parcellation

Liz Izakson

unread,
Dec 1, 2020, 3:02:11 AM12/1/20
to HCP-Users, tsc...@mst.edu
Hi Tim,

Thank you for your comment.
So, in practice, when I loaded the HCP data I had 91,282 vertices which I understood didn't include the medial wall. Thus, as Ruby has suggested, I inserted the medial wall into the HCP data (only to the 59412 first vertices which are the cortical vertices) using the medial wall mask (https://github.com/ThomasYeoLab/CBIG/blob/master/stable_projects/brain_parcellation/Kong2019_MSHBM/lib/fs_LR_32k_medial_mask.mat). I scored these additional medial vertices as NaNs in my data. Then, I had the same dimensions in the HCP data and the Schaefer parcellation file (64984 vertices). When I parcellated the data, I averaged and ignored the NaNs. Is this what you meant?

Thank you,
Liz

You received this message because you are subscribed to a topic in the Google Groups "HCP-Users" group.
To unsubscribe from this topic, visit https://groups.google.com/a/humanconnectome.org/d/topic/hcp-users/fXRQSmvF-Fw/unsubscribe.
To unsubscribe from this group and all its topics, send an email to hcp-users+...@humanconnectome.org.
To view this discussion on the web visit https://groups.google.com/a/humanconnectome.org/d/msgid/hcp-users/BYAPR01MB3894534E537CC0784A1E59EC81F50%40BYAPR01MB3894.prod.exchangelabs.com.

Coalson, Timothy Scott (S&T-Student)

unread,
Dec 1, 2020, 5:01:29 PM12/1/20
to Liz Izakson, HCP-Users
That sounds like it should work too.  We generally avoid adding NaNs to any of our data, and use separate binary masks instead when needed.

Tim


From: Liz Izakson <liziz...@mail.tau.ac.il>
Sent: Tuesday, December 1, 2020 2:01 AM
To: HCP-Users <hcp-...@humanconnectome.org>; Coalson, Timothy Scott (S&T-Student) <tsc...@mst.edu>
Reply all
Reply to author
Forward
0 new messages