Beginner asking for opinion: automated tissue separation & percentage of positive cells, is this possible?

361 views
Skip to first unread message

Marika

unread,
Jun 14, 2018, 9:33:02 AM6/14/18
to QuPath users

Hello!

 

I’m PhD student with almost zero experience using image analysis softwares, so I’d like to ask your opinion before using hours and hours on trying to do something which is maybe impossible. My aim is to compare endometrium tissue samples from 2 patient groups. I have about 140 samples and 4 different H-DAB nuclear stainings (Ki67 & immune cells). As the amount of epithelium (both glandular and surface) varies a lot between samples and therefor effects the results as stroma and epithelium are very different, I need to separate epithelium from the rest of the sample. I’ve done stromal analysis for smaller sample set earlier with ImageJ quite manually, but now with scanned WSIs and a lot more samples that seems almost impossible and huge task.

 

What do you think, would I be able to teach QuPath to separate epithelium from stroma, then count the percentage of positive cells in tissue and do the same for all samples automatedly as a project? I would be really happy if there was a youtube tutorial of this kind of procedure or at least a list of steps needed to do it, but even hearing your opinion of the possibility to do this would be nice. I added here 2 example pictures from the same sample.

With kind regards,
Marika

close-example_WSI_endometrium_Ki67.png
example_WSI_endometrium_Ki67.png

micros...@gmail.com

unread,
Jun 14, 2018, 11:05:56 AM6/14/18
to QuPath users
It "might" be possible with a combination of SLICs, texture analysis, and cell detection... but it is hard to tell before you try it.  Some of it will end up being down to how consistent the staining and tissue sections are relative to each other.  Many things are possible with scripting, though.

Pete

unread,
Jun 15, 2018, 2:24:13 AM6/15/18
to QuPath users
I agree it might be possible - and, if not, it might become possible if you slightly change the definition of what you want to do.

I'd recommend working through the early stages in the QuPath wiki, which include step-by-step examples for a couple of specific images.  By the time you reach https://github.com/qupath/qupath/wiki/Detecting-objects it should be possible to do the analysis you describe if you are willing to annotate manually all the different regions you want to analyze.  Therefore definitely laborious if you will have 4 x 140 slides, but also the place to start to get used to the software - and potentially generate meaningful results even on a small subset of the data.

The next step would be to look at annotating large regions and then detecting all the cells inside, and subsequently classifying them https://github.com/qupath/qupath/wiki/Classifying-objects
This is the other way around from the process you originally described: you don't teach QuPath to find the regions with particular cell types and then detect the cells, rather you detect the cells and then teach QuPath to split them according to cell type.  Then you can get the percentage of positive cells for each cell type.

There are other options in QuPath (e.g. texture analysis, sending regions to ImageJ and turning it into an ImageJ task) - but they will require considerably more effort, and may give worse results rather than better.  If this is a long term project, then it's possible that features added to QuPath over the next 6-12 months could also help.

Marika

unread,
Jun 15, 2018, 10:13:30 AM6/15/18
to QuPath users
Thank you for your answers! I have quite varying samples, so maybe it needs to be annotated quite manually. I could maybe choose smaller regions from each sample and manually annotate regions. I might get help for drawing annotations as it seems pretty easy to check and correct those if needed. Now when I'm trying to learn to use QuPath, some questions arise:

-Isn't there undo-button in QuPath? As if you accidentally move some annotation, you can't just click undo but need to try to get it back on place?

-Should I be able to draw 1 big annotation around the region I want to analyze and then draw smaller annotations inside of that (epithelium)? Simple tissue detection gives me area with which this works fine, but if I try to draw polygon for example, I can't draw anything inside that. Am I missing something there? I managed to annotate epithelium first and then draw a bigger, wanted area around those, but when I tried to detect positive cells from annotations all epithelium annotations just disappeared without warning.
    -And if I use simple tissue detection at first, can I then correct that annotation if I don't want some areas it includes? Mainly that seems pretty easy and good function to use

-If I annotate the whole area which I want to analyze and then mark the epithelium, can I somehow pick the wanted area without epithelium (like wanted area annotation minus epithelium annotations)? I think that could be accurate enough to present stroma.

-If I still tried detection classifier and it would work almost perfectly, could I just pick the wrong detections cell by cell and change their classification myself?

-Can I pick all annotations of the same class from annotation list at once or do I have to scroll down the list and pick all separately?

-Is it so that in summary table I just get all the annotations separately, I can't see the number of positive cells in all the annotations of the same class? Of course that can be counted with excel afterwards so it is not very big problem.

QuPath wiki is very nice and quite clear, thank you for that! I've tried to find answers from there, but couldn't find them so I had to ask.

Pete

unread,
Jun 15, 2018, 10:36:22 AM6/15/18
to QuPath users
-Isn't there undo-button in QuPath? As if you accidentally move some annotation, you can't just click undo but need to try to get it back on place?

I'm afraid not; it's actually really hard to create an 'undo' function that works with such large images and (potentially) large numbers of objects without making performance terrible.  There's some more info at https://github.com/qupath/qupath/issues/75#issuecomment-300294512

However I have worked on it a bit.  It's not part of QuPath v0.1.2, but you can try out a provisional 'undo' function (along with various other improvements) if you follow the instructions at the end of this: https://petebankhead.github.io/qupath/2018/03/19/qupath-updates.html

Eventually it will appear in a future QuPath release, but I'm not sure yet when that will be... I hope before the end of the year.

-Should I be able to draw 1 big annotation around the region I want to analyze and then draw smaller annotations inside of that (epithelium)? Simple tissue detection gives me area with which this works fine, but if I try to draw polygon for example, I can't draw anything inside that. Am I missing something there? I managed to annotate epithelium first and then draw a bigger, wanted area around those, but when I tried to detect positive cells from annotations all epithelium annotations just disappeared without warning.

Yes, if you detect cells inside a region QuPath will automatically delete any other objects inside that region.  The alternative would cause even more problems... for example, if you ran cell detection on a region twice, you would potentially end up with counting every cell twice.  So cell detection will always clear away existing objects.

This does place limitations on the order in which you need to annotate region & detect cells.

You should be able to draw polygons inside larger annotations, but not necessarily use the brush or wand tool.... unless the annotation you are drawing inside is 'locked': https://github.com/qupath/qupath/issues/50#issuecomment-280862813

    -And if I use simple tissue detection at first, can I then correct that annotation if I don't want some areas it includes? Mainly that seems pretty easy and good function to use

Yes, you can edit the region - but you need to unlock it first.  See https://groups.google.com/forum/#!msg/qupath-users/11FqZKeCA2s/xDX9KJghBAAJ

-If I annotate the whole area which I want to analyze and then mark the epithelium, can I somehow pick the wanted area without epithelium (like wanted area annotation minus epithelium annotations)? I think that could be accurate enough to present stroma.

Yes... maybe not as easily as it should be.
If you select all your epithelial annotations, you can merge them with Objects -> Merge selected annotations.
Then with that now-single epithelial annotation selected, you can choose Objects -> Make inverse annotation to get an annotation for 'everything outside'.

-If I still tried detection classifier and it would work almost perfectly, could I just pick the wrong detections cell by cell and change their classification myself?

I'm afraid not... at least not directly.  I've been asked about this once or twice, but I'm not sure I like the idea of making this functionality easily accessible.  Part of the point of the detection classifier is to make it reproducible, and enable the logging of specific steps.  If you can then easily override it, then it could cause confusion.

On the other hand, it shouldn't be impossible... and with a script most things are possible.  Here's one that sets the classification of every cell (detection) to match the classification of any parent annotation:

// Assign parent classification to every detection, if not null
def detections = getDetectionObjects()
for (detection in detections) {
   
def parentClass = detection.getParent()?.getPathClass()
   
if (parentClass != null)
        detection
.setPathClass(parentClass)
}
fireHierarchyUpdate
()

If you run it after your detection classifier it potentially gives a way to make this change, overriding whatever the detection classifier did.

-Can I pick all annotations of the same class from annotation list at once or do I have to scroll down the list and pick all separately?

With a script again... here's one to select all the annotations with the classification 'Tumor':

def pathClass = getPathClass('Tumor')
selectObjects
{it.isAnnotation() && it.getPathClass() == pathClass}

-Is it so that in summary table I just get all the annotations separately, I can't see the number of positive cells in all the annotations of the same class? Of course that can be counted with excel afterwards so it is not very big problem.

No, you just get the numbers split according to the classifications of the detections itself - not the annotations they are inside.

But the script above to set the parent classification may help.  Since this will lose any intensity classification, you might need an extra line to reapply this, e.g.

setCellIntensityClassifications('Nucleus: DAB OD mean', 0.2)

Best wishes,

Pete

micros...@gmail.com

unread,
Jun 15, 2018, 10:39:14 AM6/15/18
to QuPath users
Undo is in v1.3, which you can find how to build and use here:https://petebankhead.github.io/qupath/2018/03/19/qupath-updates.html
The closest you can do to that in 1.2 is constant use of CTRL+S and CTRL+R to save and revert.

Annotations can be drawn within other annotations once you lock the first annotation.  If it is unlocked, it would be the same as filling in holes in the first annotation.  Tissue detection is the other way around as it starts locked.

Classifying annotations is one way to determine which annotations are used for certain actions.  There are quite a few scripts where this is done on the forums.  In version 1.3, you can use the shift key and brush tool to fill in "but not overlap" another annotation.

You can manually change cell classifications by script (I think I have one somewhere on my Gist, not sure), but it probably is not a good idea if you intend to publish results.

I would recommend looking through a few of the scripts on the forums for much of what you want to do, or through Pete's Gist or mine.  It sounds like you will want to do quite a bit of scripting.
Specifically, selecting annotations with a class:
selectObjects { p -> p.getPathClass() == getPathClass("Stroma") && p.isAnnotation() }

There is no whole slide summary table by default.  There are scripts to generate whole slide results... but then you lose the information about the annotation that the results came from (for example epithelial vs stroma).

For generating files per project, you can combine:


Marika

unread,
Jun 18, 2018, 5:16:33 AM6/18/18
to QuPath users
Thank you all for your answers, I appreciate them very much! I guess that now here is quite a lot to think/try/learn, but now I have some ideas how to proceed and what to expect.
Reply all
Reply to author
Forward
0 new messages