Difference of WSI resolution between QuPath/OpenSlide's and each vender's viewer

690 views
Skip to first unread message

Naoko

unread,
May 16, 2017, 4:05:33 AM5/16/17
to QuPath users
Hi Pete,

I'm still struggling with that MATLAB thing :( but this time there is another subject to ask.

Before I was informed about QuPath, I have used Aperio platform (slide scanner, image viewer, and image analysis software) due to our institutional circumstances, so most of the scanned image I use have .tif/.svs extensions and are scanned at 20x magnification, 0.5 micrometer/pixel.

When I looked at a scanned image using QuPath and Aperio Image Scope, respectively, a slight difference in resolution of the image was noticed (attached PDF, page 1). Nuclear contours look a bit rough in the left picture compared to the right. I am concerned that this difference may affect detecting objects and also wonder whether this is just due to compatibility issues or something like that of a file format and the corresponding viewer. 

This resolution difference doesn't seem to make much difference if we look at large cells or an image at a lower magnification. According to the instruction as being discussed in the ongoing thread (by Carlos), I tried Positive cell detection analysis by changing a value for each parameter to determine an optimal setting, and the result looks good for now (page 2 and 3). I found some cells failed to be detected (page 4), which I think can be within an acceptable level in this case. But when I compared the generated image with that from Aperio nuclear algorithm (page 5 and 6), the latter seem to detect each cell more accurately even if some false-positive cells are observed as well. I understand, of course, there is room for improvement on my setting of QuPath and different algorithms for cell detection are applied in both platforms; however, doesn't this difference of resolution become an obstacle in the analysis especially for small cells? 

Generally speaking, compared to epithelial cancers including lung, colon, breast cancer, lymphoma cells are smaller in size and have narrower cytoplasm, "packing" in histological sections, which make it difficult to distinguish lymphoma cells from reactive lymphocytes...I'd like to manage this issue by using QuPath somehow since it's one of the few tools for making cell-by-cell analysis possible on WSI.

I extracted a part of the image from WSI. Please see 15217M.svs. 

Best regards,

Naoko


20170516qupath.pdf
15217M.svs

Pete

unread,
May 16, 2017, 5:50:59 PM5/16/17
to QuPath users
Hi Naoko,

Thanks for your comments.  I looked into the resolution issue, and fortunately I think there is a straightforward explanation for it.  It comes down to this line of code:

// Reset interpolation - this roughly halves repaint times
gBuffered.setRenderingHint(RenderingHints.KEY_INTERPOLATION, RenderingHints.VALUE_INTERPOLATION_NEAREST_NEIGHBOR);

Since your image is acquired at 20x then viewing it at any higher magnification requires interpolation.  QuPath uses the simplest - and fastest - interpolation approach, which basically makes every pixel looks like a square of a particular color as you zoom in further.  It appears from the screenshots that ImageScope is using a different interpolation method.  In your attachment, the image was viewed at x80, and therefore each ‘image pixel’ within QuPath is shown as a square of 16 pixels on screen with the same color and no smoothing; consequently it looks blocky.

I tested this by changing the line in QuPath and viewing the same image with the two other interpolation methods that Java offers: bilinear and bicubic.  This makes the appearance in QuPath much more similar to ImageScope.  I've attached examples.

One reason for QuPath using nearest neighbor interpolation is speed; it results in less lag when browsing the image.  However, another reason is that QuPath is mostly focussed on analysis… and I personally I strongly prefer for image analysis software to display the image with minimal processing/interpolation in order to help better understand its contents.  The boxes visible in QuPath make it clear that the image is zoomed in beyond the resolution actually provided by the scanner.


In any case, the cell detection in QuPath with the default settings is run at resolution where the pixel size is 0.5 µm… this is 20x in your case, and no further magnification or interpolation is needed or applied.  I suspect that ImageScope is probably doing the same, although I don’t know for sure.  I don’t think this should impact the detection.

For small cells there may be some advantages in acquiring the image at 40x magnification; this not only adds some small amount of information to the image, but also means that any JPEG compression artefacts are included at the higher resolution.  With this in mind, even if you actually run your analysis at 20x anyway, the quality should be better because these artefacts are reduced when scaling down.  But the impact may be too subtle to be worthwhile.


With regard to the cell detection, I would suggest reducing the ‘Minimum area’ in QuPath substantially; in your case, it looks like a value in the range 2-5 brings in many of the missed cells.  You might also try reducing the sigma value from 1.5 to 1.  And, if you would like to be able to relate QuPath’s detections more easily with the exact pixels at high magnification, you can turn off ‘Smooth boundaries’ to get a more ‘raw’ result.

In interpreting all this, one aspect of QuPath’s nucleus detection may be useful to know... or at least what I was thinking at the time.

Simple detection very often depends upon setting an intensity threshold to detect dark/light pixels in the image.  This can form the basis of nucleus detection and give very intuitive results; however, a problem can emerge in that the exact choice of intensity threshold can have a huge influence on the results.  This is not only in terms of the number of detections, but also on the measured size of the detections - because a small change in threshold might switch from detecting only the darkest part of a cell to detecting the cell and also the surrounding blur (described here).  A simple change in intensity threshold might make a cell be measured as being 2 or 3 times larger.

This isn’t good whenever cell measurements are also important for later classifying the cell type; it makes size and intensity more tightly related than they should be.  As a result, when designing the cell detection in QuPath, I tried to reduce the trouble this can cause by making the detected boundaries be effectively based on the zero crossings after a Laplacian of Gaussians filtering; this has a more mathematical justification for matching with the cell boundary.  To see this in action, if you zoom in on a suitably dark cell and run the detection with wildly different intensity thresholds, you should find that the boundary of the cell remains (usually) unchanged; the primary difference is in terms of whether the cell is detected at all.

This benefit in improved boundary consistency comes with a cost of making it a bit harder to relate what is detected directly to the intensities within the image.  Because of the preprocessing involved to help compute the boundaries, cells that look obvious might occasionally be missed - especially if they occur very close to cells that are stained more darkly.  On the other hand, the cells that are detected should be detected more consistently.  In fact, the localization of nucleus boundaries is more dependent on the sigma parameter than the intensity threshold (although since sigma is defined in terms of spatial units, that does at least make some sense).

Whether this is actually a good or a bad thing is up for debate; I suspect it depends on the image, and it’s very hard to compare algorithms with multiple parameters fairly.  I also don't know anything about how the ImageScope algorithm works or performs.  But I thought that perhaps it would be useful to know a bit more of the rationale behind the method in QuPath.

In any case, I think there is definitely room for improving the cell detection in QuPath, or adding in new detection algorithms for different types of images.  There are already two distinct methods for cell detection; given the ability to add extensions, I hope that more will be made available.

Best wishes,

Pete
Interp_bicubic.jpg
Interp_bilinear.jpg
Interp_nearest_neighbor_default.jpg

Naoko

unread,
May 23, 2017, 8:58:33 PM5/23/17
to QuPath users
Hi Pete,

Thank you so much for your clear and thorough description of the resolution issue. Now I understood how interpolation works and affects what we see on a slide viewer, and I should read up on your gitbook more to understand the rationale. 
Also, I will try to analyze images scanned at 40x magnification as well to compare the results of cell detection. 
Thanks again and I'm sorry for my late reply for your answer.

Best wishes,
Naoko

micros...@gmail.com

unread,
May 24, 2017, 9:11:16 AM5/24/17
to QuPath users
If you are still wanting to pick up more of the faint detections, another thing to try is turning the Background Radius to 0, or drop the background intensity threshold to something much closer to your detection threshold.  I frequently have to apply other methods afterwards to eliminate false detections, but at least I usually have a pool of detections that includes everything that I might want.
Reply all
Reply to author
Forward
0 new messages