Visualization of Whole Slide Histological Images with Automatic Tissue Type Recognition
The use of modern approaches based on convolutional neural networks (CNNs) for segmentation of whole slide images (WSIs) helps pathologists obtain more stable and quantitative analysis results and improve diagnosis objectivity. But working with WSIs is extremely difficult due to their resolution and...
Saved in:
Published in: | Pattern recognition and image analysis Vol. 32; no. 3; pp. 483 - 488 |
---|---|
Main Authors: | , , , |
Format: | Journal Article |
Language: | English |
Published: |
Moscow
Pleiades Publishing
01-09-2022
Springer Nature B.V |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The use of modern approaches based on convolutional neural networks (CNNs) for segmentation of whole slide images (WSIs) helps pathologists obtain more stable and quantitative analysis results and improve diagnosis objectivity. But working with WSIs is extremely difficult due to their resolution and size. To solve this problem in this paper we for the first time present PathScribe – a new universal cross-platform cloud-based tool for comfortable viewing and manipulating large collections of WSIs on almost any device, including tablets and smartphones. We also consider the important problem of automatic tissue type recognition on WSIs and present the WSS1 and WSS2 subsets of PATH-DT-MSU dataset representing a collection of high-quality WSIs of digestive tract tumors with tissue type area annotations. We also propose a new CNN-based method of automatic tissue type recognition on WSIs. It achieved 0.929 accuracy on CRC-VAL-HE-7K dataset (9 classes) and 0.97 accuracy on PATH-DT-MSU-WSS1, WSS2 datasets (5 classes). The developed method allows classifying the areas corresponding to the gastric own mucous glands in the lamina propria and distinguishing the tubular structures of a highly differentiated gastric adenocarcinoma with normal glands. |
---|---|
ISSN: | 1054-6618 1555-6212 |
DOI: | 10.1134/S1054661822030208 |