Prostate microscopy is one of the areas where the use of pathology image analysis is on the rise and about to become standard diagnostic practice. In this article, you will learn about the recent developments in automated prostate tissue analysis.
Find out how advanced tissue imaging and artificial intelligence methods like semantic segmentation, instance segmentation and image classification assist pathologists in the detection, diagnosis and treatment of prostate cancer. We share proven tips and tricks for the more efficient analysis of prostate tissue.
- Common prostatic disease conditions
- Prostate histology tissue samples
- Computational methods for the histologic assessment of prostate tissue samples
Common prostatic disease conditions
Prostatic disease is associated with a number of conditions. Each of these results in a different atypical gland pattern when prostate biopsies are observed under a microscopy. The following conditions need to be mentioned here:
- Prostatic hyperplasia
- Prostatic intraepithelial neoplasia
- Prostatic adenocarcinoma
The most aggressive of these conditions is prostatic adenocarcinoma. There are different types of prostate cancer. The most common type is the adenocarcinoma of the prostate, which develops in the glands. Two different variants of adenocarcinoma can be distinguished: acinar adenocarcinoma and ductal adenocarcinoma (Humphrey, 2017). Apart from the typical acinar type prostate cancer observed in more than 90 % of prostatic adenocarcinomas, a spectrum of morphological variants and subtypes exists (see infographic 1). Two different types need to be mentioned here: the variants of conventional acinar adenocarcinoma and cancers with unusual histological patterns like ductal type prostate cancer/ductal prostate carcinoma or mucinous carcinoma (Mikuz, 2015).
Prostate histology tissue samples
Let us have a look at how prostate tissue image data is gathered in clinical settings.
Prostate specimen collection methods
Prostate biopsy is a minimally invasive procedure and is a commonly used diagnostic method for detecting of prostate cancer, by obtaining specimens of suspicious tissue.
By using a thin needle a number of tissue samples (biopsy samples) are collected from the prostate glandular tissue. Prostate biopsies are collected using either a transrectal or transperineal approach. These two approaches are typically based on tissue sampling with an ultrasound-guided core needle biopsy to gather core needle samples from different areas.
During a transrectal prostate biopsy, a biopsy gun projects a thin needle into suspicious areas of the gland through the rectum. By doing so, small sections of tissues can be collected for analysis purposes.
In the case of a transperineal biopsy, a pin is inserted through the perineal skin into the prostate (Devetzis et al., 2021; Streicher et al., 2019). If prostate cancer is detected, a prostatectomy might have to be performed, which is the partial or complete removal of the prostate or prostatectomy samples for further analysis. However, it may also be performed to treat benign prostatic hyperplasia (Martini & Tewari, 2019; Wilt et al., 2012).
Benign prostatic hyperplasia (BPH) is a condition in men where the prostate gland is larger in volume, but not cancerous. It is described as a hyperplastic process resulting in the growth of glandular-epithelial and stromal/muscle tissue, mainly in the periurethral area of the prostate. This can cause various secondary issues like urinary problems, which when severe enough, are usually treated with a transurethral resection of the prostate (TURP). During this surgery an instrument called a resectoscope is inserted through the penis and into the urethra. With this instrument excess prostate tissue, which may be blocking urine flow, can be removed (Wilt et al., 2012; Bill-Axelson et al., 2005).
Microscopy techniques in prostate cancer diagnosis
The microscopic analysis of tissue specimens is the gold standard in cancer detection. Making a histopathological diagnosis for prostate cancer requires light microscopy assessment of the prostate tissue samples. In prostate cancer microscopy research the tissue sections are hematoxylin and eosin (H&E)-stained(Humphrey, 2017).
H&E-stained samples are typically based on fixation, processing, acquisition of prostate biopsy glass slides with an analog microscope. However, ongoing digitalization and recent developments in artificial intelligence/machine learning allow for real-time, even remote, access to images.
Imaging modalities have changed over the last few years, and thus, pathologists are able to quickly generate images, analyze them and come to an almost real-time diagnosis of prostate cancer (Rocco et al., 2021, Streicher et al., 2019).
The most commonly used microscopy imaging techniques for prostate tissue are:
- Fluorescent confocal microscopy (FCM)
- Atomic force microscopy (AFM)
- Electron microscopy (EM)
- Optical microscopy with molecular selectivity
One of the challenges in prostate cancer diagnosis is ensuring efficient patient management and avoiding unnecessary invasive procedures. However, for those patients, who are diagnosed with cancer, it is important to distinguish between those with indolent disease and those with an expected unfavorable outcome.
There is a growing body of research highlighting the importance of immunohistochemical (IHC) methods in order to make a prostate cancer prognosis and the diagnostic confirmation of “borderline” cases due to the presence of certain structures like basal cells. Among the most frequent prostate cancer imaging biomarkers are: ki-67, p53, PTEN, MYC and ERG (Carneiro et al., 2018; Devetzis et al., 2021).
IHC sample preparation can be based on fresh tissue sections, frozen tissue and tissue sections fixed with paraffin. In order to detect cancerous or abnormal cells biopsy slides are stained with IHC dyes. This allows the pathologist to detect abnormalities in the sample tissues (Jakobsen et al., 2016).
However, there is still demand for further novel biomarkers for improving the detection of clinically significant cases (e.g. malignancy) and other clinical setting. Furthermore, the phenomenon of therapy resistance e.g., castration resistant or hormone refractory prostate cancer needs to be addressed in studies.
Computational methods for the histologic assessment of prostate tissue samples
Different computational microscopy image analysis methods for the detection of prostate cancer have been discussed in existing research. In general, computer aided methods for the histologic assessment of prostate tissue samples contribute to high accuracy, reproducibility and time-efficiency in the image analysis.
We will discuss three techniques for the automated analysis of prostate tissue images in the current article – semantic segmentation, instance segmentation and image classification.
The histologic grading of prostate cancer with automated methods
The detection and grading (e.g., Gleason grading) of prostate cancer require speed and objectivity, which can be achieved by applying digital image analysis methods in prostate tissue analysis.
Tips and tricks
Artificial intelligence-driven methods can help you efficiently assess the pathologic stage in images of prostatic tissue.
If the diagnosis is made by pathologists manually, this can be not only a complex but also a time-consuming task. Pathologists use a grading and/or classification system to qualitatively assess the stage of tumor pathology. To reduce the workload for pathologists, an automatic classification system would be of great use.
With the help of advanced computational methods pathologists are able to conduct fast and accurate examination of prostatic gland tissue. Such tools give researchers the independence to draft their own automated applications tailored to the specificity of their research questions. IKOSA AI is especially developed for the training of AI-algorithms without coding or data science knowledge for specific bioimage analysis tasks.
In the case of prostate cancer, disease severity is assessed based on the Gleason score, which describes different growth patterns of the tumor glands. The score is calculated by adding the two most prominent Gleason grades (on a scale from 6-10 overall) gathered by means of prostate tissue analysis. Figure 1-4 show different Gleason grades. To date it is the best indicator for patients’ expected disease outcomes.
However, it is also necessary to advance technology in order to adapt to clinical needs and to improve patient management. Deep learning techniques are able to recognize cell types and diagnose a disease based on the extraction of information on the parameters of cells and other morphological features. Consequently, specimens can be either classified as cancerous or benign, depending on the present morphological characteristics found. Further, cancerous and benign samples can be told apart. In addition, deep learning methods allow for an accurate gland detection, where gland boundaries are preserved (Ing et al., 2018). With these techniques the following patterns can be be observed under a microscope (depending on the respective Gleason score):
- Histologic pattern
- Glomeruloid pattern
- Stromal pattern
- Atypical gland pattern
- Cribriform pattern
In diagnostic pathology, the pathologists make a diagnosis based on viewing a set of biological samples (tissue stained with different markers) and evaluating many specific features of the cellular objects (such as size, shape, color, texture etc.). This process is an important part in clinical pathology and can be improved by providing pathologists with quantitative data gathered from the images using automated deep learning techniques (Vu et al., 2019).
Modern imaging technology captures entire slide images with a scanner and stores them in a digital format. The goal of this process is to apply image analysis technology based on machine learning to determine the presence and or absence of disease such as cancer (Daisuke & Shumpei, 2017).
The most common tasks in prostate tissue image analysis are the segmentation of morphological structures, such as nuclei and cells, in cancer and non-cancerous regions and the classification of image regions and whole images. These analysis tasks are vital to extracting and interpreting morphological information from digital slide images as e.g., cancer nuclei differ from normal nuclei in many ways. Thus, the quantitative characterization and the extraction of nuclear, cytoplasmic and intraluminal features, e.g., size and number of epithelial nuclei, size and number of lumina, shape of nuclei and lumina, roundness or circularity are key components of the automated analysis (Vu et al., 2019).
The uses of the image classification method in the grading of prostate cancer
One standard technique for the automated analysis of prostate tissue samples is image classification. Image classification can be performed with or without prior segmentation. Image classification assigns a class label to an image or to a region within that image.
If a pathologist has to classify images into certain classes manually, it would definitely take up a large amount of time. Therefore, an automated method is indispensable when assigning images of tissue specimens into classes.
One way to do this is to train the an image analysis algorithm using a set of sample images/data. The second method is to let the model learn by itself. The final output from the image classification process attributes histologic images or parts of them to a specific label category.
This can be done by using a convolutional neural network (CNN). For the purposes of prostate tissue analysis a trained deep learning network can assign the nuclei in prostate tissue images to certain classes. In a next step the algorithm can use this classification to classify regions, e.g., glandular regions according to the Gleason score and/or their malignancy grade (Gunashekar et al., 2022). This is an indispensable component of modern diagnostic practice in pathology.
Semantic segmentation for the efficient detection of pathological patterns in prostate tissue
Another method for the automated analysis of prostate tissue images is semantic segmentation. This algorithm type is also based on deep learning. Semantic segmentation involves labeling each pixel in the image to find out to which class it belongs. Semantic segmentation performs pixel classification (local) using features of a broader area (Isaksson et al., 2017).
Those deep learning algorithms allow pathologists to assess the potential invasion into the extracellular space, lumen appearance and spacing as well as the appearance and arrangement of epithelial cells and nuclei. This is essential for determining the presence of prostate cancer and the prediction of the histological grades present in a prostate biopsy (Isaksson et al., 2017). Multiple patterns of images extracted from the digital slides of a prostate biopsy can be classified based on the Gleason grading system (Bhattacharjee et al., 2020).
Semantic segmentation is a common technique used to identify distinct pathological patterns in prostate WSIs. Such patterns are usually associated with a specific Gleason score. This can help researchers objectify cancer detection and distinguish between high-grade tumor regions, low-grade tumor regions and benign regions in prostate tissue images.
What is more, this automated method can help you quantitatively assess morphological features in the tumor microenvironment. Thus, you can obtain valuable information on parameters like the spatial arrangement of epithelial cells, the presence and count of prominent nucleoli as well as the extent of tumor invasion etc. (Ing. et al., 2018).
Tips and tricks
If you need to delineate and grade tumor regions in prostate WSI slides, semantic segmentation is the right way to go.
With the help of IKOSA AI you can develop highly efficient algorithms that can assist you in the study of different adenocarcinoma growth and invasion patterns.
Conquering common prostate tissue analysis challenges
However, there are also some challenges with regards to prostate tissue classification and segmentation in pathology research. Despite the large number of studies on image classification and segmentation, it remains a difficult task to extract and interpret information from digital slide images. Thus, there are a number of challenges and issues that need to be addressed by segmentation and classification algorithms.
First, the development of accurate and efficient algorithms for these purposes is a challenging task, because tissue morphology is complex and tumors are heterogeneous, not just in prostate tissue, but in other tissues as well. A single tissue specimen contains a variety of nuclei and other structures. Cancerous tissue in particular often contains glands and other morphological structures with atypical appearance.
Therefore, algorithms need to take this specificity into consideration and need to dynamically adapt to such variations (Vu et al., 2019). As a result, an algorithm can do well for one image but may not be successful for another. In such cases using IKOSA AI to develop specialized applications for the analysis of prostate glands imagesoffers many advantages that can help you avoid these issues:
- designing algorithms closely tailored to your research question
- fully automated algorithm training workflow
- options to fine-tune and retrain your existing algorithms
In the IKOSA Knowledge Base we offer detailed advice on how to improve the outcomes of your algorithm training and avoid the aforesaid issues.
Digital prostate slides have a high resolution and sometimes do not fit in main and GPU memory on most machines. Consequently, image classification might not be possible for the whole image at once. This brings up the need to design an algorithm which is able to work on multiple resolutions or image tiles (Vu et al., 2019).
Tips and tricks
Using ROIs as a basis for analyzing images can significantly speed up the analysis process when you are confronted with very large digital slides.
Next, nuclei often touch or overlap with each other. Therefore, the semantic segmentation process is difficult when nuclei are clumped, which often does not lead to optimal results. This is when a more complex method like instance segmentation comes into play.
By using the instance segmentation method you can significantly boost algorithm performance in such cases. Instance segmentation algorithms involve unified multi-task learning processes. This allows the algorithm to not only yield an accurate probability map of cellular structures on the basis of appearance information, but also to split touching and overlapping histological objects based on contour information cues (Chen et al., 2017).
Tips and tricks
To separate clustered and overlapping objects in prostate gland images applying a contour-aware segmentation method like instance segmentation is recommended.
Be sure to check our upcoming blogpost on instance segmentation in the context of multichannel fluorescent microscopy images.
Written by Elisa Opriessnig and Fanny Dobrenova
Abishek P., Singh S.K., Khamparia A. (2021). Detection of Prostate Cancer Using Deep Learning Framework. IOP Conf. Ser.: Mater. Sci. Eng. 2021:1022.
Bhattacharjee, S., Prakash, D., Kim, C. H., & Choi, H. K. (2020). Multichannel Convolution Neural Network Classification for the Detection of Histological Pattern in Prostate Biopsy Images. Journal of Korea Multimedia Society, 23(12), 1486-1495.
Bill-Axelson, A., Holmberg, L., Ruutu, M., Häggman, M., Andersson, S. O., Bratell, S., … & Johansson, J. E. (2005). Radical prostatectomy versus watchful waiting in early prostate cancer. N Engl J Med, 352, 1977-1984.
Carneiro, A., Barbosa, Á. R. G., Takemura, L. S., Kayano, P. P., Moran, N. K. S., Chen, C. K., … & Bianco, B. (2018). The role of immunohistochemical analysis as a tool for the diagnosis, prognostic evaluation and treatment of prostate cancer: A systematic review of the literature. Frontiers in oncology, 8, 377.
Chen, H., Qi, X., Yu, L., Dou, Q., Qin, J., & Heng, P. A. (2017). DCAN: Deep contour-aware networks for object instance segmentation from histology images. Medical image analysis, 36, 135-146.
Komura, D., & Ishikawa, S. (2018). Machine learning methods for histopathological image analysis. Computational and structural biotechnology journal, 16, 34-42.
Devetzis, K., Kum, F., & Popert, R. (2021). Recent Advances in Systematic and Targeted Prostate Biopsies. Research and Reports in Urology, 13, 799.
Gleason, D. F. (1966). Classification of prostatic carcinomas. Cancer Chemother. Rep., 50, 125-128.
Gunashekar, D. D., Bielak, L., Hägele, L., Oerther, B., Benndorf, M., Grosu, A. L., … & Bock, M. (2022). Explainable AI for CNN-based prostate tumor segmentation in multi-parametric MRI correlated to whole mount histopathology. Radiation Oncology, 17(1), 1-10.
Humphrey, P. A. (2017). Histopathology of prostate cancer. Cold Spring Harbor Perspectives in Medicine, 7(10), a030411.
Ing, N., Ma, Z., Li, J., Salemi, H., Arnold, C., Knudsen, B. S., & Gertych, A. (2018). Semantic segmentation for prostate cancer grading by convolutional neural networks. Medical Imaging 2018: Digital Pathology, 10581, 343-355.
Iqbal, S., Siddiqui, G. F., Rehman, A., Hussain, L., Saba, T., Tariq, U., & Abbasi, A. A. (2021). Prostate cancer detection using deep learning and traditional techniques. IEEE Access, 9, 27085-27100.
Isaksson, J., Arvidsson, I., Åaström, K., & Heyden, A. (2017). Semantic segmentation of microscopic images of H&E stained prostatic tissue using CNN. 2017 International Joint Conference on Neural Networks (IJCNN), 1252-1256.
Jakobsen, N. A., Hamdy, F. C., & Bryant, R. J. (2016). Novel biomarkers for the detection of prostate cancer. Journal of Clinical Urology, 9(2_suppl), 3-10.
Martini, A., & Tewari, A. K. (2019). Anatomic robotic prostatectomy: current best practice. Therapeutic Advances in Urology, 11, 1756287218813789.
Mikuz, G. (2015). Histologic classification of prostate cancer. Anal Quant Cytopathol Histpathol, 37(1), 39-47.
Rocco, B., Sighinolfi, M. C., Sandri, M., Spandri, V., Cimadamore, A., Volavsek, M., … & Montironi, R. (2021). Digital biopsy with fluorescence confocal microscope for effective real-time diagnosis of prostate cancer: a prospective, comparative study. European Urology Oncology, 4(5), 784-791.
Salman, M. E., Çakar, G. Ç., Azimjonov, J., Kösem, M., & Cedi̇moğlu, İ. H. (2022). Automated prostate cancer grading and diagnosis system using deep learning-based Yolo object detection algorithm. Expert Systems with Applications, 201, 117148.
Streicher, J., Meyerson, B. L., Karivedu, V., & Sidana, A. (2019). A review of optimal prostate biopsy: indications and techniques. Therapeutic advances in urology, 11, 1756287219870074.
Tătaru, O. S., Vartolomei, M. D., Rassweiler, J. J., Virgil, O., Lucarelli, G., Porpiglia, F., … & Ferro, M. (2021). Artificial intelligence and machine learning in prostate cancer patient management—current trends and future perspectives. Diagnostics, 11(2), 354.
Vu, Q. D., Graham, S., Kurc, T., To, M. N. N., Shaban, M., Qaiser, T., … & Farahani, K. (2019). Methods for segmentation and classification of digital microscopy tissue images. Frontiers in bioengineering and biotechnology, 53.Wilt, T. J., Brawer, M. K., Jones, K. M., Barry, M. J., Aronson, W. J., Fox, S., … & Wheeler, T. (2012). Radical prostatectomy versus observation for localized prostate cancer. N Engl J Med, 367, 203-213.
To see IKOSA in full action launch your trial subscription