Researchers Found Out That AI Significantly Improves Gleason Grading of Prostate Biopsies by Pathologists
This research summary is just one of many that are distributed weekly on the AI scholar newsletter. To start receiving the weekly newsletter, sign up here.
The Gleason Score is the grading system used to determine the aggressiveness of prostate cancer. But then studies have shown that Gleason scores grading suffers from significant inter- and intraobserver variability. While specialized uropathologists show higher concordance rates, such expertise is not always available.
AI systems based on deep learning have achieved pathologist level performance in Gleason grading. Recently, researchers investigated whether pathologists supported by a deep learning system can improve in Gleason grading of prostate biopsies. Read on to see what they found out.
Improving Gleason Grading of Prostate Biopsies by Pathologists with Artificial Intelligence
It’s true that deep learning systems have achieved high performances on diagnostic tasks and can be viewed as a new tool for pathologists to use in their diagnostic processes.
This is because, in the past, researchers have developed a fully automated deep learning system for grading prostate cancer that achieved pathologist level performance, both in determining the grade group and in stratifying patients in relevant risk categories. For validation, the system performance was compared to a panel of pathologists where it outperformed 10 out of 15 observers on determining the grade group.
In this new paper, a group of researchers takes a deep dive into what they term as the first of its kind investigation to find out the value of AI assistance on histological tumor grading. They compare the diagnostic performance of pathologists with and without the assistance of the deep learning system.
The Results: Their work shows that the integration of AI feedback in the diagnostic process improves pathologists’ performance and that a synergy between the pathologist and AI system achieves the best performance overall.
Without assistance, the AI system outperformed 10 out of 14 pathologists, and this dropped to only 5 out of 14 in the second-read with AI assistance. Pathologists assisted by the AI system not only improved compared to unassisted reads but also achieved higher median performance than the standalone AI.
Potential Uses and Effects
These results demonstrate that there is an added benefit of pathologists using AI assistance as a supportive tool during diagnosis. In geographic regions where the number of pathologists is limited, such AI systems can support pathologists in achieving higher grading accuracy and consistency.