학술논문

Independent real-world application of a clinical-grade automated prostate cancer detection system
Document Type
Report
Source
Journal of Pathology. June 2021, Vol. 254 Issue 2, p147, 12 p.
Subject
Artificial intelligence
Cancer diagnosis
Machine learning
Cancer screening
Detection equipment
Prostate cancer
Histochemistry
Cancer -- Diagnosis
Detectors
Language
English
ISSN
0022-3417
Abstract
Keywords: artificial intelligence; histopathology; diagnosis; screening; prostate cancer; diagnosis; deep learning; machine learning Abstract Artificial intelligence (AI)-based systems applied to histopathology whole-slide images have the potential to improve patient care through mitigation of challenges posed by diagnostic variability, histopathology caseload, and shortage of pathologists. We sought to define the performance of an AI-based automated prostate cancer detection system, Paige Prostate, when applied to independent real-world data. The algorithm was employed to classify slides into two categories: benign (no further review needed) or suspicious (additional histologic and/or immunohistochemical analysis required). We assessed the sensitivity, specificity, positive predictive values (PPVs), and negative predictive values (NPVs) of a local pathologist, two central pathologists, and Paige Prostate in the diagnosis of 600 transrectal ultrasound-guided prostate needle core biopsy regions ('part-specimens') from 100 consecutive patients, and to ascertain the impact of Paige Prostate on diagnostic accuracy and efficiency. Paige Prostate displayed high sensitivity (0.99; CI 0.96-1.0), NPV (1.0; CI 0.98-1.0), and specificity (0.93; CI 0.90-0.96) at the part-specimen level. At the patient level, Paige Prostate displayed optimal sensitivity (1.0; CI 0.93-1.0) and NPV (1.0; CI 0.91-1.0) at a specificity of 0.78 (CI 0.64-0.89). The 27 part-specimens considered by Paige Prostate as suspicious, whose final diagnosis was benign, were found to comprise atrophy (n=14), atrophy and apical prostate tissue (n=1), apical/benign prostate tissue (n=9), adenosis (n=2), and post-atrophic hyperplasia (n=1). Paige Prostate resulted in the identification of four additional patients whose diagnoses were upgraded from benign/suspicious to malignant. Additionally, this AI-based test provided an estimated 65.5% reduction of the diagnostic time for the material analyzed. Given its optimal sensitivity and NPV, Paige Prostate has the potential to be employed for the automated identification of patients whose histologic slides could forgo full histopathologic review. In addition to providing incremental improvements in diagnostic accuracy and efficiency, this AI-based system identified patients whose prostate cancers were not initially diagnosed by three experienced histopathologists. © 2021 The Authors. The Journal of Pathology published by John Wiley & Sons, Ltd. on behalf of The Pathological Society of Great Britain and Ireland. Article Note: Conflict of interest statement: RG, RC, JDK, AC, JV, CK, BR, PR, BD, JS and LG are employed by Paige, Inc and have equity in the company. SC is a consultant for Paige, Novartis, Lilly, Sermonix, and BMS; has research funds to the institution from Daiichi Sankyo, Sanofi, Lilly, Novartis, and Genentech. GD received financial compensation to complete the statistical analysis. TJF is the Chief Scientific Officer of Paige, a co-founder and equity holder of Paige. JSR-F is a consultant for Goldman Sachs, Paige, REPARE Therapeutics, and Eli Lilly; a member of the scientific advisory board of Volition RX, Paige, and REPARE Therapeutics; a member of the Board of Directors of Grupo Oncoclinicas; an ad hoc member of the scientific advisory board of Ventana Medical Systems/Roche Tissue Diagnostics, Roche, Genentech, Novartis, InVicro, and GRAIL; and an Associate Editor of The Journal of Pathology. MSKCC has financial interests in Paige.AI and intellectual property interests relevant to the work that is the subject of this study. No other conflicts of interest were disclosed. CAPTION(S): Supplementary materials and methods Table S1. REMARK guidelines checklist Table S2. Performance comparison between the local pathologists, the two independent central pathologists, the consensus of the central pathologists, and Paige Prostate Table S3. Consensus of the performance of the central pathologists by assessment mode (without and with Paige Prostate) Byline: Leonard M Silva, Emilio M Pereira, Paulo GO Salles, Ran Godrich, Rodrigo Ceballos, Jeremy D Kunz, Adam Casson, Julian Viret, Sarat Chandarlapaty, Carlos Gil Ferreira, Bruno Ferrari, Brandon Rothrock, Patricia Raciti, Victor Reuter, Belma Dogdas, George DeMuth, Jillian Sue, Christopher Kanan, Leo Grady, Thomas J Fuchs, Jorge S Reis-Filho