Clinical Validation of Artificial Intelligence–Augmented Pathology Diagnosis Demonstrates Significant Gains in Diagnostic Accuracy in Prostate Cancer Detection

Patricia Raciti, MD; Jillian Sue, MS; Juan A. Retamero, MD; Rodrigo Ceballos, MSc; Ran Godrich, MS; Jeremy D. Kunz, MSc; Adam Casson, BS; Dilip Thiagarajan, MS; Zahra Ebrahimzadeh, MSc; Julian Viret, MEng; Donghun Lee, MEng; Peter J. Schüffler, DrSc; George DeMuth, MS; Emre Gulturk, MSc; Christopher Kanan, PhD; Brandon Rothrock, PhD; Jorge Reis-Filho, MD, PhD, FRCPath; David S. Klimstra, MD; Victor Reuter, MD; Thomas J. Fuchs, DrSc

Independent real-world application of a clinical-grade automated prostate cancer detection system

Leonard M da Silva, Emilio M Pereira, Paulo Go Salles, Ran Godrich, Rodrigo Ceballos, Jeremy D Kunz, Adam Casson, Julian Viret, Sarat Chandarlapaty, Carlos Gil Ferreira, Bruno Ferrari, Brandon Rothrock, Patricia Raciti, Victor Reuter, Belma Dogdas , George DeMuth , Jillian Sue, Christopher Kanan , Leo Grady , Thomas J Fuchs, Jorge S Reis-Filho, artificial intelligence; deep learning; diagnosis; histopathology; machine learning; prostate cancer; screening, Artificial intelligence (AI)-based systems applied to histopathology whole-slide images have the potential to improve patient care through mitigation of challenges posed by diagnostic variability, histopathology caseload, and shortage of pathologists. We sought to define the performance of an AI-based automated prostate cancer detection system, Paige Prostate, when applied to independent real-world data. The algorithm was employed to classify slides into two categories: benign (no further review needed) or suspicious (additional histologic and/or immunohistochemical analysis required). We assessed the sensitivity, specificity, positive predictive values (PPVs), and negative predictive values (NPVs) of a local pathologist, two central pathologists, and Paige Prostate in the diagnosis of 600 transrectal ultrasound-guided prostate needle core biopsy regions (‘part-specimens’) from 100 consecutive patients, and to ascertain the impact of Paige Prostate on diagnostic accuracy and efficiency. Paige Prostate displayed high sensitivity (0.99; CI 0.96-1.0), NPV (1.0; CI 0.98-1.0), and specificity (0.93; CI 0.90-0.96) at the part-specimen level. At the patient level, Paige Prostate displayed optimal sensitivity (1.0; CI 0.93-1.0) and NPV (1.0; CI 0.91-1.0) at a specificity of 0.78 (CI 0.64-0.89). The 27 part-specimens considered by Paige Prostate as suspicious, whose final diagnosis was benign, were found to comprise atrophy (n = 14), atrophy and apical prostate tissue (n = 1), apical/benign prostate tissue (n = 9), adenosis (n = 2), and post-atrophic hyperplasia (n = 1). Paige Prostate resulted in the identification of four additional patients whose diagnoses were upgraded from benign/suspicious to malignant. Additionally, this AI-based test provided an estimated 65.5% reduction of the diagnostic time for the material analyzed. Given its optimal sensitivity and NPV, Paige Prostate has the potential to be employed for the automated identification of patients whose histologic slides could forgo full histopathologic review. In addition to providing incremental improvements in diagnostic accuracy and efficiency, this AI-based system identified patients whose prostate cancers were not initially diagnosed by three experienced histopathologists. © 2021 The Authors. The Journal of Pathology published by John Wiley & Sons, Ltd. on behalf of The Pathological Society of Great Britain and Ireland.

Validation of a digital pathology system including remote review during the COVID-19 pandemic

Modern Pathology, Matthew G. Hanna, Victor E. Reuter, Orly Ardon, David Kim, Sahussapont Joseph Sirintrapun, Peter J. Schüffler, Klaus J. Busam, Jennifer L. Sauter, Edi Brogi, Lee K. Tan, Bin Xu, Tejus Bale, Narasimhan P. Agaram, Laura H. Tang, Lora H. Ellenson, John Philip, Lorraine Corsale, Evangelos Stamelos, Maria A. Friedlander, Peter Ntiamoah, Marc Labasin, Christine England, David S. Klimstra, Meera Hameed, Remote digital pathology allows healthcare systems to maintain pathology operations during public health emergencies. Existing Clinical Laboratory Improvement Amendments regulations require pathologists to electronically verify patient reports from a certified facility. During the 2019 pandemic of COVID-19 disease, caused by the SAR-CoV-2 virus, this requirement potentially exposes pathologists, their colleagues, and household members to the risk of becoming infected. Relaxation of government enforcement of this regulation allows pathologists to review and report pathology specimens from a remote, non-CLIA certified facility. The availability of digital pathology systems can facilitate remote microscopic diagnosis, although formal comprehensive (case-based) validation of remote digital diagnosis has not been reported. All glass slides representing routine clinical signout workload in surgical pathology subspecialties at Memorial Sloan Kettering Cancer Center were scanned on an Aperio GT450 at ×40 equivalent resolution (0.26 µm/pixel). Twelve pathologists from nine surgical pathology subspecialties remotely reviewed and reported complete pathology cases using a digital pathology system from a non-CLIA certified facility through a secure connection. Whole slide images were integrated to and launched within the laboratory information system to a custom vendor-agnostic, whole slide image viewer. Remote signouts utilized consumer-grade computers and monitors (monitor size, 13.3–42 in.; resolution, 1280 × 800–3840 × 2160 pixels) connecting to an institution clinical workstation via secure virtual private network. Pathologists subsequently reviewed all corresponding glass slides using a light microscope within the CLIA-certified department. Intraobserver concordance metrics included reporting elements of top-line diagnosis, margin status, lymphovascular and/or perineural invasion, pathology stage, and ancillary testing. The median whole slide image file size was 1.3 GB; scan time/slide averaged 90 s; and scanned tissue area averaged 612 mm2. Signout sessions included a total of 108 cases, comprised of 254 individual parts and 1196 slides. Major diagnostic equivalency was 100% between digital and glass slide diagnoses; and overall concordance was 98.8% (251/254). This study reports validation of primary diagnostic review and reporting of complete pathology cases from a remote site during a public health emergency. Our experience shows high (100%) intraobserver digital to glass slide major diagnostic concordance when reporting from a remote site. This randomized, prospective study successfully validated remote use of a digital pathology system including operational feasibility supporting remote review and reporting of pathology specimens, and evaluation of remote access performance and usability for remote signout.