News

November 2, 2022

Dr. Perincheri of Yale Medicine Shares Live Assessment of Paige Prostate on Core Biopsies

Dr. Sudhir Perincheri, Assistant Professor of Pathology and Director of Digital Pathology at Yale Medicine, spearheaded a study of Paige Prostate on prostate biopsies secured, processed, and independently diagnosed at Yale Medicine. At Pathology Visions 2022, he offered a fascinating presentation on the details of this study, along with additional insights into how bringing artificial intelligence (AI) tools into the clinic could impact pathology practice.

Dr. Perincheri started the session with a dive into the study design. A total of 1,876 prostate core biopsies from Yale were included. Each core was cut into 5 levels, which determined the staining procedures; Levels 1, 3, and 5 were stained with hematoxylin and eosin (H&E), while levels 2 and 4 were left unstained. Each case was reviewed by a resident or fellow and a specialist Genitourinary pathologist to arrive at a final diagnosis, which would act as the ground truth. The diagnostic categories represented in the study included carcinoma (typically adenocarcinoma variants), high-grade prostatic intraepithelial neoplasia (HG-PIN) + adjacent small atypical glands (PIN-ATYP), atypical small acinar proliferation (ASAP)/focal glandular atypia (FGA), atypical intraductal proliferation, or benign.

Level 3 of each core biopsy was scanned, stripped of identifiers and then reviewed with Paige Prostate*. The AI was applied without site specific tuning or adjustment, which was a core differentiator between this study and others that the Yale team were undergoing at the time. This was a critical component for assessing how generalizable the algorithm is and helping the team to draw conclusions about how it might be implemented in clinical practice. Paige Prostate* read each slide and identified cores as “suspicious” (for carcinoma, PIN-ATYP, or ASAP) or “non-suspicious” (do not contain those legions). The output of the AI was compared to clinically rendered diagnoses for concordance, and any discordant cases were then subjected to further analysis.

The aim of the study, Dr. Perincheri explained, was to see how useful computer-aided diagnostic technologies could be in a complex clinical workflow. “When you have, on average, 17 blocks from a patient, split into multiple levels, which equates to about 51 slides, and on a given day we’re getting about 6-8 of these biopsies, we’re looking at 400 slides of prostate core biopsies alone. So, the issue that most surgical pathology practices deal with is that there is a lot of work involved…. At the time when we did this study, we wanted to take the algorithm, apply it to an independent data set, look at its performance metrics, and draw some conclusions about how such tools could be implemented in practice.”

Dr. Perincheri then went on to showcase the study results and its key takeaways. First, he noted that Paige Prostate* did perform well, with the AI delivering a concordant diagnosis for the vast majority (1,796) of the cores. For the “non-suspicious” discrepant cores, in which Paige Prostate* called the core non-suspicious but the diagnosing pathologists called it suspicious, he explained, there were a few interesting qualities to consider:

  • For 4 of the 5 cores where a focus of adenocarcinoma was missed, other core biopsies from the same patient with carcinoma were flagged by Paige Prostate*
  • 2 of the 5 cases had foamy gland characteristics – however there were other examples where the algorithm was successfully flagging this variant
  • Most of the missed foci were small and, in some cases, better represented on other levels, which were not scanned and given to Paige
  • 2 of the cores were missed on manual reads as well

He noted that it is possible that if every level of the cores had been made available, Paige Prostate* would have successfully flagged the missed cores.

He then went on to analyze the “suspicious” discrepant cores, where Paige Prostate* identified a core as suspicious, but the pathologists had diagnosed it as non-suspicious. Upon re-review of those cores by Yale pathologists, leveraging Paige Prostate* crosshairs feature that highlights the focus of interest, the classification was changed such that the pathologist classification and Paige AI’s output were in fact concordant in 6 of the cores.

Ultimately, the study therefore found that Paige Prostate* demonstrated:

  • PPV = 97.9%
  • NPV = 99.2%
  • Sensitivity = 99.7%
  • Specificity = 99.3%

Armed with the first-hand experience of trialing AI pathology tools, along with the study’s promising results, Dr. Perincheri offered a few conclusions for how AI like Paige Prostate* could transform pathology practice. First, he explained that AI could be used as a prescreening tool. In this study’s case, only about one-third of the biopsies would have required manual review, saving significant amounts of time. On the other hand, he said, AI could offer a second read, as Paige Prostate* was successful at highlighting even very small foci of atypical glands, which could enhance pathologists’ diagnostic confidence, and also result in time savings. “Given the factor of high volume of cases, and the factor of experience and expertise varying across pathologists, as well as fatigue, and pressure on time – all of that impacts performance, accuracy, and so forth. So we feel that what the data is showing us is that [with these tools] there are potential savings both in workload, precision, accuracy, and turnaround times,” he said. During the diagnostic process, he also noted that AI could automate various aspects of reporting such as Gleason grading or perineural invasion identification, which can allow for reduced subjectivity and increased precision.

Dr. Perincheri did remind the audience that quality assurance steps will be critical for the AI to deliver optimal performance, and for pathologists to get the maximum mileage out of these tools. He also pointed out that the nature of the practice implementing these tools – be it academic versus private practice – as well as how many pathologists are employed would impact how beneficial these various use cases could be. For example, he noted that the metrics may be stronger in a case where there is a small pathology team and therefore fewer eyes on each case, as these teams could greatly benefit from the extra level of scrutiny the AI can offer. He also highlighted that portability across data sets, as Paige Prostate* exhibited, would be essential to tapping into AI’s potential. Still, he ultimately concluded that “I think everyone agrees that these computer-aided diagnostic tools are going to be indispensable in a clinical workflow.”

Read the full Yale Medicine Paige Prostate study and analysis here.

*As of March 1 2022, Paige Prostate was updated to Paige Prostate Suite, under which Paige Prostate Detect and Paige Prostate Grade and Quantify became two different products. Both Paige Prostate Detect and Paige Prostate Grade and Quantify were used in this study.