154: AI in Pathology: Advances in Prostate, Bladder & Endocrine Cancer - podcast episode cover

154: AI in Pathology: Advances in Prostate, Bladder & Endocrine Cancer

Aug 19, 202521 minEp. 154
--:--
--:--
Download Metacast podcast app
Listen to this episode in Metacast mobile app
Don't just listen to podcasts. Learn from them with transcripts, summaries, and chapters for every episode. Skim, search, and bookmark insights. Learn more

Episode description

Send us a text

If our visual scoring is still based on gut feeling, how do we scale precision?
In this week’s DigiPath Digest, I explored four new AI-focused papers that could reshape how we diagnose prostate, bladder, gastroesophageal, and endocrine cancers.

From automated IHC scoring to predicting urethral recurrence post-cystectomy, these studies highlight the growing value—and responsibility—of integrating AI into our pathology workflows.

And yes, I also reveal where to get my histology-inspired earrings 😉

Episode Highlights

[06:00] Muse Vet Platform launch + STP talk
[11:00] Tools I use: Perplexity, RAG, ChatGPT, and AI citation traps
[14:00] AI’s promise—and its pitfalls

Paper 1: IHC Scoring in GEC (Caputo et al.)

Manual PD-L1 and HER2 scoring is subjective. This study shows AI can standardize and improve accuracy using digital tools for GEC.

[20:00] AI reduces visual bias
[23:00] Potential to replace expensive assays

Paper 2: ASAP in Prostate Biopsies

Page Prostate AI matched final diagnoses 85% of the time—more than human reviewers.

[24:00] ASAP = gray zone diagnosis
[27:00] AI matched final calls more often than humans

Paper 3: Recurrence Prediction Post-Cystectomy

Chinese study developed a recurrence model using ML on clinical data. AUC: 0.86 (train), 0.77 (test).

[30:00] Risk factors: CIS, bladder neck involvement
[32:00] SHAP explained model insights

Paper 4: Reticulin Framework in Endocrine Pathology

Reticulin stains are cheap but powerful. This paper calls for AI to take notice.

[36:00] Reticulin separates benign from malignant
[40:00] Let’s train AI on these patterns

📚 Resource from this Episode

  • Caputo et al., Pathology Research & Practice
  • Page Prostate study on ASAP
  • ML model predicting urethral recurrence
  • Reticulin stains in endocrine tumor grading

AI is already enhancing diagnostic precision—we just need to guide its use responsibly. From special stains to advanced models, this episode covers where we're headed next.

Support the show

Become a Digital Pathology Trailblazer get the "Digital Pathology 101" FREE E-book and join us!

Transcript

Aleks:

00:00:00 - 00:01:22
I can actually see where people join, which is fantastic. Whenever you start joining, I see people joining on YouTube. You are early, guys. It's 6:00 a.m. in Pennsylvania and we're doing our Digipat Digest. If you are there already, give me a comment that you hear me and that you see everything because it's a new system.

00:00:42 - 00:02:37
 Last time, I had the microphone off for two or three minutes. That wasn't great, but somebody told me in the comments. Am I even live? I'll have to cut the beginning off. That's okay. Let's see. If somebody can let me know online if I'm live, that would be fantastic. Meanwhile, I'll try to share the screen and see if everything works. We are sharing. Can I draw? Can I do stuff?

00:01:40 - 00:03:27
 Here's my camera. Whoever is joining, give me a virtual high five. Let me know if you can hear me well and if you can see the screen. I haven't heard from my team yet. Maybe I can check it myself. I'm going to check it and officially welcome you in three minutes. Perfect—my team says we're online. LinkedIn, YouTube, good.

00:02:50 - 00:04:17
 Let's check Facebook. In three minutes, we're officially live. I see one person on YouTube. Can you give me a comment? Team digital pathology, give me a comment too so we can check if chat is working. Still testing. You're not late at all.

00:03:49 - 00:06:20
 I'm waiting for all the systems to work. Chatting with my team to ensure everything is working. Is the camera moving with me? Yes, it is. One thing that’s not great is the screen sharing and unsharing. Maybe that’ll improve. I see some people joining on YouTube. Once you're here, let me know.

00:05:24 - 00:06:57
 It is 6 a.m. in Pennsylvania. Welcome back to Digipath Digest. We finished the seven-part AI series last stream, so now we’re back to normal papers. But first, I have a couple of updates. We’re using a new live streaming system.

00:06:12 - 00:07:33
 When you join, look at me and see if there’s anything off in my appearance—I'll tell you after a few updates. Facebook isn't working, not sure why. So, Facebook might be out today.

00:06:53 - 00:08:16
 Recently, I went to an online creator conference in Idaho. It wasn’t pathology-related, but it was to learn how to create better content for you online. I also got to run in the mountains—almost every day.

00:07:37 - 00:08:52
 That leads to a vlog coming out soon from USCAP. I ran a five-mile run in Boston. It’ll be part of the vlog. During the summer, before I go to Poland, we’ll do a few histology livestreams to teach tissue basics.

00:08:14 - 00:09:36
 If you’re on my list, you’ll get emails about those. If not, stay tuned. Another conference coming up is STP. Let me just—

00:08:54 - 00:10:24
 I’m not seeing your comments. That’s not good. Let me check LinkedIn for comments. I don't want to break the stream, so I’ll keep giving updates. I hope you're there. I see one person on YouTube.



00:09:45 - 00:11:00
 We're figuring out the streaming software. The next conference I’ll be at is the Society of Toxicologic Pathology annual meeting in Chicago, the week of the 23rd. Not sure how to get chat working—sad I can't talk to you live.

00:10:25 - 00:11:42
 Maybe my team can guess what’s off with me today. Here’s a hint: Muse, a sponsor of Digital Pathology Place, launched a new service for veterinary medicine. They do direct-to-digital tissue imaging, now for veterinary use.

00:11:04 - 00:12:19
 If you're in the veterinary diagnostics space, check them out. I see some comments! Digital pathology trailblazers—cool, I can even show it.

00:11:41 - 00:12:59
 STP—let me know if you’re going. I’ve been using AI for content creation, including this channel and blog. It’s a powerful tool for literature research too.

00:12:21 - 00:13:40
 One tool I use is Perplexity, a RAG-powered search engine. Instead of relying only on training data, it searches the web for content and then generates responses.

00:13:00 - 00:14:10
 I’ve been using Perplexity to find references for a publication I'm working on about validating our digital pathology system. But there’s a catch.

00:13:36 - 00:14:54
 Perplexity sometimes invents references. It shows titles, dates, even DOIs—but they don’t exist. When I tried to cite them, my citation manager couldn’t find them.

00:14:15 - 00:15:22
 I tried pasting DOIs and searching manually, and still nothing. So double-check everything you cite, even with RAG-based tools. We are still responsible for AI-generated content.

00:14:48 - 00:16:11
 Figure out a workflow to verify your sources. I'm clicking every link now. Sorry, I'm confused with my screens again. It’s a bit of a challenge today, but we’ll get through it.

00:15:29 - 00:16:32
 I only see YouTube comments. If you're on LinkedIn or Facebook, stay there if you can hear and see me. Now let’s officially dive into today’s publications.

00:16:01 - 00:17:32
 Let me know if you're using AI tools for writing publications. Drop an “AI for publications” in the chat—I’d love to learn your workflow and accuracy checks.

00:16:46 - 00:18:17
 Oh, did you figure out what’s off with me? I’m wearing two different earrings—one is multinucleated giant cell, the other is cartilage. I also have one with colonic crypts in special stain (Alcian blue).

00:17:36 - 00:18:47
 I only have 25 of each pattern. If you're interested, let me know in the comments. I’ll list them in the digital pathology store. They’re great conversation starters at conferences.

00:18:12 - 00:19:31
 Let’s dive into the papers now—finally! It’s been 12 minutes of updates. Hope you’re still here. Let’s start with the PDF.

00:18:51 - 00:20:22
 First up: Immunohistochemical biomarker scoring in gastroesophageal cancer. Can computers help us? I have strong opinions on visually scoring IHC—it’s guesswork.

00:19:37 - 00:20:45
 Scoring complexity is increasing, but pathologists are still expected to do it manually. It's burdensome, subjective, and variable. A computer could help a lot.

00:20:10 - 00:21:32
 In this study, they looked at both adenocarcinoma and squamous cell carcinoma. If you know the correct pronunciation of “esophageal,” drop a phonetic comment on YouTube!

00:20:51 - 00:22:14
 There are multiple predictive biomarkers in GEC, but scoring is time-consuming and inconsistent. AI could reduce subjectivity and workload.

00:21:33 - 00:23:18
 The study was from Italy, by Alessandro Caputo et al., in Pathology Research Practice. They found AI promising for PDL1 and HER2 but not yet widely applied in GEC.

00:22:26 - 00:23:45
 Several IHC scoring algorithms exist, but few for GEC. AI can alleviate complex scoring burdens and reduce reliance on additional costly assays.


00:23:06 - 00:24:23
 Before our next live stream on Friday (and a histology one on Wednesday), we need to sort out the comment system. Now, let’s talk about AI for atypical small acinar proliferation (ASAP) in prostate core biopsies.

00:23:50 - 00:25:05
 Prostate cancer is the most diagnosed cancer and the second leading cause of cancer death in the US. Tissue from a 12–14 core biopsy is the gold standard. ASAP is an indefinite diagnosis that complicates clinical decisions.

00:24:28 - 00:26:08
 This study (from Veros Archives) used the FDA-approved AI tool Page Prostate. They applied it to 107 core images initially diagnosed with ASAP and had two pathologists reclassify them as benign or malignant.

00:25:18 - 00:26:38
 Page Prostate gave results marked as “suspicious” or “not suspicious.” It agreed with the final diagnosis in 85% of cases. Pathologists agreed with each other in 77% of cases, and with Page Prostate in 66–75%.

00:26:37 - 00:28:18
 These agreement rates are fairly typical—pathologist concordance is often only around 70%. AI showed compatibility with human review, though there's still interobserver variability.

00:27:27 - 00:28:52
 The concern is whether they validated results using another method, like immunohistochemistry. AI performed comparably, but it’s unclear whether there was any “ground truth” verification.

00:28:10 - 00:29:17
 Markers for basement membrane disruption could have helped confirm diagnoses. Still, AI’s comparable performance is promising. Let me know your thoughts in the comments.

00:28:43 - 00:30:07
 Let me know where you're tuning in from. I see some folks on YouTube—thank you! Next paper: Developing a machine learning model to predict urethral recurrence after radical cystectomy.

00:29:26 - 00:30:50
 This was a multi-center retrospective study and meta-analysis from China. No image analysis—just clinical data to build a predictive model for urethral recurrence, which indicates aggressive disease and poor outcomes.

00:30:08 - 00:31:41
 They analyzed clinicopathological data from patients who had radical cystectomy between 2010 and 2020. The model was built using machine learning and evaluated using ROC curve, accuracy, sensitivity, F1 score, Brier score, and C-index.

00:30:54 - 00:32:11
 I loved that they aimed for an "easy to use" tool—so important. Yesterday I tried multiple reference management tools, and went back to Perplexity (which invents citations!) because others were too complicated.

00:31:32 - 00:33:05
 They confirmed independent risk factors using univariate and multivariate Cox regression. The dataset was split 80/20 for training and testing. Model performance was measured across several metrics.

00:32:18 - 00:33:52
 They used SHAP (Shapley Additive Explanations) to calculate the contribution of each feature. They included data from PubMed, Embase, and Scopus, covering 473 patients with a recurrence rate of 24%.

00:33:05 - 00:34:49
 Key predictors included: concurrent carcinoma in situ, tumor multifocality, bladder neck involvement, prostate involvement, lymphovascular invasion, diversion type, and chemotherapy status. AUC was 0.86 (training) and 0.77 (testing).

00:33:58 - 00:35:33
 They developed an easy-to-use risk calculator. Clinical data could be used by a machine learning model (or even an LLM with RAG) to predict recurrence risk without manual data input.

00:35:30 - 00:36:49
 They didn’t use LLMs in this paper, but it would be a logical next step. I see some comments—yay! Let me show them. Someone from upstate New York—thank you for being here at this early hour!

00:36:14 - 00:37:43
 Our last paper before you grab coffee! It’s on reticulin framework assessment in neoplastic endocrine pathology. I saw the title and got excited—special stains plus AI image analysis? Yes, please.

00:36:59 - 00:38:16
 The reticulin framework, made mostly of type III collagen, supports tissue structure. Reticulin stains, especially silver-based ones, help visualize disruptions in this framework.

00:37:38 - 00:39:28
 Let me show you what a reticulin stain looks like. The black-stained parts are the reticulin fibers. They’re easy to see, which means they’re also easy for algorithms to detect. (Still struggling with screens, sorry!)

00:38:46 - 00:40:10
 Reticulin stain is a quick, low-cost, reliable way to highlight changes in the reticulum framework—helpful in distinguishing neoplastic from non-neoplastic conditions. A distorted framework often signals abnormal cell growth.

00:39:27 - 00:40:43
 This overview covered applications in endocrine and neuroendocrine neoplasms—pituitary, parathyroid, adrenal, and more. In pituitary tumors, loss of reticulin framework can help differentiate normal vs. neoplastic.

00:40:04 - 00:41:23
 In parathyroid and adrenal tumors, varying degrees of framework disruption can be informative. Reticulin integrity plays a role in malignancy assessment using a visual "reticulin algorithm."

00:40:44 - 00:42:10
 That algorithm includes three criteria: necrosis, high mitotic count (>5 per 10 mm²), and venous invasion. These features, visible in reticulin patterns, help classify adrenal disease subtypes.

00:41:27 - 00:42:55
 Neuroendocrine neoplasms also show variable reticulin patterns. High-grade carcinomas show more disruption. It all points to how valuable reticulin assessment is in diagnostics.

00:42:11 - 00:43:30
 This is my favorite statement: “Advancements in digital pathology and AI offer promising avenues for automated reticulin framework quantification.” Let’s develop an algorithm for this!

00:43:31 - 00:44:58
 I’d love to see a weakly supervised AI model trained on reticulin-stained images and diagnoses—screening for malignancy, for instance. It’s simple, cost-effective, and data-rich.

00:44:16 - 00:45:49
 And that’s all for today’s publications! If you're interested in the earrings, check the store at digitalpathologyplace.com. They’re conversation starters and based on real histology patterns.

00:45:04 - 00:46:20
 I’ll drop the store link in the chat (if chat works). They're not set up as Father’s Day gifts, but definitely cool for Mother's Day or anyone who appreciates histology-themed jewelry.

00:45:42 - 00:47:01
 More items are coming to the store soon. I’ve seen a lot of DNA helix jewelry, but not much based on actual histology. I thought you might like it.

00:46:22 - 00:47:26
 There’s likely going to be a histology live stream this Wednesday. If you’re interested, comment—even if you’re watching the replay. Especially for those on LinkedIn, let me know where you’d prefer to stream it.

00:46:54 - 00:47:34
 Next Friday, we’ll be reviewing more papers. Thank you so much for joining me at this funky hour. I’m honored, and I’ll talk to you in the next episode. Have a great day!




Transcript source: Provided by creator in RSS feed: download file
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast