How Apple and Microsoft Built the Seeing-Eye Phone - podcast episode cover

How Apple and Microsoft Built the Seeing-Eye Phone

Dec 03, 202151 minSeason 1Ep. 7
--:--
--:--
Download Metacast podcast app
Listen to this episode in Metacast mobile app
Don't just listen to podcasts. Learn from them with transcripts, summaries, and chapters for every episode. Skim, search, and bookmark insights. Learn more

Episode description

Your smartphone can see, hear, and speak—even if you can’t. So it occurred to the engineers at Apple and Microsoft: Can the phone be a talking companion for anyone with low vision, describing what it’s seeing in the world around you? Today, it can. Thanks to some heavy doses of machine learning and augmented reality, these companies’ apps can identify things, scenes, money, colors, text, and even people (“30-year-old man with brown hair, smiling, holding a laptop—probably Stuart”)—and then speak, in words, what’s in front of you, in a photo or in the real world. In this episode, the creators of these astonishing features reveal how they turned the smartphone into a professional personal describer—and why they care so deeply about making it all work. Guests: Satya Nadella, Microsoft CEO. Saqib Shaikh, project lead for Microsoft’s Seeing AI app. Jenny Lay-Flurrie, Chief Accessibility Officer, Microsoft. Ryan Dour, accessibility engineer, Apple. Chris Fleizach, Mobile Accessibility Engineering Lead, Apple. Sarah Herrlinger, Senior Director of Global Accessibility, Apple. Learn more about your ad choices. Visit megaphone.fm/adchoices
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast
How Apple and Microsoft Built the Seeing-Eye Phone | Unsung Science podcast - Listen or read transcript on Metacast