Have you ever watched The Price is Right? Is that a thing in the UK? I think it is a thing, but I don't really watch it. But I know what it is. Well, there was an episode, apparently, last week, I saw it on threads from Justin Ryan, who is does really good work sharing things about Vision Pro. I think he has a newsletter about Vision Pro that I'll try to put in the show notes, but... vision pro was one of the
The items on The Price is Right where the contestants had to guess how much the Vision Pro cost. And the person who's closest wins it. and yeah it's like that price is right roses you can't go over so if it $1,000 product, you guessed $1,000 and won, you lose. The guesses from these contestants were not great. So the first person guessed $1,000. The second person guessed $7,50. Third person guessed a thousand and one.
Fourth person guessed $12.70. Right off the bat, the last person should have guessed $1,002 if she thought it was over the previous highest guess of $1,001. So that's a strategy error in her attempt to play Price is Right. But beyond that, none of those guesses are even remotely in the ballpark of how much Vision Pro cost.
Because I see, I'm watching the clip, and I see Drew Carey, I see the Vision Pro, like, fall out of the sky. Drew Carey describes it as, like, this mixed reality headset. It's the Apple Vision Pro. I do think his description was quite funny, too. And, like, he was like... You watch movies and take calls on it. I was like, technically true, yeah? Yeah. You can, in fact, do buffers. But it kind of missed the point, right? But yeah.
So I watch it fall from the sky, and I'm like, okay, I got this. And I'm like, I wonder if he's going to say how much storage it has, because that affects the price. And he says $256. And I'm like, all right, $34.99, put me on the prices right. and the fact that none of the people got even close i think it's just such a funny it shows like how much people in the real world know about Vision Pro and how much they care about it.
And it also shows just how big of a discrepancy between what Vision Pro currently costs and how much people think a product like that or technology like that is worth. Yeah, and it's not a direct proxy for value, but there's definitely a...
There's something meaningful to the inherent price that a cubing looks at something and says, this is how much this is going to cost, right? Yeah. And these days, you show people a phone. If it's a higher-end phone, they're going to say around $1,000, right? Because that's just become the new default.
But with VR headsets, Bye. categories quite wide and I guess the most popular VR headsets are the ones that are $200, $300 so people are kind of primed in that zone and as soon as and most people have never seen that product before in their life, right?
So yeah, they don't even know what it is really. And it wasn't like presented in the box. I feel it was just like the headset on the plinth and it came down. And so the only reference people are going to have is like, oh, you know that like $500 headset I maybe bought for Christmas? Oh, this is the Apple one. Let's go with the price like that. And obviously it's wildly out of proportion compared to the truth, which is $34.99. Even the second gen Vision Pro that doesn't exist yet.
Let's assume that in the best case it's going to be half the price. It's going to be wildly more expensive than the guesses that the Price is Right contestants gave. So obviously there's a bit of a discrepancy there in terms of what people like. naturally value these kind of things and how much this thing costs. Because details or nuances like this is the only one with the exclusive microElla display can completely fall away when you're on a primetime TV show when nobody actually cares.
But also people aren't going to buy it for the same difference because a lot of people probably put on the Vision Pro and they're like, yeah, this is nice. They don't value the quality difference. It's just like I'm very picky about TVs or whatever and other people are like, this is fine. So the $200 65-inch TV from Walmart, that's what most people go for now. And if people can get on with that and don't want to spend a lot of money going for it, there's nothing wrong with it.
The best parts of the Vision Pro are, I guess, quite subtle, right? I mean, they're subtle enough that even I can't justify buying the first generation of Vision Pro. But at least I can appreciate the technology inside it, you know? There's a big doubter to get that thing down to a more mass consumer market. Somebody did say in the replies on threads that this isn't the first time Vision Pro has been on prices right. And last time, somebody did guess it exactly right. $34.99.
But they got it from somebody in the crowd who was yelling it to help them win. I think On Price is right if you guess the price exactly. you get like a bonus $500 win or something. So good on that person and good on the fellow. fellow nerd in the crowd and get the right answer and help them out. Yeah, I mean, if you want another example, if they brought down like a Mac Studio, no human being in that audience would know how much that cost either. So it's kind of in that ballpark.
Even like a MacBook Pro, the pricing for laptops is way more complicated. And if I think about... premium kitchenware. I wouldn't know any much. It's all different perspectives from different folks. But obviously for us, the vision profile comes down at the sky. It's $34.99.
Anyway, actual news this week, because there is a surprising amount of it for mid-May. Apple has previewed its iOS 19 accessibility features. These are the features that will ship later this year, and Apple, of course, doesn't. Referred to these features as coming with iOS 19 and Mac OS 16, but we know that they are.
And they announced them early because of Global Accessibility Awareness Day, which is actually today as we're recording this. First one that Apple really seemed to focus on is accessibility nutrition labels on the App Store. So this is an extension or... Another set of nutrition labels after the privacy nutrition labels that Apple shipped a couple years ago. Accessibility nutrition labels will show you whether an app supports things like voiceover, voice control.
Larger text and those kinds of accessibility features before you download it. I've always wondered how many people actually look at the privacy nutrition labels on the App Store. No, I don't know how many people care. But I do think...
The accessibility one could be a little bit more useful, maybe. I wish they let you, like, filter by it, right? So, like, if you do need these features, if you're going to search, it should let you pick, like... only show me apps that say they support voiceover or do you know what i mean like otherwise it's a bit stupid that you have to like search for an app and then scroll all the way to the bottom scroll that's there a little thing
and then also there's a i don't know yet how they're going to police it because you can You can declare support for voiceover, but that can mean very different things. You could have one screen every app that's been annotated with voiceover labels. You could have the entire app with custom actions and triggers and motor controls and custom gestures all designed for the voiceover experience.
I mean, with privacy, it's basically Apple just leaves it on the onus to be on the developer to be truthful. So you can declare whatever you want in your privacy nutrition label and you can get away with it for as long as nobody notices that you're doing something else. And then if you are doing something else, Apple threatens kicking you out of the app store. But there's nothing technical preventing you from... lying on those labels. And the same is true for accessibility.
But I get where they're coming from and it's nice that this information is visible somewhere before you download it. But I think it would be more practical if you could... go into the app store and have a section which automatically finds all the apps that are declaring voiceover or declaring good accessibility support or something.
The one that made me laugh in the press release was they showed the Apple Fitness app. Yeah. And one of the supported features of accessibility is dark interface. Yeah. The fitness app doesn't even have a light interface. Why is a dark interface more accessible than a light one? Surely it depends on your proclivities of your sensitivities of your eyes or whatever.
I guess the thing about the accessibility labels is that if you need a feature like voiceover, and you go to the app store, and you download an app, and then you go to the app... And if you need voiceover, I guess you're just left out to dry because you have no way whatsoever to navigate the app.
So you're stuck and perhaps confused as to why that app doesn't work with your accessibility features. So now you can know beforehand that, hey, I'm going to download this app and it doesn't support voiceover, so I can't use it and I won't be lost when I download it. I don't think anybody...
doesn't download an app because of something they see in the privacy nutrition labels, but there will be people who don't download an app because of something that's not in the accessibility label. So I can see where they're coming from, for sure. True, but even so, it's kind of like...
a bit obsolete the labels thing like compared to physical products in a shop which is obviously where the nutrition label idea comes from with the app store everything's free to download these days right like so little stuff is paid up front yeah i feel like it's almost quicker to just download the app and try and use it and then you can find out how well is the voiceover implementation you know And the other thing is the larger tech support, which obviously supports the dynamic type.
The implementation of that, even in some of Apple's apps, is so varied. Some apps do it really, really well, where all the labels change size and reflow and relow. Other ones, the fonts get bigger, but they don't make more lines of text. so like it'll be a single line of text it's designed for like a 16 point font but then you turn on the accessibility font sizes and it can fit like one word on that single line whereas
If you're implementing it properly, you would then reflow so that when you're using accessibility font, it goes over multiple lines, not one line. So there's so much, you know, zero to a hundred in terms of how well these things supported that I'm not sure a nutrition label can really...
be that valuable to people because it's like if you had a if you had an interest label for like app design or you could say it's easy to use and intuitive but what does that mean right you don't really know until you download the app and you try it out for yourself so i was a bit skeptical of these labels but
It's probably better that it's listed somewhere. But like you say, ultimately, how many people are really scrolling down that far on the app page anyway to see that stuff? Very, very, very, very few. I feel like people look at screenshots and that's about it. Next we have a new magnifier app for the Mac. So this taps into either continuity camera or a USB camera attached to your Mac.
And there's a video from Apple that I'll link that's in the post that I'll link in the show notes, but it shows how this feature works because it is something that I think... is useful to see and it's kind of hard to explain. So you attach the camera to your Mac and the example Apple shows is you're in a lecture hall and the camera is reading the whiteboard or chalkboard or whatever you call it at the front of the lecture hall.
and the person in the video can be seen both zooming in to see a video feed from the continuity camera on their Mac, so a closer up view of what's on the whiteboard. But then they can also be seen taking the text that's on the whiteboard and manipulating it. So that includes stripping out everything and just looking at the text itself. That includes adjusting.
perspective of the text. So if you're sitting on the left-hand side of the lecture hall recording the whiteboard, the concept or the magnifier app will take the text and instead of you're looking at it at an angle it'll adjust the perspective so you're looking at it dead on yeah it can like correct the distortion so yeah if you're if you're looking at a skew it can you know
transform the image to be straightened out, which is useful to anybody, to be honest. And with the video feed itself, you can also adjust things like the brightness, the contrast, the colors. So if you have a vision impairment where for some reason You can't make out the text and the actual lighting conditions in the lecture hall or the brightness in the lecture hall. You can adjust that on the video feed you're watching on your Mac, which is super useful for a lot of people.
And there's a button where you can just like, instead of just being a lifer, you can capture a screenshot, right? Yep. Even 10 years ago when I was at uni, it was so common for people to just like lift their phone up in the air and snap a screenshot of the current whiteboard. Oh, yeah. Yeah, it was so, so common. So if there was like, and maybe now there's like dedicated apps that like do it better and stuff, but this seems like a direct continuation of that.
And the Magnifier app does exist on iPhone and iPad, but I don't think it has all the features that they show in this version, like the straightening thing. I don't think it's on the iOS version yet. The downside, of course, to Magnifier on Mac is that you have to, like... have some sort of separate camera because it doesn't have a rear-facing camera in it, right? Like, if there's just an iPad,
It'd be a bit more elegant. But then you'd have to use an iPad for everything. That's a problem for some people.
that is fair what would you pick your poison because i mean no offense when i was in the lecture halls at my uni at least the seats were so small if i had like a laptop and then i like amount to then put my phone on top of it I feel like my neighbour would be annoyed at me for like obscuring the air view because it like gets in the way or whatever so there are practical concerns like continuity camera as useful as it is it always feels like a bit of a clutch in my opinion
can you just kind of find a way to put a camera on the Mac, you know? At least in this case, it's a camera pointing the other direction. Whereas previously, continuity camera was like, well, the webcam on your Mac's so bad. that will let you put your iPhone facing you too. At least in this case, it's the cameras facing outwards.
But still, I'm kind of like, could they not make a future laptop that had a slightly thicker top case and more like an iPad where it's more even in terms of thickness across both sides and you could fit? you know, front and rear cameras, for instance. Nevertheless, this is cool. And a lot of the responses of this feature that I saw were people wondering why the person in the video just didn't sit closer.
And I think maybe what Apple didn't showcase well enough in the video that they released about this feature is that it's doing so much more than just being like a magnifier. And maybe that's a problem with the name of the app. But you're doing so much more than just magnifying text. Like I said, you're adjusting the color and the size of the text.
the contrast and the spacing and all of that stuff that you can't accomplish just by sitting closer. And you can't always sit closer, like. Yeah, that's not always an option, yeah. There's been plenty of times when I've been in lecture halls and stuff and someone's drawn a diagram and...
I can see it, but it'd be easier if I could sit closer. So you just take a photo of it and then you crop it. That just happens. And this is just like a built-in feature now. And it's been on iPhone, iPad, and now it's on the Mac. And people too pointed out that most professors just make their slides available for download to the class. And that's true, but at least when I was in college, which I graduated,
what, six years ago? A lot of teachers did make their slides available, but what they would do is they'd have the slide showing on a projector above like a chalkboard or a whiteboard while also writing things, showing examples.
on that chalkboard below it so yeah and maybe it's changed a bit more post like COVID but you know I finished uni in 2015 and I did like economics and there was so many like math stuff and people whiteboards blackboards you know chalk so much equations and stuff that i guess now they kind of like a lot more universities video the lectures but yeah that is true uh
I can see this coming up. And even if it doesn't have to be in lecture halls either, you can be in office rooms or conference centres and stuff like that. This comes up all over the place. Personally, I feel like... be more inclined to use it with an ipad though then because i just feel like the social awkwardness of having to clamp a phone above your laptop screen is a bit awkward um But it's nice that they've added the option.
Accessibility reader is a new feature that will be system-wide on iPhone, iPad, and Mac. And it's a reading mode that's designed to make text easier to read for users with things like dyslexia or low vision. It basically strips out everything around the text. So even like the example Apple shows is a PDF in the files app.
you enable accessibility reader and it removes all the extra white space it removes the margins everything like that just puts it in a single line of text and you can choose how to break it up you can choose how to adjust the font and the contrast and the colors It has built-in support for the accessibility spoken content feature so it can read the text to you. This seems like an extension, a more powerful version of reader mode in Safari. Just taken and put everywhere there's text in.
iOS and macOS, basically. Again, one of those features that's not just useful to people who need accessibility features, but is useful to everyone. I can see myself using this all the time with a PDF in the Files app. And I presume it like OCRs images and stuff as well? I believe so. They mention menus and things, but maybe PDF menus where the fonts are embedded versus images.
They already have features like that where you can like, you know, copy the text out of image and stuff, right? You select it with your finger in it. So I feel like it could all blend into that. And eventually... It's exactly this kind of text content that it's reading off the screen that they were going to use for.
you know, on-screen awareness, right? Like, all of those Siri features fall back into two at some point. This is just putting a UI, so if you've got a load of small text in a PDF, there's just a, you know, just press one button and it zooms up to a big font size. Braille access is a feature Apple describes as having the ability to turn a user's device into a full-featured Braille notetaker that's integrated throughout the Apple ecosystem.
So there's a built-in app launcher where users can open any app by typing with a braille screen input or connected braille device. You can take quick notes in braille format and perform calculations. A cool feature that you can see some screenshots of. I think you need the screenshots to fully understand the specific interface that Apple is using for this, but you can see those in the posts and the show notes.
Live captions on Apple Watch. So this ties into the live listen feature that Apple launched with iOS 12. That's where you put an iPhone like... We'll use the lecture hall as an example. You can put an iPhone up in the front of the lecture hall, and it'll take that audio from the professor speaking, pipe it through your iPhone, and pipe it to your AirPods.
way to hear something that you might not be able to hear otherwise. And with watchOS 12, you'll get live listen controls on the Apple Watch, so you can start or stop or rewind a live listen session right from your Apple Watch. And there's support for real-time captions so you can read along with what live listen is hearing right on your Apple Watch.
This seems useful if you do start a live listen session and you have to place your iPhone either, like I said, at the front of a lecture hall or in the middle of a meeting table or something. And previously you would have had to get up and go get your iPhone to start or stop the session. Now you have all of it right on your watch. And if the dictate, if the transcription is good, you can just like glance your wrist and yeah.
and see what the text is. Live Listen is a feature we've heard a lot about and how many people use it. I think it's kind of... Similar to a degree to some of the processing Apple does for the hearing aid stuff on AirPods now. Some new accessibility features coming to Apple Vision Pro.
so there's updates to zoom so you can magnify everything in in your field of view so the example apple shows in the video is if you have like a recipe book in front of you and you're wearing vision pro you can use the zoom feature to specifically zoom in on anything in that recipe book Updates to VoiceOver. Live recognition. Envision OS uses on-device machine learning to describe your surroundings, find objects, read documents, and other things that's with VoiceOver.
And for accessibility developers, there's going to be a new API. Where? Developers can access the main camera on the Vision Pro to provide live person-to-person assistance for visual interpretation. I'm not entirely sure how that's going to work, but I know it's been a point of contention in the past that
Apple Vision Pro doesn't give developers the ability to access that main camera feed. Yeah, I think the announcement here is simply, like, you get an entitlement that an accessibility app can then just access the feed of the camera. To do whatever they want with it. This also exists for enterprise apps. If you get special exemptions from Apple and you're like a special enterprise application, you can get access to the main camera feed on Vision Pro.
If you're just a general day-to-day developer, the camera APIs just don't let you see anything. It's just like a black screen. The only thing they let you see is the simulated camera of your persona.
Oh, yeah. And this came up, obviously, because when Vision Pro was new and, you know, everyone was hyping about it, there were so many examples of demos or mock-up people were doing where it's like, yeah, you can't actually do that because the app can't see the outside world. So, like, you can't recognize, you know.
arbitrary you know red green and blue pixels because it just doesn't let you access them and i said at the time this is probably something they'll slowly add support for like maybe you can get like a text description of what someone's looking at or like you know some sort of like abstracted version of it to
avoid the privacy concerns and maybe over time there'll be a security permission where you can say let me see what's on the camera because in my opinion I know what they mean but obviously it's slightly more personal and intense than an iphone camera right where it's literally your entire vision would be available but It's the same with the iPhone, right? Where, like, most apps don't have camera permission, and then they explicitly alter camera permission, and then they get it.
I don't think the privacy concerns of an iPhone camera are that far apart from the Vision Pro camera system that it should be completely outlawed forever. on vision pro and i kind of assumed that over time they would like peel it back a bit and i think we've kind of seen some of that because they added i think with vision s2 it was there in time for enterprise apps and now they're going to do same thing for accessibility apps
and I don't think it'll be too much longer, and they'll have some sort of elevated system for mainstream consumer apps as well. And I believe Meta have also gone through this exact same thing, where they originally locked it down a lot, and then they've pulled it back on the Oculus side. Another upgrade that I think is worth calling attention to is for personal voice. So this launched iOS 17 in 2023, and it's a way for people to create and save a voice that sounds like them.
So Apple has said it's designed for people who are at risk of losing their ability to speak, such as those with a recent diagnosis of ALS. The initial setup process for personal voice as it existed in iOS 17 and iOS 18 required that users say 150 different phrases to train Apple's machine learning model.
Then once you said all of those phrases, the voice was then processed overnight so it'd be ready for you the next day. With iOS 19, Apple is like... they've like knocked it out of the park because now you only need to record 10 different phrases and then the voice processes in under a minute instead of multiple hours And with that faster setup process, Apple says the end result is even smoother and more natural sounding than before.
That is such a huge reduction in the setup time and the strain of that setup time for personal voice. it was incredible that the feature existed and now it's just mind-blowing that it exists with such an easy setup process, only 10 phrases, and then processing in under a minute. Yeah, I mean, I think this reflects the ginormous leaps that we've seen in, you know, audio generation, LLM stuff in general, because when they first did this feature...
It was using like early, early incarnations of that from, you know, if it came out and I was 17, it would have been in development for like a year or so beforehand. Now we see people who like upload two minutes of audio from a podcast and they can like fully synthesize someone's voice and personality. So I presume this is now just using a much more modern... model neural network to make the stuff in it.
means you can do it with a lot shorter setup time and a lot shorter processing time so that's good and hopefully the result is a bit better because it wasn't bad but you could tell it was like quite robotic you know the first the first personal voice stuff Because when we talked about this feature in 2023, I remember saying that everybody should set it up, even if they don't think they need it, because you never know when you will need it when you will.
lose your ability to speak you might not get a heads up that that's going to happen but i could understand where somebody didn't want to dedicate the time to sit down and say 150 different things so but now that you can do it in under a minute and 10 phrases like Everybody should do this once iOS 19 is available. Everybody.
And there's a laundry list of other stuff, improvements to eye tracking, head tracking, switch control, some upgrades to music haptics, which is the feature that makes your iPhone vibrate and tap as you listen to a song. Vehicle motion cues is a feature that came to iPhone last year to help motion sickness. when you're in a car and now that feature is coming to the mac which is pretty cool
Yeah, this is the thing where it shows those black and white dots that kind of follow the direction of the actual exterior car movement. I guess it kind of tricks your brain into being more okay with how the car moves. I don't get super bad motion sickness, but...
I do, I feel like I'm on the verge of getting it. Do you know what I mean? Like, yeah, I get like hints of it. I'm like, oh, and especially when I was doing like the wedding to Edinburgh and obviously I was watching like an iPad and stuff. There were some turns that the driver was doing and it was like making my stomach go a little bit. I was like, I feel like I'm on the precipice of needing this quite essentially. So bring it to every single platform you can because I do think it works.
Yep, I use it anytime I'm not driving in a car and it works because I get super, super motion sick, super easy in cars. And I'm actually going, I have like a three-hour drive in the car next week where I will be a passenger because my friend that I'm riding with drives a manual car. And I don't know how to drive a manual car, so I have to be the passenger.
and I'll probably be doing work on that drive because it's during working hours on Friday. Unfortunately, this feature won't be available in time for that drive on the Mac, but it is something that's nice to see, and like you said, bring it to all the platforms.
but yeah that's the new accessibility stuff oh i guess one thing we should mention they didn't leak ios 19 design oh yeah yeah because when they did this what was it two years ago i think There was a change to the layout in settings where... Previously in settings, most of the cells, so the individual rows, they'd be full width, right? Oh yeah, I remember this. And I think it was iOS 16. I think it's when they first did like background sounds.
when they announced that maybe two years ago, three years ago, one of the screenshots when they did this accessibility preview, it literally had that design of the cells being circled out like that.
And then people are like, this isn't how the settings app looks. And sure enough, you know, they announced it in June and the settings app had changed the design. It was quite subtle. It was nowhere near as dramatic as what we think iOS 19 is going to be. But they kind of like front run their own announcements by mistake. In this case, There is nothing in these screenshots. It could just be IF 18.6, you know? So they've been very careful this time around.
Happy Hour This Week is brought to you by ExpressVPN. Check them out at expressvpn.com slash 9to5mag. Going online with ExpressVPN is like not having a case for your phone. Most of the time, you're probably fine, but all it takes is one bad drop, and then you're wishing you paid those extra few dollars for a case. ExpressVPN protects you and your data when you're browsing on any unencrypted network.
That could be at places like cafes, hotels, or airports. Any hacker on the same network can gain access and steal your personal data. ExpressVPN stops hackers from stealing your data by creating a secure, encrypted tunnel between your device and the internet. And it's super easy. You just fire up the app and you click one button to get protected. And it works across all your devices, your phone, your laptop, your iPad and more.
And ExpressVPN is great for travel. When I was on holiday, I just wanted to check something on one of my banking apps, and I was in this random cafe on their free public wifi. Well, no fuss, one button press with ExpressVPN on my phone, and boom, I'm safe in the knowledge that all of my browsing and data is now encrypted through them. And if you use it at home, ExpressVPN is so secure that even your own ISP won't be able to track what websites you're visiting.
So secure your online data today by visiting expressvpn.com slash 9to5mac. That's e-x-p-r-e-s-s-vpn.com slash 9to5mac to find out you can get up to 4 extra months free. expressvpn.com slash 9to5mac. Thanks to ExpressVPN for sponsoring the show. Happy Hour This Week is also sponsored by Insta360, a leader in 360 degree action camera tech. Now, they've just launched the brand new flower clip model, the Insta360 X5. It's small enough you can hold it with your fingertips, but it packs a big punch.
You can capture up to 8K resolution, 360 degree footage with enhanced low light performance for high quality, all day, all angle recording. Why would you want to record in 360? Well it gives you so much freedom and flexibility you can be in the moment capturing everything around you without having to check angles or frame shots in advance. It's perfect for action and vlogging, you just hit record.
and then later you can choose the best angles with the help of AI-powered reframing tools in the Insta360 app. The X5 even has a new mode that captures flat and 360 video simultaneously, generating an instant MP4 for quick sharing, plus you have the full 360-degree footage, save for later. And it's more durable with tougher optical glass and user replaceable lenses, and it can record for up to 3 hours on a single charge.
Able to capture everything from immersive POVs to unique third-person shots, the X5 is every camera you'll ever need. You can even use the clever invisible selfie stick accessory to capture seemingly impossible third-person angles. The pole just disappears from the 360 footage. So, to get a free 45-inch invisible selfie stick worth $24.99 with your Insta360 X5 purchase, head to store.insta360.com and use the promo code HAPPYHOUR, available for the first 30 purchases only.
For more information be sure to check out the link in the show notes. Thanks to Insta360 for sponsoring the show. I never thought we'd see this day at this point Next generation CarPlay. Almost three months, or three years, sorry, not three months, three years to the day since Apple announced it at WWDC. 2022. Apple has finally shipped next-generation CarPlay. And now Carl's deadline was the end of 2024, right? Because they said, by the end of 2024. And that came and went.
It was June 2022 they announced it, and they didn't say much. And then they said at the end of 2023, the first car makers... would announce their cars with Next Generation CarPlay. And that was when we found out Porsche and Aston Martin would be the first two. And then those cars were supposed to ship by the end of 2024. That didn't happen. But now, at least for Aston Martin, the day has come. It's now called CarPlay Ultra.
somewhat of the next generation carplay design that we first saw at WWDC, but the spirit is there because it takes over all screens in your car, not just the center screen. It ties into things like climate control. What else? Like your tire pressure monitoring system. All the gauge clusters are CarPlay. Even like driving modes, like sport mode or eco. Driver assistance. Yeah, like radar, you know, like the radar cruise control stuff. That's all visualized in there.
The visual interface, if you look in the mockups, or I guess these are real images at this point, of the... Aston Martin in Apple's press release. The center screen interface looks largely the same on the dashboard screen. So that's where you can see like your Apple Maps if you have navigation going, an upcoming calendar event, the now playing controls.
But then you swipe, you can see them swipe through different screens on the center screen interface. And you can see there's a dedicated screen for widgets, there's a dedicated screen for air controls, seat controls, things like that. Then it also extends to the instrument cluster display. So that's where you can see more Apple Maps stuff, more now playing stuff.
All of the design language is consistent and designed. I think they said in partnership between Apple's designers and Aston Martin designers. You pointed this out to me before we started recording, Mayo, but the reason that you shouldn't get too upset that this design maybe isn't as ambitious as what Apple showed at WWDC is simply because this Aston Martin car that they're showing. It doesn't have the wide panoramic
full screen design that the big dashboard or screen thing. Which is, you know, the mock-up that Apple kept shoving for next gen car play they never specified which car maker this was right it was just this like mock-up of a car yeah but it had like full width screens and it had like a
You know, the speed gorges and that and the instrument cluster, there was like a map underlayed or imposed on that. And then on the side, you'd have like a music widget, you'd have weather, you'd have calendar, and that was all stretching along, you know, the width of this massive screen. As far as I can tell, that kind of design is possible to achieve in next-gen CarPlay, but...
you need a car that has screens that are in that configuration to actually show it. In this case, with these Aston Martin cars, you've got an instrument cluster display and you've got a standard rectangular infotainment screen, right? This design, what they're shipping in these Aston Martin cars, is basically what they showed in the Porsche renders, right, from a year or so ago, because the Porsche cars also just had... Basically two screens. I don't think anyone that they partnered with yet.
actually has a car that looks like the mock-up that Apple made. But I don't think it's because the CarPlay Ultra system can't do it. I just think there isn't a vehicle yet that has that screen layout. The mock-ups from Apple look a lot like Some of the new Lincoln cars. I can't remember the exact one. They just announced the Navigator, maybe. One of the higher-end Lincoln cars has a very similar panoramic screen like that. And right now, it runs. for its infotainment system.
and has carplay on like part of like the middle part of the screen and then forward supports instrument cluster part of existing carplay so there are cars with that screen design now but yeah nothing with next generation carplay nothing in a aston martin or porsche Yeah.
Yeah, the incredibly wide option of choices of manufacturers, which is actually only Aston Martin because we don't know when the Porsches are coming out. And the Aston Martin cars are like 200 grand. So, you know, you're talking about exclusive, exclusive. The press release did mention that Apple is also working with Brands like Hyundai and Kia.
to bring CarPlay Ultra to the drivers, but it did not have a time frame of any description, so we could be waiting years for that to actually materialise. And the thing with adoption from automakers is Apple showed a slide at WWDC 2022 of brands that they said were committed to supporting. Next Generation CarPlay. And those brands were Land Rover, Lincoln, Audi, Volvo, Honda, Nissan, Ford, Porsche, Jaguar, Acura, Polestar, Infiniti, and Renault.
Mercedes-Benz was also on that list, but then I think it was last year they dropped their plans to support Next Generation CarPlay. Then now we have, like you said, Hyundai, Kia, and Genesis. then Aston Martin also wasn't on that list from WWDC, but they are signed on to support Next Generation CarPlay at some point. To date, what we have today is CarPlay Ultra coming to Aston Martin cars in the United States and Canada starting today.
And they said it will come out as a software update to existing vehicles. So if you're a happy hour listener with an Aston Martin, you're going to be very happy. If you're a happy hour listener with an Aston Martin... find a way to contact me and we can meet up and I can check it out. I'd love to do that. And you better be to subscribe to Happy Hour Plus. The widget screen, right?
which is available through the infotainment screen. It looks very similar to standby mode, where you have the big square rectangular widgets side by side, and you can swipe between them. It does raise a question to me, though. Why do you need Copa Ultra to have Rendered widgets. Couldn't they have just put that on normal CarPlay? That's what we've been saying all along. Yeah. All I want is a weather widget, you know? And by the way, the...
The 2022 render had a split screen. The infotainment screen was so big, so it had the apps, and then it had a split, almost like an iPad split view thing. And it had the weather app on the right-hand side.
That kind of layout isn't shown here in any of these Aston Martin things. And the weather design is like a black and white widget, right? Like you'd get in standby mode. Whereas the thing they showed in that render was like fully rendered with those animated backgrounds and stuff. Like it was actually like the standalone weather app. But that doesn't require integration with the car, right? You know, you could just render the weather app in the CarPlay. Yeah.
CarPlay doesn't have it and CarPlay Ultra, at least in this current Aston Martin version, doesn't have it either, but it does have this widget screen. which just looks like the standby widget is just projected into the car. Useful, but it's not like... The thing that CarPlay Ultra really gives you is the integration with the car data, and they haven't...
They haven't really explained how this all works, but it still seems to be all rendered by the phone. It just seems the phone receives data from the car and then quickly updates its renderings that then get transmitted back to the screens of the car. Because Top Gear have this exclusive hands-on with CarPlay Ultra video that's out. You can see a hands-on video. And as the car boots up, it has
Like, you can use the car without the iPhone being connected, right? So, like, it has its own set of gorges and instrument cluster, and then the CarPlay kicks in and it replaces it with, like, the Apple rendered version. But like it's not like you know where you have android automotive that's where the operating system of the car is android yep carplay ultra is not that it's still
The car is completely independent. If you bring the phone, it just then projects onto the instrument cluster or any screen that's in the car. And it receives data from the car, like the current speed and the mileage and the range.
and you can control, like, the radio, and there's, like, a new radio app on CarPlay that control the actual radio of the car. And then if there's some, then obviously if they've got all the visualizations for the climate control and those sliders and, like, the infotainment thing, it puts, like, the temperature.
um in the you know like the bar on the left of car play where you have the the three recent apps well now it puts on the climate control for that side for the driver's side and then you look on the other side and it's got climate control for the passenger side
So there's a lot of integration, and it just means you'd need to leave the CarPlay experience less. And even for stuff that's super advanced, like I think they show on the Top Gear thing, there's a menu for barrels and working sound system. It has special options for...
you know, like focus mode or like there's like specific browsing Wilkins settings where that hasn't been rendered in CarPlay and Apple hasn't made a template for it. So in that case, it just kind of like punches through. So they just basically render the old UI. inside of CarPlay so you get the sidebars but the middle section is basically what you'd see on the old infotainment system.
Almost like if you imagine a green screen, you've got the sidebars and the middle bit's green, it just gets replaced by the old car UI. So they've got accommodations for obviously some of the more esoteric car features. But I think the general idea is you can have nice looking instrument cluster, layout, have an integrated map and have the widgets and get your music and everything sharp there in the center.
And you just have to go to the rest of the car far less often because the most common controls that you would use in a car are now also going to be available through the carplay experience. And in the case of this, Aston Martin example. Aston Martin is not one of the automakers who has abandoned physical controls in favor of things off-screen, so they do have dedicated buttons for quickly adjusting.
the temperature and circulation and hazard lights and defrost all those buttons are still dedicated yeah i think i think i can have on screen controls as well yeah i know what i was saying though is it's going to be interesting if you take a car that has entirely on-screen controls, there's going to be other accommodations that you have to make via next-generation CarPlay, I would assume.
Yeah, take advantage of the UI. Yeah, more UI, yeah. If you look at the... I can't remember if it was last year's... I think it was last year's WWC session. They had the future of CarPlay and they showed off more of the templates that... there was quite a wide variety of stuff, many of which
There was one that almost looked like the Tesla layout, where it's white and it has the big 60 by 9 screen in the middle. And a split where you have navigation left and other controls on the right. So I think they've at least shown in mock-ups quite a lot of different designs.
But obviously you're limited by what the physical features of the car are. And in the Sassana Martin, you've got two screens, right? They're basically normal sized and then a load of physical controls. What we're really looking forward to is them to say, hey.
bring some of the CarPlay Ultra features that aren't requiring car integration bring that to other cars as much as you can and then also you want to see cars that have like you want to fulfill the vision of the 2022 mock-up right where you've got super wide screens and everything looks really pretty and they've got gradients left right and center um and it really feels like more distinguished
This is fine, and it obviously is doing stuff that current Card Black are not doing a million years, but it's nowhere near as exciting as what they first demonstrated. I'm just glad we have more details on it. It was not. Because when the 2024 deadline came and went, they did put out a statement saying they were still working on it and it was coming with a few automakers soon. So we knew it wasn't completely dead in the water. but to actually have not only an announcement showing off
the interface in a specific car, video showing off the interface in a specific car, but it's rolling out starting today. I'm just glad that that has happened because they very well could have made this announcement today instead of rolling out. later this year but nope it's available today and you know like also that they make this big deal about the templates being somewhat unique to each car maker
So in the case of the Aston Martin, the like speed gorge and the engine temperature, they have like the Aston Martin logo on them and it says like hand built in Great Britain around the dial. So that's how they're trying to like infuse a bit of the Aston Martin.
branding into it um far less subtle than what i think the porsche renders had where it was like tartan colors right do you remember that um and it was a lot more likely to the eye so i feel like at least on the smi version they've been quite restrained which is which is nice But from the top, get hands on.
There was like the dials that had like the Aston Martin logos on it with the, you know, bezel text and stuff. But then they had other, you just like swipe between and they had other dial layouts, which were much more abstract and didn't seem to have any Aston Martin influence at all. Like they have one which is just lines. You can have a progress path for speed, for instance.
So I don't think... Let's say you really hate what you're... Okay, let's imagine a world where car players are on a load of cars and they all have their own custom themes. It seems like to me Apple's still going to give you options where... You can have basically just like the Apple instrument cluster. You don't have to use the one that's matching the card. That'll just be like one of the options in the set.
And even in that hands-on, they show one, which isn't shown in the press release screenshots, they show one instrument cluster design, which is like, a thin strip on the bottom like 20% which has all of the information and then the top is just a map.
So it's not quite as pretty as the 2022 renders where the gorges and the map are overlaid on each other. But there is a layout where it has the navigation map full screen and then just a very small bar at the bottom which has speed and the different status indicators. oh yeah that does look that looks that looks nicer i think yeah the kind of headline photo they show in the press release just has like the map in like a rectangle right yeah
which, for the record, with current CarPlay, CarPlay 1.0, it does support putting maps in the instrument cluster. It's just very few automakers have taken advantage of it.
BMW has it. I rented a Polestar 2, 1, 3, one of their cars, one of their EVs. They support it. And I wrote about it on the site at the time because it's really good because it looks a lot like that example in the... aston martin video that you just talked about where it's full width or kind of blends into the gauge cluster nicely so seeing the initial look in that main image from the apple press release i was kind of like oh that doesn't look
as good as what you can theoretically accomplish with current card play. Yeah, because the way it's laid out there, it kind of looks like they're avoiding the fact that the iPhone is also rendering the instrument cluster of the dials because it's like, oh, this is the old car UI and we've slapped our bit in the center. But no, the iPhone's rendering all of it. It's just this particular layout has square widgets in the middle.
So we know that Porsche will presumably be the next automaker to roll out support for CarPlay Ultra. I keep wanting to say next generation CarPlay. That's an actual name now. CarPlay Ultra. Do you want to place a bet on when it'll come to a third brand? I think we'll see it in a third brand. Hopefully in a more reasonably priced brand before the end of next year. Not before the end of this year, you think.
I feel like it was the end of this year. They might have mentioned it in this press release. I mean, I'm maybe being too pessimistic, but it kind of feels like it might be Aston Martins out now. And then Porsche will come out later in the year. And then one of Hyundai Arkea will be next year. Do you know what I mean? I do have a Hyundai car. It only has an entertainment screen. It doesn't have an instrument across the screen.
I hope it comes to my Mach-E. It's a 2022 and it has the big portrait-oriented center screen. And CarPlay doesn't do a great job of taking advantage of that big screen. Seems like the perfect candidate for next generation. for CarPlay Ultra. And Ford is on the list, so I'm kind of hopeful. But if I am looking for a new car, which I'm not doing anytime soon, it does give brands that do support CarPlay Ultra a like upright, just like right now we have.
having carplay gives a brand a leg up i would choose if that if i needed one factor to choose a car i would choose the car that has carplay over the car that does not have carplay So in a few years time, I will probably choose a car that has CarPlay Ultra, not just regular CarPlay.
It's like all CarPlay features. They announce it and it takes five years to gradually actually be in cars. Yeah, there's absolutely no reason to ever be optimistic about the rollout of a CarPlay feature. So you're right to be pessimistic about. CarPlay Ultra coming to a third brand. But yeah, I'm glad they've announced it. I'm glad it's actually real now. Hopefully some more affordable cars soon.
Happy Hour This Week is also brought to you by Shopify. Check them out at shopify.com slash happy hour. The idea of starting your own business can be daunting. There's so much suddenly on your plate that you have to deal with beyond just having that great product.
When we sold wallpapers through 9to5Mac, even though we had a website, offering a store and checkout system was another ordeal altogether. Doing everything on your own was a non-starter. Instead, we used Shopify and it made everything so easy. Shopify is the commerce platform behind millions of businesses around the world and powers 10% of all e-commerce in the United States.
Get started with your own design studio. There are hundreds of ready-to-use templates, helping you build a beautiful online store to match your brand's style. And Shopify has helpful AI tools to help you write product descriptions, headlines, and even enhance your photography. You can also use Shopify to easily create email and social media marketing campaigns.
Shopify really is your commerce expert. They have world-class expertise in everything from managing inventory to international shipping to processing returns and so much more. So if you're ready to sell, you're ready for Shopify. Turn your big business idea into sales with Shopify on your side. Sign up for your $1 a month trial and start selling today at shopify.com slash happy hour.
Go to shopify.com slash happy hour, all lowercase. shopify.com slash happy hour. Thanks to Shopify for sponsoring the show. Finally this week, Happy Hour is sponsored by Bitwarden, the trusted solution for password, passkey, and secrets management. Bitwarden empowers individuals and businesses to take control of their digital security. We're tools that make managing strong, unique credentials simple across all your accounts and devices.
Whether you're logging in on desktop, browser, iOS, Android, or even the Apple Watch and Vision Pro, Bitwarden makes it seamless. With features like enterprise SSO integration, end-to-end encryption, and secure autofill capabilities, Bitwarden helps protect you from phishing attacks and data breaches, no matter where you are.
If you've ever abandoned logging into an account because you forgot the password, you're not alone. Over 55% of people say they've done the same, according to the Bitwarden World Password Day survey. and 60% report that managing passwords is somewhat to extremely stressful. Bitwarden helps solve that, offering options like biometric unlock, secure password sharing, and the ability to create, manage and autofill credentials, identities, and even payment info across devices.
It even supports password creation, passkey and secure sharing, all while maintaining zero knowledge encryption. So check out Bitwarden Password Manager by hitting the link in the show notes or visiting bitwarden.com. That's bitwarden.com. Thanks to Bitwarden for sponsoring the show. So this week, Apple updated the AppKit documentation to tell developers about a change coming to clipboard privacy, starting with macOS later this year. Apple doesn't say it's macOS 16, but we know it's macOS 16.
And some of this is a bit over my head, Mayo, but the gist of it and how I've interpreted it. is that the clipboard privacy features on iOS, which is where you have to specifically give an app permission to paste content from another app. Something like that is coming to macOS this year. Yeah. Yeah, so iOS in the old days, any app could read the clipboard, put something on the clipboard, and then paste from another app without you having any knowledge of it, right? That was very convenient.
It was easy, and it's how the Mac works right now. If you copy and paste through any app, any other app can see the clipboard and keep it, and you even have Clipboard Manager history apps that keep a record of everything you copied. No interruptions, no UI, it just does it.
But then what they realized in iOS was, you know unscrupulous apps were reading the clipboard surreptitiously and you know sending them off to data analytics or privacy or you know stealing information and let's say you copied a sensitive phone number you then switched to another app it wasn't immediately obvious to the user that that app would also have access to the phone number you know people just didn't really think of it and it led to some
I think mostly it was like used for fingerprinting and stuff. So they were just doing tracked email addresses and data that you'd copied and you didn't really want to paste in that app. You just did that app next. By the way, it was being abused. So Apple added the system a couple of years ago where...
When you copy something, the system doesn't immediately paste it into another app. It pops up this little dialogue. It's like, are you sure you want to paste? Then you can say, don't allow. You can say, allow. It basically stopped any third-party app from immediately getting access to the clipboard content without you knowing about it, right? And when this first shipped in the betas, it was kind of disruptive because...
it was too overreaching so like if you manually opened up the little text editing menu and pressed copy and then you went to another app and you open up the menu again you press paste yourself like a direct user intent action it would still pop up this alert and then you have to press allow paste and then carry on They ran that in and they kind of updated the heuristics over that beta period. So now you don't really see the paste button, the paste dialogue.
very often. It's only if there's like a third party app that has like a custom paste button or there's an app abusing the clipboard when it pops up, right? And there were some cases where apps would try and be helpful. So like the Reddit app, for instance, if you copied a Reddit URL from Safari and you immediately opened the Reddit app, it would just navigate you to that article.
That was quite useful, but those kind of features got removed because it doesn't really work when you have an explicit UI every time it wants to paste. Because it wouldn't even know if there was a URL to paste without looking at the clipboard first, right? So those kind of features kind of just disappeared. So that was kind of the casualty of it on iOS. But overall, I think it was generally seen as a privacy improvement because now apps can't read your clipboard without you knowing about it.
This basically means that that same system and policy is now going to come to Mac OS, so every app by default won't be able to read your clipboard. If it tries to paste, it will pop up this dialogue. And the reason Apple's telling people in advance is because a lot of Mac apps
do look at the clipboard to try and be helpful. They're not using it to abuse your privacy. They're just trying to be like, oh, you copied a link here? Well, we'll show you this thing here. Or you can paste and go in the URL bar, for instance, right? So there's API changes that have to be made. app behavior changes have been made so you don't start seeing these pop-ups in every single app every time you switch windows.
So for mainline usage, it will just be an improvement, right? It'll be like iOS. There is a concern though that this change may make clipboard history or clipboard managers on the mac uh obsolete or impossible to use now because the way they work right now is they just in the background look at your clipboard content every time it changes they get a copy of it right
As written in this update on the developer site, I think every time the clipboard manager would notice a clipboard change, it would have to pop up an alert diagram that says, are you sure you want to allow your clipboard manager to see this? and you click allow every single time you copied something, which will probably make people uninstall the clipboard manager. So that's a bit of an issue, right? Because clipboard managers are quite popular on the Mac.
And even stuff like if you copy something and then you use one of those third-party command space alternatives, if they see something in your clipboard, they'll immediately suggest that you can go straight there by pressing one button.
All that stuff becomes a lot more impossible when they can't even read the clipboard until they show this dialogue every time. So I think on the Mac, there's just more use cases where it was being used for legitimate reasons and people are worried that now all that will be thrown away.
The only thing I might think is maybe Apple will recognise that and they might add like a... another level of permission in the settings where it's like always allow you know and so that might allow those things to carry on but by default I think it's going to happen where you're going to get this pop-up So it could work sort of like the screen recording permission prompt. Exactly. Where every so often it prompts, it reminds you that an app has.
permission to read your clipboard, theoretically. Yeah. And everyone was mad about the screen recording thing at first, too, and then they had to roll some of that back. And where they landed on that, I think is fine. Like, the gradual system, I think that's fine. Because when you think about things that people talk about when they say they can't use an iPad instead of a Mac, clipboard managers are often one of the first things that are mentioned.
apps like I use Pastebot as my keyboard manager of choice. If Apple were to do something that either completely rendered those apps impossible on the Mac or made them significantly harder to use, I honestly think The blowback to that would make the blowback to the screen recording stuff seem like nothing. Those are just such power, or not even power user apps, really. Just such crucial apps to so many people's workflows. Yeah, they're just more built into how people use the Mac, right?
Alternatively, maybe Apple is adding its own pasteboard, clipboard, whatever, manager to the Mac, and you could use its option, but not a third party, which is also not a great outcome. I think they'll probably just add a setting where it's like, always allow, and then it only reminds you every...
Four months or something, yeah. They got so much blowback to the screen recording staff that I feel like they're going to try more carefully. And they got so much blowback to the initial implementation of the... a feature on iOS, too. And I think any sort of feature like that on Mac will be even more disruptive than it was on iOS. So, we'll see.
Finally this week, a few iOS 19 rumors, courtesy of Mark Gurman at Bloomberg. Yeah, these are iOS 19 features that Apple didn't officially announce this week. First in Power On last week, he said... iOS 19 will add a new feature that makes it easier to log in to Wi-Fi networks at like a hotel or an office building. So using the hotel as an example.
Usually you connect to the hotel's Wi-Fi, and then it pops up a, what's it called, a captive? Captive, yeah, something like that, where you have to enter. Typically for a hotel, you enter. your last name, maybe your email, and your room number to get onto the Wi-Fi network. Mark says that with iOS 19, you'll have to enter that information one time, and then it'll be saved and synced to your other devices, so you don't have to enter every time.
How I'm reading this, and I do think there's two ways to interpret what he's saying here, but how I'm reading it is that You connect to the Wi-Fi on your iPhone. You enter your name, you enter your email, you enter your room number. Then you open up your MacBook a little bit later and you won't have to enter that information over again. Yep.
but it only works for the same wifi network. It's not permanently saving it for every time you connect to a public wifi network like this. Yeah, I think that's right. Yeah. It basically just stops you because you've had this, right, where you're like a hotel or something and then you get your laptop open. You have to type in the free Wi-Fi login and you put your room number or something and then you press done.
And then you open your phone and then you try and connect to the Wi-Fi and it doesn't have any clue who you are and you have to type it in again. So it's going to, I think how it might actually work in practice is that
It might sync the Wi-Fi information to the phone after you type it in on the laptop. And then basically the hotel wouldn't know that you were then connected to the phone. It would just appear to be the laptop, right? So you don't have to type it in again. That's how I kind of assume this feature will work in practice. But either way, even if it's like just remembering the form that you filled in and just pasting them in themselves, the end result's the same, we're less typing for you.
The area where this gets a little bit messy is a lot of hotels, a lot of, if you think about an airline or something, they might let you have one device connected to the internet for free. But to connect a different device, you have to pay $19.99 or something. So if what you're describing as Apple is masking it to the network so it thinks it's the same device, theoretically, that could get you around the paywall. Yeah.
But otherwise, you might still run into that problem where when you connect on a new device, even if that information is saved, they're going to say, hey, we need your credit card number to connect this device to the network. Yeah, so the payable thing, I believe some Android phones, you know Personal Hotspot?
They let you hotspot Wi-Fi. I saw this. Yeah, yeah, yeah. So you connect like on your phone to the Wi-Fi, but then it hotspots the Wi-Fi to your laptop, so you avoid the multiple device problem that way. I saw a video of somebody doing that on a plane last week or something. Yeah.
So I kind of think App was going to use that technology, but just use it as the feature of, you don't have to type in more than once, but it might have the side effect of circumventing those restrictions. This is one of those features that I never, ever would have thought of. writing a feature request about. It's cool that it's something that Apple is considering and that will apparently be in macOS 16 and iOS 19.
I mean, I think they've probably seen the popularity of the share Wi-Fi password feature, right? Yep. And this is just a natural extension that to cover more types of Wi-Fi network. I also think it's somewhat of a reflection of the state of the operating systems, right? They're so mature these days that, apart from a few big initiatives,
updates to individual components are always going to be these small little advancements because the other underlying stuffs have already been there for five years at this point. Then another iOS 19 feature will apparently be using Apple intelligence to improve your iPhone's battery life.
Mark describes this as an AI-powered battery management mode for iOS 19 that will analyze how a person uses their device and make adjustments to conserve energy. It'll be pitched as part of the Apple Intelligence platform. And Apple is using battery data it has collected from users devices to understand trends and make predictions for when it should lower the power draw of certain applications or features.
Remember a few weeks ago, a couple months ago maybe at this point, we talked about a report from Mark where he said there would be no new consumer-facing Apple intelligence features in iOS 19, which that's obviously changed.
But if you look at what Apple Intelligence has done so far and the features that have shipped as part of Apple Intelligence so far, if you take Apple Intelligence and apply it to improving iPhone battery life, that might be... the most practical, the most useful Apple intelligence feature that Apple has shipped to date, right?
People want their iPhone battery life to be better. And if this feature is able to materially improve people's iPhone battery life, that's the thing. It's like, what's it going to do? I mean, I guess there's some customers out there who just live their life on low power mode. Which is madness to me. And it does extend your battery life. But...
Before promotion, because obviously low power mode's tied to the screen refresh. Right, that's the big thing. Some of the other things that low power mode does, I think a lot of people just don't notice, right? Like the fact that your mail app refreshes less frequently or something. So there's potential there to get some easy battery life gains, but if you're a power user of the phone,
I don't think an AI model can somehow optimize your battery to do it better. It's more for the people that are more casual users that it can then see your device use and you'll be like,
oh, you know, this thing that right now we update once every 10 minutes, well, you don't actually ever open this app, so we just won't bother updating that, you know? The mail is a good example of that, right? Yeah, yeah, yeah. Somebody might only open the mail app Once a day or twice a day once in the morning and once at night and they don't need a constantly doing things in the background
Or it watches your usage and it notices that when you're in this general area, you never connect to a Wi-Fi network, so you might as well turn the Wi-Fi radios off. Yeah, that's a good one. There's some more subtle behaviors like that, especially relating to low signal strength stuff.
so maybe it knows that you always travel through on your journey to work you travel through this area that has no cellular signal but you never actually use the phone so rather than ramp up the radios to make sure you keep a signal we just won't bother doing that part like there's always going to be trade-offs because you can't just make free battery right so it'll have to be turning some stuff off at times but i think the idea is at least my reading in the future is that
it will look at your personal usage or aggregated usage of what they know the average iOS user does and try and find things to turn off that people aren't going to notice, essentially. There's another line in Mark's report that I think got missed by a lot of people, but he says there will also be a lock screen indicator showing how long it will take to charge your device to 100%. I've been wanting this for so long, and it's something that's been on the Mac for a while now.
Do you remember, for a couple years, they took it out. Did they take out how long it would take to charge? I'm sure they did. I'm sure that was the story. It was like... That was like the thing, right? They were like, oh, the battery life's bad, so we will stop showing you how much is left. Right, I think they stopped showing you how much is left, but they stopped, they kept showing.
how long it will take to charge. Right, okay, okay. Which is what Mark is saying is going to come to iOS 19, I think, anyway. Right, so if your phone's on 10%, you plug it in, it'll say... checking you out an hour and a half and you'll be done. Right. Yeah. Which is very useful if you're in a pen. Another risk associated with this 17A is that Mark specifically said this battery intelligence feature.
was designed with the iPhone 17 Air in mind to prolong its battery life, but it will come to all models. So, the battery light intelligence will be on all iOS 19 devices, but it was kind of built. with the motivation being the 17A wasn't hitting the battery life targets that Apple wanted it to. So they're trying to compensate and mitigate by doing battery intelligence. Which again, if it works, has the potential to be very helpful to people, I think. But that's a big if.
And it makes my confidence in the 70 year. Less. You're confident the past few weeks have been a real nosedive for your confidence in the 17 air. It sounded like a great phone. I know, it sounded less and less great as the compromises become more public.
Another line in Mark's report that is worth mentioning is that Apple is pushing engineers to ensure that this year's software releases are more functional and less glitchy. No idea what this means, and this is the type of thing that you think apple will be doing every year like every year you tell your engineers to make things stable and less glitchy. But it is notable that they're apparently emphasizing this in the year of the biggest visual overhaul to iOS since iOS 7.
That's the thing, right? All engineers want to make software that works. They're not sitting around making buggy software on purpose. Time constraint. Factors outside their control like the number of devices times by the different geographies and the different peculiarities of feature in different places and different setups and like
Do you only have iCloud or do you have an iCloud family or do you have an iCloud this or do you have other devices? Are the iPhone and the Apple Watch syncing at the same time? Does it affect a certain time zone or a certain language? There are so many factors that obviously Apple has QA processes for.
But at some point, you've got to ship the software. So at some point, you're always making trade-offs. It's like, we've got a bug list a thousand lines long. If we didn't ship the software until we fixed all the bugs, we literally wouldn't ship the software. So every company at some point... releases software and then when it's released they have to find out new bugs the customers find right and then it gets back in the feedback loop and the pile of bugs gets even higher so every year like
You know, I'm sure the managers are like, oh, you know, do this feature and make sure it's bug free. But like actually doing that is incredibly impossible. And so I don't know where this line from Mark's report came from. Did Federighi like send out a missive in January where he's like, make sure the features are good. Or maybe there was a bit more latitude. It's like, if features are going to be buggy, don't ship them, right? But then obviously the trade-off is less features come out.
you know, the German was just like one line in there, so it's not very clear. But obviously we're anticipating a relatively big visual overhaul, at least. for iOS 19. And when they did iOS 7, iOS 7 was very buggy, right? And it was pretty clear they'd rushed out the design change and hadn't spent as much time doing QA and bug fixing.
So hopefully with iOS 19, for as big as the visual overhaul is going to be, hopefully it won't come with the baggage of the reputation of being unreliable you know like i was 17.0 i was 7.0 was so bad it would like um springboard crash you know and you'd go to the black apple logo and stuff that was like a thing all the time all the time yeah yeah not even like occasionally it was like a common thing um and it really kind of uh plagued that that
phone, like the iPhone 5S launch. If you go back to look at the press reviews for the iPhone 5S, people are like, the phone's good, but iOS 7 is buggy. So, you know, that was bad, a bad time. And hopefully they can do better this time around. But you never know until the thing comes out, right? I think ISA team is pretty good and bug-free, I would say. I can't think of many two major gates or anything relating to the software of the entire of the last cycle.
It kind of got bad, I guess, like around iOS 11-ish because then they did the iOS 12 release where they're like, you know, it wasn't no new features, but it was like a big emphasis on performance and improvements and like reducing crashes and making apps launch and stuff a lot more quickly. I'd say since then they've done a pretty good job of maintaining that. There's been a few off years.
iOS 17, iOS 18, I wouldn't necessarily describe them as buggy releases, you know? I see people complain about iOS 18, but I've never, like, the amount of times I run into a... A major bug or a show-stopping bug on my iPhone is so slim. Like, almost never. I can't think of it happening in a long time. But I see people complain. I guess everybody's setup is different. Everybody's...
Phones are different. Everybody has different stuff installed. Personally, I wouldn't call Iris 18 a buggy release. No. And that's one of the things we've heard about Craig Federighi's management of the software group is that he's put an emphasis on making it easier for engineers to enable or disable features depending on how buggy they are and whether something's ready for prime time. And now he's going to fix Cerrito. Apparently. Just disable all of it. That's the best way to make it better.
All right, I think that does it for this week. You can find us on Apple Podcasts, where you can leave a rating and a review. Find an ad-free version of the show at 9to5mac.com slash join with bonus content each and every week for $5 a month or $50 a year. Send us feedback, happyhour at 9to5mac.com. I am on threads and elsewhere at Chance H. Miller. And Mayo, what about you? At BZMA. All right. Thanks, Mayo. Bye-bye.