Taking control of Blueprints, Voice gets smarter in 2024.7 - podcast episode cover

Taking control of Blueprints, Voice gets smarter in 2024.7

Jul 03, 20241 hr 2 minEp. 159
--:--
--:--
Listen in podcast apps:

Episode description

2024.7, Year of the voice chapter 7, and an amazing ESPHome update are just some of the goodies released Rohan and Phil break down.


Chapters

00:44 ESPHome Update Entities

05:16 Year of the voice chapter 7

18:49 2204.7

20:11 Home Assistant Roadmap

29:05 20247 Continued


For complete show notes and more information about the topics discussed in this episode, be sure to check the notes at

https://hasspodcast.io/ha159

Watch this episode on YouTube

Support Rohan and Phil on Patreon


This episode was made possible thanks to our sponsors

Home Assistant Cloud by Nabu Casa

Easily connect to Google and Amazon voice assistants for a small monthly fee that also supports the Home Assistant project. Configuration is via the User Interface so no fiddling with router settings, dynamic DNS or YAML.


Hosts

Phil Hawthorne

Website

Smart Home Products

Twitter: @philhawthorne

Buy Phil a Coffee


Rohan Karamandi

Website

Smart Home Products

Twitter: @rohank9

Buy Rohan a Coffee



Our Sponsors:
* Visit kinsta.com to get your first month free when you sign up today!


Advertising Inquiries: https://redcircle.com/brands

Privacy & Opt-Out: https://redcircle.com/privacy

Transcript

Hello, and welcome to the Home Assistant Podcast. My name is Phil. Joining me, I've got Rohan. Hey, Rohan, how are you doing? Hey, good. As usual, this episode of the Home Assistant Podcast is sponsored by Home Assistant Cloud by Nabucasa. Easily access your local Home Assistant instance remotely for a small monthly fee that supports the Home Assistant and the ESP Home projects.

Configuration is done via the user interface, so no fiddling with the router settings, SSL certificates, or any YAML. All right, Rohan, 2024.7 is here. I think this will be a decent episode today because there's a few other things happening as well. So before we get into 2024.7, I want to talk about the ESP Home update that just landed. So ESP Home has just got support for update entities.

Now, on the tin, this doesn't sound too exciting, but when you drill down into it, the ESP Home now has the ability for each add-on to have its own update ability to the point where an add-on author can publish a URL and say, I'm going to put my ESP Home-related code on this URL, and when there's an update available, the ESP unit can go out and download that and install itself. Yeah, exactly, which is great, right? It's that manufacturer-controlled firmware push kind of thing, right?

So if there's a new, I don't know, if there's some new functionality that they put in, because a lot of times people just shove hardware in there into their product and say, you know, software to follow kind of thing, right? Which is fine, but that also means you need to keep an eye on the specific firmware builds for whatever product you're using. If you buy, this just kind of makes it, if they choose to use it, it kind of puts it on them.

And as they release it, they can go push it out and update everybody's everything, right? There's actually another use case that I like with this, which is, let's say, even for yourself, if you have multiple instances of Home Assistant across different places. So, for example, I have Home Assistant with ESP Home running at my house.

Maybe I have it running at, I don't know, my parents' house, or if I have a rental property or summer home or something like that, like a cottage or something like that, I can have it there as well with multiple instances of ESP Home and stuff. So I don't actually need to go into each instance. Maybe I don't have a VPN between those different places, whatever, right? They're just standard internet connection kind of thing.

You can push that update out once and it'll kind of update across the board, right? So as those devices come up. That's nice. That's another nice thing there too, right? I mean, even in my house right now, I have two ESP32s that I use for BLE proxy. I have to update each one individually, right? So if there's an update available, I have to do one, I have to do the other, whatever. Here, if I just had one.

Common config, just take it, push it out to both, and they'll kind of both just get updated along, like together, right? Rather than having to go in and click it a bunch of times. I mean, we've talked to so many folks on the podcast where they've had a ton of these ESP32s or ESP8266s scattered across their house. Sometimes those devices serve the same purpose, right? So like in my case, the BLE proxy. Great. Take it, push it out to all of those.

So I kind of really like that aspect of it too. Yeah, it's totally, yeah, very nice. Yeah, I have only a handful of ESP devices. I don't think, I think the main one I'm using for is my IKEA desk. So yeah, I'm going to update and try and get some of these new fun stuff happening in it. Yeah, and it's funny that if that's the main one you're using, that's... That's the platform that irks me the most in the BLE proxy.

Again, it's a great integration, but it's the way the desk has implemented it. The BLE stack, which is it needs to connect first, so it's actually two-step. You've got to connect the thing, and then you've got to connect the desk, and then you've got to actually do the task of raise or lower or whatever. And for some reason, that just irks me so much. That it's just not instant. Instant, right? Yeah, you can have an automation, click it, whatever, right?

Do click this and then do that kind of thing, but I don't know. It needs to be more seamless. All right. Another big thing that kind of happened earlier was Homosys and Voice Chapter 7. So, yeah. Year of the Voice, the year is over, but the Year of the Voice is not over, right? It's still happening. That's right. MicroWeekWord, became a collaboration partner of the Open Home Foundation.

So, what that is, is it's a WakeWord engine that's going to run on different microcontrollers, right? So, specifically in this case, it's made for ESP Home, but it's available standalone too, if for whatever reason you want to use it somewhere else, you can totally do that. Kevin, who created OpenWeekWord, joined Nabucasa, so he'll kind of focus on Voice and MicroWeekWord in the ESP Home. It's so good to see people that are dedicated to these things, right?

And then Nabucasa being able to come in and go, cool, you want to do this? Let's do it full-time, let's just pay you and away you go. It's getting to the point now where I think I need to get on this local voice bandwagon, right? And start experimenting with it, because I've always been holding out for timers. And the fact that they are here is pretty good. Um, I just need that echo walk, look somehow act and integrated. Um, then, then I can be away. So, so let me, let me ask you, Phil.

I mean, so you want to, you want to get into the voice. You want to get into the local voice. Do you want to start, um, doing things like integrating in, um, Olamo or chat GPT or, or, or Google Gemini or one of those as well? Or do you not care for that? Like, do you, how do you use your echo at home? Like, do you guys say, Hey, what's some, I don't know, random. How, how do I tighten a screw? Like whatever you do, you ask those kinds of things, right?

We never ask any of those like logical, you know, like who is the president in this year or whatever questions. All right. Um, our only use case is music. That's a really high use case, right? Um, like playing music. Yeah. Um, especially for little ones, right? Like bedtime. Okay. Put some, um, kids music. One for bed. Right. Um, and a time is my daughter. Who's three is just learning how to, um, tell her to put a timer on. Right.

I think today she said, um, sort of half of her name and said, um, set it for 15 minutes and the echo device was smart enough to, you know, recognize that she wanted a timer and filled in the gaps and did set a timer for 15 minutes. Right. Um, so yeah, that, they are the, they're really the only two things you do on the thing, right? We don't use shopping lists. Um, or if we do, it's really just a reminder. Oh, interesting.

Um, and I think the, I think I've read somewhere this month that the Amazon echo is now restricting, um, shopping lists anyway. Like I think the two apps that we use and somewhere else you have to read, they have to rewrite their integration now. So, and one of them is not going to do it. The other one is, yeah, I can't remember which one it was.

So even then the shopping list feature on the Amazon echo is becoming, um, a little bit of a meh, you know, you're not going to like, you've never been able to integrate with home assistant, which was my. It's actually quite frustrating. I just hate that whole... Even to the point, one cool thing I discovered this week, so we just got a new bed for my son because we're setting out his new bedroom, right? And there's no sensors in there at the moment.

I'm deciding, you know, oh, should I get a millimeter wave sensor, PIR sensor, how am I going to automate his room, right? All right, the downlights have got a smart switch on them, but there's no motion sensors, there's lights, you just want to use the switch.

And then I found in the Amazon Echo app, just by looking at the settings, because I needed to change the media play settings on it, that the Amazon Echo device that we have actually has the ability to have occupancy sensors and can then take action. So the Amazon Echo can use... Is that based on a camera? It does like people detection or something? No, no, it's... Because this one doesn't have a camera, it's just like a third...

It's a Gen Echo Dot, so it's the round ball that uses ultrasonic sound. So as soon as it detects someone is in the room, it can then trigger an event. So I have it so when someone enters the room, the lights turn on, right? I'm like, oh, cool, I don't need a motion sensor. What? And it actually works pretty decently, right? Like I walk into the room and the Echo Dot, you know, does its whole thing, right? Of course, it's cloud dependent, not local like home assistant.

The problem is that... The timeout on this thing is 30 minutes. So if you walk into the room, get something, the lights will turn on, no problem. But then you walk out of the room, those lights aren't turning off for 30 minutes. The lights are going to stay on for 30 minutes. Correct. Yeah. Interesting. Yeah. And there's no way to manually clear that presence. And it's actually in the instructions.

The Echo says it will stay on for, you know, the occupancy will be detected for 30 minutes. So... Wow. I know, yeah. That was just a random thing I found. So... Getting back to the voice, for me... I've never noticed that before. Yeah, I didn't know it was in settings, but there it was. But anyway, for me, for voice, yeah, the time is... My daughter is now at the age where she knows what the voice assistant is and she can talk to the voice assistant.

Everyone sort of loves her in the family. My daughter assumes that her assistant is her, right? Like, oh, why did she... And she was handing it. Why did she turn off the lights that way? I'm like, oh, she's just being silly. We can't explain to her the whole reasoning of the automations behind the house, right, to a three-year-old. But for us, one of the big features we have in the kitchen is the echo wall clock. And one I love because it's the only...

Well, it's the main way to save the time, right? It changes itself for daylight savings time. I don't have to go up twice a year. Yeah. I don't have to wind the clock hands around. But the big feature is that when we set the timer for that voice assistant, it has the lights around the dial so we can quickly say, all right, there's 10 minutes left on that timer or there's 20 minutes. You know, it's just nice there in the kitchen. Everyone can see it.

I would, yeah, I think that would be a selling point to avoid, not to mention that you have to have... a wall clock paired with an echo device for it to work. Yeah. I mean, those would be some cool features to have. But it's like, I don't know, I feel like I don't know most of those things don't exist. So I don't use like the presence detection, like those kind of things, right? It's... Yeah, and like, let's be honest, right?

Like there is far superior like presence detection than on the Echo Dot, right? Like I'm... We're going to disable that and then I can throw away that Echo Dot if I move away from... And go into local. I think you asked about LLMs and all the smart, you know, that would be nice, but I'm conscious of like processing power, like locally. I don't necessarily want to pay for another subscription for something that may not be worth it.

Like assist can do some pretty cool stuff just in its basic understanding. And I don't need a cloud subscription for that. Yeah. And I don't, I don't know, just more money, right? Either to run it locally with energy or to use tokens for cloud service. Pay somebody else. Yeah. Yeah. And that's kind of the, that's kind of a little bit of where we are right now, right? Is I don't want to pay for the energy. And it's also, it's also, you know, kind of inefficient use of energy, right?

Like, am I going to spend all of this energy, just so I can have the GPT answer that kind of thing, right? I get to me, it's like a, Hey, when I need help with coding or something like that, something actually complex. And I just want the answer. Like, that's great. Like, I don't need to do a Google search for me. Right. Whereas that's a little more, not, not that, you know, Google searches, I don't know how, how much power that consumes, but I assume not as much as, as GPU driven.

Yeah. I think I've read a blog article that someone, someone had compared, you know, how much energy a Google search takes compared to a GPT query or whatever prompt. But yeah, very off topic. Yeah. So, yeah, I mean, there's, there's, there's a few things coming to this to this releases too, right. With, with voice. I mean, the timers we talked about that, right. So you're going to be able to say like, Hey, turn off the light in 10 minutes kind of thing. Right.

So it can actually start combining those kinds of, those, those kinds of things. It doesn't do alarms yet. It doesn't do alarms yet. It doesn't do alarms yet. But depending on what kind of connectivity it has, so if you give it some LLM connectivity or something like that, you can probably say, hey, turn on a timer for... Actually, I don't even know if you need LLM connectivity for this. You might actually be able to do it natively, but you can say, turn on a timer for 8 a.m., right?

And then it'll say, okay, I'm going to turn on a timer for 10 hours and 43 minutes, whatever, right? And then kind of go off there. So it's a little hacky way of kind of doing it. So it's not truly like that alarm experience. But which I actually don't mind that because it's like... I'll sometimes use alarm and timer kind of interchangeably. So that means it's like, you know, set a timer for 8 a.m. But what I mean is set an alarm for 8 a.m., right?

And I know I'm not the only person that does that. So in 2024.7, it'll also allow LLMs to access certain approved scripts. So that lets you go in and say, okay, like, for example, hey, I want this LLM to only do X, Y, Z in these situations, right? So, which is nice. So now you can be a little more selective, kind of like what I was talking about earlier, right? Do I want some stuff going up there or not? And media control as well.

That's also one of the big things where it's kind of like, hey, you know, we want to say, play the next track, play the previous track, something like that. I think some of those things actually came in a previous release. I think you could do next track, but you like previous track wasn't there yet. So I think this one brings in like previous track and a couple of things like that. Right. So that's part of 2024.7 right now, I think. And so that should be... It's interesting.

I'm very intrigued with voice. I'm very intrigued. I mean, we've had Mike on here a couple of times. Yep. And, you know, it's funny because those are some of our most watched episodes, right? From on, at least on like per like YouTube stats, which tells me that people are actually interested in the topic as well. So. Bring on the voice hardware. I want to say it's ONJU voice, but it might be ONU voice. I don't know. ONJU voice on there.

So that's a custom PCB that you wire up and replace it with Google Home internals with basically an ESP and with microwave word, et cetera, et cetera, right? So it'd be kind of cool. I'll be very interested to see how that goes because that would look nice in the home, right? And give you the flexibility of having a local control. Yeah, well, that's exactly it, right? I don't know if it's there for every form factor.

So you can see kind of behind me, for those of you in YouTube, I've got a little regular Echo, like the cylindrical ones behind me there. So again, it's not like I'm going to be ripping those apart and doing that. At least not yet. Maybe at some point they'll come out with it for it. Right now, I think it's just for the Google Home Mini or the Nest Mini, whatever they call it. Now, maybe they have another form factor.

I haven't looked into it in a couple of months, but yeah, voice, it's exciting. Very nice. All right, let's get to the main event, 2024.7. So the first cap off the rank we have today. So if you're using the new sections layout dashboard, which is still marked as experimental, there is now the ability that you can dynamically resize the cards on those dashboards. And I've got to say, I tried this out. I'm not using sections myself yet, but I did want to see how the UI works for that.

And clearly, like for me, and I am running the 2024.7 beta, so it was a little bit buggy, but I really like the intuitiveness of it, right? Like you could just go in, edit the card, and there would be another tab, I think it's visualization or... Um, however you, whatever the name of it is, click that. And then you can have, you know, like a grid and you can pull the drag the card down, drag it across. Um, and it works, it, it works really nice.

So yeah, curse the team for really, you know, um, going forward on that roadmap item and bringing that to us. Yeah. And, and, and the dashboard is one of the focuses this year, right? So, um, I guess something we didn't talk about earlier, uh, Phil, Phil from a, from a headline perspective is, um, the roadmap as well. Right. So, so I'll just pivot to that for a second. Um, so Home Assistant actually has started putting out, uh, roadmaps, right?

So, yeah, I mean, so if you look at the actual, uh, roadmap, they, they're doing a, they did a 2024 mid-year update. Um, and the roadmap actually shows, okay, right now. And, and when they say, now they pretty much mean like this release, right? Like, um, so like assist capabilities out of the box that we just talked about. So it's just like timers, reminders, music controls, while those things are here, right. Um, dashboards, um, make customization easy.

Um, so like sections, better tile cards, drag and drop those kinds of things. Right. So that's kind of here, mostly here. Um, and then from an automation perspective, we'll talk about that a little bit, but a lot of that's here too. Um, but coming up, kind of what their, what their focuses are, are making the default dashboards more useful and more relevant.

Right. So I think the idea is today, even, I mean, we've obviously seen this transition to Lovelace, um, starting to move things to the UI and, and, and there's mixed opinions on that. Right. People, people love it or people hate it. I'm on a camp that I, I, I like it quite a bit. Um, but people say, Hey, you know what? I'm, I'm being, I have control taken away from me. So on and so forth. Um, well, you can, there's still a ton of stuff. You can do in YAML, right.

Um, if you choose, um, just again, bringing those entities in and stuff like that will be in the UI, but. And then we've talked about that ad nauseum on the podcast, right? Just to say, hey, you know what, this is, it's kind of trying to get, make it more accessible for everyone. So now they're doing that from a UI perspective as well, right? So there's, again, there's certain things where it's, you know, it's still built by technology folks for technology folks.

And they've been trying to move away from that for quite a while now, where they're trying to build it for everybody, right? And I look at it as, you know, can my mom just pick it up and use it, right? And like, can we, can she do that kind of thing and say, okay, great, I built a home automation platform, right? She can probably do, not that she's tried, but she can probably do it with the Amazon Echo app or like, or, you know, Google's home app, right?

Like those kinds of things, or even with like the HomeKit kind of app where it's native and it's there and it's easy enough to use, right? It's like, it's. It's kind of a little more wizard driven and those kinds of things. A scan is QR code added here that, and it's there building an automation is, is, you know, if, if this and that kind of style, right? And a lot of that's here today, but it's not, it's still not as intuitive UX wise.

So user experience wise, and even things like making assist easy to start with and making easier, making automations a lot easier to create as well, right? So they're actually going to. They're going to go through the automation engine and look at like, take a step back and say, okay, how does the automation engine work today and how should it work? So the idea is make it simple for everybody, but also give, keep giving it that power that it has today, right?

Because I mean, the automation engine is, you can kind of just do whatever you want today, right? And worst case, if it does, there's not something it does, then great. You create a template behind it and do whatever, right? So there's hard mode and then there's like ultra hard mode, right? Depending on what you want to kind of, how you want to categorize it, right? Some people it's easy. It's, it's not something I love doing, right? Creating templates and stuff like that.

So and then, you know, going down a little further looking they also want to start doing things like, Hey, let's improve privacy, guest access, public access, right? Which I don't know if I, I ever want to public access to my. To my instance, but guest access, sure, right? And I know there's a lot of people that do it natively today, right? Not natively, yeah, well, natively, but they'll say, hey, here's a tablet with this UI, and it doesn't have access to admin functions, right?

You could create a separate user, do all that kind of thing, right? But this is, I think they mean a little more like, again, that's still involved, right? You still have to go in and build a separate dashboard for guests and that kind of thing, right? So maybe if there's some concept of these default dashboards, sections is a great, great first step to that, right? So they say, okay, you know what? These are going to be marked as not, or like admin or user sections.

And then you have other pieces that are going to be marked as guest sections, right? Or both, or I don't know what that might look like or eventually kind of come to. But, you know, it's kind of cool when you think about that, right? So now natively, just like on like an iPad, if you pull up Apple Home, you can just say, hey, here's home. And what do you want to do on, off, blah, blah, blah, right? So even if you have that panel at the front, it's like, okay, this is what you need.

This is what you get. So it's nice to see that there's a roadmap. We can leave a link to that as well if you missed it. We'll put that in the show notes. And yeah. And again, the big theme is they're going for that home approval factor, right? So make sure everybody's able to use it, make sure it's worthwhile, right? And what I like is a lot of the stuff is they have a goal. They're not just building things for the sake of building things, right? There's a, hey, here's our vision.

Here's where we want to be. And let's work towards that. So going to that Open Home Foundation and donating these projects, so ESP Home. Home Assistant itself. And we talked about Micro Awake Word. So like these kind of projects, when they get donated to that foundation, what that does is it actually prevents. It actually prevents.

And the actors that are there today, right, so let's say like Paulus and Nabucasa and like all of these folks, it actually prevents them from being able to say, hey, we're going to pick it up and sell this to the Samsungs of the world or whoever, right, and say, hey, like, I don't know, like maybe tomorrow Logitech is looking for an automation platform. Hey, you know, it literally stops them from selling it, right, because it's in the foundation's mandate that they're not able to do that.

So, which is kind of cool, right? It really shows to me that shows the, like, Paulus and that entire team, that core team shows me that they really want this to be a proper open project and, you know, go from there, right? So, yeah, that's a little bit on the roadmap. I mean, I did a lot of talking. Do you have any comments there? I love the roadmap. I think that it would be, I'm ready to see what is the next roadmap. To me, this sort of seems like the standard stuff, right?

Like, I feel as though Sections is just the next Lovelace, right? And I'm ready for the next big thing, which seems to be AI and it does seem to be happening. But yeah, I'm ready for... But like, I don't need another iteration of a dashboard, right? I have, I've got a pretty good dashboard as it is. But I totally get the reasoning behind it, right? We need, for the project to be sustainable, we need, you know, to open up the project to not just technical people.

But yeah, I'm ready for some, yeah, like the next big leap and how we get there, right? So, for me, I thought as their initial roadmap, this was just a business as usual, sort of what Homestead is doing. I think the later column was still a bit fluffy, and all later columns in a roadmap will always be a bit vague because we don't know what later is going to be. But it would be good to see some crazy ambition on there. It's a roadmap at the end of the day.

I think there's still some stuff that they're going to... I don't think it's keep close to their chest, but I think there's some stuff that they want to flush out first before saying, hey, we're going to do this, and then they don't, which I get. New features. Data tables. There's been a little bit of work that's happened in the data tables.

That's when you go to integrations, and then you click on one of the integrations that you have, or just that kind of table that shows you, hey, here's all my automations and those kind of things. That's what they call data tables. There's been a bit more work done on those. You've noticed a bit of work there that's happened throughout in the last couple of releases. Now there's the ability to customize those columns as well. You can go in and say, hey, I want to change the order.

I want to see these things, that kind of thing. That's saved in your browser. That means if you use it today on this computer or your phone, and then you go and flip it to another computer, those settings won't be saved. Because again, you might care about different things than whatever somebody else is using. Yeah, that's kind of what's happening there. I would love to see those settings saved against the user profile, not the browser.

There seems to be in Home Assistant Land this default of saving things in the browser, but not the user profile, if that makes sense. Home Assistant has a pretty basic, not advanced, but good enough. User profile system, right? I can set a user profile for my tablet. I can set a user profile for my wife, for myself. And why can't the user have a default?

So when they log in for the first time, they're given the user's default dashboard, not the default dashboard that they set on a per-device basis. Just little things like that always, I don't know. I just thought, I think it's like a missed opportunity. And I'm sure they've got their reasons behind it. But for me, I would love more user management around these things, right? As opposed to just in the browser. Yeah, I think there's a couple of things where it can get split, right?

So for certain things, I understand. So for example, let's say I'm using it and I have some filters set up and stuff like that. So just only show me my automations that are tagged with doorbell, right? Whatever, right? So I've got that filter down. Now, I don't necessarily need that to come back. Every time, right? So that can be tied to just, you know, get rid of that filter once the session's done or browser's closed, that kind of thing, right?

Whereas certain other things like, again, like data tables, I think I agree with you, right? Like, I want to see these things. And if I don't want to see that, like, at least that's how I use a lot of tables, right? Just give me this specific information. I don't care about this specific information, right? So great, don't show me that. And then that's kind of how I want it. Not every time. So, I mean, which I will have if I use a consistent machine, but...

Yeah, and if we're talking about making things, you know, more accessible and consistent for people, right? Like, if you have someone that does, for whatever reason, have the laptop and have the PC downstairs, and they go onto that screen, and all of a sudden the views are different, they've logged into the same system, but the views are different, and they're not saved across sessions, that just gets, you know, just becomes another pain point, right?

Like, oh, why does it look different on my computer compared to my laptop? And particularly if we're trying to get people that aren't as technically savvy... ... Um, in time assistant, they're going to, it's just going to raise more questions. Like why does my view not save in my profile or, or whatever reason? Um, sure. Yeah. I know it's a small thing, but it's just like, yeah, it's just things like that. I think, yeah.

Um, for me, it's just the, the saving for you is for user profiles, right? Like I think is something that could be improved. I'd love to see that on a roadmap, right? Like a more fleshed out. I think we've talked about, you know, permissions, you know, um, uh, and access roles and all that in home assistant, which is a big thing, right? Like I don't like get me wrong.

I don't think that's a small thing to do, but I do think eventually once I've got the basics down with, you know, all this stuff, you know, like dashboards and all that, we do need to seriously think about, you know, how do we make not just the, the CTO of the house in home assistant? How do we?

Encourage other people in the house to use home assistant and also how do we prevent themselves from shooting themselves in the foot by, you know, allowing the CTO of the house to restrict, you know, kids from controlling the siblings to other devices and causing headaches and, you know, uh, guests, you know, coming into the house and just everything. Um, yeah. Yeah. Yeah. All right. Um, what do we got next in 2024.7? Uh, we've got a blueprint control. So if you're a user.

If you're a user of blueprints, um, you will notice that when you, uh, copy and paste a blueprint URL into home assistant, you, they blueprint author will give you, you know, get certain settings and fields that you need to enter. Once you've done that though, you don't really get the ability to change much else. Um, it's whatever the blueprint author has given you. Uh, sometimes this, you know, you know, like, oh, I just, if the blueprint could do it this way, um, I could use it.

Um, there is now the ability. For you to take control of a blueprint. And, yeah, basically what will happen is you have got your blueprint. You will select a button, you know, take control. That will then convert the automation or script or whatever it is into its own standalone automation. And then it will basically unlock everything. So anything that the blueprint author has put in that was previously locked, then unlock and you can just edit it to your heart's desire.

So that's like something that seems obvious, like, yeah, is now here. Yeah, I feel like blueprints are sometimes way too rigid, right? Yes, exactly. You know, I like the idea that, you know, Phil made this blueprint that does X, Y, Z, but I kind of want to modify it for my specific use case. But you've done 98% of the work for me. Maybe I just want to add another condition or something like that that's not already incorporated.

So, yeah, that's to me, that's kind of cool that that was my exact. It's not kind of cool, but it's kind of obvious that that should have been there for. Yeah. And that was my exact reason for not like not really getting into blueprints was because like, oh, yeah, I'd want to tweak it a little bit and you can't tweak it. Right. Or it's a whole hassle to then have to create your own blueprint to then just be able to tweak it. So, yeah, I think this is a fantastic move.

Yeah. And I mean, I mean, that's that's the one thing. But it's also, you know, as you're as you're going through, it's also like, hey, I want to be I want to take the concept that this person made because it's an amazing concept and make it 100% applicable to my thing. Right. Rather than, you know, sometimes you go through. I found myself going through and looking at the blueprint, like looking at the actual like blueprint.

Right. Like the YAML behind it and being like, OK, let me try and convert that to an automation myself and things like that. So now now you kind of get get that. So there is a little bit of a caveat. So when you take control over that, right, that means now that blueprint is completely decoupled. Right. So that automation is completely decoupled. So it's basically what I just described that I do today. Right. If I went through it and converted that to an automation.

However, now there's no tie back to that blueprint, right? Because people can update blueprints. In this case, you cannot, right? So once that tie is cut, it's cut. Yeah, that's fantastic. Linking template entities. So if you have a template entity, it'd be pretty handy to use the related entities feature to see the devices that are related to that entity. Now you can. So the related entities can show you those template entities that are created by the UI, right? So that's pretty neat.

And I think there's also something there about being able to use it, use that entity after the fact as well, right? So if I have an entity and I'm linking it to a template entity, I can actually take the template entity and use it against it. Just a already created entity, right? Because before that kind of, you couldn't do that. You have to create the entity and then as you're creating the template entity.

I always am using that related devices panel in the more info panel just to see, okay, why did this automation trigger? What's the related entity? Okay, great. It's that automation. Away I go. And I think this is a really useful tool, particularly for like the template entities, right? Like I'm always going in, like I have a template entity that determines if my central thermostat or central heater should be used or if the local heat pump air conditioner should be used.

And there's a whole bunch of conditions that go into that. You know, how many rooms are occupied? Where are people in the house sort of thing, right? And to be able to click that, like right now if I click related, nothing comes up, right? But there's so many different moving parts to that template. It would be cool to be able to just see from a quick list, okay, and jump to that list, go, all right, here you go. This is what caused it.

Okay. The only issue I have with this rollout is unfortunately that this feature where you can link the template entities to the proper entities behind them is it's only for template entities created through the UI, which I get these UI IDs and all that. For me, when I'm coding in Home Assistant, when I'm coding in YAML, when I'm coding in Ginger templates, I'm doing that in VS Code. I'm not doing that in a UI.

And I understand the UI has a text editor that can give you a coding experience, right? Like refill entity IDs and code completion. I get all that. But for me, I'm doing it in YAML because it's code to me, right? And it's just a shame that you can't, for whatever reason, expose those related entities, which I don't think would be hard to do. Right? Like we... We used to have to do that for template entities.

We used to have to specify which entity ID was related to a template for it to be based on what rules to trigger the changes to that entity. So I don't think it's hard or maybe it's not hard, but I don't think it's... It's been previously there or it could be done. It's just a shame that this one's only applicable for those UI ones because I think it would be handy to have. Yeah. Yeah, I agree. All right. Here's another one for you.

If you are playing along and you didn't realize Home Assistant didn't already do that, you can now change the radius of your home zone through the UI. So previously, this was a YAML only feature. So complete opposite to my problem just hit here. If you... When you would set up Home Assistant, you would put in your address or your coordinates to your home and Home Assistant would default to 100 meter radius around your home.

If you wanted to increase or decrease that radius for whatever reason, you had no ability to do it except for in the YAML. That is now been exposed to the UI and you can do it, you know, drag it. ever so nicely. So I would have thought that that has been there for ages, but I was surprised to see that in the release notes. So July 2024, you can finally set it. That's it. That's it. The Matter integration, a little bit of an update there.

It now provides a number of, it provides a number of entities that allow you to configure the behavior of the Matter lights, right? So just different aspects of those Matter lights that you can kind of go in and play with. One that I was pleased to see was the Android TV integration has now got support for remote entities, allowing you to remotely control your Android TV. For those playing at home with an Amazon Fire TV, that is also an Android TV. I have a Fire TV stick.

So I was excited. I'm like, yep, get the beta in. Let's go. I want to see what, you know, remote entity I get. And I was given two shiny buttons, an on and an off button. Yeah. Yeah. Yeah. It's the worst. Nothing too exciting. Ah, okay. I can guess I can turn it on and off now. So yeah. Yeah. Yeah. If you've got a fancy on it and maybe you've got an NVIDIA shield or something, maybe it'll give you up down and all those fancy controls as well.

But at least the Fire TV stick I had, it was just on and off. That's what you get. There's no support for integrating the camera from tablets running fully kiosk browser. So there's actually some cool stuff you can do there. And then, and so now it's like, you can also go in, you can then take a there's an entity for taking screenshots as well and send notifications and text to speech messages to the tablet and the media player can now play videos.

So now what this means is a little tablet that you have on the wall, right? Whatever tablet that might be. If you have fully kiosk browser on there, you can actually use that almost as a home assistant device, right? So it has access to the camera. You can actually use that as like, maybe, maybe that's set up at your front door, right? So you can say, Hey, you know what? I actually saw somebody come by and take a snapshot of that. Who's here.

And who knows, maybe you got enough compute power, you run machine learning against it and say, hey, it's Rohan or hey, oh, Phil's over, right? Like whatever that might look like and do something cool there. But the neat thing here is it actually lets you use all of those things as an extension of Home Assistant, to me at least, right? That's how I'm kind of looking at it. And you're seeing, again, let me send notification to it.

Let me do all of those things and let me play a video on the media player, right? Right? Yeah. So I think it's pretty neat. I don't use Flicios browser, but I do, or I don't use tablets like my door or anything like that. I know Phil, you do, I think. Yeah, I do have one in my old house. I've got one. But it's exciting. I've got one randomly in the garage and it has Flicios running on it. It's an Android Fire tablet, so it's incredibly slow and the app will just randomly crash.

I'll walk up to the tablet and the screen will be just on the menu. But no. I think from memory, there's been a couple of these features like playing audio and potentially playing video that were part of the browser mod extension or card in hacks that that extension would expose, you know, like media player entities of your Flicios browser to Home Assistant. So this is just bringing those sort of entities back into the official integration, which is cool.

Yeah, I think this might actually prompt me to dust off the old Fire tablets. I think I've still got a brand new one in a packet somewhere that I had grand plans for, but didn't realize how slow these things were. But maybe this is enough of a push to roll them out somewhere. Because, yeah, they could become inside security cameras, right? Like, you know, when we go away, turn on all the Fire tablet cameras and, you know, watch inside. I don't know how good.

The quality of the video on those on those will be, I guess, depends on the tablet you use, right? If you use like a higher end Samsung tablet, maybe it's great, right? Or yeah. But yeah, it's something, right? All right. Unify Protect now has support for animal detection sensors. So you've got a Unify device. Pretty cool. I think there's been a few updates to the Unify integration this release too, so that's always good to see as well. All right.

Reolink gets support for manual recording software updates and playback of the autotrack lens. So there you go. And Bang & Olufsen integration now supports announcements and support for the Tidal music service has also been added. Yeah. And something cool I learned about the Bang & Olufsen, the B&O integration there, that's actually managed by Bang & Olufsen. They're the ones that actually directly contribute to it and stuff. You're kidding. It's nice seeing a vendor kind of do that.

Yeah, I didn't know that. Frank was talking about it on the creator's call, and I was like, that's actually really cool. Wow. And not just any vendor. That's a pretty big brand. Right? Usually those kind of brands, especially high-end brands, don't care about any of that kind of stuff, right? It's nice to see that. I mean, and I like their stuff, right? I mean, the headphones you usually see me wear, these ones here, are B&O in collaboration with Cisco. So I like them, right?

This isn't a pitch by any means. Buy it, don't buy it. I don't really care. And I like the quality of it. I like that piece, right? It's nice to see that kind of a company kind of stepping up and doing that, right? Especially when we see just like pure comedy come from some of these other companies, right? Absolutely. Like, hey, we're going to stop supporting anything open source. And then their American arm is like, no, no, that's not us. They're Europeans.

They're just speaking for themselves. They're like, I don't know. It's nice to see when a company is actually just like, hey, we're just going to contribute this because I'm guessing B&O probably realizes, hey. This actually helps us in the long run. The more people integrate it into their smart home or whatever, the more people just buy our product because it's there. Again, they're not paying us to say this or anything, but I just thought it was really cool that they did that.

The Nanoleaf integration got a little bit of love as well. You're able to add the event entity. If you now start... Some Nanoleaf panels let you touch them and do stuff. Based on that, that's actually now exposed to Home Assistant. What happens is now maybe you have a sequence that you tap and it can do something. That can actually trigger something in Home Assistant. I think that's pretty cool. I don't know if the ones you have behind you are Nanoleaf, Phil. They are.

I was just about to say. It's all good. To do a Rohan, this is what's behind me right now. That's the Nanoleaf fire elements or whatever they're called. I don't know. My Nanoleaf, I've got a love-hate relationship with them at the moment. I've just been watching what I've been recording and I'm running 2024.7. This Nanoleaf behind me has been going marked unavailable in Home Assistant every two minutes. I would go to press a button on the thing and... It would just be unavailable.

And it's happened for the past few months. I have the Nanoleaf canvas and that just randomly... It's a couple of years old now, but that just randomly disconnects from my Wi-Fi and it starts flashing and gets marked as unavailable. I've got to the point now where you have to literally power cycle that Nanoleaf for it to rejoin the Wi-Fi. So I've ended up putting a Z-Wave smart switch on the thing so that when Home Assistant...

Texts that it's unavailable on the Wi-Fi or it's dropped off Wi-Fi, it kills the power to the Nanoleaf and then reboots it. I think it must have a dodgy Wi-Fi chip or something. It probably needs a... But as I said, it's only a couple of years old and it's just annoying that, you know, it's out of warranty. So if I want it fixed, I'm going to have to buy a whole new control unit, which means spending more money to them. And they're all stuck up on my wall, right?

Like if I pull them down, I'm going to be ripping off paint and all that. Yeah, you're going to rip it down. But yeah, anyway, back on point for 2024.7. Yeah, when I downloaded the beta, I could see straight away in the logs, there was notes saying, oh, I've detected, you know, this gesture on this Nanoleaf. So it's there. I don't know what use case there would be for it because I know you can, you know, swipe up and down on it to control its brightness and stuff.

So maybe you could, you know, have that as a, you know, flick it to change music somewhere or something. But yeah, I'd be interested to know what people are thinking about using that for. Cool. Yeah. If you have it and you use it, post it, post it in the comments or shoot us an email. That would be great to see what you do with that. And if you have a SwitchBot Pro, good news for you. It's now supported 2024.7.

Yeah. A few, a few integrations got love with getting platinum level quality. So ESPHome, I thought, I swear ESPHome is already platinum level. It's kind of their own project. It's nice that they don't favor themselves for it. Yeah. It's ESPHome, Pilot, and Teslometry get platinum level quality status, which is actually cool to see Teslometry because that's literally, I think it's a very recent addition, right? So they've actually moved up pretty quickly.

And it's credited by someone in the community. Basically what that means is it's basically the best of the best. Yeah. That's all I need. It's because right now all Tesla integrations are kind of hacks, right? Like they're, they're just all like. Hey, we scraped this, or we do that, or do something weird like that, right? This actually leverages a actual, like Tesla's official APIs, right?

They use a fleet APIs and things like that and say, okay, this is how we're going to manage it, right? So there is a cost to it. It's not free, but it's, I don't know, I think it's kind of neat that somebody in the community stepped up and said, hey, let me get an actual option if people wanted to do that and they wanted to pay for it. Great. And it's, I don't know, I think that's kind of cool. But basically, platinum level just means, you know, they are that, right?

They are the platinum level. They're the best of the best integrations. It's completely asynchronous, or it's running the async libraries. So that means it's fast, gives you a great user experience, and it's typically extremely well supported, right? Yeah. Congratulations. All right, Phil, let's talk some new integrations. New integrations. What have we got? Aquacell. So you can now monitor your Aquacell water softeners in-home assistant. So there you go.

Get a little entity to say how soft your water is. There you go. Ista Ecotrend. The Ista Ecotrend integration lets you bring in your monthly heating, hot water, and just regular water consumption and costs from that Ista Ecotrend service. I'm not sure what that is, but it's in here. Congrats. One I loved to see in here is the Mealy integration. So Mealy is a self-hosted recipe manager, which I have used. I love it. It's got a great UI. I really love the scraping ability, actually.

So just talk about Mealy for one sec. Yeah. So basically, you go onto your taste.com. I've got local Woolworths. Just chuck that in. Put that into. Just paste the URL. Put that into Mealy, and it will scrape the recipe, give you a list of ingredients that you need, get a nice picture, and allow you to then meal plan, right? I've got two kids. I've got two kids.

And meal planning is just something that we don't do and we really need to do because otherwise we're 5.30 and like, crap, what are we doing for dinner tonight? So, yeah, I love Mealy. I think there is – I would love to use it a bit more. Maybe this will be a good kick in the bum to look at it again. I think there might be a Mealy integration into Grossy as well. I know Grossy also has meal plans and recipes.

Yeah, anyway, good to see another open source self-hosted product available in Home Assistant. Yeah, I've been running Mealy for a couple of years. I have exactly three recipes on there. It's one of those things where I was like, hey, let me do this just to collect and kind of standardize the recipes that we like on the internet, right?

Because every recipe website you go to, it starts off with like, you know, when I was a child, my grandmother and said, listen, I appreciate you, but I don't care about the story of your life. Just give me the ingredients. Just tell me what I need to do. Yeah. Yeah. And I realize I say this as I'm literally talking on a podcast about stuff. And I realize the irony there, but, you know, it's – I will say it's a do-as-I-say-not situation, right?

Where I'm like, I'm not doing that, but I will 100% do that still on my podcast. But, yeah, it is pretty funny. But, yeah, so I use it, but I try to make it a thing in my household. I'm still running it today, right? It's still sitting in my Docker stack there. I was just going to say, gee, I'm interested to know if mine – Yeah, it's just one of those things where – I want to see if mine's still running. Okay. And you know, I don't, I can't even access it at the moment.

Like, so no, I don't know what recipes I have there. But I guess you would have had the same success as me, right? Like try to, try to get some, you know, try to get used. And then you just, I don't know. It's not, it's hard, right? When you're the only one accessing it. Yeah, yeah, exactly. All right, let us move on. Naki. So Naki is a smart device that turns on, that turns any surface into a remote control, which you can now use to trigger automations.

So let me unpack that a little bit. So when we say any surface into remote control, basically you mean you can like tap it or tap like a table or something like that and it can happen. Cool, great integration if you have it. Fantastic. They seem kind of pricey for what they are. I don't know. I feel like I'd just rather use a vibration sensor and just bang on the table or whatever. Yeah. And I don't know. I talked about this off camera earlier and like.

Yeah, I would just know like how accurate this thing is or how susceptible it is to false positives. Like a vibration sensor is very like, you know, it will get any vibration. But if that thing were to determine what is, you know, someone putting a plate down on a table as compared to, you know, a definitive knock or a command being issued to it, then maybe it's worth it. True. I would want to know what use cases would be there though.

Like if it's literally a sensor for someone to determine if somebody is knocking on your front door, I think there are other sensors that can do it. Like even just the vibration is good enough there. But if there's some other random, you know, I don't know, can you like punch out SOS on there and it will tell you what, that someone has done an SOS knock on there. Maybe then it's. It's worth it. Yeah. I don't know. And for me, I don't see the selling point to me.

Yeah. Yeah. Yeah. Well, and Phil, we were talking about it before we started recording this episode too. I mean, we've seen this being used all over, right? So Reed on Smart Home Solver, he's got it attached to his media console where he smacks his table twice and it turns on the projector and all of this stuff, right? And it's like, so it's not new and I'm pretty sure he's just using like a $5, like, I don't know, like, Xiaomi sensor or whatever, right? Like, I don't know.

I guess it depends on the use case and maybe they do the one thing really, really well or I don't know, maybe they can differentiate a plate versus a hand smacking the table. I don't know. I don't see it, but cool. Let's talk some breaking changes, Phil. Well. Sorry. Just before we do. So, and you do want to do all that. I'm just looking at the, I just pulled up the, the Nokia docs. You'd want it to do all that, but it is cloud push as well. So there is a cloud reliance, right?

So I don't know. I mean, a Zigbee vibration sensor will give you that vibration pattern much quicker locally than having to, for that. And then if it's cloud push, does that mean? It's a Wi-Fi device, actually that would chew through the batteries. I don't know. I don't have enough data on this, but yeah, look, at least it's here. Someone has obviously bought a Nokia device and they've integrated it. I would love to know why that over as if your device, all right.

Breaking changes, the legacy. And maybe there is, there is a really good range, right? All right. Some breaking changes, the legacy API password feature, which was deprecated in December, 2023 has now been removed. So if you're still using the old legacy API password, so just a simple password to access home assistant. That's not going to work anymore. You now need to update that before moving forward with 2023.7. Open weather map. There are some obsolete forecast sensors there.

Those have now been removed and they've been fully replaced by weather.getforecasts service. For Shelley devices, the switch entity for controlling a Shelley gas valve has been removed and the valve entity has taken its place. So just removing the old switch, replacing it with a new proper entity class for it. If you do use the switch entity in your automations or scripts, you must update them to the new valve entity. A little more on weather.

The weather service getforecasts was deprecated in favor of getforecasts. So plural versus singular. And home is just in 2023.12. So now that's been removed, right? So from even. From what I said earlier, from the open weather map, that is using the weather.getforecasts, the plural service. And last, certainly not least, if you're a ZipWave user, make sure you update your server. A version bump required. That's it. 2023, sorry, 2024.7. Yeah, that's right.

A little bit of a long one, but we're good. All right. See you next time. Cheers. If you want to share your Home Assistant journey or come on as a guest, reach out to us at feedback at haspodcast.io. That's H-A-S-S podcast dot I-O. The Home Assistant podcast is hosted by Phil Hawthorne and myself, Rohan Karimandi. For links to topics we discussed today, check out our show notes on Has Podcast, podcast.io. Thank you.

Transcript source: Provided by creator in RSS feed: download file