#333 Live From PyCon - podcast episode cover

#333 Live From PyCon

Apr 22, 202323 minEp. 333
--:--
--:--
Listen in podcast apps:

Episode description

Topics covered in this episode:
See the full show notes for this episode on the website at pythonbytes.fm/333

Transcript

Hey, welcome everybody and welcome. We've got a whole room full of people. We're recording this live. How about we get a live from PyCon? Shout out. There we go. Thank you all. Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds. This is episode 333, half of a beast, recorded April 21st, 2023. And I am Brian Okken. And I'm Michael Kennedy. And we are live at PyCon US 2023.

Yeah, it's excellent to be here. Thanks everyone for coming. It's awesome. It's a real honor. So, shall we kick off the show? Let's kick off the show. I think first I'd just like maybe to get your thoughts real quick on PyCon, the Expo, people. How's it feeling this year? Well, it's the first time I've been back from the pandemic. I didn't show up last year, so I'm pretty excited. It's really good to see people. There's people that I haven't seen in person since 2019.

So, it's pretty awesome. How about you? Same. It's really great to reconnect with a bunch of people and see folks I haven't seen for a long time. Really nice. Yeah, it's been great. Plus, we've got the Python staff this year. Yeah, the Python staff of Python things. So, I want to talk about something that's all the rage. And I do want to put that on the screen for a live stream, people as well. And that is more AI chat things. What do you all think about ChatGPT and all these things?

Is this scary or is it scary or is it awesome? I'm like, oh, we're getting... Yeah, all right. So, honesty. I think that represents how I've felt. I've felt like everyone out there at different stages in this whole journey. Yeah. What we saw is a whole bunch of thumbs up and a lot of sideways. So, we're not quite sure yet what we think of it. We're not quite sure yet. It's one of those things that's kind of here. The cat is out of the bag.

We can either rail against it or find good uses for it to take advantage of it. So, Microsoft has found a way to use their large language models behind OpenAI and the stuff that powers ChatGPT to help security defenders, they say. Like, if I'm on the blue team trying to stop people from breaking into my company, I could use a little bit of help with that. And you can already use ChatGPT for all sorts of crazy programming and security type of things, right?

You can say, hey, dear chat, I would love if you could help me write a phishing email that looks convincing. Or I would like you to help me identify things I might test for misconfiguration Nginx files and what I might do with that, you know? Those are all bad things. But this project here is called Microsoft Security Copilot. It says, empowering defenders at the speed of AI. And so, basically what this is, is it's ChatGPT.

But instead of using a general purpose language model is using a cybersecurity focused large language model that understands things like, don't let me get hacked, buffer overflows, configuration files, that kind of stuff. So, if you're in the space of cybersecurity, which Python is one of the most popular languages out there for cybersecurity, right? Both sides of it, the good and the bad. But, yeah. So, basically, you give it a prompt.

You ask it a question about configuration file or some kind of environment. And it will allow, it'll go and use that large language model. And it doesn't always get it right. This is one of the big challenges. Maybe some of the thumbs down from you all were like, you know, this large language model made up something about the world or whatever. But it was real confident. It was certain it was right. But it wasn't. So, this has a feedback loop.

You can say, no, no, that's actually not misconfigured security copilot thing. That was okay. And here's why. And so, you can have this loop that you would have with, you know, maybe with like a junior cybersecurity researcher or whatever. And another thing that I don't really know how all these large language models work and all this AI stuff works. Many of it, much of it seems to be, we're going to go find a bunch of other people's work and then take that.

We'll have a really cool system with this cool data, right? Like we're going to scan repos and maybe it doesn't matter if it's GPL. If we filter the GPL out through some kind of neural net or, you know, get all the Getty images. And now we can create really cool pictures if you ask for it. But the Getty wasn't on board with that. So, this data story is kind of a little suspicious for these. But with this one, they explicitly say your data does not get shared back. It doesn't go anywhere.

This is like, you can even lock it down about how other people are allowed to access. So, that's kind of cool. And yeah, they're basically trying to help people go through log files and other things on the server where people are trying to hide their tracks, behaving normally but not really, and pull those things out. Now, I have no experience with this. But I know I interviewed some folks on Talk Python who are astronomers looking for exoplanets.

And they were able to take old Kepler data and apply some machine learning and computer vision and discover 50 new exoplanets that people thought they had already analyzed. And guess what? They were hiding. They couldn't be discovered by people. But by computers, they could. I suspect the same type of thing is true here. They're like, there's 10 million lines of log file. And these three are suspicious. But nobody really noticed, you know.

So, anyway, if you're in cybersecurity, definitely give this a look. So, next, I want – I should have thought of this at a time. But we've got a bunch of people here that can't see our screens. And I do – which is a good reminder that this is also an audio podcast. It's not just on YouTube, apparently. So, the next topic I'll have to be careful talking about. But it's PEP 695 type parameter syntax. Now, this is – this PEP is an – it's for Python version 3.12. It's accepted.

So, I don't know if it's already in some of the alphas or betas or not. Yeah, I don't know either. But we've got – so, it's accepted for 3.12 type parameter syntax. The abstract is this PEP specifies an improved syntax for specifying type parameters within a generic class function or type alias. It also introduces a new statement for declaring type aliases. What does that mean? Well, I like to – it has some great examples. So, we go – if we go down to the examples, there – it's the old way.

Like, let's say I've got – one of the examples is great. So, let's say I've got a function like that takes – it takes something. We don't know what the type is. But it takes something and then it returns the same type. Or it takes something – it takes two of – it has to have two of the same typed things. It doesn't matter what they are. It doesn't matter what they are. So, like two ints or two floats or two lists or two tuples. It doesn't matter what, but it's the same thing.

The old way to do that, which is – I still think it's fairly recent. I think this might have been 3.11 for type var. It's pretty new, I think. So, yeah. Yeah. I'm laughing because it's rolling over so quickly, right? Yeah. So, the – anyway, the old way to do it was from typing import type var. And I didn't even know you could do this. And then you declare a new type using, like, as in this example, underscore t equals type var.

And then in parentheses, underscore t. And then you can use that as the type of arguments. And that's really kind of ugly syntax. And the new proposed syntax is to just give a bracket, like bracket t bracket after the function to say, basically, it's a templated function. Like all the other generic statically typed languages, like C and stuff, right? Yeah. So, it definitely reminds me of, like, the type – yeah, the – Templates. Templates. Thank you. In C++ and stuff. So, it's definitely easier.

I still – I'm not sure. So, it's approved. So, we'll get this in 3.12. It's definitely better than the old way, but it's still – I think we might be confusing people with this. What do you think? I think types in Python are awesome, but I think it can also go too far. I mean, let's ask – since you all are here, let's ask, like, how many people like typing in Python? Almost uniformly. Yeah. Yeah. Okay. But it can get over the top sometimes, I think. One of the things, though, is cool.

One of the bottom examples in this, it shows that combining types. So, like, maybe a function that takes two of the same type things, maybe that's a little weird. But it's not too weird if you think of, like, lists of things. If I want to say it can either be a list or a set of a certain type, but only one type. How do you say that without these generics? Yeah. Yeah, I know. Yeah, I think – It is incomplete.

And so it's the question of how far are you going to push the language to get that last couple percent? Anyway, it is looking a lot more like C, isn't it? I'm glad I studied that, but also glad I don't have to write it these days. Anyway. So something to look forward to in Python 3.12 is PEP 6.9.5. Yeah, absolutely. While we're riffing on types, I just want to make a quick comment. I got a message from somebody recently on this project. It said, Michael, I discovered a bug in your code.

It doesn't run. I'm like, oh, really? It seemed like it ran last time I touched it. But, okay, what's going on? Well, you used the lowercase l list bracket type, and only capital L list works. Like, no, the bug is you're in Python 3.9, not 3.10. This is a new feature. And I think – I'm joking, kind of – but with all these changes so quickly, like, it starts to get – you've got to be on the right version of Python or this thing won't exist. Right? And it's going to be an error.

Yeah. It used to be, oh, the last five versions is fine. Now it's like, oh, the last version is fine. We'll see. Yeah, that – I'm starting to – I'm working with some educators. And one of the tricky things in, like, universities is the – your curriculum is kind of needs to be known ahead of time. And they kind of set that. And so with Python moving so fast, I wonder how educators are dealing with this. If they're teaching 3.8 or 3.11.

All right, we've got some teachers in the audience saying 3.11. Kids, they like new shiny things anyway. Give them that. Give them that. All right. All right. What's next here, Brian? What's my next one? I don't know either. No, I do. It has to do with AI probably. So this one comes to us from Matt Harrison, who's here at the conference, if you want to say hi. Obviously, there's all this GPT stuff going crazy. But one of the challenges is you can ask it a question, and it'll give you an answer.

Right? Like, hey, please write this code for me. And it'll go, boom. Here's – you don't need to hire anybody. Just take this code and trust me. Or whatever, right? You ask it a question. And you can ask it a couple of questions, but it has what's called – was it a token stack or something like that? It only has so much memory of, like, the context of what you're asking it.

And the ability to go and ask it to do one thing and then based on its response, go do another and then a third after that, it's not quite there yet. So there's this project called Auto GPT. So if you have an open AI API key, basically, so if you pay for open AI or somehow have access to it, then you can plug it into this thing. And what it does is you give it a mission.

You say, dear AI thing, what I would like you to do is go search Google for this, figure out what you find, and then get the top three most popular ones. Go find their web pages. Take all the information out of that and summarize them for me and then make a prediction about, like, who's going to win the Super Bowl because I'm going to bet big on it. I don't know. So basically that's the idea.

It says it has a couple of benefits over regular chat to be, for example, which is you can't connect it to the Internet. I don't know if you ever played with it, but it'll say things like, I only know up to 2021. Sorry. This one has Internet access. It has long-term memory storage. It'll store in a database so you can, like, have it go on and on for a long time. File storage, all sorts of interesting things. So they have a video that we'll link in the show notes you can check out here.

I'm going to mute it because I don't want to hear this person talk. But it says, fires it up, and it says, all right, we're going to get started. And what I want you to do, your role is an AI designed to teach me about AutoGPT, the thing that is itself, right, very meta, self-referential. Your goals as a list in Python is first search what AutoGPT is and then find the GitHub and figure out what it actually is from its GitHub project.

And then explain what it is and save your explanation to a file called AutoGPT.txt and then stop. And it will, if you run it, you will say, okay, well, now it's gone out to Google and it's done this thing and it's pulled it in. And now it's starting to analyze it. And why is this interesting? This is all Python code, right? So this thing is created in Python. You run it with Python. I'm sure you can extend it in different ways with Python. But, yeah, it's pretty nuts.

You create these little things. You put them on a mission. And you just say, go, you know, go get me tickets for this concert or go do this other thing. And here's the plan I want you to follow. You just set it loose. So, anyway, if you want to combine some Python and some automating of the large language models, there you go. This seems like something could definitely easily be used for evil. No, no way. There's no way. Yeah, I agree. All right. What do you got for the last one?

I am, so we've talked about Ruff before, I think. So there's been an announcement that Charlie Marsh is now his own company and hiring people. So Charlie Marsh has formed a company called Astral. And he's made a good start. He's starting with $4 million of investment money. So it's not a bad. That is not a bad deal at all. Bad deal to start a company. But I'm kind of excited about it, actually. Well, one, I'm happy for him. Obviously, well, at least I hope it's a good thing for him.

But I just think it's neat that I guess I just wanted to highlight and say congrats, Charlie, you're doing this. So the Ruff, if you're not familiar, is kind of like a flake 8 linter sort of thing. But it's written in rust, and it's really, really fast. It's so fast. I can't. You can barely detect it's running. But it works. Yeah. How many of you all have heard of Ruff? R-U-F-F? Pretty much everyone. And this thing's only been out like a year. So that's a big deal.

Yeah. I ran it on the Python bytes and the Talk Python code and 20,000 lines of Python. And you're like, did it actually run? Did I give it the wrong files? It might not have seen anything. It's instant. It's crazy. One of the things Charlie's noticed is that it's becoming very popular. But he's also getting a lot of requests. So it's a very active project now. And I'm sure it's taking a lot of time. So he's got things like new requests.

So let's do more of the extensions of Flake 8, which is completely valid. And then also, yeah. Well, this was a good idea of taking part of the Python tool chain and rewriting it in Rust. What other stuff could we rewrite in Rust? And I think that's where they're headed is making more Python things more Ruff-like or Rustifying them. So I'm excited for it and to see what they come up with. And he's promising that a lot of this stuff is going to be open source available to everybody.

Awesome. Congratulations, Charlie. That's awesome. I would say, you know, when I got into Python nine, ten years ago, there seemed to be this really strong resistance to anything corporate. Anything like people were trying to bring money. It seemed really suspicious. Like, what is your motive here? Are you trying to corrupt our open source environment?

And I think since then, we've kind of found a way where there can be commercial interests that don't undermine the community, but also come in and benefit. I mean, we saw Samuel Colvin with Pydantic. We're seeing this now. And, you know, a lot of them seem to fall. Textuals. Textuals. Absolutely. Will McCoogan. Out with Rich. Sorry, Will. And a lot of them seem to fall under this what's called open core business model where, like, the essence of what they're doing, they give away for free.

Like Rich. Like Pydantic. But then, on top of that, there's something that is highly polished and commercial, and that's where they're kind of working. And I, personally, am just really happy for these folks that this is happening. I think it creates more opportunity. It creates more opportunity for people in Python. People who worked on these projects for so long get a lot of, it kind of pays off eventually, right? Like, the PayPal donate button.

There's no way that that's a job that's like a covered my dinner once a month sort of thing. I also get that there's a lot of people that can't do this. I mean, there's a lot of things that people are happy with their normal job. But they're doing something cool on the side. We still need to figure out how to compensate those people better. Yeah. So we'll figure that out. One of the things I wanted to bring up is I was talking about this announcement with somebody just yesterday.

And they said, oh, Ruff, that's kind of like black, right? I'm like, wait. I don't think it's quite, that's quite right. I think of it more like Flake 8, but I was curious about the overlap. So I went up and looked in the FAQ, and the top question is, is Ruff compatible with black? So yes, it says Ruff is compatible with black out of the box, as long as line length setting is consistent between the two, because black has a weird line length thing. I've had no problem with running them together.

And I was like, also, should I run them together? And right in here, Ruff is, it says Ruff is designed to be used alongside black. And as such, we'll defer implementing stylistic lint rules that are obviated by auto formatting. So what does that mean? It means that there's no point, if they're assuming that you're running black. So if running black will do something, there's no point in Ruff checking it, because they know that you've already done it. Or something. They're going to, you know.

Yeah. Don't let them fight. Wrap this line. Unwrap that line. Wrap that line. Unwrap that line. Well, that. And also, like, that's not their highest priority of fixing, of checking for lint errors that black would have changed anyway. So. Yeah. Indeed. All right. Well, congrats. That's very cool. I think that might be it for our items, huh? What do you think? Oh, yeah. For our main items. Our main items. You got some extras? I do have one extra. The one extra is, like, Fikert? What's Matthew?

Matthew Fikert. Okay. Yes. Wanted us to bring up, which, sorry, Matthew, for me forgetting your name right away. Former Python Bytes co-host, guest, attendee. Yeah. So, I wanted to announce that the tickets are available. It's now open. You can buy tickets to SciPy 2023. And SciPy 2023 is in Austin, Texas, on July 10th through the 16th. So, that's open. If anybody wants to go, it should be fun. Yeah. Anyone go into Austin to go to SciPy? I know you've all used up your conference going.

There's a maybe. Some maybes out there. I mean, Austin would be great to visit. SciPy will give you a different flavor of Python. I think it'd be great. But I can't make it. I'm coming home from vacation on the 10th or something like that, which makes it a little tight to get all the way to Austin. Yeah. All right. Do you have any extras? I have one extra, nothing major, kind of a follow-up here. The mobile app, I talked about that. The mobile app is officially out for Talk By Then courses.

And I would like people to try it out. If they find a bug, just shoot me an email rather than write a one-star review. And trash it. Because we're working really hard to get. It's been two and a half months I've been working on it. It's completely redone from scratch. It's very nice. But it needs a little testing across all the zillions of devices. Android is out. Do you notice, Brian, I did not say the Apple version is out, did I?

No. Oh, no. No, no, no. Because when you submit something to Apple, what they tell you is, rejected. Rejected. Your app does not do X, Y, and Z. And Android is like, yeah, sure. That's good. So we're now adding in-app purchasing because without it, you can't have your app. So I'm going to work on that for the next week. And then we'll have an Apple version y'all can test. And it will be out, but it's just not out yet. What are you going to sell for in-app purchases? Courses.

I actually wrote some of them. You know, I might even sell one of yours. Yeah, the Pi test course. Yes, exactly. Nice. Awesome. Anyway, that's my extra. What's Android, by the way? Yeah, it's. No, just kidding. Let's not go there. This one, I'm going to take a chance. I'm going to take a risk here and turn my screen around. Okay. For everyone, because this joke is very visual. You'll be able to see it over there. And you can see mine. But you know it already.

This is what it's like releasing new production. We've got the senior dev and we've got the junior dev. Here we go. Here we go. What is this, Mr. Bean? Yeah. Mr. Bean. It's just people are rocking all over. The junior dev is hanging on for life. There's like a molten lava here in a second. That's the database. Some of the developers are thrown into the lava. Scrum master. There you go. The scrum master was thrown into the lava, which is the database. Anyway, what do you all think?

You ever felt that way? No, I'd definitely throw the scrum master into the lava. Yeah, definitely. Definitely. But anyway, that's what I brought for our joke. Nice. I like it. And I also took you off the camera. There you go. That's all right. Well, this was fun doing a live episode. It was very fun. And thank you all for being here. This is really awesome. Yeah. Thanks to everybody, and thank you everybody online for watching and showing up. Yeah, absolutely. Have fun. Bye, y'all. Thank you.

Transcript source: Provided by creator in RSS feed: download file