Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds. This is episode 206. Wow. Recorded October 28th, 2020. I am Brian Okken. And I'm Michael Kennedy. Yeah. And we have a special guest today, Steve Dower. Hi. Thanks for having me on. Hey, Steve. Thanks for coming. It's great to have you here. I also want to throw out that this is sponsored by Tech Meme Ride Home podcast. Check them out
at pythonbytes.fm/ride. Steve, I'm sure many listeners know you, but maybe just give us the quick rundown. You do cool stuff at Microsoft, getting Python better on Windows and you're a core developer. Yeah. So my main day job is for Microsoft, where I work as basically a Python engineer, kind of a wide ranging resource to the company. So I haven't shipped anything in my own in a while,
but I've had my fingers in a lot of Python related things that have gone out recently. So it's a lot of fun, a lot of bouncing around between different teams, getting to work with a lot of different people. And yeah, as you say, I'm a CPython core developer, one of the Windows experts on the team. I'm responsible for the builds that go up on python.org and just generally keeping Python running while on Windows.
Yeah, that's awesome. And you've done some interesting talks like Python is okay on Windows actually, and talked about some of the popularity of it and how we as a community shouldn't discount it just because we might happen to be on a Mac or use Linux or whatever, right? A lot of people do Python on Windows. Yeah, yeah, the estimates vary. And every time I get new numbers, they seem to show up slightly different. So it's real hard to get a good fix on how many
Python developers there even are in the world. I did get some numbers recently that I had a few people double check because they were saying there's like 20 million installs of Python on Windows in the entire ecosystem, which sounded like too many to me. So I had them double check. And then I had someone else double check. And they all came back saying, yeah, it's about that. So I'm like, okay, there's a lot of Python on Windows out there. But yeah, it doesn't show up
in conferences doesn't show up on Twitter that much. And a lot of people just look at the packages that don't work and go, well, I guess it doesn't exist on Windows, because otherwise this package would work. And so you know, chicken and egg problem, right? Yeah, there's a lot of chicken and egg problems in the Python space. I mean, it's a beautiful place. But there are some of these weird chicken and egg ones. Yeah, it's weird. I've been using Python on Windows since I started Python.
So have I. But one thing I haven't been using very much is enums. So that was an attempt at a transition. Why not, Brian? Tell us more. Actually, I've tried many times I've tried to use enums. And I actually just to be honest, I don't very much. And partly because I'm used to using enums in C and C++. And they're, they just will act like symbols in C and C++. They work pretty good. There is some weirdness with enums in Python. And I'm going to highlight an article called Making Enums, as always,
arguably more Pythonic by Harry Percival. He starts it off by saying, I hate enums. So Harry's a funny guy. And this is a fairly hilarious look at why enums are frustrating sometimes. And then he presents a semi reasonable workaround, I think, to make them more usable. So what's wrong with enums? Well, he gives an example. And just as a simple enum with string values. And you can't directly if you try to compare one of the enum elements to the value, like the value gave like a similar string,
it's not equal, it doesn't compare. But you can use a dot value or in the enum value. But that's, it's just weird. He also said, he kind of thinks it'd be neat if you could do a random choice of all the enum values, I think that would be neat. And you can't directly convert them to a list. There's just interacting with the enum type itself, or the class itself is, has problems. In the documentation, there is a suggestion that you can, with you can, instead of strings, use ints, and do an
int enum, and it works a little better. And if you like it like that, but want strings, you can make your own string enum class. I'm not sure why they didn't just build this into the default, or, you know, one of the standard types anyway, but string enum is not there. But there's an example. And it sort of fixes a lot of stuff, but not everything. It doesn't, still doesn't allow for those direct comparisons. So the solution that Harry came up with is just kind of like the solution, the
documentation says, derived from both enum type and str when you're creating your enum class. But then also define this little snippet of a dunder str method, so that the str method works better. And at that point, most of the stuff works that he wants to work. It still doesn't do random choice. But apparently, he's gotten over that a little bit. So I actually think this is really, this is still just a really small snippet, a little, like two lines of extra code to add to your enum types,
make them a little bit more usable. So I think it's reasonable. If you're judging by, like, value add per character, this is awesome, because it makes working with the enumeration so much nicer, you can do like the natural things that you would expect, especially around testing and comparison. And it's like, also add a drive from str and just add a dunder str method. And you're good. Steve, what do you think about this?
Yeah, I'd just add that the gotcha that you have by doing this is now you can have two values from different enums compare as equal, which as I recall from the original discussions was the reason this wasn't put in by default, say you've got a color enumeration and a fruit enumeration, is orange the same between the two? And I think the decision was made for enums to say no, it's not. If it's a fruit orange, it's different from the color orange. Making this change or using an int
enum is going to make them equal. So as long as you're prepared to deal with that, which to be quite honest, every time I've reached for enums, I am much happier with string literals and quite comfortable with them matching equal. Yeah, but that is just one thing to watch out for. Usually it's about constraining the list of things like I know there's five things I want to make sure I don't like somehow
mix up what those five are. I just want to go, you know, class dot and here's the five. Let my editor tell me which one of those it is. It seems like it's all that. So what do you think about if you overrode like if you added dunder EQ, dunder NEQ, so like the comparison would say it has to be this type of enumeration and the stir value has to match? Yeah, that would certainly deal with that. Gotcha. Again, when these were being designed, basically anything that gets designed and added to Python
has a very large group of very smart people work through it. And, you know, as a result, things always get missed. So it's possible that one was just missed. It's also possible that someone did figure out a reason why that was also risky. And, and, you know, risky when you're developing one of the most popular languages in the world is just anything that might surprise anyone. So someone has deliberately designed around using enums everywhere, they're probably
not going to be surprised. Someone who is using code and that developer has swapped out all of their, you know, they, they had a class with static variables and they turned it into an enum and now stuff breaks because of the defaults that were chosen for enum. That's the kind of thing that you're trying to avoid. And, you know, in a language that has anywhere between, you know,
five and 20 million kind of regular users. But as a workaround, I mean, if you know where your enum is going to be used, there's a reason you can derive from string and it's exactly for stuff like this. Yeah. Thanks Harry for putting out there. That's quite a neat, a little bit of a advisor. And I'm so glad that we have Steve here because I picked some sort of semi internal type of pieces and I'm going to make
some statements about it. And then Steve can correct me to make it more accurate. And also we get the core developers, but a perspective, not that you represent all core developers, but at least a slice. All right. So this next one I want to cover is that Python 3.10 will be 10% faster. And this is the 4.5 years in the making. So Yuri Stelovanov long ago did some work on optimizing, I think, what was it? Load attribute or was it load method, load call method. So about some of these load
operations down at the CPython C eval level. And then Pablo Galindo, who's also a core developer and the Python 3.10, 3.11 release manager picked up this work. And now we have load method, call method and load global going faster. So basically there's some optimizations around those opcodes that make this much faster. And this idea apparently first originated in PyPy. PyPy. So I'm pretty excited to see that, you know, some simple internals that don't seem to change a
whole lot of stuff might make this a lot faster. What do you think, Steve? This one is real. Like I was so excited about this when it was first being proposed. The basis of the idea is that Python keeps everything in dictionaries, which means every time you look up, you know, dot name of anything, it takes that name, makes it a string, turns it into a, like gets the hash value, looks it up in a dictionary, finds the value, maybe wraps that up in some extra stuff. Like if it's
going to be a method, it's not stored as a method. You turn it into a method when you know what it's being kind of dotted through to get to, and then returns that. That's a whole lot of work. And if you're regularly calling the same method over and over again, why not cache it? That's the heart of this, right? It does that cache around load adder, right?
Yeah, it does that cache. And the insight that Yuri had that he made work, and in fact, I think someone else had suggested it earlier and hadn't gotten it to work was what happens when things change? Because again, as I say, it's, we're designing language for many, many people do all sorts of weird things. And if you cache a method lookup, and then someone tries to monkey patch it, you know, we've now broken their code for the sake of an optimization, which, you know, is a no,
in Pythons, like correctness beats performance. In every case, that's just the trade off that the language chooses to make. That's almost always what you want, right? Like, when would you want to be faster and wrong rather than slower and right? I'd be happy with faster and no monkey patching. But yes, yes, sure. Like faster and fewer restricted capabilities might be a really good trade off,
but faster and wrong is not a good one. Yeah, we did some benchmarking and basically found that there was a way to track all dictionary updates in the entire runtime with a version tag that was not going to instantly overflow and not going to break everything. So it became really easy to say, has this dictionary changed since I last looked at it with one single value comparison? And so it looks
at that value. If it has changed, it's going to do the full lockup again. 99.999% of the time it hasn't changed so it can give you the cached value and you saved big dictionary lookup, possibly error handling, descriptive protocol, all of this extra stuff that just adds so much weight to every operation. Yeah. And that's everywhere. I mean, that's everywhere in the language. Absolutely everywhere.
That's fantastic. One of the things when I was first getting into Python that made me sort of have a sad face was realizing that having function calls was pretty expensive, right? Like having a big call chain, like actually the act of calling a function has a fair amount of overhead. When I wanted to break my code into like a bunch of small functions to make it real readable, this part needs to go a little
faster. Maybe that's not what I want. You know? And so hopefully this helps with that as well. Yeah. That and vector call is another optimization that we got in recently. I think that might've been the pet 509 actually was vector call also designed to make that faster. Just removing some of the the internal steps for doing a function call. Fantastic. And it's like Brian said, this is everywhere. So that's everyone's going to benefit. This is fantastic. Yeah. Well, we would like to thank
our sponsor. So this episode is brought to you by Tech Meme Ride Home podcast. This is a great podcast for more than two years and nearly 700 episodes. The Tech Meme Ride Home has been Silicon Valley's favorite tech news podcast. The Tech Meme Ride Home is a daily podcast only 15 to 20 minutes long and even and every day by 5 p.m. Eastern. It's all the latest tech news, but it's more than just headlines.
You could get a robot to read your headlines. The Tech Meme Ride Home is all the context around the latest news of the day. It's all the top stories, the top posts and tweets and conversations about those stories as well as behind the scenes analysis. The Tech Meme Ride Home is the TLDR as a service. The folks at Tech Meme are online all day reading everything so that they can catch you up. Search your podcast app now for Ride Home and subscribe to the Tech Meme Ride Home podcast or visit
pythonbytes.fm/ride to subscribe. Yeah. Thanks for sponsoring the show. And Brian, every day, like we do this once a week and it's a lot of work. These guys are on it. I could totally do this every day. If I didn't have another job, I would not have any problem with catching up on Python News Daily. So actually sounds quite lovely. I wonder how that podcast is doing now that not so
many people are having to, you know, commute to and from work. That sounds like one of the things where you hope that you've, you've given people the excuse to tell their employer on my commute between four and 5 p.m. I can't do it. You know, I've logged off and go listen to a podcast during that time. I listen to podcasts while I'm, I realized I was missing out. So I started listening to podcasts while I'm doing my laundry. I do it when I'm, we're doing some doing dishes. I listen when I'm
doing yard work, stuff like that. Yeah. I recently broke out some other older podcasts just to catch up on stuff with a big mess I had around the home. Like it's a long story with a new puppy that's not worth going into. Maybe I'll tell you guys after the show. It's pretty outrageous anyway. Yeah. And it's so enjoyable, but what I've actually found is shows like Python bites and right home that have
like a bunch of short little things that you can just drop in and out of match a lot better. Now that people are not commuting so much, you can like do that 10 minutes while you're like folding laundry and get like a whole segment. And so I think he gets varied, but actually it's pretty interesting. And I think that they and us here are well positioned. Like talk Python has more of a dip than Python bites. Have you surveyed your listeners to find out what they're doing while they listen to
you? No, not really. Everyone should tweet at these guys, Twitter. What were you doing when you were hearing this podcast? Yeah, that's awesome. I've got a few anecdotes, but nothing like a survey that would give me a proper answer. Steve, I think your next item is super, super interesting. And people speaking to Twitter, right? Like this whole conversation started as a, Hey, Twitter message. Like why did this happen? Well, let's ask Steve. Yeah. As you know, I spend a decent amount
of time on Twitter. My handle there is Zuba, Z-O-O-B-A, which you'll probably never find in search if you're looking for my name, but that's where I am. I really like actually searching for what people are saying about Python on windows. It's kind of the most honest feedback you get when they think you're not listening. And so I go and listen. And one of these popped up, which was, Oh, I tried to install Python 3.9 newest release about a little bit under a month ago on my windows seven machine,
and I couldn't install it. And since then, I've actually seen a few more posts. Someone managed to bypass the installer completely and get it all the way onto their windows seven machine and then found out it wouldn't run. Oh man. Yeah. And so the question was, I was like, why would you do this? Windows seven is still a fairly big platform. Why would you take it out? And you know, the answer was just a bit too
long for a tweet, but someone, you know, kindly included Python bites in the reply. And so I said, Hey, I'll come on and talk about it. So let's do this topic. And the answer is multiple legal looking documents all come together and have to be read in parallel to figure out why we dropped it. Yeah. The small business owners know what I'm talking about. So one of those documents is PEP 11, one of the lowest number PEPs that we have, and it's titled removing support for little used platforms.
The title was not originally about windows, but there is a section in that PEP that describes Python's policy for supporting windows on release day of 3.x.0, all the supported versions of windows covered by Microsoft support lifecycle will be supported by CPython. And that's on the 3.x.0 release date.
So what that means is then you now have to go and look at Microsoft support lifecycle website and look up all of the different versions of windows to see which ones are still covered by support to that date. Windows seven fell out of extended support in January. There was quite a bit of noise about that because that means no more security patches, except Microsoft did do a couple more security patches
because some really bad stuff was found, but that's largely stopped. Essentially, it's the point when the only people who get windows seven support are paying for it. And I assume they're paying large amounts of money for it. I don't actually know how much it costs, but that's the point where, you know, you bought it, but you don't get the free support anymore. So CPython follows that because
no one is paying the core team to support Python on all of these platforms. And so it's, it seems like the fairest point to draw that line is at some point we have to say our volunteers can no longer keep a windows seven machine running. Even I can't keep a windows seven machine running safely because there's no security updates for it. How am I meant to develop Python on it? How am I meant to test Python on it? The burden there is too high for volunteers to handle. So we just say that's the
point where it goes away. So because those two documents lined up, Windows eight actually dropped off a couple of years ago because the support lifecycle ended early for that to force everyone onto 8.1. Windows 8.1 has about three more years. So I think Python 3.12 will be the last one to support 8.1. And then it's all Windows 10 or whatever comes out in the future. Yeah. Yeah. Windows 10 X or
whatever they call it. That's a different one. That's the Xbox. Yeah. So I, you know, I think this makes a ton of sense. And two thoughts I had as you were laying out the case here. One is if you're running a Windows seven and you can't upgrade to even Windows eight or more recently Windows 10 or one of the server equivalents, right? I'm sure there's like a server equivalent, like Windows 2003 server. I don't know how
long I supported, but whenever it goes out, it probably falls under that banner as well, right? Yeah. Windows server is a bit more interesting. Their life cycles tend to last longer, but historically, CPython has only kind of tied itself to the client operating systems. Gotcha. Oh, interesting. Okay. So to me, I feel like if you're running code or you're running systems that old, you must be running
it because it's like some super legacy thing. So do you absolutely necessarily need to have the most cutting edge Python or whatever language? Like it's probably something that's that way because it's calcified and you probably don't need, or you probably shouldn't be putting like the newest, shiniest things on it. Right. That's that one. What do you think? Yeah. No, I totally agree with that. If that setup that you're running is so critical that you can't
upgrade the operating system. How can you upgrade a language runtime? It's like, how can you upgrade anything on that? I feel like it's in the category of, please just don't touch it. It's over there. Just don't even walk by it. Leave it alone. We cannot have it break. Just leave it over there. Probably still has a PS2 keyboard plugged into it. Oh, it might with a little blue or pink round thing. Yeah, absolutely. The screen has probably got at least 16 colors. Yeah. And the monitor is probably
really heavy. Windows seven. It's not that, it's not that old. Some of the, seriously, like the stuff is old and you probably don't want to touch it, right? Yeah, that's exactly it. It's all the motivation that you would have for updating to Python 3.9 from 3.8. And again, we're talking about a version that's only one year old. Like Python 3.8 is not that old and you desperately need to upgrade to 3.9.
You even more desperately need to upgrade Windows. And there's just really, there really is no question about that. The same thing applies to early versions of Ubuntu. People running Ubuntu 14 or even 16 at this point, like need to be facing the same thing. And we have similar discussions around OpenSSL, where occasionally people will be, oh, I need Python 3.9 to run on OpenSSL 0.9. To which our answer is
basically, that's pretty hot bleed. It's like, okay, I'm going to play the other side. I totally get the reasons. But I also get the questions because the users and the developers or the whoever's wanting to install Python, they usually don't get to choose what operating system they're using, but they do get to choose which version of Python they're using. So I do get said in some cases and in some cases,
I totally understand where the question is coming. Yeah. We joke about how old these machines are and they're really not. Like people are setting up new machines, probably with Windows 7. They certainly were within the last year. And there's good legitimate reasons for that. And we're making fun of some of the apparent contradictions, but we're definitely not making fun of the people who have often been forced into these positions. But the reality is we can't afford as a volunteer team
to maintain Python against unmaintained operating systems. And so the advice is stay on the previous version of Python that the latest version of Python that works for you. It's not going to break. We're not changing it. Anything new that comes up, security fixes will still come out. At some point,
there just has to be a line drawn. And that's the point where we've chosen to draw it. The other thing I want to point out that we changed in this release, which people are more excited about is if you go to python.org to download Python for Windows, you get this real big obvious button up front that just says download for Windows or download now or something. As of Python 3.9, that's now getting the 64-bit
version rather than the 32-bit version. For a long time, it's been 32-bit. The reason for that was compatibility. We knew a 32-bit one would run anywhere. When we put Python in the Windows store, that was 64-bit only. We kind of wanted to test the waters and see, hey, will people notice that we haven't put a 32-bit version here? Turns out no one did. And so when we got to 3.9, had that change, we made it 64-bit by default. So that has a flow and effect to the ecosystem. A lot of particularly
data science packages would rather just do 64-bit only packages. Some of them certainly get theirs done first and not the 32-bit ones. So we expect to see some flow and impact from that just broader use of the 64-bit Python throughout the Windows ecosystem. Yeah, that's super cool. And just like the final thought I had was, you know, Django dropped Python 2 and they're like, we were able to remove so much code and it is easier for new people to contribute
because they don't have to write for two ecosystems. They write for one. NumPy did the same thing. And I feel like this is sort of the same story. Like you guys can just not worry about yet another older, outdated operating system and stay focused on what most people care about. One thing that someone did suggest in one discussion was why not dynamically light stuff up for the newer operating system? And the answer is we do that. And when we drop the older operating system,
we get to delete about a hundred lines of code for each point where we do that. So it is, we get to do a cleanup. We get to say, oh, we don't have to dynamically check for this API, this API, load this, cache this, store that, call that, call this format. We can just condense that into, oh, we know that this API is there so we can use it and just reduce a lot of kind of effectively dead code on newer operating systems. Nice.
Is that a pre-compile like hash if death sort of thing, or is it a runtime thing? Does it make a performance difference? It definitely makes a performance difference, though we try and minimize it. But again, there's, there's always some impact. It tends to be in operating system calls anyway. So you expect a bit of overhead. And so it's not going to add a significant kind of percentage overhead compared to whatever
operation you're doing. But it does certainly add a lot of cognitive burden to someone who's reading the code. Sure. One example that we got to clean up recently, not in the, in a previous version was we had about, I think 70 or 80 lines of code to concatenate two path segments. And this is before Python's loaded. So we have to do this with the operating system, the API call up until windows seven,
I think. So pre windows seven was not secure and it would, you know, buffer over runs, all sorts of horrible stuff, but it was the best available function there for handling certain cases. So we'd use it, but first we dynamically look for the newer safer one and call that soon as we dropped, I think Vista, we could delete all of that code and just unconditionally call the one safe path combined function. And that code got a whole lot simpler.
Yeah. Lovely. That's awesome. Yeah. Cool. Brian, would you say it's more robust now? Yes. I think it would be more robust. Actually, I thought I showed up to the bash podcast. Is this the Python podcast? Yeah, this is not as bash bites. Okay. I love Python. Of course, I still use bash regularly. And I know a lot of people that are, like sysops people and other people are using bash daily as well. So I wanted to highlight this
cool article. This is an article by David Paschley called writing robust bash shell scripts. And even though I've been writing scripts for decades, I learned a whole bunch in this and I'm going to start changing the way I do things right away. The first two tips right away are things I'm going to do. First tip is to include a set dash you and never even heard of this. What it does is it makes your bash script exit if it encounters an uninitialized variable. So the problem without
this is like, let's say you're constructing a path name or something or a path, a long path. And one of the directories or file names you have in a variable, if that's never set, a bash normally just silently just deletes it and it's just not there. And it'll still keep executing anyway, but it's not going to be what you
want it to do. So yeah, I definitely want to turn this on so I don't use uninitialized variables. Similarly, if any of your script statements returns a non true value, so it's a that's usually in scripts or shell work, non true value means something bad happened. If you use set dash E, that will make your script exit at any point if one of the sub statements returns a non returns an error value.
So you don't you don't want to just keep rolling with an error condition. So this is good. I hopefully I'll cautiously add this to scripts because I want to make sure they keep working. And then a tip just to say expect the unexpected. There will be times where you'll have missing files or missing directories or directory that you're writing into is not created. So there's ways to make sure it's there before you
write to it. If you're especially if you're running on Windows, be prepared for spaces and file names. And so variable expansion in bash does not really isolate spaces. So you have to put quotes around expansion to make sure that it's a single thing. And one of the things right away, the next one is using
trap. And I've never actually knew how to do this before. So if you've got a bash script that's running, and it just something's wrong and it won't exit, you can kill it or other ways to get it to stop. But if you have the system in a state that needs some cleanup so that this there's a way to use a trap command to exit gracefully and clean up things. The last couple of points were be careful of race conditions and be atomic.
Those are good things to do, but at least a handful of these I'll put to use right away. So it's good. Yeah, this is neat. And a lot of the stuff I didn't really know about. So yeah, like, continue on if something went wrong, just plowing ahead. Yeah, that's cool to know that you can make it stop. Steve, do you ever do any bash? You got a WSL thing going on over there? I've certainly done a lot more bash since I started using WSL for a lot of things. I was aware that
using an uninitialized variable would substitute nothing. But I'm very happy to know that there's a way to kind of turn that off because that that has certainly caught me out in the past many times. And this looks like just a good article that I'm going to have to go read myself now because it has everything that you learn from doing scripts in like command prompt or PowerShell now or even Python to
some extent. I have not personally mapped those to bash equivalents. So it sounds like this would be a good place for me to go through that and up my skills a little bit. My favorite thing was the find command. And once I got that, that felt as powerful as a reg X. And I'm kind of like, oh, I don't need to write a whole script. Now I can just do one excessively long find command. Nice. Yeah, you find a lot as well.
All right, this next one, I don't want to spend too much time on because I feel like you could easily just go and spend an hour on it. But for time's sake, we don't have a whole lot of time left for the episode because we have a bit of a hard stop. So I'm going to go through this and get your guys' thoughts on it real quick. There was a tweet about a GitHub repository that was a conversation on
the Python mailing list. Lots of places. So Anthony Shaw tweeted, calling attention to a roadmap by Mark Shannon, called Ideas for Making for Five Times Faster CPython. So he laid out a roadmap and a funding map and some interesting ideas. And I'm going to go through them quick. And then especially Steve, we'll see what your thoughts are here, how reasonable this might be. So the idea is like,
there's going to be four different stages. And at each stage, I think you can get 50%. Speed improvement, you do that four times, that's, you know, compounding performance interest, you get five. So I think it talks about three nines somewhere. But anyway, I think maybe it's got to shift its numbers a little. Anyway, so Python 3.10, stage one was to try to improve will be adaptive, specialized interpreter that will adapt types and values during execution, exploiting type stability
without the need for runtime code generation. That almost sounds a little bit like what you're talking about with the 10% increase earlier, Steve. And then 3.11, stage two would be improved performance for integers, less than one machine word, faster calls and returns to better handling of frames and better object memory layout. Stage three, Python 3.12 requires runtime code generation, a simple JIT for small regions. Python 13, extending the JIT to do a little bit more.
And I'm linking to a conversation, a long threaded conversation over on Python dev. There's a whole bunch of stuff going on here. So I encourage people to read through it. But there's just like a lot of interesting implications about like, how do we pay this if we pay someone to do it? People like Steve work on CPython and they don't get paid. Like, how is it fair to pay someone else to do it when other
people are volunteering their time? There's a lot going on here. Steve, what do you think about this? Have you been following this? I read through the original proposal. I haven't had a chance to chat with Mark directly about it. I will, I guess, start by saying that Mark is a very smart guy. And he has done all of this planning off on his own in secret and kind of come out and shared this plan with us, which, you know, is it's not an ideal kind of workflow, certainly when you're part of a team.
But I have certainly found in the past that when you get a very smart guy or very smart girl goes off and disappears for a few weeks and comes back and says, I've solved it. There's a good chance they've solved it. So I'm very interested to see where it goes. The part of the discussion that you didn't mention is, or that you hinted at is this is kind of a proposal for the Python Software Foundation
to fund the work. And that part of that funding is conditional on delivery. So the way he's proposed, this would work. And the implication seems to be that Mark will do the work himself and be the one getting paid for it. Yeah, that seemed like it wasn't clear from his GitHub repo. But if you read the conversation, it was like, look, I'm pretty sure I can do these things. This is how much would make sense for me to spend the next couple of years working on it and getting paid. How do we do a
fundraiser so that I can do this for everyone? Yeah. And, you know, I think under those conditions, if the PSF is able to put the budget towards it, they are in a bit of a tight spot. Since PyCon is normally the big fundraiser for the year, and that didn't happen. On the other hand, it's also the big expense. But financially, the PSF is not in their normal place where they'd be for
for the year, because PyCon didn't happen in the same way. But I think if they're prepared to put funding towards this, I guess if the community consensus is that this is the most important thing for us to do. And there's certainly potential downsides to doing it. Code complexity is the big
one. And I don't actually think there's a way that you implement this or even achieve these performance gains without making the code much more complex, and hence less accessible to new contributors and, people in earlier stages of learning to code, at least on the C side. Yeah. Yeah. So there's trade-offs. I'm very interested to see what would come about. I assume that because 310 is targeted for the first pass, that it's already done. And he's already got the code. And he's trying to
actually... And he's just trying to get confirmation that he can spend the next few years heavily investing in it instead of having to go find a full-time job and go back to doing this in the evenings. Yeah. Which, you know, I'm fully supportive of. Again, it's really just a big open question of, is this the most important thing for Python to be funding right now, for CPython to be getting in particular? Someone, I forget who, raised the question of what if we put that money towards PyPy
instead? You know, what could they do with it in that amount of time? And ultimately, it's going to come down to someone, probably a small group, presumably the steering council will have some involvement from the technical side. The Python Software Foundation Board will no doubt be involved in just deciding, is this the best use of the money that we have or can go out and get for what benefits it would produce?
Yeah. When I look at it with the funding side, I see it as very brought with challenges on the sort of community funding and the PSF funding. But I know there's so many huge companies out there that spend an insane amount of money on compute and infrastructure that make a lot of money. And that if they could have a 5x speed up on their code, they could probably save that money right away on
infrastructure. So it seems like that they could also get funded that way. But we should probably move on just because I've got to, I'm going to make sure we have time for everything else before we end up running out of time. I just do want to call out, like, you should go check out that conversation. There's a very funny excerpt from Larry Hastings. It says, speaking as the gillectomy guy, they were talking about borrowed references being a challenge, saying borrowed references are evil.
The definition of a valid lifetime of borrowed reference doesn't exist because they are a hack baked into the API that we mostly get away with because of the GIL. If I still had wishes left on my monkey's paw, I'd wish them away. Unfortunately, I used my last wish back in February wishing I could spend more time at home. So bad. All right, Steve, let's get a little bit more insight from you on this last one, huh? Because you were at the core developer sprints, which recently happened.
Yeah. So I don't know exactly what day this is going to go out. But last week, from recording day, we had the CPython core developer sprints. So this is kind of a get together, generally in person event that the core development team has done for five years now. I think this is
the fifth year. In the past, we've all gone down to Facebook or Microsoft or last year, we hung out at Bloomberg in London and basically spent a week in a room together coding, discussing, reviewing things, designing things, planning things, and otherwise just getting to actually meet our other contributors because we all work online. We all mostly work over email and kind of bug tracker and GitHub
pull requests throughout the year. And so it's a really good opportunity to get to meet each other, get to see who we're dealing with. It's a lot harder to be angry at someone over email when you've met them. Yes. And so it's been a really good event. This year, because we're obviously not traveling for it, we were hosted by Python Discord, which is at pythondiscord.com. There's a server that is really well managed. It's really well organized. I was impressed. I have not been there before,
but it was great. They set up, felt like thousands of channels for us, far too many, but it gave us plenty of space to kind of mingle with other core devs while we were discussing and working and planning anything. We also did a Q&A. So there'll be a link in the notes for that from YouTube that we live streamed. We had people submit questions ahead of time, everything from what situations should I use a mangled name in, like a double underscore led name through to,
you know, what's your least favorite part of Python? What do you most want to replace? Did you ever expect Python to get so big? And we had a lot more people involved. We normally do a panel for the host company. So we'll, we'll get kind of their employees together. And it's like part of the perk for funding the venue and typically meals and coffee and everything for the week. This time it was public on YouTube. It was all kind of over video. So everyone got a bit of a turn
to jump in. So you'll get to see a lot more core developer faces than you've probably ever seen before. You'll get to hear from a lot more of us than you have before. And a lot of interesting things, the big kind of ideas that came out of the week, kind of hard to say. A lot of us did come out feeling like we didn't get as much forward momentum on stuff as we normally would in person. But at the same time, a lot of things did move forward.
I think there were about seven or eight peps passed up to the steering council during the week, the various things. One of mine was deprecating disti-tills, which is an entire podcast on its own. So I might have to call you guys another time to talk about that one through to a proposal to change how we represent 3.10. Because a lot of places we put the version numbers back to back with no separator. And so you have, you know, 3.8, 3.9 with no nothing in between. Now we're up to 3.10,
or is it 3.10? Yeah. Okay. How do we fix that? And we had a lot of discussions about that. There was obviously a lot of talk about JIT, about the C API, all the usual things that we talk about. But again, because it was online, it was really good to have such a range of people involved across different time zones and people who would not normally get to travel. Yeah. It makes it more accessible. Yeah. That's awesome.
Yeah. We have core developers in countries who can't leave. Like they literally cannot leave their country either because the populace is just strictly controlled or they know they would not get back in it when they tried to go home. And so they were able to participate. And that was great to, you know, see and meet some of those people. We had a few mentees come along to interact with the rest of the team.
And just overall a good week. Awesome. Yeah. Cool. And yeah, people can check it out. The YouTube stream. I definitely want to check that out. Sounds neat. All right, Brian, we've got two minutes left. Do you think we should do a joke? Yeah, let's do the joke. Oh, let's just cut to the joke. So we, we don't miss that. Right. So you and I spoke about Hacktoberfest going wrong and like random PRs to like
config files and change, changing the spelling and config file settings. So there is a guy who posted on Twitter and said, Hey, I let me double check the name. It was Stuart McCroden. And he posted this cool t-shirt that he got. It says Hacktoberfest 2020. Any PR is a good R and it's Lua.py. And it has import high in Vim. Then the PR just adds hash. This imports a package. That's awesome. Steve, did you suffer from any of these? I did not. I might've done. My GitHub notifications are a mess.
So yeah, I don't even know yet. Yeah. I don't, I don't see poor requests until I actually go look at the repo myself for the most part. Yeah. I got a bunch. I got a whole bunch. Yeah, me too. Cool. Okay. I wanted to do one. This should have been a topic, but the five most difficult programming languages in the world. This was submitted to us by Troy
Codil, Codil, I think. It's not really a full topic, but I thought it was hilarious. This is an article where the author, Locad, Locate, I guess, actually took five programming languages, Malbosh, Intercal, Brain, you know, we all know that one, Cow, and Whitespace, and wrote Hello World in that language. And these are hilarious. And my favorite is Whitespace because the entire language depends on space tab and line feed for writing the program. And any non-Whitespace
character is considered a comment. So this is great. That's crazy. I don't know why the APL wasn't on there. APL is just fully insane. Let me, I'll put it just in the show notes here at the bottom of this, an example of APL. That right there that I put on is, I can't try to even speak this, but that is an entire thing that finds all prime numbers from one to R in that line. I hope you guys see at the bottom of the notes. That's insane, isn't it? That's not even intentionally bad, is it?
No, it's meant to be a real programming language. It's as if the Egyptians who only wrote in Hero Graphics decided to write a programming language. That's how I feel. It's insane. But it's a legitimate language. People try to use it. They do use it. Anyway. Not very long, I expect. Only as long as they must. And then they immediately stop. I just like that this Intercal example, it's so polite. It's please, please do, please, please, oh, please give up.
Yeah. Apparently you have to sprinkle pleases in it or else it's like error because it's, you're not polite enough. But if you do too much, it also errors because you're overly polite. I like that. We need more passive aggressive languages like that. Lovely. Cool. Well, thanks a lot, guys. It was fun. Yeah. Yeah. Thanks, Brian, Michael. Yeah. Thanks. And thanks for being here, Steve. It's great to have your perspective.
Thank you for listening to Python Bytes. Follow the show on Twitter at Python Bytes. That's Python Bytes as in B-Y-T-E-S. And get the full show notes at Python Bytes.FM. If you have a news item you want featured, just visit Python Bytes.FM and send it our way. We're always on the lookout for sharing something cool. This is Brian Okken. And on behalf of myself and Michael Kennedy, thank you for listening and sharing this podcast with your friends and colleagues.