[00:00:00] Alex Sarlin: Welcome to season eight of EdTech Insiders, where we speak to educators, founders, investors, thought leaders, and the industry experts who are shaping the global education technology industry. Every week, we bring you the week in EdTech important updates from the EdTech field, including news about core technologies and issues we know will influence the sector, like artificial intelligence, extended reality, education, politics, and more.
We also conduct in-depth interviews with a wide variety of EdTech thought leaders and bring you insights and conversations from EdTech conferences all around the world. Remember to subscribe, follow, and tell your EdTech friends about the podcast, and to check out the EdTech Insider Substack newsletter.
Thanks for being part of the EdTech Insiders community. Enjoy the show.
[00:00:55] Ben Kornell: Hello, EdTech Insider listeners. It is February. We are skiing down the slopes of EdTech Mania. The conference season has started early this year. The news breaks are coming out fast and furious, and we're here to bring it all to you. Uh, alongside me is my trustee partner, Alex Arlin. Alex, how are you Doing
[00:01:19] Alex Sarlin: good.
A lot going on this week. I feel like we gotta just do rat tat and hit some of these highlights from Big Tech, from higher ed. Uh. Before we do
[00:01:29] Ben Kornell: that, what's going on with
[00:01:31] Alex Sarlin: the
[00:01:31] Ben Kornell: pod?
[00:01:31] Alex Sarlin: We have some really cool episodes coming up. We just put out an episode this week with Lydia Logan, who runs sort of workforce development with IBM.
They have publicly committed to training 30 million people, upskilling all over the world, 2 million in ai. We have a great conversation coming up with Nick Hernandez from 360 Learning. That's a really big European workforce company that just bought a AI taxonomy company. They're doing really interesting things and we also talked to Robin Cook from Audi School, which is doing some really interesting AI based curriculum work.
So, you know, continues to be sort of AI months in, in EdTech insiders land, but definitely check those out.
[00:02:12] Ben Kornell: On the event schedule, we just had the Bay Area EdTech Summit. It was a great event, you know, over 150 people. The kind of signature of the event really is the small group dinners. So we had 150 people out for dinner and just great connection, conversation chats, and then we've got some really exciting events coming up on March 4th.
We have our South by Southwest edu Happy hour. That's in Austin at the Seven Grand. Make sure you check out our newsletter to sign up. And then we've also got one in Redwood City, California, the Bay Area Ed Tech Happy Hour on March 21st at the Blacksmith bar. So all of that information you can find out in our substack.
We're really, really excited to be connecting the EdTech community. So let's jump into the headlines. I think this is a great week to just start with a bang with ai. What's at the top of your list, Alex?
[00:03:12] Alex Sarlin: So the top of my list is something that you are intimately involved in, so I'm gonna probably kick it right back to you.
But I mean, I noticed that in chat, GPT, they are really, really pushing the GPTs, the GPT store, they just launched a feature that'll basically allows you to at mention specific GPTs within a conversation. So you can basically evoke specialist agent in the middle of a conversation, and they're really trying to obviously build a whole ecosystem on top of this third party content.
But it was also announced at the recent summit that Common Sense Media is working with open AI to build AI safety guidelines to try to make sure that some of these AI tools can be used safely in schools or home settings. Why don't you tell us a little bit about that, because you were the one posting pictures with Sam Altman on LinkedIn this week.
[00:04:03] Ben Kornell: Well, I mean, I think like any good partnership, everyone's like, congrats, congrats. And then you're like. Oh crap. This is gonna be a lot of work. And so I think, you know, we're diving in and so just for full disclosure, everyone, I run an the kind of for-profit investment arm of common Sense and our partnership with OpenAI.
Is a multi-year partnership and it starts with the GPT store and curating the gpt. And as you can imagine, eventually in some future state, there will be kind of a safe mode or a kids mode or a teen mode or something like that in gpt. But actually the GPT store is a way of almost tuning the GPT ecosystem to give you very specific outputs.
And you know, on the game GPT, it might give you all the rules for board games, a second grade level, but there are GPT that are really great for educators, really great for our families. And so OpenAI asks common sense to help kind of curate those and then also make sure that there's. A rubric or a framework that they're using to approve what gets into the GPT store.
So you might think, okay, there's a few thousand apps in the GPT store that seems like a lot. Reality is there's about a hundred thousand gpt in the queue, and what you see is the tip of the iceberg, the ones that have been approved to kind of go live in the store. And so this backlog, some of them. Clear implications or use cases in education or with kids and families.
And so the common sense team is really teaming up with them to curate and filter that. And then on the parent and educator communication side, we've all kind of seeded the landing pages from the different tech companies of here's how a teacher should use it, here's how an educator should use it. It's a single landing page with a couple of, you know, bullet points.
The idea is that OpenAI and Common Sense are teaming up to really create a portal for parents and a portal for educators where they can actually get really great content about what's good, what's bad, how to use, how not to use, but also get connected with kind of the best of, you know, GPT for kids and families.
And you know, over time on the longitudinal basis, there's a ton that we can do. What I was really impressed with is actually back in the fall. When Common Sense Media launched their AI reviews, they got a lot of phone calls from people who weren't particularly happy. Why did you rate this that, and, you know, why'd you give us a three?
Why'd you give us a two? Why'd you give us a one? And what was interesting is Sam Altman reached out and said, this is incredible. Like, we haven't had this kind of feedback. We wanna learn more. Our team wants to see the, the training data you used and the the files that you pinged and we wanna learn from you and make chat GPT in general safer for kids and families.
And so it was kind of, you know, this alternative universe that we could have had with. Social media where they could have leaned in for safety. I think the AI teams are realizing that it's going to be very, very slow on the legislative side to create guardrails for positive use cases, and so they're.
Actively pursuing partnerships and OpenAI is taking that lead.
[00:07:33] Alex Sarlin: It's exciting and I've been sort of continually pleasantly surprised at how much the OpenAI folks, you know, they have a lot of things they could be paying attention to. Right now they're sort of in everybody's mind for every different industry, but they really seem to care a lot about the education use case for a variety of reasons.
And it's pretty neat to see. And I'm, I respect that thinking of them sort of trying to get ahead of the sort of chaos of inevitable chaos that comes with an app store and start to say, let's actually have some meaningful guidelines, some guardrails. All the things that educators have been, and families have been, you know, saying they, they desperately need to make sure that their kids are, you know, safe and everything is working as.
Plan. So I think it's a really exciting project. A couple of other AI headlines, you know, this week that popped out, you know, one was Edse ai, which is a really interesting sort of consortium of a number of different education organizations, including, you know, Aaron Moats, innovate, EDU and, and Digital Promise, and others.
They just announced their new fellowships. They have two new fellowships each with, you know, a few dozen people in them. And I, I think it's worth checking out and following what they're doing because they are also sort of working alongside you in common sense and open AI in trying to build a really meaningful ecosystem of people who understand AI enough.
To be able to actually guide it towards a positive future in the education realm. So that was really cool to see. And we will, you know, follow up on that in future podcast as we know more about it. But, and then the last is of course, we had talked about the National Education technology plan last week, which just came out, you know, uh, just a couple weeks ago.
And Ben, I think you got a chance to speak to one of the folks from the Department of Education about how they've been thinking about that, including ai. I know that's at the end of this episode, but do you wanna give a little bit of insight into what you talked about?
[00:09:23] Ben Kornell: Yeah, I mean, I don't wanna steal their thunder too much, but one thing I would just say is.
You know, we've had national ed tech plans before, and actually our guest was part of the Obama administration. But I think what we're realizing is that for schools and districts to purchase, ed tech has become a much more. Complex and complicated endeavor. And so these guidelines are not just for the ed tech industry or for policymakers, it's really practical advice on how do you ensure equitable access?
How do you ensure that the kind of use cases and efficacy match up? And then how do you implement in a way that is going to lead to to success? And so it's a great conversation. And I will just say we should also note that the federal government is also creating some new administration elements for ai.
And so this question of the AI use case in education is actually getting a lot of attention at the federal level. And so I just, you know, some people that you might wanna be paying attention to is Elizabeth Kelly, who has been the special assistant to the president on the National Economic Council. She is now going to lead the US AI Safety Institute and Elham Tabi, who is currently the Chief AI Officer at the National Institute of Standards and Technology, will become the II's new Chief Technology Officer.
So ais, I remember we covered the kind of administration's big announcement about AI regulation. They formed the US AI Safety Institute at that time. We now have some leaders there, and it looks like education is actually gonna be one of the big focus areas. So I think this national ed tech plan is a great stepping stone to a holistic view of AI and education.
[00:11:20] Alex Sarlin: I had been pretty dour in the past about some of, especially the US' sort of slow moving concept of regulation. You know, we've talked to the podcast a lot about how, how China was way ahead, about how Europe was ahead, but it feels like there's been a little bit of a sea change. I think that the us, you know, federal government has really wrapped their head around how serious this is going to be for so many different.
You know, industries for so many different people and how it's an equity issue, how it's an education issue, how it's the safety issue. They've talked about bio weapons. I'm really excited to see them sort of stepping in with more very smart people and serious initiatives. So that's awesome. You know, there were a couple of really big tech headlines this week.
I, I think, you know, both of them I think are worth talking about, even though they're not directly about education, they do definitely have implications for education. One, of course is the launch of Apple's expensive, but apparently incredibly powerful vision Pro XR headset, which is, you know, in the. 3000 plus dollar cost range, but it's sold out even at that and it clearly can do incredible things.
And I, you know, I've been reading reviews of it in various people and you know, it's been in social media a lot, but, you know, this is obviously a little bit of a blow to meta, which has renamed itself meta to sort of own the metaverse and has actually, you know, been losing money on its reality labs and there's, you know, a brewing bruhaha about who is really gonna sort of be the big dog in the metaverse world.
I'm curious what you think about the Apple and meta sort of wars here, or, or just what you think about the Vision Pro in general.
[00:12:57] Ben Kornell: Yeah, so I got a chance to take, uh, vision Pro for a test drive. And the reality is, pun intended, that all of this technology is not really ready to take over our lives. And so whenever you're using it, you have to suspend some disbelief to say, okay, where is this headed?
Are today and the Vision Pro feels like a meaningful leap forward. And you know, I have a Quest two as well, and there's a Quest three coming out and on that it has more of a gamer feel where it's animated and there's different levels of realism in the animation, but you really know you're in an alternative universe.
You're not in a virtual copy of our real universe, and it can be clunky to move around and navigate. You've got the different, you know, handheld devices and so on. And I think for certain use cases that can be really, really great. And in the education space, there's already been some great companies that have shown that hands-on learnings, simulations, VR can be really practical for particular use cases.
I'm thinking of Prisms of Learning as one example where kids are learning advanced algebra or geometry through spatial learning. I think the Vision Pro really starts blending the real world and virtual world. Super high definition screen, very, very realistic. You get dropped into a music video with Alicia Keys and she's like literally right in front of you talking to you, or you're in Africa and you're feeding Rhinoceros right up in close and it doesn't have that animated feel.
It really feels cinematic. And so there are these moments in the Vision Pro where you are totally, you know, enmeshed in this universe and they call it immersive experiences and I think that's really where it's most powerful. On the like practical work side, you know, I wanted to test out is this how people are going to be doing virtual work in the future?
You know, the idea that the world can be your screen, that you can have different pieces up. What people need to understand is that the vision pro. You can't actually see through it, but it has cameras on the outside that make it look like or feel like you can see through it. And so you're actually able to do augmented reality where you're able to have your screen on top of things that you're looking at and you can take these immersive videos of the area and share with others and, and kind of port them to your place.
But in terms of like creating a spreadsheet or in terms of creating a PowerPoint deck, it's pretty clunky. And it really just felt like, okay, for these tasks or tools, I'm just gonna use my MacBook. But for these immersive experiences, I would use the Vision Pro. And the last thing I'll just say is I think one element that the entire VR industry is trying to figure out is how to make it social.
And there's one way to make it social is I am in vr, but I'm connecting with people all over the place, which, you know, that's what video games already can do today. But the idea that I'm going to be in a room with other people and I'm going to put it on that immediately makes it a single player experience.
And so I think they've gotta figure that part out because it does kind of create this real barrier and you're not able to do like video calls or things like that because you got the thing on your face. So figuring out how to make multiplayer and connectivity work, I think that's the key for really any technology today, is how you create virality in use and you create repetitive use because that's the platform where your social interactions are happening.
And sure the gaming use case is relevant here, but I'm not sure for work or family or life that they've really cracked the nut on that one.
[00:17:01] Alex Sarlin: It's very interesting. Nice. You know, inside view, we've actually been inside it. It's really, really cool. I mean, my quick takes for it, for the education spaces.
Obviously at this price tag, we're not gonna see these in in schools or probably even higher education. Yet we may see them in workforce training situations that is especially very specialized workforce training situations like military or, you know, high level surgery or things like that. But the social aspect that you're mentioning, I think is really interesting because this, I think this is a good, uh, segue to our other big tech story of the week, which is that, you know, you could say something pretty similar, I think about cell phones that you just said about the vision pro, right?
I mean, cell phones are in inherently, at least, I think they're inherently, you know, isolating. They're solid cystic, it's a person looking down at a device screen in their hand. You know, we, we've all seen two people at a, you know, at a dinner together, both looking at their phones, but at the same time as, you know, the app store.
And as the sort of cell world developed, they had more and more and more. Quote unquote social apps, right? I mean, from things like Words with Friends to Instagram, to TikTok, you know, I think there's gonna be some interesting evolution, whether or not it's in education or in entertainment. There's gonna be some interesting evolution over time where people try to figure out what are those killer apps and killer use cases of vr.
And I agree with you that they have to have some social element, but what does that mean? I don't think anybody knows. Right? It's a, it's a new medium. Relatively new, medium. People have tried, you know, this is one of many big tech companies that have tried to sort of break through with a, with a mixed reality, augmented reality device.
We all remember the Google Glass. And I'm not saying this is gonna go the way of the Google Glass, but I am saying that I think there's gonna be some interesting evolution. I mean, apple has put a. Crazy amount of money into developing this thing, which is why it's so high tech. They're all in on it. Tim, Tim Cook this week, you know, is saying, I'm all in on this.
We're gotta go see where this is all gonna lead to. But you know, the same way that Apple really, and then Google sort of changed our lives by making cell phones a must have device. There is a potential future where these types of devices become must have, but they, I agree with you that they certainly don't feel like it.
Yet. And the segue is that, you know, you mentioned social media before and how, you know, in the nineties there was this sort of landmark legal decision where they basically decided that social media platforms were not liable, could not be sued for content that was on the site posted by users. And really, you know, you could almost see that moment as the, the original sin of, of our modern era.
Because by doing that, it basically allowed these, you know, all the social media sites to grow unbelievably, to absolutely infiltrate every aspect of our lives, yet to have no real monetary risk in basically anything happening on them. And it's, it's been really crazy. And this last week there was a really intense, and I thought very.
Poignant, almost, you know, Senate hearing where they hauled, you know, Lindy Aino, the, the new head of X, they hauled Mark Zuckerberg. They had all the, the players take the head of the head, CEO of TikTok in front of the Senate and basically, you know, pilled them on the fact that social media has had a Snapchat, you know, uh, has had really negative effects on young people and on, you know, a lot of things in our society.
And I don't know how you felt about that one, Ben, but I really, some of those talks were political grandstanding. They were absurd. Others were, felt like they were getting to the heart of something that is sort of like the core of our modern moment. And that's how I felt. I was like, these are the things that actually matter to our lives as opposed to some other things that people discuss in politics.
I found it fascinating. I'm, what did you make of these child safety and this sort of like these hearings, they were really wild.
[00:20:49] Ben Kornell: Yeah, I think there's the hearings themselves, and then there's the after reaction. And so just for a history lesson, many of the issues that were brought up in this hearing are not new and have been known for a long time.
And the evidence of Instagram, knowing that they were providing hurtful content to teens and the outcome of it was increase in teen suicide, increase in teen eating disorders, et cetera, et cetera. That's been well documented, probably going on eight, nine, almost 10 years. And you know, Francis Hogan, I think that's how you say her last name, she was the famous whistleblower three or four years ago that really kind of brought the most attention to it because there were leaked emails that really said.
This is, they knew what was happening and they made active decisions not to make those platforms safer. And specifically Mark Zuckerberg himself. What was really new today, and you know this last week, was the theater. The fact that Mark Zuckerberg turns around and apologizes directly to the families themselves.
What was also new is that the tech industry seemed to break ranks with the Twitter ex CEO and the SNAP CEOs both saying that they supported the COSA legislation as is. Whereas the industry leaders themselves said, you know, no, there's provisions that we like and there's provisions that we don't like.
That kind of break signals a little bit of the jump ball that we have right now in the online community. The last wave, which was really dominated by Meta and Instagram, but definitely had like meaningful players with SNAP and some other social media platforms that feels like that space is a little bit jumbled and it doesn't help that, you know, Zuck has been super going down the rabbit hole with VR and Metaverse when that doesn't seem to be happening as quickly or as fast as he would've predicted.
So you've got this drama, you've got a split in the industry, and then you've got the public reaction. And I would just say, you know, the apathy is palpable in the US around these kind of issues that it's like, yeah, sure, someone, somewhere this happened to them, but like, don't get in the way of my Facebook feed or my Instagram.
But this felt like a little bit of a turning point. I agree. Where people, it's kind of parallel to the opioid Yes. Drama that we've seen playing out maybe parallel to the cigarette industry where you've got these internal memos where people know things are doing harm and they do it. And the degree to which the federal government is dysfunctional.
I think it's exponentially more dysfunctional now than it was back in the tobacco wars. But this is actually something both sides could really agree on. And there's actually conservative arguments for, you know, regulation and crackdown and settlements, and then there's liberal arguments. And then on top of that you have the lawsuits from the attorney generals working their way through courts.
So I think this is really different. That said, I did also see, well, kind of the tech apologists go into full speed. I posted this on LinkedIn, but one of the, you know, New York Times podcast columnist was like, yeah, yeah, you know, the networks should, you know, Facebook and the other platforms should do something about this, but.
You can't really blame them. They have so many users. How can they keep track of everyone? And by the way, schools haven't done anything about bullying over the last three decades, and they've known it's a problem. And I'm like, whoa, whoa. Hold up. Number one, like do not blame schools. This is not a school generated problem.
No. Even close schools are dealing with the problem that the platforms have created. And by the way, schools are doing a pretty admirable job of weaving in social emotional learning and citizen digital citizenship and cyber bullying.
[00:25:03] Alex Sarlin: Tons of cyber bullying,
[00:25:05] Ben Kornell: cyber, anti cyber bullying. So like one that's just.
Factually incorrect. And then two, this idea that how can you hold the platform accountable? There's so much going on. I think people would be shocked to know algorithmically, how much they do know. I mean, every single user, every single use case, the ability to drill into the individual and then extrapolate to the macro is the competitive advantage that all these platforms have.
So there's a profound misunderstanding, I think, of the tech companies and what they could do and what they're not doing now. And you know, a blame game of like pointing at societal levers, which you know, points at education. And I think that's misguided. And there's some simple solutions that are here, like this idea of universal age verification.
Europe is doing it. Facebook is, happens to be in Europe, by the way. Instagram is in Europe. They can do it. And you know, there's both having Google and Apple who have app stores and do you know, authentication. They could be the players to do, you know, h verification and identification. There's also this new company, Lindsay Hogan, who's a EdTech veteran, has launched called Bando, where they basically do third party age verification.
And what's great about theirs is it's privacy protected, so you never actually share any of the child's data with the platform itself. It's kind of third party cryptography use case. It's just. The headline. Just stepping back, I've kind of gone through the points, but the big picture is there's a history here, but this moment is different and the solutions are actually incredibly feasible, and that's where I think we are.
And by the way, on the presidential election front, this could be a winning topic. For Biden, this could be a winning topic for Trump. It's a jump ball who's gonna take the topic? I think that's a really interesting question,
[00:27:06] Alex Sarlin: and it's a subject that I think has gotten enough momentum that people would really care.
Great analysis. The only thing I'd add to that is that these companies, I mean, they had Discord up there too. You know, these companies have already been very far ahead in their use of AI to identify content, to identify problematic video on YouTube. I mean, they have been way ahead of everyone else on AI for quite a long time, and now we're in this moment where AI is flourishing.
There are clearly solutions that come from that. As well. They could be doing so much more to monitor. It's really about policy. It's really, I mean, we know that Facebook has hired 40,000 people plus to monitor what's going on on, on Instagram and Facebook. We know that TikTok does something similar, but it's not just about, you know, these armies of people who then burn out and go crazy because they've seen such horrible things.
I mean, this is exactly, it's one of the best use cases of ai. You can imagine it's an infinitely, you know, it has eyes on everything that happens on the entire social network. It knows exactly how to identify all sorts of problematic things. It can take them down instantly. I mean, they did some examples in that hearing where they'd be like, oh, this girl was bullied by her classmates.
Or they posted pictures, deep fake pictures of her, and they complained to Twitter and they complained. And they complained and they complained. And Twitter wouldn't take it down. And finally, you know, they had to sue them and then Twitter would take it down and it was too late. And you're like. They know how to do this better.
They do. I know they do. It's literally, they don't want to be the ones who draw these lines because if the lines are so morally dubious, they're so complicated. I don't know. I agree with you. I love these metaphors of the tobacco wars and the sort of, you know, these moments in American history where we sort of collectively realized that we've all been duped by an industry and then regulation starts to follow.
We've gone a lot into big tech. Let's do some little round the world with some of the ed tech headlines this week. So one that caught my eye, it's, I don't think it's a huge deal, but up grad in India just announced that they're opening a hundred physical experience and learning centers. And this is something that we've sort of followed this trend of, especially Indian ed tech companies sort of trying to move back into brick and mortar as, as a way to expand their brand and sort of protect what they're doing.
And I don't know exactly what I. I thought that was an interesting that they're still committing to that 'cause I have not seen a lot of success in that strategy so far. But, you know, upgrade has done really well. And then of course in India, EdTech, the big, big story is that Baju continues to be a complete flaming dumpster fire.
They are investors trying to remove Baju as the head of the company, Baju. They've written down the valuation by, you know, somewhere between 90 and 99%. Then, I mean, can you just give us the headline on Baju as we've been following this since the podcast started and it just continues to just drop into the ground.
It's just amazing.
[00:30:06] Ben Kornell: Yeah. We need a whole nother podcast just to cover this one. It's like a Netflix drama unfolding before our eyes. But I think the reality is that, you know, free money, a rapidly growing company during the pandemic and limited governance controls that had to do with founder control, but also.
Lack of transparency, financial transparency in India led to a culmination of factors where it's hard to really say what the actual value is of baju, but what they're doing is attempting an incredibly down round to wash out the existing investors. And the existing investors have this choice. I don't really know anything that's going on at buy juice 'cause I can't get a realistic picture of their current financials, their debt, where everything stands across their myriad m and a portfolio.
But if I don't put money into the black box, I'm gonna lose it all. So that's why there's the lawsuit, that's why there's the divorce happening, the breakup. And it's really sad to see because this will impact EdTech in general and make it harder for generalists investors to make a case for EdTech. And then two, it will make it really hard for EdTech entrepreneurs or entrepreneurs in general in India because this is a salacious story around the challenges of the Indian financial system and the transparency and legal system there.
So more to come about this, but the TLDR is. You know, we did not need a dumpster fire of our largest privately held EdTech company on top of the body blows we're getting in public markets. On top of the challenges we're seeing in the venture fundraising community, it's this triple whammy and the message for founders is, if you've got cash, make it last.
[00:32:06] Alex Sarlin: Yeah, I mean, I feel like Biju is sort of rapidly heading towards the, you know, Adam Neuman, Elizabeth Holmes, you know, paradigm of tech founder who has just sort of slipped out of reality. And I don't know where this is gonna go, but yeah, we will cover this more. There's some really positive news in the K12 space this week though.
[00:32:25] Ben Kornell: Yeah. You know, I'll jump into the K12 space where there is good news. 160,000 K 12 jobs were added in the education workforce. And you know, that still means that our staffing levels are not back at pre pandemic levels. But given that enrollment is down, this report by the 74 million really shows that some of the staffing crisis we've recovered from.
I still think if you talk to the average school leader, this idea of how am I gonna fill all my vacant spots, you know, that's a fear and it's coming up again because, you know, this is the time of year where I. We're thinking about staffing for next year, but generally good news there. And then on the deal side, zoom raised 140 million.
It's a school bus company, but what they do is they turn the school buses into a battery. So they have solar panels on top. They have a ginormous battery inside the bus, and they're plugged into the grid during the times when the school bus is not in use. And so it's a climate tech investment. It's a ed tech investment, and the company has been around for a while.
It started in 2015. They have a really, really large fleet that's working in San Francisco, la Oakland, Seattle, Chicago, Boston. So this is a legit big business. Yeah. And they've got this physical infrastructure that they're leveraging for climate impact.
[00:33:52] Alex Sarlin: Yeah, and I mean, it raised my eyebrows a little bit this week just to see $140 million round in ed tech.
It's sort of EdTech slash climate tech, but that's a big round. It's wistful, it's w
[00:34:03] Ben Kornell: full of Hs past for sure. Right,
[00:34:05] Alex Sarlin: exactly. It feels like a throwback kind of round and, and you know, the founder of this company is really interesting. She was the founder of TechX at Stanford, which is a sort of accelerator.
She was eBay group product manager. She's a, you know, Redwood City, real Silicon Valley doer. And it was pretty cool to see a company do something that's sort of, you know, it's about electric buses, it's about batteries, it's also about ai. It uses AI to optimize the fleets of school buses. And people are still, you know, I.
Putting that kind of round in EdTech. A couple of other things that came through this week. Instructure finally completed its acquisition of Parchment. Those are two big name companies. This is an $835 million acquisition. Instructure is rapidly really leaning into these acquisitions as a way to solidify their stance as the leading, you know, largest LMS in higher ed and in K 12, as far as I know.
And Parchment is a credentialing and credential management platform. Started by one of the ex Blackboard leaders, and I was happy to see this come through. This is one of the largest acquisitions in the last year, just larger than the DreamBox $800 million acquisition by discovery, and it's just neat to see it, you know, go all the way through and see these two giants come together.
I'm excited to see where that goes. There's also a cool article. I shared this on LinkedIn, but Greg Topo, who's a friend of the pod, a longtime education reporter of 74 million and others, put out a really exciting article. I found this exciting. This is something I've followed for a long time in Fast Company this week.
Roblox and Kahoot and some of the gamified education platforms and how the NWEA and some of the sort of formal assessment companies, PISA and things like that, are starting to look at game-based assessment known, also known as embedded assessment or steal assessment as a really meaningful way to sort of see what the future of assessment might look like.
It's much more engaging for kids. It can be collaborative. Some of these assessments are actually, you know, run by two students at once, but they can still tell from the data who's doing what. It's a really interesting idea and I think it points to something that I really hope happens in the future of K 12.
What did you make of either those stories?
[00:36:08] Ben Kornell: The m and a theme that we kind of predicted last year, I think might still be coming this year. I think there's a lot of activity. It's hard to know which deals are gonna close and which aren't. And I think the threshold for doing an m and a deal has gotten higher because people are looking for positive cash on cash m and a.
And so the seller is much less likely to want to sell if they're already sitting on a positive cash flow business. And they'd rather wait it out longer to win. Multiples are better and so on. So that one for me, I think that stands out that some of the m and a is coming. I think the other thing that hit my inbox was the European EdTech funding report from Bright Eye Ventures.
And you know, we cover this on the pod all the time, like Europe is kind of chugging along and becoming a bigger and bigger share of the EdTech market and ecosystem. And part of that is that the fragmentation of years past. There's been a number of companies that have figured out how to launch in multiple countries, multiple languages, cultures, and so on.
And part of it is that the maturity of the venture system has Yes, aligned. And so what was most interesting to me, this is from Bright Eye Ventures. The number of deals are definitely down and the amount of money is definitely down. But if you were to draw a line from 2014 forward and just take the trend from 20 19, 20.
It's steady growth and it really was this bubble in the 20 21, 22 years that has kind of come back down again, but there's still 1.2 billion invested in EdTech startups in Europe. And then you look at the number of deals that are under a million, the seed and a ecosystem is super strong. And we published that article on micro companies.
It does seem like there's a lot of interest in getting companies up and off the ground that then immediately get to cashflow positive with smaller teams. But you know, steady growth and you can actually see Europe as a leader in that space.
[00:38:17] Alex Sarlin: Agreed. One other finding from the report that I found interesting was that France actually lept over Germany in the last year in terms of the total amount of investment, but it still has many fewer investments.
So there were sort of some larger rounds in France making it the second largest ed tech country in Europe. But the UK still remains. Very much on top. It had almost five times as many investments as France and more money. And I think, you know, part of what you're seeing with this chugging along, as you see is some of the very big, you know, we've talked about brainly, we've talked about ghost students.
It's had some trouble in various places, but you're also seeing a steady set of British. And to a lesser extent German and French edtechs just continue to raise small or a rounds and yeah, I mean that chart you mentioned is, it's really exciting actually to see that, you know, if you consider 2021, this outlier was two and a half billion dollars in one year and you sort of remove that and 2022, it's still very steady growth.
You're going up from nine 50 in 2020 to 1.2 billion in 2023. That's healthy and normal growth. So it's sort of, that story feels a lot more exciting than when you look at the hole in iq, you know, over the whole world. And it just, you know, the 2023 dropped through the floor globally, especially in the US So yeah, Europe, you got gotta watch them.
They have this engine going. One more story that caught our eye. We haven't mentioned higher ed yet at all. There's been all sorts of stuff with the FAFSA and financial aid issues. This is a sort of disastrous thing that with the rollout there. But one thing that was exciting in the EdTech community, we've covered in the podcast about how.
Dr. Shaquille O'Neal had moved into EdTech investing after a long track record of very impressive investing. He was an early investor in Google and Apple and Lyft and all sorts of people, but he is starting to do more impact investing. He had invested in, you know, ed Soma, which is a AI reading app with CEO Kyle Walgreen.
He made his second ed tech investment this week into a company called Campus Tara Ode. I may be pronouncing that incorrectly, but that's really basically a low cost community college that uses a lot of online work that is showing really higher graduation rates than traditional community colleges by, you know, a pretty good amount.
So that was really exciting to see. And you know, obviously Shaquille O'Neal, big name basketball player and investor, but I was excited to see campus as a company get some attention too, because they've been sort of, that community college has been around for a little while and they've grown and they've started to really break through and I think this is gonna bring them some much needed.
What did you make of that one?
[00:40:54] Ben Kornell: Yeah, same. I mean, I think anytime you've got celebrity investors coming into this space, it just draws more eyeballs to what's going on and what's working in EdTech. It can also bring scrutiny and so, you know, this investment seems like a really solid one because it is right in the intersection of how do we reimagine higher education to deliver on a positive ROI.
So yes, you know, Shaq like, come slam dunk the ed tech sector. We're excited to have you and many, many other people who can bring great visibility to this stuff. You know, I think this is a good time for us to transition to our interviews. So we've got some great guests coming right up. Alright everyone, I am excited for our interview today.
We have a long time friend of the pod, Aaron Moat, and then we also have Zach Chase from US Department of Education joining us to talk about the national ed tech plan. We are super excited to have you both here. Hi Ben. Super excited to be here. Great. Well, first let's start with you, Zach. Can you just tell our listeners a little bit about your journey and how you ended up at the Department of Education in the first place?
[00:42:05] Zac Chase: Sure. Uh, immediately proceeding my work as a fellow at the department. Right now I worked in a school district in Colorado as the six through 12 ELA curriculum coordinator, as well as the library and media services coordinator. Before that, I was with department in the Obama administration, where I led the development of the 2016 in ETP and started my career as a classroom teacher.
[00:42:29] Ben Kornell: Couldn't leave it alone. Had to come back for more. I love it. And Aaron, for those who have been hiding under a rock in EdTech, can you just tell us a little bit about who you are and what you do?
[00:42:41] Erin Mote: Well, hi everybody. I'm Erin Moat. I'm the Executive director of Innovate EDU. We're an organization that works at the intersection of data policy and technology all towards equitable student outcomes.
I'm also a recovering school founder of a founder of a school in Brooklyn, New York, Brooklyn Lab Charter School. Shout out to the Labradors out there and I get to work with really cool folks like this consortium of CDA Learning Forward Project Tomorrow, and Whiteboard Advisors alongside Innovate EDU to help support the US Department of Education in the National Educational Technology Plan.
[00:43:21] Ben Kornell: Wonderful. So let's dive in. What is the National Education Technology plan, and then can you talk to us a little bit about the three key divides that the plan is talking about?
[00:43:32] Zac Chase: Sure the national ed tech plan is the flagship ed tech policy document for the us and this iteration of the plan takes as its focus three key divides of equity.
The the baseline is the digital access divide, and this is the one we've been talking about for decades. This is tradit.
Digital content we've seen across history has been historically a divide between socioeconomic status, so those who are participants in historically marginalized groups and have less access than their peers. So that is still a key point of this plan. But we add two components to the access divide, and one of those is the component of accessibility.
And so within the access divide within the 24 plan, we point to accessibility, meaning features that help those with identified or perhaps unidentified disabilities, being able to gain access to their learning or digital content or any of those pieces they might receive through technology. The other new component of that digital access divide is the idea of digital health, safety, and citizenship.
And the way we think about that is I knew who my driver's ed teacher was when I went to high school. I knew who was gonna teach me to drive a car, and it wasn't as though the school gave me keys and loaded up a, a tank full of gas and said, go do it and then we'll teach you, uh, driver's ed later. And we think the same thing needs to happen with digital health, safety, and citizenship in that schools as they're planning for access or revamping or improving their access, need to have a clear cut plan for how they're gonna help kids learn how to be safe, healthy digital citizens in those online spaces.
So that's the digital access divide. In the middle, we have what we're framing as the digital design divide. And this is oftentimes within systems. So you can have a teacher who has oftentimes sought out on their own training to better understand how to use digital tools to help kids learn. And then you can go right next door and have a teacher who doesn't have those same capacities, didn't have the time or support that they needed to develop those capacities.
And what we're saying is that systems need to really think through professional learning so that all teachers have the time and support to build their capacities toward using these hundreds of tools that are available for them in any given moment. And design quality learning experiences for kids. And what we suggest in the plan is that the Universal Design for Learning framework is a key component of that design.
All of this, the access and design, move toward this third piece, which we first brought forward in the 2016 plan, the digital use divide. And that's the, the divide between learners who asked to do really active things with technology, design, analysis, creation. And there are peer groups who are not asked to do very active things, who are asked to do much more passive things.
We have some surveys out there that say that most kids say quizzes and tests are the thing that they're asked to use technology to do in their, in their learning. And there's just so much more possibility beyond completion of quizzes and tests. So those are the three big,
[00:46:32] Ben Kornell: yeah. So Aaron, you were part of the consortium of EdTech leaders that helped develop this.
What does this mean? What are the implications of these kind of defining areas of digital divides? And almost like a framework for what access looks like in the 21st and even, you know, 22nd century. So
[00:46:52] Erin Mote: first of all, this consortium did a lot of listening, including the US Department of Education. Did a lot of listening out there.
So this consortium talked to over a thousand industry groups, people, organizations, listening sessions. So thinking about what's happening both in the field right now, what can we build upon? What do we have to build anew? What can we look to and find inspiration from? And so also wanna point out to the listeners something super cool in this year's plan, which is a map, which has an example of this work in all 50 states plus territories.
And it's really amazing 'cause you can actually see some of this work already happening out there in the field. And so when we think to building the future, it's not like we have to build a new, it's that we have a lot to build upon. And so in these listening sessions, we heard from amazing examples of this work happening across the field, whether that was.
Districts embedding accessibility requirements in their procurements to start conversations with industry leaders about what did it look like to have accessibility in ed tech to classrooms, changing the scope and sequence of what it looked like for digital media and literacy by having classrooms where we were highlighting students, being journalists, students being producers of content, not just consumers of content, which is a big part of the NETP.
That's what we call the National Educational Technology Plan. And that NETP spells out these amazing examples. They're also in the appendix. And you know, I love a great appendix, and I think the appendix is the sexiest part of this plan, not just because it has those great examples of the work that's happening in the field, but also.
Because there's these two page documents built for audiences, whether you're an educator, a district leader, or you work in a state ed tech agency, which helps you digest the plan really quickly and talks about what actions you can begin to take to implement this for the future. And so I think the future is bright, but we're all gonna have to make sure we row together to talk about the plan, to look at what we're doing in our own districts, in our own states, take pause, and then to look at those recommendations, whether it's about giving more space and time for professional development, embedding accessibility into procurement guidelines, elevating the role of the EdTech procurement officer to a cabinet level position, or thinking about how to use your Title two A funds in order to support that professional development for educators.
So that's where we hope the future goes, and we hope folks take time. To look at the plan to consider their own context that they're operating in and find the recommendations that are most useful to them.
[00:49:43] Ben Kornell: I love it. And it gives me this sense of, you know, these divides and the idea that there's the kind of fundamental inherent inequity in our system by elevating the stories of what's happening and what's possible.
It doesn't feel like a bridge too far. It's actually the innovation is in our midst and we need to figure out how to scale and connect it. So with this national education technology plan versus last one, Zach, what were some of the lessons learned in the rollout last time that you're applying for this one?
And I, I just also wanna be really clear, this is a set of guidelines. These aren't overnight requirements, so there's a way of really educating schools and district leaders that that's gonna have to happen. So can you talk a little bit about that, Zach?
[00:50:31] Zac Chase: Absolutely. One of the pieces that we heard as we were engaging in that listening work that Aaron was speaking about was a call from the field to make sure that this was a solutions oriented document and that the recommendations that were included were practical ones.
So we have those three key sections across the access, use, and design divides. And one of the simple things we did from a design standpoint is move our recommendations to the beginning so that as folks are starting to read through the plan that they see these are the practical steps that are gonna be included here.
And then they understand that a little bit more fully as they go through. We've also tagged those recommendations saying this is a specific recommendation for somebody who's leading at the state level or the district level or the school level. So the folks can get a better idea of what is it really that we are being recommended to do to bring greater equity to these systems and learning experiences.
So I'd say that was a key component as well as those examples. And it was really important to us, the folks could see themselves in the work that this is really a national educational technology plan, so we wanted to make sure that the nation could pick it up and, and take a look at it. I shouldn't say pick it up though.
It's a digital document, so look it up, I suppose.
[00:51:35] Ben Kornell: And Aaron, for those who are building an EdTech, the entrepreneurs amongst us, how would you recommend they incorporate the National EdTech Plan into their product, but also into how they're communicating with schools and districts that they're partnering with?
[00:51:50] Erin Mote: Yeah. Well, I think one of the things we really strive to do in the National Educational Technology Plan was to create shared language, shared language between folks who are in the system, including the supply side, who's building these tools, these solution providers. We actually did listening sessions with solution providers and industry folks to hear from them what they needed.
So we hope that there's some clarity there. Shared language opportunities for connection. And then I think one of the things, and I, I'm just getting back from FETC in Orlando, where I've had the chance to talk to a lot of providers and a lot of. Educators and a lot of superintendents, one of the things that I'm hearing in their reaction to the NETP is, wow, this can really help me start or catalyze a conversation, whether that's about accessibility or about the type of professional development that needs to be invested in for my product to work in a given school district.
I think that's phenomenal. And again, really, I think the other thing, Ben, this isn't gonna surprise you or your listeners when I say this. There's this desire for evidence about what works for whom and under what conditions in educational technology. And so at least have a logic model is what I always say to EdTech industry folks when they're thinking about catalyzing those conversations so that the clear link between what you're building and what you hope to build and student outcomes and what you desire can be there and can be part of a shared language and shared conversation.
We hope that NETP can help catalyze that.
[00:53:26] Ben Kornell: That totally resonates. And you know, Alex and I, when this first came out, we're reflecting on this and the what now seems like obvious was kind of a fringe not that long ago, like UDL being a core part of the national ed tech plan just a decade ago. You know, UDL was like a cult of educators who were trying to reimagine what learning could be like.
And I think that's really strong that you've brought in that sense of the design element. And then on top of that, I think highlighting procurement as a part of the system where this intersects is also I think something that's really refreshing and resonates and gosh, like I've never heard somebody say we should elevate the procurement role to a cabinet level, but in so many ways that is where the rubber meets the road in terms of.
Districts and who they're partnering with and how they select those programs for their learners. And so, you know, there's a lot of clarity that the documents provide and also a lot of flexibility it seems in terms of where people can go to find out more. What would you recommend? Is it going to the website and reading it?
Are you doing like listening sessions or presentation sessions, or how would people kind of lean in more?
[00:54:42] Zac Chase: We are building our road show, and so hopefully we'll be able to go to major convenings across the country. But folks can look at text ed gov slash tp and they can find the PD version. They can find the web enabled version as well, the map that Aaron spoke about.
So that's a.
Is that our office is, is ready to talk to anybody around the country who's ready to start this conversation. We also know that it's hard for folks who are advocating for a stronger, more thoughtful use of technology and education to have that conversation with their colleagues. So we are very happy to facilitate a conversation at the state and district level and say, here are the big ideas, and let those folks can sit back and be participants so that they can help their colleagues better understand what we're working toward.
[00:55:38] Ben Kornell: Well, Aaron Moat has many superpowers, but convening people around these issues is definitely one of the top ones on the list. And I'm just imagining Aaron in an RV traveling across the country doing like any tp and, and then I would also just say what Alex and I did is we took the national ed tech plan.
Loaded it into chat GBT and then could ask questions of the national ed tech plan. So, uh, just put it out there if you want to do like a bot, it was super helpful. Like what about this, what about that? And it would just kind of answer our questions as we needed to without having to search. So, you know, Alex and I are low level users of OpenAI and also we're lazy, so that's a secret hack out there.
[00:56:26] Erin Mote: Yeah, an audience. Don't forget those appendices that you can just see those in two pages. So maybe start from the back and move to the front. Sexy, sexy appendices.
[00:56:35] Ben Kornell: Yes,
[00:56:36] Erin Mote: sexy appendices.
[00:56:37] Ben Kornell: I like it. Well, thank you so much, Zach. Chase, Aaron, Moe, it's been an honor to have you and thank you for the service to like our collective work with this national ed tech plan and I'm sure we're gonna be hearing a lot more.
It's
[00:56:51] Zac Chase: been a pleasure. Thank you very much.
[00:56:53] Erin Mote: Thanks Ben.
[00:56:54] Ben Kornell: Alright everyone, we have Sohan today joining us from Flint. It is an AI in education application. He is gonna tell us all about it. Sohan, welcome to the show. Thanks, Ben. Thanks for having me. Before we dive in too much, can you just tell us a little bit about your journey and how you came to create Flint in the
[00:57:12] Sohan Choudhury: first place?
Yeah, for sure. So my background is I actually went to Georgia Tech and studied computer science there for around three years, then dropped out to start a video chat company during the pandemic, and that went really well for a couple years, but kind of petered out. Uh, as you can imagine, video chats not as popular as it used to be.
Window back in 2020 last year, I was kind of looking at the AI and education landscape. Uh, knew that my co-founder and I wanted to start something new and we really felt that there was an absence of folks building great tools, specifically using AI for educators. There's a lot of cool like study apps out there and a lot of student facing stuff, but we felt that there was something missing at the school level.
And as we started talking to a lot of teachers and school administrators really felt that while there were experts when it came to education, they were kind of at a loss in terms of controlling the development of AI software. So we basically bend the software arm for these independent schools embracing this new tech.
[00:58:00] Alex Sarlin: We chatted with you at the AI event last fall and one of those points you made that I thought was just really interesting and I'd love to hear you expound on it here, is how important it is to co-design, you know, to not be the sort of tech company walking into a school telling them what they should do or could do or can't do, but really working very much side by side with the educator community.
And I'd love to hear how you do that and sort of why that philosophy has really in key for you.
[00:58:27] Sohan Choudhury: Yeah, for sure. I can actually start with the why and then I'll move on to the how. Uh, in terms of the why, we actually had no other choice. So I'm 24. The average age of anyone on our team, we're a team of five, is like, I think like 23, 24.
We're a really young team. I, myself and my co-founder, neither of us have been educators before, so it was actually completely impossible for us to go in and say like, we know what we're doing when it comes to working with you as a school, right? Like, we really didn't know what we were doing. And we were always very transparent about that.
We're very upfront about that with every school we work with, but we really saw that as an opportunity, right? Because for us, it wasn't even possible for us to go in and have this opinion of how things should be done. We were completely dependent on every conversation with school leaders and every school that we ended up partnering with to go to co-design with them and figure out, you know, what does AI in schools actually look like?
So I would say initially it was definitely like, mainly a weakness and something we're like a little bit nervous about, not, not definitely not as confident about, but over time we definitely have seen how it's become a strength. You know, even today, the Flint product has. Dozens and dozens of features. We have around 40 schools and, you know, tens of thousands of teachers and students using it.
But I am hard pressed to think of a feature that our team has thought of. Uh, the vast majority of the functionality has been all thought of by teachers, and at most we're condensing that information, you know, playing it back out to other, you know, members that are using the tool and kind of seeing what resonates.
So for us it was not a choice, but I think we're actually turned out to be very lucky in that sense. But in terms of how we do it, it's just lots of conversations. So we visited over a dozen schools in person, and every week we probably talk to 30 to 40 teachers. At the very least, just hearing about how they're using the tool.
Can you tell us a little bit more about what the tool does and how it works? Yeah, for sure. So we're, you know, very optimistically trying to build an all-in-one AI platform that will let students get one-on-one tutoring. So right now what the tool does is it basically lets a teacher go in, provide a specific learning objective and any sort of teaching materials that have on hand, and the end result isn't a worksheet or a rubric or a prompt for students to fill out.
Instead, it's actually a one-on-one conversation. So our AI system will go and then speak to every single student, either in a written conversation or a verbal conversation. So they can actually go in through and speak with the student in French or Spanish, or even co code with them in a computer science assignment to basically help them get tutoring and assess their understanding of any topic.
So we're basically creating a more managed way for teachers to use AI in the classroom and for students. It's very, very focused on whatever learning objectives the teachers put in, and it's all about kind of that one-on-one tutoring experience.
[01:00:50] Alex Sarlin: Really interesting and you know, I imagine that people see you coming in your team that's all 23 year olds and say, oh boy.
But also if anybody knows how to use this AI stuff, this new fangled stuff that's just dropped, maybe it's these folks. I know that one thing that has happened a lot over the last couple of years is, or last year and and a bit, is that schools have sort of gone back and forth about their sort of perception of AI as a really strong learning tool or something that it could be about an integrity or, or a risk.
I'm curious what kind of conversations you've had and how you've seen that sort of, you know, pendulum swing back and forth over the last year.
[01:01:27] Sohan Choudhury: Yeah, no, it's a great question. I would say like I do a lot of kind of first calls with schools that are somewhat interested in what we're offering, but don't really know if it's what they're looking for.
And I would say, you know, 10% of the time we'll talk to someone that's like, okay, great, this is exactly what I'm looking for. The other 90% of time it can be scattered. It can be a lot of concerns around use of ai. Even if the person I'm talking to, the school administrator's excited, they're dealing with a spectrum of teacher opinions, right?
Where there's a lot of concerns around cheating. Our approach to that is basically just, you know, getting on the same page with anyone we're working with, which is like, look, students are using ai. There's no way that you as a school can control that, right? Think at that point. At this point, that's clear to everyone that works at the schools.
You can't ban chat GPT, you can't ban it at home. There's just an infinite number of tools. So then the approach we take is like kind of accepting that reality with whoever we're working with and then moving on to what the next best step is, right? And I think even for teachers that are most skeptical about students using AI for the wrong things, still see how they would prefer to have the technology be on their side and have a little bit more control over it as opposed to just reacting to how it's kind of ruled out.
So I'd say even those very skeptical teachers, they'll cautiously maybe join our platform. They'll maybe like use it for very limited use cases, but then once they start to see the time saving benefits, once they see that they can actually use AI to prevent cheating as opposed to just trying to catch cheating happening in the first place, I think that can help change their opinion.
So yeah, I think a lot of it's just like empathizing with the folks being like, yeah, look, we're not the solution to cheating, but here's like an alternative path that is probably more exciting than just trying to tell your students to stop using ai.
[01:02:58] Ben Kornell: So one thing you brought up is really the intersection between the student and the educator.
How do you think about the kind of users and buyers in your marketplace? Is the student, the user is the teacher? The user is the administrator of the user. Who's the buyer? Who's the influencer? I think that's something that everyone struggles with in EdTech in general. But then when you have ai, the tools can do so many things.
How do you narrow in on your target audience or target user? I.
[01:03:30] Sohan Choudhury: It's a great question. So in terms of like, just tactically as a company, our buyer is typically a school leader. So that can be anyone from the Dean of Academics to the head of school in some cases, especially for smaller schools to like an instructional technology team.
So some larger schools are working with, I would say, those that have like K 12 over a thousand students that will have a whole team dedicated to evaluating, rolling out ed tech tools. So they're like, they speak our language, they know exactly what we're offering, and you know, they have also the expertise to vet what we're doing to see if it'll be useful for their school.
So I'd say like anyone that works. On technology decisions outta schools, who we typically work with from the sales perspective. In terms of our users, this is a little bit broad, like you mentioned. So like a lot of our users are teachers, a lot of our students, and then there's even administrators using our tool to learn themselves and even to automate some sort of admin tasks.
So the way we measure our usage is actually through the number of tutoring sessions happening on the platform. So we define a tutoring session as whenever a user goes in and I ask in a session, at least three to five questions to an AI system. And that can be student doing in an assignment that a teacher's created or a teacher going in using it for lesson planning.
And from a school's perspective, we just tell them, look, like this month you had like 500 tutoring sessions powered by AI happen on Flint. So that's kind of how we communicate the value and measure whether people are actually using the platform. It's just like how often are people going in getting some value of learning out of this.
I mean obviously then if the is high, it probably indicates some value there.
[01:04:52] Alex Sarlin: Yeah, it's interesting to hear that idea that sort of the entire school community potentially are users and, you know, everybody could be using it for their own personalized learning, basically. Tell us a little bit about how you sort of explain the tutoring.
You know, you mentioned that it can code alongside a student, it can speak different languages. There's obviously all these elements of sort of, you know, tailored, bespoke, personalized type education built on top of what teachers are already, you know, looking to teach. How do you explain that? How do you sort of like have that conversation?
Because I think that's not the most intuitive idea, you know, for those who haven't spent a lot of time in ai.
[01:05:29] Sohan Choudhury: Yeah, it's a great question and it's definitely like a very, very different style of teaching, right? Like when we first talk to a teacher that doesn't know anything about Flint, I have like a few slides, I share them and the first slide is literally like, okay, here's a worksheet that you might be familiar with, right?
Basic, short answer, multiple choice questions. You use this for giving students practice, you know, to help deepen their understanding, but also to see how they're doing. Then the first thing I say is like, okay, what I'm about to show you is unlike any assignment you've done before, you're gonna be creating an AI bot that's gonna then go talk to your students.
So it is a very, very novel concept. There's some things we do to make that experience still very easy to use. So the analogy I love to use is it's like working with the teaching assistant. If you're working with the teaching assistant, right? You do have to give them a little bit of control, but you're gonna give them a lot of guidance and give them materials to follow.
You can give them your worksheets and just say, Hey, make this better. Like go make this a bit more interactive and more fun for my students. So that's the kind of paradigm we try to mimic in our tool. So again, it is very new, but by likening it to a teaching assistant, I think teachers feel a lot more comfortable in that.
It does, again, feel like an assistant, feel like someone that's there having conversations with your student, not something that's there to automate any of the work that you're actually doing. So again, for us as a company, there's a lot of use cases for AI substituting work from teachers, but we're not interested in that at all.
Honestly. We're really focused on like what is like this new teaching assistant we can bring to life from teachers?
[01:06:43] Ben Kornell: Incredible. Well Sohan so great to talk to you today on EdTech Insiders. If people wanna find out more about Flint, where can they
[01:06:51] Sohan Choudhury: go? Yeah, for sure. So you can search Flint AI on Google, but our website is flint k12.com and Flint just spelled F-L-I-N-T.
So should be an easy name. But obviously other things are called Flint too. So yeah, really appreciate you having us. Alex in bed,
[01:07:06] Ben Kornell: great to have you Sohan, the CEO and founder of Flint. Sounds like you're on fire, but to, so great to have you here and thanks so much for being a long time member of our community and supporter.
We're excited to keep following where the journey leads. Yeah, love EdTech Insider and thanks for having me. Appreciate it.
[01:07:26] Alex Sarlin: Alright, bye-bye. Thanks for listening to this episode of EdTech Insiders. If you like the podcast, remember to rate it and share it with others in the EdTech community. For those who want even more, EdTech Insider, subscribe to the free EdTech Insiders Newsletter on.