Special Episode: AI Founders Forum - podcast episode cover

Special Episode: AI Founders Forum

Nov 13, 202352 min
--:--
--:--
Listen in podcast apps:

Episode description

Send us a text

In this special episode of Edtech Insiders, Sarah Morin interviews seven founders of AI education companies at the AI Founders Forum held earlier last month in San Francisco.

  1. John Danner, Co-Founder and Co-CEO of Project Read
  2. Catalin Voss, Co-Founder and CTO at Ello
  3. Christian Byza, CEO and Co-Founder of Learn.xyz
  4. Courtney Monk, Co-Founder and COO at Schoolytics
  5. Gautam Thapar, Co-Founder and CEO of Enlighten AI
  6. Shweta Gandhi, Co-Founder and CEO of Strived.io
  7. Brandon Grusd, Co-Founder of Ethiqly AI

Transcript

Alexander Sarlin

Welcome to Season Seven of Edtech Insiders. The show where we cover the education technology industry in depth every week and speak to thought leaders, founders, investors, and operators in the edtech field.

Ben Kornell

I'm Alex Sarlin. And I'm Ben Kornell. And we're both edtech leaders with experience ranging from startups all the way to big tech. We're passionate about connecting you with what's happening in edtech around the globe.

Alexander Sarlin

Thanks for listening. And if you liked the podcast, please subscribe and leave us a review.

Ben Kornell

For our newsletter events and resources. Go to edtechinsiders.org Here's the show.

Alexander Sarlin

In this special episode of edtech insiders, Sarah Morin interviews seven founders of AI education companies at the AI founders forum held last month in San Francisco. We have interviews with John Danner, co founder and CO CEO at Project Reed, Katelyn Voss, co founder and CTO at

ello. Christian buys a CEO and co founder of learn XYZ, Courtney monk, co founder and CEO of school ETICS got some tapper, co founder and CEO of enlightened AI Shwetha, Gandhi, co founder and CEO of strived IO, and Brandon Gruz, co founder of ethically AI, enjoy the conversations, if

Sarah Morin

you could introduce yourself and tell us a little bit about your company and the founding story. That'd be great. Sure,

John Danner

John Danner project read is our company. And company builds decodable stories for teachers for free, and then provides feedback to the students as they read those stories orally, so they make mistake and they get a hint or correction. The founding story really is I was at Stanford two years ago, while I guess was last year, actually. And my co founder was at the Stanford Business School, we were in a class together as kind of GPT three, and then Chad GPT came out. And that got me very

excited. I'm a career educator, I was a classroom teacher, second grade teacher. And I started rocketship public schools, which is in low income communities, about 10,000 students a year, and then had done a series of ad tech

companies. But the reason this point time is so interesting is that before when you were doing ed tech, you were basically creating a digital tool that people could click on and things like that, it was really a little bit more like electronic babysitting than anything else. Whereas when you've got a conversational AI talking to a student, it's a lot more

engaging. And so I think, or tough areas like learning how to read, I think it's kind of perfect, because a teacher cannot get the 30 kids in their class and read with them one on one, which is really what it takes for a kid to improve. So having this infinitely patient, non judgmental tutor that's talking to the student, as they read, we think is going to be different, better.

Sarah Morin

Yeah, that's incredible. And to your point, so true, the abilities that AI is providing that are so different from past ed tech products, on that train of thought related to the work you're doing specifically or the broader ad tech industry. What are you most optimistic about when it comes to the future of AI and education? Yeah,

John Danner

I think the amazing thing is that before, if you were in a low income area, rocketships schools were 92%, free and reduced meal students are really the most challenged neighborhoods, we could be in, those schools historically have not gotten kind of the support, resources, etc, that they needed. And a lot of the kinds of things that you want to do is tutoring, which is very expensive. So you would see wealthier districts have tutors

and do a lot more of that. But the lower income schools that didn't get anything, it was just, you know, free for all the teacher had to manage everything with AI. Now, you have the ability to provide this kind of one on one support, and it's not quite free, but it's way closer to free than having, you know, a human tutor in every classroom.

Sarah Morin

Right? Yeah, yes, so true. And on the flip side of this, obviously, like this access that is provided through AI, as you know, unlike anything we've seen before, to your point, when it comes to things like tutoring, what are some of the concerns that you have about AI becoming so prevalent are some of the risks that are notable to you?

John Danner

I mean, I think especially with kids in school districts, our natural reaction is going to be to manage the risks. So even though now I would say a lot of teachers are very excited about AI, I would expect that within a year or two people will be really aware of the risks. and a lot more conservative about what they're doing. So I think that the challenge with AI is, it's really a completely new thing. It's not just like a continuation of the internet,

it's this new capability. And it's kind of like self driving cars. You know, we hear more about every self driving car that has an accident than every 100,000 humans who have accidents. And I think the same thing is going to be true in

schools. So if AIs are doing a huge amount of good for reading or math, but some AI is prompted in a way that gives a horrible response, people are gonna react really strongly to that right, gonna make headlines etc. So I think the big challenge is, can the industry prove that AI really has its place and is effective before kind of these headlines appear? And people say, Oh, my gosh, like this was a terrible idea. Let's shut it down. I think that's the tension. Yeah,

Sarah Morin

definitely. Any closing thoughts that you have on AI and the tech industry in general, kind of your work that you're doing right now. So big picture thoughts that you'd like to share,

John Danner

I would just say, I've never been more excited about technology and education. When I started rocket ship, we created this two hour period called Learning Lab where kids were on computers and with tutors. And it was very successful, but not nearly as successful as it will be in five years, because now the technology kind of is caught up to what our expectations were as educators before. So I think that part's just going to be amazing and amazing for kids.

Sarah Morin

Thank you so much for sharing and for being here.

Gautam Thapar

Thank you, Sara.

Sarah Morin

So would you introduce yourself and tell us a little bit about the company you started and the founding story for that company? Sure.

Catalin Voss

My name is Catalin Voss. I'm an engineer, I guess an entrepreneur background in applied artificial intelligence and the goal to try to build AI tools. That's all practical social challenges. previously built learning aid for autistic

kids. Starting a tech company with my former co founder, Tom Sayer, mostly While at Stanford, get my bachelor's Master's, have a PhD there dropped out a couple of times, and most recently, to be joined forces with Tom and Dr. Elizabeth Adams to build ello, which is an AI reading instructor. Ello was inspired originally, by Elizabeth experience, trying to figure out how to teach her daughter to read during the pandemic, we were sort of sat there in 2020 with on Y Combinator and worked

on all sorts of things. And then schools shut down. And Elizabeth at home with Lily, she's six years old at the time, and sitting next to her like every other daughter next to every other parent at the time, learning to read on Zoom, which turned out to be of course, a miserable experience. All the parents on our platform started messaging Elizabeth, she's a clinical child psychologist. And so everyone's got her number saved, asking her Hey, you know, what should we do school shut

down? What's the stack of apps that I should download onto my tablet? And we looked and there just wasn't a good answer. And then we realized, holy moly, this is a national and international crisis in the US 69% of fourth graders are unable to read at level those kids are expected to use reading as a tool for learning as soon as they move on from elementary school. And they're currently unequip to do so. And ello is

there to change that. So we're building an AI reading teacher in the form of an elephant who sits there and acts as a reading tutor word so basically listens to you read out loud, helps you out when you get stuck motivates you to keep going and then tries to inspire love for reading. Oh,

Sarah Morin

thank you. And what an amazing founding story. I'd love to hear then a little bit on your take as you're starting to work in AI of what makes you most optimistic about this field of AI and education, specifically, the impact there and how what do you foresee is some of the benefits that are coming our way soon? Yeah,

Catalin Voss

I mean, I guess, fundamentally, this dream of building AI tools to scale education. It's really like one of the original motivations of AI and one of our advisors, Terry Winograd, kind of said this to me when we got started, like, you know, this isn't a new idea. This is what people were trying to do in the 60s at Tech is just kind of not been able to do it. And we think for various reasons. But you know, one of the big ones is just technology

hasn't been there. As a result, people have, you know, built tools that tried to take the classroom online, or build sort of labor arbitrage businesses, but none of them have really been able to tackle in some sense, this one of the core problems of our time in education, which is teacher shortage, right? It's just student teacher ratios are

untenable. In this country and many others, and we think at scale AI now with the ability to understand kids properly, and then be able to respond in a manner that feels humanly natural, has the ability to scale and democratize, one to one instruction for everyone. I think that's the biggest opportunity. Yeah,

Sarah Morin

well said, and yeah, intertwining so many of these issues of our time, I think you're totally right. Where, again, this has been a focus of education for a long time, but the technology hasn't been there. So it's really exciting to see where we're just

being able to open doors. On the flip side of this, what are some of the concerns that come to mind for you, and in your work and what you're seeing in the industry right now, around the future of an education, some of the risks, some of the pieces that maybe you're you're more concerned or nervous about are coming up? Yeah,

Catalin Voss

I mean, I think the number one risk is that it kills us all. I think I'm on the, I'm on the side of people who think that's a little bit less likely, I think there's still a small chance, but I think it's quite small chance. So what I'm, you know, worried about sort of second tier from that is, of course, AI models, amplify biases of our world and,

and multiply them, right. And so I actually think that's both a terrific risk and an incredible opportunity, because it gives us the, you know, a targeted unlikely sort of anti bias training, you know, where it's like, okay, well, we got to do this at the human level, it gives us sort of a targeted lever of action. Like if we can make this better if we can make foundational models better. If we can measure the way bias manifests in the pipeline, then we have an opportunity to do

something about it, right? I think it's really important to recognize that this happens every single step of the way. So if we think about ello, it's like, gosh, I mean, it starts with just the speech perception piece, like audio filtering, like, Okay, what was this model trained on? What's the accent, background of the data set composition of that, and then eventually, of course, goes into language models and amplifying, you know, just kind of the sad and shitty realities of the

world. But we can measure that. And we can measure that every single step of the way. And I think it's on us and at Tech to do this measurement, and then take, you know, correct precautions. And it's like, if you're on 2023, and you're not tracking accuracy disparity as a key metric alongside your iPhone, Scott, what are you doing? Right, right. But then I also think the tools and levers of control are going to get

better, right? We're not going to be limited to supervise chip, fine tuning and reinforcement learning with human feedback. I think people are starting to build that that tools that are going to help us achieve this.

Sarah Morin

Yeah. Thank you so much for your thoughts. There's one closing question. Are there any pieces that you've, you've mentioned here about both the optimism that you're feeling as well as some of these risks that you feel like you've learned anything particular about it at your your work at ll right now and things you're navigating as a company? Is there anything you'd like to share from your firsthand experience right now?

Catalin Voss

Gosh, I mean, we're trying to do what's the right thing, obviously, but so we are positioning ourselves as the folks trying to use the tools at our disposal to tackle massive reading crisis, and actually make things better, we are incorporated as a public benefit corporation, which means that, you know, rather than saying, this corporation is being formed to maximize interests of the shareholders, it says, this corporation is being formed to maximize interests of the shareholders

and foster innovation early chapter 13, copy and paste the mission statement. Right, right, which when push comes to shove, allows us to point at this and say, Yeah, in the charter, it says that this is part of our objectives. And so we may make a decision that isn't in the best interest of our shareholders in the short term, but it's teaching the world to read. And by the way, we think that's going to make things better in the long term. And so, you know, we face these choices are tough,

right? As a social entrepreneur, you kind of believe that you can have your cake and eat it too, right? We face them every day, we face them, both with the way that we're doing the technical work, which Yeah, you know, we're we're here at Common Sense

labs space today. And we've spent a bunch of time with them in their AI assessment thinking, you know, kind of more carefully through how to set up the right metrics for this stuff, how to track them all throughout the pipeline, but also, you know, as part of like business decision making, it starts with like, how do you select your customer right like not excluding low income audience that has you know, net lower contribution to overall profitability of your

unit economic picture. So girl, this very clear business decision you could make here and one is going to send you down one product development path and others gonna send you down another, but these are tough calls right? is a super tough calls, we feel like it's possible. We're very optimistic, right? We feel like we can have our cake and eat it too. And we feel very lucky that we're one of the few privileged companies that you know, gets this sort of lottery ticket to be able to do this.

Sarah Morin

Yeah, thank you so much for sharing and cool, thank you. Cheers. All right, it would be great if you could introduce yourself and the company founded and a little bit about how that got started.

Christian Byza

Cool. Thanks for having me. I'm Christian Byza. I'm co founder and CEO of learn dot XYZ. And we like to say that we're actually not a tech, we are a social learning app. And we just use learning as an incentive to make you open the app every day and can think of us as a mixture of Duolingo and Wikipedia. And we just happen to use AI to power what we do, because you can just come into the app, like ask a question, and then it creates a little fun learning curiosity about any

topic you really like. And we don't like to say she lesson or chorus or anything like that, because it all sounds like work. And most people actually don't like to learn, because our name is landlord XYZ, which is a little bit funny. But yeah, that's what we do. And we just launched three weeks ago and really early on this journey.

Sarah Morin

Amazing. So yeah, talking a little bit about what you're doing at Learn XYZ, what makes you the most optimistic about the future of AI and education? And that can be in general, or specifically this project that you just started? Yeah, sure.

Christian Byza

So what makes me really optimistic is that without AI, it couldn't be possible what we do today. And so the following story is actually that we started to think about education and thought we wanted to verticalized education with picked a specific topic. And we started to write courses

manually, right? With a content writer, we built like really nice product around and we figured, what is a good way to learn in a casual way so that can people do this in the subway on the way like, similar to what Duolingo does with language running, right? The idea was, how can you teach different topics in a similar kind of

casual way? And our biggest issue was, how do we scale this content production and we thought, Oh, we just hire more content writers somewhere in the world, we get them cheap, but obviously, there's not a scalable answer. And then we thought, oh, maybe we can build a tool with AI to just create maybe 10 courses a week rather than two courses, because that's the trajectory that we were on

last year. So we had literally just two courses a week on random things about web three, and crypto and AI and quantum computing. So any kind of future topic that we had. And so we started building this tool. And the tool was meant to be just this thing to bring us to 10 courses week, and when we had the first version of the tool and remember this as it was yesterday. I mean, it wasn't January, but we typed in Chinese

New Year. And the course that came out, which was so freaking good that we were like holy, this is the product, we don't have to actually have this as a tool for our content writers, we have this as the product. And that's actually the founding story that makes me so excited. Because what we do now is that we create content and courses

with the help of AI. And it cost us less than five cents for one of these curiosities where before it was costing us close to $1,000 on YouTube, the content writers design team to create all the imagery, review process all of that you can automate. And it makes me so excited that with the power of AI, you can really build this kind of personal study buddy to draw, you name it right like, but that is only possible because these language models

take the hard work away. And then that's how and why we even exist today. Yeah,

Sarah Morin

that's awesome. Would you speak just a little bit more about that's a really interesting use case where you if you type in Chinese New Year, like that's a really niche topic. I'm curious if I know, it's still very early, but some of the thoughts that you're having about the types of topics that you think are going to be like really commonly used, learning expert experiences are kind of the settings that you expect to be most common. Yeah. So

Christian Byza

as mentioned, the app that we released now, the first public product we launched just three weeks ago, and I just checked this morning, more than 8000 courses were created. Wow. And we have about 1081 categories that we divide our content into. And it's almost in every of these categories that people learn something in already, right. And so this is about psychology about history about, you know, like random things that are happening are the best example

right this week. Obviously, we all know about these very sad news that we hear from Israel and Gaza, and people are just curious, not too much about the conflict itself. But what is Gaza? What is the West Bank? Like? Why is that this conflict in the first place? And so basically explaining historic context, things that are out there, that's where we strive and also where I can strike because there's so much information about this out there, that you can explain the general concepts in the best

way. I think, like this morning, someone I saw, create a Corizon what is a kibbutz and I said, I read kibbutz so many times in all these different news articles, so I didn't actually know what a kibbutz is. And so I took the course and now know what this is. And so that makes me a little bit more bright every every single time I use on up

Sarah Morin

that is super interesting. Yeah, thanks for expanding on that. On the flip side, what are some of the aspects of AI in education that If you're more concerned about or have some hesitancy about, again, whether that's generally in the field or specifically with learn that XYZ Yeah, so I think

Christian Byza

I can speak to both. But just generally, I think one thing that I'm concerned about with AI is that creativity goes missing in some form, right? Like, if you think about what makes school or education is that it fosters creativity, because you basically give people towards and into the hands, and then they learn, and then they get creative. And that's the whole point of school is that you

learn to learn, right? Like you don't learn because you know, like a certain date of a war, or like how a math thing works. Just, you kind of connect your brain cells somehow. And so if I now just need to ask a quick question, I get the perfect article written or a painting painted. And I don't have to do anything anymore myself, that makes me a little bit afraid of the future of my children, I have two of them, how they actually keep this curiousness.

And the ability to write things to draw things and not just ask the computer to do it for them. And so that's something that I think about a little bit specifically for our product. I mean, there's always this concern of hallucinations, of course, and if you ask Sam Altman, and other folks in this space, of course, they have the same problem. And so I think, ultimately, this will be solved.

And we also solving is in many ways already in our product, where we use different language models that talk to each other to figure out if what the first model said is actually correct. Give us scoring, just forbid certain topics in the product. And so I think, as much as so many people think about this as the big issue is actually not yes, you know, very like up to

date information. And sometimes those language models don't know just yet, but even then you can just tell that you can figure this out that this is a current event, and you can just be proactive about it in the product. And so that's why, why there is a concern, personally, I'm not too concerned about it.

Sarah Morin

Yeah, thank you so much for sharing. And thanks so much for joining us.

Christian Byza

Thank you for having me.

Sarah Morin

If you could introduce yourself and tell us a little bit about your company, as well as the founding story and any details you want to share about that?

Courtney Monk

Sure. So my name is Courtney Monk, and I am a co founder and the CEO of a company called Skrillex. I am also an elected school board member in the great town of Los Gatos. So I am a school board member on the Los Gatos Union School District. School attics is a company that builds data infrastructure for school districts very sexy. But really how we think of ourselves is that we help educators have the right information at the right time to make great decisions for instructional choices for kids.

So that means pulling data from multiple systems and putting it all in one place. So that they don't have to do it themselves. Because it's super painful. And for most districts, very manual, etc. So we, we focus on three main things. The first is automation. So we connect data from different places and then automatic way. The second thing is customization. So as much as I would like this from a business perspective, every district is actually in fact

very different. They all have different strategic plans and goals and different populations of students to serve. So we focus on giving them exactly what they need. And the third thing that we like to think about at school attics is democratization of data. So we think about giving data to more people, including events, students and parents. Awesome. Thank

Sarah Morin

you so much. So in conjunction with this introduction, you have what makes you the most optimistic about the future of AI education, whether that's broadly in the field, or specifically the work you're doing, either with this product or with you know, your work with schools yourself. Yeah, what comes to mind around the future of AI and what we can look forward to there.

Courtney Monk

I think I'm most optimistic around fulfilling the promise of personalization. For our students. Every student has unique talents, unique needs, strengths and weaknesses. And if we could meet them where they all are individually, and provide them with that kind of learning experience, I think we can all agree that would be super valuable. The challenges to doing that are many teachers, I think asking them to differentiate really well in the

classroom is a huge ask. It's very difficult, it's even harder now after COVID. Let's take a fourth grade teacher, they might have kids who can read at a first grade level, and a sixth grade level all in the same class and all in between, and we're expecting them to be able to address all of those needs. There are also lots of English language learners and

classrooms. And that's a whole population where again, we're meant to be differentiating, to meet their needs, and that's it's just genuinely a very, very hard thing to do. But I think that technology and software and in particular AI can help us to do that more easily at scale in less time. And so that's, I think, what I'm most optimistic

about. And when I think about personalization, one thing, I've realized, even with my own daughter who went from elementary school last year to middle school this year, even like the most rudimentary personalization, where she now has six different teachers, and she got to choose an elective, and she's in a different math class, that kind of suit her learning needs better, even just that has changed her entire

outlook on school. And you think, gosh, like, that's just the bare minimum that we can offer in terms of personalization. And so when I think about like, well, what is actually possible, with really smart tools in the hands of teachers, and then this is kind of the school clitics angle, where it's not just that, like, Hey, I have this cool reading app, and it's gonna know exactly my reading level, and it's gonna give me the exact things that I

need. And it can analyze all of my issues, whatever, like something really cool like that. It's that then the teacher can learn and see data from all of their students, and then start to interpret that data and make different instructional decisions and choices and, you know, find different versions of lessons that they can use all sort of at their fingertips, as opposed to trying to make that all happen through just like massive amounts of time and brute force. Right?

Sarah Morin

Yeah, that makes perfect sense. And so interesting, too, as a tool that, yeah, to your point, like, even these simple, simple examples of customization are like, almost mind blowing at this point. And imagining, yeah, what could be possible? On the flip side of this coin, what makes you more concerned about the future of AI and education? What points are maybe more of a hang up? Or something you're feeling nervous about, as opposed to kind of the shiny aspects that we're all excited? Yeah,

Courtney Monk

well, I think clearly, privacy and security are big ones. So at school attics, we probably have, for a given customer, a school district of ours, like we probably have the most expansive data set of any vendor. So we think a lot about privacy and security. And we know that there are AI tools out there, that will if we chose to plug them into our platform, you know, that data would then cease to be ours. And it would go into the training models of these other

AI engines. So we've been treading very carefully in this space, of which, you know, I would love to be going faster. But I think that that we've been trying to be very careful. But I don't know that others are thinking in this way, we really need to protect our kids data, not only because it is legally required of us, but also it's the right thing to do. So that I think if you're in this space, and you're you're not thinking hard about it, that should probably keep you up at night.

But I think there's the other concern I have is I don't know if concern is the right word. But I think that, I guess maybe I hope, I really hope that we find a way that we can be comfortable with students using

these tools, effectively. And we don't withhold calculators from kids, just because calculators can do you know, arithmetic for them, we still teach loads of math, we figured out a way to, you know, integrate calculators into our curriculum and into classrooms in a way that, you know, works, more or less for kids and teachers. And so I guess, you know, a generative high platform is much more

complex than a calculator. But I think that, you know, like, my 13 year old son could use chat GPT smartly, if we helped him to understand how to do that. And, you know, I think withholding tools, because it makes us feel better, maybe is is not the right answer yet. I don't know, I think I'm still thinking

through it. My hope is that we find kind of a happy medium, where we're encouraging students to explore these tools and figure out you know, once they are have jobs like you and I do, that these tools might be really handy and help them to get their work done. We all use calendars, but we don't penalize ourselves for using calendars because, you know, we could be writing it down on paper, like I think also, I'm a data scientist by training. So I actually know how

these things work. And I think it's important that we teach Should everyone our students included kind of the basic structure and methodology behind these models because it doesn't necessarily make them easier to use or more impactful. In fact, it's a little bit more mind boggling to think about like, well, all these things really do is they just predict texts. So they just take, you know, the string of words it's making, and then it's like, well, what's the highest probability of the next

word? So they're not sentient? In the way that we think of it? I don't think. But I think kids would really love to, like, geek out on that, and kind of understand more of what's under the hood. Hope that we can get there. Yeah,

Sarah Morin

thanks so much for sharing any final thoughts you'd like to share about this topic of education, your work, specifically, just closing thoughts, if any, I'm

Courtney Monk

really optimistic about where both AI and just software in general will take education. I know that there was kind of a bonanza during COVID. And that makes headlines about you know, what things are being used and what things aren't being used? And do we need to have a great awakening in terms of shedding tools, but I do think there's so much capacity for impact, and our kids really need us to show up and like, do

this for them. People ask me like, Well, why do you do this, and that, my general go to is that I think that more kids should have better outcomes. And you know, no one's going to come and fix it for us, like, we're the adults, we have to do this work. And we have to make good decisions. And like, it's on us, essentially, to make this work for kids and to improve their educational outcomes. And they very much deserve it. So that's kind of what keeps me going.

Sarah Morin

Thank you so much for joining us.

Courtney Monk

Thank you was a pleasure.

Sarah Morin

All right, if you could introduce yourself, and then tell us a little bit about your company, and how you came to be

Gautam Thapar

sure. Thanks for having me. My name is Gautham Thapar, and the co founder and CEO of enlightened AI. I'm also a former high school history teacher and I founded Invictus Academy of Richmond, which is a small independent school charter school over in the East Bay Area in Richmond, California.

Sarah Morin

Awesome, thank you so much. Do you want to tell us a little bit more about the founding story of your company? You know, what led you to where you are in this present moment? Some background on that?

Gautam Thapar

Yeah, absolutely. It really dates back to being a teacher. And I face the problem of grading and giving excellent feedback to students. So I'd stay up late weeknights, weekends. And it really, really moved the needle for my students, we were able to nearly 10x, the percentage of students who were passing the AP US History exam. And so the way that this got started was I had a friend who leads a school in

Los Angeles. And we started experimenting with a spreadsheet solution and to see could AI, learn his feedback style, some of the context of his expertise, in order to shift his work from typing all this feedback repeatedly, over to reading, reviewing, editing, doing the cognitive work, and he was floored. So we use spun up a simple web app, and the improvement in his students performance, just from the pre and post getting feedback was

really remarkable. So there was no turning back at that point.

Sarah Morin

Awesome. Thank you so much for sharing. I'd love to hear a little bit of your thoughts on how the work that you're doing right now. interrelates with AI and education, and maybe some of the things you're excited about right now. And that

Gautam Thapar

overlap? Absolutely. We are an AI grading and feedback app. And what's unique is we really try to respect and center the teachers expertise. And so AI is central to everything that we do. And as a teacher grades and gives feedback according to their own rubric, the app starts to learn some of the nuances of what they're looking for in student work. And it tries to match that as it recommends scores and suggestions. So each time the teacher edits or changes or

touches up the feedback. If they use specific language in their class, which is very common, it will learn that in order to help the teacher save time, but also gives much richer feedback. And the research shows that feedback is a superpower that has twice the impact of a typical educational intervention. So being able to unlock that for teachers and not having them spend 10 or 20 hours to leverage that best practice. That just seems hugely valuable to us. Yeah, that's

Sarah Morin

incredible. Maybe on the flip side, commenting on some of the difficulties of working with new technology like this, other parts of AI that either on the broad scale of in the education sector or particularly in your work that are more difficult that you're trying to kind of approach and mitigate or things you're concerned about in this sphere.

Gautam Thapar

I think in our work, we are really trying to get the teachers involvement in centrality to These processes, right? And having them train up and really personalize the AI to the teacher is something we're fine tuning every day. I think more broadly, I'm a little nervous that there could be some over reliance or over technologies ation is that's a word where there is this concept of a desirable difficulty. And part of our job as educators is to encourage students to take on

and do difficult things. And a lot of the conversation right now is about how to make things easier and save time and reduce work and that all of that has its place. But there is sort of a pendulum swing, we don't want it to swing too far in that direction, where we just want to make everything about schooling easy and individual and personalized, because it is a social experience. It is interactive, it is something where struggle is sometimes a

good thing. You know, it's maybe a little early to be worried about that. But over the long run, I think that's something we should be mindful of. Definitely,

Sarah Morin

we've already covered a lot of topics even in a short time. But any closing thoughts, general thoughts on? Yeah, AI in education right now, or anything to do with the work that you're doing that you'd like to add in? So

Gautam Thapar

I think with my teacher hat on in my school leader hat on, there's a lot of fear and reticence because there's a lot of change happening for educators right now. But I think in the long run, it's an incredibly exciting thing to imagine a world where teaching can be sustainable, really great teaching has historically been really unsustainable, there's been this linear relationship between how many hours you put in and the output you can get for students, it was inextricable, it was hard

to escape. And for the first time that I've experienced in the last 15 years, there's this opportunity to ask our teachers to produce better outputs, but actually take something off their plate. So hopefully, some of this friction that folks are feeling at this moment, is something that will pass and I believe it will, and will give way to some tools and techniques that were previously unimaginable. That can be really, really great, not only for students, but also for teachers. Wow,

Sarah Morin

thank you so much. And thanks for joining us.

Gautam Thapar

Thank you.

Sarah Morin

Thank you so much for joining us.

Shweta Gandhi

Thank you for having me. Yeah. Could

Sarah Morin

you introduce yourself and just tell us a little bit about the company you founded? And that founding story? Yeah,

Shweta Gandhi

sure. My name is Shweta Gandhi. I am the CEO and founder of Strived, cash. My educational journey started right out of college, I started a nonprofit that was working in the slums of India focused on kids going on to ninth and 10th

grade. I did that for five years, then eventually moved out to India, worked and dabbled in the education and nonprofit space for the next five years while there and then eventually came back here had my first daughter and decided I wanted to really understand education at the grassroots level here in the US, even though I had been born and raised here went to school here, you know, really understanding why school works and why it doesn't work, something I was really

interested in. I worked at a National Charter of rocketship public schools for five years. And there, I found myself really digging into the data and really trying to understand there was a number of tools that we were using there within the public school system that were digitized, but they thought sort of in this silo in what they

call the Learning Lab. And it just, you know, it behooves me to think that we couldn't pull that data and then actually make different decisions of what's happening in the stem and ELA blocks based on that information. And so yeah, that got me kind of going down this path of I ended up building an internal platform for rocket ship that they then scaled across all 23 schools, and eventually, we started selling

it externally. But one of the things I went external to, you know, selling it to other schools that I learned was, there are schools that use ad tech really well, and most schools that don't, but that being said, you know, a lot of people are looking at ad tech

tools. That is the time fellas that babysitters they don't really think about the data as something that they can utilize to really shift instruction that I do believe that if use with fidelity, which is the big hit, there are a lot of potential there that that I think there's a lot of other interesting data indicators that can be pulled in. And I think we're at a point

right now. And the work that I'm doing with strived is to take all that data make sense of all that data, things like assessment tools, demographic information, and even things like attendance, social emotional indicators, overlay that to a student's profile. The way that I've been thinking about it lately is almost like we all have, like a medical

record. Could we have an academic record that kind of paints an entire picture about how a student is growing, what they've mastered what they've accomplished with a growth mindset, but also helping them pave a pathway towards what's next. And what's the next skill they need to really gain.

Sarah Morin

Very cool. Thank you so much for sharing, and kind of elaborating on that. What makes you the most optimistic about the future of AI and education, whether that's specifically this work, you're dealing with data or more broadly on just the scale of how this is playing out in the education sector. Yeah,

Shweta Gandhi

I have two kids five and seven. I see them learning and growing every single life and it's really interesting there too. Do polar opposite children. They both have very different trends and very different needs. And they learned very differently. And I think that that reality in my little home is what we see a lot of times in schools, I think kids come from such different backgrounds, even from the first day of kindergarten, we think everyone's kind of the same

baseline. But everyone's already got, you know, five years under their belt of experiences, but kind of helped shape who they are and how they learn and how they grow. And I think the beauty and what gets me excited about AI is we're finally at a point where we have the tools at a very, like, easy entry level, almost to make sense of all of the information that we have at our fingertips and allow everyone to really understand

every child very intimately. And so when we think about data, and we think about data driven instruction, that idea is really an intimate understanding of every kid and how they learn and where they sit. Now, my hope is, and this maybe going to your next question is that what I don't want to drive with my work and data is this siloed view of every kid like, you know, honing in on this whole idea of personalization and personalized learning. It's been such a buzzword in the education space

for so long. And we're, you know, we're hearing things like Sonic platform is shutting down. And they're finding that this whole idea of like kids being on a computer for 16 hours a day, I'm not talking about even the screen time effects, but kids don't want to learn in silos.

And so what I don't want is, especially for our company, as I think about our vision and our growth, I don't want it to become a tool for siloing kids even more, but really a tool to say how do I deeply and intimately understand, take that burden off of the teachers to understand their students, and then be able to teach a whole class level or small group or in whatever way makes sense for

those particular students? I often say like, when I talk to a grandmother, or you know, someone to explain what we do with all the hard work that teachers do to try to understand where who their kids are, and where they currently sit on their academic and social, emotional and behavioral spectrum. We automate that for them. And so that's the goal. And that's the hope that we can continue to do that. Yeah, that's amazing.

Sarah Morin

I know, you already started touching on this at all. But outside of this, these optimism is Yeah. Is there anything in particular, besides on these concerns you already mentioned? With maybe your specific product of the direction you'd hope it doesn't go in? But yeah, on a broader scale, just with AI becoming so intimately intertwined with education these days? Are there other risks, concerns, things that are notable to you? Yeah, I think that technology,

Shweta Gandhi

when used really well, is always an amplifier to human behavior, human intention, and what humans do really well. And what I fear is we could go years, before we realize that the potential of AI is really in the amplification of humans and not in the replacement. And I don't think we will get there, although there are people that are obviously concerned about

that. But I think that my fear would be a world where actually, there is an AI teacher for every student, because I actually think that that connection, what teachers, if you would have any classroom, and you see what a teacher brings to every student by just their, you know, voice, their empathy, their ability to connect to the social level, and at a human level, like that should never even be dreamt of

being replaced. And I would be worried about a world where that would even be something that's being considered.

Sarah Morin

Yeah. And there any besides what we've already covered, which is a lot of sound a lot of topics really quickly, but any other closing thoughts about? Yeah, AI and education sector? The work you're specifically doing right now yourself? Just yeah. Final thoughts. Final musings?

Shweta Gandhi

No, I think it's been a really interesting year, I think reach capital just put out a survey where 300 edtech AI tools have come out this year, one of the big concerns being not adoption hasn't fully picked up. And I think that's going to be interesting to kind of wait and watch. And I think that there's a lot of really interesting companies doing interesting things. But we're going to have to be really careful about making sure that we're keeping our users at the

core. And so I'm excited to see how it all unfolds, who stay standing a year from now, even 12 months from now, things are gonna look very different, even two months from now things are gonna look very different. And I'm just excited to continue to watch out and amazing, thank

Sarah Morin

you so much.

Shweta Gandhi

Thank you fair.

Sarah Morin

All right, if you could introduce yourself and tell us a little bit about your company and how you got to where you are right now.

Brandon Grusd

Sure. Great. Thanks. My name is Brandon Grusd. I'm one of the co founders of a company called Ethiqly AI. I just want to like talk about myself a bit because I think that kind of sets the stage for water companies about what the DNA of our company is, and how we came up with our

company. So really it like my whole kind of sounds cliche, but like my life's purpose, I think is just to find how I could make the most impact in people's lives for like most of these founders cyclic weaving path through different streams of how to get to where I got, but like I started off as a lawyer trying to make impact through the legal world, just didn't find that personal connection and personal meaning and how I could help people. And then I switched to

medicine, I was a doctor. And that definitely provided a lot of personal satisfaction. And also just let me feel that I was making a difference in people's lives. So that was super meaningful to me. But at the same time, the impact is somewhat narrow, because you're dealing with people on a one to one level, which is great, but I wanted to see how I could kind

of scale my impact. And then over COVID, just with under the Zoom school movement with my kids in particular, I just noticed that I felt like they were suffering, not just mental health, but just in terms of their education. And it just struck me that like I'm like really resourced and privileged in that regard. And they're struggling, I could just imagine like, how many 1000s or hundreds of 1000s of kids who aren't privileged and don't have the resources that I had, like, how

are they doing? Do they even have computers can even attend zoom school? If they're school district software? I don't know. So our company is like a family company is like founded with my two brothers and I. So the impetus really was like, how can we make an impact on the lives of kids who come from historically underserved populations and how we can help students with variable learning needs? And how can we use AI to effectuate that change and

improve their outcome? So really, what our company ethically is, is about how can we and this is kind of in the DNA of our name is like, how can we use AI in ethical way to really improve the outcomes, not just for those students that I mentioned, but also all students in general, but in particularly those with the most need? And so our solution is really to empower students who have a diverse voice, whether it's from different cultural backgrounds, different educational backgrounds, different

educational needs? How can we use AI to like empower them to tell their story in their

writing? So whether that writing is personal narrative, whether it's writing for history, class reading and English class, or even science classes, how can we empower them so that they can really explain their experience through the lens of the curriculum, and using AI tools to really help them express themselves, especially when they may not have had that ability before because of difficulty and language or just other difficulties that they have experienced in their educational

journey? So that's kind of the our company and how we got to where we're at. Perfect.

Sarah Morin

Yeah, thank you for sharing. When it comes to this work you're doing with your company, or even on the broader scale of ad tech right now, what makes you the most optimistic about the future of AI and education? And yeah, what's coming up for you in terms of your specific experiences? Yeah, no. So

Brandon Grusd

I think there are a lot of like, stepping back. There a lot of needs for students. Rather, it's an AI world and not a world, there's has been historical differences between educational outcomes based on socioeconomic status based on race. And so these are kind of historical problems in

our education system. And so my hope is that AI can help bridge the gap between those who have been historically underrepresented in positive educational outcomes by creating tools that can help these students achieve can help reduce bias and teacher evaluation of these students, some very hopeful I can provide a positive impact on the students lives. Yeah,

Sarah Morin

thank you. And maybe on the opposite side of the spectrum, what are some components of an education that make you a little bit more concerned or hesitant about the progress we're seeing right now.

Brandon Grusd

So I'm going to think it goes into like our company's core values that we believe in the ethical use of AI they because we recognize that there is a potential for misuse, and whether that Miss, and I'm just talking about education, not like this Terminator style AI, but really just in terms of

education. And there's a possibility of misuse of AI, whether it's just for outright cheating by students, or whether the tools are so advanced, that they're not really helping students actually understand the material in a fundamental way, but really just replacing their

learning experience. So trying to come up with ethical solutions that bridge that gap, and really empower the student and With a think about a world because I think fundamentally, education is about empowering students and people in general, with just understanding the world, how to express their opinion, in a cogent way. And just be like, just more aware of the rich tapestry of different experiences that people have. And just being aware of that, and but at the same time being

able to express themselves. So I'm hopeful, but I'm on the flip side that can be misused if it doesn't really achieve those goals. Right?

Sarah Morin

Yeah. Are there any final thoughts on this of kind of ethically using AI the work that you're specifically doing right now, any final ideas that you'd like to share on this topic?

Brandon Grusd

I think overall AI, at least in my experience in the education world, was kind of it was definitely a buzzy term. But I think there's just a lot of unknowns really. And that's why there's some difficulty me like really talking about because there's just unknown in terms of, you know, our company's ethos is we want to empower not just students, but

teachers. We don't want AI to take away from teacher autonomy, we really want to have it as a tool to allow teachers to engage with their students one on one by offloading some of the mundane tasks that they otherwise are facing. So just trying to create a cohesive solution that is a friendly system that students and teachers can use. It kind of is what we're about. Thank

Sarah Morin

you so much. Thank you.

Alexander Sarlin

Thanks for listening to this episode of edtech insiders. If you liked the podcast, remember to rate it and share it with others in the tech community. For those who want even more Ed Tech Insider, subscribe to the free ed tech insiders newsletter on substack.

Transcript source: Provided by creator in RSS feed: download file