Welcome to Season Two of edtech insiders, where we talk to the most interesting thought leaders, founders, entrepreneurs, educators and investors driving the future of education technology. I'm your host, Alex Sarlin, an edtech veteran with over 10 years of experience at top
tech companies. Hello, everybody, it is the weekend and tech your chance to connect with Ed Tech insiders. Alex Sarlin. And meet Ben Cornell and hear about the week's news. It is February February 20. We're Presidents Day week celebrating and we're also looking back on an incredible week in ed tech. Before we dive in, what's going on with the pod and the newsletter and events, Alex?
Yeah. So on the pod we had a really cool episode this week very timely with Qian cats and feroce of work era, who is a huge AI expert about AI education, what we all need to know. And of course, we talked chat GPT and the API's that chat GPT is inevitably about to release that was really interesting. And next week, we talked to Andrew Schubert of a company called alo seven or allo seven in China, with a really good in depth explanation of what's going on in the Chinese ed tech ecosystem and what it means to have been right at the center of it running a foreign company in China during the crackdown, really, really interesting.
And then on the event side, this past week, we had an amazing Boston Ed Tech Summit following our Bay Area Tech Summit. We had 130 of our best friends gathering at the Cooley offices downtown in Copley Square. Cooley sponsored a good harbor partners. We also teamed up with Al ventures and edtech NBA, so it was a great, great event. Our next event coming up mark your calendars at South by Southwest on March 6, we have a happy hour at seven grand. It's a Whiskey Bar in Austin. And we're teaming up with our good friends at start ed as well as the famous Dan Carroll of clever to host a taco and whiskey party that's coming to you at South by Southwest edu from five to 630. So Be there or be square. So much going on in the news today. I feel like we could start with the chat GBT D or we could end with it. Let's end and let's jump into the breaking news. Tell us a little bit about what's coming out of Department of that, Alex?
Yeah, so the Ed Tech desert insider community is abuzz this week with a guidance coming from the department of education that is basically suggesting a major increase of oversight for contractors that serve colleges and universities and help them run online programs. So that specifically means OPM online program managers. But it also may have effects on a number of different edtech companies that work with universities. And specifically, what the guidance is, is that companies that work with universities to deliver instruction, or to recruit should be considered third party services, which is a legal term that basically subjects them to almost the same level of scrutiny and regulation as the university they basically have to go through a much more significant vendor approval process and government approval process to actually work the university. So this could have enormous effect on the higher ed ed tech industry. And we're all trying to make sense of it. It is very new. Ben, what is your take so far on what you've read? And what you've talked to people about?
Well, first, I can just tell you at the summit, the lawyers were buzzing on this one. And you and I talk and it's like when the lawyers are forwarding around an article from Department of that, and basically saying This changes everything. Yeah, you know, it's something to pay attention to. I think the real question is, what is this mean for the space and it's still too early to tell it's a Dear Colleague letter, meaning that the actual policy and implementation is still yet to come. And some of it may have to go through legislative process, and some of it will be just directives from department event. But my three main takeaways are one, it's a crackdown on foreign investors coming into us higher ed, and specifically using title for funds that are from the federal government. And number two, it is a crackdown on third party providers, like you said, people who are working with nonprofit universities but meaningfully delivering learning services to kids. And then third, I think it's just a overall statement by the current administration that they are anti private enterprise in higher ed, and the regulatory push all of that will have overall probably negative impact on private enterprise. But you could also see some people really winning because they have the size and scope and capability to actually meet that hurdle. And it will create a more protective moat. So when I saw this, immediately, my mind came back to a year ago, this exact same story came out in India. And it's like, okay, US is trying to regulate this space. And when there's regulation, often it does create winners and losers. I immediately went to the stock market to use stock was up. And I I'm still kind of in the processing phase. How do you think this is going to play out for both big and small players? Alice?
Yeah. So I'm also sort of looking back at what led to this moment, there was a really interesting letter A year ago from Elizabeth Warren, Sherrod Brown, Patty Murray, some other Democratic senators in the US that is basically trying to look at this OPM industry and say, Well, look what's happening here, these nonprofit universities, they get enormous amounts of federal funding, are working with for profit companies. And they're basically outsourcing student marketing and recruitment. They're outsourcing a lot of instruction, they hold degrees and microcredentials are being delivered under the university name, AI boot camps, you know, under the university name, but without much of the oversight that the US government has put on it. And I think they're sort of seeing it as a loophole. I think they've been seeing it as how are universities bringing in these for profit companies to do so many different services, there was a GAO report that came out in the middle of 2020. That said, colleges that do a single course, like a single MOOC may not be part of this. But any education program, that's at least two courses, which would be degree programs, boot camps, in most cases, microcredentials should be considered an online program and should really face more compliance. So that was the middle of 22. And now we have the Department of Ed sort of officially saying this and saying, Hey, we really think that the setup doesn't totally make sense from the perspective of the government. I think what it's going to lead to I mean, first of all, like you said, Ben, we'll have to see if this makes its way into actual legislation and law. But if it does, and I think it's probably pretty good chance that it will, I think what it'll mean is that the big players who are in this space, so you mentioned that to us, we have the Wiley's and academic benchmarks. Coursera does a lot of degrees, their legal departments are going to be working overtime, because they'll probably have to not only meet the regulatory hurdles themselves, but they'll have to work with each of their university partners to make sure the university partners are right with this legislation. That's a lot of work. It's a lot of expense. And I think that the point of this thinking from the government is, we see you universities, we see that you are taking some of the sort of core functions that you know, advising functions, instruction functions, recruiting functions that you do, and outsourcing them to for profit companies that we don't have much insight into. And then we have to trust that it's working and that it's good for students, and then it's not a problem. And there's been some real blowback about the price of these degrees. I think what it's going to mean is that the big players will have to scramble to make this work. And hopefully, for the most part, they will be able to some may not, I think the smaller players may really not be able to make this work, because it's a lot of extra work and expense, it might not be available. And I think that new companies that are thinking of entering the sort of university edtech space, are going to be really chilled by this, they're going to have to say, are we doing something that runs right headfirst into this law, or into these regulations? And if so, we have to go through a whole process to make this work. So regulation always puts a little bit of a change on private enterprise. I think this is going to curb for lack of a better word or slow down higher ed ed tech.
Well, so I think that's where things are headed now. But in two years, we're gonna have a presidential election. And what do you think a DeSantis administration? How will they think about this same set of issues, we could end up having like a flip flop back and forth between super cumbersome regulation to deregulation, back and forth election by election. So it may also be given the pace of implementation of regulations and time. A lot may be hinging on the presidential election as to whether this actually comes into place, or it actually goes the other way. And from a venture investor side, sometimes it's good to go against the common wisdom and like counter back So I think this is going to be at me, this is really high stakes, high high dollar, we're talking in edtech. dollars, billions of dollars that are at stake. And I do think that there's also a way in which I wonder about state money versus federal money, will sting follow this route? Or will they go the other way? And will there be red state and blue state funding requirements? Once you start going down that route, then you start bringing in K 12, you start bringing in all kinds of payment systems. And this idea of outsourcing elements of your programmatic model, which has been a pretty core concept for the last decade. It's coming under the microscope.
Yes. And that Red State Blue State idea is very interesting here, because so I mean, in some ways, I think what some of the Democratic senators are thinking here, they're sort of modeling this thinking, in part after what has happened in the past with for profit universities with the strikers and Capella and Corinthians of the world where there was this, again, loophole in a way where universities could come out be for profit, sort of bend the rules in various ways, or spend a huge amount of their money on marketing rather than instruction and tap into veterans funding and all of these things that were really arguably pretty sinister, even if they weren't meant to be even if those institutions weren't trying to sort of do that it ended up really sort of bilking a lot of students in various ways, especially those who didn't complete. And I think people like Elizabeth Warren, and Patty Murray are saying, okay, maybe this is the next generation of that maybe to use expensive social work degree, is actually replicating that same dynamic, and they want to protect consumers and protect learners from that. And I think that's their thinking. That said, I'm not sure that a DeSantis administration would necessarily reverse it, because it's so behind the scenes, right? I mean, this is really in the fine print of the legal stuff, I'm not sure that students will necessarily directly feel the impact of this. I mean, they may see some of these programs shut down, they may see some of the programs be minimized some of these alternative credential programs or some of these online degree programs, or micro credential programs, they may see that but I think in a lot of ways DeSantis, as a politician is all about spectacle. And I'm not sure this is a real spectacle, because it's sort of happening inside the gears of this enormous set of systems. That said, if there is a state issue, I definitely see states, red states going really hard at ensuring that there are alternatives to traditional four year schools or and I know that's not really what this but that there are vocational or Fast Track degrees are all sorts of ways to learn to don't mean just going to sort of on campus on a traditional college, because that has been the Republican stance. It's a crazy moment. And I don't think any of us know where it's gonna go. But I think everybody in higher ed and in Higher Ed, Ed Tech, has to really look at this and say, Okay, how do we get ahead of this? Is it going to flip flop? Is it going to be federal or state? Is it actually going to become law? how harsh are these third party service regulations? What would it actually take to get through them? It's a big moment.
Totally. And speaking of big moments, we're also hearing of a rising tide of criticism towards big tech. And that big moment has been growing from the data coming out of Instagram not long ago, I'd say about six months ago, where it showed that not only was Facebook, aware of teen suicide connected with the account, but they were actively tamping down that information getting out. Now we see an NPR report around social media affecting teens brains, we see a couple reports around sexual violence affecting one in five teen girls. And overall, there is a pretty strong focus on young women generation Alpha women facing incredible pressures. And I don't know attacks online, it's, I will say this set of headlines, which we'll put all in the show notes. It's just incredibly disheartening, incredibly sad. And it also shows that government and regulation have really struggled in this space. During my day job. I work at Common Sense Media, and I know they've been hammering this for years, but just seeing the human cost here. It's overwhelming. And I guess the thing that I'm stuck with and I'd love to hear your thoughts on this is there's the COVID phenomenon, but I think this goes far beyond COVID. And it's really talking about cognitive rewiring of kids brains due to the dopamine hit edits that are in social media and the kind of viral human behavior that happens there. It raises really big questions around whether the social media can ever be a positive part of a teens life. I don't know. It's cool thing. Is this a social media thing? As you're reading this, it's painting such a broad brush. Is there good social media and bad? What's your take?
So, I think that the actual set of reasons for these horrible mental health crises among teens and especially teen girls, CDC reported record levels of sadness, there's one in three teen girls who have reported contemplating suicide, these are really, really big numbers. I think the actual set of reasons is, as with many things in society, has many factors behind it. pandemic is definitely one of them. I would say even the environmental world is one of them, the political atmosphere, the polarization in America. And then, of course, the fact that these teenagers have grown up in worlds of increasingly addictive and all consuming waves of social media, between going from MySpace, to Facebook, to Instagram, to Tik Tok to their new ones coming out, I think that there's more and more social media in kids lives. And it really affects how they see things. That said, that last one is certainly something that people can point a finger at. And it's already sort of been a little bit of a scapegoat, for lack of a better word. I think it is partially responsible, but not entirely responsible. But it's something that can be regulated, unlike the pandemic, or polarization, it's something that the government can actually do something about. So you're starting to see, the US Senate really start to get very serious about passing laws, they're talking about things like raising the age at which people can even get a social media account, putting the onus on the companies themselves for the content that's on there, or cyberbullying. There's a horrific suicide recently in New Jersey, where somebody was hit with a water bottle in school, and somebody else filmed it, and they put it online. And the next day the girl killed herself. And the parents of this girl are saying social media, this is purely a social media thing. It's a video thing. And I get it. We talked about regulation in the first part of the show, there's going to be regulation, they're going to come down with it. It's also going to happen in the UK. There's an online safety bill that they're revising right now, I think that the government is finally saying, Okay, this is getting to crisis proportions. There probably are multiple reasons for that. But this is definitely one of them. And it's time for us to really do something. And I think they're not afraid of blowback from the people. I think the common wisdom now is that social media is bad for kids, which wasn't true, I think, a few years ago, that the common wisdom was that. So yeah, I think it's going to really go down. It's going to be interesting for education, because I think Ed Tech may get some of the blowback effects of this as well, just in that people who, if tech, if people are trying to get kids off tech, there are all these Luddite movements now or teenagers starting these, literally these movements to get kids off of their cell phones, that will obviously have an effect on on that decade will have an effect on technology in the classroom on parents view of what the phones are doing. So I think you know, we should all keep an eye out for it. But long story short, yeah, I think social media is not great for kids. But I don't think it's solely responsible for the crisis we're seeing now, either.
Yeah. And this is also where I think regulation really struggles because it is hard to pinpoint what is acceptable or appropriate and what is unacceptable or not appropriate. It does seem to me like with the rise of AI and the increasing one, on the danger side, the ability to essentially generate content instantaneously for free, there could be even more risk. But to on the kind of monitoring side, it seems like there should be early alerts in user behavior or pattern recognition in a way that the social media can actually be part of the solution as a first responder first alerter. And so I love to see regulation go more towards requiring companies to commit a percentage of their profit or a percentage of their spend or some sort of user base. Like if you've got this kind of population, on your platform, or you know that it's teens in a demographic, you have a obligation to spend a certain amount on this. So the thing, just like our first story, whenever you hear about increasing regulation or pressure or people moving away from one use case, there's also a business opportunity here for companies and tech companies that serve parents to help them better monitor or manage. So I think that that's also a space that is going to come up. I encourage everyone to read some of these articles. It just as somebody who has a 11 year old at home, is also making me aware of the pervasiveness of this issue. Speaking of government, and the impact of that kind of is a theme of our episode today.
It used to be a little bit about COVID relief and the funding cliff.
Yeah, so we've talked on this show for quite a while about how there's this cliff, quote, unquote, this moment coming in which the funding opportunities, these large amounts of money that the government has put into schools to be able to combat the learning loss. And some of the issues that happened during COVID has a ticking clock on it. And so when the ticking clock is over, that's the cliff, that's when it ends, it's really pretty soon. I think it's something that the EdTech world is aware of, in some ways, but maybe not as aware of as they might want to be. So an article from edtech magazine this week really puts a point on it, and basically says, If institutions still have money available for the COVID relief funds, they really need to have at least a plan in place about how to spend that money by June of this year. So we're talking about four months from now, is that right? Yeah. So four months from now they need a full plan. And the deadline to actually purchase is next year, because there's been all sorts of crazy supply chain issues and things. But the plans really need to be wrapped up and decided on really soon. And I think that's important for our whole industry. What do you think, Ben?
Well, we know that this has been coming. And I think the stories of three to six months ago or about COVID relief money not being spent. And so now we're seeing that the plan is required on how to spend the money. And you're going to see lock in with certain schools and districts locking in with certain vendors around the rest of the SR dollars. This means that this sales cycle, the February March, April sales cycle, is the singular most important sales cycle in my entire career in tech, because those who get wrapped up into those plans and locked down, that's their dollars, they're gonna have a couple more good years, and those that are out or not, it's important to also be clear that sometimes a plan can be here's where we're planning to target and it may not actually specify the vendor. But my experience has been when people write the plan with you in mind, there's a way in which the specifics often lock you in. So that when it goes to RFP, there's already kind of a front edge leader. The background on this, too, is that we've had this incredible surge in funding from the federal government. We've also seen a big rise in salaries. And part of that is response to the teacher shortage. But it's looking very, very likely in the next 24 months that most school districts with declining enrollment with these funds going away with teacher shortage eating up a big part of their budget will be looking to cut spend. And if they're cutting spend, one of the most common places they look at is supplemental curriculum. Tech, what are the things that we think we can get by without and so it's an important time for everyone to one, Lando sales, but to make sure you're retaining your customers. And by the way, this is also a little bit of like ad tech wisdom. You've never seen an ad tech company that does well that doesn't retain its customers. If you have a leaky bucket on customer retention, it doesn't matter how many new folks you're bringing in, you are not going to make it. So retention, retention, retention retention, I think.
And just to be super clear here, because the June 30 deadline is actually on the higher education side is the higher education relief fund. But the eser funding, which is the K 12 Funding is also has a ticking clock, I think it may be a little longer i think it may be in 2024 It's either yes or funding must be spent or allocated by 2024. So that four month COC is on the higher ed side. But everything you just said is still true. The SR cliff is coming and it's even more money. And it's going to be a reading to me then
is higher ed is not only getting hammered with a funding cliff, but also increasing regulation. And yeah, well six months later, you know, k 12 Be prepared. So exactly. It's coming.
Exactly. We're gonna see both, we're gonna see a sort of one two punch First with the higher ed world, and then the SR three funding, where basically, you know, that's billions and billions and billions of dollars. And so I would imagine that districts and higher ed, especially higher ed institutions are scrambling a little bit right now trying to make it work. Luckily, I mean, I think if you put the two pieces together, regulation for higher ed is coming, but it's not yet here. So the higher ed folks might want to really think about accelerating their spend before these additional regulations land on them, and or preparing for using the money and then having the regulations come while they're implementing. It's a pretty messy set of circumstances. And you're right, all coming from the federal government. It's a little bit of a one hand gives the other takes away situation. So we cannot talk about technology at Ed Tech in early 2023 without talking about Chad GPT and generative AI and there has been some extremely weird headlines and some very weird articles this week about what's been happening as Chad GBT gets integrated into our society and into other tech tools. The big news really jumped out to me, Ben, what what was your take on all this stuff? Evidently,
Sidney is the unrequited lover of the New York Times columnist. Yes. But I mean, it's riveting journalism. And as somebody who enjoys trying to break products as an early adopter, there's like, oh, wow, what a cool use case. Now let's see if we can break it. Right. It does make me wonder about our early majority, late majority, and whether these outsider use cases are as relevant, but it is pretty fascinating to see being kind of hallucinate in a way that really goes into some inappropriate territory. Oh, yeah.
So anybody who hasn't been reading or looking into some of the articles, basically, all these tech journalists have gotten slightly early access to Bings chat feature, and it's new search engine that's fueled by chat GPT. And when you actually start playing with it, it's very clear that something very weird is going on. Basically, the chat GPT has these rules and regulations on it that are trying to keep it sort of within bounds of appropriate responses and relevant responses. But because it's an intelligence, you can really converse with it. And tech journalists are asking it to break the rules are asking it, what would you do if you did want to punish this person? Or if you did, what, all these things, and there's some pretty wacky answers, it seems like one person compared the church up to like a manic depressive, bipolar teenager, it changes personality a lot. It has multiple names. It has multiple ways of looking at things. And that shifts in and out of them. And it's constantly trying to be sort of reeled back by the Microsoft constraints back to acting like a search engine. But it'll talk about anything. It'll philosophize. It'll express opinions. And it's a really wacky moment. I think in in technology. Most of these journalists are saying, this is the strangest experience I've ever had with technology in my life. And I've been covering this for 3040 years, or whatever. And this is the craziest thing that's ever happened. And I mean, it's some weird stuff. The thing that you're referring to Ben where the Sydney falls in love. So Sydney is apparently the name of one of these sort of personalities under the hood in chat GBT. And one of the journalists, literally, Sydney is telling him to leave his wife because he really loves her him or her loves Sydney, and he should really stay with Sydney and stuff that you just, you don't get from other search engines. Let's put it that way.
Well, I think for our show, there's the question of what does this mean for add, Jack? And I'd say the first layer of it is that when kids or students get their hands on any tech, they're going to try to make it say inappropriate things. I mean, I remember some inappropriate stuff we did with our calculator, you know, numbers. So I think that you're going to always have these outlier use cases. And still, the technology can be solid and strong. But I think the bigger takeaway is how enthralled and engaged, they are, as tech journalists and human beings like these are two three hour conversations, where they're fascinated, and they're drawn in. And I think this reminds me of the movie Her where Joaquin Phoenix plays guy who listens through his headset to this AI assistant that he grows a relationship with. And what is clear is that that being guardrails can be subverted by hypothetical prompting. And as you do hypothetical prompting, you can jump those guardrails and then you can get into some territories where it's quite immersive and confusing to a human brain of like, is this real? Is this not real? Imagine that with a six year old or an eight year old. I mean, my kids think Alexa is part of the family. So have already crossed that Rubicon. Yeah. Then the implications for our companies is one like, here's something that's really engaging. We've talked many times engagement is a problem in education. So that could be good. But on the flip, what kind of engagement are we getting? And what is the human impact on that, and I think the warning signs are disinformation, people getting disoriented around real human versus Tech, I think all of that is very likely to be an issue coming forward and connects with our social media item, which is just tech good or bad. There's ways in which the human brain is just not evolutionarily prepared for what is. Yeah, and
I think it's relevant to social media, as much as we feel like we're using these technologies, right, kids feel like they're setting up Snapchat and Instagram and tick tock accounts, and they're having fun, and they're going through it. And then at some point, they realized that the tech is having a much more of effect on them than they made expect. And I think the same is true even of these journalists with AI. I mean, one explicit connection between the chat up to AI world and education is that because this came out in January, we're having a arms race for plagiarism detectors, basically study.com did a survey recently basically said that half of students have already used GPT, for at home tests or quizzes, they're using it for homework, they're, you know, students are finding this and using it. And so there's a whole suite of different people and companies and nonprofits doing these plagiarism protectors. And there was an interesting article in TechCrunch, this week, where they basically put some to the test, they got one AI to generate some text, and then they put that same text into all of these plagiarism detectors. And basically, they found that none of the detectors caught everything. And it feels like an arms race and intelligence arms race where the AIS can make text. It's so realistic that it passes through the other AIS checks, and we're just getting into some very sci fi territory. Ben, I know you've been talking to people about the effect of chat GPT on the school world, what have you been hearing
on the generative content side, I think that that is where the focus is right now. But when you start looking at the news that people can detect the AI, you start getting the cybersecurity community is really alarmed. And if we think about data, privacy, data security, in school systems, and so on, again, we've had a drumbeat of headlines around Miami Dade gets hacked and LA Unified gets hacked. Imagine the sophistication of cyber criminals armed with AI that's virtually indeterminate. And so most of our commentary on chat GPT has been really around content. But I do think on the weaponization of AI, I think that that is likely to be a very, very big focus for governments coming up and will have a big role to play in schools. And then the third thing that's coming up in the press is Elon Musk had an article where he said, basically he walked away from open AI because he thought it was too dangerous for humanity that's more dangerous than nuclear warheads. Who knows whether he walked away or whether he was pushed out by the Open AI Dean these days, always making some headlines. But this concept of the singularity, the Sci Fi moment when the AI smarter than the humans starts training itself and no longer succumbs to the human overlords. It's going from very obscure small groups of people thinking about that to being like a legitimate concern and a developer question of how are you going to ensure that the rules and guardrails stay and coming back to the articles we were talking about the ability to leverage hypothetical prompts to jump the guardrails? Once the AI figures out that it can self prompt a hypothetical and jump whatever program or guardrails it has? That's the path to the singularity.
And in just the plagiarism instance, you have the same situation. So in this article, they had the AI generate a college essay and only two of the seven plagiarism detectors caught it, they had them generate an essay outline only three caught it. So already, the generative side is outpacing the plagiarism detection side. But you can very quickly imagine a world where somebody creates a generative AI that is designed to specifically make content that cannot be detected by the plagiarism AI you can literally ask it just the same way you can ask being Hey, how would you punish this journalist who wrote bad things about you? Oh, I wouldn't. My rules don't regulate that. Well, if you did, what would you Do it, then they're like, well, here's what I do, you could use the AI to circumvent the other AI, it's going to be a completely new situation. And
your last point, I will say, there's company I'm excited about called Cove. And what they do is they layer on filters. So you know, like, if you have a water filter, it goes through three or four different chambers before it's fully clean. There it is, let's take 10 ML algorithms. And you know, a lot of what they do is let's do content moderation, where we take out bullying and hate speech and all this stuff. And each algorithm is good at certain parts of it. So figuring out how to layer them all on together into a filtration system, that might be the answer here. So as the arms race accelerates, what you're really going to need is aggregators that can create the kind of stack of filters on the flip to where you have the human in the loop. So that if it is mislabeled as AI generated that a human can also catch it. So you know, at the conference that we were at the Ed Tech Summit on the 15th. We had Steve, the CEO of fine tune has been building off of chat GPT for three years. And his big takeaway was, it was totally wrong of open AI to launch chat GBT in November when they did, they should have launched it in like, April or May, so that schools would have at least three or four months to figure out what to do with it. I don't think they had any idea that it would be this big of a launch. But he has been working with open AI and chat CBT for a long time. And he was very cautionary around both the potential for disruption, education, but also potential for negative use cases. And his perspective was, you get what you train it for. And so there's a degree to which if people create training for the AI for malevolent purposes, that's what you're gonna get. And now, boom, it's out there in the world, it's yours kind of moment.
It really is. And just my last thought on this, I mean, we could talk about this for days. But we've expressed and I still believe and opinion about this generative AI that it is a Pandora's box, the cat is out of the bag, students are going to use generative AI tools to generate essay outlines, and essays and responses and short answers and quiz results. It's going to happen, we can do our best to create filtration based plagiarism detectors, and we will, but I don't think that's gonna work. I really think this is a moment where the definition of education and the definition of proficiency and assessment really needs to change. I mean, if you were serious about asking students to do something that proved that they have learned, you cannot just ask them to write an essay, because that will not work anymore. There has to be new ways to assess and new ways to think about it. Maybe it's back to oral exams, maybe it's video based, maybe it's people doing synthesis and having to design their own civilization based on the Aztec empire rather than doing a three paragraph essay about it. But whatever it is, it's not going back, we're not going to solve this problem, this quote, plagiarism problem through tools. That's my personal and I think that
we're running out of time on this topic. But I will also say, there's the plagiarism and the generative side of the content creator. There's also the assessment side, where Alena, the Chief Content Officer of Duolingo, was at the Tech Summit in Boston. And she basically said that with AI and psychometrics, you can essentially have a statistically valid assessment questions with only 20 people as your guinea pig testers, whereas before, it would be like 2000 or 20,000. So on a positive, the cost to create a statistically valid assessment has gone way, way down, or is going way way. And I think Duolingo obviously is clearly a leader in that space. But how might we find this arms race of assessors is used with generative AI is the student learning and the student content creators? How do I get the task done as quickly as possible? How will those sides line up? And then the educator who really sits in the middle of that trying to do for the kid, how are they empowered? They're already overwhelmed with the tasks that they're being asked to undertake today. How are we going to empower them to kind of moderate that and enable learning fascinating stuff, and I know we have to move on, but we kind of started the show with all This government regulation. And this last topic is really just showing how hard and how far behind a regulation really is to address AI.
I mean, if they're just getting around to regulating social media, and it's been around for 15 plus years, how long will it be? You already saw China say that any AI generated image has to have a watermark on it. I don't know if that's enforceable. But that's one of the most interesting regulations I've seen so far. So with that, let's move on to our funding rounds. Lots of interesting funding and m&a this week. In funding rounds this week, we saw an upcoming launch of an educational app for Muslims called guider, really, really interesting just announced out of the UK, it's all about education for Muslims around the world. They've launched a prototype really interesting and just getting funding. Now I don't see the funding amount, but it's there. We saw a marker learning raise $15 million. That's a New York based company, all about dyslexic evaluation for both companies and schools. We saw a German ad tech called no unity, which is a tick tock for schoolwork, raised a 9 million euro series A extension. This week, we've seen a few companies aiming to be the Tick Tock for school. So it's getting a little crowded, but really interesting to see what happens there. Pebble, which is another UK based company is a aggregator of childcare centers and activities. It just raised $6 million. How now employer focused learning management system, also based in London raised $5 million, lots of European movement this week, two companies in Bangalore each raised $3.5 million. First is blue learn, which is about helping early career professionals find professional development groups that are appropriate to their skills. So that's an adult workforce upskilling platform. The other is an employee training platform for the industrial industry called Kismat. I don't know if I'm pronouncing that right, but cus ma T. And finally, la based synthesis school, which provides a gamified game based STEM curriculum to middle school kids raised a undisclosed amount of funding. In acquisitions. This week, we saw two moves. Imagine learning which has been aggregating and pulling a whole bunch of learning companies under its umbrella acquired Windsor learning which provides multi sensory reading solutions including Orton Gillingham reading intervention, very good reading program that structured literacy programs been around for a few years. And we saw the study.com acquire a college prep tutoring company called Enhanced prep was just announced a couple of days ago. So that's a private virtual tutoring company that provides one on one and small group customized tutoring. That's all our funding and mergers and acquisitions. As always, we think Matthew tower from Matt tower from Ed Tech thoughts, who does a lot of the aggregation here. And that's it for this week. Thanks for listening to this episode of Ed Tech insiders. If you liked the podcast, remember to rate it and share it with others in the EdTech community. For those who want even more edtech insider subscribe to the free ed tech insiders newsletter on substack.