Welcome to Season Two of edtech insiders, where we talk to the most interesting thought leaders, founders, entrepreneurs, educators, and investors, driving the future of education technology. I'm your host, Alex Sarlin, an edtech veteran with over 10 years of experience at top tech companies. Kian Katanforoosh is the CEO and co
founder of Workera. That's Workera.ai The Skills intelligence platform redefining how enterprises understand, develop and mobilize talent with care empowers organizations to stay ahead by unlocking the skills data needed to drive innovation. Enterprises get an objective pulse of their innovation skills through a configurable skills ontology, using AI powered measurement of data, AI software, cloud and cyber skills at an atomic level.
Kian is also a lecturer at Stanford University where he teaches deep learning in the computer science department with Professor Andrew Yang. Kian has been acknowledged for his teaching excellence by Stanford with the Walter J Gore's award, Stanford's highest teaching award, as well as the centennial Award for Excellence in
Teaching. Kian is also a founding member of deep learning.ai and Co created the deep learning specialization on Coursera with Andrew Ng, which has taught artificial intelligence skills to millions of learners, from 2014 to 2016, Kian co founder and CO lead dust get a French startup developing in classroom ed tech solutions for universities. Kian Katanforoosh, welcome to Ed Tech insiders.
Thank you, Alex, very excited to be here.
I'm really excited to have you here. It is fantastic to connect with you. You've been working for years in one of the hottest fields in education, machine learning, artificial intelligence and deep learning. Give us a little bit of your background, how you got into AI and how it led you to become an ad tech founder.
Yeah, of course, I studied really a lot of theory of math and physics and computer science back in France, where I grew up, and the system for engineers and for Zeiss, that a lot when I arrived at Stanford for my graduate studies, I originally focused on crypto, and I got a chance to work with Professor Dan Benet, and quickly realized that at Stanford, I was probably ahead in terms of the theory, but I was completely behind in terms of practical skills of building products, it was just obvious.
So I started by taking a few classes to step up my game in software engineering. But I realized that I was always passionate about education and technology. And the applications of crypto are fairly limited in education and talent, not in existence, but limited. So I delved really deeper into AI, which aligns very well also with my theoretical background, I met Andrew Yang, who taught me a lot through our journey at deep
learning.ai. So launching it, building the deep learning specialization, growing the company. And then Andrew helped me launch where Kara, which is currently my main focus. And so that's how I got into it.
Yeah, it's a really great trajectory, because you really had to go deep into this very specific, pretty complex domain, and then go through it at Stanford Engineering, ml crypto, and then you can come out and you have a lot of credibility and confidence in being able to teach anybody in the world about this incredible new world. So let's jump into what work error is, for any listeners who might not have come across it. What does work era do? Who are your learners? Who are your paying
customers? And what are you teaching?
Yeah, we're Cara is a skills intelligence platform that redefines how enterprises large organizations of all sorts, understand talents, develop their talents and mobilize their talent effectively. So we help companies and organization unlock what we call the skills data that they need to drive innovation, data driven talent strategies, such as upskilling, talent acquisition, talent, mobility, coaching and mentoring
for employees. And so we really provide the way to see it is an objective polls have the innovation skills in the company in a configurable way, and with AI powered measurements. So every employee essentially gets access to that technology, allowing them to measure their skills on a regular basis to
track their progress. And then to personalize their work experience, get personalized learning plans, get access to projects, get access to mentors, all based on skills, or learners or, you know, employees at companies and organizations. We also We'll have a free version for anyone in the world that wants to measure themselves to understand where they stand to accelerate their career growth. Because you know, with so much content online, it's so hard to select what I do next and what I
need. And so we help them with that level of mentorship. And then paying customers or large fortune 500 companies and public entities who really sponsor work Care Licenses for their employees, and get a sense of their capabilities at any point in time. That's a
terrific overview of how it all works. You know, I remember businesses always being very concerned about digital transformation was their term for everything changing so quickly because of technology and not having workers with the skills to keep up. You mentioned that you teach innovation skills, or measure and teach innovation skills, what types of skills do you mean, when you say innovation skills?
Yeah, that's a good question. What we mean is, today, it's really any skill that is top of mind for a CEO at a large company, AI, ml, analytics, cybersecurity, cloud skills, software skills, of course, DevOps, all of these and we're looking now it's a set of behavioral skills, such as management, leadership, and other human skills, but also green technology skills, sustainability skills, 5g skills, IoT skills, the really the skills that are needed in the next five years to build a
future. That's our key focus right now.
That's fascinating. And you know, I think of work era because of its amazing origin as such a tech first company, but being able to move into management skills and sort of humanistic skills, as well as green technology. That's a really natural but cool evolution of what work era does. You mentioned a little bit about enduring and deep learning AI and specializations. Can I dig in a little more just because I think we're Kara has a pretty unusual origin story for an
edtech company. Can you give us a little more detail about how work era sort of came out of this machine learning, education ecosystem? And Stanford?
Yeah, of course. So, in 2017, Andrew had this idea to launch a company called Deep Learning on AI, and I was one of his graduate students at the time. And so I joined him. In starting that effort, we created a class called the deep learning specialization, really to democratize access to AI education, that was all the goal of the company. And between him and I, we managed to teach AI to over 3 million people in a very
short amount of time. So huge, huge volumes in portion of the workforce, as I was creating content, and talking to our learners and their organization, asking them, what is missing in your career? What is the limiting factor to build the next stage of your career? Is it content, almost all the time, people would tell me it is not content, we have so much content out there, it is actually so hard to select which content to
study, and it takes so long. And if you know I had a student in Ethiopia, who was telling me from a content perspective, I'm not different from a Stanford student, we actually have access pretty much the same content. But I do not have the mentorship, I could be better than a Google engineer, I would not know it. Nobody's measuring me. Nobody's giving me feedback. Nobody's telling me what to
study next. And so we really looked at worker as the opportunity to bring accurate, precise measurements of skills to people with the goal of accelerating their career. And that's where we're here we started with, we looked at psychometrics technology, you know, the science of measurement, which is a very old
field and traditional field. And we thought with the new techniques in AI, we can reinvent that space where we can measure a few skills of someone, but be able to predict hundreds of skills based on all the measurements that were running around the globe. And today on worker, you can be measured for 20 minutes, or about an hour, an hour and a half and get insights on hundreds of skills because of all the cross correlation between skills that we
understand. And so that was really the inception of work era. And it really span off deep learning.ai in in 2020.
Fantastic description. I think that really makes it clear. And it's so interesting to hear that your learners were saying that mentorship and guidance were sort of the missing piece, definitely not content, as well as assessment. You mentioned that your Ethiopian students said, I might be as good or better than a Google engineer, but I would never know. And I think that creating assessments that can accurately measure these very new skills is a very rare and valuable thing to be
able to do right now. And you know, from my perspective, work era is fast becoming V Go to Certification for anything related to machine learning AI, high level analytics may You're also doing cybersecurity in that way, although there are other cybersecurity certificates. But AI is a very diverse field, you mentioned how many different skills you can be measured on. Even within machine learning and artificial intelligence. There's a lot of different job roles.
There's a lot of different sub skills, all of which you've sort of broken down and are assessing separately. tell our listeners who have been, you know, following the machine learning world, what are the most in demand skills within the machine learning and AI world? What's rising right now?
Yeah, great question. I first say skills. These are Neal defined term. So I'm going to talk about skills. And then I will say what sort of skills or domains are present on work here today in high demand. Most platforms today, we'd call machine learning a skill. But you know, if I'm telling one of my students, you're good at machine learning, or you're bad at machine learning, it's not really helpful for them, they're gonna ask What do you mean? And so at work, Cara, we do all this
work. And we have psychometricians, and experts who do the work of breaking down machine learning into skills. Machine learning is hundreds of skills. Oftentimes, even software is hundreds of skills. Machine learning is a domain broken down into sub domains, broken down into topics into sub topics into individual skills. One skill may be implementing a learning rate decay in machine learning AI, that's a skill that
we can measure. And so today, we have 7000, roughly skills on work error that we can measure in our ontology. The skill is on the rise, or the domains on the rise, I would say, of course, machine learning ops, all sorts of domains under data engineering, and cloud, a lot of people are trying to acquire these cloud skills, there is a set of domains under initiative we call ai plus x, which is people that are engineers, researchers, or analysts outside of AI, and are trying to acquire
an AI toolkit. So they may be mechanical engineers, or electrical engineers or material scientist or astronautics or aerospace engineers, and they have the background to learn that very quickly. But they need a specific set of skills to acquire it. That's on the rise. And it's a huge population in
the workforce. And then finally, I would say literacy of AI and data fluency of speaking about AI and data, understanding what is the learning system learning algorithm, what is not a learning algorithm, especially with the new GPT stuff, where it's become broad? Everybody has heard of it. Now, companies are expecting pretty much everyone to have a basic level of literacy and fluency around those topics.
Yeah, and I know that, you know, there's a AI for everyone class that Andrew Yang put together, that is also meant to do sort of bring people up to speed on that basic AI literacy. That's a really interesting rundown. So you know, machine learning ops, AI, plus all of these different specific sub domains and other sort of areas of engineering and science. It's really interesting to hear you unpack how it actually works. And it's not, you know, just one monolith I
completely agree. That said, I do want to ask you about machine learning AI as a whole technology just for a moment. So, you know, on this show, we've been talking about web three, we've been talking about, you know, Tao as virtual reality and extended reality. And of course, especially recently, artificial intelligence and machine learning and how it's affecting edtech. You are a complete master of this machine
learning world. And I'd love to ask you, you're using clearly psychometrics and machine learning in work era, as well as assessing people on it. How do you expect AI and deep learning to change edtech in the future in the near future, and maybe even in the far future?
Just short on series, I believe AI will have a massive impact in education technology. But I'm especially bullish about the use of AI for for purposes, measurements, personalization, delivery methods, and also the creator economy. On measurements, I think AI will change measurements, because it will be hard for humans to infer someone's skills better than a model trained on millions of measurements can, you know if I know, you can do two times two
equals four. And I've done so many measurements around the world, I can infer that you can do two plus two minus two divided by two, I can set the probability to your ability to do to, you know, let's say square root of nine. And so this is all going to come with AI in statistics. On the topic of personalization, I think user interactions on SAS platforms can be analyzed that large scale combined with the skills data to provide better personalization.
So this will pick up because logistically, it is difficult for a human mentor to track and handhold someone every 10 minutes, but assess that form can do it. You can track these touch points, you can know what was the last history of actions the user took, and so the personalization will improve all around And on the topic of delivery methods, I think AI and the point I just made will influence user interfaces in educational products.
Specifically, we will see more UIs that are interactive, that are dynamically generated based on data collected from skills measurement and content engagement. And thus, it's really AI influencing the delivery methods indirectly. If you look at even what chat GPT is, and when it will make it to user interfaces, it will improve the delivery methods. And finally, creator economy is probably the most obvious one in the sense that it's so easy to
create content today. But I don't think humans will become irrelevant by any means. I think AI is today good at connecting the dots. So there is a body of knowledge that we've developed as a humanity. And AI can connect the dots between that body of knowledge, but it allows us to focus on really the boundaries of that body of knowledge and expand it and create data that AI can use, and connect the dots for learners around the world.
Amazing. So here, measurement AI can use big data to accurately measure better than any human personalization can adapt. UIs can adapt to somebody's knowledge of any domain delivery methods, being able to have totally different types of interactions with computers, you didn't mention AI chatbots or tutors. But I think that would probably be part of that third category, and then a huge idea
of the Creator economy. And we've already begun to see the Creator economy take off people are making and selling art and winning contests using Art and Writing from artificial intelligence tools. And it's becoming a very big area. So we have a lot of edtech founders that listen to this podcast. And, you know, given your four areas you just named which are
fantastic. How should ad tech founders be thinking about building the skills to actually manifest any of these amazing ideas is, as you said, ML is not just one thing. So what aspects of machine learning should they start Googling or getting some classes on or hiring people who know them, if they want to develop that type of technology?
Obviously, it will depend on how technical the founder of the tech company is. But my advice is that they should be constantly following existing ITT Tech founders and CEOs looking at you know, some of the investments that all the big investors in tech are doing and looking at what's out there, and what's missing having
conversations. In terms of learning, I think the barrier to entry to AI has diminished heavily, it is much easier, and it is going to get even more easy in the future to use these technologies today. You know, you can call an API and you get an incredible speech system or text to speech system, in order to use in your product five years ago, 10 years ago, you would have to build it yourself, most likely, and it is a pain.
But today, it doesn't make sense to build a speech to text yourself, because there's so many API's that you can use. So being aware of what's out there is incredibly important. It's more important, I would say than doing it yourself for a founder probably.
That's really insightful. So it's sort of if I can paraphrase, you know, start to understand what tools are out there, what libraries, what API's have already been built, so that you can avoid reinventing the wheel and trying to build something that might just be a single API call to a specific already built AI library, because there's a lot that's already been done. Is that about, right? Yes. Yeah.
That's a really good point. And we're having this conversation a few weeks after the world got a crash course in generative AI like chat GPT, which you've just been talking about, and mid journey, Dali, all of these other amazing AI tools that are showing the world what AI can do, even for people who don't know anything about the underlying technology. I'd love to hear you riff basically on generative AI. This is, you know, the newest wave, arguably but you've seen many waves of
machine learning. Do you think generative AI and this sort of idea of people being able to access AI directly? Is it as transformative as people think? Is it overhyped? Is it actually even more important than people think? How does it work compared to other areas of machine learning?
That's a good question. Let me think, I'd say it's as important as people think it is a really big deal. In my view, we've seen that word Kara, we've a week after it came out, I asked all our engineers and scientists to look into it, and to implement it in our processes as soon as possible or leverage it. We're all waiting for the API to come out essentially, because we have already identified applications
for it in psychometrics. So I think it's the case of a lot of companies who are adopting this and will continue to adopt it. But it is also a double edged sword. And it is scary for many companies because the fact that AI is accelerating, opens up opportunities and create risks for enterprises, as the barrier to entry to AI is lower than ever. So those who will be able to pick up those technologies rapidly will accelerate, and the
others will be disrupted. That's why I expect enterprises to increase their spend on education on upskilling, on retaining their employees to build a new AI technology skills. If you look at the main metric, that is the half life of skills to measure how long a skill is valid or useful, it is the lowest that it has ever been. So reinvention is the new
norm for employees. And it is even more the case because of charge GPT, Dolly mid journey, all of these tools that you talked about are just making it a requirement to be a lifelong learner in today's economy.
That's a fantastic point. I love that phrase. reinvention is the new
norm. And this idea that people who are jump right in and use this and do as you do say, you know, how can I use this in my business yesterday and figure it out and use the API's are going to accelerate like never before, and any company who just ignores it and thinks this is a fad might be left behind very quickly, I have actually a funny anecdote about this, which is years ago, Adobe was you know, which is a huge leader in graphics software, started thinking a lot about deep
learning. And they were like, We really need to know a lot more about deep learning inside our company. And it wasn't entirely clear why but they were sort of talking about that a lot. And then when these tools just came out, one of the headlines maybe was a little bit lost in the excitement was that Adobe is making stock imagery through AI. And they're able to create new images of anything you'd like using some of these graphics
types engines. And that I think, is a really good example of a relatively big tech company, not sitting on its laurels. Because, you know, other stock photo companies might be a little afraid that people can create images of any kind on their own. But Adobe was sort of on top of it.
Yeah, that's a very good point. I agree.
You mentioned before something that I thought was really interesting, which is that AI is getting more and more into the hands of, of lay people, the user interfaces are getting simpler, and they're going to get even simpler, as you mentioned. And you know, I keep thinking about this moment in parallel to what they called Web 2.0. But you know, we were just like, this is for us older
folks who remember web 1.0. But you know, originally, the Internet was really sort of a one way platform, you'd go read things on it. And the only people who knew how to create web pages, sort of it was specialized knowledge at the time, because you'd have to do it from scratch in HTML. And suddenly, this whole suite of tools came out MySpace, Friendster, Facebook, originally, all these blogging
platforms. And they basically said, Hey, you can put stuff online Flickr was a big early one, you can put stuff online without having to know much, you can upload your photos, you can write whatever you want. And it was just explosive completely changed how the world interacted with the Internet. I'm curious if you think that is a fair comparison. Do you think that this is the moment when people stop thinking of AI as something that's hyper specialized, and only something that you know,
Googlers do? And instead say, I'm going to start using this? You know, in my work tomorrow?
Yes, I think it is a fair comparison. The way I see it, is that today, we've seen what sharp GPT can do, and people can use it on on open API's websites. But an important limitation of charge GPT, in my view, is that it is hard to control. You cannot really tell what it's going to say to the prompt you give it. So when you think about embedding a technology like charge GPT in your user interface, you're kind of scared. You're like, what if?
What if I'm trying to Mentor a student with this chatbot and suddenly it starts spitting something that is nonsense, because I cannot control it. And so the future for me or the next version of Chad GPT. And I believe that Sam Altman mentioned that they're looking
at it actively. It's the verticalization of charge GPT, where you make it more controllable, you can determine the body of knowledge that it has access to, you can determine the type of vocabulary that you can use, you can define filters to avoid certain, you know, bad behaviors that you do not want to see in your user interface. And once charged GPD has been
verticalized. I think the comparison to MySpace, Friendster, social media platforms is even more true because now you can control it and you're you feel more confident using it in your user interface.
That's really interesting to hear you talk about the very localization of chat GPT I don't think that's something I haven't heard anybody talk about it in that way. And there's a lot of articles about it, you know, and people talking about it and all
sorts of ways. But that idea of, if AI API's if machine learning API's have really accelerated the field and made it more accessible than an API around chat, GBT that actually allows you to tailor it to your use case, whether it's a chat bot on your site that sells cars, or a virtual tutor, or a friend to you know, the elderly, so that they have a companion in their nursing home or any other of a million different use cases, the ability to actually as you say, control or limit or sort of
tailor chat, GBT and the image software as well, it seems like it would open up a lot of channels for people to use it in so many different creative ways. Yeah, I agree. I agree. All right. So a quick question. I
know this is a loaded one. But, you know, for years, people have been talking about the fourth industrial revolution, and how creative and critical thinking are the skills that would be safe from automation, but repetitive analytical work, like analyzing X rays or tax documents was going to start going away, suddenly, you have this explosion of very creative types of AI, do you think that it's still true that creativity and critical thinking are sort of a buffer against automation,
I believe that jobs will evolve. Now, it is hard for me to predict whether it has to do with critical thinking with creative skills. But thinking out loud through a few examples. I think robot maintenance is not a creative job, per se, I would bet it will grow over the next few years, you need a lot more people who know how to maintain hardware, things like content marketing, which is, by the way, you know, up to now a huge job
in the market. There's a lot of content marketers a lot of demand for content marketers, that job will likely change drastically. If you're a content marketer in the era of generative AI and you do not know how to use generative AI. I wonder how you will compete with other content marketers, I don't think it's doable. So I don't know where the jobs will be created exactly where the jobs will be, you know, suppressed, but there will be both created
and suppressed jobs. I do believe that history has shown more jobs will be created by technology than not. And the safest professional trade you can have is lifelong learning mindset. That's all I believe,
I will put a plus one on that I have constantly found that as soon as you take any time and stop paying attention to the changes, and the new tools and the new ideas that are out there, it seems like you can fall behind almost instantly, you know, just a few months, everything changes in a lot of fields. I wanted to ask you, forgive me if this is a sort of naive question. I'm very much a dilettante in AI and ML. But, you know, we're talking about Chad GPT, and Dali as tools that make things much
easier. But you've mentioned that there are a lot of other tools and API's that have come out that have made AI easier and easier to use for different people. And they don't have to build their own. I've heard about, you know, psychic learn and pytorch and nimi. I don't know how to pronounce that. But these are Python libraries that make AI easy to implement, or Google has big tools. IBM has
tools, Amazon has tools. And a lot of these are sort of off the shelf tools that you paid to us, but they do pretty sophisticated ml. Tell us about your take on this this set of of tools, how have you seen them change, and maybe even open up the world of machine learning, even in the pre chat GPT days?
Yeah, they will participate in lowering the barrier to entry, I would say there's two levels. They're the tools that you talked about, which make it easier for technical folks to create AI application. So you know, pytorch TensorFlow are hard for a non technical person to use. Honestly, I don't see that happening in the in the next few months or years. But it makes it easier for a software engineer who is not an AI person to actually build an application.
And then there is the second level, which is really the WordPress equivalent or the wix.com equivalent where you don't need to be a software engineer to create a Wix website or a WordPress website or HubSpot website. And this is no code tools or low code tools that are also coming to the market very rapidly. I haven't seen that being picked up as fast as we thought honestly, but I think it's just a matter of
time. But the way I tend to think about you know the barrier to entry and the skills needed for everyone to use AI is in a T shape where in a T shaped career you have your durable skills that are the horizontal bar of the t shirt So for example, doing two plus two is a durable skills I believe in 10 years from now, you still need to be able to do two plus two to add two numbers, it will still be important, but then you have, you know, some endurable skills where I would say python falls
under some endurable, where the Python we use 10 years ago, was slightly different from the Python we use today. So we had to update our skill sets, but not that much, it's still relevant. And then you have the perishable skills, where it's hard to say which skills are perishable, but those are skills that you will need to update very frequently and you will not be able to continuously evolve as AI evolves, if you are not able to tackle perishable skills very quickly learn them very
quickly. And frankly, a skill like, you know, transformer architecture, which is the one of the most popular architecture in AI today. I don't know if we need to teach that to people two years from now, I think it's a perishable skill. If you look at TensorFlow that we were using in 2015, compared to today, it is also tremendously different than 2015. You know, good luck. If you're not super technical to use TensorFlow, it was really, really hard. And today, it
became very different. It's much easier to use who knows what's going to happen in two years? Who knows if it's a perishable skill or a durable skill, but I think you should be well rounded in your career, whether you are a technical person or non technical person in your ability to to learn new perishable skills and be comfortable forget you're not using them in the future.
Amazing answer. And I think your your earlier answer about reinvention is the new norm. Combined with this T shaped career concept is really valuable. It's like some of the skills that you have right now are perishable, built connected to specific tools, they're connected to specific techniques, which will get updated. You have to continually learn them and reinvent them, reinvent yourself in
relationship to them. Others maybe like management or like you said, math skills will continue to serve you throughout your your career. So I think this would be great advice for young people who are just entering the workforce. Now, do your best to tell the difference between what skills that you're learning are perishable, and durable and semi durable, because the ones that are perishable are exactly the area, you have to be a lifelong learner. I couldn't agree more
with that. I wanted to ask you a few questions that may be on some of our listeners minds. And I hope this section isn't too loaded or wacky. But you are such an expert in machine learning and AI, you've been, you know, work era is really becoming, if not already is the place to measure AI and ML skills around the world for all these different fields. I'm sure you get these questions a lot. But let's talk about some of the things that people are a little worried about in the future with
AI. I'll ask you just a couple of them. And I'm so curious, we already talked a little bit about automation, but I'm so curious how you think about these from your perspective at work, Kara? So number one, will AI automate away jobs? You talked about this a little bit? But I want to hear a little more. And if they do, is it going to be the white collar jobs like managers and graphic designers or content marketers?
Or is it going to be the blue collar jobs like call center workers or medical assistants? Or is it going to be both?
Yeah, man, this is a very interesting question. But a tough one. I've thought about the topic, I'd say mainly two things. The first one is, while I don't know what jobs will be automated away or not, I tend to think about automation as percentage of productivity. And that percentage of productivity will determine how much the demand is for a specific job. So going back to the content marketer, that whose job is to create content for a
specific purpose. In a world where AI generated AI is fully present and deployed, there is an impact on the productivity of a single content marketer. So a company may ask themselves, do we actually need 10 content marketers? Or do we need five? Or do we need three? Or do we need one, and this has an impact on the amount of demand there will be for that job. On the other hand, there is new opportunities that will open up you know if now we can verticalized AI, we need people
to label that data. And we did not have this job before. How many do we need? Do we need to do we need three? Do we need 10 Depending on the amount of application. And so I would look at it from a productivity perspective. And I think this will determine the demand or supply for a given job category. My dream personally is that we are able to measure skills at a large scale, you know as a community, so that we know what are the skills that we have in
supply? And what are the demand for skills through job descriptions through what companies are looking for. And we can tell people that we can tell someone if you are going to take a boot camp for six months in that job category and you We're going to upskill yourselves, here is what you're going to face, when you're going to come out, here's your chance of getting a job and people should know it, you can decide
to follow your passion. And you know, you will reach maybe a place where it's highly competitive market and you will get a job or not, but at least you should know what you're going for. I remember in France, when I was there, they were twice as many physical educator, a part of a four year degree than the number of jobs that were available out of that degree. And so without telling people, you have half of them that will not get the job they
were looking for. And I can bet you if you had told them earlier, they would not have gone that route, they would not have taken four years to build a certain skill set that is not useful in the market, because it has an impact on their financials, and on their families and on the and on a lot of different things in life. And so I sense that, you know, if we can determine productivity, job by job and also measure skills at a large scale, we will get to a place where we can answer this
question very granularly. My advice is, for anyone in a job, ask yourself, how much more productive you get thanks to AI. And this will tell you how much the demand will be for that job category. Probably,
really, really interesting take and, and it, it strikes me that, you know, if the skills were being broken down and communicated to people as they were starting their career or changing careers in a much clearer way than they are now they are not now at all. It would also allow people to tell the difference between these perishable and durable skills, it would allow them to foresee which jobs might be growing and which jobs might be shrinking.
So I think the idea of a skills ontology skills taxonomy that is actually open to people that people can can really access is a mitigation against the automation away of jobs, or way too many people becoming, you know, business majors and then business majors not being hired or becoming computer science majors or becoming, you know, a sociology majors and then there's just nothing there. It's a really, really interesting way
to look at it. And obviously very connected to what work here is doing in this skills taxonomy. You mentioned the idea of verticalized AI, where you can sort of control it. And AI has been criticized in the past as sometimes being biased. It can recreate the sort of inequalities by using data models that aren't equal, or sometimes even non factual. So you said, you mentioned that you know that sometimes Jad GPT spits out nonsense it lies or
things that make no sense. How do you think about bias or that idea of sort of lying in AI? However, you mentioned a little bit about this already. But how are we going to ensure that AI is trusted and doesn't sort of throw curveballs at us?
That's a huge issue. The lot of people were arguing that charge GPT will kill Google or search. But if you look at it, it's a very different application where charged deputies trained on dialogue. Google is not trained on dialogue, they're trying to find information that is accurate. And so it will require to train the AI for certain behaviors in order to get to a level where it is not biased, it is safe to use it is not lying.
In fact, a fact checking of AI output is a large area of research right now, responsible AI. And we actually test responsible AI skills at work here, you can test your ai plus fairness, ai plus safety, ai plus privacy skills. It's a massive topic in the enterprise. I'm actually really glad that Chuck GPT was released in beta and that was messaged appropriately, because people knew, you know, don't treat that seriously. You can use it but
don't treat it too seriously. If you go on character AI, which is another generative AI platform, it says up front in any run of your chatbot discussion. Remember, everything the character says is made up. I'm all for testing and for this messaging around it. Not everyone is technology savvy or super technology savvy. And people can fall in the trap of thinking everything I say is factual or, or true or unbiased.
While it's not true. I'm thinking about my own parents who, who can easily fall in that trap if they start playing with GPT and trusting it which is scary. But if you have the right messaging around it, it can work. I also like the idea of Elon Musk to open source certain algorithms on their decision making processes. I think on the long run, that would definitely help I think practically it's slightly difficult but I hope we'll get there one day.
I'm glad you're raising that flag and giving the chat GPT and other folks credit for not saying this is all done. It's ready. You should trust it because you know we all know medical information or political information There's, you know, when a data set is itself polarizing or filled with inaccuracies or, you know, then the AI doesn't know that and it can recreate them. So I'm glad it's being studied. I'm glad it's being taken very
seriously. It is a tricky, you know, this whole ai plus ethics, as you say, it feels like that's just going to grow and grow. I
have a personal prediction. And I'm curious if you agree, I'm predicting that in 2023, pretty much every edtech company that is, you know, looking for funding or trying to grow, will need to have an AI story, they'll need to have a slide in their deck that talks about how they're leveraging AI that could be machine learning or generative AI, I just think it's going to become part of everybody's story. Do you agree or disagree?
That's a very good question. Very good take, actually. And I agree, if they are venture backed, I do agree that hard to get funded as a net tech company without any AI angle. But you know, sometimes it's not necessary. Sometimes it's an overkill. I have seen a bootcamp companies talk about their AI stack. And honestly, I'm a bit dubious, I just think they're doing something great that may not require as much AI as they talk about, and it's
perfectly fine. So for example, if you are a content creator, and you're really on the edge of innovation, you're creating content for a thing that nobody knows out there. Your work is extremely valuable. But it's not AI work. And it's perfectly fine. You're literally just spreading your knowledge out there and not using AI and you may be a very successful company, and you're definitely contributing. But you don't need to talk about AI. Yeah, but I agree with you with venture backed, yes.
So there will be a slide in the deck. But sometimes it's redundant or unnecessary, just has to be it's just something that VCs are looking for, but maybe not something that's actually required in every different edtech company. Yeah, I like that take. I definitely think I mean, I asked the question, but I definitely think VCs are going to start getting very obsessed with AI and how it plays in edtech. And then here's the most sort of sci fi question of the interview. And forgive me if
it's just too silly. But do you believe in the concept, it was popularized by Ray Kurtzweil, I think 3040 years ago, of the technological singularity, which is defined as sort of the moment when tech growth becomes uncontrollable, irreversible, and sort of results in unpredictable, unforeseeable changes to humanity, like, does tech take over humanity? Do you think that's possible, given the trajectory of AI and machine learning right now?
I hope we will control technology. I'm not sure I'll have a great answer for you here. But I look at technology, you know, let's say on a 100 year or horizon, which is easier than then 1000 year horizon, obviously, who knows what happened in 2000 years, but you look at a few technologies like AI, like VR Metaverse, you look also at bio. And those are three big things I think will happen in the next 100 years. On the AI side, I think AI will change people's lives, it will
change how we leave. But it is controllable. I believe that the training process of AI is controlled, it is going to be increasingly inspired by the way we learn. But it may actually find ways to learn that are better than the way your human brain works. And so it may be smarter in many ways than than humans, but it doesn't have a body. It doesn't have, you know, this chemical body or bio that
that we have as humans. So there's still an unknown of how how this will happen to fully take over human civilization, we need to understand that then by your comes in where I think, you know, you see all these health trackers today that you can track pretty much anything but it will continue to advance in my opinion, where we'll get to a point maybe in 100 years, where your body is not anymore a mystery medicine has changed completely. You can live longer life maybe forever, who knows.
And you can also be more powerful, more resilient, smarter, and you can start controlling some of the chemicals in your brain in order to learn things even faster, or do things that complement AI. And then there is the part on meta verse, which is the biggest
question mark for me. Where I do believe that in this century, every house will have sort of a green room or a room with a nice setup where you go and you go into Metaverse, and you spend your day there and maybe there's something attached to your body that fuels some energy in your body or nutrients. I'm not sure how widespread it will be. But I think it will go in that direction where you can live a life in a Metaverse and do a lot of things that you would not
believe we can do today. I think it's just a matter of time before we get there. And the experience has to be immersive enough. You know, it's possible that we ended up becoming Similar to the AI because we ended up being in a different world, and our intelligence is projected in that world. I'm renting and thinking out loud, but I think we can control all of that, in a sense.
Yeah, no, it was a appropriately sci fi answer to a sci fi question. But I really want, every house will have a green room, biology will make the body less of a mystery. And AI might get smarter than people, you know, at some point, or at least might learn in ways we've never thought of, which I think is exciting. But underneath it all, you feel like it will still be controllable. And that makes me sleep a little better at night. So, you know, where care is, like incredible.
I am a really a fan of this company. And I think it's just doing really, really necessary work in helping define the skills of the future. I wanted to ask you, you know, I end every podcast with the same two questions. You know, when you look from your perspective of or carry, you're talking to lots of fortune 500 companies, you're talking to a huge number of learners looking to get the most
up to date skills. What do you see as the most exciting trend in the EdTech landscape, what's something that you're seeing rising and you're hearing more and more of that you think is going to be more you know, important in the future.
The trends I am seeing are first skills measurements, it's top of mind for every educational provider and every one of their clients. I think companies are getting really serious about measuring skills accurately. This includes both tech skills, but also career success skills or soft or power skills, sometimes called data and AI is the main focus on the tech skill side career success skills are also very
much growing. The second trend would be apprenticeships, I think are coming to the US historically, apprenticeships have been popular in Europe to allow students to build job skills and work experience while studying. They're placed within a company on a temporary period with almost, let's say, a warranty job at the end of their apprenticeship, typically 1218 months. Today, I think it's coming to the US with a special focus on diversity hiring, which
is great. Peer to Peer Coaching platforms somehow are trending, connecting mentors to employees, but also connecting employees to employees. I think the main limitation in that front is the skills measurement. Actually, when you cannot measure skills accurately, you have a hard time matching the right mentor to the
right person. And so the match may not be as meaningful as you think, of course, cohort based classes, anything related to the Creator economy, that's trending, these are focused majorly on developing some sort of marketplace where experts are able to monetize their knowledge by finding an audience who will pay to learn. And then the last one will be immersive learning, sometimes called experiential
learning. Those platforms are also trending targeting the enterprise, helping clients increase engagement of upskilling initiatives by creating immersive pathways really, with all sorts of blended learning where you can learn through a lecture style class, you can learn to live coaching, you can learn through projects, you can learn through podcasts, and all of that comes together. Yeah, those are some of the trends I believe in.
That's super interesting. I feel like there's a potential through line I'm hearing skills measurements getting really serious, that will unlock peer to peer coaching, because you can connect people with complementary skills, apprenticeships coming from Europe, where they've been successful for many generations to the US cohort based courses and immersive and experiential
learning. one through nine that I sort of see with all of these is that they actually have a lot to do with learning more about what we don't know. And then connecting us to people who know, who knows, thanks. That sort of the through line there. It's like when you mentioned earlier that the thing that your learners at work era said they
needed was mentorship. I think these are all sort of parts of mentorship cohort based courses is sort of hiring a mentor for a little while apprenticeships is getting mentors who are in the field. I think that sort of mentorship and understanding how to get the skills you need is underlies all these trends, and they're terrific trends. That's
a really good answer. And then our very final question is what is one resource that you would recommend for somebody who is wants to learn more about anything we talked about today and I'm gonna give you a constraint I never do this but you're not allowed to say the batch the deep learning AI blog, even though it's a fantastic resource, just because we've already heard that one and I want you to go deeper. You're so deep in the field, but everybody should check out the batch.
So to break into a tech show, I say I like Sarlin said tech insiders but ya know, that's a great one. I like also learning tech talks from Christopher lean, I would also say follow CEOs founders of at tech companies on LinkedIn at Tech is actually bigger than most people think. And I would even say talent tech in general.
And then, you know, follow news from investors in tech who are active and, and you look at their investments, it's probably what, what their thesis is and what they're looking 10 years out as being a game changer for education.
Great suggestions. I'm gonna push you one more question, which is, if people want to learn more about AI, and ml, these tech founders who need to understand this better, where would you send them?
Actually, I say different things. It depends if you want to technical knowledge or not, but the way I typically do it is first, actually, you know, go to some AI leaders accounts on Twitter, Twitter is a big place for AI, by the way, and you look at who they follow, and you follow the same people, because that's how
they get their news. You know, if you look at Andrews, Twitter or my Twitter, we actually curated a list of people like you can, you can follow, and then you get news on AI, then if you want to go deeper, I'd say Reddit is a great place to go deeper on a research paper. So you find about the research paper on Twitter, then you search it on Reddit, and that's where people debate the
technicalities. And then finally, sometimes I would suggest technical people to go on the websites of major AI conferences like neurips, Eichler ICML. And look at the award winning papers, they have
a rigorous process. And so you know, most of the time those will be highly curated paper that are well written that are probably going to be useful, it's not guaranteed, but at least you you can skim through and spend five minutes per paper, if one of them is very interesting to what you want to build, then spend two hours on it. If it's very interesting, then spend two days on it try to re implement that's how I would go about it.
Classic answers. Twitter for those AI leaders read it to debate research. I have never heard that Reddit is a place to debate research, but it sounds like it is and AI conference websites looking at award winning papers and seeing what the best new research is at any given moment. Amazing recommendations across the board. Qian, it was so terrific to talk to you today.
Anybody who hasn't already looked up work era and what they do if you hadn't already known a lot about it coming into this podcast should be doing that right now. It is a really innovative company that is moving the needle on skills every day. Thank you so much for being here with us on edtech insiders.
Thanks Alex was a pleasure.
Thanks for listening to this episode of edtech insiders. If you like the podcast remember to rate it and share it with others in the tech community. For those who want even more and Tech Insider subscribe to the free ed tech insiders newsletter on substack.