S1:E1: From Data to Impact: Redefining L&D’s Value with Derek Mitchell - podcast episode cover

S1:E1: From Data to Impact: Redefining L&D’s Value with Derek Mitchell

Jan 22, 202557 minSeason 1Ep. 1
--:--
--:--
Listen in podcast apps:

Episode description

New Series! From Training To Transformation: Redefining L&D’s Role.  We're so excited to announce our new multi-episode podcast series on L&D transformation. Our first episode kicks off with Derek Mitchell, Analytics Partner at Novo Nordisk. 

In this episode Derek discusses the critical role of data in L&D. He emphasises using data to proactively identify training needs, personalise communications, and measure the impact of learning initiatives. He also highlights the importance of moving beyond traditional metrics like course completions to focus on performance, skills growth, and behaviour change.  It's a fantastic episode, and definitely sets the scene for this crucial series! 

Links: 

 

 

Transcript

Hello, everybody. Happy 2025. Happy New Year. I'm not sure really how long we can say that into the month of January, but here we are mid January and I'm still saying it. So there you are. That's the rule. I've set it. As you know, my name is Ashleigh St. Clair and I am here joined by a fantastic guest that I'm so excited to chat with. Um, Derek. Hi, I've got Derek Mitchell here. Hey. Hi. Really lovely to have you here.

I'm so excited to dig into this conversation where we're obviously going to talk a lot about, uh, data and L and D, but how we can actually maybe start to consider driving value in our organization. That's beyond measurements like completions and bums on seats and really talk about how, how we can drive impact as a learning function. I know you have some really strong views and I'm very excited to dig into them.

First and foremost, I mean, I, I think most people probably know you are, you're quite active on LinkedIn, but those that don't, would you mind maybe just giving a little bit of an introduction into who you are, what you do, and how you found yourself here? Totally. So I'm an analytics partner, a people analytics partner at Novo Nordisk, big pharma company. Um, at heart, I'm an analyst. I'm not an L&D guy.

Okay. I started off working, uh, looking at product performance for a big media company and then moved into customer acquisition, right? Saying what happens if we put out a 5 discount versus a 10 discount versus a free TV, etc. How do we affect customer behavior? That got me interested in, in the behavior of people and how we can use data and behavioral science to, to essentially manipulate them to their benefit and to our benefit. So I moved into banking.

where I ran a number of behavioral experiments. Um, you know, what happens if we're having a web chat with a customer and we stick out our tongue in an emoji or we wink at them with an emoji and what have you? How does that affect their behavior? What does it do? Then I fell into people analytics, having that background and interest in behavioral science. So looking at L&D, lots of companies spank huge amounts of money.

Um, on training people to do stuff, but they don't know what they get back for that money. So I was brought in, um, to companies to try and understand that. And now I'm in a people analytics role. So I'm sitting across all the people data in a huge multinational company, looking at talent attraction, learning development, employee listening. Rewards, all of that stuff. Love data, essentially. Amazing. And you know what? I didn't even, again, most of us end up falling into the L&D industry.

I don't think we wake up or even when we're picking our degrees and things and go, I want to be a learning and development professional. I want to work in people development. I think we mostly don't even know it exists, but I'm saying I kind of fell into this, but it's, it's wonderful to hear more of your kind of, almost even, uh, kind of a marketing background in some ways, like the depth and breadth of using data to influence people's behavior, to my mind, is, is very marketing oriented.

Whether they're customers or they're employees, right? We've all got motivations, we've all got interests, and we can all be nudged towards doing things. Um, so yeah, it's the same. Marketing and L&D are Best bed buddies. Best. I agree. I agree. I think, you know, they, we are really trying to accomplish the same things. Get people to know about things they don't know about and get 'em to do things they maybe don't necessarily wanna do. Totally.

So, okay, so you're, you eat, sleep, breathe, live, love data, live laugh, love data. So talk to me a little bit about what. Data means in L&D. Obviously, you've got a broader view, which is people, and I would expect that that maybe eliminates some of the key issues that L&D typically experience with data silos and stuff being housed in other parts of the business. So talk to me a little bit about what you're doing with data, what role data has for your organization and for you.

If we focus in on the L&D aspect, okay, data for me has three uses. In the L&D area, okay? One of them is about identifying proactively what people need. Okay, so traditionally in L&D, what we'll do is we'll have a function sitting there and someone senior will say, I think we need to train people on X, phones up L&D, yeah, we need a course on X, people build a course on X and then that's it, essentially, right? They say, did you enjoy the biscuits? Was the trainer wearing a nice dress?

All that sort of stuff. But actually, we've got huge amounts of data on our people. Okay. We know what their jobs are. So we know the job descriptions. So we know what skills they should have. Okay. We know what positions are open in the business. So we know what, you know, skills people need. Okay, we can use all of that data to say, okay, what do we need to train people in regardless of the highest paid person's opinion?

Okay, what are people saying they need training on or what can we infer that they need training on? We've got people's skills, right? We ask people, what are you good at? What are you not good at? Okay, we've got a huge database of skills, so all of that stuff can tell us what people need. And again, just the performance.

Okay, if we see Team A is performing poorer in whatever functional area than Team B, LMD should be using that data to say, actually, we can jump in and we can help Team B. A or team B, whoever, whoever needs it. Okay, rather than waiting for the business. So proactively identifying training needs is one use of data. Then the other use, of course, is once the training stuff has been built, we need to get people to do it.

Because frankly, when, when people come into their jobs, of a morning, right? Some, if they're in, you know, salaried roles for big companies, they might have their eye on their bonus. And that's not going to necessarily, um, they're not going to be motivated by stopping what they're doing, stopping being productive to go and look at a course. Okay. So, so we can use data in marketing, okay. Getting the messages to the people who need it.

It's one thing knowing that people need a thing and then designing a thing to meet that need, but we need to talk to them about it. And we need to personalise those communications, right? So, so we use data and marketing there and also personalization within the course, right? You know, it's, it's one thing opening a course and it says, dear colleague, you know, whatever. Right. How about, hi, Derek. Right. Uh, the, the simplest data point use, use people's names. So we do that.

And then of course, there's the third area we use data is in measurement. So we've, we've identified the need. We've built a thing to meet that need, and we've told people about it. What, what did it do? Right? Did it, did it change the behavior? Has it changed the culture? Has it changed the performance? Have their skills grown? Have their human networks expanded? These are all things that we look at.

Um, so yeah, all the way through the life cycle, data is there to identify the need and it's there to measure the impact. Make sure we didn't waste our time essentially. Yeah, absolutely. And I love that it's interwoven with every aspect of L&D and it isn't just a case of something we do for measurement purposes at the end. It's actually, it helps inform everything that we do and that, you know, those insights are crucial.

I'd like to unpick each of those steps with you, if that's okay, because the first piece when you were when you were talking about, you know, we've we've got all this people data we've got, we know what jobs they do, and we know what skills therefore they need to have. Was this a, was this data present when you joined Novo Nordisk?

Because I, I would expect a lot of people listening will go, I don't have that data, so they'd have to undertake a very gargantuan process around skills mapping, capturing that data, aligning that data. Is this just a task that has to happen? So, so this is one of those things, it's as difficult as you want to make it.

Okay, if we just start off with, with, with people's jobs and job descriptions, okay, that data exists, that exists in all companies, because you know what, when we hired people, we put up job descriptions. Okay. To say, Hey, we've got a vacancy. So we've got that data. We know what people are supposed to be able to do. Okay. And then we can measure them against what they're supposed to do. One of the data points we we we use quite heavily as skill stuff.

Now we use degrees to capture people's skills and they self rate themselves on their skills. But it could be as simple as just a survey that goes out to everyone saying list five things. that you are good at five things you wanted to get better at. Okay. And then just pull it back in, right?

Look at those results and say, okay, do we have groups of people are women overindexing for instance, in terms of, you know, senior exposure or whatnot are men overindexing and being, being selfish guys, whatever, you know, That's a really simple data point to get. I mean, there's, there's more advanced data points we get, you know, if we start looking at performance data, then, you know, if we think about a huge business, right, people who work in finance might have 20 different metrics, right?

And those metrics are going to be different to people who work in manufacturing, who are going to be different to people who work in marketing, who are going to be different to people who work in analytics, etc. So, you know, we have to be quite cute about what metrics we, we choose. We try to chase, we can't get everything all at once, but it does, it does all exist in the business, right? Yeah, absolutely. And I think it's understanding where to get it from too and starting small.

I think sometimes people, I personally see a very extreme reticence in, in our industry in terms of you gathering and utilizing data to inform what we do, to build better learning strategies, to improve our marketing, but also, you know, Enhance our learning products to ensure that what we're offering, our solutions are actually aligned with the business and people. Is it, is, do you have a view on why we don't use data?

Is it just cause it's damn well hard to do or, or like you said, is it, is it because we're making it bigger than it needs to be and maybe we should start small? We should start small, right? And it's really easy to start small. And actually, it's diminishing returns, okay? If you get the right data set to start with, you can do huge amounts. But, but ultimately, there's, there's three reasons I think we don't use data very well in L&D, okay?

The first reason is that people in L&D are, are quite creative people. Like you said, said at the start, right? People don't wake up in the morning or start their careers and say, I want to be in L&D, right? They tend to fall into it, having worked in other functions in a business, and they've got there because they're creative and they like working with PowerPoints or doing animations or seeing how people think, okay?

For people who are really creative, data can seem a bit boring if they don't understand it, right? Ultimately, you know, that's a big turnoff for people in L&D. But if you think about data as not being something that's on a system, right, just ones and zeros and numbers, okay, ultimately, data is people. Okay. Right. Every, every data point we see is generated by a person doing something. Okay. So, so data is how people behave and, and the digital exhaust of what they do.

So it's, it's actually super interesting. And I think if, if more people in L&D thought about data as people, um, they might get a bit more engaged about it. Right. I was at inbound in Boston a couple of months ago, and it was, it was just so refreshing to see these people. Who, who work with tiny little data sets and they do so much with it. Okay? So, you know, talking with people who, who, who maybe take the, the footprint of a customer who's visited a website to buy a pair of shoes, right?

And they, they've got that one interaction and they buy the pair of shoes. So from that interaction, we, we know the shoes they looked at, we know all the other shoes they looked at. We know the color of those shoes, the size of those shoes, right? We know were they formal, were they casual? We know all of this stuff. They were free at that time of the day to go out and buy things. All of this data. And of course, we've got their email address as well from the sale.

Okay, so so people in marketing are using tiny little glimpses of of human behavior to do incredible things. Let's let's sell that person or position in front of them a complimentary sweater, right? They bought some some blue sneakers. Let's put a blue sweatshirt in front of them. Like, let's see if that hits the mark. And all these people, tiny amounts of data, they're doing so much with it. And it did strike me that that within me. Uh, L&D functions, right?

Marketeers we kill for what we've got. We've got so much data on our people. It's not a five minute interaction on our website Okay, we've got people sitting on our tools for eight hours a day, five days a week Talking with people, doing things, moving about, working on social channels We we've got so much information L&D just Don't do a lot with it.

Okay, so if they apply the creativity to using those data and funky ways, then you know, we're well ahead of most marketing teams in terms of understanding what people want. So data being boring is one thing. Then the next thing. That maybe stops people using data is that nobody in L&D, I've worked in a few companies in L&D and, you know, senior leaders who listen to your podcast might hate this, but, but let's be honest, no one's accountable for business impact.

Okay, L&D functions have spent decades saying to leadership teams, you know what, measuring learning is really hard. If we put people through soft skills course, how are we supposed to see that right? Businesses have now stopped asking. L&D functions to justify themselves. You know, what, what L&D senior leader is going up at their annual review and saying, for my objectives next year, I want to be personally accountable for demonstrably driving sales by 1%. And that's my function.

And I own that. Nobody, nobody's doing that. Okay. It shows a lack of confidence, right? If we're delivering fantastic things, if we're really delivering business benefits, we should be saying, right, we want to own cost reduction. We want to own, you know, whatever, give us a metric, but no, no one's accountable for that. That's a no one's objectives. The third part. Um, and this might be even more controversial than the last one.

For the most part, for the programs and activities that I look at, most of them don't work. Okay, most of L&D activity is just a really expensive way of taking people away from their jobs to do something that isn't productive. Okay, now, as an industry, the, That sounds really negative. There's lots of great activity that adds value to the business. L&D functions typically hope that the net result is that it's net positive, okay? We bring more value than we take out of the business.

But if we don't measure it, then we don't have to worry about it, right? So, in many cases, ignorance is bliss, right? We just We think we're doing good stuff. No one's asking us for, you know, to justify ourselves. Data's a bit boring. So let's do some fun courses. That, that, that's where I see it as a non L&D person. Looking into L&D functions. And then, you know, this isn't, I'm not talking about my current company. Um, this is across companies, right?

Yeah. Oh, we, we, we see very, very similar things where it's, I think L&D doesn't really know, like when you were speaking, I was just thinking, you know, what is the purpose of a learning function? What is our role? Like when we've done discovery work with clients and, you know, we're looking at building out marketing strategies and learning campaigns.

A part of that is understanding the, the learning function and, and what their value prop is or their brand and who they are and their identity within the organization. We'll often ask them, what's your ultimate goal as a learning function? And the blank faces never fails to astonish me because it isn't.

It isn't as rudimental as we want to help people learn or something really saccharine like I want to enable people's lives and empower them and there's there's got to be something around this business link that is just deeply missing from most learning functions. We don't know how to build value into the into the business like you said because people aren't asking for it because we don't really know how to. Capture and leverage the data that's available to us.

It, it really does create an interesting question around is L&D necessary? Is that something that we, we need to, you know, for example, I've just been reading through the World Economic Forum's future of job report just came out this month. I don't know if you've read it. It's very, very long.

Um, I only read the bits that I thought pertained to L&D, but according to that, Reskilling, upskilling, very crucial to business transformation, skills gaps are a major blocker for business transformation. Their output over like a thousand employers that they spoke to said that investments are increasing in L&D That's not what we're seeing. We're seeing what you've just described where businesses don't understand the value of L&D. They've stopped asking L&D to prove its value.

And so what's happening? Well, Restructuring, budget losses, loss of headcount. It's an interesting dynamic, isn't it? So if we're not really capturing this information and taking the initiative to actually prove our value through things like data and people information, we're losing the foothold that we have, which is small anyways. Totally. And I think this is a realization that's starting to work its way through with the people in L&D I speak with.

You know, if we look two years ago when we saw ChatGPT come in, it was really clear, you know, within a couple of weeks how disruptive that could be for L&D. And then, you know, you sit and have lunch with L&D people and they're like, yeah, yeah, but it'll never, it'll never, you know, Be able to do what humans do. It will never be able to create a course. It will never be able to coach people, right? Sitting there thinking, no, that's that's exactly what it can do.

That's exactly what it will do. Now, my predictions were it will do that in about six months. It took about nine months for us to get there. I'm always optimistic in terms of how fast things are going to change. And then, of course, It meets the reality of big corporate world and there's what's possible and then there's what actually drips through it, right?

But um, I I think this year coming is is one where people are actually going to really start thinking You know, what how do I justify myself in the business now? Um, there's going to be some existential thinking going on Absolutely. It's an identity crisis that needs to be addressed for sure. So let's go back to your three stages. So we've talked, talked about the data that we have available and understanding our people more through data.

That's one of the key ways that we can use data, but later on I'll pick up with you maybe on some of the data points that you use or things that are particularly useful, but you have touched on some of those. So let's move on to that second piece, which is actually. Marketing of the stuff. So you've used your data points to actually build learning products that make sense, that people want, that they need, that the business is aligned with. They need to know it exists.

They need to know where to go. They need to understand the benefits. So how are you using data in the kind of marketing and communications aspect? Totally. So we run A B tests when we Send out comms and just for anyone listening who doesn't know what that is.

Okay, so traditionally when started at Novo Nordisk If we want to if we had a sales course to go to salespeople We build a sales course and then we email 20, 000 salespeople to say hey, here's a dear colleague Here's a course on sales for you to do right and they'll they'll get that thing most most people ignore it but Having spent 30 seconds on there, you know, 20, 000 people, that's, that's time, time's money, etc. So, so what we wanted to do was start using segments, okay?

Um, to talk to people personally. So we use their skills, okay? If we've got a sales course that's going out, we will say, okay, who's said that they're, they are interested in sales? Okay, of the people who said they're interested in sales, who has said that they're not very good at sales? Okay. This is aimed at managers. So people who are interested in sales, who are not very good at sales, who are sales managers.

Okay. Instead of emailing 20, 000 people, we're going to email these 14 people or, you know, these 200 people or whatever that message. Now it's not that those 200 people get the same message, right? Because It will hit with some for others. It might be like, yeah, you know, I don't like the subject line of that email. I'm just going to delete it, whatever. So when we send the emails now, we we send out multiple versions. Okay, this is our AB testing.

So if we were sending out, for instance, um, a sales course for female, Okay. Managers or, or, or, you know, it's, it's aimed at getting more females through the, the leadership pipeline within our business. Okay, let's send out an email where one of them's got a picture in it, right, of a really strong professional looking woman carrying a laptop. Right. And let's get another version where it's a guy with highly polished shoes carrying a briefcase. Right?

And we'll send out those two emails to a subset of our population and whichever email or whichever format gets the most click throughs to the course, gets the most engagement, everyone else then gets that version of the email. Right? So, so that's an A B test. It's where we put different subject lines, we put different content in those mails to see what's most engaging for people. Now, now doing that, you know, using the data, it used to be, we send out lots of Uh, emails disturb lots of people.

They weren't interested and sign up rates were really low when we started personalizing it right actual people signing up increased by six times, right? So it was six times as many people from emailing less people. We were getting six times more. People signing up for things. That's because we were talking to them like people, right?

It wasn't a dear colleague It was hi Derek or hi Derek If we know you you like informal conversation or dear Derek if we know you like formal conversation And there's there's ways to identify that across people You know changing the content in it because you've been in the business for you know X amount of time doing Y role. It's just, you know, it's basically a mail merge of different texts for different people. Personalizing it made a huge difference.

So that's one way we've used data in there to drive our marketing campaigns. And I'm not surprised to hear those staggering results because I think sometimes, and many marketing functions are just as guilty of doing this as L&D are, where it's, you know, I'm just going to send everybody it. And then, you know, throwing shit at the wall and seeing what sticks essentially is the strategy there, right? And yeah, you'll get some conversions, but it's not gonna, it's not gonna be really valuable.

And I think that personalization piece, that segmentation piece is super interesting because Um, you know, obviously we focus a lot on personas, which is, is more around that kind of brand messaging piece. You know, how do we, how do we pull out the pain points of people? How do we, how do we get them started with learning? And, and that is where personas are really crucial in understanding those pain points and those behaviors and the benefits, the ROI of investing time and learning.

What you're doing is more almost what people who are at the past the awareness stage, they're kind of more, um, In the middle of the funnel, and therefore you can have these more kind of transactional personalized communications. You said you're interested in this, so how about this? You know, kind of what you were describing with the inbound event as well. So I think that's, you know, what an incredible way to use simple data points.

A B testing, of course, very important that when you are doing your testing, you only change one thing at a time. So obviously, if you want to be able to determine which picture worked, you need to have those two. So that you can compare and contrast. So sometimes people forget that and it'll change multiple things. And then it's obviously very difficult. And then what, what thing works. Yeah, indeed. Yeah. And multivariate testing.

You need a good tech to help you with that really, or, or someone who just really likes a lot of data, maybe. Um, so the only thing I was curious about is how are you tracking quick through on your emails? Like, are you sent, do you have like a email sending platform or are you using Outlook and just track links or how are you doing that? Um, So we, we are using an external platform. We're using dot digital. Um, so platform we can see, you know, when an email was sent, did the person open it?

How many times did they open it? If there was a link in there, did they click the link and then on our own in house systems? So these links will typically come through to our LMS or LXP. We can then see, okay, the person clicked the link. Did they actually do the course? Right? So we can look at dropout all the way through from people who opened the email. Done nothing, people open the email, click through and then done nothing, people open the email, click through and then done the course.

So, so we track it at all those points. Oh man, that is a superpower that L&D often don't get which is having an email tool or I think digital is even a bit broader now. It used to be mailer back in the day and I think they've expanded their prowess a little bit but was it difficult to get buy in, to get, Something like that in the L and D function. Did you get kickback from internal comms? What, what, what was the process for that? Fortunately not.

So, so there, there was a demand in the business anyway to track emails, right? We, we've gotten a comms team that sit across multiple countries. They send out mails and then an outlook, you know, they don't know who's, who's, who's engaging with that content. Right. So actually it was, it was fairly easy to say, well, we've got all this data about our people and. You know, it that relates to learning activity. Okay, we want to talk to those people.

If we were to use a tool that allows us to not bother a lot of people, right? And one of the main benefits is just not bothering people, not sending emails to people. If we have a tool, would you guys be supportive? So yes, of course we would. Essentially, we then brought in the tool for L&D, but it's now used by lots of different functions in the business, right? Our finance function are using it, our Okay. We've got all these different, different accounts set up. So that's how we do things.

You know, I personally think that that's transformational, like genuinely being able to so quickly generate a list of people who opened an email about sales. So that, oh, yeah, well, like, you know, again, or like, you know, we've got HubSpot.

So we can, like, if we're sending out promotional emails, we can then build a journey that resends to people who haven't opened with a different subject line or, you know, like, there's so much power in that data and the automation aspects that it comes with as well. And it's the fact it makes it easy for non data people, you know, as I was saying earlier, if we want to send in, uh, because we've got all the, the.

Data within the back end that identifies different attributes for people, you know, if we wanted to send a mail to men who have been in the business for less than six months, who work in marketing, who are interested in presentation skills, who have, you know, the name John, we can send, we can select that criteria and send it to those two people, right, really easily. And anyone in our business who has access to the platform can do the same, right?

It's just a really easy way of narrowing down, so we're only talking to the people we want to talk to. That's just, oh, the dream, the dream. I, you know, I, I, I, I just, That makes so much sense as to why you're getting so much impact because that tool or any tool of that ilk just makes things so much more quick and just enables you to do marketing in a way that we can't when we just are doing things like Outlook or you can still do some stuff, but it makes it so much harder.

It's a little bit clunky, but I was going to say, right, I've got the luxury of being in a big company where we've got lots of toys, right? To play with essentially, but, but in smaller companies, there's nothing stopping people doing this right from saying, okay, we wanted to send, let's look at the skills data. Okay. Who do we want to send this mail to? Who says they were interested in X, Y, or Z, right? Or who's talking on Yammer about X, Y, Z, whatever. Right. We pull out that data.

Here's 150 people. Okay. Let's send 75 of those people this, 75 of those people that in Outlook, right? And we know who we sent it to. And then we know who signed up. So even though we don't see them clicking on buttons, we know what proportion of people we sent it to. Yeah, exactly. And it's, it takes a little bit of time, but it's nothing that no company can't do.

Yeah, and it's worthwhile because you're refining and improving everything that you do all the time, so you're learning, you know, I always say, especially in marketing, like when you were talking about A B testing, I used to always like predict which one was going to win, and I was always wrong, you know, because I think that's the beauty of it is we can't predict our audience's behavior. We need data to inform how we build, how we create and how we iterate and evolve. Okay. Thank you.

All of our effort, which is so much more fun than sitting at a table, right? You've got two people with an opinion saying, I think the email should look like this. I think the email should look like that. Okay. Who will pick one and we'll send it out. Okay. That's, that's how most businesses do things. No, send out both. Make a bet. Right. Whoever wins gets a coffee. Fantastic. It's, it's, it's great fun. Yeah, I'm with you. I love data too.

And because it is a superpower, it helps us so much in everything we do. So we've talked a bit about really using people data to understand our learning function, our people function, understand what it is that we're doing, what we should be creating, making sure that there's that relevancy piece, that business, business impact focus. We then talked about the fact that we need to use data to raise better awareness of these products because people don't know what they don't know. They're busy.

We need to get their attention. We need to interrupt. How can we best do that? Things like segmentation, personalization. Let's look at that final piece of the pie that you talked about in terms of measuring. Impact or efficacy of learning. Yeah. The, the, the really scary bit that we don't like to ever touch. What's. What should we, as learning professionals, like, what are the metrics that matter? What should we be capturing? Where should we be benchmarking? Totally. Right.

It depends how sophisticated we want to go, right? But as a fundamental, you know, we want to be capturing the horrible, the horrible metrics of who's completed what and how happy were they? Not that these things matter, right? But these are data points that we can then attach other metrics to. We then look at the performance of these people, right? So we have to know that stuff. But ultimately the the metrics that matter are people's interests and skill level existing skill level, right?

Pre and post training, right? That's essentially a key thing their behaviors I'll come back to behaviors actually in a minute because that's that's a slightly more more advanced thing, you know the performance So if someone's working in sales, what were the sales before what were the sales afterwards? Obviously, we need to know that right?

But when I, I mentioned behaviors there, okay, and we're now in the 20, 25, good heavens, gosh, already, yeah, Trump's about to come in and all that good news, right? Yeah, um, you know, we've got super rich data in orgs now, right? If an organization is using Microsoft, for instance, we can look at all that behavioral data, okay? So, and that's really interesting data from a metrics that matter point of view, okay?

If we think about performance data, that can be hard to get sometimes, or hard to identify the specific metric we want. However, it always exists in the business, right? If we want to get better at sales, we'll look at sales. If we want to reduce Errors in our process, we look at the reduction errors in a process, whatever, that always exists.

The thing I hear from L&D is that, you know, if we're putting people through soft skills courses or communication courses or how to, how to coach people, of course, all that stuff, all that soft skill, what not, that's really hard to measure. It's not anymore, right? We can see that we can see when people interact with Microsoft. Now we can see, you know, how long does it take them to read every email? How long does it take them to write an email when they're talking to people?

Who are they talking to? Is it people more senior than them? Is it people more junior than them? Is it people outside their business area? All of that data is there. And again, you know, if we think about the changes we expect to see as a result of of learning one skills growth, you know, performance change, cultural change, whether people talk about things differently on social channels, etc. A big one is going to be behaviors.

And we're going to want to see are people actually behaving differently after that course than they were before that course. More importantly, Are they behaving more differently than people who didn't go through the course, right? Having that controlled trial element to measuring stuff.

So in terms of metrics that matter, you know, it's, it's skills before and after, it's culture before and after, it's behavior before and after, it's performance before and after, um, and human networks, of course, have they grown. These are the things. It's, it's, this data is mostly available in most businesses, and if it's not, you know, skills data is a survey away.

I think it's interesting earlier on when you said, you know, when you go from completions to then What happens after the completion and we don't tend to go beyond that though that that's usually where our measurement ends is They did the thing and that that's that's that's the outcome, you know I don't really follow Kirkpatrick model or anything like that, but that you did that's like bottom level, isn't it?

Like we're saying people did that did the thing But again, it kind of goes back to what I was saying earlier. What's the purpose of L&D? We don't want people to just do things we want to impact performance, right? We want to align to that performance being related to business value. And so there has to be an element of perceived value from the employee, but also a value add to the business, whether that is.

Reduction in turnover, reduction in time to competency for X, Y and Z, increase in sales and productivity, whatever it might be. And it's so important, if we think back to the marketing element, okay, of when we send out communications, we'll send out two different versions and we'll run an A, B test, okay? What we're not doing enough of in L&D is doing that with content as well, okay? So we might build a course and then we put everyone through that course.

Um, and typically that course then isn't measured, right? We just say high five, everyone. Well done. That was, that was, I'm sure it was magnificent. Okay. But even if we did put everyone through that course and we measured it, right? And we, and we saw, you know, sales went up by 3%, right? We, we want to know firstly, okay, for everyone who didn't go through that course, the sales got by 3%, in which case the course offered no value.

But we also want to know what, what in that course worked and what didn't, right? So if we run AB testing within our program, so let's say, you know, we've got a, A half day workshop and we've spanked 20, 000 on building a video to get a celebrity in to say, Hey, this is really important or whatever as part of that half day workshop. Okay. Let's run that for half the participants with that video and half the participants without that video. Right.

And then look at measurement at the back end, right? Let's, let's deploy different versions of learning. But as more AI comes in, okay, learning is going to become more and more tailored to individuals. One course that goes to everyone just isn't, isn't going to cut it anymore, right? People's expectations. are going to be well beyond that. Okay, so, so why, why aren't we doing that just now?

And when I talk to people in L&D and say, okay, can we run an experiment with this course to see, you know, what would happen if we'd done X, Y, and Z? What happens if we change it a little bit and run two side by side? The answer is always, yeah, but if we do that, a group of people are going to be getting a worse version of the course. Some people will be getting a good version, some people will be getting a worse version.

It's like, yeah, so, so is your solution then just to give everyone the worst version, but we don't know it's the worst version. Yeah, don't know it's the better version. You're just putting something out and we've no idea if it's, if it's good or not. Right. So, so this idea of, you know, we can't build multiple things or, you know, we don't want to learn from what we do is for the birds as, as far as I'm concerned.

Yeah, well, again, I think burying our head in the sand doesn't help us understand what we're doing. And I think that that, you know, the way that I see a lot of things. The, the way I see L&D being, I guess, framed in a lot of organizations is we've got X amount of budget. Let's spend it. We've got a learning platform.

Let's chuck, let's chuck a load of, um, on demand and off the shelf learning solutions, LinkedIn learning, Udemy, Coursera, chuck all those in and then we'll supplement it with some leadership programs and management programs, obviously onboarding. Cool. We're all set. And the reality is, you know, again, it comes down to me, like, who's that for? Like, what are, what's the, how is that all linked to anything from the employee's perspective?

Like, here is all of the things in the world that you could ever want to know. Oh, cool. You know, I, I spend, 20 minutes scrolling on Netflix at times trying to determine what it is I want to watch next. You know, I don't think that any employee in the world is going to, like I would literally was speaking to, uh. A vendor not too long ago and they were talking about their platform and they were saying learners can come on here and browse and I was like, they don't do that.

Like they don't, they don't come on and browse the LMS. Oh, I just fancy learning something. Maybe like 2 percent of the organization who are those very eager learners. So we need that data to, to really understand, you know, what is important. to people too, you know, because to my mind, we should be identifying skills that are universal across the organization, skills that everybody who works here needs to have.

Building out effective programs that support and enable that so that learning is easy, relevant, accessible, they understand the why of all of that. And then there is role specific education that is really important too, to help people be proficient in the role that they're in, but also maybe help them accelerate their career. or move where they want to in the business, whether that's horizontal or upward.

The data, unless we're gathering information on what's happening, both at that stage one and stage three, we're just doing all this stuff blindly. Madness. Madness. Yeah. You know, and, and it comes back to understand why we're doing it, you know, and if we, if we take a marketing mindset, you know, if you, if you run a campaign or you produce something and you throw it out to your audience and it doesn't work, right, marketeers celebrate that. Yes. We know what doesn't work.

And they do something different next time. L&D will throw something out. Did it work? We don't know. Let's do it again. Okay. Brilliant. And yeah, people don't have time for that. Okay. We're all becoming really, really, really busy. Yeah. No one goes to an LXP and says, right, I'm going to do a course on Excel. Fantastic. There's 50,000 Excel courses. Let's look at through all of these. No, they want to, they want to be presented with the one that they need.

Okay, and we know they need it because we've seen them in Excel struggling or they've put up on the social channel on Yammer or whatever it's called now Engage saying, Hey, can anyone help me with a lookup formula in Excel? Oh, someone said, can you help? Here's a course on lookup formulas in Excel, you know, that data exists in our business. Uh, someone has typed in something and told us what they need. Are, are most functions doing anything with it?

No, you know, they will just wait for someone to come in, a senior leader and say, can we have a course on X, Y and Z? And we just need to be much more tailored and much more agile and proactive. Yeah, and much more comfortable with pushback.

But unless we have this data and we have a strategy that's effectively aligned to the things we've already discussed, it's very difficult to push back to the senior stakeholder that's saying, I need this course on X, Y, and Z, because if you have all those things, you can say, well, hold on, I've got a strategy that's been approved at board level. And what you're asking me for doesn't align with that.

And therefore we can't commit resource and time to that, but we don't have the ability to say no, because we don't have the data to say. That course is going to be redundant. No one's asking for it. You just think we need it. Interesting question that, um, as I've been mulling over, so obviously use the Excel example, right? So that to me is just in time learning. I do that stuff all the, all the time across the web. I need to know how to do something right this second.

I'm going to Google it or use chat GPT or whatever, and then it's going to help me do the thing. I don't ever need to retain that information because I know I can go and find it where I need to find it. when I need it. So that's kind of like just in time learning that people don't need to necessarily commit to memory. Then there's obviously skills development and reskilling, which is a kind of a different thing.

When we're talking about measurement in learning, do you think we need to measure everything? No, we need to measure things we're spending lots of money on. Okay. So, so if, if someone watches a, you know, short format video in your example to say, how do I do this thing that I need to do just now, It doesn't matter, right? It matters for the individual that they can do what they do. If we're spanking, you know, a million dollars on a, on a program, yeah, that's worth understanding, right?

If, if we want to, yeah, no, I mean, that's, that's fundamentally, it comes down to how much, so, so when we measure stuff, okay, There's different levels of effort we can put into these things. Okay, so some things we don't want to measure at all, because frankly, it doesn't matter. It's one person doing one thing for one task. Fine. If they're happy, they're happy. If we're doing a program or a course, right? How much effort do we want to put in to this?

And that depends on how important it is. If it's not very important, but we do, we do want to measure it somehow. Anecdotal feedback is fine, right? Just, you know, did you find this useful? What was good about it? All that stuff, right? That's anecdotal evidence is cool. If it's slightly more important and we want to justify a bit more spend, perhaps than just doing a trend analysis of saying, okay, what was the performance of the people before this group of people before.

What was the performance of this group of people afterwards? Did it increase? Yes, it did. Was that down to training? Probably, maybe. We can see it's gone up a little bit, right? That's a bit more involved to do than anecdotal evidence, but it's still not a tricky problem to solve, right? It's just looking at a metric for whatever the course was supposed to affect.

But if it's super important and we're going to say, right, are we, You know, are we going to invest huge amounts of money in this area or do this for a bigger population, all that stuff? Then we want to do a controlled trial, right? Which is say, let's give it to some people. Let's not give it to some people who need it, right? Or let's give it to some people who need it. Let's not give it to some people who need it. What was their performance before? What was their performance afterwards?

Controlling for other factors that might affect it, making sure people in those groups are the same. Can we say that any improvement was down to training? Yes, we can. Right. We can see with 95 percent confidence that is as a result of the training and not as a result of the weather changing or or time elapsing. So we don't have to measure everything and everything that we do measure doesn't have to be a controlled trial. Yeah. But we, we do have to think about what, what are we doing?

And ultimately, if we don't know what we're doing, if we, if we don't know what we're trying to affect if we don't know if we're trying to improve skills or, or, or what skills specifically, or we don't know what business metric we're trying to improve, or we don't know, really know what behaviour we're trying to change. We probably shouldn't be doing it, right? We should just be saying no. Any money we spend here is likely just to be money down the pan. Why? Why would we do that?

Saying no is the best value. People in L&D can bring the business a lot of the time. Yeah. And, and even things like, you know, training needs analysis, that type of thing. I, you know, I find that really problematic because it, it's kind of like the issue I have with the word learner. It's quite pregnant. It infers that.

There is a need for learning, learning is the solution to the problem rather than, I suppose, more organically looking back, taking a step back and trying to understand, okay, if this is a business problem, how could learning support that? Is learning needed to support that? I think unless we're really understanding what it is the business is trying to accomplish, we will always end up in this kind of aimless, meandering role where we're just.

Serving learning opportunities blindly because we don't really know what it is we're trying to attain. And furthermore, even when we are building out training programs, uh, blended programs, leadership programs, whatever they might be, we're not benchmarking. So we're not saying what was happening before this happened. So there is no, you know, you haven't drawn your line in the sand. You don't know how far you traveled, if you traveled at all, if you went bloody backwards.

You know, it's a, It puts us at immense risk because, of course, we are blind and even if you're saying, you know, leadership aren't asking for, for you to evidence it anymore, we're seeing that L&D is losing value perception because it is losing a foot at the table, it's losing its stronghold, what little it had in businesses.

Yeah, and to prove that value, we have to get right up front before any work takes place to understand there is a problem and how are we going to solve that problem, right? And we use data to do that. I mean, the biggest bugbear of mine is, and we built a bot to now deal with this stuff, is as people come building something, taking a demand from the business, going off building something, delivering it, and then coming to me afterwards and saying, can you tell us if it worked? Right.

Well, what was it? It was a course. Okay. What was it supposed to change? What, what business metric? We don't know. Can you tell us? No. Right. Can we just, can we look at, can we look at everything and see if something changed? No, no, we can't do that. You know, it's just that lack of understanding the clarity up front of why are we doing what we're doing, right? What are the needs of the people? If we ignore learning platforms and we ignore all that, this is a people business.

Okay, we've got people who need something. What do they need? Well, let's use data to identify that need, right? And let's talk to them to understand how they want that need delivered. Or if we can't talk to them, right, let's use the data points to see what channels do they interact with, right? Historically, are they more into videos or podcasts or whatever, all of that stuff, right? And working from there, understanding the problem you're solving, work it through.

And then at the end, you can see, did that thing change? Right. It's, there's no use coming in at the end and saying, ah, we've done all the fun stuff data. Can we get some data? Cause the answer will be no, no, it's no point. Absolutely. And you run the risk of not having the data to measure anyways, right? You know, I mean, like I use, I always use Google analytics as a great example. If you don't have that installed on your site or your platform or anything, you're not collecting data.

So you may not collect the data. And when you need to use it or you want it, at least you have it, rather than trying to errantly find it once you've decided it's something important. So this has been such an interesting conversation. Um, I, I love your view on how important data is. really in order to ensure the longevity and success of any people function really, I think, you know, this is, this is kind of the message. One final question for you. I'm just conscious of time here.

What's your, obviously I know you have a broader view than just L&D, but most of the people who listen to us are, you know, Well, they're in people functions as well. I expect it's a predominantly L&D. We should probably get some data on it. Um, you know, it's, it's mostly an L&D audience, but likely people HR and such like, what's your prediction for, for the next year or two in the L&D space in, in particular?

I, you know, I know when we spoke previously, you had quite a strong opinion on where you think this industry is going. It'd be interesting to hear what you think.

Yeah, so, so I'm, I'm really bad at this, right, because, because we have what's possible and what's possible today and then, as I mentioned earlier, we have the pace of large organizations adapting to that, okay, and also the, the, the pace of technology is now moving super fast, but ultimately, I think the second half of this year, okay, and, uh, this, this will look completely foolish in the second half of this year, okay.

I think that people working in L&D are going to start realizing the existential threat from AI. Okay, I still don't think it's kicked in properly yet. Okay, and second half of this year, AI coaches are going to be the thing. So some companies are already developing as third party suppliers, AI coaches. Okay, these can be developed in house now really cheaply and easily. I, I woke up just before Christmas in the morning.

I was getting loads of phone calls from people saying, how do we, how do we measure stuff, right? They're becoming more accountable. Now, you know, they have to prove their value a bit more in our business. So, you know, I'm designing course how to do it. I was getting too many calls right within 10 minutes. I'd built a bought on on copilot studio that now sits in teams and when someone wants to talk to me about that. Up it pops and it starts chatting with them saying, have you considered this?

Do you have baseline metrics? No? Okay, maybe push back on it. Okay, you do have baseline metrics. Here's some metrics we recommend. Tell me more about it. It has that conversation. It'll come up with, you know, a plan of how to run a controlled trial or gather anecdotal evidence, whatever. So, so AI coaches are going to be Completely normalized and everyone's going to be using them because they're so easy to build now and moving into 2026 We're probably going to see more of a marketing mindset.

Okay, and I would expect AI to start identifying audiences So at the minute we use data to identify audiences and RL&D professionals will go in and say, you know Show me the people who are weak in this skill who have been in the business this long etc And they're manually picking these segments to to send comms to and design things to AI will be doing that for them Um, ultimately, we'd be looking for that next year to be building content, right?

And serving up the content to those who need it, right? And sending out different versions of the content, right? To see which one was actually got the highest engagement. Just like marketers do, you know, with, with, with adverts on websites, right? You throw it out there, see what sticks. I think that's going to be coming into L&D a lot more. But the automation of creation of content, um, which, which, you know, is a risk to people who aren't proving their value, frankly.

Um, we'll also see non text agents. So at the minute, it's very easy to build that chatbot, you know, that will coach you through, I've got a problem. Explain your problem to me, it might say, and then you explain your problem and it says, have you thought about this? Right? We're going to be talking to avatars on the screen, obviously, and it's going to be conversations like this, right? In real time. I say something, it responds, right? So we're going to move away from text based stuff.

And then 2027 is probably, um, complete content autonomy. Um, L and D functions. I, yeah, I, I would then say at that point, the L and D functions will struggle to show that they're not just really expensive groups of people in a business delivering not very much. Um, uh, again, if I see it being two years out, it might be five years out, right? So it's a big industry and it's, it's fairly slow to, to change, but yeah, AI, AI, AI.

Yeah, well, you know, I mean, and the pace of change with it is just astonishing. And I think, you know, for me, AI is, is it's not the enemy, but we need to understand how we use it as a tool and how we bring it into our workflows, rather than either looking for as a solution to replace everything, where, where, I guess, for me, where does the human brain add value and where can, where can it ball?

do all the grunt work, you know, where, where can all the, all the heavy lifting be done by, you know, I can put a data set into AI and it can understand it in two seconds, you know, I'll, I'll land at the similar outcomes, but it will take me a lot longer because my brain processes slowly, more slowly. Um, so yeah.

Okay. Well, That is a really interesting parting thought, you know, we are at risk and that's, that's abundantly clear based on where we're stood and it's interesting to hear you say the same thing. You're not the only kind of, uh, folks that are saying similar things that I've spoken to. So, interesting thoughts. so much for coming on the podcast. I have followed you on LinkedIn for years and I've been like, Derek, come on the podcast, come on the podcast, Derek, come on the podcast.

I've been following what mass marketing have been doing and what you bring to the industry is massively needed. It's that whole, right, let's use data to do stuff, right? Let's think about things like marketeers. You're, you're. Beating the right drum and you're well ahead of the curve. Thank you. Yeah. And I think that is, that is part of the problem. Sometimes actually people just aren't, aren't there.

They're not ready to understand that we just can't keep on doing, we can't, we will not survive if we continue the way that we are, but that is the truth of it. We, we will go. So thank you so much for your time. I'm no doubt people listening have found this incredibly interesting because you just have a really unique view of things. You're so, you're so data led and that is uncommon. And so thank you so much for coming and sharing your perspective. I really appreciate your time. Awesome.

Thanks very much. Thanks so much. Have a great day. See you later. You too. Bye.

Transcript source: Provided by creator in RSS feed: download file