Welcome to the HR Data Labs podcast, your direct source for the latest trends from experts inside and outside the world of human resources.
Listen as we explore the impact that compensation strategy, data, and people analytics can have on your organization. This podcast is sponsored by Salary.com, your source for data, technology, and consulting for compensation and beyond. Now, here are your hosts, David Teretsky and Dwight Brown.
Hello and welcome to the HR Data Labs podcast. I'm your host, David Turetsky, and as always, we try and find brilliant people inside and outside the world of HR to bring you the latest on what's actually happening. Today, we have with us Chris Taylor from ActionWall. Chris, how are you? I'm fabulous, David. Better for being here. How are you? I'm good. I would never say fabulous because I kind of, I don't want to be, and I'm not trying to put this on you, but
I actually don't even know the concept of fabulous in terms of how I am anymore. I mean, I've been to the Rangers winning the Stanley Cup. That was fabulous. I've been there when my kids were born. That was fabulous. No offense, dude, but this doesn't rise to that level yet, but we'll see. Maybe at the end of the podcast, I will be. Maybe. Something to aspire to. Yeah, yeah. It's good to have aspirations, as they say. Chris,
Chris, tell us a little bit about yourself and about Actionable. For sure. So I'm fabulous because I'm just back from Lisbon, Portugal, where I did very little but eat incredible pastries. So I'm still basking in the glow of Portuguese baked goods. When I'm not in Portugal, I run a company called Actionable. We help shine a light on the impact of corporate training programs.
We support learners in putting ideas into practice and bring the reporting data back to the stakeholders so they can actually see which needles have been moved by that report or by that program. It's sort of the, you know, we're aspiring towards that holy grail ROI of corporate training. Which is what every CFO asks for every time we ask them for another dollar. Exactly right. That's it.
That's really good. Well, it's really an important thing. So we should be talking a lot about that measurement today. But first, before we do, what's one fun thing that no one knows about Chris Taylor? So I'd actually forgotten about this. And then my wife reminded me, I am a direct descendant of Sir Francis Drake, who was a famous pirate. Yeah. Yeah. So, wow. Right. Take that. Exactly. It goes against my Canadian sensibilities, but I'll take it. Yeah.
Yeah, well, that's actually pretty amazing. My mom was really into genealogy when I was a kid and then discovered going back far enough that there was a great, great, great, great, great, great, great uncle. So do you have any ideas? Is there a good beat on where he kept the treasure or is that all gone? No treasure. But you know what? We actually had until about two generations ago a piece of
of the Golden Hyde. Is it Golden Hyde? Is that his? Or is that Magellan? Anyway, I think the Golden Hyde. His ship that he was on, we had a piece of that. Almost petrified wood. I know, right? And then someone lost it in a basement flood, as happens. So, yeah. Wow.
Yeah, that kind of sucks. I don't know how you transition out of that, David, but that's the story. Well, we'll do our best. But it's actually kind of funny that pirates have been in such a big thing in the movies and on TV that that's a badge of honor. I am honored to be in your presence. So I'm already more fantastic than I was at the beginning. Perfect. We're just inching it north. That's great. Hey, listen, an inch at a time will get us there.
It's all progress, Chris. It's all progress. But now let's talk about our topic, which is near and dear to a lot of people who listen to this podcast, which is not just trying to figure out how to make learning work, but actually figuring out how to measure behavior changes and how to create what you call learning interventions. ♪♪♪
So, Chris, you focus on supporting and measuring behavior change on talent management and talent development programs. What is the behavior change and why? Behavior change for me is the first possible moment that we can actually start to measure something as far as efficacy is concerned.
I think as an industry, we've gotten real good at the smile sheets, the eval forms on what happened in the room. And I think that the best we can capture on there is the intention that a participant might feel towards putting ideas into practice. And that's all well and good, but it's not actually going to create impact until people start doing something with it.
So I've been really obsessed for the last 16 years on that first step. Once people leave the room, literally as they're going home from the session or after they've clicked out of Zoom or Teams,
Then what? What's that first piece of movement that breaks the inertia of the status quo? So I'm just I'm fascinated by it. I think it's really interesting and it works in the favor of the participant and the organization. So how do you judge a baseline? Because I think for measurement, we always want to know what's the beginning and how did it move? So is it?
the efficacy of the skill or whatever was being trained prior to, is it, is it a, is it an assessment that needs to happen at the beginning or is it something that you have to observe? Well, this is okay. So I'm so glad we're going here because I get that this is the sort of foundation of the industry, right? We need a baseline. We need to show change compared to something else, ideally, uh, external measurements, right? Um,
behavior change, if we go down to like it's true nuclear core is a deeply internal, deeply personal thing, right? In many cases, right. We're shifting mindset in many cases, or even just awareness before we're actually shifting action. Right. And so I, without, you know, taking away from the value of external validation, what we focus on in behavior change specifically is around the individual's self-assessment of where they're starting from and then how they're progressing.
We're not trying to get to an absolute measurement on this. What we're trying to do is just bring to the surface something that was historically invisible. So the way that we go about this in a really simple fashion is at the end of the session, participants will commit to a behavior change, a habit that they want to establish, something that they want to do differently moving forward. And then they rate themselves one to 10. How do you feel you're doing with this currently?
The number doesn't matter. What matters is that it creates a mental placeholder for them to say, I'm at a four and I want to improve. Then as they self-reflect over the coming days, weeks, month, we'll typically on average collect about, well, 8.2 data points per person over the following month to be able to say the individual believes that they are making this sort of progress. Is that empirical by itself? Of course not, right? And on an individual basis, the numbers don't actually mean anything.
But directionally, being able to see the self-reported trend allows us to shine a light to say, okay, now, a month later, two months later, if we start looking for observable change, asking the people around them to review their progress on that specific area, can we see a correlation between the two? And in our experience, the overwhelming answer is yes, we absolutely can see that. When the individual feels they've improved in an area, a month or two later, those around them will be able to see it as well.
Four to six months later, they'll start to see the ramifications on externally measured KPIs. I guess the question I'd go to is there's a ton of training programs.
that all try and give different types of learning. Some of it might be required. Some of it might be licensure. Some of it might be health and safety. Some of it might be, you know, job duties. It also could be culture change. You know, the company's going through massive change. They do change management on something. And, you know, that's also in there.
Is there a sweet spot where this is more useful than others, or is this applicable across the board for learning? Yeah. So there's, there's, okay. So I'm going to take that through two lenses. One is the content and the second is the audience. So from a content standpoint, anywhere that new information is being communicated to individuals where there would be benefit to the individual and the organization, that
In people making regular changes to their typically daily practice, if we can get it down to a daily practice, then yes, there's absolutely value in pursuing this and this we can get into. From an audience standpoint, the same thing applies, but there needs to be desire and intent.
I don't I've yet to find a piece of tech on the planet that will take someone who is obstinately refusing to change and suddenly magically they're now taking action. Right. Sure. The the role of the facilitator, the intervention itself is, in my view, to transfer the information and to do it in a way that helps the learner find a so what for them. So they want to change. Right.
Assuming those two things are true, then yeah, I mean, we had last year, there's about 3000 programs that we were involved in globally, everything from health and safety, culture change, soft skill development. I think where it doesn't work super well is where the goal is not behavioral change. So if we're doing a technical training that we might need to utilize in case of emergency once a year, then no, we're not trying to change any behavior, but for something that even, you know,
Sorry, I'm rambling a bit here, David, but the way I think about diversity, equity, inclusion, right, was obviously a big focus several years ago. It didn't work because not the movement, although there could be comments on that because we were focused on exception based training. Right. We were as an industry focused on when you see this bad thing or when you experience this bad thing, do this.
We did find the DEI programs that shifted that focus from exception-based activity to something that I can notice or plan for or be deliberate about on a daily basis, then it absolutely worked. So that's the whole trick. How do we get it to a place of daily practice? I think when you bring up the DEI world, the thing that it really certainly sticks out for me in terms of training is when you have to train for inclusion.
And you have to make sure that we're dealing with inclusive language, which is a very big behavior change. It means stop using these words. And in many circumstances, stop calling people or things, these types of pronouns. And we can get into the politicalization or the politicization, whatever, however it's called of those things. You know, I can't say it. Um, but that's not, that's not the point. The point is when you're in a work situation, show respect and, and,
and be respectful and this is how to do it. That's the one piece or the caveat I would take away and say that's definitely behavior change and could show a measurable difference in how people feel what their engagement scores are. Yep, that's exactly it. And that's, I'm glad you brought up engagement scores because that's,
That's probably the softest. Now I'm doing it. It's catching the softest of the. As long as you're not getting my COVID, then you're going to be OK. Thank goodness for virtual interviews. Yeah. But employee engagement scores, that sense of psychological safety. Do I trust my team? Do I trust my leader? That's a great example where there may be specific scores in that 76 question juggernaut that the organization really wants to move the needle on. Yeah.
If we can back out of that or extrapolate out of that, what are the behaviors that have shifted should move the needle on those specific questions? And then we target training, um,
new ideas and creating the space for people to explore why engaging that behavior change would make the most sense, then time and again, I've lost count of how many culture change programs we've been involved in where that's been the goal, right? We want to move the needle on this employee engagement score metric. Like what you hear so far? Make sure you never miss a show by clicking subscribe. This podcast is made possible by salary.com. Now back to the show.
And for everybody in HR, when we're asked for how do I improve productivity, how do I improve engagement? One of the first things we do is to ask what will be the second question that we're going to focus on, which is how do I really measure it? Because those engagement surveys, no offense, but they suck. I mean, to your point, it's not just the 76 questions. It's that a lot of times people think they're being tricked into trying to answer the same damn question five different ways.
To be able to do what? I mean, give me a five-question question.
survey of, do I like my company? Do I like my manager? Do I like the situation I'm in? Do I like how I'm paid? You know, am I going to leave tomorrow? There you go. There are five questions you can ask me to tell me whether I'm engaged or not, but still we're trying to do, we're trying to measure something differently, right? I mean, isn't that kind of where all this is starting from? It's trying to do some pseudoscience or psych bullshit on, pardon my French, on people.
Well, I think that's exactly it, David. I think, you know, I watch the conversations on LinkedIn, which is my professional social nesting ground. And it's fascinating because there's 80%, I'm making up numbers now, which is not great for a data guy, but 80, 90%, the majority of the conversations just get so convoluted around how to measure this stuff. And it's, you know, and it's 18 layers removed from reality. Right. And yeah, I mean, this is a major, major challenge. So,
I think you can approach this from one of two directions, but not in the messy middle.
So direction one is top down, let's call it, where we start from the strategic priorities. And, you know, I've got a four circle visualization on this, but there's strategic priorities for the organization. If we assume that those strategic priorities involve achieving something that we've never achieved before, then it's going to require some new stuff. Some of that might be process. Some of that is going to be competency at an individual level.
Right. And then the organization can choose to buy those competencies, hire for it, or build those competencies, train and develop for it. Right. Like, again, I appreciate this is overly simplistic language, but that's kind of the goal. Right. Right. So if we can identify them, what do we need people to be competency wise, different or culturally competency in order to achieve that strategic outcome? What are the behaviors that we need that make those competencies happen? What are the interventions that we need to make those behaviors happen?
So we can, we call it the impact value chain. If we string those together and it can get complicated quickly, but it doesn't have to, then we can at least show the through line of why we're doing the trainings that we're doing. And we could say, these are the intelligently why the behaviors we're focused on are the ones that we're focused on. And we can start to measure that way where with the, I mean, their strategic priorities have X number of metrics that they're already tracking. Right. And so that becomes our big lag, right?
The lead becomes of the behavior shifting. And then if we organize the people data, so we say, yes, all these people went through training, but guess what? HR and L&D folk, most executive people don't care about the cohorts or the modules or the sessions or the experiences that people were in.
What they want to see is the data sliced by function, geography, seniority, all the stuff they measure everything else by. They want to see the leading indicator data of, can we see which behaviors are shifting for which clusters within that? And then can we map that to any movement in the KPIs? Do you want me to make this real or is this still too... This is pretty theoretical, right? To me, this is perfect because I live and breathe everything you just said. But to the listeners, I mean...
All those things make sense because that's what the leaders want. But when it comes down to HR actually executing against that. Yep. How do you do it? Yeah. So here's an example. We worked with a casino just before the pandemic, and I keep using it even though it's five years old now because it's such a clean example. So the casino...
had very high voluntary turnover. The industry has high turnover, but they were like an order of magnitude beyond the industry norm. They'd figured that it was costing them about seven million bucks a year in replacement fees on their frontline staff at the casino. So they were investing in a frontline leadership program. The core metric that they were looking for was retention on this piece. Right.
You're shaking your head, David. I don't know if people are listening. Tell me. Well, that's one way of going at it, but okay. Because I'm an analytics guy and retention, yeah, management and leadership training is great.
But is that really the issue? Right. Yeah, no, absolutely. And I think this is where we can get tangled, right? Because of course it's not in isolation, right? There's so many external factors. What we wanted to do, though, was to see what percentage, not even what percentage, but does working on leadership and management actually move the needle on the retention side? And so what they did was they took 176 frontline leaders that were going in this program and
They were split. This is why I love it. It's so clean. They were either food and beverage or gaming. They were the AM shift, PM shift or graveyard shift. Oh, that's perfect. Right. So every manager neatly clusters into one of those six boxes. Right. They'd done the impact value chain and said, what are the capabilities of one develop? What are the behaviors related to that? What's the content that's going to drive those behaviors? Then they start running them through the program.
Each month, not based on when the sessions took place, but each month being able to say cluster A, the AM food and Bev group disproportionately self-assessed progress in this particular behavioral area. Are you seeing any correlation on voluntary turnover?
And of course, the answer is no, because it just happened. Right. And month two, no. Month three, no. Month four is where they started to see, in their words, enough correlation that they could say, interestingly, we're seeing an improvement in retention in the food and Bev shift. Let's look back to see what behaviors we were working on at the beginning four months ago.
And can we replicate that now across the other groups? So they use it not as an empirical causality, like this is what caused this, but to say, is there a potential role that this is playing? Yes or no. And I think this is where the ROI of learning falls down is that we're trying to get to a black and white, this equals this. When instead, if we can simply shine a light, not simply, but shine a light and say, here's some stuff that happened.
As we move through time, some positive ramifications of potentially that happen, but positive things happen. Is there enough correlation that we can double down on this? It makes our program smarter when we can at least look at what were the behaviors that were shifting earlier. And for those of you who are reading my mind, as Chris was talking, one of the things I was going to ask him was for that group A, was it causation or correlation? Exactly. And so...
There's lots of ramifications of just the training, but did they change how they hire? Did they change who they hire? Did they get a better job description? Did they improve pay at all?
You can't just say everything else held equal because the world does not work like that. But I guess the question I wanted to ask you, in all of what you were saying, did they measure for those things as well or at least hold them as a control to be able to make those determinations along the way?
I certainly hope so, David. I didn't see it. I mean, it's interesting because we work mostly with consulting firms that work with their clients. So we're a couple of layers removed. And then this is where you get, you know, the internal culture. Like we were working with the L&D team. Are they actually exposed to some of these other pieces? I don't know. Right. I think.
The way I see it is, can we incrementally come back to our earlier point about progress and interest? Can we incrementally increase the level of awareness and intelligence that's going into design of learning programs? Sure. Well, but it's important to at least be able to have something because the CFO is going to ask, how do I know it's working? You know, give me something. And to your point,
this gives them something, right? You've got to be able to say everything else held equal, at least for them, and be able to make some sort of hypothesis that I'm making some progress because I'm trying to change these skills, these behaviors, and I'm seeing something, at least in these types of groups. So in the groups where we didn't see it, we're going to do something else. But I'm not going to change the fact that I invested or
or will invest in this for those groups that it did work for to at least see that those are carrying through. I mean, to me, that just makes HR better, smarter, and trying to be better partners with the CFO's office because you give a damn that it's actually working or not. Yeah.
Yeah, absolutely. I think there's two pieces I want to build off there if I can. Sure. The first being, and I think it's so critical that HR folks, when they have this data for the first time, sometimes over exuberance can lead to statement of causality, right? And I think really important for your own credibility and for the industry's credibility that we're not stating causality. This is a correlation piece as David, as you were saying. Absolutely. The second thing that sort of building off your point about now we have something,
I mentioned there's sort of two ways to go about this. The ideal is to start at the strategic priority and work backwards.
And there's a whole bunch of L&D folks that are asked to go out and find a leadership program. Why? Well, because we need a leadership program. Okay. So because we have budget we need to spend. It's like, oh, okay. That would be a great problem to have. Even in that case, if for some reason you're stonewalled towards understanding how the program connects to the strategic priorities of the organization, right?
Being able to start from the content and show the behavior changes that it's driving invites the conversation from the leadership team because you're bringing data that's different than what you've shown in the past around completion rates and how much people like the sandwiches. We're now able to say self-reported and then ideally externally like third party validated. Here's the behaviors that are shifting in the month following the delivery of this session.
What I find interesting with that approach, and again, it's if you can't get strategic alignment early, when you bring that data forward,
people will start asking questions that actually shows what some of the strategic drivers are in the organization. Yeah. And then invite different new training that get them similar behaviors and people start going, wow, this shit works. Turns out. Yeah. Yay. Our investments paid off. Perfect job secured for another year. Hey, are you listening to this and thinking to yourself, man, I wish I could talk to David about this.
Well, you're in luck. We have a special offer for listeners of the HR Data Labs podcast, a free half hour call with me about any of the topics we cover on the podcast or whatever is on your mind. Go to salary.com forward slash HRDL consulting to schedule your free 30 minute call today.
This gets to our third question, which is how do you make sure that this stuff sticks? Because you can't just do training once and expect it to be forever. People change. Jobs change.
The world changes. Requirements change and we hire new people. People leave. So how do you make sure that these things continue? Do you have to just keep doing these trainings over and over again to the same people or to the different cohorts now? Or how do you make it stick?
Yeah, so I think there's sort of the micro and the macro on this, because the first piece that I look at is how do we make sure that people leaving the room actually put anything they took from the room into practice right afterwards, right? And then the second, I think, from sort of a quote-unquote cultural level, whether that's on a team level or department or organization, how do we make sure that this becomes more of a sort of a new normal moving forward?
So one of the things that I that I love is that because Actionable is at the center of all of these or is involved in all of these programs, it's a little little bit of hubris is there at the center of it, but involved in so many programs. We're sitting now on about four million data points around what works and what doesn't in helping people actually drive behavior change following a session.
And that includes a number of the factors that exist after the session, but also what happens in the room itself. So one of the things that we stumbled upon last year was this three to one ratio. And again, I appreciate this sounds trite, but I also want to give some practical stuff that people can apply. What we have found is that when the content of the session...
is complemented by three times the amount of air space of contextualization, i.e. not new content, but giving people space to turn it over, figure out, do I give a shit about this? You'll fairly dramatically increase the likelihood that they'll actually follow through afterwards because you're helping them find the why that will break the inertia of status quo, right? So does that mean use cases and practice and...
Like, so I get taught something and then I do three activities right there. Is that what you mean? Yeah. At least three, three times the amount of time. So if I've just taught 10 minutes of content, giving them half an hour to do, you know, sort of current state, future state consideration, how would this impact me? How does this impact us? How does this impact?
them types of exercises. Most facilitators would have sort of their bag of tricks around what are some of the exercises to contextualize, but that's what we're talking about. Even just journaling, right, as non- high-tech as that sounds, taking time to figure out why would I care about this? How does this make my life better? Anyway,
So that's a key piece. And then the two things, there's a bunch. And if anybody wants it, I can make sure, David, you have the, we do an annual insights report free that's got all of this stuff in it. Yeah, it'd be lovely. If you can give me a link, I'll put it in the show notes. Yeah, for sure. Post session, it's the social reinforcement. So can we have the team normalize the fact that we're going to be engaging in behavior change and that it's going to be awkward for a bit? Can
Can we have that conversation in the room bridging out of the room? Can we bring a specific accountability partner in? We see, you know, we're talking about those self-reported changes between beginning and end of, yeah. There's a, I should have these numbers in front of me, but it's more than 40% improvement between first and second if...
I have an accountability partner who's actively engaged in the process. That's not your boss? That's not your boss. This is someone you chose to invite into your commitment to actually see and support you on that. Sounds like a 12-step process.
It's probably more than that. I'm being serious because that person's there to support you throughout making sure that you're living up to the commitments you've made. So I was not necessarily being tongue-in-cheek. I was being serious. Yeah, that's really good. I'm Canadian, David. My default reaction is to laugh at everything. It's like...
My grandmother died. I'm sorry. Oh, wait, I'm actually sorry to hear that. That's terrible. I'm a hockey player, so I appreciate that. Kindred spirits. The other piece that's been interesting, and this is a mental shift for a number of HR groups, is the facilitator continuing their relationship with the participants post-session is actually even more impactful than an accountability buddy that I chose percentage-wise.
What that looks like is the facilitator having visibility into what each participant is working towards and being able to provide commentary, whether it's just celebrating or encouraging their progress, whether it's around probing to understand like digital coaching or just reminding them of the thing that mattered to them, why they chose that commitment in the first place.
But does that mean that that person needs to be internal to the organization? Or does that mean there's an added expense to that, to have that person who may be an external resource continue the relationship? Yeah, so it would be an added expense if all things being equal. And I think there's that question to ask with the program to say, is the purpose of this training to deliver content or is it to drive change? Right. If it's to drive change, maybe we should put some emphasis on what happens after people leave the room.
Having said that, it doesn't need to be much. We typically see it's a 10 to 15% premium over what the event costs. And usually that premium can actually just be dug out of the extra pastries or the VR experience that we added in or whatever. Hey, leave my donuts alone. Well, I'll liken it to this. You know, I need to get my car painted. Am I going to get acrylic or enamel paint? Right? Because the acrylic will wash off for the first rain. But hey, listen, I got my car painted.
But if you really want it to, sorry, this is where I was going with that. If you really want it to stick and to look good and to actually change. But no, seriously, you're right. This is not even just about ROI anymore. It's about, are you just trying to check a box of saying, yeah, we did the training? Or are you really trying to make sure that there's stickiness in this behavior change? And if there's that much change in the effectiveness of the training, gosh, you know, I'd write that check every day.
Well, I mean, this is the thing that blows my mind, right? You're spending, including the actual hourly rate on the participants in the room. The cost of that session is, is quite high, um, to then do nothing to support it afterwards. I'm pushing content of people is not supporting the learning, right? Like, and that's where we get like, well, we have a sustainment strategy. We're just going to blast them with 12 emails. Like, well, nobody needs more emails. Oh, let's send them a freaking book. Right. Yeah. That'll get read.
100%. Not, not 100%. Let's just test the measurement on that one. Hey, did anybody open the package? What package? The thing we sent you last week. Oh, I was supposed to read that? Yeah. And supposed to, too. I think this is the whole thing. So if we can shift the conversation in the room, allow more breathing room, less content, more context, we increase the likelihood of people wanting to change.
Then if we support them through a couple people around them, checking in with them and seeing how they're doing, and there's efficiency tools for this too, right? Like the very, very not so subtle plug for actionable is that part of what we can do is make it really efficient for facilitators to do it. We're not the only ones. There's other stuff. But just if you think about that primary question of is the purpose the experience or the impact of the experience?
And invite everyone to explicitly state their response to it. One of two things happens. Either we start to think about how to solve for impact or we go, yeah, no, the point of this was the experience. So now we can just ignore everything afterwards. And, you know, I used to David think like, well, that's ridiculous. What a waste of money. I'm, I'm coming around. There's time in a place where just being in the space with their colleagues is the point. And that's okay. Right. Most of the time it's around impact.
And for everybody who's in HR, who's tried to prove to their bosses that not only are these things, these trainings, these behavior changes are important and sometimes they're required from by law. Yeah. Um, you know, like sexual harassment training, um,
But there are some things that we're trying to do. Like right now, it's really popular to do pay transparency training and have managers and employees understand what it all means, why are we doing it, and how does it impact them?
Those things are critical because if we don't train them and there's no behavior change, then we're going to a get sued or get fined. And we're also going to lose our people. And so it really matters. So we're we're actually on the front lines of this, Chris. And we're telling people you need to do really, you know, really comprehensive training for managers and employees so they understand what all this stuff means.
It can't be an in one ear and out the other because they're going to lose people and it's going to be the wrong person and it's going to lead to, you know, really bad things. So yeah, and expensive things. We're with you. Power to the hockey players. Exactly. And the Canadians.
Chris, thank you very much. This has been really cool. I actually would probably like to bring you back to talk a little bit more about measurement of programs, because one of the things that our listeners love is to actually hear actionable advice on how things can work better. And so I love discussing the ROI of programs. So we'll have you back again, if you don't mind. I'll bring some case studies. We can make it a lunch. And I'll bring Wilson. Perfect.
Please don't do that. I mean, no offense to your sponsors. I'm kidding. No. And we're sponsored by Molson Breweries. No, we're not. We're not. We're sponsored by salary.com. Again, Chris, thank you very much. You're awesome and phenomenal insights. I really, I learned a lot today. Thank you, David. I appreciate the conversation. My pleasure. Take care and everybody stay safe.
That was the HR Data Labs podcast. If you liked the episode, please subscribe. And if you know anyone that might like to hear it, please send it their way. Thank you for joining us this week and stay tuned for our next episode. Stay safe.