AI is a shield and AI can be a sword. And we've got to be the best, strongest, fastest shield there is. Our processes were built for case resolution pre-AI, not post-AI. No shortcuts in
Innovation, it's that notion of forgetting everything you know, acknowledging you know nothing, and starting from the beginning. So humbling, my goodness. You're eliminating handoffs. You're reducing friction. We're using our AI right now to reduce our time to resolve by over 40%. It's the only tool that can actually help you keep up with the pace of innovation.
It's not about replacing people with AI. It's about elevating the role of people and increasing their impact in a world that does not stand still in cyber. Welcome back to Experts of Experience. I'm your host, Lacey Peace. And joining me as always is our producer, Rose. Hey, Rose. Hello. Hello.
We just got off the mic with Katie Bianchi, the Chief Customer Officer at Palo Alto Networks. Now, this one was a fun one. Katie actually gave us, and I'm not joking here, she gave us a step-by-step process for AI implementation. So for all of you guys out there that are kind of not sure about what to do next or are unclear on how to implement this in your organization, Katie has given us basically a mastermind in this. I found it so fascinating.
It was so fascinating to hear her talk about like high quality data versus not. Before this interview, I would have thought, well, I'm sure all data, I mean, all data in a company is probably pretty important, right? There's an insane amount of data that you have the potential to collect and then deciding what data to collect, what's useful, right?
And how to store it. Like, is it kept in a place that's like we can actually use it? Because there's no point in collecting it if it's so messy that I can't even use it. What's super interesting to me is how she's bridging the gaps between departments by making it so their data collection means are consistent. Over here in sales, we're tracking this one thing. In marketing, we're tracking this other thing. Instead of everyone being in silos, she's taking those data points across the board and connecting them all together. She mentioned an insane stat that
about how quickly their customer case times are improving versus what they were like two and a half years ago. I mean, it's just bonkers. Yeah. I feel like this is a great episode for anybody that's like,
struggling to get buy-in, the leaders that are like struggling to get everybody in the C-suite on the same page or anybody that has a team right now that's maybe a little apprehensive, maybe a little bit scared of what AI agents really mean for them. What she mentioned that I found very profound was how her leadership basically came in and they were like, we know nothing. Like this thing is so new, this beast, this AI beast is so new. We know nothing.
And we're going to turn to the people on the ground, the people doing the work every day, and ask them to help us, to help guide us through this AI process. And she talks a lot about how her leadership has empowered their employees to test, iterate, experiment, to figure out what can work within their organization to actually augment their team so they can effectively stay ahead of others.
bad actors. She talks about how to implement AI step by step. She shares her biggest mistakes, their biggest wins, the things that they're planning for and looking forward to in the future, and ultimately why this is something that gets to be integrated in every step of the organization to improve not only the customer experience, but also the employee experience. So with that,
Let's get into it. Katie, welcome to Experts of Experience. Thanks so much, Lacey. I'm thrilled to be here today. Yeah. So you lead customer experience at one of the most important cybersecurity companies, Palo Alto Networks.
And I'm wondering when you're at like a dinner party or in the elevator and someone asks, what do you do? How do you answer them? Very simply, I say I'm responsible for making sure that customers stay ahead of threats with our AI powered platform that secures users, networks and clouds at scale. Awesome. And before you got into this work that you're doing now, what was your background like and your experience like before this? What brought you to this role?
Yeah, I've always had, I've actually always really had a passion around customers and service. So as I started earlier in my career, I was very focused on driving a process and driving automation that improved customers' ability to get value out of whatever solution it was that they were purchasing from us.
And I really built on that passion over an almost 25-year career where I've done different roles in leading strategy and execution for large businesses focused on what I like to believe are great missions. So I was, during my time at GE, really focused on hardware, software, services development for driving business.
technology that inspects critical industrial assets. So ensuring that airplanes are safe to fly and power equipment is safe to run as sort of the digital industrial era was starting to take hold. This notion of making customers successful with technology
software and analytics became something I really, really gravitated to. And that really started my journey into what I call the full tech space experience over the past year.
That's awesome. And what I like about what you shared is that you're choosing really complex companies to work within and really complex customer experiences to solve. Have you always been someone drawn to these like big problems?
Yes, absolutely. I think the application of technology to solve some of the hardest problems in industry is something that's always really, really, really, really compelled me. The harder the problem and the harder the solution, the more I gravitate to it because really at the core of it, what I know is.
is that that problem that's being solved for a customer has such an impact on the mission that they're actually serving, right? Whether it is healthcare, financial services, social services, energy, or transportation, those are all really compelling things to help, you know, build, move, and empower the world. That's so cool. Yeah.
So I pointed this out in our prep call and I want to reiterate it for our audience. You've been at Palo Alto Networks for two and a half years. So you came right in at the beginning of when generative AI started to take the stage and right after COVID. So what was that like for you? You kind of came into this company and it's probably just a lot to take in, not only being at a new company, but with all these big changes and big trends happening at once.
Yeah, yeah. Well, look, I came in at an incredibly exciting time for the company. I think what it does so well is just continually disrupts itself and innovates everything.
in the security space. So for me to have the opportunity to come somewhere and a company that's trying to disrupt itself in the network security space, build a new category in cloud security for customers and also transform the way that SOCs are run and the power that a post-sales organization or a customer experience organization has and helping customers execute on that mission is too good for me to pass up.
And, you know, I did end to join right, I think it was in December of 2022. So right at the time that generative AI was hitting the scene and honestly, it felt like jumping onto a rocket. It's exciting. It's intense, but it's also impossible to ignore. What was great about the timing and the way our leadership team approached it was it forced actually urgency and clarity.
So we knew we couldn't sit back and observe that we had to lead. And I think to do that well, we had to return to the start in many ways, especially from a customer experience perspective. So-
We had to rethink everything we knew about how we did things through the lens of this incredibly powerful technology that was hitting the scene. And it really forced us to go back to the start and learn about the way things were done so we could actually learn about how to do them better with AI. So I want to get into this implementation process and all the lessons you guys have learned in doing this.
But can you just set the table here and tell our audience kind of where you are in this AI journey? How long has it been? So it's been, it sounds like two and a half years since you started that you guys started implementing this. Where are your teams at in this process? Just as a quick, quick overview, you know, at the end of the day, my role is all about how to get our customers deployed fully and reliably, how to get them to value and continue to deliver products.
value and how to ensure that whatever we've deployed for them stays technically healthy as they grow, change, and scale with us. So as you think about that landscape of work, we're really in the execution and scaling phase. We continue to experiment because the technology changes so quickly that you have to continue to experiment at the rate the technology is changing to figure out how to apply it.
But AI is already embedded in how we resolve every single technical support case.
And we focused first on technical support, given what we knew to be a really massive opportunity to completely change and delight our customers and respond cases much more quickly, even the more complex ones. So we've also started the journey to actually how we embed AI and the way that we deploy our products and drive value for our customers. There was something you mentioned to me also was that you guys chose the most complex topics to begin with, not the easiest things to automate.
Could you talk to me a little bit more about that mindset? Because that is, it's wild. I mean, I've heard the exact opposite from many other companies. Yeah. You know, it's interesting. We did. We focused first on areas where scale and complexity were slowing customers down. And for us, that was support and deployment. And if I think about
How I think about like the one, two, three of implementation, you know, the first part is fixing the foundation and making sure that your data and your processes and how people access both and how you feed those into AI sort of has to be central. But we didn't just focus on
on the quick wins. We focused on what it would take to solve our hardest and most complex problems as well. The work we do is very technically complex and it's high stakes. So we tried to push AI into the deep end early, knowing that if it could help our teams handle the toughest cases, we would unlock real transformation.
But what it required us to do is really understand how every single case got resolved, which could be resolved simply through improving our documentation or improving our process so the AI could essentially learn from every interaction.
But then it also forced us by focusing on how complex cases get resolved as well to accelerate some of the decision points that we made around collecting better product telemetry, collecting more information from customers that may not fully resolve the case because they're very complicated, but would take the time to actually accurately diagnose that case from sometimes weeks to
down to days and hours. And by focusing on that full system and really understanding in the detail how the cases are resolved, with which data, with which expertise, it really helped us focus on everything to have a bigger impact at the start. So talk to me more about this implementation process, because I want to get into some of the mistakes, the lessons, the great wins that you guys have had in this process. And then I want to hear how it's changed and how soon you
you guys saw benefit from these new AI agents? Like, was it immediate or was there this like six month process of just getting the data in order before you could even start to see benefit? So like, what did it really look like as you guys enrolled this? Absolutely. So, you know, we think about it in three steps, but I'll answer the last part of your question first. Like we committed to fixing the foundation, right?
Focusing on simple to complex issue resolution and how we built that infrastructure in alignment with our IT organization and with our product organization. And going back to first principles means that you have to be patient. There's no shortcuts in innovation. And so that process for us around fixing the foundation was...
all about understanding what data resolved our problems and how do we assemble teams to assess and quickly clean it up and load it and test again to make sure that the data that we loaded resolved the issue the second time.
And then we had to completely rebuild our processes. Our processes were built for case resolution pre-AI, not post-AI. So when you take the lens that you want to learn from every single interaction, you have to redo your processes. And so that was sort of the second step forward.
And we knew if we didn't get that right, the AI wouldn't scale. So that's actually where I would say, you know, on a two plus, probably a two plus year journey, we spent the first six months learning, building and cleaning and having our product organization decide on what the right architecture was for the solution that we were going to build.
They started to build their own internal co-pilot. And then with that, figuring out what are the metrics of success that tell us whether or not we're getting better over time as we teach it to resolve our cases. There was something you mentioned earlier
like two answers ago about experimentation. And I kind of want to get into that because you just talked about it again, how there was like this phase of trying to figure out what you're going to do and how you can implement these new technologies faster within the organization. So could you just talk to me a little bit about like how you're encouraging your teams to experiment? Like this is a brand new thing. They've never done it before. I'm rewriting the entire script and
How can I do that confidently? And like, I'm sure there's failures in that, right? There's places where we take a misstep and we need to backstep and figure out how we actually need to move forward. So is there, I don't know, any tips or any learnings you can share about how to encourage that type of experimentation and that mindset in other companies? Yeah, absolutely. I think there's two things. Cordon off your best and brightest resources of people who are most curious and able to help you experiment and solve problems you've never had to solve before. Yeah.
That's step one. And we did that as we were building the support co-pilot. We basically went into our organization and said, hey, we need the 30 people who are best at resolving cases, analyzing cases, and helping us build this engine and make it better and helping us to do the experimentation along the way. So that was sort of one team we built up.
And it really is unheard of. Like you go to a support organization and say, we're going to take 30 of your best people off the front line.
to build a solution for the future. That's incredibly different. It's a different way of thinking. And then alongside that, we built a team of very smart and capable strategists, IT resources, and product people who came together to help us actually figure out how to build the frameworks of re-architecting process, re-architecting data pipelines, and
And building the solution and building structured experiments that would help us learn, pivot, and change course in a way that we needed to. So that ecosystem of people that we really cordoned off and decided that we were going to have innovate for us was one of the ways in which we were able to accelerate execution.
And the second component is, and it kind of goes back to that thread of, you know, get your smartest, most curious people to teach you. We're trying to up the ability for our own teams to access all the tools that are out there today and come and apply it to the processes that they struggle with every single day and come and tell us how they apply something in a way that we can actually figure out how to take and scale. Yeah.
The best people who are going to tell you how to fix what's going on aren't necessarily the leaders and the managers who are so abstracted sometimes from the day to day. They are the folks you have in your organization that can actually understand the biggest problems that arise and look at the new technology that's out there, create their own experiments, and then come to us and work with us to teach us
So then we can work together to figure out how to drive a scaled adoption of that motion throughout the entire team. This is such a cool culture. So was it like this before, even before generative AI, Palo Alto Networks was like, we're going to make sure that our smartest people feel empowered to tell us what's going on? You know, there are green shoots of that as part of the culture. But I think we have to look to AI actually as a generational technology transformation that
actually accelerated this, to be honest. And our CEO is, he is a very involved visionary. And I think he recognized very early on that we had to do more learning than doing. And the way to do that learning was to really step back, assume you don't know anything, and bring in all of the right people to actually help
teach. And he created for the first 12 months, what we called an AI summit. And every month, 100 leaders across the company would get together, they would learn from what was happening outside. And then they would show and tell what they were doing in their specific space across their specific use case. And it was honestly, in my career, the first time I've ever seen a
innovation get executed in a way that actually sticks and is incredibly compelling because we all learn together, but different teams went down different paths to solve different problems. And he let that and wanted that to happen, that divergence to happen so we could learn as we went.
And then because we were doing these every single month, over time, there was a very big convergence of how we were going to execute this at a different degree of scale. Common architecture, common metrics, common framework, and common solutioning, which then we slowed down for a while, but all of that investment helped us speed up.
It was incredible. It was incredible. So it sounds like you kind of broke apart the typical bureaucratic structure you see in large companies and let people operate autonomously, go discover what you need to discover. And by letting them have time to do that and then slowing down, ingesting that information and finding a way to make it more cohesive across the organization really paid out. Basically, it lets you
Act faster almost, even though it might sound like there was multiple steps here and a little bit of slowing down, you guys were actually able to execute faster than I've seen a lot of other organizations execute. Absolutely. And there was incredible intentionality into how that process of innovation was going to go. We didn't know where we were going to end up, but we knew what we were going to do, I would say, as a company under his leadership to get to the place we wanted to get and to do it in a way that...
would deliver the right outcomes for our customers and our people, no matter what use case we were exploring as part of the... And there were many, but no matter what use case we were exploring as part of the initiative. I was just talking to my CEO a little bit about this, how if your vision is...
so small that you know exactly how to get to the end, then it's not really a vision. It's just like a micro goal. And I think a lot of organizations operate from that like goal standpoint of, you know, I want to cover my butt. I want to know exactly that we're going to have this thing that in six months I can report to whoever.
And in this instance, your CEO held that vision and was like, we have this big thing up here that I want to get to. Don't really know how, but I'm gonna let the smartest people in the organization figure out what we need to do to get there. So it's just really cool that you guys got to kind of come to work and play a little bit. Yeah. Yeah. And I think we knew, you know, we were driven to align on what outcomes we wanted to get to. But to your point, the path of
So humbling. My goodness.
Well, I want to get a little bit back into sort of the technical aspects of this a bit more. What are your AI agents currently capable of and how are you currently using them? And then I do want to talk a little bit about roadmap and like where you guys are headed, but just give us the numbers. You shared some really cool stats with me and I would love to have the audience hear those as well. Yeah. Yeah. So just quickly in terms of what they do, what they do today, we have a
that handles our technical support use case. And that co-pilot can handle everything from simple and repetitive tasks all the way to providing some decision and diagnostic support for our most complex problems, which is really awesome 'cause it lets our team focus on the most complex problems
And so from that perspective, we're using our AI right now to reduce our time to resolve by over 40%, whether it's the simple use cases where we're actually getting a higher degree of traction of having AI fully resolve those cases, all the way up to the complex cases.
And it's been really interesting because doing this for the simple cases, the lights are customers because we're on a path where they're going to be able to fully self-resolve themselves. And for the complex cases, and we do it, you know, on the complex side, not by replacing humans, but by reserving.
like surfacing the right data to them at the right time so they can work smarter and they can work faster. And then what were some of those lessons? Because you mentioned, you know, in this two and a half year process, there have been lots of learnings. What are some of the key lessons that you think other CX leaders or just business leaders in general should understand about this process? One of them, I think, to me is like I would have started earlier on the data side and
I think it's the most underestimated part of any AI effort and probably the least glamorous.
In certain ways, like I think people tend to get excited about the shiny layers of co-pilots and automation and agentic AI. But if the underlying data isn't clean or you don't know what data is required to solve the issue and you don't figure a way to turn that on and get it flowing across your data pipeline, the value just won't be there and your solution won't.
scale. So look, I think looking back, we probably could have moved a little bit faster if we had started earlier in the journey around data unification and sort of capturing some of the telemetry much more quickly. So I think that's one thing. And I think one of the biggest misconceptions, especially for, I would say like a customer experience organization, as you go on this AI journey, I think the biggest misconception is that you can do it alone. What do you mean by that? Well, I
I don't think AI transformation is something that a post-sales team can or should own in isolation. You can't bolt it on to existing workflows and you don't want to. You need to drive net new process re-architecture.
And that takes a lot of partnership with your strategy or operations organization, as well as your IT organization. And when you think about fundamentally what you're doing, you're solving problems with the product. And our product organization is actually the one that is building technically doing the engineering behind the co-pilot build process.
And I think that that has been a huge accelerant for us. Now, we're doing that in the highest degree of partnership where we are cleaning our data, validating our data, and then helping people.
helping the product organization improve both efficacy and accuracy, but we would not be where we are right now if they weren't owning the engineering around that, right? Number one, because they've got the engineering expertise, but number two, because they have the product expertise. Say goodbye to chatbots and say hello to the first AI agent. AgentForce for service makes self-service an
actual joy with his conversational language anytime on any channel. To learn more, visit salesforce.com slash agentforce. Do you see companies doing this a lot where they stay stuck in their silo and don't disperse the work or don't work with
other teams that could be supporting them? Yes. And I think for us, what's been really interesting is that AI and this sort of revolution has made tight collaboration between go-to-market, CX organizations, product and IT. It's an absolute non-negotiable. One example I would give you is really how we're working to unify data capture and automated workflows across both pre- and post-sales.
So do you think about a historic motion between pre and post sales? Pre-sales captured all the technical requirements and all the outcomes and all the use cases. They stored it in a system or they didn't. We didn't have access to it and deal closed. Post sales would pick it up and start all over again and frustrate the customer by asking the same questions again.
They'd have to start from scratch in terms of building a technical strategy, which takes a ton of time, redoing work, missing context. Now, as we're building this interlock between pre- and post-sales with AI-driven flows that are going to carry that data from pre-sales to post-sales and use it to recommend deployment strategies that are tailored to each customer, you're
you're eliminating handoffs, you're reducing friction, you're accelerating time to value. And for our customers, it's going to mean a more seamless, much more personalized experience from like the first conversation that they have with us in the presale to the time that we get them to value through a full deployment. And honestly, like I do not think that that level of integration would have ever happened without AI forcing us to really think about
the end-to-end process and the data capture that's required and what we want to build towards in terms of the experience that we want to provide for our customers. How are you using this data that you've collected and leveraging it to...
improve every single customer interaction moving forward? I'll give two examples. I think one, if I'm specifically looking at the support use case, the way that we've changed the process is that number one, based on all the inputs we're getting through the case lifecycle, we're driving certain data inputs at case closure. And we're summarizing all of that
into knowledge that feeds our co-pilot so that that question may have been asked nine times in the past and a human may have had to engage to answer it. But on the 10th time or the third time now that that question gets asked, we want to have a solution for it.
And we're building our processes to ensure that we have enough data that as AI and AI reasoning gets smarter and better, that that capability is built in and it's actually something we can leverage from the start. The second example would be as you think about this whole customer journey and customer lifecycle from pre-sales to deployment to ongoing value realization and support,
You actually want a fully closed system. And the way that I think about it through the persona of a customer or a partner who's deploying our product is you no longer want to hand them a manual runbook for how to deploy a product. You want to hand them a copilot.
And that co-pilot will have all of the information around every interaction that we've had specific to selling, architecting, deploying, and supporting a product that has looked like what they are going to deploy to.
And if you imagine your deployment leader, whether it's a third party partner, you want to do your work or your team going in. And as part of the process, getting a technical strategy automatically created for them from all of that learning and all of the net new inputs specific to that customer that have happened in the presale.
So all they need to do is pick up a, what I would call dynamically generated playbook and execute to that. That's what we're building towards.
I love that dynamically generated playbook. That's the future right there. Yes. And what I love, too, about what you mentioned is that you guys are planning for more AI development, right? Because, you know, it's not done. Like new tools are being developed every single day. The way that AI is able to execute is improving every single day. And so how are you guys thinking about that with like data collection? You're kind of collecting for the future almost. Yes.
It's not necessarily collect everything. It's collect what adds value. And honestly, you have to go through a journey to learn what adds value. I was just going to ask you, how do you even determine that? Well, it's like you actually, you can't like, you know, we did it for support. We actually went through some painstaking level of analysis on every case to say, what is every data point? What is every interaction that actually gets this case through?
for and how do we collect it? Like how do we turn on zoom transcripts? How do we turn on audio recording? How do we pull in from JIRA? How do we pull in from the network and telemetry? And as we've gotten smarter, we figured out what information is high quality information and
And what information is not high quality information. And it's allowed us to sort of tone and lock that in a little bit better. You know, we're on the earlier stages of this with our deployment journey. And we're actually still starting in the same place. It's like, what's everything we need? Because you actually have to uniquely learn it for each experience. Oh, that's so good. Tell us about Project Zero Fault and how that's redesigning customer experience there at Peloton Networks.
Getting into then the digital employees, these AI agents that we've been talking about, obviously it's changing the workplace. So we've talked a lot about how humans are thinking differently about how we can set up projects, how we can help our customers by using these tools. But it's really changing, you know, where they can put their focus and their time.
And truthfully, how many people you need on your team as well. So could you talk to me a little bit about how you guys are thinking about this new digital employee and the digital workplace? And yeah, I guess I'll just start with that and see where we go. Yeah. You know, I think about it in the lens of like, how does work change? What is the future of work? And what we're seeing right now is like our teams are spending...
much less time grinding through tickets and more time focused on real complex customer challenge, like customer challenges. And it's the kind of work only humans can do. I think one of the amazing parts about being part of a company that has to move at the pace of cybersecurity and continue to innovate, to stay in front of adversaries is
is that it means that we are going to continue to innovate. We're going to continue to develop new products and we've got to figure out how to stay ahead of that. And a big piece of staying ahead of that is to make sure that
AI and some of the tools and technology that we have available to us can take on the repeatable, the predictable, the what's already been solved layer. So our sales engineers, our PS teams, our support engineers, and our architects are freed up to engage more deeply with our customers on how to apply new technology to improve their security outcomes.
So I think it's changed the way that we are all working in a really energizing way. Our teams have to react less to the noise and more on, I think, solving the problems that matter. So for me, it's not about replacing people with AI. It's about elevating the role of people and increasing their impact in a world that does not stand still in cyber.
Yeah. Yeah. That is the thing about cybersecurity I've always been fascinated about. We do another podcast called Big Ideas Lab, which talks to Lawrence Livermore National Laboratory. I'm sure you're familiar. And staying ahead of the bad guys, people that might have, you know, not so great ideas in mind for what they want to do to you.
It's just this constant marathon that you guys are running. Like you don't get a break. You have to keep going. And as AI gets better, you have to keep your AI tools out pacing the bad guys, quote unquote, AI tools. So yeah, I just, I think that's very, very cool space that you guys are playing in. And I love that you get so much energy from it. And it's not just this like, oh my God, we're constantly racing. You guys are definitely just like,
How can we play here? How can we have a little bit of fun? But also like this is super serious and we need to take it as seriously as we can. Yeah, absolutely. You said it really well. I think it's like AI is a shield and AI can be a sword. And we've got to be the best, strongest, fastest shield there is, whether it's what our product can do or whether it's how our teams are working every single day to bring that value to our customers more quickly with less complexity than ever before.
Yeah. Oh, I love that analogy. AI is a shield. Okay. I might steal that one here in a future episode.
Okay. You talked a little bit about metrics kind of towards the beginning of our conversation. I did want to touch on that because some people are having a hard time trying to figure out like, okay, what are the metrics that still matter that are, you know, we've used them in customer success for a long time. But what are the new metrics I should be looking at to measure if my, you know, AI implementations working well? How can I sort of see or get the pulse of if like my digital employees and my human employees are playing well together? Yeah. Talk to me about how you guys are thinking about that. Yeah.
Yeah. Look, I think honestly, we have to continue to measure success by the outcomes we deliver for our customers. It's eliminating issues before they happen. It's resolving issues as quickly as we can when they do happen. It's not just a shorter time to deploy, but a faster time to get them to value and
with a deployment that ensures that as they grow and scale, they'll continue to stay technically healthy and operational. And most importantly, with that, it becomes the fastest time to value an ongoing value, right? So it's not just about solving those problems, but it's got to be about helping customers get the full benefit of what they bought sooner. So
You know, I think that's got to be the backdrop. You can't, like, we can't lose sight of what we do every single day. But at the core of it with AI, there are a different set of things that you need to ensure are working well in order to deliver those outcomes. And it really comes down to, is your data right? Is your data accurate? Are the solutions you're providing to customers accurate?
effective? Are they helping resolve the issue? Are they helping customers do the work or get the answers they need in a faster, more reliable way? So you do have to spend the time making sure that the product that you're delivering to the customer, which is now this AI solution where it used to potentially be a document or a person,
Yep.
And that product has to provide a wow experience to our customers from the first time that they touch it. And it means that we've got to think differently about the solution design that we get to. And it also means we need to measure things a little bit differently in terms of how good that solution is for the problems we're trying to solve for those specific customers. Were there any customer stories that you wanted to share? I think one customer story that stands out is we've had...
A couple of customers who we've been able to help prevent major incidents in their infrastructure because we flagged emerging issues using telemetry before they even saw symptoms. So that early intervention helped them to take action quickly, helped them stay ahead of some of that risk.
And it was a clear moment for us of sort of our ability as an organization to move us from being more reactive about the problems to actually being much more proactive. And with all the intelligence we have, you know, we have other customers or customer situations where sometimes, you know, you have customers who are frustrated, who are at risk, who aren't getting to
value where they might have struggled with the way that deployments had been done in the past or frequent escalations on specific issues.
But because we've been able to get a lot smarter in the way that we're handling cases and looking at issues before they go red or using the data that we have to have a more proactive deployment engagement. You see that really change the way that customers lean in and engage with us. And we can't remember, we can't forget, just like you said a little bit earlier, our customers are changing.
are solving, they have the hardest jobs in the world. They're solving the hardest problems in some of the most emission critical industries in the world. And our job is to make the complex simple. And so when you're able to do that by embracing AI, by embracing automation and really rethinking it, you see them lean in
completely different ways. And it's this notion and this mission and this philosophy we have about using that to actually build customers for life. Oh, I love that. So what's next in this? You guys, you've talked a little bit about a lot about the implementation process to this point, but you're already in the works. You guys are still constantly experimenting. So next six months, next year, I talk to you five years from now. What do you think you're going to be working on working towards? Yeah.
Yeah. I mentioned this a little bit earlier. I am actually most excited about using AI as the operating system for how we work.
Where it's not just a tool that we use occasionally on the side to enhance an old process that we've used, but it really becomes the singular engine that guides our teams in real time. So I like to imagine a world where AI helps every single team member know what to do, how to do it really well, and when to take action. So we're managing by exception, right?
And that's going to allow us to accelerate time to value, solve the hardest issues much more quickly. So I think this type of like intelligent inflow guidance is,
was really only something that AI can unlock. And it really, it changes the way we operate, right? Like it makes this notion of expertise in a fast-paced world much more scalable, right? It drives consistency and it allows us to deliver better outcomes at speed and to a degree of quality that honestly, like I don't think we could have imagined a few years ago.
So because it's not just an operating system, it's also like this knowledge base. So like I as an employee or even I've been an employee for 10 years, but now there's this brand new thing that like basically makes my knowledge completely defunct. Like, how do I stay on top of this? So I think that's
That is what's interesting to me is how you've got this artificial intelligence for your company that's not just going to be able to functionally do things and help you operate, but it's going to teach you things that you can learn from. And then using our human brains to be able to take that little cool key learning and turn it into this whole new product line or new process or new system, it's just going to democratize information and knowledge in such a profound way.
And I love that you guys are thinking about doing that inside the organization. Yeah, absolutely. I think it's the only tool that can actually help you keep up with the pace of innovation, if that makes sense, right? So if you look at some of the tools that are out there, like now, like Notebook LM, right? This gives every single person on your team an AI-trained assistant on all of the internal and external knowledge so they can get more real-time information and
And as our product organizations continue to launch new products, new features, new functionality that are complicated and complex, they're using AI to generate perfect documentation that feeds that system. So no longer do you have to write a document, do a PowerPoint,
do a webinar, beg people to come to the webinar, think that they retain the information. You have a full system that keeps pace with innovation that people can access when and where they need it. Yeah. Yeah. I think about this where I'm like, okay, in 10 years, I'm not even going to be interviewing people. I'm just going to be interviewing the AI version of themselves. So I'll just be talking to the AI Katie. It's true. It's true. Yeah.
An AI agent that your customers actually enjoy talking to? Salesforce has you covered. Meet AgentForce for service. The AI agent that can resolve cases in conversational language anytime on any channel.
To learn more, visit salesforce.com slash agentforce. Okay, I want to get into some closing questions. So Rose, before I do that, what do you have? What are you curious about? What have you been withholding from us over there silently? Thank you. Yes, I actually have a couple questions. My first one is not...
AI related. During this whole AI revolution, do you feel like there's any customer experience, like basic fundamentals that might be getting overshadowed by all of the discourse and all of the excitement and controversy of AI? I'll speak from my experience, but I don't know that I would say that they're getting overlooked or overshadowed by the by the.
The controversy, look, I think the importance of having safe and trusted AI, especially in a security space is incredibly important. But I do think there's a perspective of this is a pretty powerful tool that can actually, when applied in the right way, safely with the right data can have incredible outcomes. I think one of the things that I have struggled with a little bit as a leader is like we did make that choice to solve the hard problems first.
And that really became, we're going to use AI to solve every single technical problem that we have.
And what now as we're sort of on that mission and have that scaled, we know that we still have a lot more work to do to actually apply AI to make the processes for how we resolve those cases because we won't go down to zero, but to make the processes for how we resolve those cases much more effective, much more efficient. And so it really depends on what your strategy is and what problem you're trying to solve for second and third.
Okay. And my second question is a little silly, but I just have to ask it. I've always wanted to ask somebody in cybersecurity this question.
Have you ever seen the show Mr. Robot? My wife has. I have not. Okay. Just curious. The show's all about, you know, cybersecurity and he's like a hacker and things like that. I've always wondered, like, is this realistic? Is this really what goes on? Just curious. I'm sure it is. I thought you were going to ask me for my one piece of advice, which is don't click on any link ever. That's also great. Okay.
That's a great one. That's a great clip. That's especially helpful when your dad calls you and asks you what this email means. At least he called you and didn't click it yet. My husband falls for this stuff. He grew up in Alaska, like I told you, and he does this thing all the time where he just believes people. And I'm like, why do you just believe people? Because I guess...
you know, in jolly old Alaska. Everyone's so nice and sweet, but now he's getting these like USPS notifications that are not real or like signing up for random things. It's just ridiculous. So I'm going to send him that clip. The level of sophistication and how...
how AI is actually enabling those things to convince people is part of what we deal with every day, right? Like two to three years ago, everything would be spelled wrong. You could tell right away. And now you've got two things happening. One, they're much more sophisticated. They're much more savvy. They're much more convincing. And then there's a lot of, I think, websites that are spinning up
out there that are convincing you to buy certain things or buy certain tools that may not be real, that may be AI generated. So I think it's just interesting. They're using actual logos too. I've seen those websites spun up and it's like, actually, that's a real logo. That's a real company. But this is fake. This isn't real. It's fascinating. Yeah. I mean, it is that constant march of like, they're getting smarter. We have to stay ahead of that. So there's this...
fun closing segment I like to do. It's called relevant or irrelevant. And I'm going to ask you a question and you need to tell me if it's relevant or irrelevant or like what you think would be relevant or not relevant. So it'll make a little bit more sense as I ask these questions. Okay. Having an AI center for excellence within the company, is that needed or not needed? Not needed.
Not needed. Why? Why is that not needed? Because if AI isn't part of the fabric of how every person and every function operates every single day, a COE becomes a crutch for that and will never get you to scale. I love this answer. That's so good. Okay. What common metric do you think is no longer relevant or should be replaced by something else? CSAT, NPS.
Yeah. Okay. Tell me about that. We should know enough about every interaction and have an opinion about what great looks like, where we can measure ourselves against good versus having to tell customers where we're having, having to have customers tell us where we're great or where we fall short. That's putting the load on them versus us taking ownership of it. This kind of echoes Rose's question a little bit, but we'll try it.
What's a foundational CX principle you think hasn't changed and won't change? Well, I can give you my favorite principle. And my favorite principle is always, it may not be our fault, but it's our problem. And I don't think that should ever change. And last question, pick a ball, relevant or irrelevant? Irrelevant. I'm a tennis player and it's loud. Love that.
Everyone here in Austin plays pickleball. I'm sure. Everyone. No, it is fun. It is fun. Don't get me wrong. I haven't even done it yet. Rosa is over here. She's our pickleball player of the team. So I haven't committed. It's super fun. But as a tennis player, I like to give it some garbage. I like to give it some. See, I didn't know you were a tennis player. I wouldn't have asked you that question. Yeah.
That's good. I've never played tennis. I think – well, no, I take it back. I think I played like one round in high school PE and found it to be so exhausting. I was like, I give up. Can't do it. Yeah.
That's how you get all the anger out from like setting up cybersecurity stuff. You're like, oh, OK, I'm going to go. I will say, too, I I always know on the pickleball court when I'm playing a tennis player, they are so good. Their backhand is always so crazy. Their serves are I can't even if you're serving at me like I can't even return it. It's and they take like five steps back to serve because it's so powerful. So it's terrifying. I respect it.
If I was a tennis player, I probably also would think pickleball is pretty irrelevant. And it is very loud. I'm lucky enough to not live close to any courts, but I feel bad for neighborhoods that are like right next to a court. It's definitely caused some drama in the suburban landscape. Suburban tech bros. Totally. Totally.
Okay. All right. Last question before we close out. What's a recent experience you've had as a customer of another company that you want to shout out? You know, recently...
Starbucks surprised me in the bus way. And I said that just that it's like my favorite, you know, CX expression is like, it may not be our fault, but it's our problem. So a few weeks ago, I accidentally placed a mobile order at the wrong location in my town. And when I showed up at a different store, they didn't make it my problem.
Like the barista just said to me, no worries, we'll make it for you. What was it? And it was a very small moment, but
But it showed empowerment at the front line and a real customer first mindset, which then turned the mistake that I made into just increasing loyalty. Yeah. We've heard Starbucks mentioned a few times on this podcast. And so like I just huge shout out to them. They definitely do a great job empowering their employees to, you know, put the customer first every time. Right.
Fantastic. All right. Well, thank you so much for joining us, Katie. I really, really enjoyed this conversation and I hope we can have you on again sometime. I would love to. This was a highlight of my day, Lacey. So thank you so much for the invite and look forward to trading some more stories with you as things progress in the world of CX. Awesome. Awesome. Thanks, Katie.