We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Moving Fast and Breaking Things

Moving Fast and Breaking Things

2025/6/3
logo of podcast War on the Rocks

War on the Rocks

AI Deep Dive AI Chapters Transcript
People
J
Justin Fanelli
Topics
Justin Fanelli:我认为不应该将商业技术的采纳视为简单的平衡,而应该将其视为一个需要优化的投资组合,以实现期望的影响。我们需要关注那些能够快速迭代和适应的技术,以应对快速变化的需求。评估采购的重点应该是结果和实现结果的速度,而不是仅仅关注公司的新旧或规模。我们应该根据结果来衡量成功,而不是仅仅关注投入。目标不是简单地与公司会面,而是要找到最佳的影响,并尽快将其转移给作战人员。有时需要改变方法才能更快地达到目标。应该让供应商展示他们所能带来的不同,而不是仅仅依赖传统的评估方式。我们希望通过节省他们的时间来展现我们对海军官兵的重视。我们使用的一个以结果为导向的指标是用户节省的时间。我们希望商业公司能够量化他们的技术为海军官兵节省的时间。企业服务的关键在于找到真正优秀的解决方案并规模化,同时淘汰那些不值得发展的方案。

Deep Dive

Chapters
The Navy uses a portfolio approach, balancing commercial off-the-shelf solutions with custom needs. They prioritize speed and impact, measuring success by outcomes rather than inputs like munitions used. This involves a shift in prioritizing and meeting thousands of companies to find the best solutions.
  • Portfolio approach to technology adoption (commercial vs. custom)
  • Prioritization based on speed and impact
  • Outcome-driven metrics instead of input metrics

Shownotes Transcript

Translations:
中文

You are listening to the newest podcast from War on the Rocks called Cogs of War, which is focused on defense technology and the defense industrial base. Cogs of War is supported by Booz Allen Hamilton. We're really excited to partner with Booz Allen on this one. You'll be hearing a lot more about that. And the first guest for this first episode is Justin Finelli, the Chief Technology Officer of the Department of the Navy. We recorded this one at a live event in front of a live audience. And it's really about moving fast and breaking things to innovate in the Navy.

Enjoy the show. This one goes out on the flagship War on the Rocks feed. Please subscribe to Cogs of War by searching for it on your podcast app of choice or clicking on it in the show notes. How long have you been in this job now? A little over two years for the CTO job and then three years for the tech director job within Acquisition. For those out here who are less familiar with your career trajectory, how did you arrive at working in the defense department? A mediocre soccer player in high school.

Went to Air Force ROTC, full scholarship, ended up getting medically disqualified and wanted to serve. And I just wanted to be around people who were making a difference. For that reason, I found a civilian job. It was a really good time in the job market. So like 20 jobs came and went offer wise before the government delivered their offer. And then I took it and then started working for the Navy. And then the intelligence community kind of zigzagged a lot.

across federal tech defense, or I see it at all points, but launched a couple agencies and then have worked in the acquisition community post-DARPA for the last four years now. And you've been in this particular position how long? Again, remind us. Two years for the CTO job, three years for tech director for PEO Digital. That's right. And you've emphasized, I mean, this is talked about a lot in the defense department, but you've really put your name out there in terms of emphasizing the commercial first approach to

accelerate capability delivery? How is the Navy balancing the adoption of commercial off-the-shelf solutions with the need for mission-specific requirements and security considerations? Because this especially gets complicated when you're talking about certain things that are at the bleeding edge, certain things that are highly sensitive missions, certain things that need to go on ships.

I think sometimes people see it as a seesaw and really it's a portfolio, right? And so what's better, bonds or blue chip stocks? You need what you need, right? And so this needs to be thought out. I don't know that balance is always even the right term, but it needs to be optimized for the impact that you want to have. And so one of the things when it comes to commercial technology is cycle time. And so...

We will need and will always need exquisite technology. In the 2010s, we designed, we finished the design for the Columbia submarine. And the life cycle of that is until the 28th, right? This is something that takes a long time and lasts a long time. And there's a strategic advantage there. And the board there doesn't change very often. There are a lot of places where the board changes much more often.

We've seen requirements processes and contracting processes take longer than the life cycle of the product that we're procuring. That is not good for impact. That diminishes what our warfighters want to do, whether that's front office, back office, tip of the spear. And so looking at all of this in the lens of what is the outcome and what is the speed to outcome, that's the way that we want to measure. And it's been at times kind of convoluted about, hey, is this a old company or a new company? We want to talk about impact, right?

We want to measure this in terms of outcomes. When you talk about metrics, and I'm going to make fun of some other people in DOD, and you don't need to comment, but I was a little dismayed when I read an article about our campaign against the Houthis in the Red Sea, and the White House asked for metrics of success and CENTCOM sent back a number of munitions used.

So metrics of success are important, but it's important that we're using the right metrics. At FCOS, you mentioned obviously the importance of metrics in assessing IT procurement success. What are the right metrics when you're talking about that kind of challenge? Within acquisition in general, the ones that we're most familiar with as a whole, if you've ever done any sort of buying either side, private sector or public sector, is we've talked

about cost, schedule, and performance. And these are mostly input metrics. The idea of how many munitions are used might be an output metric, but it's definitely not an outcome metric. And so the idea of KPIs saying, here's exactly what we understand we need, and here is how we do it, is a fairly linear process. There are still places for that, just like there are places for explicit systems, but

in the space where it's dynamic and now we have more offerings, we have more companies and we have more products than we knew before. These are all very good things, but it ultimately creates a queuing problem. And so if we didn't have a straightforward way of determining where there was advantage, what if something is way better than we thought it would be? And so we basically said, how many companies can we possibly meet within a year? The answer was like, I don't know, 300 companies. If you do less work than you should.

And that's not the goal. And so we said, how do we meet with thousands of companies? How do we like jump the order of magnitude to make sure that as more private capital comes into the market, we can process these. And so ultimately you're shifting to where the leverage is. The goal isn't to meet with companies. The goal is to find the very best impacts of

and then transition to them, the war fighters, as quickly as possible. And so we needed a very different way to prioritize that. And so there's this thing called the beta region paradox. So it took me an hour to get here. The roads look clearer now, so it'd probably take like 15 minutes. At some point, if I had just gotten out of the car and walked, I would have gotten there faster.

Sometimes you have to change the mode of transportation. And I think there are a lot of places where some of the old metrics aren't serving us well. Prioritization has always been hard in the Department of Defense. Now, where we put the onus on our partners and our vendors to show the difference they're making, maybe they're better than someone who's been counting.

There's been a big focus on tying enterprise services to enhancing the daily operations of sailors. There's a lot of sailors, a lot of naval officers who listen to this show. For them especially and for anyone, anyone have a Navy background in this room? Raise your hand. We got a couple. We got Artam over here, of course.

How would you describe to them what you're trying to accomplish and what kind of impact you want to have on their lives when it comes to delivering these enterprise services and changing the way they're delivered and also just what they are? We want to show them how much we value their time and their commitment and their sacrifice. And we want to show that per dollar. One of the outcome driven metrics that we live by is user time lost or saved.

We are valuing this and we are asking commercial companies to come in and say, hey, I have a great technology. No one cares about your great technology for technology's sake. How much time can you save a lieutenant on a ship? How much time can you save the CSO who hasn't slept in a while?

So we want to do that translation. So one of the enterprise services that we have, the chief systems officer on a carrier was coming back from the Red Sea and there were some complications. And he video chatted with some of the sailors on board with their families back at home using one of these enterprise services.

it lowered the stress of the crew it lowered the stress of the family at home it was just great for quality of life and then quality of work and so this is a case where we want to improve quality of service by getting rid of distractions and we've started kind of low in the stack and up

to, "Hey, which function should I use? There are 17. What if there was just one good one with one backup? We don't need 17." So enterprise services is just figuring out when there are a thousand flowers blooming because everyone's proud of what they can create. What's actually grade A? Let's scale that and let's leave everyone else in the competition tank or cut the flowers that don't deserve to bloom.

Something that we've talked a lot about, and it's being talked about a lot right now, the new administration is encouraging, actually mandating that the Defense Department exercise authorities that have already existed. So OTAs most prominently, other transaction authorities.

We've talked a lot about in private how you've been able to motivate members of your team, other members of the Navy team to use these authorities because they've been around for a while without this sort of SECDEF sign-off memo. And one of the things I've been really interested in is how you motivate members of the Navy team to accept more risk and move faster. And I'd love to hear some of your thoughts and experiences on that. I wish I did.

It's actually more of a matter of like finding the motivated ones. So there are a ton of motivated people. There are people who run hard against all odds, against all kind of guidance at times. And so we've just labeled them and we say, hey, come work with us. We will unleash you. We will let you do your thing at the pace that you want to. They enjoy it more. They make a bigger difference. And so the locating piece is a big one.

The other piece is when you have all of this energy and you can kind of channel it to say, hey, we've done a lot of reps at this. We see that you are someone who wants to make a bigger difference. We see that you're someone who is going to run at a problem. You don't have to be a Roomba this time. We can just show you a straighter path. And so for that reason, we said, hey, innovation adoption is a contact sport.

But we don't send warfighters into theater without a kit. So let's not send adopters into this ecosystem without a kit. So we created the innovation adoption kit to make this easier. That includes the outcome-driven metrics. That includes the funnel, the pipeline to say, hey, we'd like to divest something. What's behind that? Can we have three things behind that? Are we giving a clear enough signal to our market partners and mission partners to make sure that something is up

up next. This includes the how to do OTAs, how to do CSOs, who are your partners to lean on. I was at In-Q-Tel's CEO Summit last week with some SEALs. They move really quick and they're tweaking things overnight. And then we went to an operational exercise and they were tweaking things overnight. This is a case where, hey, if we can find something that is broken, we don't have to mod a contract and then wait six months. We can go in there. There's a proc

between the doers and the user doers. These are player coaches in contact relationships. The feedback cycles are going way down and you're seeing the return on investment from that proximity. Tell us about Operation Cattle Drive, what that meant in the last administration, what that means. I know you don't speak for the Navy, which is important to say. These are Justin's opinions, but what that means in this administration.

Cattle Drive, we wrote a memo maybe nine months ago, structured divestments. So we have a series of ACDCs, accelerating change design concepts. How do you go faster? If you're one of those runners, come hang with us. Structured divestments is to say, hey, we all have heard of the valley of death. If you don't divest, unless there's a plus up, you don't have any runway for three years.

And so we looked at that and we said, wow, we can get people in the door. We're better at getting people in the door with CSOs and OTAs, but the money is still finite unless you can turn something off. And so we've now worked with our private sector partners to say, hey,

you have a capability. Turns out we don't need a 73rd cyber capability. Is there anything that you can turn off? Oh, you can turn off four things? That's great because we can then look at how to apply that funding to yours and then not have a valley of death and then get those capabilities out. So there's the impact side of it from a national security perspective and there's the economic side of it from a, we just need these things to go faster and we need the best companies to win. So,

The answer to that cattle drive question is let's scoreboard those divestments so that we get that muscle. Anyone can start something. Who can finish something? Who can divest something? You've heard the Reagan quote about the closest thing to everlasting life on this planet is being a government program. We are going to work the muscle of actually turning things off. And so what does that look like in this administration? Well, I was...

told last year previously that we should not use return on investment because that is not a realistic concept within the government. There's no actual return on investment and it hurt my soul. This was a senior person and so they're not here anymore. I had just wrestled with that and I think I had called a couple of sailors to think through, is there something we're missing here? We have a

value oriented, quantitative, decisive secretary right now. And we are, if there is something that is in the way of impact, we will divest. So the answer to your question is we are speeding up and turning up

cattle drive. We just need to be really thoughtful from a private sector perspective of, hey, are you able to deliver this capability? Is it worth the change? So tightening up those business cases, those impact cases is an area that we can keep getting better at. I think that has kind of like the sky's the limit.

I know you get asked about this all the time, but one of the limiting factors that gets brought up a lot by industry is the limited bandwidth on ships when it comes to working with the Navy and delivering solutions. What is the Navy doing to address this? We've talked about Mavericks a little bit, right? We've talked about who are these unleashed people and how do we find them? Like Artem right here.

in the front row. He's a good guy. I was on base in Camp Pendleton with a Marine, Ryan Ratliff, last week with Artem. It was a naval car, right? The PM who first fielded Sting that increased the bandwidth on ships by 100x was a Maverick.

He used commercial where we were only using Millsat before. And so this is to say that you can create nonlinear, you can create exponential improvements when you do these trade-offs well, and you push. And so he had an old story that he told me, maybe in private.

where he said the seniors at the time said, make this happen, do what you need to do. We've got your top cover. I think it was a little bit more colorful than that. And then he did it. And then he ran through walls and glass and everything else. And so now we have him as an advisor and he is recruiting and kind of fanning the flames for those people. So the bandwidth piece where we have flank speed wireless fielded ships are seeing 100 X improvement.

Ultimately, how we make the most of that and what we get out of that, I think that comes down to more Mavericks. And this daisy chains into a story where we had a captain who said, hey, I want to try different things. I want to have operational controls and SD WANs, how to find network controls on the ship.

And so he set up that wide area network to allow his sailors to do more training because normally you have to come home to do training. They had time online. They could do that on their ships. That's the home time could be home time and then a number of other things. And so this is like the dominoes of Mavericks.

To go from 10 or 20% who are willing to essentially kind of just incur more work for themselves for impact to we'd like to get to a lot more than that. Let's get to 30, 40%. Who else wants to sign up? Who else is going to download the innovation adoption kit today? Who else is going to say, hey, I think there is a commercial offering or seven that do this better than this?

And again, it's not everything, but if you are measuring it, then you have a fighter's chance. And we just want a puncher's chance to move faster on things that are stuck. We obviously talk a lot about AI and autonomy for good reasons and also added additive manufacturing. But what are some other critical technology areas that you think are going to become increasingly important for the Navy as we move forward, maybe on a five or 10 year horizon? I mean, a couple of things like we have a lot of different technologies

Collaborative autonomy is really important. It's really important to do well. We talked about enterprise services like... So our enterprise services right now are modeling and simulation and identity systems. I have a dream that we have enterprise services for autonomy and they're collaborative and they're joint and the Army and the Navy who are...

Right now, doing exercises together are making sure that those are well integrated with mission partners. This doesn't need to be a competition at exercises or even in market research. We can just get a lot smarter on that. The AI piece, because we use it so generically, is a little bit tricky.

I want to talk about it for a second, like specifically within generative AI, because it's so hard to execute. So this is predominantly a horizontal capability, which means the hardest possible thing to buy out of 18 PEOs, which PEO would buy generative AI, all of them, right? And so what does that look like?

from an enterprise service perspective, but in general, right? So what does that potentially do for us? Well, we had one PEO recently that did beta trials of, hey, if we augment the acquisition workforce with generative AI, how much faster they can go? How much leaner are those teams? Is it more effective? And so we're going to scale that.

And then the sequence of these activities for, hey, where is something that wasn't adding value? Let's learn those lessons much, much faster. The other one is integration. And so we've talked to the people who are working with Windsurf and a couple other companies, and they're saying, hey,

We're 10 times more effective. Can we be 10 times more effective with reasoning models on integrations? Because pretty much everything we do is integrations, right? And so if you were coming to us and saying, we do integration well, that's, I don't know, moderately interesting. If you say we can deliver this much faster with this many more impact, we have a way to validate that.

We can look at which models we can consolidate. We can look at what to get rid of completely because the integration drives most timelines. Let's open it up for questions, and depending on how crazy they are, we'll see if they end up in the show. As the force becomes more software-enabled and technology-enabled, does professional military education need to catch up because reality is outpacing training and professional military education? It's a great question.

The board is moving really fast, right? And so there's a scenario and we're living in it where the technology changes faster than the rest of .MLPFP. So what do you do about that? What do you do about the training? What do you do about all of it? Do you let the tail wag the dog? The answer is no. And so we've seen a number of, we worked with a number of rapid adaptation cells, one with a seal teams, one with In-Q-Tel, one embedded in a couple of task forces.

One with special ops folks, and they perform at a much higher level. So we had been working on something called adaptive roadmaps. Hey, what is the technology the Army does transforming in contact? And then how does that change the use of the technology? So tech-informed concept of employment informs tech-informed concept of operations.

we're ready on some of that. And so how do we scale that is a really good question. Simplicity scales. And so part of that is making these operational exercises part of the market research and then scaling what's in there. There was a time when operational exercises were for lessons learned and then market research within the acquisition community was for buying stuff. Why would those be separate? Why would they be divorced?

And so if you can document based on what's working, a lesson isn't learned until it's applied. And so if this is applied on a small scale and then applied in a large scale, we can turn that OODA loop a lot tighter, and we need to.

I'll also just add on that is it's a great question. But if you look at there's all these concepts and requirements that are already set that are unfulfilled. So, for example, the latest NDAA has all these requirements for everyone in DOD, not just service members, but civil servants to have access to courses on AI on lots of different things.

hasn't been actioned on. There's all these requirements for competency-based learning. The Army has the Army Learning Concept, which hasn't really been resourced in a serious way. There's the Advanced Distributed Learning Initiative and all the pathfinding that they've done. So the vision's there. It just hasn't been resourced and actioned on by leadership.

And until we change the way we envision and what education and training look like and how we make it fit into the lives of professionals as they live it, whether they're civil servants or in uniform, we're not going to get ahead of what you're talking about. And I do think to this point, like there's an opportunity to do more human machine teaming. So like it was it was the same beta region paradox that we talked about before, like just doing more stuff isn't going to help. We historically always cut training when we were over budget.

Right. And what can we do right now where costs have dropped and we can show the again on our world class alignment metrics on our whams? What is the impact of a squad that is training versus a squad that isn't right? Like these are business cases or mission cases that are much clearer. We could also potentially with the infusion of private capital and then some of that happening to the left side.

redirect some of that like so in these divestments hey we don't need to do this more at all we can divest this function like hunted labs these are cyber researchers found something before we had to right this is potentially a change in our expenditures versus the value that we get now

there, what can we reallocate our funding for? Or what does that look like? Thinking about these in terms of portfolios and impact, I think really does allow us to change the game. The other one there is there is like an impediment to AI adoption, an emotional one in some cases. We just need to work through that. And so if anyone needs support or a push or tools like

Literally, like, let's work on that right now. This is a learning by doing situation. It's going to humble us all until we get really good at it, just like everything was, just like, I don't know, basic training was. So just pushing into that piece. And you don't need to be like a STEM lifer. We have these scholars from Noble Reach. They're awesome. They know the ins and outs. I'm an electrical engineer. Like, there are some things. You don't need that to be an expert at AI in terms of the usage piece.

And so we can all push ourselves in that direction. And I think that starts to move the needle as well. This is great. We're going to end the episode here. And thanks so much for joining us.

Thank you for being among our first listeners to our newest podcast, Cogs of War. I hope you join us again and help us spread the word. Tweet about it, or rather post about it on X. Share it on Blue Sky. Share it on LinkedIn. Post about it on Facebook. Everywhere you can. Email about it to your friends. Give us five stars or more if they let you on your podcast app of choice. Trust me, it makes a difference. Thank you so much for listening.

And thank you so much to our partners, Booz Allen Hamilton, for supporting this critical conversation. Thank you.