We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode A Military Contract Tests Google's Open Culture

A Military Contract Tests Google's Open Culture

2021/3/16
logo of podcast Land of the Giants

Land of the Giants

AI Deep Dive AI Chapters Transcript
People
C
Chewy Shaw
F
Fei-Fei Li
J
Jeff Dean
K
Kate Conger
K
Kent Walker
L
Lieutenant General Jack Shanahan
L
Liz Fong-Jones
M
Meredith Whitaker
R
Robert Work
S
Sergey Brin
主持人
专注于电动车和能源领域的播客主持人和内容创作者。
Topics
主持人:本集探讨了Google的开放文化与其与国防部签订的秘密合同Project Maven之间的冲突,以及由此引发的员工抗议和公司内部的伦理辩论。该项目旨在利用AI分析军事图像和视频数据,引发了员工对技术被用于伤害他人的担忧,并对Google的'不作恶'原则提出了质疑。 Chewy Shaw:Google的企业文化一直以开放透明著称,员工可以自由地获取信息并参与到公司的决策中。然而,Maven项目的秘密性质与这种文化相冲突,引发了员工的不满。 Meredith Whitaker:Whitaker作为Google内部的伦理倡导者,在发现Maven项目后,积极组织员工反对,并撰写请愿书,呼吁公司不要参与战争业务。她认为Google应该承担更高的道德责任。 Liz Fong-Jones:Fong-Jones也对Maven项目表示担忧,认为该项目与Google的'不作恶'原则相冲突,并对中东和穆斯林员工造成了负面影响。 Lieutenant General Jack Shanahan:Shanahan作为Maven项目的国防部负责人,解释了该项目的目的是提高军事图像和视频数据的分析效率,并强调AI不会做出杀人的决定。但他承认项目缺乏透明度,导致谣言四起。 Fei-Fei Li:Li的邮件显示Google高管更关注如何避免负面评价,而非Maven项目的伦理问题。 Kent Walker:Walker试图解释Google终止Maven项目的原因,并强调Google仍然愿意与国防部合作,维护国家安全利益。 Jeff Dean:Dean作为Google的AI负责人,阐述了Google的AI原则,强调公司不会参与开发主要用于伤害人类的武器或技术。 Sergey Brin:Brin认为Google参与全球军事合作有助于维护世界和平。 Robert Work:Work认为Google员工与Maven项目领导之间的沟通不够直接,并指出AI技术既可以挽救生命,也可以造成伤害,这构成了一个道德困境。 Kate Conger:Conger作为记者,报道了Google参与Maven项目的消息,并指出Google员工对该项目的沉默反映了他们对公司的信任和期望。 主持人: Google的开放文化在公司规模扩大和政治环境日益两极分化的情况下,面临着严峻的挑战。近年来,越来越多的Google员工感到被排除在公司重大决策之外,这与Google一直以来承诺的开放性相冲突。

Deep Dive

Chapters
The episode explores the historical culture of openness at Google, highlighting the TGIF meetings and various internal communication tools that fostered a collaborative environment. This culture is contrasted with the current climate where employees feel increasingly excluded from major decisions, leading to a significant loss of trust.

Shownotes Transcript

Translations:
中文

Silicon Valley Bank is still the SVB you know and trust. The SVB that delivers human-focused, specialized lending and financial solutions to their clients. The SVB that can help take you from startup to scale-up. The SVB that can help your runways lead to liftoff. The only difference? Silicon Valley Bank is now backed by the strength and stability of First Citizens Bank. Yes, SVB. Learn more at svb.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.

When you're running a small business, sometimes there are no words. But if you're looking to protect your small business, then there are definitely words that can help. Like a good neighbor, State Farm is there. And just like that, a State Farm agent can be there to help you choose the coverage that fits your needs. Whether your small business is growing or brand new, your State Farm agent is there to help. On the phone or in person. Like a good neighbor, State Farm is there.

Remember that list we saw posted at Google's Mountain View campus in our first episode? All that good stuff canceled because of COVID? Massage rooms, game rooms, gyms, pools, shower rooms... These are some of the lifestyle perks Google's known for. They're easy to envy, and they're easy to make fun of. But in my experience, when you ask Googlers what really makes their company's culture stand out, they start ticking through a very different list.

And until recently, at least, near the top would usually be something called TGIF. TGIF is what Google calls its all-hands meeting. Google used to host TGIFs every Friday. They started back in 1999, when all employees could fit in a conference room. Larry and Sergey were usually there. And for years, you could raise practically any question. From what's wrong with the bike navigation on maps, to what the company was doing to eliminate unconscious bias from its hiring practices. TGIFs.

To Chewy Shaw, who joined Google as a summer intern when he was still in college, TGIF felt like the perfect encapsulation of what made working at Google so special. Beyond, you know, the fact that you got to work on products that reach billions of people and you could tackle the world's biggest problems. There's a strong principle that we each are brought here as a large group of diverse, intelligent people.

because these are hard problems and we're moving as quickly as we can but the only way that we can solve these problems correctly is by getting more voices to the table. And so therefore whenever somebody has something that they think is going in the wrong direction, Larry and Sergey explicitly encouraged that. And not just theoretically, Google built actual tools to encourage the spirit of debate. Google wasn't a need-to-know culture, it was a want-to-know culture.

Take Google Drive, which of course Googlers use internally. By default, if you're at Google, you could just pop in and see your colleagues' work. Their memos, slides, codes, spreadsheets, whatever. Unless it was a competitive trade secret, you could check it out. And that goes for teams you didn't even collaborate with directly, even executives.

Google also has tons of email lists. They're easy to make, easy to join, and tend to be really active. There's one for politics, one for expecting new parents, and one simply called MISC for whatever. And then there are Google's internal websites.

And then there's the crowd favorite, MemeGen. MemeGen is an internal Google page where anyone can post a meme.

You can playfully dunk on management or riff on the latest tech controversy. The meme gen was actually just kind of a volunteer project by one worker who was like, I like making memes. We should have a place where we can make internal memes because, you know, there are some subjects that we can't talk about externally, but I want to make memes.

All these open channels, from TGIF to MemeGen, they're remarkable when you consider Google's scale. We're talking about a $1 trillion company. And still, Google's known for letting employees dig into the details of practically everything it does.

Contrast that with Apple, which has a notoriously secretive company culture, to the extent that different product teams often can't even talk to each other about what they're working on. For years, this want-to-know culture worked incredibly well for Google. Employees were able to collaborate and get up to speed extremely quickly. But as the company grew and filled with employees who had different opinions in an increasingly polarized political climate, Google's idealistic culture slammed into some hard realities.

Because it's one thing to host a radically transparent TGIF when Google employed a couple hundred or couple thousand people. It's another thing when Google is over 100,000 strong and news breaks of a secretive project that for some Googlers called into question the company's core mission.

This is Land of the Giants. I'm Alex Kantrowitz. And I'm Shireen Ghaffari. Like any one of the giants, Google is facing tough questions these days about its role spreading disinformation or claims that it neutralizes competition in ways that might hurt consumers. But Google faces another fundamental question that the other giants don't to the same degree. Who's

Who's in charge here? Because for years, Google seemed to send a strong message to its employees that they were in charge. Or at least, they had eyes on the company's process and a seat at the table. But that's changed. Especially in the last couple years, more and more Googlers have felt cut out of the biggest decisions Google, the company, makes. And since Google's promise of openness was so core to its business, the potential loss of it feels like a seismic shift with existential consequences.

Over the next two episodes, we're going to tell the story of the limits of Google's once beloved, idealistic, open culture. And fittingly, we're going to start with one of Google's secrets. A decision Google covertly made to get into business with the Department of Defense.

Meredith Whitaker started working at Google in 2006. By 2017, she was a program manager in the cloud computing department. But beyond that role, she'd also founded the company's Open Research Group, a team with a mission to tackle thorny problems like data privacy and net neutrality. I was known for both being fairly critical and not towing the company line on ethical issues already.

Whitaker was someone who wasn't afraid to challenge Google and the tech industry writ large to examine its impact on society. So she became a kind of informal activist to her colleagues. A lot of times people would come to me and say like, hey, something weird is happening on my team or, you know, I don't, I don't, I'm not comfortable with this decision. Can you advise me? One day in the fall of 2017, a colleague pinged her. Hey, you should know about this, she remembered the note said.

This is really concerning. Whitaker's colleague was referring to an already signed contract that only a select few inside Google even knew about, Project Maven. It was an explicit partnership with the Department of Defense. So right up front, what you need to know about Maven is it was part of a series of initiatives by the DoD to keep up with cutting-edge technology and get ahead in the AI arms race with countries like China. The DoD contracted Google to work on it. I have a very specific problem set.

And it's this, it's an avalanche of data that we are not capable of fully exploiting. Lieutenant General Jack Shanahan ran MAVEN on the DoD side. He explained the project's purpose in a keynote for the GPU Technology Conference in Washington in November 2017. Shanahan clicked to a slide of a truck being overtaken by a barreling mass of snow. So more people is not the solution to the problem. Better tools are in tradecraft and algorithms.

are the solution to the problem. So that's what we're facing today. And that's why we generated Project MAVEN. Here's the vision. Here's our objective. Here's what I'm tasked to do: to deliver AI-based algorithms to tactical unmanned aerial systems by the end of 2018. In other words, the DoD's goal for MAVEN was to use AI to analyze massive amounts of image and video data from aerial vehicles operated by remote pilots way more efficiently than human beings could. Using that analysis, the military could better identify what was happening on the ground.

Shanahan was open about Maven's intentions, but he never mentioned Google. Two months earlier, Google had quietly won the contract, but there was no announcement. The company made it clear to the DoD that it was not interested in doing any press. So Project Maven was a public project, but Google's involvement with it was not, even to the majority of its own employees.

But toward the end of 2017, a whisper network had started to form among a small group of employees. And they were concerned: could this project be used to harm people? Google would later tell employees that the technology it contributed to Maven would only be used for "non-offensive purposes" — so, to gather intelligence on the ground, not to target humans. But especially since the project was still so secretive, Googlers in the whisper network had their doubts about that.

Liz Fong-Jones was a site reliability engineer in the Google Cloud department at the time, which is the same department that was doing work for Maven. So Google's involvement was personal to her. One of the things that drew me to Google was the "don't be evil" policy. That it was something that I thought, you know, everyone that I talked to believed in. And I wanted to believe in it too.

Like Whitaker, Fong-Jones was known for her advocacy work inside Google. She had a track record for mediating between executives and employees to make products more inclusive, like she'd push for anonymity to be an option on Google+ to protect vulnerable communities. For Fong-Jones and many other employees, there was an unofficial line in the sand that Google could be crossing. People who were working on Google infrastructure projects in Google Cloud

expected their efforts to be used for commercial purposes, for things like helping e-commerce sites scale up, for helping Snapchat, for helping, right? Like, it's a very different thing to ask someone to work on weapons systems compared to working on civilian applications. To be clear, not all Googlers disagreed with the idea of working on technology that would be used in warfare.

Plus, it's actually not new for Silicon Valley to work with the U.S. military. Cold War-era research grants from DARPA, which is part of the U.S. Department of Defense, basically funded the internet. And you might remember, Larry would even go to DARPA events to recruit talent. But particularly in the shadow of the faltering war on terror in the Middle East and controversial use of drone technology to support that, some engineers didn't want anything to do with the U.S. military or warfare, period.

The idea among them was: as long as you steered clear of major defense contractors like Lockheed Martin or Raytheon, your talents wouldn't be used for that. Which is why Maven was so unsettling to some of those Googlers who found out about it early on. Fong Jones, Whitaker, and a handful of other in-the-know employees began raising concerns. So did a group of infrastructure engineers dubbed the Group of Nine. They were tasked with building a critical security clearance feature for the project.

But even as these various groups raise their concerns, most employees still didn't know Google had won the Maven contract. And you can't figure out whether something is evil if you don't know about it. So I think Liz posted something on the

internal G+. Okay, this was huge. In February 2018, Liz Fong Jones posted a note on her internal Google+ page — yes, that stayed alive for employees — and that note changed everything. She explained what she knew about Project Maven, and more importantly, asked questions about what she didn't know. The note was visible to all of Google's over 100,000 employees.

I need to give you a quick disclaimer here. By all accounts, Fong Jones is a major part of this story. But she can't talk about the specifics of her note because she signed an NDA when she left Google. So it's based on outside reporting and our interviews with other Googlers that we know Fong Jones' note on Google+ was when momentum started to build among a larger swath of employees questioning Maven.

Enter Diane Greene, who at the time was the head of Google's cloud computing department. So remember, the department that was working on Maven. And, this is important to know, also a department struggling in a much larger competition among the giants. Google Cloud's biggest competitors were Amazon's Web Services and Microsoft's Azure. And Google was a distant third. Like, really distant. But anyway, after Fong Jones' Google+ note,

Green posted her own, assuring employees that Maven was just a $9 million contract. Pocket change. This is according to reporting from Wired. For Whitaker, though, that piddly price tag raised even more questions. And that's where I got really, really worried. Because by late February 2018, Whitaker said she'd reached out to people she knew in the defense world in D.C.,

And she found out about a DoD tech contract that was orders of magnitude bigger than Maven that would be up for grabs. A contract called JEDI worth up to $10 billion.

Jedi was a winner-take-all project that would give a single vendor the job of building nearly all of the Pentagon's cloud computing infrastructure. It's good business to score if you're a distant third in the industry. And indeed, Maven was very clearly, from the folks I was talking to, a kind of try-before-you-buy contract because Google was not a tried-and-tested military contractor at this point.

To be clear, there's no evidence that the Maven contract explicitly put Google in the running for JEDI. But for those opposed to Google's work on Maven, it was a real fear. As more and more employees found out about Maven inside the company… I had gotten what I would consider a very bad tip.

All the person told me was that I should look into something called Maven. Kate Conger is a technology reporter for The New York Times. Back in early 2018, she was covering cybersecurity for Gizmodo when she received this mysterious tip. She started asking her sources within Google about it, and something really telling happened.

A lot of my sources I've known for a very long time, and some of them can be very chatty and comfortable. And I was having a really hard time making headway on this story. And so I was like, well, it must be a really big deal. If not, tell me about it. This makes a lot of sense to me because I've experienced the same thing on big stories about Google. The reticence Conger hit up against says so much about the faith Googlers still had in the company then. There was a lot of employees who kind of wanted to give the benefit of the doubt to the company and wait for

before making those things public because I think there was a feeling of like,

Surely we don't have the full picture here or you must be confused or mistaken or like someone in leadership is going to come along and explain what's happening. This was that weird googly mix of naivete and self-regard. Like there's got to be an explanation, a way to keep all of this in the family and fix it. This is just sort of antithetical with what we know about the company and our vision of the company. And someone is going to come along at some point who can kind of explain that cognitive dissonance.

But management's explanations weren't enough to stop people from worrying. So back inside Google, Whitaker upped the stakes around February 2018. You know, we can't keep asking, right? Like, we can't keep asking nicely. So I took the lead on writing a petition, and then the petition sort of, you know, got the facts out there into the Google bloodstream a little more.

Along with a few other Googlers, Whitaker drafted a petition against Maven using, of course, Google Drive, which you'll remember was open. So any employee with a link could dip into this doc while it was still being written. I remember when we were finishing up the Maven letter, seeing Kent Walker appear in the doc and my very quickly closing all the comments to be like, fuck, get it out of here because Kent just found out about it.

Kent Walker was Google's top lawyer. So here was this super senior guy who backed the project. And because of Google's open culture, he could drop in as his more junior colleagues mounted the case against it. Whitaker knew they better get that petition out fast. When they did finish the petition, they addressed it to Sundar Pichai, CEO. It read, Dear Sundar, we believe that Google should not be in the business of war.

The letter went on to make the case that Maven could harm Google's brand and its ability to find talent, that it would put the company in the league of defense contractors like Palantir and Raytheon, and that Google stood out from other major tech companies because of its, quote, "unique history," its motto "Don't be evil," and its direct reach into the lives of billions of users, unquote.

Basically, the letter held Google to a higher moral standard than its competitors like Microsoft or Amazon. And it quickly gathered hundreds and then ultimately 4,000 signatures. Which for Whitaker, on the one hand, was great. But on the other hand, terrifying.

And I remember being out of friends and just breaking down crying because I had just realized that what I was doing was like, I just stepped on the bee's nest. And so it's the military and Google, right? Like the heart of empire. You know, there were moments of just sheer terror. And, you know, it was also clear that like, you can't punch soft, right? Like if you've already punched, like you have to keep going.

As things were escalating, Kent Walker and other managers sat down with Whitaker and a few of her colleagues who were against Maven to hear them out. And so every meeting, like this is where the word thoughtful got completely run into the ground and destroyed. So, you know, we're being very thoughtful about this complex set of issues. Y'all just wait and we can't rush them because rushing ethics means we're not being thoughtful. And it's classic, right? Like this is too complicated. Only the wise men

in executive positions are prepared to sort of, you know, survey the whole field. So we've heard your concerns, but you know, implicitly you are not in a position to actually understand everything. We are, so we're going to take your concerns back. We're going to act on them. We'll have the benefit of being able to say we heard you without the obligation to, you know, have acted on what we heard.

I want to point out, Google's leadership didn't have to host any of these discussions. It does speak to the company's commitment to its open culture that it even held these meetings with Maven's detractors. This feels like Google acknowledging that the fate of this project was about a lot more than Maven. It would set a precedent for the way that Google works with the military for years to come.

At the time, management had several distinct points in its defense of MAVEN that it brought up in TGIFs, listening sessions, and internal notes. One big one: this technology wouldn't be used to help drones target actual human beings.

They said that you couldn't go down to the person level and do sort of surveillance on people. This was really just sort of, you know, basic recon, nothing to worry about here. But then, Whitaker says, inquisitive Google employees went into Maven's code base and saw evidence that seemed to contradict what executives were saying. Like Whitaker recalled, there was indeed a class of code for, quote, persons. Employees discussed their findings on mailing lists.

So this was also a time where, you know, the code was much more open. That was how Google worked. Another important defense for management: they said Google wasn't doing anything tailor-made for the U.S. military. It was just giving them access to code that was already out there.

The execs were sort of trying to get ahead of it by making claims like, you know, it's actually not a custom model. We're just selling them off-the-shelf software. So then, you know, there's someone who writes in the AI team who writes a doc on, you know, this is why the term custom model can't be trusted. It was kind of a game of whack-a-mole. But if you ask former Deputy Secretary of Defense Robert Work, the Pentagon official who issued the directive for Project Maven in the first place, it was also kind of a game of telephone.

I cannot find any evidence that the employees asked members of Project Maven to come and talk with them directly. This was Robert Work on a site called Government Matters. It was an indirect conversation between the employees and the leaders of the company. And I've got to say, you know, I think their position presents a moral dilemma because AI has the possibility of saving as many lives as it possibly can harm.

Because, in theory, AI could make drones more precise and cause less collateral damage. Sergey himself made a related argument at a TGIF meeting as the Maven controversy unfolded.

That's according to a couple of Google employees cited in the New York Times. Reportedly, Sergey said that if Google was at the table with the world's militaries, it would make that world a more peaceful place. And according to work, the Department of Defense has said that AI would never make a decision to kill someone, that a human would always step in to make that call. But for many of the Googlers against Maven, anything involving drone technology was incredibly controversial.

Because AI-powered or not, drone warfare causes casualties and puts a sense of distance between those who do the killing and the people they kill.

Fong-Jones said she felt an obligation to speak up for her colleagues. I know that a number of Middle Eastern and Muslim employees were absolutely aghast at hearing about Project Maven. We know from public reporting that the Edmund Ariel vehicles surveilling the Middle East were surveilling people, you know, who are civilians, who are just walking around doing their daily business, driving cars, and that drones were tracking them.

And that drones were even firing missiles at weddings. I think that's kind of the place where a lot of Googlers threw the line and said, "I don't want to be a part of this, and I'm upset that you made me a part of this whether I want to or not." And soon, this debate about drone warfare and Google wouldn't just be internal anymore. After the break, what happens when the press gets a view straight into Google's controversy?

Support for Land of the Giants comes from Quince. The summer is not quite over yet, but shifting your wardrobe to the colder months could start now, little by little. You can update your closet without breaking the bank with Quince. They offer a variety of timeless, high-quality items. Quince has cashmere sweaters from $50, pants for every occasion, and washable silk tops.

And it's not just clothes. They have premium luggage options and high-quality bedding, too. Quince's luxury essentials are all priced 50% to 80% less than similar brands. I've checked out Quince for myself, picking up a hand-woven Italian leather clutch for my mom. As soon as she saw it, she commented on how soft and pretty the leather was and told me it was the perfect size for everything she needs to carry out on a quick shopping trip.

Make switching seasons a breeze with Quince's high-quality closet essentials. Go to quince.com slash giants for free shipping on your order and 365-day returns. That's Q-U-I-N-C-E dot com slash giants to get free shipping and 365-day returns. quince.com slash giants

On September 28th, the Global Citizen Festival will gather thousands of people who took action to end extreme poverty. Watch Post Malone, Doja Cat, Lisa, Jelly Roll, and Raul Alejandro as they take the stage with world leaders and activists to defeat poverty, defend the planet, and demand equity. Download the Global Citizen app to watch live. Learn more at globalcitizen.org.com.

In March 2018, Kate Conger and her Gizmodo colleague Del Cameron broke the news about Google's involvement with Maven. Now the world knew of the fight happening inside Google. This got kind of hotter and hotter. Meredith Whitaker remembers Googlers were raging on Memegen, bringing up hostile questions about the project at TGIFs, signing on to the petition. So in April 2018, roughly six months after Whitaker found out about the project...

Google management announced an internal global town hall. It would be all about Maven. And like 36 hours before we're supposed to go on stage, I got an email from Diane Greene saying, "We want to participate in this." Diane Greene, one of the top executives leading Maven. Google also invited Vint Cerf, a VPN-renowned computer scientist who's considered one of the founders of the internet. Cerf had serious street cred. Whitaker said it ended up being four people defending Maven versus her and one other colleague opposing the project.

Whitaker prepared the case against Maven practically overnight. It was a frenzy. I arrive in California, like, nighttime, sleep in a weird hotel, wake up at, like, 6 a.m. Yeah, it was kind of face-off with the execs, and, like, I just...

They continued talking over me, so I remember having to just do this almost like filibuster technique where I would just keep talking. She talked about the issues with training data, testing, how hard it would be to guarantee the technology was being used ethically since it would all be classified. Issues with tying our bottom line to one country's military objectives when we were a multinational, issues with the drone war being illegal and the number of civilian casualties.

issues with going down this road to provide this kind of infrastructure. It was just like we like, it was kind of an onslaught of arguments against this. And then what was coming back were sort of arguments like, well, a hammer is also a tool. It can be used for good and it can be used for ill. And I was like, wait, what? Like that's where we're at. I mean, like on the merits, we won.

By Whitaker's account, the town hall was a turning point where internal opposition to Maven really grew. Which I don't take sole credit for. I think there was a lot of work that went into that. And it was also just lucky for us that I think at that point Google thought what we were doing was kind of plucky and quaint, right? They didn't get the sort of seriousness with which we were taking this or the goal to build real worker power.

This town hall may have felt to Whitaker like a sign of Google's carelessness about opposition to Maven, but in another way, this was Google's management staying true to its ethos of debate. Google invited Whitaker on stage to air out her arguments to the entire company. This kind of event would probably never happen at Microsoft or Amazon. But if the idea was to cool down the tension over Maven by opening space for argument, pretty much the opposite happened.

A little over a month after the town hall, hundreds of technology researchers outside Google signed a petition against Maven. The list included some of the most esteemed academics in the field, including Larry's old Stanford advisor, Terry Winograd.

But then came another PR crisis and the ongoing maelstrom for Google. In late May 2018, The New York Times published internal emails from Google's upper management discussing Maven. We could read some of them if you want. Kate Conger reported on these emails too. Here she's reading one from September 2017 written by Fei-Fei Li, who was then the chief scientist for AI inside Google Cloud.

She wrote, "It's so exciting that we are close to getting Maven. That would be a great win. I think we should do a good PR on the story of DoD collaborating with GCP from a vanilla cloud technology angle, storage, network security, etc. But avoid at all costs any mention or implication of AI."

Google is already battling with privacy issues when it comes to AI and data. I don't know what would happen if the media starts picking up a theme that Google is secretly building AI weapons or AI technologies to enable weapons for the defense industry. The emails showed Google's leadership appearing to strategize about how to avoid criticism. But the question many were asking was: were they sufficiently considering the ethics of Google potentially making technology applied for warfare?

I don't have access to every email that's sent or received within Google. But from the correspondence that I saw, I didn't see a lot of debate over the ethics of the technology, whether or not it would be good to be building weapons for the Defense Department or not. The debate was really about public perception and the fact that the public

would be very concerned to hear about this. And so it should be kept quiet. But I didn't see any kind of deeper delving into why the public would feel concerned. Lee did not respond to our request for comment. Back in 2018, she gave a statement to the New York Times that said, it is deeply against my principles to work on any project that I think is to weaponize AI. On June 1st, 2018, two days after the New York Times published the emails. I was at my desk and

And there was a cloud all hands. I wasn't actually watching, right? I was doing some other work. And then somewhat like my DMs and my signal and everything lit up and was like, they're dropping Maven. And then suddenly like I checked Twitter and it's already on Twitter. And I'm like, whoa.

It's not exactly clear what the final straw was, but after months of pressure, Google announced it would not renew its contract with the Department of Defense on Project Maven. It's one of those interesting moments where I'm like, I was having, I was like, yes, this is a big pivotal moment. But like it was happening through all the mundane channels that everything else happens through. So like I like got up and walked around. I was like, holy shit, they did it. The contract wasn't fully canceled. Google saw out its obligation to the DoD for another year.

But even with that caveat, this was a historic moment. A relatively small number of tech employees had derailed what might have been the beginning of a major partnership between the U.S. military and one of the most powerful tech companies in the world. It's one of the first times I feel like a lot of tech workers started to have this conversation about technology.

Whether what they were doing at work was a net good for society, right? You know, people who are building cloud products that feel very generic and feel like they don't have societal implications suddenly being like, wait a minute, like, what am I building? And what are the broader implications of that?

Yes, if you ask us to violate our code of conduct as engineers, then we're going to refuse to work on the project. And if you want to fire us, you can fire us. For Fong-Jones, this was a moment when tech employees became aware of their own power. Skilled engineers had leverage. They could walk away. There is actually a significant risk to Google's business if they hadn't canceled the contract.

That group of nine critical infrastructure employees we talked about earlier? At some point, they refused to work on the project, according to reporting from Bloomberg. And ultimately, about a dozen Google employees resigned in protest of Maven. In particular, the set of employees who resigned or threatened to resign over this were primarily employees with very long tenures and very large amounts of operational expertise.

Executives seemed in the end to listen to their employees and their concerns. And less than a week after making its big Maven announcement, Google issued a document that established a new course for the company going forward, a set of guidelines on the way it would ethically work on artificial intelligence projects called the AI Principles. One of the Googlers who shaped these guidelines was a top engineer named Jeff Dean.

You're going to be hearing a lot more about Dean as our Google story unfolds, but for now, you just need to know he's Google's head of AI. And in Google's employee level system, which goes from 1 to 10, Dean is actually ranked an 11. So the guy's words carry weight.

During the peak of Maven controversy, he actually spoke against using machine learning for autonomous weapons at a Google conference. When we interviewed Dean, he talked through the AI principles with us. We have four areas, including development of autonomous weapons as one of them, that we definitely will not work on as a company.

The principal said Google wouldn't help build weapons or other technologies whose "principal purpose or implementation is to cause or directly facilitate injury to people." I think that's a really, really important thing for us to sort of communicate to the world as other people are also starting to think about using machine learning in more and more places.

they can at least look at what we've come up with as we've been thinking about this and say, okay, yeah, that makes sense. Let's not understate the significance of Google releasing these AI principles. It was an ethics statement on how Google would handle AI development moving forward. And it wasn't without risk. For one, this clear statement put Google at a competitive disadvantage when it came to going after major military contracts. Second, pulling away from Maven opened Google up to the accusation that it wasn't patriotic.

It pissed off some politicians who thought Google should give unwavering support to its own country. And as you are facing potential regulation, that's not the group you want to cross.

Diane Greene declined to comment for this episode. Through Google, Kent Walker also declined an interview. And Google declined to comment for this podcast on claims or criticisms of its handling of Project Maven. It did provide a general statement about the project's ending, which says, In 2018, we announced we would not be renewing our image recognition contract with the U.S. Department of Defense connected with Project Maven.

The contract expired in March 2019, and we have worked closely with the DoD to make the transition in a way that is consistent with our AI principles and contractual commitments. The company's cloud division continues to provide its technology to support the U.S. military on projects such as training Air Force pilots and using drones to inspect Navy vessels. These are the types of tech government partnerships Eric Schmidt, Google's former CEO, has long advocated for. I'm joined with two close friends of mine.

And I'm probably the only person who can say this in the entire world. I work with and for both of them. This was Schmidt moderating a talk at the National Security Commission on AI conference in November 2019. Schmidt had left Google by then, but when the Maven deal was signed, he was still executive chairman of Alphabet, Google's parent company.

The two close friends Schmidt was interviewing here were Google's General Counsel Kent Walker and Lieutenant General Jack Shanahan, who was the director on Project Maven. This talk happened about a year and a half after Google announced it would not renew its Maven contract. But it was still all about Maven. Now Schmidt was on stage seemingly trying to repair the damage that Maven had done to Google's standing in the defense industry. Or rather, letting Kent Walker try to repair that damage. And man, did Walker try.

The topic of today's panel, private-public partnerships, is extraordinarily important to me. I grew up in this community. My father was in the service for 24 years. I was born and spent the first years of my life on U.S. military bases. It's clear Walker knew his audience. He was in front of a bunch of military people. This was not a Google TGIF. With regard to the more general question of national security and our engagement in the MAVEN project,

It is an area where it's right that we decided to press the reset button until we had an opportunity to develop our own set of AI principles, our own work with regard to internal standards and review processes. But that was a decision focused on a discrete contract, not a broader statement about our willingness or our history of working with the Department of Defense and the National Security Administration.

We continue to do that. We are committed to doing that. And that work builds on a long tradition of work throughout the Valley on national security generally. We are a proud American company. We are committed to the cause of national defense for the United States of America, for our allies, and for peace and safety and security in the world. We are looking forward to working more closely together in the future.

Walker was here to set the record straight. Google was still open for business with the US military. Schmidt also opened the floor for Shanahan to reflect on Maven. And Shanahan was a little bit more candid about what happened from his point of view. We got tremendous support from Project Art in Maven from Google. What we found though, and this is really the critique on both sides, is we lost the narrative very quickly.

Shanahan retired from the Department of Defense last year, but agreed to speak to us for this episode about what happened with Maven from the very beginning of its contract with Google. We were honestly stunned that Google was...

demonstrating a willingness to work with us on this project because we knew they had the highest percentage of AI talent in the world resided in the company at the time. We want the world's best talent working on these problem sets. We need it because other countries are doing it. Shanahan was optimistic about what Project Maven could achieve with Google's engineering support.

So he was particularly disappointed when it exploded in controversy. There was just not enough transparency. And when there's a vacuum, rumors fill the vacuum. And I think the rumors in this case became malignant. And people assumed that there was much more to the project than there really was.

Like employees concerned that Google's work was being used for lethal purposes. Now, there's a gray area here. Shanahan reiterated that at the time, Google's work was not being used to guide weapons. But he did acknowledge that it would be hard to ensure intelligence gathered by the project would never be used to cause harm. You could use the term of what's the chain of custody with that information? Where does it end up? Maybe it could be used six months down the road. But that was the ongoing discussion with Google.

Shanahan says he actually suggested he could talk to Maven's detractors to better explain the project, but Google told him that was a bad idea. When he got word that Google wasn't renewing its contract: I was extremely disappointed by the decision.

But it had spiraled to a place that it was going to be helpful to nobody to keep it going. So I don't know if this is the right term to use, but they needed breathing room in the company. They needed to talk about it internally to the company. On the one hand, Shanahan felt Google should have been transparent with its employees about Maven from day one. On the other hand, he wonders if at some point Google should have exerted more control over all that talk.

Part of me says, look, who's running the company here? Is it 100 employees or is the CEO going to make a decision about what the company's doing? That's a big decision and

They got convoluted. And that's why they ended up pulling out. They could have just said, we're going to do this. If you don't like it, you're free to leave the company or you don't have to work on Project Maven. So our feeling, so I'm giving you my Department of Defense sense of the world, is you're kidding me. Who's running the company right now? And so that was just the way we looked at it. Just say it for what it is.

But Google, of course, has always liked having a different sense of the world, a sense where debate is supposed to be worth it,

In the months after Maven, employee activism at Google would surge. A growing group of impassioned workers would speak out against other internal controversies, from sexual misconduct in its upper ranks to contracts with U.S. immigration agencies to a secret project with the Chinese government. And management would start limiting the free-for-all environment of employee communication that it touts as a key to its success.

Green and Lee would leave Google in the months after Maven. Whitaker and Fong Jones would also leave the company in the next year and a half, but not before organizing on other issues. Chewy Shaw, the longtime Googler you heard from at the top, he would go on to become one of the leaders of the new Alphabet Workers Union. There would be more fights to come.

In the past, Google's biggest challenges had come from its competitors, especially from other tech giants like Microsoft, Apple, and Amazon. But Maven helped usher in a new era, one in which Google seemed to be at war with itself. Next week, we'll tell the stories of a few key Googlers who left the company under circumstances they did not expect, including Dr. Timnit Gebru, who was the co-lead of Google's ethical AI team.

Google says she resigned from the company in December 2020. Gebru says she was fired. I don't know how to describe how I felt. I don't remember. I was just in the, you know, I just felt I had to act. Grouped together, these former employees' stories chart a timeline of the company's gradual clampdown on its open culture. I know it's like terrible, but the truth is, like, the one silver lining of it is that every time they fire someone like this, it just adds...

more proof to the column that says, well, is this isolated or is this systemic? Special thanks to Kate Conger, who directed us to that talk with Eric Schmidt. And thanks to Peter W. Singer for helping us better understand the defense context of the story.

Thank you.

Art Chung is our showrunner. Nishat Kurwa is our executive producer. I'm Alex Kantrowitz. You can check out my weekly interview series, Big Technology Podcast, on your favorite podcast app. And I'm Shireen Ghaffari. If you like this episode, leave us a rating and review on Apple Podcasts and tell a friend. And subscribe to hear our next episode when it drops.