Sponsor MessageBecome a KQED sponsor
upper waypoint

‘We Have To, and Are Proud To’: Silicon Valley Embraces the U.S. Military

Save ArticleSave Article
Failed to save article

Please try again

WASHINGTON, DC - JANUARY 21: OpenAI CEO Sam Altman, accompanied by U.S. President Donald Trump, Oracle co-founder, CTO and Executive Chairman Larry Ellison (R), and SoftBank CEO Masayoshi Son (2nd-R), speaks during a news conference in the Roosevelt Room of the White House on January 21, 2025 in Washington, DC. Trump announced an investment in artificial intelligence (AI) infrastructure and took questions on a range of topics including his presidential pardons of Jan. 6 defendants, the war in Ukraine, cryptocurrencies and other topics. (Photo by Andrew Harnik/Getty Images)

A decade ago, most major tech companies swore off working with the U.S. military. Google, Meta and OpenAI even once had policies banning the use of AI in weapons.

But times have changed, and now Silicon Valley is fully embracing contracts and collaborations with the military. Sheera Frenkel, tech reporter with the New York Times, explains how and why this shift occurred.

Some members of the KQED podcast team are represented by The Screen Actors Guild, American Federation of Television and Radio Artists. San Francisco Northern California Local.

Links:


Sponsored

This is a computer-generated transcript. While our team has reviewed it, there may be errors.

Jessica Kariisa [00:00:00] I’m Jessica Kariisa, and welcome to The Bay, local news to keep you rooted.

Ambi [00:00:05] Ladies and gentlemen, welcome to the Army Jacket Ceremony and the Commissioning Ceremony for Detachment 201.

Jessica Kariisa [00:00:13] In June of this year, four current and former executives from Meta, OpenAI, and Palantir took center stage at a ceremony at the Joint Base Meyer Henderson Hall in Arlington, Virginia. Wearing combat gear and boots, the executives were there for their swearing-in ceremony as Lieutenant Colonels in Detachment 201. A new unit to advise the Army on new technology for use in combat.

Ambi [00:00:45] In an era defined by information warfare, automation, and digital disruption, the army needs skilled technologists in its ranks now more than ever.

Jessica Kariisa [00:00:58] Big tech has embraced the U.S. Military. It’s a dramatic shift from just a decade ago when most of Silicon Valley was firmly against helping the government wage war. These days, tech executives are singing a different tune.

Sheera Frenkel [00:01:14] You’re seeing a lot of posting about how great America is and how proud they are to be Americans doing business in America. That’s a shift and it’s really noticeable among the top executives.

Jessica Kariisa [00:01:30] Today, Sheera Frenkel from The New York Times talks with The Bay’s host, Ericka Cruz Guevarra, about how Silicon Valley changed its mind on working with the military.

Ericka Cruz Guevarra [00:01:51] Sheera, I guess how might you describe how tight Silicon Valley and the U.S. Government and U. S. Military in particular are these days?

Sheera Frenkel [00:02:01] We are in a moment of exceptional closeness between the U.S. government and Silicon Valley, and that is really unusual. Silicon Valley had its origins with funding from the U.S. Government. But until now, there has not been this kind of widespread across the board move of Silicon Valley, you know, big companies, executives working closely with the U S military and having the kind of technology that’s actually useful for them. This is a region that saw itself as liberal, progressive, independent, connecting the world. That was a big motto. This idea that it was really international and it was about the good of all humankind, and not something that was specifically wedded to kind of an American patriotism. There’ve been figures, there’ve been characters, there’s been companies that have been public about their want and their need to work with the U.S. Government, but as much as a decade ago, there was widespread protests across Silicon Valley by the employee base at the idea of working closely with the government.

Ericka Cruz Guevarra [00:03:12] Yeah, don’t be evil, right, as Google used to say. And I’m thinking, you mentioned the protests, I’m thinking back to 2018 and Google when there were these mass protests by employees there around Google’s involvement in a Pentagon program, right? Can you just remind me of that era of Google, of this like don’t-be-evil sort of motto?

Sheera Frenkel [00:03:38] That was an era where people came to work at Google, they would graduate from the top universities in the United States. And as people in their early 20s, they saw it as this just really sort of do good, do positive things for the world kind of company. And executives fed into it, this idea of it’s bottom-up kind of culture and we listen to every employee and if you guys protest, we want to hear about it.

News clip [00:04:03] A letter to Google CEO Sundar Pichai is signed by more than 3,000 Google workers. Here’s what it says, quote, we believe Google should not be in the business of war, therefore we ask that Project Maven be canceled and that Google draft publicize.

Sheera Frenkel [00:04:18] And so when Google employees came out en masse and said they did not want executives to pursue a contract with the U.S. Government with the Pentagon, executives listened and they backed down. And you saw employees at smaller companies across Silicon Valley taking note.

Ericka Cruz Guevarra [00:04:31] And I remember the protests not just being effective in stopping the collaboration with this program but it literally became policy at Google to not pursue contracts with the US Military right?

Sheera Frenkel [00:04:50] Three of the the biggest companies, Meta, OpenAI, and Google, all changed their terms of service so that they would not work with the U.S. Government and that specifically their AI technology wouldn’t be used to help build defense systems. It was literally, we’re going to create policy so that our systems can’t be used for defense or for military purposes. That’s how strongly these executives doubled down on what their employees were asking for.

Ericka Cruz Guevarra [00:05:16] Around this time, Sheera, is it fair to say that everyone in tech was pretty much against military contracts?

Sheera Frenkel [00:05:24] I wouldn’t say everyone because you had outliers. You had companies like Palantir, who was very outspoken about their work with the US government. They, in fact, sued the army to get a contract because they were so keen on being a tech company that was very out, very public, very aggressive about wanting to be a tech companies that worked with the U.S. Military.

Alex Karp [00:05:47] And while there, you had the idea for Palantir? Yeah, well, you know, post 9-11, I think the idea, again, it was Silicon Valley ought to be involved in fighting terrorism and protecting our civil liberties.

Sheera Frenkel [00:05:59] Alex Karp, the CEO of Palantir, talks about the importance of working with the government all the time.

Alex Karp [00:06:05] We are kind of the greatest democracy in the world, and we tend to win wars where the people believe in what they’re doing. Where the people think that there’s a trade-off between civil liberties and fighting cyber terrorists, it’s going to be very hard to win.

Sheera Frenkel [00:06:17] I just remember how clear it was that they were outliers at that time to what the rest of kind of the Silicon Valley companies were feeling and doing and saying.

Ericka Cruz Guevarra [00:06:31] And for folks who maybe aren’t as familiar with Palantir, what do they do?

Sheera Frenkel [00:06:35] Palantire is a funny company in that they had a certain mysterious aura around them for a long time, and I think they encouraged that by not saying much about what they did. They build systems. They build data systems that can analyze data, that can process it, that can draw conclusions. For instance, they work across the U.S. Federal government, and they’ll come into a place and say, right, here is all the data you sit on. We are not just going to organize it for you, we’re going to make it easy for you visualize it, to analyze it, our AI will draw conclusions. So for a long time, they were used by police departments, for instance, or they were used by different intelligence services to help look at their own data and sort of be able to understand it, even if you were not necessarily a technically minded person.

Ericka Cruz Guevarra [00:07:23] I guess we’re talking now because, as you’re just talking about, Palantir was sort of this outlier among tech companies, really among one of the only ones really working closely with the U.S. Military, but increasingly they’re someone that other tech companies are becoming more and more jealous of these days, it seems like.

Sheera Frenkel [00:07:44] Yeah, it’s really interesting. It’s come full circle. All these tech companies that, you know, stepped away from the US government are now looking at Palantir’s incredibly lucrative contracts across the US Government. Each one of these contracts can be worth hundreds of millions of dollars. And once you are working with the US government, they’re pretty faithful as clients. So you’re looking at these contracts that are going to give you amazing revenue year after year. And they want to work with American companies.  They seek out American companies. And so I’ve heard some pretty senior executives at Meta and at Google say quite plainly, like, we’re jealous. We wish we were in there sooner.

Ericka Cruz Guevarra [00:08:26] What exactly has changed here? Like, how did a company like Google go from don’t be evil to now attempting, it looks like, to pursue contracts with the US military? Like, what is this change?

Sheera Frenkel [00:08:39] I think an executive at Google would say, well, we’ve rethought what it looks like to be evil. A couple things have happened in the last five years or so that have shifted their view. I think primarily the war in Ukraine, seeing the way that Russia and Ukraine have been fighting that war has really mobilized a lot of American executives into thinking that the US Army is not ready to fight the kind of wars that get fought now. Tanks and fighter jets and all that are always going to be part of the U.S. Military. But the way that drone warfare has shifted things, the way the AI systems have shifted both the way militaries collect intelligence and choose targets and select how to act, all of that is not possible without the kind of technical companies and expertise you have in Silicon Valley. And so there’s this sense of like, oh, well, if America goes to war and we’re they’re helping, we may not win. We also have seen a really radically shifting political climate in Silicon Valley. More and more executives have openly expressed support of Donald Trump and his administration. You hear a lot of people out here being like, well, I may not agree with everything that Trump does, but he’s good for business and he’s good for this. And you hear that kind of thing more and more. And so you have a certain willingness of executives to kind of come out and say, I want to work with Trump. I think it’s positive for me and my company to work with him.

Ericka Cruz Guevarra [00:10:25] I also have to imagine that money plays a big role here. You mentioned how many of these military contracts have a pretty big price tag on them. I mean, what role do you think that plays? And I know the president too has pledged to spend a lot more on the military.

Sheera Frenkel [00:10:45] Trump wants to put into place budgets that are going to see a lot of money flowing to the kind of new technology that Silicon Valley can produce. And so if you’re an executive out here, and not to name names, but you’ve decided to rename your company Meta because you think the Metaverse is the future. And then people are kind of like, well, I don’t know if I want to live in the Metaverse. I’m not sure that I want AR and VR goggles. And then the US military comes around and they’re like, Well, we’ll buy half a billion dollars worth of VR goggles because we want to train our soldiers on how to fight in war by putting them through battle scenarios. And suddenly, suddenly there’s a reason to name your company Meta. Suddenly there’s an actual client that wants to buy all that. And so it makes a lot of business sense for these companies to be in this way, and finding military applications for the technology they’ve been working on.

Ericka Cruz Guevarra [00:11:37] Yeah, you just mentioned Meta and these AR VR goggles. I mean, what are some examples, I guess, of this shift that is happening in Silicon Valley? And I guess what specifically to our tech executives saying?

Sheera Frenkel [00:11:53] You hear a lot of pride among tech executives that they’re working this closely with the U.S. Government, I like to look at their Instagram or their threads or their X pages because you can tell a lot by what they post. And if you look at them over the last, I’d say, year or so, you’re seeing a lot of like American flags flying in the background of posts. You’re seeing lot of posting about how great America is and how proud they are to be Americans doing business in America.

Sam Altman [00:12:22] Of course, we have to and are proud to and really want to engage in national security areas.

Sheera Frenkel [00:12:29] Sam Altman, the CEO of OpenAI, has started talking about the importance of working with the U.S. Government just in the last year.

Sam Altman [00:12:36] Part of AI to benefit all of humanity very clearly involves supporting the US and our allies to uphold democratic values around the world and to keep us safe. And this is like an integral part of our mission. This is not some side quest that maybe we think about at some point.

Sheera Frenkel [00:12:55] That’s a shift, and it’s really noticeable among the top executives. That’s something you’re really seeing at the top, and I think there is a gulf here between what executives are saying and posting and feeling about all this, and what the workforce is feeling about the direction that their companies are taking. You’ve also seen a lot of contracts signed. You’ve seen companies like OpenAI partnering with Andrel to use their AI technology to create weapons of the future. The question now isn’t whether the US is going to have autonomous weapons. It’s when will the US have autonomous weapons, and how quickly will companies like Google, or OpenAI, or Microsoft be able to use and pivot their AI technology to create these weapons.

Ericka Cruz Guevarra [00:13:46] I mean, this is making me think about Google back in 2018, as we were talking about earlier, and the role that the employees at these companies played in pushing back against this working with the US military. Are we seeing that same kind of pushback by tech employees in Silicon Valley now?

Sheera Frenkel [00:14:08] We are not seeing the kind of loud public pushback that we saw a little less than a decade ago. I spoke to quite a few engineers and employees at tech companies that are working with the U.S. Government who are worried. They’re sitting there and going, well, I joined this company because I believed in the ethos of connecting the world or do no evil. And now, I don’t know, I might be building an AI system that helps choose bombing targets faster for some future war, in which were you know, launching aerial strikes. I just think there’s this interesting moment where a lot of these people are asking themselves, do I feel good about the work I’m doing? But they’re doing it quietly, to be clear, because the last few years have seen a lot of layoffs across the big companies. And a lot of these people are worried for their jobs.

Ericka Cruz Guevarra [00:14:57] And we’ve seen that over the issue of Israel and Palestine, for example, at some of these tech companies, right? That there is real pushback happening now from the top.

Sheera Frenkel [00:15:08] Very much so. And a couple of the employees I spoke to looked specifically at Gaza as an example of a very AI driven war. I’ve written about this a lot about the systems that Israel built to be able to choose more targets to strike, to be to analyze intelligence quickly, to, you know, the facial recognition software that they’re deploying to use across Gaza. All of this are the kinds of systems that America is thinking about building. And you’re an employee, you’re looking at and you’re saying, is that the future of war?

Ericka Cruz Guevarra [00:15:40] I mean, Sheera, there’s obviously this moral opposition here. But I mean are there any other reasons why this collaboration between Silicon Valley and the US military is a maybe concerning trend? I mean I’m thinking about this technology and its use for surveillance in the US potentially even. I mean what are the other concerns around this?

Sheera Frenkel [00:16:08] I think the concerns are that you can’t put the genie back in the bottle. Technology can introduce different levels of surveillance that the US government can then choose to use as it wants to, right? And so there’s questions of how much more of a surveillance state does the US become. There are questions of, again, autonomous weapons. And every soldier I’ve met has talked about how the introduction of autonomous weapons removes one layer of humanity in war and that when it is robots firing at robots, it’s a very different war. And so there are people out there that are asking these questions of, do we want all these autonomous systems? What does that mean? Are we just making killing easier in the next conflict? And so, yes, anytime a technology is introduced, I think there’s a rush to kind of embrace that new technology. And then often a little like a beat later, like some would say a moment too late, there’s the question of, is this good?

Sponsored

Ericka Cruz Guevarra [00:17:14] Well, Sheera, thank you so much for sharing your reporting with us.

lower waypoint
next waypoint