Skip to content
Who's in the Video
Anna Butrico is Chief of Staff of Odgers Berndtson U.S. Anna supports the OBUS leadership team in driving the firm’s growth strategy. Prior to joining Odgers, Anna was a senior[…]
Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

Google famously used the slogan, “Don’t be evil,” to guide its business practices. 

However, many Google employees were upset when they learned that the company had partnered with the U.S. Department of Defense on Project Maven, whose goal was to produce AI that could track people and vehicles. 

Is it immoral for a tech company to partner with the military to create war technology? Or is it immoral not to?

ANNA BUTRICO: Let's just say Doomsday arrived. The worst of the worst came knocking on the door. How would we think about the tech company's role in partnering with the military? Is it immoral for a tech company to partner up with the military to create war tech? But in the same breath, we can ask ourselves a very connected question: Is it immoral not to? 

The United States has and will continue to be in an AI arms race. It truly is a race to a undetermined finish of who can provide the best technologies at the right time, to achieve the best objectives. The United States has to consider whether its own narratives about the technologies, how we use them, how we promote them, are putting ourselves at a disadvantage against an enemy that's asking similar questions.

My name is Anna Butrico. I recently co-authored a book with General Stanley McChrystal, entitled "Risk: A User's Guide." And I work at McChrystal Group. 

Google's "Don't Be Evil" value came from a discussion, really at the beginning, in the inception of the company, about what Google's values were. Paul Buchheit, who was the 23rd employee of Google, recommended, in kind of a joking manner, that "Don't Be Evil" should be one of the company's values. Other companies were trying to take advantage of their users in putting advertisements in their searches. When doing so, they'd often make money off of those searches and the advertisements connected to them. Google didn't wanna do that. 

They did not want to be sneaky with their users and take advantage of them in order to profit. Google employees believed that they wanted to do good, that in their life-saving and enabling technologies, that they were intentionally not doing harm, and rather helping the world. The word evil in the "Don't Be Evil" phrase is a bit problematic because what is 'evil?' In terms of Google, initially, being evil was sneaking in advertisements. But later, evil became a question of how Google was using its technologies in the world and in national security. Project Maven was a project from the Department of Defense that aimed to use Google's technology for their own purposes. 

Specifically, they wanted to use Google's deep learning and artificial intelligence to be able to track vehicles and track people. Google employees originally did not really know about Project Maven. Parts of the business definitely did, but it wasn't made hugely public. So initially, when people heard about Project Maven, they investigated, they put their own feelers out to read code and see what was going on. Google employees felt that they had a culture of 'sanctioned activism.' 

So it was Google-y to speak up and protest and speak your mind if you felt something was inappropriate, or that it contradicted Google's "Don't Be Evil" value. When they did eventually find out what was happening, there was great unrest within the employees of Google. There were petitions, there were protests. They wrote a letter to CEO Sundar Pichai, asking him to reconsider Project Maven. And when he didn't initially, many were upset and resigned from Google as a result. They were very frustrated that the narrative that they so believed in at Google, was not being honored in this project. TV REPORTER: A giant, army B-24 bomber arrives for inspection by 78-year-old Henry Ford. 

The relationship between private industry and the military was pretty straightforward in World War II. Ford, for example, would provide vehicles, machinery, etc., to the military in a very easy, expected, straightforward way. These days, it gets a bit more complicated, particularly with tech companies like Google. They provide a wide array of services that expand from simple messaging to more complex gathering, to passing of data and information at great scale. But with that increased capability, in an organization as big and complex as Google, comes increased risks when we're lending those services to the military. 

RONALD REAGAN: To ignore the facts of history and the aggressive impulses of an evil empire. 

BUTRICOEvil is in the eye of the beholder. GEORGE W. BUSH: To defeat this evil. ANNA BUTRICO: How do you define your terms? NAJI SABRI: Those who are fond of exporting evil. ANNA BUTRICO: It's hard for me to say whether or not Google's partnership with the Department of Defense was, in itself, evil. It's up for your own interpretation or debate. At the end of the day, it all comes down to narrative. When the huge narrative shifts in a time of great crisis, then the terms of what is evil, what is not, what is accepted and what we can't tolerate, will change. 

Think about the September 11th attacks: That was a horrendous day for the United States, and it immediately changed the narrative of how we perceive ourselves, how we perceive our efforts, and what we would do for the sake of protecting our country.


Related