The resignations and protests in Google for a contract to cede his AI to a military project

Illustration: Jim Cooke (Photo: Getty)

It has been almost three months since Google employees learned of the company’s decision to develop artificial intelligence for a controversial military pilot program known as the Maven Project. There are already a dozen employees who have resigned in protest.

Reasons for employees who resign range from particular ethical concerns about the use of artificial intelligence for war purposes to broader concerns about Google’s policy decisions and the erosion of user confidence that could result from these actions.

Many of the employees have written reports on their decision to leave the company, and their stories have been compiled into an internal document, the content of which has been described by several sources to Gizmodo. Several of them explain that the executives of the company have become less transparent with their employees when talking about controversial business decisions. They also seem less interested in listening to the workers’ objections as they did before.

The goal of Google in Maven is to accelerate the analysis of sequences taken by drones by automatically classifying images of objects and people. Google is helping the Department of Defense implement automatic learning to classify the images collected by the drones. However, some employees believe that humans, and not algorithms, should be responsible for that sensitive and potentially lethal work, and that Google should not be involved in any kind of military project.

Historically, Google has promoted an open culture that encourages employees to question and debate company decisions, but some employees feel that their leadership is no longer as attentive to those concerns.

“Over the past few months, I have been increasingly disappointed with the response and the way in which staff concerns are addressed and heard,” explains one of the employees who resigned.

There is a precedent for employee resignation that resulted in a policy change: in 2015, employees and users successfully challenged Google’s ban on sexually explicit content on Blogger. However, this is the first mass resignation known in Google in protest against a commercial decision of the company

In addition to the resignations, nearly 4,000 Google employees have expressed their opposition to the Maven Project in an internal petition that calls for the contract to be canceled immediately and to establish a clause against future similar work. Increasing employee pressure seems to have done little to influence Google’s decision. The company has defended its work at Maven and is believed to be one of the main candidates for another important cloud computing contract for the Pentagon. It is a Joint Enterprise Defense Infrastructure, known as JEDI, which is currently under tender.

Employee requests for Google to terminate its contract with the Pentagon are also complicated by the fact that Google claims that it is only providing open source software to Project Maven. That means the military could continue using the technology, even if Google did not accept payment or offer technical assistance. Even so, employees who quit believe that Google’s work at Maven contradicts the company’s principles of good intentions.

“It’s not that Google is that small machine learning company that is trying to find customers in different industries,” says one of the employees who resigned. “Google should stay out of these projects for a simple matter of reputation.” Many Google employees learned for the first time that the company was working in Maven when the news of the controversial project began to spread internally in late February. At the time, a Google spokesperson told Gizmodo that the company was in the process of drafting “safeguard policies” around the use of machine learning. That policy document has not yet materialized.

An employee explained that Google employees were promised an update on the ethics policy in a few weeks, but that progress seems to be blocked. Ethical concerns “should have been addressed before entering into this contract,” the employee adds. Google has emphasized that its artificial intelligence is not used to kill, but its application in the Pentagon’s drone program still poses complex ethical and moral problems for technology workers and for academics studying the field of machine learning.

In addition to the petition circulating on Google, the Tech Workers Coalition launched its own petition in April demanding that Google abandon its work at Maven and that other large technology companies, including IBM and Amazon, refuse to work with the US Department of Defense. UU “We can no longer ignore the harmful biases of our industry and our technologies, the large-scale breaches of trust and the lack of ethical guarantees,” the petition says. “It’s about life and death bets.”

More than 90 experts in artificial intelligence, ethics and computer science published today an open letter that urges Google to finish its work in Project Maven and support an international treaty that prohibits autonomous weapons systems.

Peter Asaro and Lucy Suchman, two of the authors of the letter, have testified before the United Nations about autonomous weapons. A third author, Lilly Irani, is a science teacher and former Google employee. The company’s contributions to the Maven Project could accelerate the development of fully autonomous weapons, Suchman explains to Gizmodo. The letter says:

Although Google is headquartered in the United States, it has an obligation to protect its global user base, and that duty exceeds its alignment with the army of any nation. If ethics on the part of technological companies requires considering who can benefit from a technology and who can be harmed, no topic deserves a more serious reflection -no technology is more important- than algorithms destined to kill at a distance and without public responsibility.

Google has turned to military work without submitting the decision to public debate or deliberation, either nationally or internationally. It is true that Google regularly decides about the future of technology without this implying a public democratic commitment, but its entry into military technologies highlights the problems of private control of the information infrastructure.

Google executives have made efforts to defend Project Maven before employees. In a meeting, shortly after the project was made public, Google Cloud CEO Diane Greene spoke in support of Project Maven. More recently, Greene and other employees have organized several sessions to discuss the project. These sessions featured speakers who supported and opposed Maven, and highlighted the difficulty of drafting policies on the ethical use of machine learning

There are other reputational concerns that employees who have made the decision to leave Google consider. The company’s recent political blunders, such as its sponsorship of the Conservative Political Action Conference and its struggle to address concerns about internal diversity, have also played a role.

“At some point, I realized that I could not recommend anyone to join Google, knowing what I knew. I realized that if I can not recommend that people join, why am I still here? “Says a Google employee who resigned.

“I tried to remind myself that Google’s decisions are not my decisions. I am not personally responsible for everything they do, but I feel responsible when I see something in which I should intervene, “adds another.

A spokesperson for Google said in April the following about the Maven Project:

An important part of our culture is having employees actively participate in the work we do. We know that there are many open questions involved in the use of new technologies, so these conversations – with employees and external experts – are very important and beneficial.

Maven technology is used to mark images for review by humans and is intended to save lives and prevent people from doing extremely tedious work. Any military use of machine learning naturally raises valid concerns. As we move forward with our policies around the development and use of our machine learning technologies, we actively participate throughout the company and with external experts in a thorough debate on this important topic.

Google has not yet responded to our request for comments on these waivers. Meanwhile, employees want to see results, either in the form of a new ethical policy, in the form of a canceled contract, or both.

“Actions speak louder than words, and that is a standard to which I also subscribe,” says one of the employees who resigned. “I was not happy just to express my concerns internally. I had to do something, and the most consistent thing was to leave. “



Viral news from Nigeria

Leave a Reply

%d bloggers like this: