Google workers want to end work on Defense Department drone project, cite 'Don
Source: Mike Snider
Google CEO Sundar Pichai in Davos, Switzerland, on Jan. 24, 2018.
Markus Schreiber, AP
More than 3,000 Google employees have signed a letter asking management to end the company's involvement in Project Maven, a Defense Department drone surveillance project.
The employees, in a letter addressed to company CEO Sundar Pichai, say Google's assistance in developing the artificial intelligence-powered system to detect vehicles and other objects in video captured by military drones betrays the company's motto of "Don't Be Evil."
Google counters the employees' arguments saying, in a statement, the company's involvement is for "non-offensive purposes" and is used "to flag images for human review and is intended to save lives and save people from having to do highly tedious work."
Employees had previously voiced concerns about the project, begun in April 2017 by the Defense Department to explore artificial intelligence, machine learning and big data enhancements, they say in the letter, which was first obtained by The New York Times.
Even though company management said Google-supplied technology would "not 'operate or fly drones' and 'will not be used to launch weapons,' " the employees' letter says "the technology is being built for the military, and once it’s delivered it could easily be used to assist in these tasks."
In the Project Maven letter, employees ask Google to cancel the project immediately and to "enforce a clear policy stating that neither Google nor its contractors will ever build warfare technology."
Google is at the forefront among the big tech companies in developing artificial intelligence and machine learning, cutting-edge technology that makes use of powerful computers and huge data sets to match photos, simulate human conversation, predict user behavior and run self-driving cars. Some leaders, notably Tesla's Elon Musk and the late physicist Stephen Hawking, have warned artificial intelligence without strong ethical guidelines will have catastrophic consequences.
More: Elon Musk says AI could doom human civilization. Zuckerberg disagrees. Who's right?
The company said    "any military use of machine learning naturally raises valid concerns" and said it's "actively engaged across the country in a comprehensive discussion of this important topic and also with outside experts" to develop policies around machine learning technologies.
This is just the latest issue to foment division between the company and some of its 78,000 employees. Recently, efforts to increase the diversity of its staff by recruiting and advancing more women and people of color have exposed fault lines in its workplace culture, which encourages staff to "bring your whole self to work."
An internal culture war ignited when ex-Google engineer James Damore wrote a memo in August 2017 saying that women in technology were unlikely to succeed because, in general, they are more interested in people than ideas. He was fired, and in a subsequent suit against Google, Damore alleged the company discriminates against white conservative men.
Some employees who volunteer as internal diversity advocates have complained about online attacks after personal information and comments from internal company forums were included in Damore's suit and also leaked to far-right websites.