NEW YORK, February 5. /TASS/. Google Corporation has published an amended list of principles for the development and application of artificial intelligence (AI) technologies, omitting the promise not to use its AI products for weapons development and human surveillance, according to the revised principles published on the official Google AI website.
Google first outlined its principles for the use of AI technologies in 2018 on the official blog of the company's CEO Sundar Pichai. Among other things, it stated that AI would not be used for "weapons or other technologies whose principal purpose or implementation is to cause or directly facilitate injury to people." The company also promised not to use AI to develop technologies that "gather or use information for surveillance violating internationally accepted norms" and "whose purpose contravenes widely accepted principles of international law and human rights."
The revised list of principles contains no explicit language related to weapons or surveillance technologies. In it, the company only promises to respect "widely accepted principles of international law and human rights." Two Google executives posted a statement on the company's official blog commenting on the company's updated AI principles. They cited the rapidly changing "geopolitical landscape" and increased competition in AI technology, without mentioning the issues of weapons and surveillance.
Several Google employees expressed concern about the policy revision. "It's deeply concerning to see Google drop its commitment to the ethical use of AI technology without input from its employees or the broader public, despite long-standing employee sentiment that the company should not be in the business of war," Parul Koul, a Google software engineer, told Wired magazine.
The magazine attributes the change in Google's policy to, among other things, the inauguration of new US President Donald Trump. However, the company's representative Alex Krasov told Wired that the modification of the principles was prepared much earlier.