It’s time to break up big tech, warns Artificial Intelligence pioneer
A pioneer in artificial intelligence says that the time has come to break up what he is calling “Big Tech” embodied by three principal AI leaders — Google, Facebook, and Apple — because they are becoming a “threat to democracy.”
Source: JD Heyes
As reported by Axios, the remarks by Joshua Bengio, a leader in the field, a professor at the University of Montreal and a member of the “Canadian Mafia” that broke new ground in machine learning — which is the foundation of AI — are noteworthy because of his influence throughout the industry.
But they are also significant because he and many of his peers all lead or consult for Big Tech’s AI programs.
“Concentration of wealth leads to concentration of power,” he said, as Axios reports. “That’s one reason why monopoly is dangerous. It’s dangerous for democracy.”
Bengio is a consultant for tech giant IBM. His colleagues Geoffrey Hinton and Yann LeCun also consult for Big Tech — Google and Facebook, respectively. And a protege of Hinton, Ruslan Salakhutdinov, runs Apple’s AI research program.
Since there are several companies involved in AI research, it’s not likely they would be subject to the U.S. Constitution’s prohibition on government-granted monopolies. But Bengio is correct in stating that the AI industry is becoming more concentrated.
In fact, Bengio said that continued concentration of resources, knowledge and talent among a few giant technology corporations is growing and that only government can act to prevent further consolidation. “We need to create a more level playing field for people and companies,” Bengio told Axios at an AI conference in Toronto recently.
The site noted further:
In recent years, Apple, Facebook, Google and Microsoft have amassed a towering lead in AI research. But now, they are subject to growing scrutiny because of their outsized influence on society, politics and the economy. I asked Bengio if the companies should be broken up. He harrumphed and responded that anti-trust laws should be enforced. “Governments have become so meek in front of companies,” he said.
“AI is a technology that naturally lends itself to a winner take all,” Bengio added. “The country and company that dominates the technology will gain more power with time. More data and a larger customer base gives you an advantage that is hard to dislodge. Scientists want to go to the best places. The company with the best research labs will attract the best talent. It becomes a concentration of wealth and power.”
Bengio is only the latest to warn about the dangers of AI in general, let alone the concentration of AI knowledge and research. (Related: Artificial Intelligence ‘more dangerous than nukes,’ warns technology pioneer Elon Musk.)
But tech experts aren’t the only ones warning about the rise of AI. Russia President Vladimir Putin recently predicted that whoever managed to perfect the technology could end up ruling the world. As reported by Robotics.news:
Addressing students last week, Vladimir Putin said that there are legitimate concerns about AI and that its development will produce “colossal opportunities and threats that are difficult to predict now.”
Going further, Putin warned that “the one who becomes the leader in this sphere will be the ruler of the world.”
He added: “Artificial intelligence is the future, not only for Russia but for all of humankind.”
Perhaps sensing the kind of power a company — or even a few companies — could amass if they became the first to fully develop AI, Putin warned that such technology should not become “monopolized” (sound familiar?), hinting that Russia would share the technology if it develops AI first.
“If we become leaders in this area, we will share this know-how with the entire world, the same way we share our nuclear technologies now,” he told students around the country, via satellite link-up, as he spoke to them from the Yaroslavl region.
Frankly, I wouldn’t buy that but the points are well-made: AI may become too powerful for one company — or three or four — to contain.