Google’s woke AI wasn’t a mistake, former engineers say – Washington Times

1 minute, 37 seconds Read

In a recent unveiling of Google‘s latest AI tool, Gemini, users and observers were left stunned by the search results it produced.

When prompted to provide images for terms like “Nazis,” “knights” and “popes,” the responses skewed sharply from historical accuracy, showcasing a clear ideological slant. Google‘s AI displayed predominantly Black Nazis, female Asian knights and women as popes.

Further inquiries into sensitive topics received refusals or noncommittal replies from Gemini, prompting concerns over the AI’s objectivity.

The reaction from tech circles and the general public varied from disbelief to bemusement at Gemini’s extreme programming. However, Shaun Maguire, a former partner at Google Ventures from 2016 to 2019, expressed no surprise.

According to Mr. Maguire, these peculiarities are emblematic of a broader cultural shift in Google that favors a particular set of values.

“I was not shocked at all,” he told The Free Press. “When the first Google Gemini photos popped up on my X feed, I thought to myself: Here we go again. And: Of course. Because I know Google well. Google Gemini’s failures revealed how broken Google’s culture is in such a visually obvious way to the world. But what happened was not a one-off incident. It was a symptom of a larger cultural phenomenon that has been taking over the company for years.”

Former employees corroborated this sentiment, indicating a corporate atmosphere intensely focused on diversity, equity and inclusion (DEI). They described an environment where DEI objectives supersede aims of meritocracy and sound business strategy. Engineers had to evaluate even minor software changes on the basis of their DEI impact, and hiring practices were encouraged to favor demographic diversity over potential expertise.

“The model is just a reflection of the people who trained it,” another former AI researcher at Google Brain, who asked not to be named, told The Free Press. “It’s just a series of decisions that humans have made.”

• Staff can be reached at 202-636-3000.

This post was originally published on this site

Similar Posts