Google’s Gemini AI chatbot won’t answer some questions about elections – Quartz

author
1 minute, 37 seconds Read

Google logo on building

Google announced new safeguards against misinformation during this year’s elections around the world.
Image: Hannah McKay (Reuters)

Google announced Tuesday that it has restricted the types of election-related questions the company’s Gemini chatbot will answer for users in the U.S. and India.

The restrictions are part of a number of steps the company has taken to safeguard its services from misinformation as millions of Indian citizens are set to vote in a general election this spring.

Advertisement

“Out of an abundance of caution on such an important topic, we have begun to roll out restrictions on the types of election-related queries for which Gemini will return responses,” the tech giant wrote in a blog post. “We take our responsibility for providing high-quality information for these types of queries seriously, and are continuously working to improve our protections.”

Advertisement

Google did not immediately respond to a request for comment from Quartz.

CNBC reports that the restrictions were also rolled out in the U.S., where voters are currently participating in presidential primary elections.

Advertisement

“As we shared last December, in preparation for the many elections happening around the world in 2024 and out of an abundance of caution, we’re restricting the types of election-related queries for which Gemini will return responses,” a company spokesperson told CNBC.

This is not Gemini’s first restriction

The news come just weeks after Google was forced to pause its AI model from generating images of people after users found it was making historically inaccurate and sometimes offensive photos.

Advertisement

This wasn’t what we intended,” Google said in a blog post in February. “So we turned the image generation of people off and will work to improve it significantly before turning it back on.”

Even Google cofounder Sergey Brin said that Google “definitely messed up on the image generation,” and that “it was mostly due to not thorough testing.”

This post was originally published on this site

Similar Posts