Google tested out AI overviews for months before releasing them nationwide last week, but clearly, that wasn’t enough time. The AI is hallucinating answers to several user queries, creating a less-than-trustworthy experience across Google’s flagship product. In the last week, Gizmodo received AI overviews from Google that reference glue-topped pizza and suggest Barack Obama was Muslim.

The hallucinations are concerning, but not entirely surprising. Like we’ve seen before with AI chatbots, this technology seems to confuse satire with journalism – several of the incorrect AI overviews we found seem to reference The Onion. The problem is that these AI answers offer an authoritative answer to millions of people every day who are just trying to look something up. Google injected its AI into a product we use multiple times a day, and now we’re seeing the consequences.

In my experience, AI overviews are more often right than wrong. However, every wrong answer I get makes me question my entire experience on Google Search even more – I have to asses each answer carefully. Google notes that AI is “experimental” but they’ve defaulted opted everyone into this worldwide experiment.

“The thing with Search — we handle billions of queries,” Google CEO Sundar Pichai told The Verge on Monday when asked about the AI overview rollout. “You can absolutely find a query and hand it to me and say, ‘Could we have done better on that query?’ Yes, for sure. But in many cases, part of what is making people respond positively to AI Overviews is that the summary we are providing clearly adds value and helps them look at things they may not have otherwise thought about.”

Strangely, Google Search occasionally responds to a query with “An AI overview is not available for this search,” while other times, Google will just not say anything and show traditional search results. I got this answer when I searched “what ethnicity are most US presidents” and when I searched “what fruits end in me.” We asked Google why this happens, and will update the article when they respond.

Notably, Google had to pause Gemini’s answers and image generation around racial topics for months after it upset large swaths of the country. It’s unclear if those changes will affect AI overviews as well.

It’s clear that Google felt pressured to put its money where its mouth is, and that means putting AI into Search. People are increasingly choosing ChatGPT, Perplexity, or other AI offerings as their main way to find information on the internet. Google views this race as existential to Search, but it may have just jeopardized the Search experience by trying to catch up.

This week, Google Search has told people a lot of strange things through AI overviews. Here are some of the weirdest ones Gizmodo has found.

Share.
Exit mobile version