Google reportedly scales back AI Overviews to fix mistakes

Nico Grant of the New York Times reports today that Google has sharply scaled back the presence of AI Overviews for its search engine after a disastrous rollout of wrong, inappropriate, or even unsafe information being generated by AI.

The AI Overviews now appear only for limited searches, apparently.

The glitch with AI Overviews follows Google’s disastrous rollout of its image generator that produced historically inaccurate images that critics accused of being “woke.”

These mistakes by Google might be small bumps in the road. But they do raise a serious question about how much internal testing Google performs on its AI models before making them publicly available.

Liz Reid, VP of Google Search, posted a blog post discussing the problem.

In addition to designing AI Overviews to optimize for accuracy, we tested the feature extensively before launch. This included robust red-teaming efforts, evaluations with samples of typical user queries and tests on a proportion of search traffic to see how it performed. But there’s nothing quite like having millions of people using the feature with many novel searches. We’ve also seen nonsensical new searches, seemingly aimed at producing erroneous results.

Liz Reid, VP of Google Search

One problem area involved search queries that lack much content on the Web.

For example, the query “How many rocks should I eat?” had a “‘data void’ or ‘information gap,’ where there’s a limited amount of high quality content about a topic. However, in this case, there is satirical content on this topic … that also happened to be republished on a geological software provider’s website. So when someone put that question into Search, an AI Overview appeared that faithfully linked to one of the only websites that tackled the question.”

Another problem area involved “AI Overviews that featured sarcastic or troll-y content from discussion forums. Forums are often a great source of authentic, first-hand information, but in some cases can lead to less-than-helpful advice, like using glue to get cheese to stick to pizza.”

Google said it has made improvements to address these problems.

Leave a Reply


Discover more from Chat GPT Is Eating the World

Subscribe now to keep reading and get access to the full archive.

Continue reading