Google search will improve snippets to avoid misinformation

The team behind Google Search tweaks its snippets – the text boxes that sometimes spread false information while trying to offer help. The company announced an update that’s supposed to make the answers more accurate and avoid the problem of false premises or questions for which no definitive answer would make sense. It’s paired with an extension of Google’s “about this result” option and warnings for low-quality data voids, plus a new partnership on information literacy lesson plans for middle and high school students.

The snippets show up in many searches, but because they seem to answer questions directly by quoting from pages, they can backfire in ways that standard query answers don’t. In a presentation to journalists, Google gave some examples of these problems and how it is trying to solve them. When looking up the time it takes for light to travel from the Sun to Earth, for example, Google at one point offered a snippet that highlighted Pluto’s distance instead.

The solution, according to research vice president Pandu Nayak, lies in finding consensus: facts that match multiple research findings. On a call with reporters, Nayak clarified that this consensus check comes from pages that Google has already designated as high quality, which Google hopes can avoid a snippet equivalent to Google’s bombardment. “It doesn’t establish that something is trustworthy, it just looks at the best results,” says Nayak. But by looking at several pages that Google already trusts and then trying to find commonalities, it hopes to avoid highlighting the wrong details.

A warning on a Google search for “how to contact the Illuminati”.

A separate issue is the “false premises” problem, a phenomenon where Google tries to be a bit too useful with snippets. For years, if you’ve typed in a leading question about something that never happened, Google has frequently offered snippets that seem to confirm its factuality, pulling out-of-context snippets of text from a semi- linked. The research team’s example, for example, is “When Did Snoopy Murder Abraham Lincoln”, which at one point offered the date of Lincoln’s death in an excerpt. Google calls these “not very common” cases, but it says it has trained its systems to better detect them and not offer any snippets at all, and it promises to have reduced the incidence of these inappropriate appearances. by 40%.

This doesn’t necessarily fix all the issues with snippets. Nayak acknowledged that neither system would help solve a problem identified last year when Google offered the the opposite of good advice on foreclosure management, listing a series of “don’ts” as guides on what to do. “That kind of thing is really about making sure our underlying algorithms are correctly extracting enough context,” says Nayak, who says Google is continuing to make improvements that could prevent similar issues.

But the goal is to make snippets go haywire less often and increase trust in search results, which is underscored by Google’s other changes. For about a year, Google has placed warnings above unreliable search results that may arise in news situations. It now extends them to more general situations where it determines there aren’t high-quality results for a search, adding a notice before letting people scroll down to see the results. This does not prevent anyone from seeing the content, but it ideally helps to manage expectations about the reliability of information.

Google is also developing “About this page”, which allows you to see details about the website where a given result came from. The option was previously available on search, but it’s now launched in Google’s English-language iOS app – you can swipe up while browsing any page in the app to find out more on it, theoretically helping you to assess its reliability. The system will launch on Android later this year and in other languages ​​over the next few months.


Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button