Google I/O 2022: multiple search for local information, powerful feature of teased scene exploration


During its I/O 2022 consumer keynote, Google announced on Wednesday that it was updating its search engine with improvements to visual searches. The company revealed that it would expand its multiple search functionality to allow users to see local results. Meanwhile, Google also said it was working on a new scene exploration feature to get information about multiple objects by panning a camera to see information and information on their screen. The feature will build on the multiple search feature, and the company has yet to reveal when it will be available to users.

During Google I/O 2022, the company revealed that it would further expand its multi-search feature with Google Lens which allows users to search with images and text at the same time. Users will be able to search “near me” results to find options for a local retailer or restaurant, based on the photo you clicked and your search term. Google says local information in multiple search will be available to all users worldwide later this year in English, while support for other languages ​​will be added in the future.

Google says local results for multiple search will be available to all users worldwide later this year in English
Photo credit: Google

Introduced last month, multiple search is a feature that Google has touted as one of its “most significant search engine upgrades” in several years. Multiple search allows users to click on an image of an object or product, such as a dress or home decor, then swipe up to add text for a “combined” search query. Users can click on the image of an orange dress, then add the query “green” or “blue” to find similar products in another color, or click on the image of a houseplant and add a query for “care instructions”.

In order to find local results using multiple search, the company says it analyzes millions of images and reviews posted on web pages and the Maps Contributor Community to find nearby results. . The feature – which relies on machine learning – can be used to find out where you can find a particular dish at a restaurant near you, or to find a product at a local retailer, according to Google.

The company is also working to expand the multi-search feature on Google Lens with a new feature called “scene exploration.” Google says users will be able to use Multi Search to pan their camera to see information about “multiple objects in a larger scene.” The feature could allow scanning an entire shelf of products, while shopping, to see information in an overlay on their screen.

google scene exploration google scene exploration

The company teased its in-development Scene Exploration feature at Google I/O
Photo credit: Screenshot/Google

Google says it plans to bring multi-search scene exploration in the future, but didn’t reveal which regions will have access to the feature or which languages ​​will be supported. “Scene exploration is a major breakthrough in our devices’ ability to understand the world the way we do – so you can easily find what you’re looking for,” said Prabhakar Raghavan, senior vice president at Google.

In another announcement related to its search product, Google announced on Wednesday that it will add the ability to request deletion of phone numbers, home addresses, and email addresses through the Google app in the coming months. to come. The company announced last month that it was extending its removal policies related to personal information from search results to all users.


Tech

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button