Internet

Google Search is getting smarter by deploying the proprietary Multitask Unified Model (MUM). At the Search On event on Wednesday, the Mountain View, California-based company announced a series of updates coming to Google Search that will leverage MUM to enhance user experience. Google is also bringing a redesigned Search page for users where it will use artificial intelligence and MUM to offer deeper results on various topics. Users will also get a new experience when searching for videos on the search engine. It will bring related videos to a particular video. Additionally, Google announced Address Maker that uses open-source Plus Codes to provide functioning addresses at scale. The Google app on iOS is also getting an update with Google Lens integration. Google is also updating Chrome with Google Lens support.

One of the biggest changes that MUM is bringing to Google Search is the ability to find results with both visuals and text simultaneously. At the Search On event, Google showcased that MUM is enabling Google Lens to let users search for visuals by adding their queries in text. So for instance, you will be able to find results on how you can fix your bike by capturing an image of its broken part using Google Lens.

Google is using MUM to let users search for visuals by adding their queries in text
Photo Credit: Google

Similarly, you can look for something that is difficult to describe accurately with words by taking their pictures through Google Lens. You will just need to tap on the Lens icon to look for the results in such cases.

Google demonstrated this update by taking a picture of a shirt and asking the search engine to find the same pattern but on socks.

“By combining images and text into a single query, we’re making it easier to search visually and express your questions in more natural way,” the company said.

These new searching capabilities are currently under experiment, though Google said users would be able to experience them in the coming months.

Google demonstrated how the enhanced search would work using visual and text queries
Photo Credit: Google

Google also announced the redesigned Search page that will use AI and MUM advancements to provide more natural results. The redesigning will bring a section called Things to know to give deeper results about new topics. The new section will show links to the content that you would not see on regular search results.

The redesigned Google Search page will also carry features to let users refine and broaden their searches. For instance, if a user is looking for acrylic painting, the features to refine searches will show specific techniques such as puddle pouring or art classic you can take to learn about the new skill. Similarly, the broaden searches option will let you widen your search query with related topics such as other painting methods and famous painters.

Google didn’t provide a timeline on the release of these features, though it mentioned that users will start getting them in the coming months. MUM was first announced at Google I/O earlier this year.

Users on Google Search will also be able to see visually rich pages where articles, images, and videos will be available under one single page. This new page is already live and you can try it out by searching for “Halloween decorating ideas” or “indoor vertical garden ideas”.

Google is also advancing video searches on its site by bringing a related videos section. The company said that the new experience will identify related topics in a video, with links to let users easily dig deeper on a specific query to know more about it — without passing multiple search queries.

Aside from simply taking cues from the title and metadata of video results, Google said that it will use MUM to show related topics in video results even if those aren’t explicitly mentioned in the first video. This will start rolling out in the coming weeks in English, and more visual enhancements will reach users in the next few months.

The existing About This Result on Google Search is also receiving an update with insights such as the information about the source, what others have said, and more about the topic. These changes will be available in the coming weeks in English in the US.

Google is also updating its native app for iOS users with a Lens mode — similar to how it has provided the Lens integration on its Android app. This will let you search for shoppable images and visuals. Initially, the Google Lens integration will be limited to the users in the US.

In addition to the Google app for iOS, Google is also bringing its Lens integration to the Chrome browser on desktop. It will let you select images, video, and text content on a website to quickly get their additional information in the same tab — without leaving the page you’re surfing. This update will be available to users across the globe in the next few months.

Google is additionally bringing a more shoppable search experience by letting users see a visual feed of products such as apparels and home decor items alongside information like local shops, style guides, and videos. It is currently limited to the US and is powered by Google’s Shopping Graph that is claimed to be a real-time dataset of products, inventory, and merchants with over 24 billion listings.

Google is making it easier for users to shop for new products using its enhanced search 
Photo Credit: Google

For its users in the US and select markets including Australia, Austria, Brazil, Canada, Denmark, France, Germany, Japan, Netherlands, New Zealand, Norway, Sweden, and Switzerland, Google is also bringing an ‘in stock’ filter that will help find nearby stores for specific items.

Alongside Google Search updates, Google Maps is getting the Wildfire Layer to inform users about wildfire information. It is based on satellite data and will include emergency websites, phone numbers, and evacuation information from local governments to help users. The Wildfire Layer will be available to Google Maps users worldwide on Android, iOS, and desktop starting this October.

Google is also bringing Tree Canopy Insights to over 100 cities around the globe, including Guadalajara, London, Sydney, and Toronto during the first half of 2022. The Tree Canopy tool was first experimented in Los Angeles last year. It uses aerial imagery and AI capabilities to identify places that are at the greatest risk of experiencing rapidly rising temperatures. The tool will help local governments to have free access to insights about where to plant trees to increase shade and reduce heat over time, the company said.

Additionally, Google is utilising Plus Codes to help provide addresses to people businesses using its new tool called Address Maker. The company said that the tool helped get under-addressed communities in a matter of weeks.

Google’s Address Maker is based on Plus Codes to help assign addresses to the masses
Photo Credit: Google

Google has made an Address Maker app for local governments and NGOs to help them create addresses using Plus Codes. The app is available for download through Google Play to approved organisations, and it allows them to create work areas to be addressed; assign new work areas; add roads, streets, alleys, and paths; and generate and validate Plus Code addresses for properties.

Originally, Google developed Address Maker for bringing addresses to underserved communities in Kolkata, India. It has, however, also been used by governments and NGOs in The Gambia, South Africa, Kenya, and the US, the company said.


It’s Google I/O time this week on Orbital, the Gadgets 360 podcast, as we discuss Android 12, Wear OS, and more. Later (starting at 27:29), we jump over to Army of the Dead, Zack Snyder’s Netflix zombie heist movie. Orbital is available on Apple Podcasts, Google Podcasts, Spotify, Amazon Music and wherever you get your podcasts.