Google unveils wide-ranging upgrade of search at its “Search on 2021” event
Google search will combine multiple searches into a single query and you’ll be able to search text, images and video at the same time.
Google search will soon combine multiple searches into a single query and you’ll be able to search text, images and video at the same time.
It’s part of the functionality being offered by MUM, Google’s “Multitask Unified Model” artificial intelligence tool which it further detailed at its second “Search On’ held online this week. Google announced a wide range of product updates around search at the event.
MUM extends the capability of the existing Google Lens, where Google software interprets an image it sees through a phone camera and offers back information, such as the description of an object or details of businesses and shops found in the view.
However, Google says MUM goes far beyond that capability.
“MUM can simultaneously understand information across a wide range of formats, like text, images and video. It can also draw insights from and identify connections between concepts, topics, and ideas about the world around us,” says Google.
“We’ve been experimenting with using MUM’s powerful capabilities to make our products more helpful and enable entirely new ways to search.”
It’s probably easier understanding MUM and more powerful searches through examples.
Google gives the example of seeing a shirt you like, but wanting to have the same pattern on socks. You open Google Lens, point your camera at the shirt, and in the text box beneath it, ask to see socks with this pattern. Google then searches that pattern online and replies with matching socks.
It says this capability will launch in coming months, starting in English.
In a blogpost earlier this year, Google explained some of the other capabilities of MUM. You might be a mountaineer and have previously climbed mountain A. You are now wanting to climb mountain B and want to know how to prepare differently.
That would normally take several searches but Google says that it aims to offer the results from one.
“If you were talking to a hiking expert; you could ask one question — ‘what should I do differently to prepare?’,” says the blogpost. “You’d get a thoughtful answer that takes into account the nuances of your task at hand and guides you through the many things to consider.”
“This example is not unique — many of us tackle all sorts of tasks that require multiple steps with Google every day,” says Google. “In fact, we find that people issue eight queries on average for complex tasks like this one.
“Today‘s search engines aren’t quite sophisticated enough to answer the way an expert would. But with a new technology called Multitask Unified Model, or MUM, we’re getting closer to helping you with these types of complex needs. So in the future, you’ll need fewer searches to get things done.”
Google also announced it was introducing a MUM based experience into videos that identifies related topics, even if a topic isn’t explicitly mentioned, which makes it easy to dig deeper and learn more.
The ability of intelligent search to analyse video will bring huge amounts of extra information within easier reach of users. Google says a new MUM-based experience will identify related topics in a video. It says the feature is launching in coming weeks in English.
Google also promises more natural, intuitive search. “When you search for a topic, like acrylic painting, you can see all the different dimensions people typically search for, and find the path that’s right for you,” it says.
“In the future, MUM will unlock deeper insights you might not have known to search for — like ‘how to make acrylic paintings with household items’ — and connect you with content on the web that you wouldn’t have otherwise found.” That feature will also launch in the coming months.
Google says you will soon be able to visually browse for search results using MUM.
It says that starting soon, Apple iOS users will see a new button in the Google app to make all the images on a page searchable through Google Lens, and Google Lens will come to Chrome on the desktop. You will be able to select images, video and text content on a website with Lens to see search results in the same tab, without leaving the page you’re on. Again, it will be available in the coming months.
Google is obviously keen to apply these innovations to shopping. “When you search for ‘cropped jackets’, we’ll show you a visual feed of jackets in various colours and styles alongside other helpful information like local shops, style guides and videos.”
A new “in stock” filter will let you see if nearby stores have specific items on their shelves. If you are looking for a child’s bike helmet, the ‘in stock’ filter will find stores near you that have a helmet, even a specific brand or type, on their shelves, says Google.
This is launching now in countries including in Australia. Of course, its success will depend on current shop inventory and stock levels being connected online for Google to know what is in stock. Whether Google crawls online store websites as it does news sites, or incorporates other methods, will be interesting to see.
In other developments, Google has announced an address maker that uses an open-source system called Plus Codes that creates unique, functioning digital addresses for businesses and NGO that don’t have physical ones. It divides the world into grids so that with just a few characters you can locate any place on the planet.
It’s not a new idea: what3words which makes any place on the planet available as a three word address has been around for years and integrates with some car navigation systems. You type the three words into your sat nav, and your driving instructions materialise.
There were two other key announcements.
Google says it is extending its bushfire (aka wildfire) location monitoring and information about their approximate size as a layer on Google Maps. The information will be gleaned from satellite imagery including thermal and infra-red images. Google hopes to refresh information each 15 minutes.
“The (fire) layer will include emergency websites, phone numbers, and evacuation information from local governments if they’ve been provided,” it says.
“When available, you can also see details about the fire, such as its containment, how many acres have burned, and when all this information was last reported.”
It says this will be launched globally on Android, iOS and desktop this October.
It is also expanding its tree canopy data tool to more than 100 cities around the world; Sydney is one of them. “With tree canopy data, local governments have free access to insights about where to plant trees in order to increase shade and reduce heat,‘’ it says.