Google Multisearch is the official name of a new feature introduced in Google Lens.
Now we can define and apply text filters to the classic image search in Lens. It is rolling out gradually on mobile, only through the official beta of the Google app, in English and in the United States.
To better explain it is necessary to give an example. Let’s say I photograph a dress that interests me. Lens shows me all the results for that model or similar models, including the stores where it is available for purchase. But the original color doesn’t suit me and I want it green. Using the Add to search button at the top it is possible to specify text keywords (including “green”, in fact) to identify more precisely what exactly interests me.
Google offers a couple more cases where Multisearch can come in handy:
- Look for complementary elements . For example, I photograph a dining table and add the word “coffee table” to find one of the same style.
- Look for additional information on the objects photographed, for example, instructions for caring for a plant.
All this is possible thanks to MUM (Multitask Unified Model), an AI algorithm that was introduced in recent months. Google said it will continue to expand and enhance Multisearch in the months to come.