How To Use Google’s New AI-Powered Visual Search

These new AI-powered Google features will make searching for specific things a lot more intuitive.

Originally Published: 
Google AI search feature

Google is getting a lot more creative with how it allows users to search. In an AI Search Event this week the company debuted its latest search functions that use visual cues to refine and expand search queries.

The updates include an improved multisearch function, but also a new “search screen” feature. Google began rolling out its multisearch function in April 2022 as a beta feature in English and limited it to those in the U.S., but now the capability is available globally and in all languages.

Google is really embracing AI, with these search functions coming a few days after Google announced its Bard AI-powered chatbot meant to compete with ChatGPT. Most people may already be familiar with Google’s reverse image search function, but Google is looking to advance visual searching to the next level with its latest AI tech. The latest features make searching for things even more intuitive, as you can see in Google’s demos.


Google Lens will now be able to search your screen to help you identify anything that you come across while you’re browsing online or on social media. If you see something you want to identify while you’re watching a video or scrolling social media, you can long-press the power or home button on your Android device. Once the Google Assistant prompt comes up, you’ll see a “search screen” option. Once you hit it, Google Lens takes over and tries to identify whatever it is you’re looking at on screen.

The feature will be available in the coming months globally, but only on Android devices.

The “Search screen” function pops up with Google Assistant.



Google’s multisearch function isn’t new, but it’s now available on mobile globally in all languages and countries where Google Lens is available. The multisearch function lets you combine images and text when you search for something, which can be particularly helpful in certain cases.

If you come across a dress that you like, but not in the color you want, you can search with the image of the dress and whatever color you punch in to get something that suits your style more. You can also take a picture of your houseplant and type in “care instructions,” or if you come across a picture of a round coffee table that you like, you can see it in a different style by typing in “square” or “rectangle.”

Google’s multisearch makes it way easier to find the exact style of dress you’re looking for.



Google also expanded its multisearch function with an option to search locally. If you take a picture of something you’re looking for and add “near me,” you’ll have an easier time supporting local businesses. Maybe you need something in a pinch, where the “near me” option will help you find the closest place you can buy whatever you need. The local search option is only available in English and in the U.S. at the moment, but Google says it will expand this globally in the coming months.

This article was originally published on

Related Tags