Close Menu
Best in TechnologyBest in Technology
  • News
  • Phones
  • Laptops
  • Gadgets
  • Gaming
  • AI
  • Tips
  • More
    • Web Stories
    • Global
    • Press Release

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

What's On

A Special Diamond Is the Key to a Fully Open Source Quantum Sensor

9 August 2025

Here’s How to Buy the Best Used EV

9 August 2025

Security News This Week: The US Court Records System Has Been Hacked

9 August 2025
Facebook X (Twitter) Instagram
Just In
  • A Special Diamond Is the Key to a Fully Open Source Quantum Sensor
  • Here’s How to Buy the Best Used EV
  • Security News This Week: The US Court Records System Has Been Hacked
  • Gear News of the Week: iPhone 17 May Be a Month Away, and Sonos to Raise Prices
  • Why You Need an Outdoor Air Quality Monitor
  • I Tested Steam Mops in My Kitchen for 3 Months. These Were My Favorites
  • Google to Reportedly Shut Down Support for Steam for Chromebook in 2026
  • Sand and Deliver: We Raced Across Dunes to Find the Best Beach Wagon
Facebook X (Twitter) Instagram Pinterest Vimeo
Best in TechnologyBest in Technology
  • News
  • Phones
  • Laptops
  • Gadgets
  • Gaming
  • AI
  • Tips
  • More
    • Web Stories
    • Global
    • Press Release
Subscribe
Best in TechnologyBest in Technology
Home » Google’s Visual Search Can Now Answer Even More Complex Questions
News

Google’s Visual Search Can Now Answer Even More Complex Questions

News RoomBy News Room3 October 20244 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
Share
Facebook Twitter LinkedIn Pinterest Email

When Google Lens was introduced in 2017, the search feature accomplished a feat that not too long ago would have seemed like the stuff of science fiction: Point your phone’s camera at an object and Google Lens can identify it, show some context, maybe even let you buy it. It was a new way of searching, one that didn’t involve awkwardly typing out descriptions of things you were seeing in front of you.

Lens also demonstrated how Google planned to use its machine learning and AI tools to ensure its search engine shows up on every possible surface. As Google increasingly uses its foundational generative AI models to generate summaries of information in response to text searches, Google Lens’ visual search has been evolving, too. And now the company says Lens, which powers around 20 billion searches per month, is going to support even more ways to search, including video and multimodal searches.

Another tweak to Lens means even more context for shopping will show up in results. Shopping is, unsurprisingly, one of the key use cases for Lens; Amazon and Pinterest also have visual search tools designed to fuel more buying. Search for your friend’s sneakers in the old Google Lens, and you might have been shown a carousel of similar items. In the updated version of Lens, Google says it will show more direct links for purchasing, customer reviews, publisher reviews, and comparative shopping tools.

Lens search is now multimodal, a hot word in AI these days, which means people can now search with a combination of video, images, and voice inputs. Instead of pointing their smartphone camera at an object, tapping the focus point on the screen, and waiting for the Lens app to drum up results, users can point the lens and use voice commands at the same time, for example, “What kind of clouds are those?” or “What brand of sneakers are those and where can I buy them?”

Lens will also start working over real-time video capture, taking the tool a step beyond identifying objects in still images. If you have a broken record player or see a flashing light on a malfunctioning appliance at home, you could snap a quick video through Lens and, through a generative AI overview, see tips on how to repair the item.

First announced at I/O, this feature is considered experimental and is available only to people who have opted into Google’s search labs, says Rajan Patel, an 18-year Googler and a cofounder of Lens. The other Google Lens features, voice mode and expanded shopping, are rolling out more broadly.

The “video understanding” feature, as Google calls it, is intriguing for a few reasons. While it currently works with video captured in real time, if or when Google expands it to captured videos, entire repositories of videos—whether in a person’s own camera roll or in a gargantuan database like Google—could potentially become taggable and overwhelmingly shoppable.

The second consideration is that this Lens feature shares some characteristics with Google’s Project Astra, which is expected to be available later this year. Astra, like Lens, uses multimodal inputs to interpret the world around you through your phone. As part of an Astra demo this spring, the company showed off a pair of prototype smart glasses.

Separately, Meta just made a splash with its long-term vision for our augmented reality future, which involves mere mortals wearing dorky glasses that can smartly interpret the world around them and show them holographic interfaces. Google, of course, already tried to realize this future with Google Glass (which uses fundamentally different technology than that of Meta’s latest pitch). Are Lens’ new features, coupled with Astra, a natural segue to a new kind of smart glasses?

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleXiaomi 15 Series, Honor Magic 7 Lineup Tipped to Be Unveiled in October Following Vivo X200 Series Launch
Next Article This Dyson V8 cordless vacuum deal cuts the price by $120

Related Articles

News

A Special Diamond Is the Key to a Fully Open Source Quantum Sensor

9 August 2025
News

Here’s How to Buy the Best Used EV

9 August 2025
News

Security News This Week: The US Court Records System Has Been Hacked

9 August 2025
News

Gear News of the Week: iPhone 17 May Be a Month Away, and Sonos to Raise Prices

9 August 2025
News

Why You Need an Outdoor Air Quality Monitor

9 August 2025
News

I Tested Steam Mops in My Kitchen for 3 Months. These Were My Favorites

9 August 2025
Demo
Top Articles

ChatGPT o1 vs. o1-mini vs. 4o: Which should you use?

15 December 2024105 Views

Costco partners with Electric Era to bring back EV charging in the U.S.

28 October 202495 Views

Oppo Reno 14, Reno 14 Pro India Launch Timeline and Colourways Leaked

27 May 202582 Views

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Latest News
News

I Tested Steam Mops in My Kitchen for 3 Months. These Were My Favorites

News Room9 August 2025
Laptops

Google to Reportedly Shut Down Support for Steam for Chromebook in 2026

News Room9 August 2025
News

Sand and Deliver: We Raced Across Dunes to Find the Best Beach Wagon

News Room9 August 2025
Most Popular

The Spectacular Burnout of a Solar Panel Salesman

13 January 2025129 Views

ChatGPT o1 vs. o1-mini vs. 4o: Which should you use?

15 December 2024105 Views

Costco partners with Electric Era to bring back EV charging in the U.S.

28 October 202495 Views
Our Picks

Gear News of the Week: iPhone 17 May Be a Month Away, and Sonos to Raise Prices

9 August 2025

Why You Need an Outdoor Air Quality Monitor

9 August 2025

I Tested Steam Mops in My Kitchen for 3 Months. These Were My Favorites

9 August 2025

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Facebook X (Twitter) Instagram Pinterest
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us
© 2025 Best in Technology. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.