One of the features I was most excited to try on the new Pixel 2 smartphone was Google Lens, a visual search engine that comes preloaded on the new phones.
Unfortunately, it didn’t live up to the hype.
First off, it’s nearly impossible to find. The feature lives inside the Google Photos app, not the camera app. This means it only works after you’ve already taken a photo, not as you’re taking one, like it may seem.
And therein lies the first major problem with the feature: If the intent is to have Google Lens identify objects you see in the real world, it’s a hassle to take a photo, then have to open up the Google Photos app to find Lens. I was fooled – and I suspect many others were as well – into thinking Lens worked in in real time.
More importantly, however, Lens just doesn’t work that well yet. Google says that for now, Lens is just a “preview,” which may be Google-speak for a beta version. But I expected it to be, well, smarter.
Here's what I mean:
All Lens could gleam from this photo was that it's a still life.
Now, Lens isn't wrong here. This is a still life photo, and Lens showed me images that were visually similar. But this wasn't exactly what I was expecting to see when I used the feature. I thought it would recognize that this was a seashell, and maybe even tell me what type of seashell I was looking at.
But Lens can accurately read labels, at least.
All Lens did here was identify that this was a jar of Jif peanut butter and pull up the company's Wikipedia page. Sure, it's accurate, but is this helpful? Not really. Lens accomplished nothing here that my actual eyes can't already do.
Lens is great at identifying addresses.
I quickly realized that Lens is exceptionally good at reading typed letters and numbers. Here, Lens read the address (as well as the serial number below it) in a matter of seconds and even recognized some additional information about it. Lens could tell this was the address of a corporation rather than a private home, and recognized it was in Canada, not the US.
The coolest part - and I think the best use case of Lens - is that it offered to pull up Google Maps and direct me to the address.
It's not so great at reading handwriting.
When it comes to handwritten numbers, Lens falls flat. The feature didn't work at all, despite the fact that I think I wrote the numbers pretty neatly. I had hoped it would recognize the phone number and offer to dial it without me having to type it in manually, but no such luck.
It can identify brands — sometimes.
Selfishly, I was hoping Lens might work for one of my more common issues: Seeing something I like out in the world but not knowing who makes it or where I can get it.
Here, Lens worked ... fine. It couldn't identify my Vans sneakers, despite the fact that they're one of the more easily recognizable logos. But it worked great with my Daniel Wellington watch, since it's so good at reading printed words. I didn't find the results Lens served me particularly helpful, but at least it was accurate.
So far, Lens doesn't work quite as well as I'd hoped.
Google isn't billing Lens as 100% ready for primetime, so I have to cut it some slack for now. But my expectations were so high because it's Google. Lens has the power of Search behind it, along with Google's excellent smart assistant and an overall knack for having products that seamlessly work together.
Plus, if Lens is a feature inside Google Photos instead of the camera app, it's frustrating that other Google Photos users on other devices don't have access to it.
But there's good news for Lens: As people buy the Pixel 2 and start using the feature, it's only going to get better from here.