Cookie banner

This site uses cookies. Select "Block all non-essential cookies" to only allow cookies necessary to display content and enable core site features. Select "Accept all cookies" to also personalize your experience on the site with ads and partner content tailored to your interests, and to allow us to measure the effectiveness of our service.

To learn more, review our Cookie Policy, Privacy Notice and Terms of Use.

or
clock menu more-arrow no yes mobile

Filed under:

Google Is Actually Pretty Good at Identifying What People Are Wearing

New, 1 comment

We tested out Google Lens, which lets you shop via image recognition.

A woman walks down the street, with a green dot superimposed on her back.
Google Lens prepares to ID a T-shirt featured in Business of Fashion’s May issue.
Photo: Eliza Brooke

Racked is no longer publishing. Thank you to everyone who read our work over the years. The archives will remain available here; for new stories, head over to Vox.com, where our staff is covering consumer culture for The Goods by Vox. You can also see what we’re up to by signing up here.

In May, Google introduced a visual recognition tool called Style Match, which uses Google Lens to let you take pictures of clothing and then offers up similar, shoppable items from around the internet. Plenty of tech companies have put out features like this in the past, but Google’s entry into the fray was a big moment in the push to make visual ID for clothing items a widespread reality.

With Style Match now rolling out to Google Pixel and other Android phones (iOS users will be able to access it through Google Photos down the line), we decided to give it a test drive.

Google Lens can identify landmarks, celebrities, and cans of LaCroix sitting on your desk. When you train your camera on a person’s outfit, colorful dots pop up over the pieces it can match, like Racked reporter Chavie Lieber’s linen shirt and jeans.

A screen cap of Google Lens at work. A yellow and blue dot hover over Chavie’s shirt and pants.
Google Lens analyzes Chavie’s outfit.
Photo: Eliza Brooke
A screen cap of Google Lens’s matches for Chavie’s shirt.
It surfaced a number of similar white shirts.
Photo: Eliza Brooke

As promised, Google Lens is good at identifying a product’s key characteristics and offering similar options, which are gathered through Google Shopping. A photo of Selena Gomez wearing a yellow satin Coach dress with a low neckline brought up a plunging velvet dress of the same hue. A co-worker’s black dress with short lace sleeves generated a variety of options, all with a similar sleeve detail. When I took a picture of a leopard-print coat, it knew exactly what fabric I wanted to see more of.

It can’t ID everything, though. When I tried Chavie’s gold hoop earrings, Google Lens came up short. (They’re from Forever 21.)

A screen cap of Google Lens saying “Sorry, not sure how to help.”
No dice.
Photo: Eliza Brooke

Google Lens surfaces aesthetically similar items but didn’t provide exact matches for any of the items I snapped photos of, so if you want that jacket, you’re better off asking the person wearing it where they bought it. When I focused the camera on my black high-top Converse (standard-issue), for instance, it first served me a bunch of black low-tops. Photographed from a different angle, it gave me some high-top Converse, though they were all special edition models, like a pair printed with Andy Warhol’s soup can art.

There are also some things the human eye understands that Google Lens doesn’t quite pick up. In a recent Stuart Weitzman ad, Kate Moss wears a sheer, dark blouse with her arms are raised over her head, which pulls the sides of the shirt up at an angle. I saw this as a standard button-down that was warped by Moss’s pose, but Google Lens interpreted it as a top with an asymmetrical cut. So it showed a number of dark jackets that are longer in the front than they are on the sides, including ones by Rick Owens, Elizabeth and James, and Walmart.

A screen cap of Google Lens’s matches for Kate Moss’s shirt.
It’s worth noting that Google Lens immediately identified Kate Moss by name.
Photo: Eliza Brooke

This brings us to the subversive beauty of Style Match: It shows you a $2,551 Rick Owens coat next to a $17.63 zip-up from Walmart. It’s pretty rare to see clothing items with such disparate price tags in the same space, in part because luxury brands are notoriously touchy about having their products displayed next to widely accessible items, for fear of losing brand equity. Unless you’re searching an aggregator like Lyst or browsing a department store like Nordstrom, shopping experiences tend to be broken down by price bracket, and comparing similar items across price points means opening a lot of tabs.

By focusing specifically on visual likeness, Style Match throws together all kinds of brands, which has a weirdly democratizing effect on the browsing experience — sometimes to a hilarious and (probably not intentionally) shady extent. The feature picked up a pair of black heeled Mary Janes in a recent Dior ad, and the first “similar product” it tossed out was by the sensible shoe brand Hush Puppies. Not a flattering comparison for the ultimate in French chic.

A screen cap of Google Lens in action.
A recent Dior ad.
Photo: Eliza Brooke
Google Len’s suggestions for Dior’s shoes.
Those aren’t Hush Puppies.
Photo: Eliza Brooke

Google Lens’s clothing ID feature is most helpful when it’s hard to sum up an item’s defining characteristics. While flipping through Business of Fashion’s May issue, I saw a picture of a woman wearing a long skirt. If I wanted to find it myself, I’d either troll around on a bunch of e-commerce sites ticking off filter boxes, which is a lot of work, or I’d Google: “tea-length white-and-orange pleated skirt with diagonal stripes.” I did that, and I received a bunch of skirts that were pleated, striped, and the right length, but none that were very similar at all.

Google Lens found more compelling options much faster than I was able to.

Google Lens serves up similar options to a pleated, striped skirt.
In this case, the machine did better than the fashion writer.
Photo: Eliza Brooke

Is Google Lens going to change how you shop tomorrow? Probably not, though if it does, please tell us all about it. But even in its early days, the clothing recognition tool is pretty promising.