Have you ever wanted to know where a stranger bought her (insert any miscellaneous fashion item here)? Tech companies have been trying to master this kind of image recognition technology for a while now, without much success. This week, Google launched Style Match, a tool that uses Google Lens to identify pictures of home goods and clothing.
Take a picture of a well-dressed stranger’s shoes, and the Lens feature is supposed to instantly populate details about the style. It even goes above and beyond this by offering similar styles based on your learned preferences.
“Sometimes your question is not, ‘what is that exact thing?’ but instead, ‘what are things like it?’ Now, with style match, if an outfit or home decor item catch your eye, you can open Lens and not only get info on that specific item—like reviews—but see things in a similar style that fit the look you like,” writes Rajan Patel, Director of Google Lens.
Unfortunately for Apple’s iPhone users, the shopping recognition technology isn’t available, yet. But, for those who are using certain Android phones, you can start testing out the new feature in a few weeks. Other phones that support Google’s Style Match feature include LGE, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, Asus, and of course the Google Pixel.
The app certainly solves one shopping woe: you will no longer have to aimlessly search for that stranger’s perfect pair of mules that you can’t quite describe. Simply point, shoot, and voila.
However, as Racked’s Eliza Brooke notes, the technology could bring about what people are calling the “Great Flattening Effect,” whereby we will all eventually be dressing the same.
The theory holds up in that, if we are all being dressed by an algorithm that helps us to copy other’s styles, individual style could very well fizzle and fade with the rise of AI-driven dressing. At the rate people are using Instagram and social media to self-police one another, it is only a matter of time before this point-and-shoot technology takes off.