You see it. We sell it.
Visual search refers to a computer’s ability to analyze an image or video and return contextual information based on what it sees. This holds true whether the algorithm is analyzing a live shot or one that has been previously captured. Last year, Google introduced the public to Lens, bringing one of the first mass visual search engines to the forefront.
Since then, Lens has improved and spread across more native Android and iOS devices. However, similar to text-based search, individual retailers are also going to need to deploy their own version of the service to keep up with the behavioral paradigm shift. Gartner predicts that by 2021, early adopter brands that redesign their websites to support visual (and voice) search will increase digital commerce revenue by 30 percent.
The tech is quickly making its way into social platforms as well. In late September, Snap and Amazon agreed to a partnership, allowing Snapchat users to take a photo of a product or barcode to seamlessly load the Amazon product page. The ability to go from discover to purchase that quickly, is what makes this tech so appealing to retailers.
A recent survey from eMarketer found that 62 percent of millennials view visual search as the most popular new technology they would be comfortable using as part of their digital shopping journey. For marketers, this means that it’s incredibly important to ensure that your product images are optimized for visual search. The easiest way to do this is to use descriptive, keyword-rich, alt text tags with your images, as well as optimizing them for their size and file type.
As the friction between accessing visual search tools continues to decrease, we’ll see consumers fully realize the utility this search option provides. Questions that were previously hard to answer, such as “what pants do this shirt go best with?” suddenly become table-stakes, and a new area for marketers to compete in.