25/11/2019 – See, search, buy — auf Deutsch lesen

Revolutionising shopping experiences in the fashion trade

A picture says more than a thousand words. The saying is a good way of describing Visual Search.


A study has shown that one in eight respondents ordering products online do so using Apple’s Siri, Amazon Echo or Google Home. A quarter of participants expressed an intention to shop mainly using voice control. © Dragon Images/Shutterstock


Image searches are particularly helpful in e-commerce where text-based product searches are often more problematic. After all, everyone describes a product in a different way – from the name to the details about colour and shape.

Voice search can do little to change this. According to a study, one in eight people ordering products online, do so using Apple’s Siri, Amazon Echo or Google Home. A quarter of respondents expressed an intention to shop mainly by means of voice control in future: by 2020, 50 percent of Google search requests are expected to be triggered by voice search. And yet, the challenges posed by the product description remain. The solution: Visual Search. Developed around ten years ago, the necessary databases and the required computing power have only just become available.

Consumers want Visual Search

Users photograph a product or upload an image – and the system comes up with similar products. A Visual Search study has shown that 59 percent of 20 to 69-year-olds think that images are more suitable than descriptions when searching for fashion and accessories. They consider Visual Search as practical, helpful, useful, quick and uncomplicated. Critics claim, however, that the technology still has teething problems.

Visual Search stimulates impulse buys

Visual Search removes the barrier between online and offline shopping. Customers see a product in a shop window or fashion magazine, they photograph it and search for it online – then they buy it, thus paving the way for genuine impulse buys. Visual Search looks set to revolutionise the search for fashion because it is so easy to use.

First Visual Search suppliers

Users can take advantage of Visual Search functionalities through large platforms and retailers’ apps. The Lens solutions from Google and Pinterest are leading the way in terms of performance and capabilities. They allow customers to take and upload pictures and to search for similar products. Google displays corresponding suppliers, including smaller fashion sellers. The image search can be integrated through special engines into the apps of retail companies, such as Amazon and H&M: Users can photograph products or upload them from their own photo gallery. In an evaluation of the search results, Google proved to be better than Amazon. The H&M app delivers good results for their own products. Even so, Visual Search is relatively unknown: 75 (H&M) and 63 percent (Amazon) of respondents said they had heard of the supplier but knew nothing about their Visual Search options.

Enhancing images

Many neglect their images when engaging in search engine optimisation – this is an unforgivable and careless mistake. For Visual Search to work properly, the images have to be optimised. Following a keyword search, the images should be described in the metadata precisely and extensively. In addition, keywords should be incorporated in file names, image titles and image descriptions. The description “shift dress in old-rose brocade with floral pattern” leads to more accurate hits than a simple description such as “dress”. The more precise the description, the better the hits. Keywords also form the foundation for alt texts. Developed to describe image content to people with visual impairments, they are read by the search machines. It is also advisable to have several high-quality images of a product taken from different angles. Users are then more likely to opt for a product. A consistent visual language and aesthetic helps the search engines understand the relationship between images.

AI interprets metadata

Based on AI processes and neuronal networks, Visual Search systems identify objects in images. When users focus on an object by cropping or highlighting it and AI systems interpret the metadata, the latter delivers relevant results about shape, colour and material. One example is the Complete-the-Look function from Pinterest. Based on an uploaded image, the system provides stylistically matching and thus highly personalised product recommendations. A further innovative push will come with the integration of chatbots in Visual Search technology.

Thinking of tomorrow today!

Even if the technology still has a few teething problems and is little known, fashion retailers should be lending it some attention. Arvato Systems provide industry expertise, technical understanding and a focus on the needs of customers wanting to tap into this technology.

Steffen Groba, Director Business Development SAP Customer Experience bei Arvato Systems