top of page

Argos iOS App - Visual Search

Visual search is one of the latest breakthroughs that is truly making a mark in eCommerce. The retail industry has changed drastically in recent years and young, tech-savvy shoppers are demanding smarter shopping experience where the journey from the discovery of a product to check-out is as short as possible. Argos decided to capitalise on this, hoping to improve the collection for 1-Click Reservations and AOV (average order value) conversion.

Role

UX/UI Lead

Year

2018-2019

Team

Anthony Eamens | Matt Grabham | Ollie Smyth

PRODUCT CONCEPT & WHY

It is a well-known fact that an image is more effective than words because our minds are built to process better visual information. Images get people’s attention, evoke an emotional reaction, and improve comprehension and retention of information.

 

Visual search is one of the latest breakthroughs that is truly making a mark in eCommerce. The retail industry has changed drastically in recent years and young, tech-savvy shoppers are demanding smarter shopping experience where the journey from the discovery of a product to check-out is as short as possible.


Coupled with that product discovery is happening more and more online and in social networks, particularly in fashion and home decor categories where the visual appeal of a product is the primary factor.



40% OF CONSUMERS ARE TAKING PICTURES OF PRODUCTS THEY FIND IN STORES TO LATER CHECK PRICES ONLINE.

Our feature has 3 distinct components that can lead to Search results:
 

  • Take a Photo – this allows a Customer to take a photo using the phone’s camera of their surrounding and have the App search that image for similar/matching products. We can see this being used when a Customer is in a friend’s house and likes their style or specific item, or they could be in a department store and like what is on display.
     

  • Use an image from Camera Roll – choosing this will open the Customer’s camera album and they can choose a previously taken photo, or an image they have saved from a web search, or somewhere like Pinterest. Our hypothesis here is that Customers often saved ‘mood boards’ of style images or products that they like and store for later use. Here they can simply select a saved photo and a search will be fired.
     

  • Use of ours – lastly, a Customer can choose to use one of our own images and that will take them to a gallery of Instagram images taken from the @ArgosHome Instagram account as well as Customer Generated Content that has been tagged as ‘ArgosHome’. This gives Customers a great wealth of inspiration not only for individual products but how a whole room could come together.

CUSTOMER PROBLEMS IDENTIFIED

As a customer, I don't know exactly what I like about a product I found online or in a store. To sit down and describe it is too much of a cognitive load. Furthermore, there is a gap between how I might describe a product and how a "textual" search engine will. ("I'll know what I like when I see it”).

As a customer, I want to find A similar-looking items in different price ranges, so I can buy a look-alike product at a lower price.

DESIGN GOALS & CONCEPT

  • Accessibility is always front of mind for us, so one of the main design objectives was using a limited palette that would meet AA accessibility standards as set by WCAG 2.0 and would be still bright and have enough contrast to work in a variety of lighting conditions. 

     

  • Through the UI  we also wanted to express our unique brand identity through smart use of font, color, and image decisions. We made sure we provided enough branding to give people context within our app, but not so much that it becomes a distraction. We didn’t want to let branding create any sort of added friction to the use of Visual Search.
     

  • Visual Search is an ambiguous term for those who are not tech (or text!) savvy, so throughout the experience we wanted to use familiar, understandable words and phrases and keeping the interface text clear and concise.
     

  • Success Metrics of the feature will be measured both by growth on monthly searches using the feature in the first 6 months - expect to be at least 20% month growth rate in iOS - and engagement with the search results, expected to be no less than 65%. 

Visual Search 03d@2x.png
VISUAL_SEARCH WIRE_1_SAVEPHOTO.001.jpeg
AA_FB_Prez.014.jpeg

OBSTACLES & FULFILLING THE CONCEPT

  • Colours: our core brand colours are basically RGB, which makes it hard to design interfaces that don’t look childish. Colour is an on-going issue for us, especially as steering clear of colour is against our brand principles, and inherently makes the design look quite wireframe, having no visual hierarchy between elements. We decided our primary interaction colour would be green for this feature, as it’s the colour of our main CTAs. 
     

  • Two inherent obstacles in this feature: 
     

    • Argos does not have a huge depth of range in any of the categories that are traditionally associated with Visual Search: Home decor and Clothing. The consumer finding inspiration online in a social network or in a store wanting to find matching pieces on Argos might feel disappointed when we can't match their intent.
       

    • From a technology perspective, there is a risk that the technology is still not good enough to match consumer expectations. If the technology is not able to detect the simplest visual features of a product consumers will inevitably feel short changed and see the feature as a gimmick rather than something of value to them. 

ENABLING SUCCESS FOR MY TEAM

Enabling success and constantly checking team health is a major hygiene factor when embarking on developing a shiny new product feature. Here are a couple examples of how I handled doing that for my team:

 

  • Ensuring the team performs effectively, I always get involved in the in-depth work of the team when required. Our Products Owner went on holiday for a week, so I immediately jumped in to managing the team, running our stand ups and cleaning up our JIRA board. 
     

  • Took action to obtain the partners needed to deliver the best experience possible. Once we sized that the API development would too big and complex for us to handle in-house, I took action to be involved with the engineering team when selecting an API partner that ticked all the boxes.
     

  • I always invited input from team members into decisions that affect them. Holding twice weekly feedback sessions where the whole team would get together and analyse the progress we were making and have a constructive conversation on not only the UX and design, but the experience as a whole. Involving the whole team in these sessions is great to get different types of feedback, but I was also careful for it not to turn into design by committee by clearly focusing on the customer problem.

WHAT I WOULD DO DIFFERENTLY

When you ship a feature, as a designer you look back and think about the things that are clearer in hindsight. They’re not kidding when they say it’s 20/20! Here are some of the things I have ruminated on and thought about if I got a “do-over”:


  • I would improve visual search results, working with our API partner Syte to make proper optimisations, as there is a lack of expected features (sort/filter) and guidance. This might be seen as a poor reflection on design thinking in the app and might cause a barrier to uptake and use of feature.

     

  • I would work on a robust on-boarding flow for Visual Search, as test users did not understand its use right away and were slightly confused on how to use it. Although, then found it added a lot of value. It’s possible people don’t realise this feature exists, because it’s easy to miss.

     

  • Not only testing happy paths! I definitely think we should of tested more edge cases with customers, such as the understanding of camera roll, full navigation, saving images, and active/inactive states . But due to limitations with the prototype and fact device wasn’t users own, we did not do this.

bottom of page