When I was a kid, I usually went to our local department store when I wanted to buy something. The store attendant would ask me what I wanted and then guide me, question-by-question, to the product that best suited my needs. Depending on what I needed, implicit information like my estimated age, height, or shoe size was automatically considered.
Virtual assistants bring in-store personalization online
Fast-forward to today, entering an e-commerce website. I’m bombarded with special offers while trying to locate the search box to enter what I’m looking for – or what I “think” I’m looking for. I try a couple of searches, even use filters to reduce the vast list of results, but finally give up, frustrated, not having found what I was looking for.
Now… How would the perfect online shopping experience be like? Why not start with a chatbot-like or virtual assistant interface? Just a "What would you like to buy?” text, a plain text box, and a submit button with both text and voice input options. That’s it.
What’s next? I imagine entering a simple term like "tablet” – using the keyboard or voice. Obviously, this query is very broad and ambiguous, but with a search engine in the background, this will do. The next step is submitting "tablet” for search and analyzing the facets returned with the search results. The most prominent facet would be "department” and I would be asked which type of "tablet” I’m after – a computer, a kitchen accessory, or some medicine (the most probable option first).
<<< Start >>>
Facets list different values of search result characteristics, such as shoe colors, manufacturers, or sizes.
<<< End >>>
I’m neither sick nor much into cooking. So after selecting “computer,” the chatbot would ask if I would prefer an iPad. How come? There may be a couple of reasons. Probably because I’m using an Apple computer right now. Alternatively, the business logic may prefer selling Apple products because of the larger margin.
Actually, I’m looking for something for my kids, so Apple is not an option. After responding with "no,” another search is triggered: A tablet in the computer department, not made by Apple. Again, the search results come with facets and the most prominent one is the screen size.
After answering this and another set of questions, I’m sure I want the Bansai PlusTab M. I add it to the basket, go to check-out, and wait for the postman to ring with a parcel in his bag.
By now, you should have gotten the idea: Powering a conversation with a search engine mimics the “old school” shopping experience. Identifying the customer’s intent by cleverly refining the choices using the most prominent facets. Including implicit information, such as profile information (for logged-in users), operating systems, browser types, etc., can help refine the result set. But make sure to be transparent about using this information with phrases like "I noticed that...,” "You’re using browser X...,” and so on.
But wait, there’s more! If I had entered "red Chucks,” the color facet will be the most prominent one in a search result. But why should the search engine return the obvious? If I already entered "red,” I do not want to be asked which color I prefer.
The virtual assistant needs to analyze the input. It’s a process called query understanding enabled by a set of different technologies. The input "red Chucks” would trigger this search: Anything with the color "red" (using category snapping), in the "shoes" department, made by "Converse, "model "All Stars" (learned from log analytics and query chaining in this case). If I was logged in as a returning user, the search would filter for Men’s only (from my profile) and sizes 9 and 9.5 (based on my order history).
<<< Start >>>
Living in Germany adds another aspect to input analysis: Finnic and Germanic languages use a lot of compound words like Plastikbecher (plastic cup). For these languages, the analysis must first separate the input (plastic and cup) and then decide which information can be used as a filter – plastic as material in this case.
<<< End >>>
Not knowing what to search for? Image search to the rescue
What if I don’t know what to look for?
You all know the situation: You’re walking along somewhere, see something you like, but it’s not for sale, the shop is closed, you don’t know the manufacturer of the product, or there’s another reason why you can’t buy the item. Words fail to describe it, so I’ll just take a photo and give it to the chatbot. In addition to searching for similar images, the chatbot will run the picture through image detection and trigger a text search. Again, the chatbot lets me know what is happening and reveals the text description used for search which, by the way, is treated the same way as if I had input a search query using text or voice.
<<< Start >>>
According to Smarter with Gartner*, "by 2021, early adopter brands that redesign their websites to support visual and voice search will increase digital commerce revenue by 30%."
<<< End >>>
Power the “no search” e-commerce experience
What if I needed advice? Maybe I want to know what I should buy and why. The virtual assistant should know which products have scored best in tests and tell me about their advantages and shortcomings. It might also consider which products need to be cleared out of stock when making the suggestions. In this case, the advice can be somewhat biased (the e-commerce site wants to make money, after all). Again, search to the rescue! However, this time, the magic is not so much on the search side, but more on the indexing side, i.e. content processing.
This process relies heavily on natural language processing (NLP) and understanding (NLU) techniques which are used not only to index text but also to extract meaning. Information on products is widely available on the web, e.g. magazines or forums. Using techniques such as document splitting (identifying top 10 list entries), entity extraction (product features, advantages, and shortcomings), or sentiment analysis (quality or popularity), the virtual assistant can suggest a product and explain why the personalized suggestion was made.
Of course, I wouldn’t want to miss any of the existing features. I would like to be identified as a returning customer, being asked if I need to restock on my coffee capsules (it’s the end of my usual buying interval), how I like my new Bansai PlusTab M (recent purchase), or if I would fancy a Bluetooth keyboard for my new tablet (others have bought it with their new tablet).
One last thing I want to mention from the search engine perspective: Make sure to process all input properly. Harmonize data during indexing (e.g. mapping color names like “Nightfire” and “Burgundy” to “red”) and try to understand the user (e.g. mapping colloquial to product names). Also, remember that search is not static. Analyze your log files regularly and introduce a continuous improvement process like Engine Scoring. Watch our 21-minute webinar to learn about the tips and best practices for quickly identifying and improving common e-commerce search functionalities.
Wrapping things up, I’m looking forward to this revolution in the e-commerce search experience. It's not so much about having a chatbot instead of a search box – instead, it's the backend content processing and intent understanding powering the chatbot that will make the difference. I would enjoy being advised on the best product to buy, just like when I was as a kid, without having to do all the research myself and risking buying the wrong product.
*"Gartner Top Strategic Predictions for 2018 and Beyond" authored by Kasey Panetta was published on October 3, 2017. To read more, visit: https://www.gartner.com/smarterwithgartner/gartner-top-strategic-predictions-for-2018-and-beyond/