© unsplash/@dadaben_

© unsplash/@dadaben_
Retail, E-commerce & Marketplaces

Semantic product search

Context

E-commerce clients not finding immediately their desired product might want to use the website's search toolbar. However, on-site search engines often do not return the desired results. A study revealed that 70% of all e-commerce sites are unable to return reasonable results when product-type synonyms (e.g. pullover vs. jumpers) were used, 34% were unable coping with single character typos (e.g. sneaker vs. snaekers; source: Baymard Institute). These results lead to a bad user experience and lost sales.

Challenges

Configuring a search engine specific to an e-commerce site requires expertise in programming NLP applications. Moreover, the on-site search should take into account similarities between articles to display similar articles that could match with user expectations.

To ensure the best possible user experience, availability of the article shall be validated before displaying the option so that available products are prioritized over out-of-stock products. Additionally, the product catalogue needs to be updated continuously to reflect buying patterns and trends to increase the probability of a user converting into a buyer.

Potential solution approaches

Semantic search algorithms attempt to encapsulate the meaning of the search string, reflecting the user's intent and context. The state-of-the-art word embeddings for a variety of NLP applications are BERT (developed by Google) and GPT-2/3 (by OpenAI) that allow to train transformers, a deep learning architecture.

The biggest advantage over previous deep learning models is its attention mechanisms, taking into account bidirectional, and not only sequential, word strings and setting words into context. BERT is implemented in Google Search since December 2019. Additionally, a text corpus containing search strings specific to the e-commerce site prior could improve the performance over the general purpose text corpus of BERT and other pre-trained language models.

Related webinars

Semantic search and understanding of natural text with neural networks: BERT

In this webinar you will get an introduction to the application of BERT for Semantic Search using a real case study: Every year millions of citizens interact with public authorities and are regularly overwhelmed by the technical language used there. We have successfully used BERT to deliver the right answer from government documents with the help of colloquial queries - without having to use technical terms in the queries.

Konrad Schultka

Machine Learning Scientist

Jona Welsch

Machine Learning Scientist

Automated answering of questions with neural networks: BERT

In this webinar we will present a method based on the BERT model for automated answering of questions.

Mattes Mollenhauer

Machine Learning Scientist