© unsplash/@dadaben_

© unsplash/@dadaben_
Home › 
Use Cases › 
Semantic product search

Semantic product search

Use Case
Commerce

Context

E-commerce clients not finding immediately their desired product might want to use the website's search toolbar. However, on-site search engines often do not return the desired results. A study revealed that 70% of all e-commerce sites are unable to return reasonable results when product-type synonyms (e.g. pullover vs. jumpers) were used, 34% were unable coping with single character typos (e.g. sneaker vs. snaekers; source: Baymard Institute). These results lead to a bad user experience and lost sales.

Challenges

Configuring a search engine specific to an e-commerce site requires expertise in programming NLP applications. Moreover, the on-site search should take into account similarities between articles to display similar articles that could match with user expectations.

To ensure the best possible user experience, availability of the article shall be validated before displaying the option so that available products are prioritized over out-of-stock products. Additionally, the product catalogue needs to be updated continuously to reflect buying patterns and trends to increase the probability of a user converting into a buyer.

Potential solution approaches

Semantic search algorithms attempt to encapsulate the meaning of the search string, reflecting the user's intent and context. The state-of-the-art word embeddings for a variety of NLP applications are BERT (developed by Google) and GPT-2/3 (by OpenAI) that allow to train transformers, a deep learning architecture.

The biggest advantage over previous deep learning models is its attention mechanisms, taking into account bidirectional, and not only sequential, word strings and setting words into context. BERT is implemented in Google Search since December 2019. Additionally, a text corpus containing search strings specific to the e-commerce site prior could improve the performance over the general purpose text corpus of BERT and other pre-trained language models.

Get quarterly AI news

Receive news about Machine Learning and news around dida.

Successfully signed up.

Valid email address required.

Email already signed up.

Something went wrong. Please try again.

By clicking "Sign up" you agree to our privacy policy.

dida Logo