improving precision of e commerce search results
play

IMPROVING PRECISION OF E-COMMERCE SEARCH RESULTS HAYSTACK Europe - PowerPoint PPT Presentation

IMPROVING PRECISION OF E-COMMERCE SEARCH RESULTS HAYSTACK Europe 2019 - Berlin 06.11.2019 1 ABOUT US Jens Krsten Tech Lead & Developer Search @otto.de Arne Vogt Business Designer Search @otto.de HAYSTACK Europe 2019 - Berlin


  1. IMPROVING PRECISION OF E-COMMERCE SEARCH RESULTS HAYSTACK Europe 2019 - Berlin 06.11.2019 1

  2. ABOUT US Jens Kürsten Tech Lead & Developer Search @otto.de Arne Vogt Business Designer Search @otto.de HAYSTACK Europe 2019 - Berlin 06.11.2019

  3. About OTTO and otto.de Founded in 1949 On average 1.6 million visits on otto.de per day ▪ ▪ Number of employees 4,900 Up to 10 ordersper second ▪ ▪ Revenue in 2018/19 3.2 billion Euro ▪ More than 3 million items on otto.de ▪ More than 400 OTTO market partners ▪ Approx. 6,800 brands on otto.de ▪ Expansion of the business model towards becoming a ▪ marketplace OTTO‘s headquarter in Hamburg HAYSTACK Europe 2019 - Berlin 06.11.2019 3

  4. Key Figures Product Search @otto.de in 2018 Ø search queries per day ~0.9 million max. search queries per day ~3 million search queries in 2018 unique search terms in 2018 ~320 million ~40 million HAYSTACK Europe 2019 - Berlin 06.11.2019 4

  5. Our Key Requirement for Search Relevance @otto.de Search relevance @otto.de is determinedby • our user queries • product data (quality) USER • different performance indicators of our products • different business goals for different categories BUSINESS ! Finding the balance between the user‘s intent and the business ‘ perspective is our key requirement for search relevance @otto.de HAYSTACK Europe 2019 - Berlin 06.11.2019 5

  6. WHAT IS THE PROBLEM? HAYSTACK Europe 2019 - Berlin 06.11.2019 6

  7. One Challenge wrt. Search Relevance @otto.de: Understanding the User‘s Intent Query results for category searches are often too fuzzy: recall is good, but precision can be quite bad HAYSTACK Europe 2019 - Berlin 06.11.2019 7

  8. One Challenge wrt. Search Relevance @otto.de: Understanding the User‘s Intent Fuzzy search results lead to difficulties in ranking HAYSTACK Europe 2019 - Berlin 06.11.2019 8

  9. One Challenge wrt. Search Relevance @otto.de: Understanding the User‘s Intent Results via navigation deliver much higher precison for the same category HAYSTACK Europe 2019 - Berlin 06.11.2019 9

  10. Topical Relevance vs. Business Value Topical Relevancevs. Business Value - Query "tie" Impact 0 10 20 30 40 50 60 70 Rank Position Business Value Relevance HAYSTACK Europe 2019 - Berlin 06.11.2019 10

  11. HOW IS IMPROVING THE PRECISION GOING TO AFFECT THE USER? HAYSTACK Europe 2019 - Berlin 06.11.2019 12

  12. First Business Objective: Search Effectiveness We regard an order in a search session as a sign of success Successfulsearch session: Unsuccessful search session: HAYSTACK Europe 2019 - Berlin 06.11.2019 13

  13. Second Business Objective: Search Efficiency We regard a search session with less search interactions as more efficient 1 search order 5 Search Interactions Ratio 5:1 1 search order Ratio 2:1 2 Search Interactions HAYSTACK Europe 2019 - Berlin 06.11.2019 14

  14. Hypothesis for improving the precision How will an improvement in precision influence our users? Hypothesis 1: Search Effectiveness We assume that some of our users have a low involvement in the search task or the online shop. They are easily frustrated due to the current lack of precision and leave the shop before they find what they are looking for. → An improvement in precision will therefore lead to a higher search conversion rate Hypothesis 2: Search Efficiency We assume that some of our users have a high involvement in the search task. They will tolerate the lack of precision and still find what they are looking for. It just cost them more effort (time, clicks, thoughts). → An improvement in precision will therefore lead to a lower ratio of search interactions to orders HAYSTACK Europe 2019 - Berlin 06.11.2019 15

  15. OUR APPROACH HAYSTACK Europe 2019 - Berlin 06.11.2019 16

  16. Our basic discovery approach In our discoveries we loosely follow the design thinking process testing the understanding finding the solution the problem solution HAYSTACK Europe 2019 - Berlin 06.11.2019 17

  17. Our basic discovery approach In our discoveries we loosely follow the design thinking process testing the understanding finding the solution the problem solution HAYSTACK Europe 2019 - Berlin 06.11.2019 18

  18. Our Idea for a Solution of the Problem : Automatic Filter Selection Use the data our customers leave behind HAYSTACK Europe 2019 - Berlin 06.11.2019 19

  19. Our Idea for a Solution of the Problem: Automatic Filter Selection Use the data our customers leave behind searchterm & product clicks & orders performance filter attribute values for filtered search relevance results HAYSTACK Europe 2019 - Berlin 06.11.2019 20

  20. It took us four iterations to define the prototype Iteration 1 Iteration 2 Iteration 3 Iteration 4 Scope : Scope : Scope : Scope : brand searches category searches all searches Shaping the Insight : Insight : Insight : prototype potential too low potential ok, but higher potential, Insight : there might be but also higher risk Definition of cut-off, more decision for data fields and metrics HAYSTACK Europe 2019 - Berlin 06.11.2019 22

  21. Our basic discovery approach In our discoveries we loosely follow the design thinking process testing the understanding finding the solution the problem solution HAYSTACK Europe 2019 - Berlin 06.11.2019 23

  22. Offline Evaluation of Search Relevance Improvements judgements query and click logs OFFLINE ONLINE on-site testing new configuration relevance assessment of different configurations HAYSTACK Europe 2019 - Berlin 06.11.2019 24

  23. Offline Evaluation Architecture web shop # queries query judgement & tracking data # clicks score pairs per product (optionallysampled) (in time slices) configs queries metrics hits HAYSTACK Europe 2019 - Berlin 06.11.2019 25

  24. Metrics in the Making OFFLINE Topical relevance metrics • • Precision@n • NDCG • Average Precision • ERR Adressing temporal changes in frequency and significance • • Traffic weight as metric factor at query-level Adressing significance as business performance predictor • • Traffic weight * business importance at query-level HAYSTACK Europe 2019 - Berlin 06.11.2019 27

  25. Offline Evaluation Setup for Automatic Filter Selection OFFLINE Product data as filter fields: assortment category producttype clicks add to baskets Interaction data: x% of interaction Filter value selection based on: precision @ k average precision @ k Evaluated metrics: ! We evaluated 12 configurations based on different product data, interaction data and filter/attribute value selection on a query-set with 100.000 entries HAYSTACK Europe 2019 - Berlin 06.11.2019 28

  26. Filter Attribute Value Selection Strategy OFFLINE Produkttyp Clicks Cumulated Sum Coverage Values LED-Fernseher 100 100 50% 4k Fernseher 80 180 90% Curved TV 10 190 95% Smart TV 5 195 97,5% … … … … … … 200 100% HAYSTACK Europe 2019 - Berlin 06.11.2019 29

  27. Offline Evaluation Results for Automatic Filter Selection OFFLINE HAYSTACK Europe 2019 - Berlin 06.11.2019 30

  28. Offline Evaluation Results for Automatic Filter Selection OFFLINE ! Every configuration leads to increased precision. HAYSTACK Europe 2019 - Berlin 06.11.2019 31

  29. Offline Evaluation Results for Automatic Filter Selection OFFLINE ! Higher attribute granularity → higher precision HAYSTACK Europe 2019 - Berlin 06.11.2019 32

  30. Offline Evaluation Results for Automatic Filter Selection OFFLINE ! Using click events performs better than using add2basket events. HAYSTACK Europe 2019 - Berlin 06.11.2019 33

  31. Our basic discovery approach In our discoveries we loosely follow the design thinking process testing the understanding finding the solution the problem solution HAYSTACK Europe 2019 - Berlin 06.11.2019 34

  32. Technical Integration X Business Rules Query Preprocessor "krawatte" => (querqy) * FILTER: class:krawatten HAYSTACK Europe 2019 - Berlin 06.11.2019 *https://github.com/renekrie/querqy 35

  33. Query Selection for Auto Filtering 230k Queries 1. No Nonsense 2. Business Rules • Identical hit count • No brands • 0-hits • Pos. metric change • Unclear judgements • Hit set >30 40k Filter Rules HAYSTACK Europe 2019 - Berlin 06.11.2019 36

  34. User Interaction Challenge HAYSTACK Europe 2019 - Berlin 06.11.2019 37

  35. Data Update Challenges Filtering data removes existing • interaction patterns Missing „ trending “ attribute • selections may lead to missing products Frequency of interaction data updates • HAYSTACK Europe 2019 - Berlin 06.11.2019 38

  36. On-Site Test Results* Hypothesis 1: Search effectiveness An improvement in precision will lead to a higher search conversion rate KPI : conversion rate search Test result : -0,49% Hypothesis 2: Search efficiency An improvement in precision will lead to a lower ratio of search interactions to orders KPI : Ratio of search interactions to search orders Test result : -0,73% (the lower the better) * only one week of data, not significant (yet) HAYSTACK Europe 2019 - Berlin 06.11.2019 39

  37. We generate data with the A/B- Test… … and use the insights for the next iteration Next Iteration HAYSTACK Europe 2019 - Berlin 06.11.2019 40

Recommend


More recommend