Future of Personalized Recommendation Systems Xing Xie Microsoft Research Asia
Recommendation Everywhere
Personalized News Feed
Online Advertising
History 2010 (Various data competitions) Hybrid models with machine learning 1990s (Tapestry, GroupLens) Content based filtering LR, FM, GBDT, etc. Collaborative filtering Pair-wise ranking Explainable recommendation CB Knowledge enhanced recommendation ML DL FM Reinforcement learning Transfer learning CF … 2006 (Netflix prize) 2015 (Deep learning) Factorization-based Models Flourish with neural models SVD++ PNN, Wide&Deep, DeepFM, xDeepFM, etc.
Our Research Deep learning based user modeling Deep learning based recommendation user Explainable recommendation Item Knowledge enhanced recommendation
Microsoft Recommenders • Helping researchers and developers to quickly select, prototype, demonstrate, and productionize a recommender system • Accelerating enterprise-grade development and deployment of a recommender system into production • https://github.com/microsoft/recommenders
User Behavioral Data
Explicit User Representation Demographic Personality Interests Status Social Schedule Age Openness Food Emotion Friend Task Gender Conscientiousness Book Event Coworker Driving route Life stage Extraversion Movie Health Spouse Metro/bus line Marital status Agreeableness Music Wealth Children Appointment Residence Neuroticism Sport Device Other relatives Vacation Education Impulsivity Restaurant Tie strength Vocation Novelty-seeking Indecisiveness
Explicit vs Implicit DNN Model Item Embedding Implicit User Representation User Embedding Deep Models Representation Pros Cons Network Embedding ID Embedding Text Embedding Image Embedding • • Easy to understand; Hard to obtain • Can be directly training data; • bidden by Difficult to satisfy Explicit advertisers complex and global needs; IDs Texts Images Network • • Unified and Difficult to explain; • heterogenous user Need to fine-tune in Implicit representation; each task Feature Engineering • End-to-end learning Classification/Regression Models Explicit User Representation
Query Log based User Modeling gifts for classmates groom to bride gifts cool math games tie clips mickey mouse cartoon philips shaver shower chair for elderly lipstick color chart presbyopic glasses womans ana blouse costco hearing aids Dior Makeup Chuhan Wu, Fangzhao Wu, Junxin Liu, Shaojian He, Yongfeng Huang, Xing Xie, Neural Demographic Prediction using Search Query, WSDM 2019
Query Log based User Modeling Different words may have different importance birthday gift for grandson Different records have central garden street different informativeness google my health plan Neighboring records may medicaid new York have relatedness, while far The same word may have different medicaid for elderly in new York ones usually not importance in different contexts alcohol treatment amazon.com documentary grandson youtube
Query Log based User Modeling
Mapping between age category and age range Experiments • Dataset: • 15,346,617 users in total with age category labels • Randomly sampled 10,000 users for experiments • Search queries posted from October 1, 2017 to March 31, 2018 Distribution of query number per user Distribution of query length Distribution of age category
Experiments discrete feature, linear model continuous feature, linear model flat DNN models hierarchical LSTM model
User Age Inference Queries from a young user Queries from an elder user
Car / Pet Segment
Universal User Representation • Existing user representation learning are task-specific • Difficult to generalize to other tasks • Highly rely on labeled data • Costly to exploit heterogenous unlabeled user behavior data • Learn universal user representations from heterogenous and multi- source user data • Capture global patterns of online users • Easily applied to different tasks as additional user features • Do not rely on manually labeled data
Deep Learning Based Recommender System Learning latent representations Learning feature interactions
Motivations • We try to design a new neural structure that • Automatically learns explicit high-order interactions • Vector-wise interaction, rather than bit-wise • Different types of feature interactions can be combined easily • Goals • Higher accuracy • Reducing manual feature engineering work Jianxun Lian, Xiaohuan Zhou, Fuzheng Zhang, Zhongxia Chen, Xing Xie, Guangzhong Sun, xDeepFM: Combining Explicit and Implicit Feature Interactions for Recommender Systems, KDD 2018
Compressed Interaction Network (CIN)
Relation with CNN An example of image CNN Feature map 1 m … … Feature map H K+1 D Direction of filter sliding
Extreme Deep Factorization Machine (xDeepFM) • Combining explicit and implicit feature interaction network • Integrate both memorization and generalization
Data • Criteo: ads click-through-rate prediction • Dianping: restaurant recommendation • Bing News: news recommendation
Experiments
Experiments
Knowledge Graph • A kind of semantic network, where node indicates entity or concept, edge indicates the semantic relation between entity/concept
Knowledge Enhanced Recommendation • Precision • More semantic content about items • Deep user interest • Diversity • Different types of relations in knowledge graph • Extend user’s interest in different paths • Explainability • Connect user interest and recommendation results • Improve user satisfaction, boost user trust
Knowledge Graph Embedding • Learns a low-dimensional vector for each entity and relation in KG, which can keep the structural and semantic knowledge Distance-based Models ❑ Apply distance-based score function to estimate the triple probability ❑ TransE, TransH, TransR, etc.
Knowledge Graph Embedding Matching-based Models ❑ Apply similarity-based score function to estimate the triple probability ❑ SME, NTN, MLP, NAM, etc.
Knowledge Graph Embedding KG KGE Entity Vector, Relation Vector (Joint Training) User Vector, Item Vector Learning RS Feed into KGE Learning Entity Vector User Vector (Successive Training) RS Task KG Relation Vector Item Vector KGE KG Entity Vector, Relation Vector (Alternate Training) Learning User Vector, Item Vector RS
Deep Knowledge-aware Network Hongwei Wang, Fuzheng Zhang, Xing Xie, Minyi Guo, DKN: Deep Knowledge-Aware Network for News Recommendation, WWW 2018
Deep Knowledge-aware Network
Extract Knowledge Representations • Additionally use contextual entity embeddings to include structural information • Context implies one-step neighbor
Deep Knowledge-aware Network
Experiments
Examples
Ripple Network • Users interests as seed entity, propagates in the graph step by step • Decay in the propagating process Hongwei Wang, etc. Ripple Network: Propagating User Preferences on the Knowledge Graph for Recommender Systems, CIKM 2018
Ripple Network
Experiments
Example
Explainable Recommendation Systems Effectiveness Transparency Model Presentation Persuasiveness Explainability Quality Trust Readability
Explainable Recommendation Systems 1-800-FLOWERS.COM – Elegant Flowers for Lovers Fog Harbor Fish House Their tan tan noodles are made of magic. The chili oil is really appetizing. However, prices are on the high side. Effectiveness Transparency Model Presentation Persuasiveness Explainability Quality Trust Readability
Problem Definition • Input • User set 𝑉 , 𝑣 ∈ 𝑉 is a user 𝑣: user ID and user attributes • Item set 𝑊 , 𝑤 ∈ 𝑊 is an item 𝑗: item ID 𝑚 𝑘 : interpretable component • A recommendation model to be explained 𝑔(𝑣, 𝑤) • Output • z is generated based on the selected components • Explanation 𝑨 = expgen The 𝑘 th interpretable component is selected The 𝑘 th interpretable component is not selected
Outline Can we enhance persuasiveness (presentation quality) in a data-driven way? Users 𝑉 Recommendation Recommended Explanation Explanation 𝑨 Feedback Aware Generative Model, Shipped to Bing Ads , revenue increased by 0.5% model 𝑔(𝑣, 𝑤) items ′ Method Items 𝑊 Can we build an explainable deep model (enhance model explainability)? … … Users 𝑉 Explanation Explainable Recommendation Through Attentive Multi-View Learning, AAAI 2019 Explanation 𝑨 Recommended Method items Items 𝑊 Recommendation model 𝑔(𝑣, 𝑤) Can we design a pipeline which better balances presentation quality and model explainability? A Reinforcement Learning Framework for Explainable Recommendation, ICDM 2018
Explainable Recommendation for Ads Search Ads Advertiser Platform Native Ads / MSN Native Ads / Outlook.com
Recommend
More recommend