Adaptive Hierarchical Translation-based Sequential Recommendation Yin Zhang , Yun He, Jianling Wang, James Caverlee Department of Computer Science and Engineering Texas A&M University, USA
How to model User Sequences? User 1 Sequence Highly personalized User 2 Sequence chronological order
How to model User Sequences? User 1 Sequence Highly personalized User 2 Sequence chronological order
How to model User Sequences? Item Relations [1] : Complementary: items can go well with each together. e.g. Mac Pro and Mac Pro charger Substitute: items that are interchangeable. e.g. Mac Pro and ThinkPad But have general sequence patterns and item relations patterns User 1 Highly Sequence personalized User 2 Sequence chronological order Personal preferred item User behavior changes Interacted Item Relations [1] McAuley, Julian, et al. "Image-based recommendations on styles and substitutes." SIGIR, 2015.
How to model User Sequences? Item Relations [1] : Complementary: items can go well with each together. e.g. Mac Pro and Mac Pro charger Substitute: items that are interchangeable. e.g. Mac Pro and ThinkPad But have general sequence patterns and item relations patterns User 1 Highly Sequence personalized User 2 Sequence chronological order Our Goal: Improving Sequential Recommendation with Item Multi-Relations (e.g. complementary and substitute) Inside User Dynamic Sequences [1] McAuley, Julian, et al. "Image-based recommendations on styles and substitutes." SIGIR, 2015.
Our Goal: Improving Sequential Recommendation with Item Multi-Relations (e.g. complementary and substitute) Inside User Dynamic Sequences Input: Item Complementary and User Sequences Substitute Relations Complementary … … User 1 Sequence Complementary User 2 … … Sequence … … … Substitute dynamic … … … Output: each user next interacted items … ?
How to consider item relations in user sequences? Challenges 1. Item relation data is extremely sparse , how to make it generalized across time in user sequence recommendation? Item Relation Data (Sparse, Stable) User Sequence (Highly personalized, Dynamic) user … item can update over time sequence Complementary ? Item- 2019 2019ThinkPad 2020 2020 ThinkPad Level ThinkPad Charger ThinkPad Charger Complementary Complementary Category- Computer Computer Computer Computer Level Charger Charger
How to consider item relations in user sequences? Challenges 1. Item relation data is extremely sparse , how to make it generalized across time in user sequence recommendation? 2. How to e ff ectively capture the item multiple relations inside user dynamic sequences? Category-level Item Relation (Pairs) User Sequence (Dynamic change, Sequence)
How to consider item relations in user sequences? Challenges 1. Item relation data is extremely sparse , how to make it generalized across time in user sequence recommendation? 2. How to e ff ectively capture the item multiple relations inside user dynamic sequences? With the changes of user sequences, item relations can also dynamically change. Complementary Substitute
How to consider item relations in user sequences? Challenges 1. Item relation data is extremely sparse , how to make it generalized across time in user sequence recommendation? 2. How to e ff ectively capture the item multiple relations inside user dynamic sequences? With the changes of user sequences, item relations can also dynamically change. Complementary Substitute
How to consider item relations in user sequences? Challenges 1. Item relation data is extremely sparse , how to make it generalized across time in user sequence recommendation? 2. How to e ff ectively capture the item multiple relations inside user dynamic sequences? With the changes of user sequences, item relations can also dynamically change. Complementary Substitute
How to consider item relations in user sequences? Challenges 1. Item relation data is extremely sparse , how to make it generalized across time in user sequence recommendation? 2. How to e ff ectively capture the item multiple relations inside user dynamic sequences? With the changes of user sequences, item relations can also dynamically change. Complementary Substitute
How to consider item relations in user sequences? Challenges 1. Item relation data is extremely sparse , how to make it generalized across time in user sequence recommendation? 2. How to e ff ectively capture the item multiple relations inside user dynamic sequences? 3. How to model the dynamic joint influence of user personal preference and item relations? Personal preferred item User behavior changes Interacted Item Relations
⃗ Sequential Recommendation Methods • Based on TransE, user • Hierarchical Translation- connects items to model based Recommendation ‘ higher-order ’ interactions: • Focus on the diverse (HierTrans): the translation behavior (green line) can patterns of item orders adaptively change in user sequences according to both Item Item user Embedding Embedding • user preference and • e.g. Markov Chains, translation vector • the relations of her RNNs, CNNs, • Since stays the same in r u SASRec[1] recent interacted items the user sequence, these • more flexible to capture models assume a user’s user complex dynamic translation behavior is the preferences over time. same across time; • e.g. HierTrans • e.g. TransRec[2] [1] Kang et al. "Self-attentive sequential recommendation." ICDM . IEEE, 2018. [2] He,et al. . "Translation-based recommendation." RecSys . 2017.
Proposed Method: HierTrans 1. Building Hierarchical Temporal 2. Recommendation with Graph HierTrans
Building Hierarchical Temporal Graph Item Multi-Relations Graph G C We extend the item-level relations to category-level. G C captures item category-level semantic relation information. Node: Category E,g, if item i complements/substitutes item j, then the category of item i is close to category of item j in complements/substitutes relation. • Construction Method: if item i complements/substitutes item j, then the category of item i is connected to the Edge: Item relation category of item j by complements/substitutes relation. Details refer paper and [1]; • Dense and General for dynamic changes; Node: Item Dynamic User Interactions Graph G I Items are connected based on user dynamic sequences. The node connections is dynamic changed with user sequences Edge: Each changes. user sequence • Construction Method: In user u sequence, if item j is next to item i, then we connect i to j by user u. • Graph structure benefits learning of di ff erent patterns in the user sequences; [1] Wang, Zihan, et al. "A path-constrained framework for discriminating substitutable and complementary products in e-commerce." WSDM . 2018.
Building Hierarchical Temporal Graph r b : belong to It aggregates the influence from both item multi-relations and user sequences. • Construction Method: if item node i in G^i belongs to category node c in G^c, then connect node i to node c by r_b; • Given a user sequence, by referring to r_b, we can easily capture each item category- level relations inside user sequences;
Recommendation with HierTrans attention mechanisms user u previous T interacted items Captures both user personal User translation vector can dynamic preferred patterns and adaptively change based on both G I item relations based on recent G C user’s previous interacted T items interacted T items. and item di ff erent relations.
Experiments • How well does HierTrans perform for sequential recommendation? • What impact do the design choices of HierTrans have, e.g. category-level relations and adaptive translation vector? Dataset * : Three categories in Amazon (Electronics (Elec), Cell Phones & Accessories (C & A), Home & Kitchen (H&K)) * McAuley, Julian, et al. "Image-based recommendations on styles and substitutes." SIGIR , 2015.
Experimental Setup: Baselines • BPR: Bayesian personalized ranking; • TransE: • TransFM: Translation-based Methods • TransRec: • GRU4Rec: RNN based; • Caser: CNN based; • SASRec: Self-attention based; Metrics: Following with previous sequential recommendation, we use Recall, NDCG.
Experiments: Recommendation E ff ectiveness • HierTrans consistently outperforms state-of-the-art methods in recall@K and NDCG@K;
Experiments: Impact of Influence Factors HierTrans Without adaptive translation and multiple previous heads Item-level relations • HierTrans consistently outperforms all its variations in recall@K and NDCG@K; • The adaptive translation and category-level relations can help improve the sequential recommendation;
Recommend
More recommend