Next-item Recommendation with Sequential Hypergraphs Jianling Wang, - - PowerPoint PPT Presentation

next item recommendation with sequential hypergraphs
SMART_READER_LITE
LIVE PREVIEW

Next-item Recommendation with Sequential Hypergraphs Jianling Wang, - - PowerPoint PPT Presentation

Next-item Recommendation with Sequential Hypergraphs Jianling Wang, Kaize Ding*, Liangjie Hong**, Huan Liu* and James Caverlee Texas A&M University Arizona State University* LinkedIn Inc.** Next-item Recommendation The goal is to infer


slide-1
SLIDE 1

Next-item Recommendation with Sequential Hypergraphs

Jianling Wang, Kaize Ding*, Liangjie Hong**, Huan Liu* and James Caverlee

Texas A&M University Arizona State University* LinkedIn Inc.**

slide-2
SLIDE 2

The goal is to infer the dynamic user preferences with sequential user interactions.

Next-item Recommendation

sofa wall decoration bouquet Nintendo Switch iPhone 8

… … …

User A User C

Historic User-Item Interactions

iPhone XR

2018 2017 2019

slide-3
SLIDE 3

Next-item Recommendation

The next item

The goal is to infer the dynamic user preferences with sequential user interactions.

sofa wall decoration bouquet Nintendo Switch iPhone 8

… … …

User A User C

Historic User-Item Interactions

iPhone XR

2018 2017 2019

slide-4
SLIDE 4

How are items treated?

slide-5
SLIDE 5

Items emerge and disappear

  • From a long-term perspective, the relationships between

items are unstable. ==> Short-term relationships are critical for item modeling.

More than 50% of the items becomes inactive shortly More than 50% of the items becomes inactive shortly

More than 50% of the items become inactive shortly.

slide-6
SLIDE 6

The relationships change

  • The relationships between items are changing along time.
  • The variations are larger the longer time gap.

Neighboring items change temporally (c)

We capture the item co-occurrence with word2vec. Neighboring items change along time.

slide-7
SLIDE 7

For a certain time period, the meaning of an item can be revealed by the correlations defined by user interactions in the short term.

September 2017

iPhone 8 Nintendo Switch

C

The time when iPhone 8 came out

How are items treated?

slide-8
SLIDE 8

For a certain time period, the meaning of an item can be revealed by the correlations defined by user interactions in the short term.

September 2019 September 2017

iPhone 8 Nintendo Switch

C

iPhone 8 Nintendo Switch Lite AirPods Gen1

D E

iPhone 8 became a budget choice

How are items treated?

slide-9
SLIDE 9

Challenge 1

  • High-order correlations
  • Multiple-hop connections

A user may purchase multiple numbers of items in a certain time period.

iPhone 8 Nintendo Switch Lite AirPods Gen1 Apple lighting Cable AirPods Case

D E B

slide-10
SLIDE 10

Challenge 1

  • High-order correlations
  • Multiple-hop connections

Items connected by multiple- hop path are related.

iPhone 8 Nintendo Switch Lite AirPods Gen1 Apple lighting Cable AirPods Case

D E B

slide-11
SLIDE 11

Challenge 2

The semantics of an item can change across users and over time.

The same flower bouquet is linked to different purposes

sofa wall decoration wedding veil bridal gown bouquet iPhone 8 Nintendo Switch Lite AirPods Pro AirPods Gen1 Apple lighting Cable AirPods Case iPhone 11 Pro

D A

iPhone 8 Nintendo Switch

C

September 2019 September 2017

B E B C

slide-12
SLIDE 12

Challenge 2

The semantic meaning of the iPhone changes along time

sofa wall decoration wedding veil bridal gown bouquet iPhone 8 Nintendo Switch Lite AirPods Pro AirPods Gen1 Apple lighting Cable AirPods Case iPhone 11 Pro

D A

iPhone 8 Nintendo Switch

C

September 2019 September 2017

B E B C

The semantics of an item can change across users and over time.

slide-13
SLIDE 13

Our proposal: HyperRec

A novel end-to-end framework with sequential Hypergraphs to enhance next-item recommendation.

1 3 4 2 1 3 4 2 1 3 4 2

Layer L

HGCN Residual Gating

1 3 4 2 1 3 4 2

HGCN Residual Gating

1 3 4 2 1 3 4 2

HGCN

Layer L

8 7 5 6

Layer L

7 9 6 3 4

Dynamic User Modeling

Dynamic Item Embedding Static Item Embedding

t1

t2

Fusion Layer

Self-attention

Predicted Score

x

+

Dynamic Item Embedding Dynamic Item Embedding Dynamic User Preference

Sequential Hypergraphs

tn

slide-14
SLIDE 14

Hypergraph

Each hyperedge in a hypergraph can connect multiple nodes on a single edge, s.t.,

  • Each node denotes an item; each hypedge can connect the set
  • f items a user interacts within a certain short time period

altogether.

1 3 4 2 1 3 4 2

Simple Graph Hypergraph item user

slide-15
SLIDE 15

1 3 4 2

ϵ1

ϵ2

2

ϵ1

ϵ2

1 3 4 2

ϵ1

ϵ2

2

ϵ1

ϵ2

Hypergraph

Hypergraph Convolutional Layers (HGCN)

1 3 4 2

ϵ1

ϵ2

2

ϵ1

ϵ2

Nodes —> Hyperedges Hyperedges —>Nodes

slide-16
SLIDE 16

Sequential Hypergraphs

Split user-item interactions based on the timestamps. Construct a series of short-term hypergraphs for different timestamps.

1 3 4 2 1 3 4 2 1 3 4 2

Layer L

HGCN

1 3 4 2 1 3 4 2

HGCN

1 3 4 2 1 3 4 2

HGCN

Layer L

8 7 5 6

Layer L

7 9 6 3 4

t1

t2

tn

slide-17
SLIDE 17

1 3 4 2 1 3 4 2 1 3 4 2

Layer L

HGCN Residual Gating

1 3 4 2 1 3 4 2

HGCN Residual Gating

1 3 4 2 1 3 4 2

HGCN

Layer L

8 7 5 6

Layer L

7 9 6 3 4

Dynamic Item Embedding Embedding

t1

t2

Dynamic Item Embedding Dynamic Item Embedding

Sequential Hypergraphs

tn

Sequential Hypergraphs

Residual Gating: Model the residual information among the consecutive timestamps.

Static Item Embedding

slide-18
SLIDE 18

Dynamic User Modeling

Short-term User Intent: Combining the items interacted by the user in the short-term period. ==> embeddings of hyperedges

1 3 4 2

ϵ1

ϵ2

slide-19
SLIDE 19

Dynamic User Modeling

Fusion Layer: To generate the representation for a user-item interaction at timestamp t.

Fusion Layer Static Item Embedding Dynamic Item Embedding

(user, item, timestamp)

Short-term user intent Embedding

slide-20
SLIDE 20

Dynamic User Modeling

Self-attention: Generate the dynamic user embedding

Fusion Layer

Self-attention

slide-21
SLIDE 21

HyperRec

1 3 4 2 1 3 4 2 1 3 4 2

Layer L

HGCN Residual Gating

1 3 4 2 1 3 4 2

HGCN Residual Gating

1 3 4 2 1 3 4 2

HGCN

Layer L

8 7 5 6

Layer L

7 9 6 3 4

Dynamic User Modeling

Dynamic Item Embedding Static Item Embedding

t1

t2

Fusion Layer

Self-attention

Predicted Score

x

+

Dynamic Item Embedding Dynamic Item Embedding Dynamic User Preference

Sequential Hypergraphs

tn

slide-22
SLIDE 22

Experiments: Data

Dataset #User #Item #User-User Interactions Density Cutting Timestamp

Amazon

74,823 64,602 1,475,092 0.0305% Jan 1, 18

Etsy

15,357 56,969 489,189 0.0559% Jan 1, 18

Goodreads

16,884 20,828 1,730,711 0.4922% Jan 1,17

Three Datasets:

Ecommerce Information Sharing Platform

slide-23
SLIDE 23

Experiments: Metric

Leave-one-out Setting

  • HIT@K: Hit Rate
  • NDCG@K: Normalized Discounted Cumulative Gain
  • MRR: Mean Reciprocal Rank
  • K=1, 5
slide-24
SLIDE 24

Experiments: Baselines

Compare with next-item recommendation frameworks:

  • PopRec: Most Popular
  • TransRec: Translation-based Recommendation (RecSys 2017)
  • GRU4Rec+: Recurrent Neural Networks with Top-K Gains (CIKM 2018)
  • TCN: Convolutional Generative Network for Next Item

Recommendation (WSDM 2019)

slide-25
SLIDE 25

Experiments: Baselines

Compare with attention-based recommendation frameworks:

  • HPMN: Lifelong Sequential Modeling with Personalized Memorization

(SIGIR 2019)

  • HGN: Hierarchical Gating Networks for Sequential Recommendation

(KDD 2019)

  • SASRec: Self-attention Sequential Recommendation (ICDM 2018)
  • BERT4Rec: Bidirectional Encoder Representations from Transformer

for Sequential Recommendation (CIKM 2019)

slide-26
SLIDE 26

HyperRec vs Baselines

  • HyperRec can achieve the best performance for all of the

evaluation metrics in the experiments.

  • HyperRec outperforms all the baselines by 20.03%, 7.90%

and 17.62% for Amazon, Etsy and Goodreads in NDCG@1/HIT@1.

  • The outstanding performance of HyperRec in both e-

commerce and information sharing platforms demonstrate that it can be generalized to various online platforms.

slide-27
SLIDE 27

Impact of each component?

We conduct ablation tests to examine the effectiveness of each component.

Architecture Amazon Etsy Goodreads (1) HyperRec 0.1215 0.4712 0.2809 (2) Static Item Embedding 0.1051 0.4477 0.2643 (3) Replace Hypergraph 0.0978 0.4588 0.2576 (4) (-) Residual 0.1169 0.4591 0.2626 (5) (-) Dynamic Item Embedding 0.1131 0.4646 0.2789 (6) (-) Short-term User Intent 0.1147 0.4616 0.2709 (7) (-) Dynamic in Prediction 0.1151 0.4703 0.2746

Table 3: Results for Ablation Test under HIT@1/NDCG@1.

Results under NDCG@1/HIT@1

slide-28
SLIDE 28

Impact of each component?

It is essential to have dynamic item embedding revealing their change of semantic meanings with the sequential Hypergraphs.

Architecture Amazon Etsy Goodreads (1) HyperRec 0.1215 0.4712 0.2809 (2) Static Item Embedding 0.1051 0.4477 0.2643 (3) Replace Hypergraph 0.0978 0.4588 0.2576 (4) (-) Residual 0.1169 0.4591 0.2626 (5) (-) Dynamic Item Embedding 0.1131 0.4646 0.2789 (6) (-) Short-term User Intent 0.1147 0.4616 0.2709 (7) (-) Dynamic in Prediction 0.1151 0.4703 0.2746

Table 3: Results for Ablation Test under HIT@1/NDCG@1.

Results under NDCG@1/HIT@1

slide-29
SLIDE 29

Impact of each component?

Modeling the residual information help to generate more informative item embeddings, leading to better performance.

Architecture Amazon Etsy Goodreads (1) HyperRec 0.1215 0.4712 0.2809 (2) Static Item Embedding 0.1051 0.4477 0.2643 (3) Replace Hypergraph 0.0978 0.4588 0.2576 (4) (-) Residual 0.1169 0.4591 0.2626 (5) (-) Dynamic Item Embedding 0.1131 0.4646 0.2789 (6) (-) Short-term User Intent 0.1147 0.4616 0.2709 (7) (-) Dynamic in Prediction 0.1151 0.4703 0.2746

Table 3: Results for Ablation Test under HIT@1/NDCG@1.

Results under NDCG@1/HIT@1

slide-30
SLIDE 30

Impact of each component?

The design of our fusion layer can help in dynamic user preference elicitation.

Architecture Amazon Etsy Goodreads (1) HyperRec 0.1215 0.4712 0.2809 (2) Static Item Embedding 0.1051 0.4477 0.2643 (3) Replace Hypergraph 0.0978 0.4588 0.2576 (4) (-) Residual 0.1169 0.4591 0.2626 (5) (-) Dynamic Item Embedding 0.1131 0.4646 0.2789 (6) (-) Short-term User Intent 0.1147 0.4616 0.2709 (7) (-) Dynamic in Prediction 0.1151 0.4703 0.2746

Table 3: Results for Ablation Test under HIT@1/NDCG@1.

Results under NDCG@1/HIT@1

slide-31
SLIDE 31

Conclusion

  • We explore the dynamic meaning of items in real-world scenarios

for next-item recommendation.

  • We propose a novel recommendation framework empowered by

sequential hypergraphs to incorporate the short-term correlations.

  • The proposed HyperRec model can provide more accurate next-

item recommendation for both E-commerce and information sharing platforms.

  • The next step: Can we transfer the dynamic patterns across

platforms or even across domains?

slide-32
SLIDE 32

Please check our paper or contact jlwang@tamu.edu for more details.