item cold start recommendations
play

Item Cold-start Recommendations: Learning Local Collective - PowerPoint PPT Presentation

Item Cold-start Recommendations: Learning Local Collective Embeddings Martin Saveski MIT Media Lab Amin Mantrach Yahoo Labs Barcelona ACM Conference on Recommender Systems : 7 Oct 2014 Cold-Start When new user/item enters the system No


  1. Item Cold-start Recommendations: Learning Local Collective Embeddings Martin Saveski MIT Media Lab Amin Mantrach Yahoo Labs Barcelona ACM Conference on Recommender Systems : 7 Oct 2014

  2. Cold-Start When new user/item enters the system No past information → No effective recommendations

  3. Cold-Start When new user/item enters the system No past information → No effective recommendations User Cold-start • Visits from users who are not logged in • Content-based/Collaborative-filtering not applicable Item cold-start • No previous feedback available • Collaborative filtering is not an option

  4. Motivation Cold-start Hundreds/thousands of new items every day • Yahoo News: ~100 new articles / day • eBay or Amazon: >1000 items / day ??? Jump-start collaborative filtering systems • Make new items “ popular ” • Enough feedback to achieve the expected performance

  5. News Recommendation Yahoo News 0.40 0.35 0.30 Ranking Accuracy 0.25 0.20 0.15 0.10 0.05 0.00

  6. News Recommendation Yahoo News 0.40 0.35 0.30 Ranking Accuracy 0.25 0.20 0.15 Content Based 0.10 0.05 0.00

  7. News Recommendation Yahoo News 0.40 0.35 0.30 Ranking Accuracy 0.25 BPR + kNN 0.20 0.15 Content Based 0.10 0.05 0.00

  8. News Recommendation Yahoo News Local Collective Embeddings 0.40 0.35 0.30 Ranking Accuracy 0.25 BPR + kNN 0.20 0.15 Content Based 0.10 0.05 0.00

  9. Local Collective Embeddings 2 Main Ideas 1) Combine content and past collaborative data • Link item properties and users • Topics and Communities 2) Exploit data locality • Data may lie in a manifold • Graph regularization

  10. Data in Matrix Form #attributes #users #items #items X A X U Content Matrix Collaborative Matrix

  11. Data in Matrix Form #attributes #users #items #items X A X U Content Matrix Collaborative Matrix

  12. Data in Matrix Form #attributes #users #items #items X A X U Content Matrix Collaborative Matrix

  13. Data in Matrix Form #attributes #users #items #items X A X U Content Matrix Collaborative Matrix

  14. Content Embeddings Factor 1 … k SPORT Item 1 + POLITICS H A Item 2 ... ECONOMY ≈ + X A W A TOPICS … Item N Content Matrix Embeddings

  15. Content Embeddings Factor 1 … k SPORT Item 1 + POLITICS H A Item 2 ... ECONOMY ≈ + X A W A TOPICS … Item N Content Matrix Embeddings

  16. Content Embeddings Factor 1 … k SPORT Item 1 + POLITICS H A Item 2 ... ECONOMY ≈ + X A W A TOPICS … Item N Content Matrix Embeddings

  17. Collaborative Embeddings Factor 1 … k Community 1 Item 1 + Community 2 H U Item 2 ... Community k ≈ + X U W U COMMUNITIES … Item N Collaborative Matrix Embeddings

  18. Collaborative Embeddings Factor 1 … k Community 1 Item 1 + Community 2 H U Item 2 ... Community k ≈ + X U W U COMMUNITIES … Item N Collaborative Matrix Embeddings

  19. Collaborative Embeddings Factor 1 … k Community 1 Item 1 + Community 2 H U Item 2 ... Community k ≈ + X U W U COMMUNITIES … Item N Collaborative Matrix Embeddings

  20. Collective Embeddings #words + Topic 1 H A TOPICS … + Topic k #items X A W ≈ + #users + Community 1 H U COMMUNITIES + #documents … . + Community k X U W ≈ Common Embeddings

  21. Collective Embeddings Inference #words q A ^ Topic 1 H A New Item … Topic k ^ Community 1 H U … . Community k

  22. Collective Embeddings Inference #words q A ^ w ≈ Topic 1 H A New Item … Topic k Community 1 H U … . Community k

  23. Collective Embeddings Inference #words q A ^ w ≈ Topic 1 H A New Item … Topic k ^ w = Community 1 H U … . Community k

  24. Collective Embeddings Inference #words q A ^ w ≈ Topic 1 H A New Item … Topic k #users ^ q U ^ w = Community 1 Predictions H U … . Community k

  25. Exploiting Locality • So far: linear approximation of the data • Data may lie in small subspace

  26. Graph Regularization Nearest Neighbors → Similar embeddings • Manifold approximation using kNN Graph • Weighting by the Laplacian Matrix: L = D - A 1 NN(X) X i 2 NN(X) X j X ● ● ● X k k NN(X)

  27. Local Collective Embeddings Learning Non-convex Optimization Problem • Hard to find the global minimum • Convex when all but one variable are fixed Multiplicative Update Rules • Simple and easy to implement • Non-increasing w.r.t. objective function

  28. Experimental Evaluation News recommendation • Yahoo News: 40 days • 41k articles, 650k users (random sample) • Implicit feedback Email Recipient Recommendation • Enron: 10 mailboxes • 36k emails, 5k users • Explicit feedback

  29. Baselines Experimental Evaluation 1. Content Based Recommender (CB) 2. Content Topic Based Recommender 3. Latent Semantic Indexing on user profiles [Soboroff’99] 4. Author Topic Model [M. Rosen- Zvi’04] 5. Bayesian Personalized Ranking + kNN (BRP-kNN) [Gantner’10] 6. fLDA [ Agarwal’10]

  30. Baselines Experimental Evaluation 1. Content Based Recommender (CB) 2. Content Topic Based Recommender 3. Latent Semantic Indexing on user profiles [Soboroff’99] 4. Author Topic Model [M. Rosen- Zvi’04] 5. Bayesian Personalized Ranking + kNN (BRP-kNN) [Gantner’10] 6. fLDA [ Agarwal’10]

  31. Email Recipient Recommendation Experimental Results 0.50 0.40 Performance 0.30 0.20 0.10 0.00 MicroF1 MacroF1 MAP NDCG LCE (No Graph BPR-kNN CB LCE (No Reeeee) LCE Regularization)

  32. News Recommendation Experimental Results 0.40 0.35 0.30 Ranking Accuracy 0.25 0.20 0.15 0.10 0.05 0.00 RA@3 RA@5 RA@7 RA@10 LCE (No Graph CB BPR-kNN LCE (No Reeeee) LCE Regularization)

  33. Conclusion • New hybrid recommender for item cold-start Linking content and collaborative information helps • Graph regularization is useful in some cases •

  34. Thank you!

  35. Item Cold-start Recommendations: Learning Local Collective Embeddings Martin Saveski MIT Media Lab Amin Mantrach Yahoo Labs Barcelona ACM Conference on Recommender Systems : 7 Oct 2014

Recommend


More recommend