fundamental limits of caching
play

Fundamental Limits of Caching Urs Niesen Jointly with Mohammad - PowerPoint PPT Presentation

Fundamental Limits of Caching Urs Niesen Jointly with Mohammad Maddah-Ali Bell Labs, Alcatel-Lucent Video on Demand Video on demand is getting increasingly popular: Netflix streaming service Amazon Instant Video Hulu Verizon / Comcast on


  1. Fundamental Limits of Caching Urs Niesen Jointly with Mohammad Maddah-Ali Bell Labs, Alcatel-Lucent

  2. Video on Demand Video on demand is getting increasingly popular: Netflix streaming service Amazon Instant Video Hulu Verizon / Comcast on Demand . . .

  3. Video on Demand Video on demand is getting increasingly popular: Netflix streaming service Amazon Instant Video Hulu Verizon / Comcast on Demand . . . ⇒ Places significant stress on service provider’s networks

  4. Video on Demand Video on demand is getting increasingly popular: Netflix streaming service Amazon Instant Video Hulu Verizon / Comcast on Demand . . . ⇒ Places significant stress on service provider’s networks ⇒ Caching (prefetching) can be used to mitigate this stress

  5. Caching (Prefetching) 100 normalized demand 80 60 40 20 0 0 6 12 18 24 time of day

  6. Caching (Prefetching) 100 normalized demand 80 60 40 20 0 0 6 12 18 24 time of day High temporal tra ffi c variability

  7. Caching (Prefetching) 100 normalized demand 80 60 40 20 0 0 6 12 18 24 time of day High temporal tra ffi c variability Caching can help smooth tra ffi c

  8. The Role of Caching Conventional beliefs about caching:

  9. The Role of Caching Conventional beliefs about caching: Caches useful to deliver content locally

  10. The Role of Caching Conventional beliefs about caching: Caches useful to deliver content locally Local cache size matters

  11. The Role of Caching Conventional beliefs about caching: Caches useful to deliver content locally Local cache size matters Statistically identical users ⇒ identical cache content

  12. The Role of Caching Conventional beliefs about caching: Caches useful to deliver content locally Local cache size matters Statistically identical users ⇒ identical cache content Insights from this work:

  13. The Role of Caching Conventional beliefs about caching: Caches useful to deliver content locally Local cache size matters Statistically identical users ⇒ identical cache content Insights from this work: The main gain in caching is global

  14. The Role of Caching Conventional beliefs about caching: Caches useful to deliver content locally Local cache size matters Statistically identical users ⇒ identical cache content Insights from this work: The main gain in caching is global Global cache size matters

  15. The Role of Caching Conventional beliefs about caching: Caches useful to deliver content locally Local cache size matters Statistically identical users ⇒ identical cache content Insights from this work: The main gain in caching is global Global cache size matters Statistically identical users ⇒ di ff erent cache content

  16. Problem Setting server shared link K users caches

  17. Problem Setting server N files, N ≥ K for simplicity shared link K users caches

  18. Problem Setting server N files shared link K users caches size M

  19. Problem Setting server N files shared link K users caches size M Placement: cache arbitrary function of files (linear, nonlinear, . . . )

  20. Problem Setting server N files shared link K users caches size M Delivery:

  21. Problem Setting server N files shared link K users caches size M Delivery: - requests are revealed to server

  22. Problem Setting server N files shared link K users caches size M Delivery: - requests are revealed to server - server sends arbitrary function of files

  23. Problem Setting server N files shared link K users caches size M Delivery: - requests are revealed to server - server sends arbitrary function of files

  24. Problem Setting server N files shared link K users caches size M Question: smallest worst-case rate R ( M ) needed in delivery phase?

  25. Conventional Caching Scheme N files, K users, cache size M N M

  26. Conventional Caching Scheme N files, K users, cache size M N M N M

  27. Conventional Caching Scheme N files, K users, cache size M N M N M

  28. Conventional Caching Scheme N files, K users, cache size M N M N M

  29. Conventional Caching Scheme N files, K users, cache size M N M N M

  30. Conventional Caching Scheme N files, K users, cache size M N Performance of conventional scheme: M R ( M ) = K · (1 − M / N ) N M

  31. Conventional Caching Scheme N files, K users, cache size M N Performance of conventional scheme: M R ( M ) = K · (1 − M / N ) N M Caches provide content locally ⇒ local cache size matters Identical cache content at users

  32. Conventional Caching Scheme N = 2 files , K = 2 users R 2 conventional scheme 0 M 0 2

  33. Conventional Caching Scheme N = 4 files , K = 4 users R 4 conventional scheme 0 M 0 4

  34. Conventional Caching Scheme N = 8 files , K = 8 users R 8 conventional scheme 0 M 0 8

  35. Conventional Caching Scheme N = 16 files , K = 16 users R 16 conventional scheme 0 M 0 16

  36. Conventional Caching Scheme N = 32 files , K = 32 users R 32 conventional scheme 0 M 0 32

  37. Conventional Caching Scheme N = 64 files , K = 64 users R 64 conventional scheme 0 M 0 64

  38. Conventional Caching Scheme N = 128 files , K = 128 users R 128 conventional scheme 0 M 0 128

  39. Conventional Caching Scheme N = 256 files , K = 256 users R 256 conventional scheme 0 M 0 256

  40. Conventional Caching Scheme N = 512 files , K = 512 users R 512 conventional scheme 0 M 0 512

  41. Proposed Caching Scheme N files, K users, cache size M Design guidelines advocated in this work: The main gain in caching is global Global cache size matters Di ff erent cache content at users

  42. Proposed Caching Scheme N files, K users, cache size M Design guidelines advocated in this work: The main gain in caching is global Global cache size matters Di ff erent cache content at users Performance of proposed scheme: 1 R ( M ) = K · (1 − M / N ) · 1 + KM / N

  43. Proposed Caching Scheme N = 2 files , K = 2 users R 2 conventional scheme proposed scheme 0 M 0 2

  44. Proposed Caching Scheme N = 4 files , K = 4 users R 4 conventional scheme proposed scheme 0 M 0 4

  45. Proposed Caching Scheme N = 8 files , K = 8 users R 8 conventional scheme proposed scheme 0 M 0 8

  46. Proposed Caching Scheme N = 16 files , K = 16 users R 16 conventional scheme proposed scheme 0 M 0 16

  47. Proposed Caching Scheme N = 32 files , K = 32 users R 32 conventional scheme proposed scheme 0 M 0 32

  48. Proposed Caching Scheme N = 64 files , K = 64 users R 64 conventional scheme proposed scheme 0 M 0 64

  49. Proposed Caching Scheme N = 128 files , K = 128 users R 128 conventional scheme proposed scheme 0 M 0 128

  50. Proposed Caching Scheme N = 256 files , K = 256 users R 256 conventional scheme proposed scheme 0 M 0 256

  51. Proposed Caching Scheme N = 512 files , K = 512 users R 512 conventional scheme proposed scheme 0 M 0 512

  52. Recall: Conventional Scheme N = 2 files, K = 2 users, cache size M = 1

  53. Recall: Conventional Scheme N = 2 files, K = 2 users, cache size M = 1 A B

  54. Recall: Conventional Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2

  55. Recall: Conventional Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2 A 1 , B 1 A 1 , B 1

  56. Recall: Conventional Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2 A 1 , B 1 A 1 , B 1

  57. Recall: Conventional Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2 A 2 A 1 , B 1 A 1 , B 1

  58. Recall: Conventional Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2 A 2 A A A 1 , B 1 A 1 , B 1

  59. Recall: Conventional Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2 A 2 A A A 1 , B 1 A 1 , B 1 ⇒ Identical cache content at users ⇒ Gain from delivering content locally

  60. Recall: Conventional Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2 A 1 , B 1 A 1 , B 1

  61. Recall: Conventional Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2 A 2 , B 2 A 1 , B 1 A 1 , B 1

  62. Recall: Conventional Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2 A 2 , B 2 A B A 1 , B 1 A 1 , B 1

  63. Recall: Conventional Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2 A 2 , B 2 A B A 1 , B 1 A 1 , B 1 ⇒ Multicast only possible for users with same demand

  64. Recall: Conventional Scheme N = 2 files, K = 2 users, cache size M = 1 R 2 conventional scheme proposed scheme 1 0 M 0 1 2

  65. Proposed Scheme N = 2 files, K = 2 users, cache size M = 1

  66. Proposed Scheme N = 2 files, K = 2 users, cache size M = 1 A B

  67. Proposed Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2

  68. Proposed Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2 A 1 , B 1 A 2 , B 2

  69. Proposed Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2 A 1 , B 1 A 2 , B 2

  70. Proposed Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2 A 2 B 1 A 1 , B 1 A 2 , B 2

  71. Proposed Scheme N = 2 files, K = 2 users, cache size M = 1 A 1 , A 2 B 1 , B 2 A 2 B 1 A 1 , B 1 A 2 , B 2

Recommend


More recommend