finding nash equilibria in certain classes of 2 player
play

Finding Nash Equilibria in Certain Classes of 2-Player Game Adrian - PowerPoint PPT Presentation

Finding Nash Equilibria in Certain Classes of 2-Player Game Adrian Vetta McGill University Introduction Introduction Finding a Nash equilibrium (NE) is hard. Introduction Finding a Nash equilibrium (NE) is hard. In multiplayer games.


  1. A Geometric Interpretation of MSNE   3 7 3 9 0 2 What if Bob plays a mixed 9 1 1 3 4 5     A strategy on columns 2 and 3 ? 7 4 6 2 8 0     0 4 2 3 3 9     6 6 5 5 1 1   1 2 3 7 0 8 Geometrically: Alice’s options are now points in 2 - D . r 1

  2. A Geometric Interpretation of MSNE   3 7 3 9 0 2 What if Bob plays a mixed 9 1 1 3 4 5     A strategy on columns 2 and 3 ? 7 4 6 2 8 0     0 4 2 3 3 9     6 6 5 5 1 1   1 2 3 7 0 8 Geometrically: Alice’s options are now points in 2 - D . r 1

  3. A Geometric Interpretation of MSNE   3 7 3 9 0 2 What if Bob plays a mixed 9 1 1 3 4 5     A strategy on columns 2 and 3 ? 7 4 6 2 8 0     0 4 2 3 3 9     6 6 5 5 1 1   1 2 3 7 0 8 Geometrically: Alice’s options are now points in 2 - D . r 1

  4. A Geometric Interpretation of MSNE   3 7 3 9 0 2 What if Bob plays a mixed 9 1 1 3 4 5     A strategy on columns 2 and 3 ? 7 4 6 2 8 0     0 4 2 3 3 9     6 6 5 5 1 1   1 2 3 7 0 8 Geometrically: Alice’s options are now points in 2 - D . r 3 r 1

  5. A Geometric Interpretation of MSNE   3 7 3 9 0 2 What if Bob plays a mixed 9 1 1 3 4 5     A strategy on columns 2 and 3 ? 7 4 6 2 8 0     0 4 2 3 3 9     6 6 5 5 1 1   1 2 3 7 0 8 Geometrically: Alice’s options are now points in 2 - D . r 3 r 1

  6. A Geometric Interpretation of MSNE   3 7 3 9 0 2 What if Bob plays a mixed 9 1 1 3 4 5     A strategy on columns 2 and 3 ? 7 4 6 2 8 0     0 4 2 3 3 9     6 6 5 5 1 1   1 2 3 7 0 8 Geometrically: Alice’s options are now points in 2 - D . r 3 r 1

  7. Best Responses and Extreme Points Extreme points still correspond to best responses.

  8. Best Responses and Extreme Points Extreme points still correspond to best responses. c 3 c 2

  9. Best Responses and Extreme Points Extreme points still correspond to best responses. c 3 c 2

  10. Best Responses and Extreme Points Extreme points still correspond to best responses. c 3 c 2 Any extreme point on the anti-dominant of the convex hull is a best response to some probability distribution (q,1-q) on columns 2 and 3 .

  11. Best Responses and Extreme Points Extreme points still correspond to best responses. P 2 , 3 c 3 c 2 Any extreme point on the anti-dominant of the convex hull is a best response to some probability distribution (q,1-q) on columns 2 and 3 .

  12. Best Responses and Extreme Points Extreme points still correspond to best responses. P 2 , 3 c 3 c 2 Any extreme point on the anti-dominant of the convex hull is a best response to some probability distribution (q,1-q) on columns 2 and 3 .

  13. Best Responses and Extreme Points Extreme points still correspond to best responses. P 2 , 3 c 3 (1 , 0) r 1 c 2 Any extreme point on the anti-dominant of the convex hull is a best response to some probability distribution (q,1-q) on columns 2 and 3 .

  14. Best Responses and Extreme Points Extreme points still correspond to best responses. P 2 , 3 c 3 c 2 Any extreme point on the anti-dominant of the convex hull is a best response to some probability distribution (q,1-q) on columns 2 and 3 .

  15. Best Responses and Extreme Points Extreme points still correspond to best responses. P 2 , 3 (1 / 2 , 1 / 2) r 5 c 3 c 2 Any extreme point on the anti-dominant of the convex hull is a best response to some probability distribution (q,1-q) on columns 2 and 3 .

  16. Best Responses and Facets But then faces can also correspond to best responses. P 2 , 3 c 3 c 2

  17. Best Responses and Facets But then faces can also correspond to best responses. P 2 , 3 (2 / 3 , 1 / 3) r 5 c 3 r 1 c 2

  18. Best Responses and Facets But then faces can also correspond to best responses. P 2 , 3 (2 / 3 , 1 / 3) r 5 c 3 r 1 c 2 Theorem. and form a NE ( r 1 , r 5 ) ( c 2 , c 3 ) if and only if is a facet of and is a facet of . ( r 1 , r 5 ) P 2 , 3 P 1 , 5 ( c 2 , c 3 )

  19. Random Games

  20. Random Games In random games matrix entries are drawn independently from a distribution. e.g. U[0,1] , N(0,1)

  21. Random Games In random games matrix entries are drawn independently from a distribution. e.g. U[0,1] , N(0,1) So the #NE relates to the #facets in randomly generated polytopes .

  22. Random Games In random games matrix entries are drawn independently from a distribution. e.g. U[0,1] , N(0,1) So the #NE relates to the #facets in randomly generated polytopes .

  23. Random Games In random games matrix entries are drawn independently from a distribution. e.g. U[0,1] , N(0,1) So the #NE relates to the #facets in randomly generated polytopes .

  24. Random Games In random games matrix entries are drawn independently from a distribution. e.g. U[0,1] , N(0,1) So the #NE relates to the #facets in randomly generated polytopes .

  25. Random Games In random games matrix entries are drawn independently from a distribution. e.g. U[0,1] , N(0,1) So the #NE relates to the #facets in randomly generated polytopes .

  26. Random Games In random games matrix entries are drawn independently from a distribution. e.g. U[0,1] , N(0,1) So the #NE relates to the #facets in randomly generated polytopes .

  27. Random Games In random games matrix entries are drawn independently from a distribution. e.g. U[0,1] , N(0,1) So the #NE relates to the #facets in randomly generated polytopes .

  28. Random Games In random games matrix entries are drawn independently from a distribution. e.g. U[0,1] , N(0,1) So the #NE relates to the #facets in randomly generated polytopes .

  29. Random Polytopes Points are in general position.

  30. Random Polytopes Points are in general position. All NE have supports of the same size.

  31. Random Polytopes Points are in general position. All NE have supports of the same size. Proof . Won’t have d+1 points on (d-1) -dimensional facet.

  32. Random Polytopes Points are in general position. All NE have supports of the same size.

  33. Random Polytopes Points are in general position. All NE have supports of the same size. # extreme points # facets ≤

  34. Random Polytopes Points are in general position. All NE have supports of the same size. # extreme points # facets ≤ Proof. Each facet has d points; each extreme point is on d facets. ≥

  35. Random Polytopes Points are in general position. All NE have supports of the same size. # extreme points # facets ≤

  36. The # of Nash Equilibria

  37. The # of Nash Equilibria Theorem. E (# d × d NE) ≥ E (#extreme points) 2

  38. The # of Nash Equilibria Theorem. E (# d × d NE) ≥ E (#extreme points) 2 Proof. A set R of d rows is a best response to a set C of d columns with probability #facets � n � d and vice versa.

  39. The # of Extreme Points

  40. The # of Extreme Points Theorem. For the uniform distribution E (#extreme points) � log d − 1 n

  41. The # of Extreme Points Theorem. For the uniform distribution E (#extreme points) � log d − 1 n Proof. � E (#extreme points) = n Pr( x is extreme) f ( x ) dx x ∈

  42. The # of Extreme Points Theorem. For the uniform distribution E (#extreme points) � log d − 1 n Proof. � E (#extreme points) = n Pr( x is extreme) f ( x ) dx x ∈ x

  43. The # of Extreme Points Theorem. For the uniform distribution E (#extreme points) � log d − 1 n Proof. � E (#extreme points) = n Pr( x is extreme) f ( x ) dx x ∈ x d 1 − y i � H x = { y : = d } 1 − x i i =1

  44. The # of Extreme Points Theorem. For the uniform distribution E (#extreme points) � log d − 1 n Proof. � E (#extreme points) = n Pr( x is extreme) f ( x ) dx x ∈ x d 1 − y i � H x = { y : = d } 1 − x i i =1

  45. The # of Extreme Points Theorem. For the uniform distribution E (#extreme points) � log d − 1 n Proof. � E (#extreme points) = n Pr( x is extreme) f ( x ) dx x ∈ x d 1 − y i � H x = { y : = d } 1 − x i i =1

  46. The # of Extreme Points Theorem. For the uniform distribution E (#extreme points) � log d − 1 n Proof. � E (#extreme points) = n Pr( x is extreme) f ( x ) dx x ∈ x

  47. The # of Extreme Points Theorem. For the uniform distribution E (#extreme points) � log d − 1 n Proof. � E (#extreme points) = n Pr( x is extreme) f ( x ) dx x ∈ � x Pr( H x separates x ) f ( x ) dx ≥ n x ∈

  48. The # of Extreme Points Theorem. For the uniform distribution E (#extreme points) � log d − 1 n Proof. � E (#extreme points) = n Pr( x is extreme) f ( x ) dx x ∈ � x Pr( H x separates x ) f ( x ) dx ≥ n x ∈ . . . � log d − 1 n

  49. The # of Nash Equilibria

  50. The # of Nash Equilibria Theorem. For the uniform distribution E (# d × d NE) � log 2( d − 1) n

  51. The # of Nash Equilibria Theorem. For the uniform distribution E (# d × d NE) � log 2( d − 1) n We expect lots of NE, even lots with 2x2 support.

  52. The # of Nash Equilibria Theorem. For the uniform distribution E (# d × d NE) � log 2( d − 1) n We expect lots of NE, even lots with 2x2 support. But this isn’t enough. We need concentration bounds .

  53. The # of Nash Equilibria Theorem. For the uniform distribution E (# d × d NE) � log 2( d − 1) n We expect lots of NE, even lots with 2x2 support. But this isn’t enough. We need concentration bounds . Can we show that is small? Pr(# d × d NE = 0)

  54. Cap Coverings

  55. Cap Coverings The fraction of points on a convex hull K is E (vol( ¯ K ) = 1 − E (vol( K ))

  56. Cap Coverings The fraction of points on a convex hull K is E (vol( ¯ K ) = 1 − E (vol( K )) ¯ K

  57. Cap Coverings The fraction of points on a convex hull K is E (vol( ¯ K ) = 1 − E (vol( K )) ¯ K A cap is the intersection of the cube and a halfspace.

  58. Cap Coverings The fraction of points on a convex hull K is E (vol( ¯ K ) = 1 − E (vol( K )) ¯ K A cap is the intersection of the cube and a halfspace. Cap Covering Thm. (Bar89) can be closely covered by a ¯ K small number of low volume caps that don’t intersect much.

  59. Cap Coverings The fraction of points on a convex hull K is E (vol( ¯ K ) = 1 − E (vol( K )) ¯ K A cap is the intersection of the cube and a halfspace. Cap Covering Thm. (Bar89) can be closely covered by a ¯ K small number of low volume caps that don’t intersect much.

  60. Cap Coverings The fraction of points on a convex hull K is E (vol( ¯ K ) = 1 − E (vol( K )) ¯ K A cap is the intersection of the cube and a halfspace. Cap Covering Thm. (Bar89) can be closely covered by a ¯ K small number of low volume caps that don’t intersect much.

  61. Concentration Bounds

  62. Concentration Bounds Cap coverings give concentration bounds on: # extreme points # faces

  63. Concentration Bounds Cap coverings give concentration bounds on: # extreme points # faces Combinatorially. For NE we examine the probability that a set S of rows forms a facet given that (i) A set T of rows forms a face. (ii) We resample some of the coordinates.

  64. A Dumb Algorithm

  65. A Dumb Algorithm Algorithm. Exhaustively search for dxd NE; d=1,2,...

  66. A Dumb Algorithm Algorithm. Exhaustively search for dxd NE; d=1,2,... Theorem. The algorithm finds a NE in polytime w.h.p.

  67. A Dumb Algorithm Algorithm. Exhaustively search for dxd NE; d=1,2,... Theorem. The algorithm finds a NE in polytime w.h.p. Proof. There is a 2x2 NE w.h.p.

  68. Win-Lose Games

  69. Win-Lose Games In a win-lose game the payoff matrices are 0-1 .

Recommend


More recommend