Mixture Selection, Mechanism Design, and Signaling Ho Yee Cheung Shaddin Dughmi Yu Cheng Ehsan Emamjomeh-Zadeh Li Han Shang-Hua Teng University of Southern California Yu Cheng (USC) 1 / 14 Mixture selection
Mixture Selection Optimization over distributions shows up everywhere in AGT. Yu Cheng (USC) 2 / 14 Mixture selection
Mixture Selection Optimization over distributions shows up everywhere in AGT. Mixed strategies, loteries, beliefs. Yu Cheng (USC) 2 / 14 Mixture selection
Mixture Selection Optimization over distributions shows up everywhere in AGT. Mixed strategies, loteries, beliefs. Definition (Mixture Selection) Parameter: A function g ∶ [ 0 , 1 ] n → [ 0 , 1 ] . Input: A matrix A ∈ [ 0 , 1 ] n × m . Goal: max x ∈ ∆ m g ( Ax ) . Yu Cheng (USC) 2 / 14 Mixture selection
Mixture Selection x ∈ ∆ m g ( Ax ) max Yu Cheng (USC) 3 / 14 Mixture selection
Mixture Selection x ∈ ∆ m g ( Ax ) max Yu Cheng (USC) 3 / 14 Mixture selection
Mixture Selection x ∈ ∆ m g ( Ax ) max Yu Cheng (USC) 3 / 14 Mixture selection
Mixture Selection x ∈ ∆ m g ( Ax ) max Yu Cheng (USC) 3 / 14 Mixture selection
Mixture Selection: An Example Single buyer (with Bayesian prior) unit-demand pricing problem. Design a single lotery to maximize revenue. $ 1 $ 1/2 $ 1/3 $ 1/3 $ 1 $ 1/2 $ 1/2 $ 1/3 $ 1 Yu Cheng (USC) 4 / 14 Mixture selection
Mixture Selection: An Example Single buyer (with Bayesian prior) unit-demand pricing problem. Design a single lotery to maximize revenue. $ 1 $ 1/2 $ 1/3 $ 1/3 $ 1 $ 1/2 $ 1/2 $ 1/3 $ 1 A ij : Type i ’s value for item j . x : Lotery to design. g ( Ax ) : Expected revenue of x with optimal price. Yu Cheng (USC) 4 / 14 Mixture selection
Mixture Selection: An Example Single buyer (with Bayesian prior) unit-demand pricing problem. Design a single lotery to maximize revenue. $ 1 $ 1/2 $ 1/3 $ 1/3 $ 1 $ 1/2 $ 1/2 $ 1/3 $ 1 x = ( 1 , 0 , 0 ) = g ( Ax ) = 1 / 3 with optimal price p ∈ { $ 1 , $ 1 / 2 , $ 1 / 3 } . Yu Cheng (USC) 4 / 14 Mixture selection
Mixture Selection: An Example Single buyer (with Bayesian prior) unit-demand pricing problem. Design a single lotery to maximize revenue. $ 1 $ 1/2 $ 1/3 $ 1/3 $ 1 $ 1/2 $ 1/2 $ 1/3 $ 1 x = ( 1 / 3 , 1 / 3 , 1 / 3 ) = g ( Ax ) = p = ( $ 1 + $ 1 / 2 + $ 1 / 3 )/ 3 = 11 / 18 . Yu Cheng (USC) 4 / 14 Mixture selection
Motivation x ∈ ∆ m g ( Ax ) max Building block in a number of game-theoretic applications. Mixture Selection problems naturally arise in mechanism design and signaling. Yu Cheng (USC) 5 / 14 Mixture selection
Motivation x ∈ ∆ m g ( Ax ) max Building block in a number of game-theoretic applications. Mixture Selection problems naturally arise in mechanism design and signaling. Information Revelation (signaling): design information sharing policies, so that the players arrive at “good” equilibria. The beliefs of the agents are distributions. Yu Cheng (USC) 5 / 14 Mixture selection
Our Results: Framework Framework Two “smoothness” parameters that tightly control the complexity of Mixture Selection. A polynomial-time approximation scheme (PTAS) when both parameters are constants: Yu Cheng (USC) 6 / 14 Mixture selection
Our Results: Framework Framework Two “smoothness” parameters that tightly control the complexity of Mixture Selection. A polynomial-time approximation scheme (PTAS) when both parameters are constants: O ( 1 ) -Lipschitz in L ∞ norm: ∣ g ( v 1 ) − g ( v 2 )∣ ≤ O ( 1 ) ⋅ ∥ v 1 − v 2 ∥ ∞ ; Yu Cheng (USC) 6 / 14 Mixture selection
Our Results: Framework Framework Two “smoothness” parameters that tightly control the complexity of Mixture Selection. A polynomial-time approximation scheme (PTAS) when both parameters are constants: O ( 1 ) -Lipschitz in L ∞ norm: ∣ g ( v 1 ) − g ( v 2 )∣ ≤ O ( 1 ) ⋅ ∥ v 1 − v 2 ∥ ∞ ; O ( 1 ) -Noise stable: Controls the degree to which low-probability (possibly correlated) errors in the inputs of g can impact its output. Yu Cheng (USC) 6 / 14 Mixture selection
Our Results: Noise Stability Definition ( β -Noise Stable) A function g is β -Noise Stable if whenever a random process corrupts its input, Yu Cheng (USC) 7 / 14 Mixture selection
Our Results: Noise Stability Definition ( β -Noise Stable) A function g is β -Noise Stable if whenever a random process corrupts its input, and the probability each entry gets corrupted is at most α , The output of g decreases by no more than αβ in expectation. Yu Cheng (USC) 7 / 14 Mixture selection
Our Results: Noise Stability Definition ( β -Noise Stable) A function g is β -Noise Stable if whenever a random process corrupts its input, and the probability each entry gets corrupted is at most α , The output of g decreases by no more than αβ in expectation. Must hold for all inputs, even when the corruptions are arbitrarily correlated. Yu Cheng (USC) 7 / 14 Mixture selection
Our Results: Applications Game-theoretic problems in mechanism design and signaling. Problem Algorithm Hardness Unit-Demand Lotery Design [Dughmi, Han, Nisan ’14] Signaling in Bayesian Auctions [Emek et al. ’12] [Miltersen and Sheffet ’12] Signaling to Persuade Voters [Alonso and Câmara ’14] Signaling in Normal Form Games [Dughmi ’14] Yu Cheng (USC) 8 / 14 Mixture selection
Our Results: Applications Game-theoretic problems in mechanism design and signaling. Problem Algorithm Hardness Unit-Demand Lotery Design PTAS No FPTAS [Dughmi, Han, Nisan ’14] Signaling in Bayesian Auctions PTAS No FPTAS [Emek et al. ’12] [Miltersen and Sheffet ’12] Signaling to Persuade Voters PTAS 1 No FPTAS [Alonso and Câmara ’14] Signaling in Normal Form Games Qasi-PTAS 2 No FPTAS 3 [Dughmi ’14] 1 Bi-criteria. 2 n O ( log n ) for all fixed ǫ . Bi-criteria. 3 Assume hardness of planted clique. Recently [Bhaskar, Cheng, Ko, Swamy ’16] rules out PTAS. Yu Cheng (USC) 8 / 14 Mixture selection
Simple Algorithm for Mixture Selection Inspired by ǫ -Nash algorithm in [Lipton, Markakis, Mehta ’03]. Support enumeration x for s = O ( log ( n )/ ǫ 2 ) . Enumerate all s -uniform mixtures ˜ Check the values of g ( A ˜ x ) and return the best one. Yu Cheng (USC) 9 / 14 Mixture selection
Simple Algorithm for Mixture Selection Inspired by ǫ -Nash algorithm in [Lipton, Markakis, Mehta ’03]. Support enumeration x for s = O ( log ( n )/ ǫ 2 ) . Enumerate all s -uniform mixtures ˜ Check the values of g ( A ˜ x ) and return the best one. Proof Take the optimal solution x ∗ . Draw s samples from x ∗ and let ˜ x be the empirical distribution. Yu Cheng (USC) 9 / 14 Mixture selection
Simple Algorithm for Mixture Selection Inspired by ǫ -Nash algorithm in [Lipton, Markakis, Mehta ’03]. Support enumeration x for s = O ( log ( n )/ ǫ 2 ) . Enumerate all s -uniform mixtures ˜ Check the values of g ( A ˜ x ) and return the best one. Proof Take the optimal solution x ∗ . Draw s samples from x ∗ and let ˜ x be the empirical distribution. Tail bound + union bound: Pr [∥ Ax ∗ − A ˜ x ∥ ∞ < ǫ ] > 0 . x s.t. ∥ Ax ∗ − A ˜ Probabilistic method: there exists a s -uniform ˜ x ∥ ∞ < ǫ . x ) ≥ g ( Ax ∗ ) − O ( ǫ ) . If g is O ( 1 ) -Lipschitz in L ∞ , g ( A ˜ Yu Cheng (USC) 9 / 14 Mixture selection
Simple Algorithm for Mixture Selection Running Time: Evaluate g ( ⋅ ) on m s inputs. A Qasi-PTAS for Mixture Selection when g is O ( 1 ) -Lipschitz in L ∞ . Yu Cheng (USC) 10 / 14 Mixture selection
Simple Algorithm for Mixture Selection Running Time: Evaluate g ( ⋅ ) on m s inputs. A Qasi-PTAS for Mixture Selection when g is O ( 1 ) -Lipschitz in L ∞ . Bypass the Union Bound log n / ǫ 2 ) times. ✟ Sample s = O ( ✟✟ Yu Cheng (USC) 10 / 14 Mixture selection
Simple Algorithm for Mixture Selection Running Time: Evaluate g ( ⋅ ) on m s inputs. A Qasi-PTAS for Mixture Selection when g is O ( 1 ) -Lipschitz in L ∞ . Bypass the Union Bound log n / ǫ 2 ) times. ✟ Sample s = O ( ✟✟ Each entry ( Ax ) i gets changed by at most ǫ , with probability ( 1 − ǫ ) . Works if g is Noise Stable. Yu Cheng (USC) 10 / 14 Mixture selection
Simple Algorithm for Mixture Selection Running Time: Evaluate g ( ⋅ ) on m s inputs. A Qasi-PTAS for Mixture Selection when g is O ( 1 ) -Lipschitz in L ∞ . Bypass the Union Bound log n / ǫ 2 ) times. ✟ Sample s = O ( ✟✟ Each entry ( Ax ) i gets changed by at most ǫ , with probability ( 1 − ǫ ) . Works if g is Noise Stable. Summary High probability “small errors” (Lipschitz Continuity). Low probability “large errors” (Noise Stability). Yu Cheng (USC) 10 / 14 Mixture selection
Our results: Main Theorem Theorem (Approximate Mixture Selection) If g is β -Stable and c -Lipschitz, there is an algorithm with m O ( c 2 log ( β / ǫ )/ ǫ 2 ) ⋅ T g , Runtime: OPT − ǫ . Approximation: When β, c = O ( 1 ) , this gives a PTAS. Yu Cheng (USC) 11 / 14 Mixture selection
Recommend
More recommend