preparing for the worst but hoping for the best robust
play

Preparing for the Worst but Hoping for the Best: Robust (Bayesian) - PowerPoint PPT Presentation

Preparing for the Worst but Hoping for the Best: Robust (Bayesian) Persuasion Piotr Dworczak Alessandro Pavan February 2020 Motivation Bayesian persuasion/ information design designer knows agents sources of information trusts her ability


  1. Preparing for the Worst but Hoping for the Best: Robust (Bayesian) Persuasion Piotr Dworczak Alessandro Pavan February 2020

  2. Motivation Bayesian persuasion/ information design designer knows agents’ sources of information trusts her ability to coordinate Receivers on actions most favorable to her optimal information structure sensitive to fine details of agents’ beliefs In many problems of interest, agents’ sources of information (both before and after receiving Sender’s information) unknown Sender may not trust her ability to coordinate Receivers Quest for robustness

  3. This Paper Novel solution concept that accounts for such uncertainty/ambiguity Lexicographic approach to the problem Step 1: “Preparing for the worst” designer seeks to protect herself against possibility that Nature provides information and coordinates agents on actions most adversarial to designer Step 2: “Hoping for the best” designer maximizes over all worst-case optimal policies assuming Nature and Receivers play favorably Robust solutions best-case optimal among worst-case optimal ones max-max over max-min (also maximize λ inf π ∈ Π v ( q , π ) + (1 − λ )¯ v ( q , ∅ ), for λ large)

  4. Results Separation theorem Properties of robust solutions Implications for various persuasion models Conditionally-independent signals (online supplement)

  5. Literature Bayesian persuasion ...Calzolari and Pavan (2006), Brocas and Carillo (2007), Rayo-Segal (2010), Kamenica and Gentzkow (2011), Ely (2017), Dworczak-Martini (2019)... Surveys Bergemann and Morris (2019) Kamenica (2019) Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet, Perego, Taneva (2019) Morris et al. (2019) Ziegler (2019) Persuasion with unknown beliefs Kolotilin et al. (2017) Laclau and Renou (2017) Guo and Schmaya (2018) Hu and Weng (2019) Kosterina (2019) Max-max over max-min design Borgers (2017)

  6. Plan Introduction 1 Model 2 3 Robust Solutions 4 Separation Theorem 5 Corollaries Applications 6 Conditionally-independent Robust Solutions (another day) 7

  7. Model

  8. Model: Environment Payoff-relevant state: ω ∈ Ω (finite) Prior: µ 0 ∈ ∆Ω Sender’s“signal” q : Ω → ∆ S S : signal realizations (Reduced-form description of) Sender’s payoff, given induced posterior µ ∈ ∆Ω V ( µ ) : highest payoff V ( µ ): lowest payoff Difference between V and V : strategy selection (multiple Receivers) tie-breaking (single Receiver)

  9. Model: Sender’s uncertainty Nature designs information structure π : Ω × S → ∆ R R : signal realizations Multiple Receivers discriminatory disclosures embedded into derivation of V ( µ ) given common posterior µ , Nature provides (possibly private) signals to the agents and coordinates them on course of action most adversarial to Sender (among those consistent with assumed solution concept) e.g.., Bayes-correlated eq. given µ Conditioning on Sender’s signal information acquisition (after hearing from Sender) correlated noise maximal concern for robustness Online Appendix: conditionally independent signals

  10. Plan Introduction 1 Model 2 Robust Solutions 3 Separation Theorem 4 Corollaries 5 6 Applications 7 Conditionally-independent Robust Solutions

  11. Robust Solutions

  12. Robust Solutions Sender’s expected payoffs when Sender selects signal q Nature selects signal π � � � V ( µ s , r v ( q , π ) ≡ 0 ) d π ( r | ω, s ) dq ( s | ω ) µ 0 ( ω ) S R Ω � � � V ( µ s , r v ( q , π ) ≡ 0 ) d π ( r | ω, s ) dq ( s | ω ) µ 0 ( ω ) S R Ω where µ s , r is common posterior obtained from ( q , π ) 0

  13. Worst-case optimality Definition 1 Signal q is worst-case optimal if, for all signals q ′ , π v ( q ′ , π ) . inf π v ( q , π ) ≥ inf maximal payoff guarantee

  14. Worst-case optimality Given any posterior µ ∈ ∆Ω, Sender’s (lowest) payoff if, starting from µ , state fully revealed � V full ( µ ) ≡ V ( δ ω ) µ ( ω ) Ω where δ ω is Dirac measure assigning prob 1 to ω . Remark 1 Since both Nature and Sender can reveal state, signal q is worst-case optimal iff inf π v ( q , π ) = V full ( µ 0 ) W : set of worst-case optimal signals non-empty (full disclosure is worst-case optimal)

  15. Robust Solutions Definition 2 Signal q RS is robust solution if it maximizes v ( q , ∅ ) over W . Lexicographic preferences max-max over max-min policies step 1: max-min (worst-case optimal policies) step 2: max-max (highest payoff if Nature and Receivers play favorably) Clearly, q RS also maximizes sup π v ( q , π ) over W However, Sender prefers to provide information herself rather than counting on Nature to do it

  16. Robust Solutions Lemma 1 Signal q RS is robust solution iff distribution over posterior beliefs ρ RS ∈ ∆∆Ω that q RS induces maximizes � V ( µ ) d ρ ( µ ) over set of distributions over posterior beliefs W ⊂ ∆∆Ω satisfying (a) Bayes plausibility � µ d ρ ( µ ) = µ 0 and (b)“worst-case optimality”(WCO) � lco( V )( µ ) d ρ ( µ ) = V full ( µ 0 )

  17. Robust vs Bayesian Solutions Bayesian solutions : q BP maximizes v ( q , ∅ ) over Q (feasible signals) induced distribution over posterior beliefs ρ BP ∈ ∆∆Ω maximizes � V ( µ ) d ρ ( µ ) over all distributions ρ ∈ ∆∆Ω satisfying Bayes plausibility, � µ d ρ ( µ ) = µ 0 Robust solutions: q RS maximizes v ( q , ∅ ) over W ⊂ Q (worst-case optimal signals) induced distribution over posterior beliefs ρ RS ∈ ∆∆Ω maximizes � V ( µ ) d ρ ( µ ) over all distributions ρ ∈ ∆∆Ω satisfying, in addition to Bayes � plausibility, µ d ρ ( µ ) = µ 0 , the WCO constraint � lco( V )( µ ) d ρ ( µ ) = V full ( µ 0 )

  18. Plan Introduction 1 Model 2 Robust Solutions 3 Separation Theorem 4 Corollaries 5 6 Applications 7 Conditionally-independent Robust Solutions

  19. Separation Theorem

  20. Separation Theorem Theorem 1 Let F ≡ { B ⊆ Ω : V ( µ ) ≥ V full ( µ ) ALL µ ∈ ∆ B } , Then, W = { ρ ∈ ∆∆Ω : ρ satisfies BP and supp ( µ ) ∈ F ALL µ ∈ supp ( ρ ) } Therefore, ρ RS ∈ ∆∆Ω is robust solution iff ρ RS maximizes � V ( µ ) d ρ ( µ ) over all distributions over posterior beliefs ρ ∈ ∆∆Ω satisfying BP and s.t., for any µ ∈ supp ( ρ ) , supp ( µ ) ∈ F .

  21. Separation Theorem Idea: suppose Sender induces posterior µ with supp ( µ ) = B for which there exists η ∈ ∆ B s.t. V ( η ) < V full ( η ) starting from µ , Nature can induce η w/ strict positive probability starting from µ , Nature can bring Sender’s payoff strictly below V full ( µ ) because Nature can respond to any other posterior µ ′ ∈ supp ( ρ ) by fully disclosing state, � lco( V )(˜ µ ) d ρ (˜ µ ) < V full ( µ 0 ) policy ρ not worst-case optimal

  22. KG’s prosecutor example 1 Expected payoff before Nature's disclosure Expected payoff after Nature's disclosure 0 1 Figure: Prosecutor example

  23. Plan Introduction 1 Model 2 Robust solutions 3 Separation theorem 4 Corollaries 5 6 Applications 7 Conditionally-independent Robust Solutions

  24. Corollaries

  25. Existence Corollary 1 A robust solution always exists. existence guaranteed by possibility for Nature to condition on realization of Sender’s signal

  26. State separation Corollary 2 Suppose there exist ω, ω ′ ∈ Ω and λ ∈ (0 , 1) s.t. V ( λδ ω + (1 − λ ) δ ω ′ ) < λ V ( δ ω ) + (1 − λ ) V ( δ ω ′ ) , Then any robust solution must separate ω and ω ′ . Assumption: there exists some belief supported on { ω, ω ′ } under which Sender’s payoff below full disclosure Conclusion: ALL posterior beliefs must separate ω and ω ′ .

  27. Robustness of Bayesian Solutions Corollary 3 Bayesian solution ρ BP is robust iff for any µ ∈ supp ( ρ BP ) and any η ∈ ∆Ω s.t. supp ( η ) ⊂ supp ( µ ) , V ( η ) ≥ V full ( η ) Binary state: any robust solution full disclosure Bayesian solution

  28. Worst-case optimality preserved under more disclosure Corollary 4 W closed under Blackwell dominance: If ρ ′ ∈ W , and ρ Blackwell dominates ρ ′ , then ρ ∈ W . Result not true in case of conditionally independent signals

  29. Informativeness of Robust vs Bayesian solutions Corollary 5 Given any Bayesian solution ρ BP , there exists robust solution ρ RS s.t. either ρ RS and ρ BP not comparable in Blackwell order, or ρ RS Blackwell dominates ρ BP . If Bayesian solution ρ BP is Blackwell more informative than robust solution ρ RS , then ρ BP also robust Reason why robustness calls for more disclosure little to do with indifference on Sender’s part concealing information gives Nature more room for adversarial design If Bayesian solution ρ BP is not robust and is strictly Blackwell dominated by robust solution ρ RS , then ρ RS separates states that are not separated under ρ BP robustness never calls for MPS with same supports

  30. Concavification Let v low := min ω ∈ Ω V ( δ ω ) − 1 Auxiliary function � V ( µ ) if supp ( µ ) ∈ F and V ( µ ) ≥ v low V F ( µ ) = otherwise v low Corollary 6 A feasible distribution ρ ∈ ∆∆Ω is robust iff � V F ( µ ) d ρ ( µ ) = co ( V F )( µ 0 ) . Furthermore, there always exists a robust solution ρ with supp ( ρ ) | ≤ | Ω | . V F upper-semi-continuous: results follow from arguments similar to those in BP literature

Recommend


More recommend