quantum causal structures
play

Quantum Causal Structures Christian Majenz University of Copenhagen - PowerPoint PPT Presentation

Quantum Causal Structures Christian Majenz University of Copenhagen Joint work with Rafael Chaves and David Gross, University of Freiburg (arXiv:1407.3800) 04.12.2014 Motivation Question Question Can we rule out certain causal relationships


  1. Entropy and Bayesian networks ◮ Consider entropy cone of RVs from Bayesian network ◮ Conditonal independence: Z ⊥ X | Y ⇔ I ( X : Z | Y ) = 0 ◮ Conditional Mutual information I ( X : Z | Y ) = H ( XY ) + H ( YZ ) − H ( XZY ) − H ( Y )

  2. Entropy and Bayesian networks ◮ Consider entropy cone of RVs from Bayesian network ◮ Conditonal independence: Z ⊥ X | Y ⇔ I ( X : Z | Y ) = 0 ◮ Conditional Mutual information I ( X : Z | Y ) = H ( XY ) + H ( YZ ) − H ( XZY ) − H ( Y ) ◮ Linear equation

  3. Entropy and Bayesian networks ◮ Consider entropy cone of RVs from Bayesian network ◮ Conditonal independence: Z ⊥ X | Y ⇔ I ( X : Z | Y ) = 0 ◮ Conditional Mutual information I ( X : Z | Y ) = H ( XY ) + H ( YZ ) − H ( XZY ) − H ( Y ) ◮ Linear equation ◮ intersection with entropy cone is convex cone

  4. Marginal scenario I ◮ Only some RVs are observed

  5. Marginal scenario I ◮ Only some RVs are observed ◮ Example:

  6. Marginal scenario I ◮ Only some RVs are observed ◮ Example: → Marginal scenario: { A , B , C }

  7. Marginal scenario I ◮ Only some RVs are observed ◮ Example: → Marginal scenario: { A , B , C } ◮ Plan: Remove unobserved variables from inequality description of entropy cone

  8. Marginal scenario II

  9. Marginal scenario II Definition (Marginal Scenario) Let n ∈ N . A subset M ⊂ 2 { 1 ,..., n } such that for I ∈ M and J ⊂ I also J ∈ M is called marginal scenario.

  10. Marginal scenario II Definition (Marginal Scenario) Let n ∈ N . A subset M ⊂ 2 { 1 ,..., n } such that for I ∈ M and J ⊂ I also J ∈ M is called marginal scenario. Projection of entropy cone to a marginal scenario = ⇒ observable entropy cone.

  11. Quantum

  12. Quantum Causal Structures I ◮ DAG G = ( V , E )

  13. Quantum Causal Structures I ◮ DAG G = ( V , E ) ◮ Sink nodes s ∈ V : Hilbert space H s

  14. Quantum Causal Structures I ◮ DAG G = ( V , E ) ◮ Sink nodes s ∈ V : Hilbert space H s ◮ Nodes v ∈ V with children: Hilbert space � H v = H v , w w ∈ V ( v , w ) ∈ E

  15. Quantum Causal Structures I ◮ DAG G = ( V , E ) ◮ Sink nodes s ∈ V : Hilbert space H s ◮ Nodes v ∈ V with children: Hilbert space � H v = H v , w w ∈ V ( v , w ) ∈ E ◮ � H = H v v ∈ V

  16. Quantum Causal Structures I ◮ DAG G = ( V , E ) ◮ Sink nodes s ∈ V : Hilbert space H s ◮ Nodes v ∈ V with children: Hilbert space � H v = H v , w w ∈ V ( v , w ) ∈ E ◮ � H = H v v ∈ V ◮ Parent Hilbert space � H pa ( v ) = H w , v w ∈ V ( w , v ) ∈ E

  17. Quantum Causal Structures II ◮ Initial product state ρ 0 = � q ρ q on Hilbert spaces of source nodes q

  18. Quantum Causal Structures II ◮ Initial product state ρ 0 = � q ρ q on Hilbert spaces of source nodes q ◮ CPTP maps Φ v for each non-source node: � � Φ v : L H pa ( v ) → L ( H v )

  19. Quantum Causal Structures II ◮ Initial product state ρ 0 = � q ρ q on Hilbert spaces of source nodes q ◮ CPTP maps Φ v for each non-source node: � � Φ v : L H pa ( v ) → L ( H v ) ◮ ρ v = Φ v ρ pa ( v )

  20. Quantum Causal Structures II ◮ Initial product state ρ 0 = � q ρ q on Hilbert spaces of source nodes q ◮ CPTP maps Φ v for each non-source node: � � Φ v : L H pa ( v ) → L ( H v ) ◮ ρ v = Φ v ρ pa ( v ) [!] no global state

  21. Quantum Causal Structures II ◮ Initial product state ρ 0 = � q ρ q on Hilbert spaces of source nodes q ◮ CPTP maps Φ v for each non-source node: � � Φ v : L H pa ( v ) → L ( H v ) ◮ ρ v = Φ v ρ pa ( v ) [!] no global state ◮ want classical nodes: pick the right Φ v

  22. Example 1 unobserved observed observed 2 3 ◮ H = H 1 , 2 ⊗ H 1 , 3 ⊗ H 2 ⊗ H 3

  23. Example 1 unobserved observed observed 2 3 ◮ H = H 1 , 2 ⊗ H 1 , 3 ⊗ H 2 ⊗ H 3 ◮ States on the coexisting subsets of systems: = ρ 0 ρ (1 , 2) , (1 , 3) = (Φ 2 ⊗ 1 ) ρ (1 , 2) , (1 , 3) ρ (1 , 3) , 2 = ( 1 ⊗ Φ 3 ) ρ (1 , 2) , (1 , 3) ρ (1 , 2) , 3 = (Φ 2 ⊗ Φ 3 ) ρ (1 , 2) , (1 , 3) ρ 2 , 3

  24. The entropy cone ◮ Look at entropy cone of states constructed like this

  25. The entropy cone ◮ Look at entropy cone of states constructed like this ◮ H is a tensor product of n = | E | + | V s | hilbert spaces, V s : Set of sinks

  26. The entropy cone ◮ Look at entropy cone of states constructed like this ◮ H is a tensor product of n = | E | + | V s | hilbert spaces, V s : Set of sinks ◮ Formally: Entropy vector v ∈ R 2 n

  27. The entropy cone ◮ Look at entropy cone of states constructed like this ◮ H is a tensor product of n = | E | + | V s | hilbert spaces, V s : Set of sinks ◮ Formally: Entropy vector v ∈ R 2 n ◮ v I = S ( ρ I ) if the state ρ I exists

  28. The entropy cone ◮ Look at entropy cone of states constructed like this ◮ H is a tensor product of n = | E | + | V s | hilbert spaces, V s : Set of sinks ◮ Formally: Entropy vector v ∈ R 2 n ◮ v I = S ( ρ I ) if the state ρ I exists ◮ v I arbitrary if ρ I doesn’t exist

  29. The entropy cone ◮ Look at entropy cone of states constructed like this ◮ H is a tensor product of n = | E | + | V s | hilbert spaces, V s : Set of sinks ◮ Formally: Entropy vector v ∈ R 2 n ◮ v I = S ( ρ I ) if the state ρ I exists ◮ v I arbitrary if ρ I doesn’t exist ◮ for each J ⊂ E ∪ V s of coexisting systems: quantum entropy cone

  30. The entropy cone ◮ Look at entropy cone of states constructed like this ◮ H is a tensor product of n = | E | + | V s | hilbert spaces, V s : Set of sinks ◮ Formally: Entropy vector v ∈ R 2 n ◮ v I = S ( ρ I ) if the state ρ I exists ◮ v I arbitrary if ρ I doesn’t exist ◮ for each J ⊂ E ∪ V s of coexisting systems: quantum entropy cone ◮ extra monotonicities for classical systems

  31. Data processing inequality ◮ replace conditional independence relations?

  32. Data processing inequality ◮ replace conditional independence relations? → Data processing inequality: For a CPTP map Φ : H A → H B I ( A : C | D ) ≥ I ( B : C | D ) for any other systems C and D .

  33. Data processing inequality ◮ replace conditional independence relations? → Data processing inequality: For a CPTP map Φ : H A → H B I ( A : C | D ) ≥ I ( B : C | D ) for any other systems C and D . ◮ relates entropies of noncoexisting systems

  34. Data processing inequality ◮ replace conditional independence relations? → Data processing inequality: For a CPTP map Φ : H A → H B I ( A : C | D ) ≥ I ( B : C | D ) for any other systems C and D . ◮ relates entropies of noncoexisting systems ◮ replaces conditional independence

  35. Marginal Scenario ◮ Again, only some systems are observed

  36. Marginal Scenario ◮ Again, only some systems are observed ◮ Example:

  37. Marginal Scenario ◮ Again, only some systems are observed ◮ Example:

  38. Marginal Scenario ◮ Again, only some systems are observed ◮ Example:

  39. Marginal Scenario ◮ Again, only some systems are observed ◮ Example: ◮ most interesting marginal scenario: { A , B , C } & assume these nodes classical

  40. Application

  41. Information Causality The IC principle is defined by a game:

  42. Information Causality The IC principle is defined by a game:

  43. Information Causality The IC principle is defined by a game:

  44. Information Causality The IC principle is defined by a game: Ρ AB

  45. Information Causality The IC principle is defined by a game: X 1 , X 2 Ρ AB

  46. Information Causality The IC principle is defined by a game: X 1 , X 2 M Ρ AB

  47. Information Causality i The IC principle is defined by a game: X 1 , X 2 M Ρ AB

Recommend


More recommend