Entropy and Bayesian networks ◮ Consider entropy cone of RVs from Bayesian network ◮ Conditonal independence: Z ⊥ X | Y ⇔ I ( X : Z | Y ) = 0 ◮ Conditional Mutual information I ( X : Z | Y ) = H ( XY ) + H ( YZ ) − H ( XZY ) − H ( Y )
Entropy and Bayesian networks ◮ Consider entropy cone of RVs from Bayesian network ◮ Conditonal independence: Z ⊥ X | Y ⇔ I ( X : Z | Y ) = 0 ◮ Conditional Mutual information I ( X : Z | Y ) = H ( XY ) + H ( YZ ) − H ( XZY ) − H ( Y ) ◮ Linear equation
Entropy and Bayesian networks ◮ Consider entropy cone of RVs from Bayesian network ◮ Conditonal independence: Z ⊥ X | Y ⇔ I ( X : Z | Y ) = 0 ◮ Conditional Mutual information I ( X : Z | Y ) = H ( XY ) + H ( YZ ) − H ( XZY ) − H ( Y ) ◮ Linear equation ◮ intersection with entropy cone is convex cone
Marginal scenario I ◮ Only some RVs are observed
Marginal scenario I ◮ Only some RVs are observed ◮ Example:
Marginal scenario I ◮ Only some RVs are observed ◮ Example: → Marginal scenario: { A , B , C }
Marginal scenario I ◮ Only some RVs are observed ◮ Example: → Marginal scenario: { A , B , C } ◮ Plan: Remove unobserved variables from inequality description of entropy cone
Marginal scenario II
Marginal scenario II Definition (Marginal Scenario) Let n ∈ N . A subset M ⊂ 2 { 1 ,..., n } such that for I ∈ M and J ⊂ I also J ∈ M is called marginal scenario.
Marginal scenario II Definition (Marginal Scenario) Let n ∈ N . A subset M ⊂ 2 { 1 ,..., n } such that for I ∈ M and J ⊂ I also J ∈ M is called marginal scenario. Projection of entropy cone to a marginal scenario = ⇒ observable entropy cone.
Quantum
Quantum Causal Structures I ◮ DAG G = ( V , E )
Quantum Causal Structures I ◮ DAG G = ( V , E ) ◮ Sink nodes s ∈ V : Hilbert space H s
Quantum Causal Structures I ◮ DAG G = ( V , E ) ◮ Sink nodes s ∈ V : Hilbert space H s ◮ Nodes v ∈ V with children: Hilbert space � H v = H v , w w ∈ V ( v , w ) ∈ E
Quantum Causal Structures I ◮ DAG G = ( V , E ) ◮ Sink nodes s ∈ V : Hilbert space H s ◮ Nodes v ∈ V with children: Hilbert space � H v = H v , w w ∈ V ( v , w ) ∈ E ◮ � H = H v v ∈ V
Quantum Causal Structures I ◮ DAG G = ( V , E ) ◮ Sink nodes s ∈ V : Hilbert space H s ◮ Nodes v ∈ V with children: Hilbert space � H v = H v , w w ∈ V ( v , w ) ∈ E ◮ � H = H v v ∈ V ◮ Parent Hilbert space � H pa ( v ) = H w , v w ∈ V ( w , v ) ∈ E
Quantum Causal Structures II ◮ Initial product state ρ 0 = � q ρ q on Hilbert spaces of source nodes q
Quantum Causal Structures II ◮ Initial product state ρ 0 = � q ρ q on Hilbert spaces of source nodes q ◮ CPTP maps Φ v for each non-source node: � � Φ v : L H pa ( v ) → L ( H v )
Quantum Causal Structures II ◮ Initial product state ρ 0 = � q ρ q on Hilbert spaces of source nodes q ◮ CPTP maps Φ v for each non-source node: � � Φ v : L H pa ( v ) → L ( H v ) ◮ ρ v = Φ v ρ pa ( v )
Quantum Causal Structures II ◮ Initial product state ρ 0 = � q ρ q on Hilbert spaces of source nodes q ◮ CPTP maps Φ v for each non-source node: � � Φ v : L H pa ( v ) → L ( H v ) ◮ ρ v = Φ v ρ pa ( v ) [!] no global state
Quantum Causal Structures II ◮ Initial product state ρ 0 = � q ρ q on Hilbert spaces of source nodes q ◮ CPTP maps Φ v for each non-source node: � � Φ v : L H pa ( v ) → L ( H v ) ◮ ρ v = Φ v ρ pa ( v ) [!] no global state ◮ want classical nodes: pick the right Φ v
Example 1 unobserved observed observed 2 3 ◮ H = H 1 , 2 ⊗ H 1 , 3 ⊗ H 2 ⊗ H 3
Example 1 unobserved observed observed 2 3 ◮ H = H 1 , 2 ⊗ H 1 , 3 ⊗ H 2 ⊗ H 3 ◮ States on the coexisting subsets of systems: = ρ 0 ρ (1 , 2) , (1 , 3) = (Φ 2 ⊗ 1 ) ρ (1 , 2) , (1 , 3) ρ (1 , 3) , 2 = ( 1 ⊗ Φ 3 ) ρ (1 , 2) , (1 , 3) ρ (1 , 2) , 3 = (Φ 2 ⊗ Φ 3 ) ρ (1 , 2) , (1 , 3) ρ 2 , 3
The entropy cone ◮ Look at entropy cone of states constructed like this
The entropy cone ◮ Look at entropy cone of states constructed like this ◮ H is a tensor product of n = | E | + | V s | hilbert spaces, V s : Set of sinks
The entropy cone ◮ Look at entropy cone of states constructed like this ◮ H is a tensor product of n = | E | + | V s | hilbert spaces, V s : Set of sinks ◮ Formally: Entropy vector v ∈ R 2 n
The entropy cone ◮ Look at entropy cone of states constructed like this ◮ H is a tensor product of n = | E | + | V s | hilbert spaces, V s : Set of sinks ◮ Formally: Entropy vector v ∈ R 2 n ◮ v I = S ( ρ I ) if the state ρ I exists
The entropy cone ◮ Look at entropy cone of states constructed like this ◮ H is a tensor product of n = | E | + | V s | hilbert spaces, V s : Set of sinks ◮ Formally: Entropy vector v ∈ R 2 n ◮ v I = S ( ρ I ) if the state ρ I exists ◮ v I arbitrary if ρ I doesn’t exist
The entropy cone ◮ Look at entropy cone of states constructed like this ◮ H is a tensor product of n = | E | + | V s | hilbert spaces, V s : Set of sinks ◮ Formally: Entropy vector v ∈ R 2 n ◮ v I = S ( ρ I ) if the state ρ I exists ◮ v I arbitrary if ρ I doesn’t exist ◮ for each J ⊂ E ∪ V s of coexisting systems: quantum entropy cone
The entropy cone ◮ Look at entropy cone of states constructed like this ◮ H is a tensor product of n = | E | + | V s | hilbert spaces, V s : Set of sinks ◮ Formally: Entropy vector v ∈ R 2 n ◮ v I = S ( ρ I ) if the state ρ I exists ◮ v I arbitrary if ρ I doesn’t exist ◮ for each J ⊂ E ∪ V s of coexisting systems: quantum entropy cone ◮ extra monotonicities for classical systems
Data processing inequality ◮ replace conditional independence relations?
Data processing inequality ◮ replace conditional independence relations? → Data processing inequality: For a CPTP map Φ : H A → H B I ( A : C | D ) ≥ I ( B : C | D ) for any other systems C and D .
Data processing inequality ◮ replace conditional independence relations? → Data processing inequality: For a CPTP map Φ : H A → H B I ( A : C | D ) ≥ I ( B : C | D ) for any other systems C and D . ◮ relates entropies of noncoexisting systems
Data processing inequality ◮ replace conditional independence relations? → Data processing inequality: For a CPTP map Φ : H A → H B I ( A : C | D ) ≥ I ( B : C | D ) for any other systems C and D . ◮ relates entropies of noncoexisting systems ◮ replaces conditional independence
Marginal Scenario ◮ Again, only some systems are observed
Marginal Scenario ◮ Again, only some systems are observed ◮ Example:
Marginal Scenario ◮ Again, only some systems are observed ◮ Example:
Marginal Scenario ◮ Again, only some systems are observed ◮ Example:
Marginal Scenario ◮ Again, only some systems are observed ◮ Example: ◮ most interesting marginal scenario: { A , B , C } & assume these nodes classical
Application
Information Causality The IC principle is defined by a game:
Information Causality The IC principle is defined by a game:
Information Causality The IC principle is defined by a game:
Information Causality The IC principle is defined by a game: Ρ AB
Information Causality The IC principle is defined by a game: X 1 , X 2 Ρ AB
Information Causality The IC principle is defined by a game: X 1 , X 2 M Ρ AB
Information Causality i The IC principle is defined by a game: X 1 , X 2 M Ρ AB
Recommend
More recommend