Outline Overview Information Theory Disordered spin systems Implications of the correspondence Questions Bibliography Spin Glasses and Information Processing Pavithran S Iyer Guide: Prof. V.V Sreedhar Chennai Mathematical Institute April 25, 2011 Pavithran S Iyer Guide: Prof. V.V Sreedhar Spin Glasses and Information Processing
Outline Overview Information Theory Disordered spin systems Implications of the correspondence Questions Bibliography 1 Overview 2 Information Theory Communication problem Error correcting codes Shannon Heartely theorem 3 Disordered spin systems Introduction Reason for correspondence Spin glass physics 4 Implications of the correspondence SK Model REM Convolution Codes 5 Questions 6 Bibliography Pavithran S Iyer Guide: Prof. V.V Sreedhar Spin Glasses and Information Processing
Outline Overview Information Theory Disordered spin systems Implications of the correspondence Questions Bibliography Outline Work described - papers by N. Sourlas and a book by Nishimori Looking at: correspondences Error correcting code ⇔ Spin Hamiltonian Signal to noise ⇔ J 2 0 J 2 Maximum likelihood Decoding ⇔ Find a ground state Error probability per bit ⇔ Ground state magnetization Sequence of most probable symbols ⇔ magnetization at T = 1 Convolutional Codes ⇔ One dimentional spin glasses Viterbi decoding ⇔ T = 0 Transfer matrix algorithm BCJR decoding ⇔ T = 1 Transfer matrix algorithm Gallager LDPC codes ⇔ Diluted p -spin ferromagnets Turbo Codes ⇔ Coupled spin chains Zero error threshold ⇔ Phase transition point Belief propagation algorithm ⇔ Iterative solution of TAP equations Pavithran S Iyer Guide: Prof. V.V Sreedhar Spin Glasses and Information Processing
Outline Overview Information Theory Disordered spin systems Implications of the correspondence Questions Bibliography Outline correspondences Error correcting code ⇔ Spin Hamiltonian Signal to noise ⇔ J 2 0 J 2 Maximum likelihood Decoding ⇔ Find a ground state Error probability per bit ⇔ Ground state magnetization Sequence of most probable symbols ⇔ magnetization at T = 1 Convolutional Codes ⇔ One dimentional spin glasses Viterbi decoding ⇔ T = 0 Transfer matrix algorithm BCJR decoding ⇔ T = 1 Transfer matrix algorithm Gallager LDPC codes ⇔ Diluted p -spin ferromagnets Turbo Codes ⇔ Coupled spin chains Zero error threshold ⇔ Phase transition point Belief propagation algorithm ⇔ Iterative solution of TAP equations Pavithran S Iyer Guide: Prof. V.V Sreedhar Spin Glasses and Information Processing
Outline Overview Information Theory Disordered spin systems Implications of the correspondence Questions Bibliography Communication problem Communication Problem − → usual formulation - message from Alice to Bob Alice transmits encoded input - (gaussian) channel inflicts error - Bob tries to recover from the error Statistical formulation: Bob’s perspective - given an output, maximizes his guess of � J in | J out � the input being correct. Maximizing quantity: P - called the posterior probability. Maximum Aposteriori Probability or MAP decoding: compute conditional probabilities using baye’s theorem, assign J in = 1 if P ( J i = 1 | J out ) > P ( − 1 | J out ) and J in = − 1 i i otherwise. Maximum information about the (Alice) input which can be transmitted across the channel to Bob = channel capacity C . Input (signal) power S & Noise (power) = N , then important quantities S N = signal to noise ratio and for a gaussian channel, channel capacity Pavithran S Iyer Guide: Prof. V.V Sreedhar Spin Glasses and Information Processing 1 � S �
Outline Overview Information Theory Disordered spin systems Implications of the correspondence Questions Bibliography Communication problem Communication Problem − → usual formulation - message from Alice to Bob Alice transmits encoded input - (gaussian) channel inflicts error - Bob tries to recover from the error Statistical formulation: Bob’s perspective - given an output, maximizes his guess of � J in | J out � the input being correct. Maximizing quantity: P - called the posterior probability. Maximum Aposteriori Probability or MAP decoding: compute conditional probabilities using baye’s theorem, assign J in = 1 if P ( J i = 1 | J out ) > P ( − 1 | J out ) and J in = − 1 i i otherwise. Maximum information about the (Alice) input which can be transmitted across the channel to Bob = channel capacity C . Input (signal) power S & Noise (power) = N , then important quantities S N = signal to noise ratio and for a gaussian channel, channel capacity Pavithran S Iyer Guide: Prof. V.V Sreedhar Spin Glasses and Information Processing 1 � S �
Outline Overview Information Theory Disordered spin systems Implications of the correspondence Questions Bibliography Communication problem Communication Problem Alice transmits encoded input - (gaussian) channel inflicts error - Bob tries to recover from the error Statistical formulation: Bob’s perspective - given an output, maximizes his guess of � J in | J out � the input being correct. Maximizing quantity: P - called the posterior probability. Maximum Aposteriori Probability or MAP decoding: compute conditional probabilities using baye’s theorem, assign J in = 1 if P ( J i = 1 | J out ) > P ( − 1 | J out ) and J in = − 1 i i otherwise. Maximum information about the (Alice) input which can be transmitted across the channel to Bob = channel capacity C . Input (signal) power S & Noise (power) = N , then important quantities S N = signal to noise ratio and for a gaussian channel, channel capacity � 1 + S � C = 1 2 log 2 . N Pavithran S Iyer Guide: Prof. V.V Sreedhar Spin Glasses and Information Processing
Outline Overview Information Theory Disordered spin systems Implications of the correspondence Questions Bibliography Communication problem Communication Problem Statistical formulation: Bob’s perspective - given an output, maximizes his guess of � J in | J out � the input being correct. Maximizing quantity: P - called the posterior probability. Maximum Aposteriori Probability or MAP decoding: compute conditional probabilities using baye’s theorem, assign J in = 1 if P ( J i = 1 | J out ) > P ( − 1 | J out ) and J in = − 1 i i otherwise. Maximum information about the (Alice) input which can be transmitted across the channel to Bob = channel capacity C . Input (signal) power S & Noise (power) = N , then important quantities S N = signal to noise ratio and for a gaussian channel, channel capacity C = 1 � 1 + S � 2 log 2 . N Pavithran S Iyer Guide: Prof. V.V Sreedhar Spin Glasses and Information Processing
Outline Overview Information Theory Disordered spin systems Implications of the correspondence Questions Bibliography Communication problem Communication Problem Statistical formulation: Bob’s perspective - given an output, maximizes his guess of � J in | J out � the input being correct. Maximizing quantity: P - called the posterior probability. Maximum Aposteriori Probability or MAP decoding: compute conditional probabilities using baye’s theorem, assign J in = 1 if P ( J i = 1 | J out ) > P ( − 1 | J out ) and J in = − 1 i i otherwise. Maximum information about the (Alice) input which can be transmitted across the channel to Bob = channel capacity C . Input (signal) power S & Noise (power) = N , then important quantities S N = signal to noise ratio and for a gaussian channel, channel capacity C = 1 � 1 + S � 2 log 2 . N Pavithran S Iyer Guide: Prof. V.V Sreedhar Spin Glasses and Information Processing
Outline Overview Information Theory Disordered spin systems Implications of the correspondence Questions Bibliography Error correcting codes Error correcting code Not all encodings can assure recovery from error - only certain codes called error correcting codes . Crux: add redundant bits to input message - majority of bits are unaffected by error - original message can be retrieved. Redundancy is undesirable - slow rate of information transmission. Rate of transmission Rate of information = # bits for encoding (ignoring error) # bits used for encoding with error Pavithran S Iyer Guide: Prof. V.V Sreedhar Spin Glasses and Information Processing
Outline Overview Information Theory Disordered spin systems Implications of the correspondence Questions Bibliography Error correcting codes Error correcting code Not all encodings can assure recovery from error - only certain codes called error correcting codes . Crux: add redundant bits to input message - majority of bits are unaffected by error - original message can be retrieved. Redundancy is undesirable - slow rate of information transmission. Rate of transmission Rate of information = # bits for encoding (ignoring error) # bits used for encoding with error Pavithran S Iyer Guide: Prof. V.V Sreedhar Spin Glasses and Information Processing
Outline Overview Information Theory Disordered spin systems Implications of the correspondence Questions Bibliography Error correcting codes Error correcting code Not all encodings can assure recovery from error - only certain codes called error correcting codes . Crux: add redundant bits to input message - majority of bits are unaffected by error - original message can be retrieved. Redundancy is undesirable - slow rate of information transmission. Rate of transmission Rate of information = # bits for encoding (ignoring error) # bits used for encoding with error Pavithran S Iyer Guide: Prof. V.V Sreedhar Spin Glasses and Information Processing
Recommend
More recommend