the magic of correlation measurements
play

The Magic of Correlation Measurements 29th Ann. Symp & Mini - PowerPoint PPT Presentation

The Magic of Correlation Measurements 29th Ann. Symp & Mini Show Enrico Rubiola Hanover Manor, NJ, Oct. 2, 2014 MTT-S & AP-S FEMTO-ST Institute, Besancon, France Contents Statistics Spectral measure and estimation Theory


  1. The Magic of Correlation Measurements 29th Ann. Symp & Mini Show Enrico Rubiola Hanover Manor, NJ, Oct. 2, 2014 MTT-S & AP-S FEMTO-ST Institute, Besancon, France Contents • Statistics • Spectral measure and estimation • Theory of the cross spectrum • Applications home page http://rubiola.org

  2. 2 Correlation measurements x = a + c instrument A Σ a ( t ) FFT analyzer dual-channel instr. noise c ( t ) DUT input y = b + c instrument B signal Σ b ( t ) instr. noise Two separate instruments � a(t), b(t) –> instrument noise � measure the same DUT. � c(t) –> DUT noise Only the DUT noise is common noise measurements single-channel DUT noise, 
 a, b 
 c instrument noise 
 S φ (f) normal use DUT noise background, 
 a, b 
 c = instrument noise 
 1/ √ m ideal case 0 no DUT c is the correlated background, 
 a, b 
 c ≠ correlation instrument noise real case 0 Zero DUT noise frequency

  3. 3 Statistics Boring but necessary exercises

  4. 4 Vocabulary of statistics • A random process x(t) is defined through a random experiment e that associates a function x e (t) with each outcome e . • The set of all the possible x e (t) is called ensemble • The function x e (t) is called realization or sample function . • The ensemble average is called mathematical expectation E { } • A random process is said stationary if its statistical properties are independent of time. • Often we restrict the attention to some statistical properties. • In physics, this is the concept of repeatability. • A random process x(t) said ergodic if a realization observed in time has the statistical properties of the ensemble. • Ergodicity makes sense only for stationary processes. • Often we restrict the attention to some statistical properties. • In physics, this is the concept of reproducibility. Example: thermal noise of a resistor of value R • The experiment e is the random choice of a resistor e • The realization x e (t) is the noise waveform measured across the resistor e • We always measure <x 2 >=4kTRB, so the process is stationary • After measuring many resistors, we conclude that <x 2 >=4kTRB always holds. The process is ergodic.

  5. 
 
 5 A relevant property of random noise A theorem states that there is no a-priori relation between PDF 1 and spectral measure For example, white noise can originate from 
 • Poisson process (emission of a particle at random time) • Random telegraph (random switch between two level) • Thermal noise (Gaussian) (1) PDF = Probability Density Function

  6. 6 Why Gaussian White Noise? • Whenever randomness occurs at microscopic level, noise tends to be Gaussian (central-limit theorem) • Most environmental e ff ects are not “noise” in strict sense (often, they are more disturbing than noise) • Colored noise types (1/ƒ, 1/ƒ 2 , etc) can be 
 whitened, analyzed, and un-whitened • Of course, GW noise is easy to understand

  7. 7 Properties of Gaussian White noise with zero mean x(t) <=> X(ıf) = X’(ıf)+ ıX”(ıf) 1. x(t) <=> X(ıf) are Gaussian 2. X(ıf 1 ) and X(ıf 2 ) , f 1 ≠ f 2 statistically independent 1. are statistically independent, 2. var{X(ıf 1 )} = var{X(ıf 2 )} X' 3. real and imaginary part: statistically f 0 f 1 f N–1 /2 f 2 1. X’ and X” are statistically 
 independent independent X" 2. var{X’} = var{X”} = var{X}/2 statistically independent 4. Y = X 1 + X 2 1. Y is Gaussian 2. var{Y} = var{X 1 } + var{X 2 } 2N degrees of freedom 5. Y = X 1 × X 2 1. is Gaussian 2. var{Y} = var{X 1 } var{X 2 }

  8. 8 Properties of parametric noise x(t) <=> X(ıf) = X’(ıf)+ ıX”(ıf) 1. Pair x(t) <=> X(ıf) 1. there is no a-priori relation between the distribution of x(t) and X(ıf) (theorem) statistically independent 2. Central limit theorem: x(t) and X(ıf) end up to be Gaussian X' 2. X(ıf 1 ) and X(ıf 2 ) f 0 f 1 can be f N–1 /2 f 2 1. generally, statistically independent correlated X" 2. var{X(ıf 1 )} ≠ var{X(ıf 2 )} in general 3. Real and imaginary part, same frequency statistically independent 1. X’ and X” can be correlated 2. var{X’} ≠ var{X”} ≠ var{X}/2 4. Y = X 1 + X 2 , zero-mean independent The process has N … 2N degrees Gaussian r.v. 
 of freedom, depending on var{Y} = var{X 1 } + var{X 2 } correlation between X’ and X” 5. If X 1 and X 2 are zero-mean independent Gaussian r.v. 1. Y = X 1 × X 2 is zero-mean Gaussian 2. var{Y} = var{X 1 } var{X 2 }

  9. 9 Children of the Gaussian distribution Chi-square 
 Bessel K 0 
 χ 2 = ∑ i x i2 x = x 1 x 2 Rayleigh 
 x = √ (x 12 +x 22 )

  10. 10 Spectral measure 1 and estimation (1) Engineers call it Power Spectral Density (PSD)

  11. 11 The Spectral Measure for stationary random process x(t) Autocovariance � [x( t ) − µ ][x( t − τ ) − µ ] ∗ C ( τ ) = E Improperly referred to as the correlation and denoted with R xx ( τ ) � µ = E x Z ∞ C ( τ ) e − i ωτ d τ � S ( ω ) = F C ( τ ) = Spectral measure (two-sided) −∞ Z T/ 2 For ergodic process, interchange [ x ( t ) − µ ][ x ( t − τ ) − µ ] ∗ dt C ( τ ) = lim ensemble and time average T →∞ process x( t ) –> realization x ( t ) − T/ 2 1 1 T | X T ( ω ) | 2 S ( ω ) = lim T X T ( ω ) X ∗ T ( ω ) = lim Wiener Khinchin theorem T →∞ T →∞ for stationary ergodic processes S I ( f ) = 2 S II ( ω / 2 π ) , f > 0 In experiments we use the single-sided PSD Fourier transform autocorrelation function Z ∞ R xx ( τ ) = 1 n o ξ ( t ) e − i ω t dt � = [x( t ) − µ ][x( t − τ ) − µ ] F ξ σ 2 E −∞

  12. 12 Sum of random variables 1. The sum of Gaussian distributed random variables has Gaussian PDF 2. The central limit theorem states that 
 For large m , the PDF of the the sum of m statistically 
 independent processes tends to a Gaussian distribution 
 Let X = X 1 +X 2 +…+X m be the sum of m processes of mean µ 1 , µ 2 , … µ m and variance σ 12 , σ 22 , … σ m2 . The process X has Gaussian PDF 
 expectation E{X} = µ 1 + µ 2 +…+ µ m , and variance σ 2 = σ 12 + σ 22 +…+ σ m2 3. Similarly, the average <X> m = (X 1 +X 2 +…+X m )/m has 
 Gaussian PDF, E{X} = ( µ 1 + µ 2 +…+ µ m )/m, and σ 2 = ( σ 12 + σ 22 +…+ σ m2 )/m 4. Since white noise and flicker noise arise from the sum of a large number of small-scale phenomena, they are Gaussian distributed PDF = Probability Density Function

  13. 13 quick Product of independent zero-mean Gaussian-distributed random variables x 1 and x 2 are normal distributed with f ( x ) = 1 ✓ − | x | ◆ πσ K 0 zero mean and variance σ 12 , σ 22 σ x = x 1 x 2 E { f ( x ) } = 0 x has Bessel K 0 distribution 
 E {| f ( x ) − E { f ( x ) }| 2 } = σ 2 with variance σ = σ 12 σ 22 Thanks to the central limit theorem, the average <X> m = (X 1 +X 2 +…+X m )/m of m products has • Gaussian PDF, • average E{X} = 0 • variance V{X} = σ 2

  14. 14 Spectral Measure S xx (ƒ) (Power Spectral Density) X is white Gaussian noise Take one frequency, S(f) –> S. Same applies to all frequencies Spectrum h S xx i m = 1 T h XX ⇤ i m T h ( X 0 + iX 00 ) ⇥ ( X 0 � iX 00 ) i m = 1 ( X 0 ) 2 + ( X 00 ) 2 ↵ = 1 ⌦ T m white, Gaussian, 
 avg = 0, var = 1/2 white, χ 2 , with 2m degrees of freedom 
 avg = 1, var = 1/m the S xx track on the � dev 1 avg = FFT-SA shrinks as 1/m 1/2 m Normalization: in 1 Hz bandwidth var{X}= 1, and var{X’}= var{X”}= 1/2

  15. 15 Estimation of |S xx (ƒ)| 10 10 10 10 |Sxx| m=1 |Sxx| m=2 |Sxx| m=4 |Sxx| m=8 1 1 1 1 0.1 0.1 0.1 0.1 0.01 0.01 0.01 0.01 frequency frequency frequency frequency 0.001 0.001 0.001 0.001 0 50 100 150 200 0 50 100 150 200 0 50 100 150 200 0 50 100 150 200 10 10 10 10 |Sxx| m=16 |Sxx| m=32 |Sxx| m=64 |Sxx| m=128 1 1 1 1 0.1 0.1 0.1 0.1 0.01 0.01 0.01 0.01 frequency frequency frequency frequency 0.001 0.001 0.001 0.001 0 50 100 150 200 0 50 100 150 200 0 50 100 150 200 0 50 100 150 200 10 10 10 10 |Sxx| m=256 |Sxx| m=512 |Sxx| m=1024 File spectra � seq � 11 � 1024 � 0316 � absSxx Sourcexsp.mn E.Rubiola, mar 2010 1 1 1 1 0.1 0.1 0.1 0.1 0.01 0.01 0.01 0.01 frequency frequency frequency frequency 0.001 0.001 0.001 0.001 0 50 100 150 200 0 50 100 150 200 0 50 100 150 200 0 50 100 150 200 Running the measurement, m increases and S xx shrinks => better confidence level

  16. 16 Cross Spectrum Theory Getting close to the real game x = a + c instrument A Σ a ( t ) FFT analyzer dual-channel instr. noise c ( t ) DUT input y = b + c instrument B signal Σ b ( t ) instr. noise

  17. 17 S yx with correlated term (1) A, B = instrument background C = DUT noise channel 1 X = A + C channel 2 Y = B + C A, B, C are independent Gaussian noises 
 Re{ } and Im{ } are independent Gaussian noises Normalization: in 1 Hz bandwidth var{A} = var{B} = 1, var{C}= κ 2 
 var{A’} = var{A”} = var{B’} = var{B”} = 1/2, and var{C’} = var{C”} = κ 2 /2 Cross-spectrum T h ( Y 0 + iY 00 ) ⇥ ( X 0 � iX 00 ) i m 1 1 T h Y X ⇤ i m h S yx i m = = Expand using X = ( A 0 + iA 00 ) + ( C 0 + iC 00 ) Y = ( B 0 + iB 00 ) + ( C 0 + iC 00 ) and Split S yx into three sets � � � � S yx ⇥ m = � S yx ⇥ m instr + � S yx ⇥ m mixed + � S yx ⇥ m � � � DUT background background DUT noise only and DUT noise only ... and work it out !!!

Recommend


More recommend