capacity and beyond
play

Capacity and Beyond Erozan M. Kurtas Acknowledgement M. Fatih - PowerPoint PPT Presentation

Capacity and Beyond Erozan M. Kurtas Acknowledgement M. Fatih Erden Sami Iren Dieter Arnold Raman Venkataramani Inci Ozgunes March/15/2004 Erozan Kurtas Page 2 Conventional system M M r Store bit by - Applying external H > Hc


  1. Capacity and Beyond Erozan M. Kurtas

  2. Acknowledgement M. Fatih Erden Sami Iren Dieter Arnold Raman Venkataramani Inci Ozgunes March/15/2004 Erozan Kurtas Page 2

  3. Conventional system M M r Store bit by - Applying external H > Hc (magnetize up) H c H - Applying external H < -Hc (magnetize down) M = Slope r ( − 1 S ) H c 1 [ bit ] = 2 ArealDensi ty [ bits / inch ] a [ inch ] w [ inch ] O w a March/15/2004 Erozan Kurtas Page 3

  4. In reality there are tiny grains Tiny grains with finite volume V and anisotropy coefficient Ku Minimum grain number fixed to preserve SNR Maximum value limited by maximum writer field K V Maximum Attainable ≥ u M Super paramagnetic limit : Areal Density is Limited k T B Boltzmann Constant Temperature Large number, like 60 March/15/2004 Erozan Kurtas Page 4

  5. Dibit Responses Compared Dibit response (Longitudinal model) Dibit responses (Perpendicular model) 0.8 0.7 ND=1.5 ND=1.5 ND=2 ND=2 0.6 ND=2.5 0.6 ND=2.5 ND=3 ND=3 0.4 0.5 0.2 0.4 Amplitude Amplitude 0 0.3 -0.2 0.2 -0.4 0.1 -0.6 0 -0.8 -5 -4 -3 -2 -1 0 1 2 3 4 5 -10 -8 -6 -4 -2 0 2 4 6 8 10 t/T t/T Perpendicular model Longitudinal model March/15/2004 Erozan Kurtas Page 5

  6. Frequency Responses of Dibit responses Frequency responses for a Longitudinal model Frequency responses of a Perpendicular model ND=1.5 ND=1.5 1 ND=2 1 ND=2 ND=2.5 ND=2.5 0.8 0.8 Normalized amplitude Normalized amplitude 0.6 0.6 0.4 0.4 0.2 0.2 0 0 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 Normalized frequency Normalized frequency Longitudinal model Perpendicular model March/15/2004 Erozan Kurtas Page 6

  7. Reality is Quite Different � Signal suffers from nonlinearities � NLTS � MR Asymmetry � Base Line Wanders � TAs � Other distortions � Noise is Non-Gaussian � Noise is Signal Dependent March/15/2004 Erozan Kurtas Page 7

  8. Noise Free Real Signals with Nonlinearities 9851 PRBS ND=2.1 March/15/2004 Erozan Kurtas Page 8

  9. Extracted Dibit Response extracted dibit • provides information about systems linear response. • is a convenient means for identifying nonlinearities present in system that show up as echoes around the main pulse. March/15/2004 Erozan Kurtas Page 9

  10. Volterra Model of a Readback Signal Linear First Order Kernel Second Order Non- Response linear Response ( ) ∑ 1 = − y ( t ) a C ( t kT ) k Second Order Kernels k ( ) ( ) Third Order Non- ∑ 2 ∑ 2 + − + − + a a C ( t kT ) a a C ( t kT ) L − − k k 1 k k 2 linear Response 1 2 k k ( ) ( ) ∑ 3 ∑ 3 + − + − + a a a C ( t kT ) a a a C ( t kT ) L − − − − k k 1 k 2 k k 1 k 3 1 , 2 1 , 3 k k Memory Length Third Order Kernels C l ( ) VM can be characterized by 2 L -1 kernels, = ( t ), l 2 , L L For magnetic recording, only a few of the kernels are significant. A rule of thumb: L ~ the extent of the dipulse (in units of bit interval, T ) March/15/2004 Erozan Kurtas Page 10

  11. VM Kernels can be conveniently identified from measured PRBS signals All Patterns of Length L=9 Signal Chips a -4 a -3 a -2 a -1 a 0 a 1 a 2 a 3 a 4 0.1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 -1 0.08 3 1 1 1 1 1 1 1 -1 1 4 1 1 1 1 1 1 -1 1 -1 5 1 1 1 1 1 -1 1 -1 1 0.06 Amplitude (Volts) 6 1 1 1 1 -1 1 -1 1 -1 7 1 1 1 -1 1 -1 1 -1 -1 0.04 M M M M 0.02 506 -1 1 1 -1 1 -1 1 1 1 507 1 1 -1 1 -1 1 1 1 1 0 508 1 -1 1 -1 1 1 1 1 1 509 -1 1 -1 1 1 1 1 1 1 -0.02 510 1 -1 1 1 1 1 1 1 1 511 -1 1 1 1 1 1 1 1 1 512 -1 -1 -1 -1 -1 -1 -1 -1 -1 -0.04 0 50 100 150 200 250 300 Sample Number 15-Apr-2003 Research Channels No signal chip March/15/2004 Erozan Kurtas Page 11

  12. Identification of Kernels Distorted (1) -3 x 10 There are only three 7 kernels above the C (2) threshold 1 6 C (2) We declare them as 2 C (2) significant and include in 3 5 C (3) the Reduced Complexity Amplitude (Volts) 1,2 C (3) Volterra Model. 1,3 4 C (3) 2,3 3 2 1 THRESHOLD 0 10 20 30 40 50 Sample Number 15-Apr-2003 Research Channels March/15/2004 Erozan Kurtas Page 12

  13. Volterra Model Block Diagram y (0) ( t ) Linear Part a k ={-1,1} y ( t ) y (1) ( t ) + C (1) ( t ) y nl (t) a k T a k-1 a k-2 NonLinear T Distortion ( 2 ) X C ( t ) y (2) ( t ) 1 + X ( 2 ) + C ( t ) 2 y (3) ( t ) ( 3 ) X C ( t ) 1 , 2 March/15/2004 Erozan Kurtas Page 13

  14. How Good is VM? March/15/2004 Erozan Kurtas Page 14

  15. Longitudinal Media Noise and Signal Synchronous Noise Samples and Noise Free Signal 0.25 0.2 0.15 0.1 0.05 Volts (V) 0 -0.05 -0.1 -0.15 -0.2 -0.25 0 50 100 150 200 250 300 350 400 29-Oct-2002 Sample Number (1Gsample/Sec) March/15/2004 Erozan Kurtas Page 15

  16. Longitudinal: Media Noise Voltage Synchronous Noise Voltage Normal Probability Plot 2500 0.999 0.997 2000 0.99 0.98 0.95 0.90 Occurances 1500 Probability 0.75 0.50 0.25 1000 0.10 0.05 0.02 0.01 500 0.003 0.001 0 -0.1 0 0.1 -0.05 0 0.05 Noise Voltage Noise Voltage March/15/2004 Erozan Kurtas Page 16

  17. Data Dependent AR Model (AR) Noise Filter n k-L n k-L+1 n k-1 N (0,1)~ w k L D D σ(β) b L ( β ) b 1 ( β ) Filter L Lookup { β ∑ α n k a k { Signal + z k Lookup y ( α ) March/15/2004 Erozan Kurtas Page 17

  18. Perpendicular Signal and Media Noise March/15/2004 Erozan Kurtas Page 18

  19. Media Noise 400 MHz March/15/2004 Erozan Kurtas Page 19

  20. As ND Increases Noise Variance vs. Pattern March/15/2004 Erozan Kurtas Page 20

  21. AR noise generation - not a great fit March/15/2004 Erozan Kurtas Page 21

  22. Perpendicular: Media Noise Voltage Synchronous Noise Voltage Normal Probability Plot 6000 • Noise distribution 0.999 5000 0.997 does not fit 0.99 0.98 Gaussian! 0.95 4000 0.90 Occurances Probability • Noise 0.75 3000 0.50 distribution 0.25 looks more 0.10 2000 0.05 like a 0.02 0.01 laplacian. 0.003 1000 0.001 0 -0.05 0 0.05 -0.04 -0.02 0 0.02 Noise Voltage (V) Noise Voltage (V) March/15/2004 Erozan Kurtas Page 22

  23. Generic Storage Channel k R = n k n bits bits write head encoder modulator Channel adds redundancy to make signal robust 1 capacity medium T uses redundancy SNR PW50 to recover the data bits demodulator read head decoder & equalizer March/15/2004 Erozan Kurtas Page 23

  24. Brief Survey of the Literature Z k ~ N ( 0 , N / 2 ) 0 Y k { } ∈ − 1 + X , 1 FIR k – Bounds Shamai at al. 1991, McLaughlin and Neuhoff 1993, and many others – Direct Computation Hirt 1988 – Markov-Chain Monte-Carlo Method Arnold and Loeliger, Pfister et al., Sharma and Singh, Vontobel, all in 2001 – Shamai-Laroia conjecture Shamai-Laroia 1996, Dholakia et al. 2000, Arnold and Eleftheriou 2002 March/15/2004 Erozan Kurtas Page 24

  25. Brief Survey of the Literature (cont.) Kavcic 2001: Zhang, Duman, and Kurtas 2002: Modeling signal- dependent noise March/15/2004 Erozan Kurtas Page 25

  26. Set-up for Mismatch Lower Bounds Y k ( ⋅ ⋅ W | ) X k ( ⋅ Q ) ( ⋅ ⋅ M | ) Mismatch lower bound: [ ] [ ] ≡ − I ( Q , M ) E QW log M ( Y | X ) E QW log( QM )( Y ) ≥ I ( Q , W ) I ( Q , M ) A. Ganti, A. Lapidoth, and I. E. Telatar, “Mismatched Decoding Revisited: General Alphabets, Channels with Memory, and the Wide-Band Limit, “ IEEE Trans. on Inform. Theory, pp. 2315-2328, Nov. 2000. March/15/2004 Erozan Kurtas Page 26

  27. How to model the noise? Synchronous Noise Voltage Normal Probability Plot 6000 0.999 5000 0.997 0.99 0.98 0.95 4000 0.90 Occurances Probability 0.75 3000 0.50 0.25 0.10 2000 0.05 0.02 0.01 0.003 1000 0.001 0 -0.05 0 0.05 -0.04 -0.02 0 0.02 Noise Voltage (V) Noise Voltage (V) March/15/2004 Erozan Kurtas Page 27

  28. Just use histograms Y Z k k ( ⋅ ⋅ W | ) X k ( ⋅ Q ) ( ⋅ ⋅ M | ) ( 0 , 0 ) ( 0 , 1 ) March/15/2004 Erozan Kurtas Page 28

  29. Validation on the ideal (1-D)-Channel (1-D) channel AWGN L=1 b=1,3,5 N k X V Y k k k 1-D March/15/2004 Erozan Kurtas Page 29

  30. (1-D)-Channel – Histograms (dotted for low SNR, solid for high) L=1 High SNR b=3 Low SNR +1 -1 March/15/2004 Erozan Kurtas Page 30

  31. Model the channel as a GPR or FSM 1. Capturing the influence of the neighbor bits by means of a state Dc=2.0 ≡ s ( x , x ,..., x , x ,..., x ) − − − + + k k m k ( m 1 ) k k 1 k m = + = 2 m 1 L 2. Sorting over time by means of a trellis of size | S | 2 2 x + k 1 S S + k k 1 March/15/2004 Erozan Kurtas Page 31

  32. Use Quantized Histograms per Branch for Noise 3. Representation of the noise pdfs per branch: z z z z z z z z − + + + + + + k 1 k k 1 k 2 k 3 k 4 k 5 k 6 ... ... Pr Amplitude March/15/2004 Erozan Kurtas Page 32

Recommend


More recommend