random eigenvalue problems in structural dynamics
play

Random Eigenvalue Problems in Structural Dynamics S ONDIPON A - PowerPoint PPT Presentation

Random Eigenvalue Problems in Structural Dynamics S ONDIPON A DHIKARI Department of Aerospace Engineering University of Bristol Bristol, U.K. Email: S.Adhikari@bristol.ac.uk URL: http://www.aer.bris.ac.uk/contact/academic/adhikari/home.html


  1. Random Eigenvalue Problems in Structural Dynamics S ONDIPON A DHIKARI Department of Aerospace Engineering University of Bristol Bristol, U.K. Email: S.Adhikari@bristol.ac.uk URL: http://www.aer.bris.ac.uk/contact/academic/adhikari/home.html Random Eigenvalue Problems – p.1/21

  2. Outline of the talk Random eigenvalue problem Perturbation Methods Mean-centered perturbation method α -centered perturbation method Asymptotic analysis Numerical Example Conclusions & Open Problems. Random Eigenvalue Problems – p.2/21

  3. Random eigenvalue problem The random eigenvalue problem of undamped or proportionally damped linear systems: K ( x ) φ j = λ j M ( x ) φ j . (1) λ j eigenvalues; φ j eigenvectors; M ( x ) ∈ R N × N mass matrix and K ( x ) ∈ R N × N stiffness matrix. x ∈ R m is random parameter vector with pdf p ( x ) = (2 π ) − m/ 2 e − x T x / 2 . (2) Random Eigenvalue Problems – p.3/21

  4. The fundamental aim To obtain the joint probability density function of the eigenvalues and the eigenvectors. If the matrix M − 1 K is GUE (Gaussian unitary ensemble) or GOE (Gaussian orthogonal ensemble) an exact closed-form expression can be obtained for the joint pdf of the eigenvalues. In general the system matrices for real structures are not GUE or GOE Random Eigenvalue Problems – p.4/21

  5. Mean-centered perturbation method Assume that M ( 0 ) = M 0 and K ( 0 ) = K 0 are ‘deterministic parts’ (in general different from the mean matrices). The deterministic eigenvalue problem K 0 φ j 0 = λ j 0 M 0 φ j 0 . The eigenvalues λ j ( x ) : R m → R are non-linear functions of x . Here λ j ( x ) is replaced by its Taylor series about the point x = 0 λ j ( 0 ) x + 1 λ j ( x ) ≈ λ j ( 0 ) + d T 2 x T D λ j ( 0 ) x . (3) d λ j ( 0 ) ∈ R m and D λ j ( 0 ) ∈ R m × m are respectively the gra- dient vector and the Hessian matrix of λ j ( x ) evaluated at x = 0 . Random Eigenvalue Problems – p.5/21

  6. α -centered perturbation method We are looking for a point x = α in the x -space such that the Taylor series expansion of λ j ( x ) about this point 2 ( x − α ) T D λ j ( α ) ( x − α ) λ j ( α ) ( x − α ) + 1 λ j ( x ) ≈ λ j ( α ) + d T (4) is optimal in some sense. The optimal point α is selected such that the mean or the first moment of each eigenvalue is calculated most accurately. Random Eigenvalue Problems – p.6/21

  7. α -centered perturbation method The mean of λ j ( x ) can be obtained as � � m e − h ( x ) d x m λ j ( x ) p ( x ) d x = (2 π ) − m/ 2 ¯ λ j = (5) R R h ( x ) = x T x / 2 − ln λ j ( x ) . where (6) Expand the function h ( x ) in a Taylor series about a point where h ( x ) attends its global minimum. By doing so the error in evaluating the integral (5) would be minimized. Therefore, the optimal point can be obtained as ∂h ( x ) ∂λ j ( x ) 1 = 0 x k = , ∀ k. or (7) λ j ( x ) ∂x k ∂x k Random Eigenvalue Problems – p.7/21

  8. α -centered perturbation method Combining for all k we have d λ j ( α ) = λ j ( α ) α . Rearranging α = d λ j ( α ) /λ j ( α ) . (8) This equation immediately gives a recipe for an iterative algorithm to obtain α . Substituting d λ j ( α ) in Eq. (4) � 1 − | α | 2 � + 1 λ j ( x ) ≈ λ j ( α ) 2 α T D λ j ( α ) α + α T � � x + 1 λ j ( α ) I − D λ j ( α ) 2 x T D λ j ( α ) x . (9) Random Eigenvalue Problems – p.8/21

  9. Eigenvalue statistics using theory of quadratic forms Both approximations yield a quadratic form in Gaussian random variable λ j ( x ) ≈ c j + a T j x + 1 2 x T A j x . The moment generating function: 2 a T j [ I − s A j ] − 1 a j ≈ e sc j + s 2 � e sλ j ( x ) � M λ j ( s ) = E � (10) � I − s A j � Cumulants: � 2 Trace ( A j ) c j + 1 r = 1 , if � � κ r = (11) 2 a T j A r − 2 a j + ( r − 1)! A r r ! Trace if r ≥ 2 . j j 2 Random Eigenvalue Problems – p.9/21

  10. Asymptotic analysis We want to evaluate an integral of the following form: � � h ( x ) d x m f ( x ) p ( x ) d x = (2 π ) − m/ 2 � J = m e (12) R R h ( x ) = ln f ( x ) − x T x / 2 . � where (13) Assume f ( x ) : R m → R is smooth and at least twice h ( x ) reaches its global maximum at an differentiable and � unique point θ ∈ R m . Therefore, at x = θ h ( x ) ∂ � ∂ ln f ( x ) , ∀ k, or θ = ∂ = 0 or x k = ∂ x ln f ( θ ) . (14) ∂x k ∂x k Random Eigenvalue Problems – p.10/21

  11. Asymptotic analysis Further assume that � h ( θ ) is so large that � � � � 1 � � D j ( � h ( θ )) � → 0 j > 2 for � � (15) � � h ( θ ) h ( x ) evaluated at where D j ( � h ( θ )) is j th order derivative of � x = θ . Under such assumptions, using second-order Taylor h ( x ) the integral (12) can be evaluated as series of � h ( θ ) � θ � � e T θ / 2 H ( θ ) � − 1 / 2 . � � − J ≈ � = f ( θ ) e (16) H ( θ ) � � � Random Eigenvalue Problems – p.11/21

  12. Asymptotic analysis An arbitrary r th order moment of the eigenvalues � j ( x ) p ( x ) d x , m λ r µ ′ r = r = 1 , 2 , 3 · · · (17) R Comparing this with Eq. (12) it is clear that f ( x ) = λ r j ( x ) h ( x ) = r ln λ j ( x ) − x T x / 2 . � and (18) The optimal point θ can be obtained from (14) as θ = r d λ j ( θ ) /λ j ( θ ) . (19) Random Eigenvalue Problems – p.12/21

  13. Asymptotic analysis Using the asymptotic approximation, the r th moment: � � − 1 / 2 � � j ( θ ) e − | θ | 2 � I + 1 r r θθ T − λ j ( θ ) D λ j ( θ ) � � r = λ r µ ′ . (20) 2 � The mean of the eigenvalues (by substituting r = 1 ): � � λ j = λ j ( θ ) e − | θ | 2 � − 1 / 2 . � I + θθ T − D λ j ( θ ) /λ j ( θ ) ¯ (21) 2 � λ j ) r � � r � = � r ( λ j − ¯ k ¯ λ r − k ( − 1) r − k µ ′ Central moments: E . k =0 j k Random Eigenvalue Problems – p.13/21

  14. Numerical example Undamped two degree-of-system system: m 1 = 1 Kg, m 2 = 1 . 5 Kg, ¯ k 1 = 1000 N/m, ¯ k 2 = 1100 N/m and k 3 = 100 N/m. 1� 2� m� m� 1� 2� k� k� k� 2� 3� 1� Only the stiffness parameters k 1 and k 2 are uncertain: k i = k i (1 + ǫ i x i ) , i = 1 , 2 . x = { x 1 , x 2 } T ∈ R 2 and the ‘strength ¯ parameters’ ǫ 1 = ǫ 2 = 0 . 25 . Random Eigenvalue Problems – p.14/21

  15. Numerical example Following six methods are compared 1. Mean-centered first-order perturbation 2. Mean-centered second-order perturbation 3. α -centered first-order perturbation 4. α -centered second-order perturbation 5. Asymptotic method 6. Monte Carlo Simulation (10K samples) - can be considered as benchmark. The percentage error: Error i th method = { µ ′ k } i th method − { µ ′ k } MCS × 100 { µ ′ k } MCS Random Eigenvalue Problems – p.15/21 .

  16. Numerical example 20 Mean−centered 1st−order Mean−centered 2nd−order 18 α −centered 1st−order α −centered 2nd−order 16 Asymptotic Method Percentage error wrt MCS 14 12 10 8 6 4 2 0 1 2 3 4 k−th order moment: E [ λ k 1 ] Percentage error for the first four raw moments of the first eigenvalue Random Eigenvalue Problems – p.16/21

  17. Numerical example 0 −2 −4 Percentage error wrt MCS −6 −8 −10 Mean−centered 1st−order −12 Mean−centered 2nd−order α −centered 1st−order α −centered 2nd−order −14 Asymptotic Method −16 −18 −20 1 2 3 4 k−th order moment: E [ λ k 2 ] Percentage error for the first four raw moments of the second eigenvalue Random Eigenvalue Problems – p.17/21

  18. Numerical example 3 x 10 −3 Mean−centered 1st−order Mean−centered 2nd−order α −centered 1st−order α −centered 2nd−order 2.5 Asymptotic Method 2 (u) 1.5 1 p λ 1 0.5 0 0 500 1000 1500 u Probability density function of the first eigenvalue Random Eigenvalue Problems – p.18/21

  19. Numerical example 2 x 10 −3 Mean−centered 1st−order Mean−centered 2nd−order 1.8 α −centered 1st−order α −centered 2nd−order 1.6 Asymptotic Method 1.4 1.2 (u) 1 2 p λ 0.8 0.6 0.4 0.2 0 400 600 800 1000 1200 1400 1600 1800 2000 2200 2400 u Probability density function of the second eigenvalue Random Eigenvalue Problems – p.19/21

  20. Conclusions Two methods, namely (a) optimal point expansion method, and (b) asymptotic moment method, are proposed. The optimal point is obtained so that the mean of the eigenvalues are estimated most accurately. The asymptotic method assumes that the eigenvalues are large compared to their 3rd order or higher derivatives. Pdf of the eigenvalues are obtained in terms of central and non-central χ 2 densities. Random Eigenvalue Problems – p.20/21

  21. Open problems Joint statistics (moments/pdf/cumulants) of the eigenvalues with non-Gaussian system parameters. Statistics of the difference and ratio of the eigenvalues. Statistics of a single eigenvector (for GUE/GOE and general matrices). Joint statistics of the eigenvectors. Joint statistics of the eigenvalues and eigenvectors. Random Eigenvalue Problems – p.21/21

Recommend


More recommend