Lecture Eigen Vector

6
1 Lecture 2: Eigenvalue-Eigenvector and Orthogonal Matrix (Part B Chapter 7 in “Advanced Eng. Math.” By E. Kreyszig) Definition For a given square matrix A, if equation AX = λX, with λ is a nonzero number and X is a nonzero vector exists then λ is the eigenvalue and X the eigenvector of A. Both λ and X are derived from A, they represent certain characteristics of A. The definition shows that, square matrix A multiplying its eigenvector only changes the eigenvector’s magnitude with a proportional factor of λ (eigenvalue). Calculation . 3 ; 5 3, 5 3 45 try 0 45 21 0 ) ( 2 2 ) 2 )( 2 )( 6 ( ) 1 )( 1 )( 3 ( ) 3 )( 2 ( 2 ) 1 )( 6 ( 2 ) )( 1 )( 2 ( 0 - 0 2 - 1 - 6 - - 1 2 3 - 2 - 2 - 0 1 0 0 0 1 0 0 0 1 0 2 - 1 - 6 - 1 2 3 - 2 2 - 0 2 - 1 - 6 - 1 2 3 - 2 2 - 0 | I | from Find (1) steps Two 0 | I - | requests Rule Cramer the , for solution nontrivial has LES s homogeneou the If ) ( 3 2 1 2 3 3 2 1 3 2 1 1 1 1 = = = × × = = + + = + + = = = = = = = = = × × × × × λ λ λ λ λ λ λ λ λ λ λ λ λ λ λ λ λ λ λ λ λ λ λ x x x x x x A-λ A X O X I A O IX AX X I AX X X A n n n n n n n Roots of a polynomial: the number of roots equals the polynomial order and the product of all roots equals the polynomial constant. (2) Find eigenvector X from homogeneous LES, 0 ) ( = λ X I A with the calculated λ using GE a) For λ = 5 = = = = = = = + = + + 1 2 1 2 0 0 0 2 0 5 2 0 0 0 2 1 0 5 2 1 16 - 8 - 0 16 - 8 0 5 - 2 - 1 - 32 - 16 - 0 16 - 8 0 5 - 2 - 1 - 3 - 2 7 - 6 - 4 2 5 - 2 - 1 - 5 - 2 - 1 - 6 - 4 2 3 - 2 7 - 5 - 0 2 - 1 - 6 - 5 - 1 2 3 - 2 5 - 2 - 1 1 3 1 2 1 1 1 3 2 3 2 1 c c x c x c x X x x x x x Notice: there are 2 Equations including 3 unknown components of the eigenvector (x 1 , x 2 , x 3 ) so 1 independent component x 1 can take any value, c 1, and the other components, x 2 and x 3, have the values corresponding to c 1 . In general, for a LES with n unknowns, after Gauss Elimination, we have r = rank (A) being the number of independent equations in the LES then, (n – r) equals the number of independent unknowns. The eigenvector needs to be represented as a linear combination of the (n - r) independent vectors. b) For λ = -3 (the double roots) + = = = + = = = = = = = + = = = 1 0 3 0 1 2 1 0 3 and 0 1 2 1 0 0 1 take Usually, 2 3 r, eigrnvecto in the components t independen as ; take If 0 3 2 2 1 3 0 0 0 0 0 0 3 - 2 1 4 2 - 1 - 6 - 4 2 3 - 2 1 ) 3 ( - 0 2 - 1 - 6 - ) 3 ( - 1 2 3 - 2 ) 3 ( - 2 - 3 2 3 2 3 2 3 3 2 2 2 3 3 3 2 2 3 2 1 k k X X X k k x x c x c x x x X c x c x x x x r n r n Note: you can use any two independent vectors as 3 2 x x then calculate x 1 and obtain the eigenvector X.

description

Math lectures

Transcript of Lecture Eigen Vector

  • 1

    Lecture 2: Eigenvalue-Eigenvector and Orthogonal Matrix (Part B Chapter 7 in Advanced Eng. Math. By E. Kreyszig)

    Definition For a given square matrix A, if equation AX = X, with is a nonzero number and X is a nonzero vector exists then is the eigenvalue and X the eigenvector of A. Both and X are derived from A, they represent certain characteristics of A. The definition shows that, square matrix A multiplying its eigenvector only changes the eigenvectors magnitude with a proportional factor of (eigenvalue). Calculation

    .3 ;5 3,53 45 try 045210)(22)2)(2)(6()1)(1)(3(

    )3)(2(2)1)(6(2))(1)(2(

    0-02-1-6--123-2-2-

    0100010001

    02-1-6-123-22-

    02-1-6-123-22-

    0 |I| from Find (1)

    steps Two 0 |I-| requests RuleCramer the,for solution nontrivial has LES shomogeneou theIf

    )(

    32123

    3

    2

    1

    3

    2

    1

    111

    =====++

    =

    ++

    =

    =

    =

    =

    =

    ====

    xxx

    xxx

    A-

    AX

    OXIAOIXAXXIAXXXA nnnnnnn

    Roots of a polynomial: the number of roots equals the polynomial order and the product of all roots equals the polynomial constant. (2) Find eigenvector X from homogeneous LES, 0)( = XIA with the calculated using GE a) For = 5

    =

    =

    =

    =

    =

    =

    =+

    =++

    121

    20002052

    000210521

    16-8-016-805-2-1-

    32-16-016-805-2-1-

    3-27-6-425-2-1-

    5-2-1-6-423-27-

    5-02-1-6-5-123-25-2-

    1

    13

    12

    11

    132

    321

    ccxcxcx

    Xxxxxx

    Notice: there are 2 Equations including 3 unknown components of the eigenvector (x1, x2, x3) so 1 independent component x1 can take any value, c1, and the other components, x2 and x3, have the values corresponding to c1. In general, for a LES with n unknowns, after Gauss Elimination, we have r = rank (A) being the number of independent equations in the LES then, (n r) equals the number of independent unknowns. The eigenvector needs to be represented as a linear combination of the (n - r) independent vectors. b) For = -3 (the double roots)

    +

    =

    =

    =

    +

    =

    =

    =

    =

    ==

    =+

    =

    =

    =

    103

    012

    103

    and 012

    10

    01

    take Usually,23

    r,eigrnvecto in the componentst independen as ; takeIf

    0322

    13

    0000003-21

    42-1-6-423-21

    )3(-02-1-6-)3(-123-2)3(-2-

    32323

    2

    33

    22

    23

    3322

    321

    kkXXXkkxx

    cxcxxx

    X

    cxcx

    xxxrn

    rn

    Note: you can use any two independent vectors as

    3

    2

    xx then calculate x1 and obtain the eigenvector X.

  • 2

    Rules 1. Values of equal the roots of characteristic polynomial derived from |A| = 0. If the order of the polynomial is k, you have k real and/or complex roots, they may be distinct or multiple. A square matrix, Ann can have at maximum n distinct roots (eigenvalues). 2. All eigenvectors corresponding to distinct eigenvalues are independent.

    21 2

    1

    11

    2

    2

    12 1 2

    2

    5 3 5 30 (5 ) 9 0 8, 2.

    3 5 3 58,

    5 8 3 -3 3 1

    3 5 8 0 0 12,

    5 2 3 3 3 1 and independent.

    3 5 2 0 0 1

    A

    x X

    x

    xX X X

    x

    = = = = =

    =

    = =

    =

    = =

    3. AT has the same eigenvalues as A. Example eigenvalues and eigenvectors for 2d square matrices: Applications

    1. Principal Directions in vector transformation (by multiplying a matrix) A circle elastic membrane in the x1-x2 plane with boundary equation: x12 + x22 = 1 is stretched so that any point P (x1 x2) on the boundary goes to point Q (y1 y2) by = !! = = 5 33 5 !!

    In the Principal Direction, points P moves to Q in the same direction: 5 33 5 !! = !! 5 33 5 = 5 ! 9 = 0 ! = 8, ! = 2 eigenvectors ! : 5 8 33 5 8 !! = 0 ! ! = 0 1 independent ! = ! = c! !! = c! 11 ! : 5 2 33 5 2 !! = 0 ! + ! = 0 1 independent ! = ! = d! !! = d! 11

    Illustration

    Matrix

    1011

    Eq. 2 2+ 1 = 0 2 2k + k2 = 0 ( k1)( k2) = 0 2 2 cos + 1 = 0 1, 2 1= 2=1 1=2= k 1= k1 2= k2 1,2= cos isin

    X1, 2 ( )01 ( ) 1) (0, 0 1, ( ) 1) (0, 0 1, (1, -i) (1, i)

  • 3

    2. Discrete Time Markov Chain A random process of state vector Xn1 characterized as memoryless: the next state Xt+1 depends only on the current state Xt but not on the sequence of events that preceded it: !!!!! = !!!!! , !!!!!!!!!!!! = 0.8 0.1 0.10.1 0.7 0.20 0.1 0.9

    !! = 1!! = 0!! = 1 = !!!! = 0.9!!!! = 0.3!!!! = 0.9 !" = probability of transforming from !! to !!!! !!!! = 0.9 includes 0.8 from !! ; 0.1 from !! and 0.1 from !! .

    3. Vibration analysis

    Two masses, m1 and m2, hang on two springs (k1 and k2) as shown. With a small disturb, the system will vibrate around it force-balanced positions y1 = 0 and y2 = 0. Calculate the displacements of m1 and m2: y1(t) and y2(t).

    From Newtons 2nd law and Hooks law.

    221211112221

    2

    1 )( )( ykykkykyykdtydm ++==

    221212222

    2

    2 )( ykykyykdtydm ==

    =

    +=

    2

    1

    22

    2

    21

    2

    2

    1

    22

    221

    22

    2

    2

    21

    2

    1

    2225)(

    yy

    dtyddtyd

    yy

    kkkkk

    dtydm

    dtydm

    and If

    22

    11

    =

    =t

    t

    exyexy

    =

    t

    t

    exex

    dtyd

    dtyd

    2

    2

    21

    22

    2

    21

    2

    Xxx

    xx

    xx

    xx

    exex

    exex

    t

    t

    t

    t

    r Eigenvecto ; Eigenvalue 22

    2522

    2522

    25 22

    1

    2

    12

    2

    12

    2

    21

    2

    12

    2

    21

    =

    =

    =

    y1

    y2

    m1 = 1 k2 = 2 m2 = 1

    0

    0

    k1 = 3

  • 4

    =

    =

    ==

    =

    =

    =

    +

    +==

    ==

    ==

    ==

    =

    =

    =

    +

    +===

    ===

    =

    =

    ti

    ti

    itt

    itt

    ceycey

    xcX

    Oxx

    Oxx

    xx

    i

    ceexyceexyxcX

    Oxx

    Oxx

    xx

    i

    xx

    xx

    62

    61

    22

    2

    1

    2

    1

    2

    12

    22

    1111

    2

    1

    2

    1

    2

    11

    212

    2

    1

    2

    1

    21

    2

    0021

    4221

    622

    265 ;6 6For

    221

    0012

    12

    24

    122215

    ;1 1For

    ;6;10-2-2

    2-5-

    2225

    The system can vibrate in different modes! 2. Diagonalization of a square matrix Ann with n independent eigenvectors (distinct eigenvalues) Diagonalization means transforming a non-diagonal matrix into an equivalent diagonal matrix, hence it becomes simple for operations. If matrix Ann with distinct eigenvalues, it must have independent eigenvectors. Form a new matrix P whose column vectors are these distinct eigenvectors (P is called the modal matrix of A). It can be shown that P-1 exists (|A| 0) and the matrix product P-1AP = D. For this diagonal matrix D the diagonal elements are the distinct eigenvalues. The order or the eigenvalues in D is the same as the order of column vectors in P:

    ==

    n

    DAPP

    ...000.000...00...0

    2

    1

    1

    =

    ==

    =

    =

    =

    ==

    =

    =

    ==

    ==

    =

    1005

    1111

    2332

    1111

    21

    1111

    21 and

    1111

    in order r eigenvecto theChange5001

    1111

    2332

    1111

    21

    1111

    21 and

    1111

    11

    5 ;1

    1 1

    2332

    Given

    11

    1

    1222111

    APPDPP

    P

    APPD

    PPcXcXA

    =

    ==

    =

    =

    ==

    ==

    =

    1006

    1114

    2145

    and 11

    141-1

    1 ;14

    62145

    Given

    54

    51

    51

    51

    1

    54

    51

    51

    51

    12211

    AXXD

    XXXXA -

    We can retrieve A from D and P: AIAIPPAPPPAPPPPDP ==== )()()( 11111 3. Matrix Power calculation using 1= PDPA

    . 112112 === PPDAPPDPDPPDPA kk

    =

    ==

    =

    =

    =

    5001

    1111

    21

    5001

    1111

    then 1111

    21 and

    1111

    2332

    Given 23123231 PPDAPPA

  • 5

    4. Similar Matrix Matrix C is formed by two given matrices A and B with ABBC 1= then C and A are similar matrices and the process is called a Similarity Transformation of A. Features of Similar Matrix C and A share the same eigenvalues {i} and their eigenvectors have the relation AC XBX

    1= 1 25 52 15 5

    1 25 51 12 15 5

    5 41) Given matrix and test the similarity transformation.

    1 2

    Let and

    Eigenvalues and eigenvectors of 5 4 5- 4

    1 2 1 2-

    T

    X A

    P X P X X

    A

    A

    = =

    = = = =

    =

    21 2

    1

    2

    0 (5- )(2- ) 4 0

    7 6 0 ( 6)( 1) 0 6, 1.Eigenvectors

    65-6 4 -1 4 -1 4 41 2-6 1 -4 0 0 1

    15-1 4 4 4 1 1 11 2-1 1 1 0 0 1

    Similarity Matrix o

    X

    X

    = =

    + = = = =

    =

    =

    =

    =

    1 2 1 2 23 65 5 5 5 5 51 1 12 1 2 1 21 125 5

    5 55 5 5 5

    23 65 5

    21 125 5

    f

    5 4 1 2 5 4 1 21 2 2 1 1 2 2 1

    Eigenvalues and eigenvectors of

    23 5 60 0

    21 12 5

    -

    A

    A P AP

    A

    = =

    = =

    %1

    2

    123 65 5

    21 125 5

    1 25 51 1 12 1 5 55 5

    6 as

    1

    6

    6 23 5 6 6 7 6 7 6 6121 12 5 6 21 18 0 0 756

    4 1 2 4 6 Similarity!

    1 2 1 1 7

    s A

    P X

    =

    =

    =

    =

    = = =

    5. Orthogonal Matrix Orthogonal Vectors Two vectors A1n and B1n have the scalar product (dot) AB = 0. A1n (B1n)T = 0 Orthogonal vectors must be independent, but independent vectors may not be orthogonal. Perpendicular is limited to 3D, Orthogonal for nay Dimensions. Orthogonal Matrix A square matrix has all column/row vectors {ai} orthogonal. aiak= 0 j k Orthonormal Matrix (Normalize the column/row vectors in the orthogonal matrix) A real square matrix is orthogonal matrix, iff its column/row vectors: (a1,,an) have:

  • 6

    aiak=1length have vectorscolumn/row all othereach vectorscolumn/rowdifferent

    if 1 if 0

    =

    =

    kjkj

    Features of an orthonormal matrix: (1) AT = A1; (2) |A| = 1, (3) Eigenvalue || =1, and (4) forming a unit perpendicular coordinate system. Check agiven orthnormal matrix A

    IAAAA TT =

    =

    =

    =

    32

    31

    32

    32

    32

    31

    31

    32

    32

    32

    32

    31

    31

    32

    32

    32

    31

    32

    32

    31

    32

    32

    32

    31

    31

    32

    32

    ;

    32

    32

    31

    31

    32

    32

    32

    31

    32

    Rule A symmetric matrix A can construct an orthonormal matrix X using its normalized eigenvectors. Two steps: 1) calculate eigenvectors (2) normalization the eigenvectors.

    =+

    =+

    =

    =====

    =

    51

    52

    21

    525

    1

    21

    2

    12

    12-

    020021

    4221

    042201

    21

    020012

    1224

    542251

    05

    0504)4)(1(042

    214221

    xx

    xx

    A

    1|| 5212)51(

    04)51(0512

    251 0

    of s

    1

    matrix? orthogonalan 1001

    0) product dot (vector lorthonorma0)(:Tests

    2

    51

    52

    52

    51

    51

    52

    52

    51

    51

    52

    52

    51

    51

    52

    52

    51

    1

    51

    52

    52

    51

    51

    52

    52

    51

    =

    ==

    =+=

    =

    =

    =

    =

    ==

    ==+

    =

    ii

    X

    X

    XXXX

    X

    T

    Specific matrices which can be transformed into an orthogonal matrix: 1) A square matrix An n has n distinct eigenvalues can construct an orthogonal matrix from its independent eigenvectors. The column or row vectors in this orthogonal matrix construct a basis for Rn (a Cartesian coordinate system). 2) An symmetric matrix A can derive an orthonormal system from its eigenvectors and form an orthogonal matrix (by normalization) whose column/row vectors is a basis of Rn (a Cartesian coordinate system)