Eigenvalue: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Michael Underwood
(New page: In linear algebra an ''eigenvalue'' of a (square) matrix <math>A</math> is a number <math>\lambda</math> that satisfies the eigenvalue equation, :<math>\text{det}(A-\lambda I)=0\ ,...)
 
imported>Michael Underwood
No edit summary
Line 1: Line 1:
In [[linear algebra]] an ''eigenvalue'' of a (square) [[matrix]] <math>A</math> is a number <math>\lambda</math> that satisfies the eigenvalue equation,
In [[linear algebra]] an '''eigenvalue''' of a (square) [[matrix]] <math>A</math> is a number <math>\lambda</math> that satisfies the eigenvalue equation,
:<math>\text{det}(A-\lambda I)=0\ ,</math>
:<math>\text{det}(A-\lambda I)=0\ ,</math>
where <math>I</math> is the [[identity matrix]] of the same [[dimension]] as <math>A</math>
where <math>I</math> is the [[identity matrix]] of the same [[dimension]] as <math>A</math>
Line 6: Line 6:
That is, to find a number <math>\lambda</math> and a vector <math>\vec{v}</math> that together satisfy
That is, to find a number <math>\lambda</math> and a vector <math>\vec{v}</math> that together satisfy
:<math>A\vec{v}=\lambda\vec{v}\ .</math>
:<math>A\vec{v}=\lambda\vec{v}\ .</math>
What this equation says is that even though <math>A</math> is a matrix its action on <math>\vec{v}</math> is the same as multiplying it by the number <math>\lambda</math>.  Note that generally this will ''not'' be true.  This is most easily seen with a quick example.  Suppose
:<math>A=\begin{pmatrix}a_{11} & a_{12} \\ a_{21} & a_{22}\end{pmatrix}</math> and <math>\vec{v}=\begin{pmatrix} v_1 \\ v_2 \end{pmatrix}\ .</math>
Then their [[matrix multiplication|matrix product]] is
:<math>A\vec{v}=\begin{pmatrix}a_{11} & a_{12} \\ a_{21} & a_{22}\end{pmatrix}\begin{pmatrix} v_1 \\ v_2 \end{pmatrix}
=\begin{pmatrix}a_{11}v_1+a_{12}v_2 \\ a_{21}v_1+a_{22}v_2 \end{pmatrix}</math>
whereas
:<math>\lambda\vec{v}=\begin{pmatrix} \lambda v_1 \\ \lambda v_2 \end{pmatrix}\ .</math>

Revision as of 16:41, 3 October 2007

In linear algebra an eigenvalue of a (square) matrix is a number that satisfies the eigenvalue equation,

where is the identity matrix of the same dimension as and in general can be complex. The origin of this equation is the eigenvalue problem, which is to find the eigenvalues and associated eigenvectors of . That is, to find a number and a vector that together satisfy

What this equation says is that even though is a matrix its action on is the same as multiplying it by the number . Note that generally this will not be true. This is most easily seen with a quick example. Suppose

and

Then their matrix product is

whereas