Revision as of 17:02, 3 October 2007 by imported>Michael Underwood
In linear algebra an eigenvalue of a (square) matrix
is a number
that satisfies the eigenvalue equation,
![{\displaystyle {\text{det}}(A-\lambda I)=0\ ,}](https://wikimedia.org/api/rest_v1/media/math/render/svg/c44150bec555a64de0ab5ffe08c8090bf31a8d24)
where
is the identity matrix of the same dimension as
and in general
can be complex.
The origin of this equation is the eigenvalue problem, which is to find the eigenvalues and associated eigenvectors of
.
That is, to find a number
and a vector
that together satisfy
![{\displaystyle A{\vec {v}}=\lambda {\vec {v}}\ .}](https://wikimedia.org/api/rest_v1/media/math/render/svg/6486dc7dd37a91410fec63d88a41eb59506b4c43)
What this equation says is that even though
is a matrix its action on
is the same as multiplying it by the number
.
This means that the vector
and the vector
are parallel (or anti-parallel if
is negative).
Note that generally this will not be true. This is most easily seen with a quick example. Suppose
and ![{\displaystyle {\vec {v}}={\begin{pmatrix}v_{1}\\v_{2}\end{pmatrix}}\ .}](https://wikimedia.org/api/rest_v1/media/math/render/svg/9ec7a39946a2f1ea7ae0c5588e697404b405bee3)
Then their matrix product is
![{\displaystyle A{\vec {v}}={\begin{pmatrix}a_{11}&a_{12}\\a_{21}&a_{22}\end{pmatrix}}{\begin{pmatrix}v_{1}\\v_{2}\end{pmatrix}}={\begin{pmatrix}a_{11}v_{1}+a_{12}v_{2}\\a_{21}v_{1}+a_{22}v_{2}\end{pmatrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/0820fa077010c2b1588b9e55bf7ff0115fae3ac9)
whereas the scalar product is
![{\displaystyle \lambda {\vec {v}}={\begin{pmatrix}\lambda v_{1}\\\lambda v_{2}\end{pmatrix}}\ .}](https://wikimedia.org/api/rest_v1/media/math/render/svg/ea57f4030837a1ca8f3f2ed51f067e951dbb14ec)
Obviously then
unless
and simultaneously
,
and it is easy to pick numbers for the entries of
and
such that this cannot happen for any value of
.
The eigenvalue equation
So where did the eigenvalue equation
come from? Well, we assume that we know the matrix
and want to find a number
and a non-zero vector
so that
. (Note that if
then the equation is always true, and therefore uninteresting.) So now we have
. It doesn't make sense to subtract a number from a matrix, but we can factor out the vector if we first multiply the right-hand term by the identity, giving us
![{\displaystyle (A-\lambda I){\vec {v}}={\vec {0}}\ .}](https://wikimedia.org/api/rest_v1/media/math/render/svg/288b81c995d7114985f92eb56790f52ad5fc5c9b)
Now we have to remember the fact that
is a square matrix, and so it might be invertible.
If it was invertible then we could simply multiply on the left by its inverse to get
![{\displaystyle {\vec {v}}=(A-\lambda I)^{-1}{\vec {0}}={\vec {0}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8731f599be395028bc0891f186766fb88670b832)
but we have already said that
can't be the zero vector! The only way around this is if
is in fact non-invertible. It can be shown that a square matrix is non-invertible if and only if its determinant is zero. That is, we require
![{\displaystyle {\text{det}}(A-\lambda I)=0\ ,}](https://wikimedia.org/api/rest_v1/media/math/render/svg/c44150bec555a64de0ab5ffe08c8090bf31a8d24)
which is the eigenvalue equation stated above.