# Eigenvectors and Eigenvalues

The eigenvectors of a matrix are those special vectors for which , where is an associated constant (possibly complex) called the eigenvalue. Let us rearrange the eigenvalue equation to the form , where represents a vector of all zeroes (the zero vector). We may rewrite this expression using the identity matrix to yield , which will be more convienient for the next step. Now to solve for , multiply the left and right sides of the equation by , if it exists. This yields .

This last equation presents a challenge. Anything multiplied by the zero vector yields the zero vector, but clearly that is a trivial solution for that we aren't interested in. Thus, our assumption that exists must be wrong. The eigenvalue equation therefore has non-trivial solutions only when does not exist. In our previous discussion of determinants, we noted that a matrix does not have an inverse if its determinant is zero. Thus, we can satisfy the eigenvalue equation for those special values of such that . This is called the secular determinant, and expanding the determinant gives an -th degree polynomial in called the secular equation or the characteristic equation. Once the roots of this equation are determined to give eigenvalues , these eigenvalues may be inserted into the eigenvalue equation, one at a time, to yield eigenvectors.

As an example, let us find the eigenvalues and eigenvectors for the matrix

 (25)

We begin with the secular determinant , which in this case becomes
 (26)

Expanding out this determinant using the rules given above for the determinants of matrices, we obtain the following characteristic equation:
 (27)

which has solutions . These are the three eigenvalues of . What are the corresponsing eigenvectors? We substitute each of these eigenvalues, one at a time, into the eigenvalue equation, and solve for the system of equations that result.

Let us begin with the eigenvalue . Substituting this into , we obtain

 (28)

This is three equations in three unknowns, which we may rewrite as
 (29) (30) (31)

The middle equation is, of course, not particularly useful. However, we know that the components and of the eigenvector corresponding to are both zero, and there is no equation governing the choice of . We are therefore free to chose any value for , and a valid eigenvector will result. Note that any eigenvector times a constant will yield another valid eigenvector. Most frequently, we chose normalized eigenvectors by convention (such that ), so in this case we will choose . This gives the final eigenvector
 (32)

We can verify that this is indeed an eigenvector corresponding to the eigenvalue by multiplying this eigenvector by the original matrix :
 (33)

that is, multiplication of times the eigenvector yields the eigenvector again times a constant (the eigenvalue, ).

By a similar procedure, one can obtain the other two eigenvectors, which, when normalized, are

 (34)

Although an matrix has eigenvalues, they are not necessarily distinct. That is, one or more of the roots of the characteristic equation may be identical. In this case, we say that those eigenvalues are degenerate. Determination of eigenvectors is somewhat more complicated in such a case, because there will be additional flexibility in selecting them. Consider the matrix

 (35)

which has the characteristic equation
 (36)

with solutions . Although it is no difficulty to find the eigenvector corresponding to , the doubly-degenerate eigenvalue presents an additional complication. Upon substituting this value into the eigenvalue equation, we obtain
 (37)

or
 (38)

In this case, the first and third equations are equivalent (one equation is just the minus of the other), and so they are not independent. Additionally, the second equation gives no information. We therefore have only one equation to determine the three coefficients of the eigenvector. Recall that only two valid equations resulted above for our previous non-degenerate case, because any multiple of an eigenvector still yields a valid eigenvector. Here, a double degeneracy has lost us one of our equations. Hence, we only know that and is arbitrary. Any eigenvector satisfying these rules will be satisfactory, and clearly there are an infinite number of ways to chose them, even if we require them to be normalized. Let us, somewhat arbitrarily, pick the normalized vector
 (39)

Then, it is traditional to try to pick the second eigenvector for as orthogonal to the first (there are reasons for doing this, most commonly because we might wish to use these vectors as a new orthonormal basis). In that case, the following (normalized) vector will be suitable:
 (40)

You can verify that both of these vectors are (a) orthonormal, and (b) satisfy the eigenvalue equation for .