1.4 Eigenvalues and Eigenvectors of matrices

In general the transformation \(A\) of the vectors \(x\) leads to

 \begin{equation*} y = A x \end{equation*}(1.20)

where \(y\) is rotated against \(x\) and the lengths of both vectors differ.
Now we are looking for special vectors \(v\) with

 \begin{equation*} A v = \lambda v = \lambda E v \label{Eigeneq} \end{equation*}(1.21)

These ”Eigenvectors” only change their length, when the operator \(A\) is applied to them.
Eq. (1.21) can be rewritten as

 \begin{equation*} (A - \lambda E) v = 0 \end{equation*}(1.22)

This equations has only nontrivial solutions if

 \begin{equation*} \det(A - \lambda E) = 0 \end{equation*}(1.23)

The zeros of this characteristic polynom are the Eigenvalues \(\lambda_i\) of the operator \(A\).
For each Eigenvalue \(\lambda_i\) we can calculate the Eigenvectors \(v_i\) by solving the equation

 \begin{equation*} (A - \lambda_i E) v_i = 0 \end{equation*}(1.24)

Example:
The matrix \(\left(\begin{array}{cc}3&-1\\ -1 &3\end{array}\right)\) hasthe characteristic polynom \(\lambda^2 - 6 \lambda + 8 = 0\) with its solutions \(\lambda_1 = 2\) and \( \lambda_2 = 4\).
The corresponding Eigenvectors are \(\vec{v}_1 = \frac{1}{\sqrt{2}} \left(\begin{array}{c}1 \\ -1\end{array}\right)\) and \(\vec{v}_2 = \frac{1}{\sqrt{2}} \left(\begin{array}{c}1 \\ 1\end{array}\right)\).
The adjacent unitary transformation is therefore \(U = \frac{1}{\sqrt{2}} \left(\begin{array}{cc}1&1\\ -1 &1\end{array}\right)\).
An illustrative example for an Eigenvalue problem you can find in the math-script.
Eigenvalues of linear Hermitian Operators are always real
Let \(A\) be a Hermitian operator with the Eigenfunction \(f_1\) and the Eigenvalue \(a_1\); it follows that

 \begin{equation*} a_1\langle f_1|f_1\rangle = \langle f_1|A f_1\rangle = \langle A f_1|f_1\rangle = a_1^*\langle f_1|f_1\rangle \end{equation*}(1.25)

thus \(a_1\) is real.

Eigenvectors belonging to different Eigenvalues are orthogonal
Let \(A\) be a Hermitian operator with Eigenfunctions \(f_1\) and \(f_2\). Let \(a_1 \neq a_2\) be the corresponding Eigenvalues. We find

 \begin{equation*} a_2\langle f_1|f_2\rangle = \langle f_1|A f_2\rangle = \langle A f_1|f_2\rangle = a_1^*\langle f_1|f_2\rangle \end{equation*}(1.26)

Consequently

 \begin{equation*} (a_2 - a_1^*) \langle f_1|f_2\rangle = 0 \qquad ; \end{equation*}(1.27)

since \((a_2 - a_1^*) \neq 0\) we find that \(f_1\) and \(f_2\) are orthogonal.
Degenerated Eigenvalues
Let \(A\) be a Hermitian operator with the Eigenfunctions \(f_1\) and \(f_2\). Let \(a_1 = a_2 = a\) be the corresponding Eigenvalue, i.e.

 \begin{equation*} \begin{split} A |f_1\rangle & = a |f_1\rangle \\ A |f_2\rangle & = a |f_2\rangle \end{split} \end{equation*}(1.28)
We find:

 \begin{equation*} A\left(x|f_1\rangle + y|f_2\rangle \right) = xa |f_1\rangle + ya |f_2\rangle = a \left(x|f_1\rangle + y|f_2\rangle \right) \qquad . \end{equation*}(1.29)

Thus degenerated Eigenfunctions form a sub vector space. This sub space is perpendicular to all other Eigenvectors. The orthonormalization procedure of Schmidt allows to find a set of orthogonal vectors of length 1 in this subspace.
Orthonormal base of vectors
The system of Eigenvectors for every Hermitian matrix can thus be transformed to a set of orthonormal vectors \(e_i\), with represent a basis in the vector space:

 \begin{equation*} \left\langle e_i| e_j \right\rangle = \delta_{i,j} \qquad . \end{equation*}(1.30)

Each vector can be written as

 \begin{equation*} \langle a| = \sum_i \langle a|e_i\rangle \langle e_i| \qquad . \end{equation*}(1.31)

which is called the Closure-relation. The \(a_i = \langle a|e_i\rangle \) are called the components of the vector for the basis \(e_i\). In component representation the vector is just written as \(\langle a| = (a_1, a_2, a_3, ...,a_n)\).
REMARK: Remember the Bra-, Ket- representation of vectors; for complex vectors the components have to be chosen complex conjugated when changing from Bra- to Ket- representation or vice versa.


With frame Back Forward as PDF

© J. Carstensen (Quantum Mech.)