2.16 Hermite and unitary matrices with complex components

As e.g. typical in quantum mechanics, we chose a somewhat different notation for vectors:

\[\vec{a}=\left|a\right\rangle =\vect{a_1 \\ a_2 \\ \vdots \\ a_N}\]
\[\vec{a}^+ = \left\langle a\right| = \left( a_1^* \quad a_2^* \quad \cdots \quad a_N^* \right)\;,\]

so as already pointed out in section 2.7.1 vectors are special matrices and the scalar product can now be interpreted as a standard matrix multiplication:

\[\langle a\left|b\right\rangle = \left( a_1^* \quad a_2^* \quad \cdots \quad a_N^* \right)\vect{b_1 \\ b_2 \\ \vdots \\ b_N} =\sum_{i=1}^N a_i^*b_i\]

For complex vectors we strictly have to distinguish between multiplying from left (conjugate complex vector) and multiplying from right. Formally this has already been introduced in the first definition for the scalar product 14; especially the ”nearly” commutative rule reflects this asymmetry.
For matrices \(\tilde{A}\) holds generally

\[\langle a|\tilde{A} b\rangle = \langle \tilde{A}^+ a|b\rangle .\]

With this more general concept for vectors we will redefine Hermite matrices \(\tilde{A}\) (see section IV): Each matrix \(\tilde{A}\) with

\[\langle a|\tilde{A} b\rangle = \langle \tilde{A} a|b\rangle = \langle a|\tilde{A}|b\rangle \]

for all vectors is called Hermite matrix. The third form of a scalar product acting on a Hermite matrix reflects the symmetry most perfectly and is widely used e.g. in quantum mechanics. Writing down in components one easily gets for the components \(a_{i,j} = a_{ji}^*\) which was the former definition.
Eigenvalues of Hermite matrices are always real, since for all Eigenvectors \(v\) of \(\tilde{A}\) with Eigenvalue \(a\) we find:

\[a \langle v|v\rangle = \langle v|\tilde{A}v\rangle = \langle \tilde{A}v|v\rangle = a^* \langle v|v\rangle \; \mbox{ q.e.d.}\]

Eigenvectors of Hermite matrices with different Eigenvalues are always perpendicular to each other, since

\[a_j \langle v_i|v_j\rangle = \langle v_i|\tilde{A}v_j\rangle = \langle \tilde{A}v_i|v_j\rangle = a_i \langle v_i|v_j\rangle \mbox{, i.e.\ } (a_i-a_j) \langle v_i|v_j\rangle = 0\;.\]

If \(a_i \neq a_j\) then \(\langle v_i|v_j\rangle = 0\) q.e.d.
So for Hermite matrices the Eigenvectorsystem is always orthogonal (for degenerated Eigenvalues one can use the procedure discussed in section 3.10 to get orthogonal vectors). Dividing by the length of the Eigenvectors, the Eigenvectorsystem of a Hermite matrix can always be chosen orthonormal, i.e. \(\langle v_i|v_j\rangle = \delta_{ij}\). We now define the orthonormal Eigenvectors as base vectors

\[\left|\;\;i\right\rangle =\vect{0 \\ 0 \\ 1_{\; i.\;pos.}\\ \vdots \\ 0}\]

and a (unitary) matrix \(\tilde{U}\) with the orthonormal Eigenvectors as columns

\[\tilde{U}=\left( \begin{array}{cccc} \uparrow & \uparrow & & \uparrow \\ v_1 & v_2 & \cdots & v_N \\ \downarrow & \downarrow & & \downarrow \\ \end{array}\right) \mbox{ , i.e.\ } \tilde{U}^+=\vect{\leftarrow v_1^* \rightarrow \\ \leftarrow v_2^* \rightarrow \\ \vdots \\ \leftarrow v_N^* \rightarrow}\]

We get

\[ |v_i\rangle = \tilde{U} |i\rangle \;.\]

Quite obviously \(\tilde{U}^+\tilde{U} = \tilde{I}\), so \(\tilde{U}\) is really a unitary transformation and we find

\[\langle j|\tilde{U}^+\tilde{A}\tilde{U}|i\rangle = \langle v_j|\tilde{A}|v_i\rangle = a_i \langle v_j|v_i\rangle = a_i \delta_{ij}\;,\]

i.e. \(\tilde{U}^+\tilde{A}\tilde{U}\) is a diagonal matrix with the Eigenvalues as components.
Since \(\det(\tilde{U})=\det(\tilde{U}^+)= \pm 1\) we get \(\det(\tilde{A})=\det(\tilde{U}^+\tilde{A}\tilde{U}) = \prod_{i=1}^N a_i\). Thus we find as for the case of 3D (see section 2.15) that the determinant calculates the volume spanned up by the row-vectors (column-vectors) of the matrix \(\tilde{A}\). If the determinant is zero, i.e. the vectors are linearly dependent, than not a full \(N\)-dimensional volume is formed. (Hint: as e.g. in the case of 3D a 2D surface does have a volume 0).
We have learned here some very general results:


With frame Back Forward as PDF

© J. Carstensen (Math for MS)