Loading [MathJax]/jax/output/HTML-CSS/jax.js
Skip to main content
Library homepage
 

Text Color

Text Size

 

Margin Size

 

Font Type

Enable Dyslexic Font
Physics LibreTexts

4.3: State Basis and Matrix Representation

( \newcommand{\kernel}{\mathrm{null}\,}\)

While some operations in quantum mechanics may be carried out in the general bra-ket formalism outlined above, many calculations are performed for quantum systems that feature a full and orthonormal set {u}{u1,u2,,uj,} of its states uj, frequently called a basis. The first of these terms means that any possible state vector of the system (i.e. of its Hilbert space) may be represented as a unique sum of the type (6) or (10) over its basis vectors: |α=jαj|uj,α|=jαjuj| so that, in particular, if α is one of the basis states, say uj, then αj=δij. The second term means that ujuj=δij For the systems that may be described by wave mechanics, examples of the full orthonormal bases are represented by any full and orthonormal set of eigenfunctions calculated in the previous three chapters of this course - for the simplest example, see Eq. (1.87).

Due to the uniqueness of the expansion (37), the full set of the coefficients αj involved in the expansion of a state α in certain basis {u} gives its complete description - just as the Cartesian components Ax,Ay, and Az of a usual geometric 3D vector A in certain reference frame give its complete description. Still, let me emphasize some differences between such representations of the quantummechanical state vectors and 3D geometric vectors:

(i) a quantum state basis may have a large or even infinite number of states uj, and

(ii) the expansion coefficients αj may be complex.

With these reservations in mind, the analogy with geometric vectors may be pushed further on. Let us inner-multiply both parts of the first of Eqs. (37) by a bra-vector uj and then transform the resulting relation using the linearity rules discussed in the previous section, and Eq. (38): ujα=uj|jαj|uj=jαjujuj=αj. Together with Eq. (14), this means that any of the expansion coefficients in Eq. (37) may be represented as an inner product: αj=ujα,αj=αuj; these important equalities relations are analogs of equalities Aj=njA of the usual vector algebra, and will be used on numerous occasions in this course. With them, the expansions (37) may be rewritten as |α=j|ujujαjˆΛj|α,α|=jαujuj|jα|ˆΛj, where ˆΛj|ujuj| Eqs. (41) show that ˆΛj so defined is a legitimate linear operator. This operator, acting on any state vector of the type (37), singles out just one of its components, for example, ˆΛj|α=|ujujα=αj|uj, i.e. "kills" all components of the linear superposition but one. In the geometric analogy, such operator "projects" the state vector on the jth  "direction", hence its name - the projection operator. Probably, the most important property of the projection operators, called the closure (or "completeness") relation, immediately follows from Eq. (41): their sum over the full basis is equivalent to the identity operator j|ujuj|=ˆI. This means in particular that we may insert the left-hand side of Eq. (44), for any basis, into any bra-ket relation, at any place - the trick that we will use again and again.

Now let us see how the expansions (37) transform the key notions introduced in the last section, starting from the short bracket (11), i.e. the inner product of two state vectors: βα=j,juj|βjαj|uj=j,jβjαjδjj=jβjαj. Besides the complex conjugation, this expression is similar to the scalar product of the usual, geometric vectors. Now, let us explore the long bracket (23) : β|ˆA|α=j,jβjuj|ˆA|ujαjj,jβjAjjαj. Here, the last step uses the very important notion of matrix elements of the operator, defined as As evident from Eq. (46), the full set of the matrix elements completely characterizes the operator, just as the full set of the expansion coefficients (40) fully characterizes a quantum state. The term "matrix" means, first of all, that it is convenient to represent the full set of Aij ’ as a square table (matrix), with the linear dimension equal to the number of basis states uj of the system under the consideration. By the way, this number (which may be infinite) is called the dimensionality of its Hilbert space.

As two simplest examples, all matrix elements of the null-operator, defined by Eqs. (35), are evidently equal to zero (in any basis), and hence it may be represented as a matrix of zeros (called the null-matrix):

while for the identity operator ˆI, defined by Eqs. (36), we readily get Ijj=uj|ˆI|uj=ujuj=δjj, i.e. its matrix (naturally called the identity matrix) is diagonal - also in any basis: I(1001) The convenience of the matrix language extends well beyond the representation of particular operators. For example, let us use the definition (47) to calculate matrix elements of a product of two operators: (AB)jj=uj|ˆAˆB|uj. Here we may use Eq. (44) for the first (but not the last!) time, inserting the identity operator between the two operators, and then expressing it via a sum of projection operators:

 Matrix  of an  of aerator  product (AB)jj=uj|ˆAˆB|uj=uj|ˆAˆIˆB|uj=juj|ˆA|ujuj|ˆB|uj=jAjjBjȷ

This result corresponds to the standard "row by column" rule of calculation of an arbitrary element of the matrix product AB=(A11A12A21A22)(B11B12B21B22) Hence a product of operators may be represented (in a fixed basis!) by that of their matrices (in the same basis).

This is so convenient that the same language is often used to represent not only long brackets, β|ˆA|α=jβjAjjαj=(β1,β2,)(A11A12A21A22)(α1α2) but even short brackets: βα=jβjαj=(β1,β2,)(α1α2) although these equalities require the use of non-square matrices: rows of (complex-conjugate!) expansion coefficients for the representation of bra-vectors, and columns of these coefficients for the representation of ket-vectors. With that, the mapping of quantum states and operators on matrices becomes completely general.

Now let us have a look at the outer product operator (26). Its matrix elements are just (|αβ|)ij=ujαβuj=αjβj. These are the elements of a very special square matrix, whose filling requires the knowledge of just 2N scalars (where N is the basis size), rather than N2 scalars as for an arbitrary operator. However, a simple generalization of such an outer product may represent an arbitrary operator. Indeed, let us insert two identity operators (44), with different summation indices, on both sides of an arbitrary operator: ˆA=ˆIˆAˆI=(j|ujuj|)ˆA(j|ujuj|) and then use the associative axiom to rewrite this expression as ˆA=j,j|uj(uj|ˆA|uj)uj|. But the expression in the middle long bracket is just the matrix element (47), so that we may write ˆA=j,j|ujAjiuj| The reader has to agree that this formula, which is a natural generalization of Eq. (44), is extremely elegant.

The matrix representation is so convenient that it makes sense to extend it to one level lower from state vector products to the "bare" state vectors resulting from the operator’s action upon a given state. For example, let us use Eq. (59) to represent the ket-vector (18) as |αˆA|α=(j,j|ujAjjuj|)α=j,j|ujAjjujα. According to Eq. (40), the last short bracket is just αj, so that |α=j,j|ujAjjαj=j(jAjjαj)|uj But the expression in the parentheses is just the coefficient αj of the expansion (37) of the resulting ketvector (60) in the same basis, so that αj=jAjjαj. This result corresponds to the usual rule of multiplication of a matrix by a column, so that we may represent any ket-vector by its column matrix, with the operator’s action looking like (α1α2)=(A11A12A21A22)(α1α2). Absolutely similarly, the operator action on the bra-vector (21), represented by its row-matrix, is (α1,α2,)=(α1,α2,)((A)11(A)12(A)21(A)22). By the way, Eq. (64) naturally raises the following question: what are the elements of the matrix on its right-hand side, or more exactly, what is the relation between the matrix elements of an operator and its Hermitian conjugate? The simplest way to answer it is to use Eq. (25) with two arbitrary states (say, uj and uj ) of the same basis in the role of α and β. Together with the orthonormality relation (38), this immediately gives 13 (ˆA)jj=(Ajj) Thus, the matrix of the Hermitian-conjugate operator is the complex conjugated and transposed matrix of the initial operator. This result exposes very clearly the difference between the Hermitian and the complex conjugation. It also shows that for the Hermitian operators, defined by Eq. (22), Ajj=Ajj, i.e. any pair of their matrix elements, symmetric with respect to the main diagonal, should be the complex conjugate of each other. As a corollary, their main-diagonal elements have to be real: Ajj=Ajj, i.e. ImAjj=0. In order to fully appreciate the special role played by Hermitian operators in quantum theory, let us introduce the key notions of eigenstates aj (described by their eigenvectors aj| and |aj ) and eigenvalues ( c-numbers) Aj of an operator ˆA, both defined by the equation they have to satisfy: 14 ˆA|aj=Aj|aj. Let us prove that eigenvalues of any Hermitian operator are real, 15 Aj=Aj, for j=1,2,,N, while the eigenstates corresponding to different eigenvalues are orthogonal:

The proof of both statements is surprisingly simple. Let us inner-multiply both sides of Eq. (68) by the bra-vector aj|. On the right-hand side of the result, the eigenvalue Aj, as a c-number, may be taken out of the bracket, giving aj|ˆA|aj=Ajajaj. This equality has to hold for any pair of eigenstates, so that we may swap the indices in Eq. (71), and write the complex-conjugate of the result: aj|ˆA|aj=Ajajaj. Now using Eqs. (14) and (25), together with the Hermitian operator’s definition (22), we may transform Eq. (72) into the following form: aj|ˆA|aj=Ajajaj. Subtracting this equation from Eq. (71), we get 0=(AjAj)ajaj. There are two possibilities to satisfy this relation. If the indices j and j ’ are equal (denote the same eigenstate), then the bracket is the state’s norm squared, and cannot be equal to zero. In this case, the left parentheses (with j=j ) have to be zero, proving Eq. (69). On the other hand, if j and j ’ correspond to different eigenvalues of A, the parentheses cannot equal zero (we have just proved that all Aj are real!), and hence the state vectors indexed by j and j ’ should be orthogonal, e.g., Eq. (70) is valid.

As will be discussed below, these properties make Hermitian operators suitable, in particular, for the description of physical observables.


13 For the sake of formula compactness, below I will use the shorthand notation in that the operands of this equality are just Ajj and Ajj. I believe that it leaves little chance for confusion, because the Hermitian conjugation sign may pertain only to an operator (or its matrix), while the complex conjugation sign *, to a scalar - say a matrix element.

14 This equation should look familiar to the reader - see the stationary Schrödinger equation (1.60), which was the focus of our studies in the first three chapters. We will see soon that that equation is just a particular (coordinate) representation of Eq. (68) for the Hamiltonian as the operator of energy.

15 The reciprocal statement is also true: if all eigenvalues of an operator are real, it is Hermitian (in any basis). This statement may be readily proved by applying Eq. (93) below to the case when Akk=Akδkk, with Ak=Ak.


This page titled 4.3: State Basis and Matrix Representation is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Konstantin K. Likharev via source content that was edited to the style and standards of the LibreTexts platform.

Support Center

How can we help?