Skip to main content

# 1.5: The Trace and Determinant of an Operator

$$\newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} }$$ $$\newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}[1]{\| #1 \|}$$ $$\newcommand{\inner}[2]{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}[1]{\| #1 \|}$$ $$\newcommand{\inner}[2]{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$$$\newcommand{\AA}{\unicode[.8,0]{x212B}}$$

There are two special functions of operators that play a key role in the theory of linear vector spaces. They are the trace and the determinant of an operator, denoted by $$\operatorname{Tr}(A)$$ and $$\operatorname{det}(A)$$, respectively. While the trace and determinant are most conveniently evaluated in matrix representation, they are independent of the chosen basis.

When we defined the norm of an operator, we introduced the trace. It is evaluated by adding the diagonal elements of the matrix representation of the operator:

$\operatorname{Tr}(A)=\sum_{j}\left\langle\phi_{j}|A| \phi_{j}\right\rangle,\tag{1.53}$

where $$\left\{\left|\phi_{j}\right\rangle\right\}_{j}$$ is any orthonormal basis. This independence means that the trace is an invariant property of the operator. Moreover, the trace has the following important properties:

1. If $$A=A^{\dagger}$$, then $$\operatorname{Tr}(A)$$ is real,
2. $$\operatorname{Tr}(a A)=a \operatorname{Tr}(A)$$,
3. $$\operatorname{Tr}(A+B)=\operatorname{Tr}(A)+\operatorname{Tr}(B)$$,
4. $$\operatorname{Tr}(A B)=\operatorname{Tr}(B A) \text { (the "cyclic property"). }$$

The first property follows immediately when we evaluate the trace in the diagonal basis, where it becomes a sum over real eigenvalues. The second and third properties convey the linearity of the trace. The fourth property is extremely useful, and can be shown as follows:

\begin{aligned} \operatorname{Tr}(A B) &=\sum_{j}\left\langle\phi_{j}|A B| \phi_{j}\right\rangle=\sum_{j k}\left\langle\phi_{j}|A| \psi_{k}\right\rangle\left\langle\psi_{k}|B| \phi_{j}\right\rangle \\ &=\sum_{j k}\left\langle\psi_{k}|B| \phi_{j}\right\rangle\left\langle\phi_{j}|A| \psi_{k}\right\rangle=\sum_{k}\left\langle\psi_{k}|B A| \psi_{k}\right\rangle \\ &=\operatorname{Tr}(B A) \end{aligned}\tag{1.54}

This derivation also demonstrates the usefulness of inserting a resolution of the identity in strategic places. In the cyclic property, the operators A and B may be products of two operators, which then leads to

$\operatorname{Tr}(A B C)=\operatorname{Tr}(B C A)=\operatorname{Tr}(C A B)\tag{1.55}$

Any cyclic (even) permutation of operators under a trace gives rise to the same value of the trace as the original operator ordering.

Finally, we construct the partial trace of an operator that lives on a tensor product space. Suppose that $$A \otimes B$$ is an operator in the Hilbert space $$\mathscr{H}_{1} \otimes \mathscr{H}_{2}$$. We can trace out Hilbert space $$\mathscr{H}_{1}$$, denoted by $$\operatorname{Tr}_{1}(.)$$:

$\operatorname{Tr}_{1}(A \otimes B)=\operatorname{Tr}(A) B, \quad \text { or equivalently } \quad \operatorname{Tr}_{1}\left(A_{1} B_{2}\right)=\operatorname{Tr}\left(A_{1}\right) B_{2}\tag{1.56}$

Taking the partial trace has the effect of removing the entire Hilbert space $$\mathscr{H}_{1}$$ from the description. It reduces the total vector space. The partial trace always carries an index, which determines which space is traced over.

The determinant of a 2×2 matrix is given by

$\operatorname{det}(A)=\operatorname{det}\left(\begin{array}{ll} A_{11} & A_{12} \\ A_{21} & A_{22} \end{array}\right)=A_{11} A_{22}-A_{12} A_{21}\tag{1.57}$

The determinant of higher-dimensional matrices can be defined recursively as follows: The topleft element of an $$n \times n$$ matrix defines an $$(n-1) \times(n-1)$$ matrix by removing the top row and the left column. Similarly, any other element in the left column defines an $$(n-1) \times(n-1)$$ matrix by removing the left column and the row of the element we chose. The determinant of the $$n \times n$$ matrix is then given by the top-left element times the determinant of the remaining $$(n-1) \times(n-1)$$ matrix, minus the product of the second element down in the left column and the remaining $$(n-1) \times(n-1)$$ matrix, plus the third element times the remaining matrix, etc.

The determinant of the product of matrices is equal to the product of the determinants of the matrices:

$\operatorname{det}(A B)=\operatorname{det}(A) \operatorname{det}(B)\tag{1.58}$

Moreover, if $$A$$ is an invertible matrix, then we have

$\operatorname{det}\left(A^{-1}\right)=\operatorname{det}(A)^{-1}\tag{1.59}$

This leads to an important relation between similar matrices $$A=X^{-1} B X$$:

\begin{aligned} \operatorname{det}(A) &=\operatorname{det}\left(X^{-1} B X\right)=\operatorname{det}\left(X^{-1}\right) \operatorname{det}(B) \operatorname{det}(X) \\ &=\operatorname{det}(X)^{-1} \operatorname{det}(B) \operatorname{det}(X)=\operatorname{det}(B) \end{aligned}\tag{1.60}

In particular, this means that the determinant is independent of the basis in which the matrix is written, which means that it is an intrinsic property of the operator associated with that matrix.

Finally, here’s a fun relation between the trace and the determinant of an operator:

$\operatorname{det}[\exp (A)]=\exp [\operatorname{Tr}(A)]\tag{1.61}$

## Exercises

1. Vectors and matrices:
1. Are the following three vectors linearly dependent or independent: $$a=(2,3,-1), b=(0,1,2)$$, and $$c=(0,0,-5)$$?
2. Consider the vectors $$|\psi\rangle=3 i\left|\phi_{1}\right\rangle-7 i\left|\phi_{2}\right\rangle$$ and $$|\chi\rangle=\left|\phi_{1}\right\rangle+2\left|\phi_{2}\right\rangle$$, with $$\left\{\left|\phi_{i}\right\rangle\right\}$$ an orthonormal basis. Calculate the inner product between $$|\psi\rangle$$ and $$|\chi\rangle$$, and show that they satisfy the Cauchy-Schwarz inequality.
3. Consider the two matrices

$A=\left(\begin{array}{ccc} 0 & i & 2 \\ 0 & 1 & 0 \\ 1 & 0 & 0 \end{array}\right) \quad \text { and } \quad B=\left(\begin{array}{ccc} 2 & i & 0 \\ 3 & 1 & 5 \\ 0 & -i & -2 \end{array}\right)\tag{1.62}$

Calculate $$A^{-1}$$ and $$B A^{-1}$$. Are they equal?

4. Calculate $$A \otimes B$$ and $$B \otimes A$$, where $$A=\left(\begin{array}{ll}0 & 1 \\ 1 & 0 \end{array}\right)$$ and $$B=\left(\begin{array}{cc} 1 & 0 \\ 0 & -1 \end{array}\right)$$.
2. Operators:
1. Which of these operators are Hermitian: $$A+A^{\dagger}, i\left(A+A^{\dagger}\right), i\left(A-A^{\dagger}\right)$$, and $$A^{\dagger} A$$?
2. Prove that a shared eigenbasis for two operators $$A$$ and $$B$$ implies that $$[A, B]=0$$.
3. Let $$U$$ be a transformation matrix that maps one complete orthonormal basis to another. Show that $$U$$ is unitary
4. How many real parameters completely determine a $$d \times d$$ unitary matrix?
3. Properties of the trace and the determinant:
1. Calculate the trace and the determinant of the matrices $$A$$ and $$B$$ in exercise 1c.
2. Show that the expectation value of $$A$$ can be written as $$\operatorname{Tr}(|\psi\rangle\langle\psi| A)$$.
3. Prove that the trace is independent of the basis.
4. Commutator identities.
1. Let $$F(t)=e^{A t} e^{B t}$$. Calculate $$d F / d t$$ and use $$\left[e^{A t}, B\right]=\left(e^{A t} B e^{-A t}-B\right) e^{A t}$$ to simplify your result.
2. Let $$G(t)=e^{A t+B t+f(t) H}$$. Show by calculating $$d G / d t$$, and setting $$d F / d t=d G / d t$$ at $$t=1$$, that the following operator identity

$e^{A} e^{B}=e^{A+B+\frac{1}{2}[A, B]},\tag{1.63}$

holds if $$A$$ and $$B$$ both commute with $$[A, B]$$. Hint: use the Hadamard lemma

$e^{A t} B e^{-A t}=B+\frac{t}{1 !}[A, B]+\frac{t^{2}}{2 !}[A,[A, B]]+\ldots\tag{1.64}$

3. Show that the commutator of two Hermitian operators is anti-Hermitian $$\left(A^{\dagger}=-A\right)$$.
4. Prove the commutator analog of the Jacobi identity

$[A,[B, C]]+[B,[C, A]]+[C,[A, B]]=0\tag{1.65}$

This page titled 1.5: The Trace and Determinant of an Operator is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Pieter Kok via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.

• Was this article helpful?