$$\require{cancel}$$

# 1.3: Mean, Variance, and Standard Deviation

What is meant by the mean or average of a quantity? Suppose that we wish to calculate the average age of undergraduates at the University of Texas at Austin. We could go to the central administration building and find out how many eighteen year-olds, nineteen year-olds, et cetera, were currently enrolled. We would then write something like ${\rm Average~Age} \simeq \frac{N_{18}\times 18 + N_{19}\times 19 +N_{20} \times 20+\cdots} {N_{18}+N_{19}+N_{20}+\cdots},$ where $$N_{18}$$ is the number of enrolled eighteen year-olds, et cetera. The probability that a randomly picked student is eighteen is $P_{18} \simeq \frac{N_{18}}{N_{\rm students}},$ where $$N_{\rm students}=N_{18}+N_{19}+N_{20}+\cdots$$ is the total number of enrolled students. (Actually, this definition is only accurate in the limit that $$N_{\rm students}$$ is very large.) We can now see that the average age takes the form ${\rm Average~Age} \simeq P_{18}\times 18 + P_{19}\times 19 + P_{20}\times 20 +\cdots.$ Finally, because there is nothing unique about the age distribution of students at UT Austin, for a general variable $$u$$, which can take on any one of $$M$$ possible values $$u_1$$, $$u_2, \cdots, u_M$$, with corresponding probabilities $$P(u_1)$$, $$P(u_2),\cdots, P(u_M)$$, the mean or average value of $$u$$, which is denoted $$\langle u\rangle$$, is defined $\label{dmean} \langle u\rangle \equiv \sum_{i=1,M} P(u_i)\, u_i.$

Suppose that $$f(u)$$ is some function of $$u$$. Thus, for each of the $$M$$ possible values of $$u$$, there is a corresponding value of $$f(u)$$ that occurs with the same probability. That is, $$f(u_1)$$ corresponds to $$u_1$$, and occurs with the probability $$P(u_1)$$, and so on. It follows from our previous definition that the mean value of $$f(u)$$ is given by $\langle f(u)\rangle \equiv \sum_{i=1,M} P(u_i)\, f(u_i).$ Suppose that $$f(u)$$ and $$g(u)$$ are two general functions of $$u$$. It follows that $\langle f(u)+g(u)\rangle = \sum_{i=1,M}P(u_i)\,[f(u_i)+g(u_i)] = \sum_{i=1,M}P(u_i)\,f(u_i)+ \sum_{i=1,M} P(u_i)\,g(u_i),$ so $\langle f(u)+g(u)\rangle= \langle f(u)\rangle+\langle g(u)\rangle.$ Finally, if $$c$$ is a general constant then it is clear that $\langle c \,f(u)\rangle = c\,\langle f(u)\rangle.$

We now know how to define the mean value of the general variable, $$u$$. Let us consider how we might characterize the scatter around the mean value. We could investigate the deviation of $$u$$ from its mean value, $$\langle u\rangle$$, which is denoted ${\mit\Delta} u \equiv u- \langle u\rangle.$ In fact, this is not a particularly interesting quantity because its average is obviously zero: that is, $\langle {\mit\Delta} u\rangle = \left\langle(u-\langle u\rangle)\right\rangle = \langle u\rangle-\langle u\rangle = 0.$ This is another way of saying that the average deviation from the mean vanishes. A more interesting quantity is the square of the deviation. The average value of this quantity, $\label{dvar} \left\langle ({\mit\Delta} u)^2\right\rangle = \sum_{i=1,M} P(u_i)\,(u_i - \langle u\rangle)^2,$ is usually called the variance. The variance is a positive real number, unless there is no scatter at all in the distribution, so that all possible values of $$u$$ correspond to the mean value, $$\langle u\rangle$$, in which case it takes the value zero. Note that $\left \langle (u-\langle u\rangle )^2\right\rangle = \left\langle( u^{\,2}-2\,u\,\langle u\rangle+\langle u\rangle^{\,2})\right\rangle= \left\langle u^{\,2}\right\rangle-2\,\langle u\rangle\,\langle u\rangle+\langle u\rangle^{\,2},$ which yields the following useful relationship $\left\langle({\mit\Delta} u)^2\right\rangle= \left\langle u^{\,2}\right\rangle-\langle u\rangle^2.$ The variance of $$u$$ is proportional to the square of the scatter of $$u$$ around its mean value. A more useful measure of the scatter is given by the square root of the variance, $\sigma_u = \left[\,\left\langle({\mit\Delta} u)^2\right\rangle\,\right]^{1/2},$ which is usually called the standard deviation of $$u$$. The standard deviation is essentially the width of the range over which $$u$$ is distributed around its mean value, $$\langle u\rangle$$.

## Contributors and Attributions

• Richard Fitzpatrick (Professor of Physics, The University of Texas at Austin)
