Skip to main content
\(\require{cancel}\)
Physics LibreTexts

12.5 Apparent Brightness

A nearby flashlight may appear to be brighter than a distant streetlight, but in absolute terms (if you compared them side by side) the flashlight is much dimmer. This statement contains the essence of the problem of determining stellar brightness. A casual glance at a star does not reveal whether it is a nearby glowing ember or a distant great beacon. Astronomers must distinguish between how bright a star looks and how bright the star really is. Apparent brightness is the brightness perceived by an observer on Earth and absolute brightness is the brightness that would be perceived if all stars were magically placed at the same standard distance. There can be a great difference between the total amount of radiation a star emits and the amount of radiation measured at the Earth's surface.

Apparent brightness can be defined as the number of photons per second collected at the Earth from an astronomical source. Apparent brightness depends on the light-collecting aperture of the viewing device and on the distance to the source. A star appears much brighter as seen through a telescope than it does viewed by eye — the larger aperture of the telescope can collect many more photons per second. Similarly, the closer a star is, the brighter it appears — the number of photons intercepted by our light-gathering device decreases by the inverse square of the distance. Now you can understand why astronomers build larger and larger telescopes. A larger collecting area compensates for the diminishing amount of light that reaches us from more and more distant sources. Larger telescopes can detect more distant sources and they can capture more photons from nearby sources.

The most direct way to talk about apparent brightness is in units of photons per second. However, the most convenient way to measure apparent brightness is to express it as a ratio to the apparent brightness of the Sun or some other prominent star. This ratio allows us to compare how much brighter or dimmer a star is than our Sun or another familiar star. Apparent brightness defined in this way requires no units since the ratio of two numbers that have the same units is a pure number. Astronomers traditionally relate the apparent brightness of astronomical objects to the bright star Vega.

The list below shows the ratio of the apparent brightness of different objects to the bright star Vega:


• Sun: 4 × 1010 
• Full Moon: 100,000
• 100-Watt light bulb at 100 meters: 27,700
• Venus (at brightest): 58
• Mars (at brightest): 12
• Jupiter (at brightest): 3.6
• Sirius (the brightest star): 3.6
• Canopus (second brightest star): 1.9
• Vega: 1.0
• Spica: 0.4
• Naked-eye limit in urban areas: 0.025
• Uranus: 0.0063
• Bright asteroid: 0.0040
• Naked-eye limit in rural areas: 0.0025
• Neptune: 0.0008
• Limit for typical binoculars: 0.0001
• 3C 273 (brightest quasar): 8 × 10-6 
• Limit for 15-cm (6-inch) telescope: 6 × 10-6 
• Pluto: 1 × 10-6 
• Limit by eye with largest telescopes: 2 × 10-8 
• Limit for CCDs with largest telescopes: 6 × 10-12 
• Limit for the Hubble Space Telescope: 3 × 10-12 

The Sun is by far the brightest object in the sky. For example, the Sun is 11 billion times brighter than Sirius, the brightest star. The actual ratio is 4 × 1010 / 3.6 = 11 × 109. The best telescopes in space can detect objects about 800 million times fainter than the eye can see! Again, the actual ratio is 0.0025 / 3 × 10-12 = 8 × 108. Three of the planets appear brighter than any star when they are closest to the Earth. However, each of them only intercepts a tiny fraction of the Sun’s rays and imperfectly reflects them back to the Earth. Jupiter, for example, is about nine-billionth (3.6 / 4 × 1010) as bright as the Sun. You can also see the effect of the inverse square law in the relative brightness of the different planets. Planets further from the Sun appear fainter (although the different size of the planets plays a role, too). Uranus is only visible to the naked eye if you are far from city lights and Neptune and Pluto can only been seen with binoculars or a telescope. Pluto is over 50 million times fainter (10-6 / 58) than Venus at its brightest.

Notice in the list that the limit for naked eye observing is ten times better (or deeper or fainter) in a rural area than in an urban area. The same factor of ten reduction applies between a night sky with no Moon and with a full Moon. That’s because there’s another issue that affects depth of viewing apart from the apparent brightness of an astronomical source: the brightness of the night sky. The night sky is never truly black, even with no Moon and when you are far from city lights. There’s always some light from cities and industrial activity diffusing over large distances, and the air itself shines as molecules are excited by cosmic rays from space, a phenomenon called airglow. So the same little patch of sky that has starlight also has some light from the sky itself; the star is "competing" against this background and that affects the depth of viewing. For this reason, astronomers go to great lengths (and expense) to locate their major telescopes far from city lights, such as in Chile's Atacama Desert. Notice also in the list that the apparent brightness limit for the Hubble Space Telescope is two times better than for much larger ground-based telescopes. How can that be? It’s the same reason — sky brightness. In space, there’s no airglow and the sky approaches true black. This lets the Hubble Space Telescope make images ten billion times deeper than most people can see with their eyes!

Can the relative brightness of objects be used to estimate the distance to the nearest stars? Yes, if we make the bold assumption that the brightest stars are just like the Sun (this would be like assuming that the flashlight and the streetlight are intrinsically the same brightness). This is equivalent to assuming that stars emit the same number of photons per second as the Sun, and the difference in apparent brightness is a measure of how the photons thin out with increasing distance. This assumption also implies that stars with the highest apparent brightness and the nearest. By the inverse square law, the brightest few stars must be about √ (1010) = 100,000 times farther than the Sun. This is a distance of 1.5 x 108 x 105 ≈ 1013 kilometers, or about 1/3 parsec.

The 6000 stars visible with the naked eye span a range of 3.6 / 0.0025, or a factor of 1440. If we assume that they are also like the Sun, then we infer a distance range spanning a factor of √ (1440) ≈ 40 for the stars you can see in the night sky. We can also relate starbrightness to a more familiar terrestrial object — a light bulb. From the list of ratios, we can calculate that the Sun is like a 100 Watt light bulb seen at a distance of √ (27,700 / 4 x 1010) x 100 = 0.08 meters or about 3 inches. (Don’t try this; it will hurt your eyes just as staring at the Sun would!) On the other hand, the brightest star is like a 100 Watt light bulb seen at a distance of √ (27,700 / 3.6) x 100 = 8770 meters. This is like looking at a reading light in a house over five miles away. This gives a sense of the enormous range in brightness between our star and all the others.

It is easy to understand relative brightness using a linear scale. However, astronomers use a relative brightness scale based on logarithms. Why do astronomers use a different system? They are victims of history. When Hipparchus cataloged 1200 stars in about 130 B.C., he ranked their apparent brightness on a magnitude scale of 1 to 6, with 1st-magnitude stars the brightest and 6th-magnitude stars the faintest visible to the naked eye. Viewed with the naked eye, stars could only be classified with six gradations of brightness. A difference of one magnitude corresponds to a factor of roughly 2.5 in apparent brightness, five magnitudes represents a factor of 100 in brightness. The scale is reversed; higher numbers represent fainter objects. Vega defines the zero-point of the magnitude scale. Hipparchus invented this nonlinear scale to match the response of the eye. The eye is a logarithmic detector, which means it can accommodate a huge range of brightness. Evolution equipped us with sensors that let us see in bright sunlight but also let us hunt at night. (A more familiar example is the loudness of a sound, which is quoted in units of decibels, also a logarithmic scale. The decibel unit must be tied to a known sound intensity measured at a fixed distance, since loudness depends on how far you are from a sound.) The 2100-year-old magnitude system is so ingrained that astronomers continue to use it.

To be meaningful, apparent brightness must be specified at a particular wavelength. Stars have different colors, which means that the apparent brightness depends on the wavelength. Also, light detectors (the eye, photographic films, and electronic CCDs) have different sensitivities to particular colors or wavelengths. For this reason, astronomers specify exactly what wavelength any set of measurements refers to. Standards have been derived to express apparent brightness measured in blue light, red light, infrared light, radio waves, X-rays, and so on. Here we will usually be referring to a system having the same color sensitivity as the human eye, which is sometimes called visual apparent brightness (corresponding to a range of wavelengths centered on the green part of the visible spectrum).