Astronomy Jargon 101: Absolute Magnitude

In this series we are exploring the weird and wonderful world of astronomy jargon! You’ll surely measure the awesomeness of today’s topic: absolute magnitude!

Stars in the sky have all sorts of brightnesses. But some stars appear brighter because they’re closer, while some stars appear brighter because they’re…actually brighter. So astronomers invented a system to standardize the description of the brightness of any particular star, using something called absolute magnitude.

To calculate a stat’s absolute magnitude, you must pretend that you are measuring its luminosity (which is itself the total radiation output of the star) from a specific distance of 10 parsecs away, and pretending that there’s no dust or interstellar gas or other astronomical gremlins in the way.

Why 10 parsecs? Well…why not? It had to be something, and “10” seemed like a reasonable number.

So that’s pretty straightforward, but nothing in astronomy is left straightforward for long. First, the absolute magnitude is usually specified in reference to a particular band of wavelengths. That’s because astronomers typically don’t observe and record all the electromagnetic radiation by a star. They only observe through certain filters or bands of wavelengths, and so the absolute magnitude has to be specified in reference to that band.

Second, absolute magnitudes are measured on a logarithmic scale. That means that that one magnitude difference can mean that the star is over ten times brighter.

Lastly, these magnitudes are written such that smaller values mean brighter stars. Why? Because some ancient Greek astronomer did it this way and then it stuck. It also means that negative values of magnitude represent some of the brightest stars in the galaxy.

See, I told you astronomers never keep things straightforward.