Want to stay on top of all the space news? Follow @universetoday on Twitter
Brightness is the attribute of visual perception in which a source appears to be radiating or reflecting light. The main measurement of brightness is luminosity; however, in astronomy, magnitude is the measurement tool.
Magnitude in astronomy has been used for over 2,000 years. Hipparchus was the first astronomer to use the term. He classed stellar objects on how bright they appeared. The brightest were magnitude 1 down to magnitude 6, the faintest he could see. Objects have an apparent magnitude and an absolute magnitude.
The apparent magnitude(m) of a celestial body is the measure of its brightness as seen by an Earth-based observer that has been adjusted to the value it would have without an atmosphere. The brighter the object appears, the lower its apparent magnitude is.
Absolute magnitude measures a celestial object’s intrinsic brightness. Absolute magnitude is calculated from the observed apparent magnitude of a celestial object corrected for distance to its observer. The absolute magnitude equals the apparent magnitude an object would have if it were at a standard luminosity distance(1 astronomical unit or 10 parsecs) from the observer, in the absence of astronomical extinction.. This allows for the true brightnesses of objects to be compared without regard to distance.
With the invention of photometers, absolute and apparent magnitude were able to better classify the brightness of celestial objects and negated the human eye being fooled by its ‘attraction’ to certain parts of the light spectrum.
We’ve done many episodes of Astronomy Cast about stars. Listen here, Episode 12: Where Do Baby Stars Come From?