The brightness of a star depends on two factors:
1- Distance of the star from the earth.
2- Amount of energy the star emits.
A glance at the night sky above Earth shows that some stars are much brighter than others. However, the brightness of a star depends on its composition and how far it is from the planet.
Astronomers define star brightness in terms of apparent magnitude — how bright the star appears from Earth — and absolute magnitude — how bright the star appears at a standard distance of 32.6 light-years, or 10 parsecs. (A light-year is the distance light travels in one year — about 6 trillion miles, or 10 trillion kilometers.) Astronomers also measure luminosity — the amount of energy (light) that a star emits from its surface.
Measuring star brightness is an ancient idea, but today astronomers use more precise tools to obtain the calculation.
Advertisement
From Greek to modern times
More than 2,000 years ago, the Greek astronomer Hipparchus was the first to make a catalog of stars according to their brightness, according to Dave Rothstein, who participated in Cornell University's "Ask An Astronomer" website in 2003.
"Basically, he looked at the stars in the sky and classified them by how bright they appear — the brightest stars were 'magnitude 1,' the next brightest were 'magnitude 2,' etc., down to 'magnitude 6,' which were the faintest stars he could see," Rothstein wrote.
Human eyes, however, are not very discerning. Large differences in brightness actually appear much smaller using this scale, Rothstein said. Light-sensitive charged-coupled devices (CCDs) inside digital cameras measure the amount of light coming from stars, and can provide a more precise definition of brightness.
Using this scale, astronomers now define five magnitudes' difference as having a brightness ratio of 100. Vega was used as the reference star for the scale. Initially it had a magnitude of 0, but more precise instrumentation changed that to 0.3.