Starlight, star bright
First star I see tonight...
Most of us remember this old nursery rhyme, but there is a bit of science behind it. The brightest stars in the sky will always be the first stars we see in the gloaming1 of twilight. And the measure of a star's brightness is expressed scientifically by referring to its magnitude.
In the 2nd Century BC, the Greek astronomer Hipparchus described the brightest stars in the sky as being of the first magnitude, the next brightest group were referred to as being of the second magnitude, and so on until he reached stars of the sixth magnitude, which were the faintest visible to the naked eye. Hipparchus appears to have had an ulterior motive for doing all this work. He had discovered a 'new star' in the constellation Scorpius, but couldn't be completely sure if it was a new discovery, since there was no standardised way of describing stars. So not only did he devise a method of describing brightness, he also came up with a system of latitude and longitude to map stellar positions.
In the mid 1800s, astronomers determined mathematically what old Hipparchus did visually, giving his scale a definable basis. An English Astronomer, NR Pogson, noticed that an average first magnitude star was in fact 100 times as bright as an average sixth magnitude star. Further measurements and calculations showed that for every increase of 1 in order of magnitude there is a 2.51-fold increase in the apparent brightness of a star. Therefore, the equation for apparent magnitude M is:
m = −2.5 log f + constant
where f is the flux from the star.
In plain English, this means that a star of the third magnitude would appear to be 6.31 times as bright as a star of the fifth magnitude, because the difference in apparent magnitude is two.
Bright, Really Bright, and Very, Very, Very Bright
|Difference in Magnitude||Factor in Brightness|
|1 mag||2.51 times|
|2 mag||6.31 times|
|3 mag||15.85 times|
|4 mag||39.81 times|
|5 mag||100 times|
|6 mag||251 times|
Incredibly, Hugely, Extremely Bright
Some stars are so bright that they must be assigned negative magnitude values in order for the magnitude six stars to remain as the faintest visible to the naked eye. An example of this is Sirius, which is the brightest star in the night sky2, shining at mag −1.4. The Moon and some planets are brighter than this. At its brightest, Venus can shine at mag −4.4. The full moon is mag −12.3 and the Sun is mag −26.8.
The problem with apparent magnitudes is that there is no way to differentiate between a bright star that is a long way away, and a dimmer star which is nearer. So astronomers use absolute magnitude M to compare the intrinsic brightness of stars, as suggested by Danish astronomer E Hertzprung.
The absolute magnitude of a star is defined as being the magnitude a star would have if it were 10 parsecs away from the Sun.
Putting it All to Use
For all you backyard astronomers, here is a list of the 20 brightest stars in the sky. Star names in parentheses denote that the star is not visible from mid-latitudes in the northern hemisphere.
Another method astronomers use to classify stars, the Spectral Classification System, involves the electromagnetic spectrum.