less than 1 minute read

Stellar Magnitudes

The Nature Of The Magnitude Scale



In 1856, the British scientist N. R. Pogson noticed that Hipparchus' 6th class stars were roughly 100 times fainter than his 1st class stars. Pogson did the sensible thing: he redefined the stars' V brightness so that a difference of five magnitudes was exactly a factor of 100 in brightness. This meant that a star with V = 1.00 appeared to be precisely 100 times brighter than a star with V = 6.00. One magnitude is then a factor of about 2.512 in brightness. Try it: enter 1 on a calculator and multiply it by 2.512 five times-you've just gone from first magnitude to sixth (or eight to thirteenth, or minus seventh to minus second, or any other combination).



Looking back to the numbers above, we see that the Sun (V = -26.75) has an apparent visual brightness 25 magnitudes greater than Sirius. That's five factors of five magnitudes, or 100 × 100 × 100 × 100 × 100: ten billion! And the difference in apparent brightness between the Sun and the faintest object humans have ever seen (using the Hubble Space Telescope) is more than 56 magnitudes, or a factor of ten billion trillion.


Additional topics

Science EncyclopediaScience & Philosophy: Spectroscopy to Stoma (pl. stomata)Stellar Magnitudes - How Bright It Looks: Apparent Magnitude, How Bright It Really Is: Absolute Magnitude, The Nature Of The Magnitude Scale