![]() ![]() So a magnitude 1.0 star is 2.512 times brighter than a magnitude 2.0 star, and so on. The magnitude scale’s multiplicative factor is about 2.512. Although this logarithmic scaling (in which each tier is a multiplicative factor fainter than the one preceding it) may be counterintuitive, it is actually quite convenient: a linear scale encompassing the enormous range of stellar brightness would require far too many tiers to be useful. Outside of astronomers, however, the magnitude scale sees little use-perhaps because it’s confusingly nonlinear! In other words, a magnitude 1.0 star is not six times brighter than a magnitude 6.0 star but rather brighter by a factor of 100. This was the true origin of the magnitude scale, which astronomers still use today. A few centuries later, another Greek astronomer, Ptolemy, attempted to classify stars using a six-tier scale, assigning the brightest stars to the first tier and the faintest ones to the sixth. The first person we know did this was Greek polymath Hipparchus, who created a star map noting the brightness of various stars more than two millennia ago. And astronomers, being scientists, decided they had to quantify it in other words, throw math at it. This varying visibility of stars is so obvious you may not have given it much thought.Īstronomers, however, think about it a lot. A handful are so bright that you can easily see them even in a big city’s washed-out sky, while others are so faint that they’re invisible unless you’re stargazing on a moonless night from an essentially light-pollution-free locale ( if you can find one). ![]() One of the most obvious things about looking at stars in the sky is that they’re not all the same brightness. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |