Astrophysics (Index) | About |
Astronomical magnitude is an indicator of the brightness of a star (or other astronomical object) based upon differentiation of brightness by eye, with a magnitude of 1 for a very bright star, 2 for somewhat less bright and so forth. In the 19th century, it was formalized as a logarithmic scale, essentially -2.5 times the log 10 of the flux density. Some magnitudes (as viewed from Earth):
Relative magnitude of two objects:
m1-m2 = -2.5 log10 I1/I2
Traditionally, the (logarithmic) magnitudes have been scaled with zero for Vega's magnitude (the Vega system), but current scale-specifications define specific flux density measurements as zero, a reason being Vega, like many stars, is a little bit variable. Thus now zero is taken as either an average of Vega's magnitude over time, or a fixed brightness, now generally that defined by the AB system. The above examples are apparent magnitudes, the magnitude as viewed from Earth, which is the typical meaning of magnitude when not qualified or in context. Absolute magnitude is defined as the apparent magnitude a star would have if it were 10 parsecs from Earth (the Sun's is +4.83). Bolometric magnitude is specifically a magnitude including all EMR wavelengths, i.e., including radio, infrared, etc. Magnitudes of stars are often cited for specific passbands or differences between passbands (color indices). With no such qualifications or context, it is best to assume magnitude means the apparent visual magnitude, an estimation of what you would see, using the V band.
Magnitudes are cited for stars and other point sources, and also for extended sources, in which case they can be cited for the radiative flux from the entire source (e.g., as is cited above for the Sun, all the light from the object reaching us). A surface brightness (either of the entire source or of some portion of it) can be cited as the magnitude per unit solid angle, which is often used in describing galaxies.