FACTOID # 27: If you're itching to live in a trailer park, hitch up your home and head to South Carolina, where a whopping 18% of residences are mobile homes.
 
 Home   Encyclopedia   Statistics   States A-Z   Flags   Maps   FAQ   About 
   
 
WHAT'S NEW
 

SEARCH ALL

FACTS & STATISTICS    Advanced view

Search encyclopedia, statistics and forums:

 

 

(* = Graphable)

 

 


Encyclopedia > Stellar magnitude

The apparent magnitude (m) of a star, planet or other heavenly body is a measure of its apparent brightness; that is, the amount of light received from the object. Hundred times less bright (e.g. the same object ten times as far) corresponds to an apparent magnitude that is five more; 2.512 times less bright (e.g. the same object 1.585 times as far) corresponds to an apparent magnitude that is one more.


As the amount of light received actually depends on the thickness of the atmosphere in the line of sight to the object, the apparent magnitudes are normalized to the value it would have outside the atmosphere. The dimmer an object appears, the higher its apparent magnitude. Note that apparent brightness is not equal to actual brightness — an extremely bright object may appear quite dim, if it is far away. The rate at which apparent brightness changes, as the distance from an object increases, is calculated by the inverse-square law (at cosmological distance scales, this is no longer quite true because of the curvature of space). The absolute magnitude, M, of a star or galaxy is the apparent magnitude it would have if it were 10 parsecs away; that of a planet (or other solar system body) is the apparent magnitude it would have if it were 1 astronomical unit away from both the Sun and Earth.

Scale of apparent magnitudes
App. Mag. Celestial Object
−26.8 Sun
−12.6 full Moon
−4.4 Maximum brightness of Venus
−2.8 Maximum brightness of Mars
−1.5 Brightest star: Sirius
−0.7 Second brightest star: Canopus
0 The zero point by definition: Vega
+3.0 Faintest stars visible in an urban neighborhood
+6.0 Faintest stars observable with naked eye
+12.6 Brightest quasar
+30 Faintest objects observable
with Hubble Space Telescope
(see also List of brightest stars)


The scale upon which magnitude is measured has its origin in the Hellenistic practice of dividing those stars visible to the naked eye into six magnitudes. The brightest stars were said to be of first magnitude (m = +1), while the faintest were of sixth magnitude (m = +6), the limit of human visual perception (without the aid of a telescope). This somewhat crude method of indicating the brightness of stars was popularized by Ptolemy in his Almagest, and is generally believed to have originated with Hipparchus. This original system did not measure the magnitude of the Sun. Because the response of the eye to light is logarithmic, the resulting scale is also logarithmic.


In 1856, Pogson formalized the system by defining a typical first magnitude star as a star which is 100 times brighter than a typical sixth magnitude star; thus, a first magnitude star is about 2.512 times brighter than a second magnitude star. The fifth root of 100, an irrational number about (2.512) is known as Pogson's Ratio. Pogson's scale was originally fixed by assigning Polaris a magnitude of 2. Astronomers later discovered that Polaris is slightly variable, so they switched to Vega as the standard reference star. Nowadays, the Sun is the reference (its absolute magnitude is +4.83 in the V band and +5.48 in the B band).


The modern system is no longer limited to 6 magnitudes. Really bright objects have negative magnitudes. For example, Sirius, the brightest star of the celestial sphere, has an apparent magnitude of −1.44 to −1.46. The modern scale includes the Moon and the Sun; the Moon has an apparent magnitude of −12.6 and the Sun has an apparent magnitude of −26.8. The Hubble and Keck telescopes have located stars with magnitudes of +30.


The apparent magnitude in the band x can be defined as

where is the observed flux in the band x, and is a constant that depends on the units of the flux and the band.


The second thing to notice is that the scale is logarithmic: the relative brightness of two objects is determined by the difference of their magnitudes. For example, a difference of 3.2 means that one object is about 19 times as bright as the other, because Pogson's ratio raised to the power 3.2 is 19.054607... The logarithmic nature of the scale is due to the fact of the human eye itself having a logarithmic response, see Weber-Fechner Law.


Magnitude is complicated by the fact that light is not monochromatic. The sensitivity of a light detector varies according to the wavelength of the light, and the way in which it varies depends on the type of light detector. For this reason, it is necessary to specify how the magnitude is measured in order for the value to be meaningful. For this purpose the UBV system is widely used, in which the magnitude is measured in three different wavelength bands: U (centred at about 350 nm, in the near ultraviolet), B (about 435 nm, in the blue region) and V (about 555 nm, in the middle of the human visual range). The V band was chosen so that it gives magnitudes closely corresponding to those seen by the human eye, and when an apparent magnitude is given without any further qualification, it is usually the V magnitude that is meant, also called visual magnitude.


Since cooler stars, such as red giants and red dwarfs, emit little energy in the blue and UV reaches of the spectrum their power is often under-represented by the UBV scale. Indeed, some L and T class stars would have a UBV magnitude of well over 100 since they emit extremely little visible light, but are strongest in infrared.


Magnitude is a minefield and it is extremely important to measure like with like. On photographic film, the relative brightnesses of the blue supergiant Rigel and the red supergiant Betelgeuse are reversed compared to what our eyes see since film is more sensitive to blue light than it is to red light.


For an object with a given absolute magnitude, 5 is added to the apparent magnitude for every tenfold increase in the distance to the object.


  Results from FactBites:
 
Stellar - Space Wiki - A Wikia wiki (436 words)
Stellar aberration is an astronomical phenomenon defined as an apparent motion of the heavenly bodies due to a combination of the motion of the Earth and the finite velocity of light.
Stellar astronomy is the study of stars and the phenomena exhibited by the various forms/developmental stages of stars.
Stellar evolution is the sequence of changes that a star undergoes during its lifetime, the millions or billions of years during which it emits light and heat.
  More results at FactBites »

 
 

COMMENTARY     


Share your thoughts, questions and commentary here
Your name
Your comments

Want to know more?
Search encyclopedia, statistics and forums:

 


Press Releases |  Feeds | Contact
The Wikipedia article included on this page is licensed under the GFDL.
Images may be subject to relevant owners' copyright.
All other elements are (c) copyright NationMaster.com 2003-5. All Rights Reserved.
Usage implies agreement with terms, 1022, m