You're welcome, Celestron. |
But it turns out the moon is pretty dim, too, when considered from another perspective (no, not the dark side). So why does the moon shine in the first place? While it does have a temperature, the vast majority of its thermal radiation is not in the visible range. Instead, of course, the moon borrows its light from the sun, reflecting it back toward us.
Naively, then, you might expect the moon would be roughly the same brightness as the sun. And when you look at a full moon hovering imperiously in the night, washing out all the stars in the sky, it does seem darn bright. However, our eyes (and the rest of our senses) are pretty terrible at discerning objective levels of radiant power. The moon is bright only relative to the sky and the stars. In astronomical terms, the sun is much, much more luminous than the moon.
Measured with fancy equipment, the apparent magnitude of the sun is about -27, while the apparent magnitude of the moon is roughly -13. If you remember from my nerdrage over Star Wars, larger magnitudes are dimmer, the visible stars are around magnitudes 1-6, and the scale is not linear. From this we can tell that the sun is way brighter than the moon, the moon is way brighter than the stars, and astronomers use a needlessly cumbersome system for quantifying brightness.
If you do the math, 1014/2.5, a magnitude difference of 14 is about a factor of 400,000 in brightness. Yes, objectively, the sun is 400,000 times brighter than the moon (as seen from Earth). So when the moon shines its paltry reflected sunlight back at you, what happens to the other 99.99975% of the light? How do we go from a sun’s worth of light to one moon unit (a Zappa)?
It's at this point you may recall that different objects reflect and absorb different amounts of light. That's why color exists, after all. You can also measure an overall amount of reflectivity, which gets called albedo. The bond albedo of an object is just the percentage of light that is reflected rather than absorbed. Freshly fallen snow has an albedo as high as 0.9, whereas asphalt can be as low as 0.04. The moon's average albedo is 0.12, which means 88% of the sun's light is absorbed. But 88% is not 99.99975%. From albedo considerations alone, the moon is still too bright by a factor of 48,000. How does the moon get rid of the rest of its sunlight?
The problem is that we're thinking of the moon as a giant, flat mirror directly reflecting the sun's light toward us. But the moon is not a mirror. You can tell this because it doesn't look like the sun. A mirror exhibits specular reflection, which means incoming light bounces off cleanly at a particular angle. If it comes in 30° one way, it bounces off 30° the other way. And since all the light bounces in the same way, mirrors reproduce an image of what’s reflecting off of them.
Ignore everything about this picture that is ridiculous. |
I admit I struggled with this problem for a bit before finding a suitable answer. Here's what I did to solve it. How do you account for a factor like 48,000? Well, let's compare some relevant numbers. The moon is 384,400 km away from us on average. Its radius is 1,737 km. The Earth's radius is 6,400 km. The distance from the sun to us is 150,000,000 km. Hmm, I can’t think of anything else that might be important.
The distance from the sun can't matter, because we're dealing with the apparent brightness of the sun, which is how bright it looks to us from here on Earth. Distance already factors into the 400,000 figure. The Earth's radius can't matter, because we're talking about how bright the moon is to our eyes. If the Earth were the size of a pin (and we were still the same distance from the moon), it wouldn't affect the light that hits our eyes. So the only two numbers that can matter are the moon's radius and its distance from us.
Well, what's 384,400/1,737? 221. 221 doesn't look very good, but if we square it, we get about 49,000. That's very close, within a few percent, of our factor of 48,000 (which is a heavily rounded figure). Okay, but why does squaring matter?*
In the illustration above, we're thinking that the moon intercepts the sun's light and shines this perfect sun laser back at us. If that's the case, then we are hit by a circle of light with the area of the moon's disc. The area of a circle is πr2. (I told you π was involved.) If the above relation is valid, then we are really being hit by a circle of light with the radius of the distance between the Earth and the moon. How could that be? Imagine that instead of the moonlight bouncing straight back at us, it spreads out in a cone, with the angle between the edge of the cone and the line connecting the Earth and the moon being 45°.
Jobs I won't get upon completion of my degree include: NASA artist |
Why would the light reflect back that way? It probably doesn't, exactly. The process by which the moon reflects light is complicated and is modeled with something called a bidirectional reflectance distribution function. But the opposition effect means a full moon tends to reflect light directly back, so everything coming back at an angle of 45° or less seems reasonable. But we're ignoring for a moment that the moon is not a point source, so that right circular cone probably looks different at other latitudes. On average, though, it works out to produce the above picture.
Anyway, that's probably enough MS Paint illustration from me for one blog post. Also, this is a reasonable length, so I better stop now before things get out of hand.
*Update: My solution to the problem posed in this post is almost certainly wrong. I believe I was right about the square relation between the moon's radius and distance from us, but wrong about why that relationship is important. That's the tricky thing about proportionality arguments: without constants, you can fool yourself about what you're talking about. Anyway, I think I've figured out the real answer.
So one of the issues that bothered me about my solution is that it relies on the moon being this weird, hard to study surface, but gives you an answer with a simple and neat geometric interpretation. That seemed unlikely, but the math worked so I accepted the answer anyway. But it turns out that the moon's surface is both harder and easier to analyze than I realized. Before I get to that, however, there's another important issue.
When I first considered this problem, I assumed the answer was that the inverse square law causes the light reflected from the moon to diminish so that it is less luminous than the sun. But after some thought, that didn't seem plausible. You see, when the sun's light travels to us, it loses some intensity because of the inverse square law, just like gravity gets weaker with distance.
For the moon's light, however, that light goes the extra distance from the Earth to the moon and back again (for a full moon). But the distance to the sun is 150,000,000 km, and the distance to the moon is 384,400, which means the additional distance traveled is only .5% more, which is only going to lose you .25% of your intensity from the inverse square law, and not the factor of 48,000 we needed. So I figured that couldn't be the answer.
What I was failing to consider, however, was that light reflecting off of the moon changes the applicability of the inverse square law. The inverse square law isn't mysterious. Rather, it's a consequence of geometry in a 3-dimensional world. If an object emits light radially from a point source, then at any given distance from the source, the light will be spread out on a spherical shell around the source. As the distance grows, the light falls off with the square of the distance, because the surface area of a sphere is 4πr2.
But any real emitter is not actually a point source. The sun radiates the light we see from its surface, which is (almost perfectly) spherical. All this means, however, is that there is some defined power at the surface, and we can imagine that power increasing to infinity as we dip below that surface to a point. But here's the key: the power radiated per unit area has some value at 1 radii out (the surface), and that power drops to 1/4 its original value at 2 radii, 1/9 its original value at 3 radii, and so on. Note that this exactly mirrors (ha) my original answer. At 221 moon radii (384,400/1737), the power has been reduced by a factor of 2212=49,000.
This answer being applicable, however, requires that light reflected from the moon is emitted radially (from the half that is facing the sun, anyway), which seemed implausible to me in the beginning given how complicated the moon's regolith is supposed to be. But it turns out that if you assume the moon is an ideal diffuse reflecting surface, then radial emission is what happens.
For a specular reflecting surface, the incident angle of the light exactly determines the angle of reflection. But for an ideal diffuse surface, the incident angle is not important at all, and the light reflects in a random direction. If the light reflects entirely randomly, then on average the angle of reflection will be exactly perpendicular to the surface, because any angle away from perpendicular will be balanced out. So on average, a diffuse reflector looks like a radial emitter and follows the inverse square law.
The complicated surface of the moon, with its opposition effect, means that the "on average" part up there is not strictly speaking true, but it apparently doesn't have enough of an effect to eliminate the approximately true inverse square relation that shows up. Why radially emitting from the moon seems to drop off more quickly than radially emitting from the Sun is because a radial emitter that has the Sun's apparent brightness at 1 lunar radii is actually a weaker source than the same apparent brightness at 1 solar radii. If you expand the lunar emitter to the size of the solar emitter, then your power/area is reduced accordingly and you have a dimmer surface, so of course its power will fall off more quickly than the solar emitter.
Well, anyway, so much for this post being a reasonable length.