Again, it’s been awhile since my last post. Stuffing linear algebra into a 5-week course (with a not terribly awesome professor) turned out to be hard. But I said I would have a post about the class, and last night presented the nth example of what it is I wanted to write about.
You see, linear algebra is no ordinary math class, despite the deceptively benign name. Our professor told us we’d actually have to think in his class because linear algebra is the first math course where you have to prove things. That’s true enough, but it’s not the reason why linear algebra is different from other math.
The reason is that linear algebra is the first math course seemingly dedicated to the task of teaching you what you’re really doing in math. We kind of take it for granted that our teachers lie to us. They don’t (usually) do so maliciously; they do so because the “true” answer is significantly more complex than the “false” answer. But it happens pretty frequently that the lies teachers tell us are good enough.
Regardless, there comes a point at which you’re deemed capable of handling the truth. Now, there are plenty of times earlier in math when this happens. I think the most basic example is subtraction. You learn it shortly after you learn about addition, and you’re told that unlike addition, subtraction is not commutative. 10-5 is not the same as 5-10. But it turns out that whenever you’re subtracting, what you’re “really” doing is adding a negative. 10-5 becomes 10 + -5. And then subtraction is again commutative, because 10 + -5 is exactly the same as -5 + 10.
But linear algebra extends this drawing back of the curtain to many more ideas. In linear algebra, you learn that the vector dot product is really just a special case of a general operation called the inner product. You learn that vectors themselves are really just objects that follow a particular set of rules and don’t necessarily have anything to do with directed line segments. You learn that functions are really just a method of mapping one set to another.
At this point you might be wondering what “really” really means. Is there some true math underlying the universe that we simple humans are merely discovering, or are we just peeling back the layers of logic upon which our peculiar brand of mathematics is built? Since the 1600s, when calculus was discovered/invented, mathematics has proven enormously successful at describing the real world.
Many have seen this as distinct evidence that math is something real and not just a human construct. Math that was once thought to be purely theoretical in nature has turned out to have physical basis. So when a mathematician writes down a law no one has written down before, has that mathematician discovered the law or invented it?
Others do not hold math in such reverence, choosing to believe instead that we have simply made up math to serve our purposes. They point to the fact that mathematicians often make choices and that those choices don’t necessarily reflect anything deeper. We choose for 0! to equal 1. We choose for division by zero to give you a calculator error. At one point we chose for there to be no imaginary numbers or even negative ones. Did someone discover the complex numbers, or did we merely decide to define the square root of -1?
I try not to make definitive statements about the nature of the universe. I err on the side of caution as far as that’s concerned. So I’m not willing to say that math is a fundamental part of the universe. But I also don’t believe new branches of math are merely choices or inventions. Instead I believe that new math resembles the emergent phenomena you see in something like the Game of Life. Simple rules can lead to complex behavior. DNA is truly the perfect example. There is far more data contained in even an infant human brain than in the human genome, yet somehow the brain comes from DNA, from following relatively simple rules over and over again.
Math, then, is many complex rules that emerge from a few simple ones. Historically, mathematics began by matching sets. We used numbers to keep track of livestock in a pen or bushels of grain in a granary. We matched the number we observed with the number of notches in clay or wood. And it turns out that today—although this wasn’t always true—we still base mathematics on the idea of matching and counting sets.
So when a mathematician writes down a new law, that mathematician is discovering a novel application of the rules we invented. This doesn’t explain why math is so effective, but it does bridge a gap between discovery and invention.
The next question, I suppose, is why we invented the rules we did. I think that arises from the fact that, on large-ish scales, the universe is discrete. I see 1 tasty lamb; I’m being chased by 3 hungry wolves; I am saved by 5 human friends.
(On small-ish scales, the universe looks continuous. We don’t have 3 water; we have enough water to fill that bucket. We now know that water is made up of molecules, so you can count your water.)
Maybe that’s why it took us so long to get to calculus. Maybe our brains are designed to see things in a discrete, delineated fashion. We know it’s true that our brains are excellent at border detection, at filling in missing lines, at assigning meaning to individual objects. So perhaps our universe exists discretely, and we evolved discretely, and we came up with discrete math, too. No wonder the math we invented, and then discovered, is so effective.
(No, I don't believe I've solved the philosophy of mathematics in a 1000-word post.)