As a freshman, I made the conscious decision not to live where I knew a lot of other math majors would be (Random). I figured I would have the rest of my life to meet other math people, and I really wanted to use college as an opportunity to expand my horizons and meet other kinds of people. I've definitely succeeded in that goal, both at Senior Haus (where I lived as a freshman) and at Theta Xi (where I live now). As a side effect, though, I never really became involved with the math community at MIT, and I don't often talk to other MIT students about the stuff I'm studying.
A lot of people, even at MIT, don't really like math. The story I hear too often is that they loved math up to a certain point, then got a terrible math teacher, then it stopped making sense to them and they hated it after that. It's a sad story. Math very much builds on itself, and if you miss a vital piece of foundation, then your math is going to be fragile and prone to collapse. You can probably take literature classes in college without taking literature classes in high school, but good luck trying to take math classes in college without taking math classes in high school.
The saddest part, though, is that most people never get to the good stuff! Most of what gets taught in grade school doesn't really deserve to be called "math." It's really closer to what mathematician John Allen Paulos calls numeracy. It's important to distinguish mathematics from numeracy in the same way that it's important to distinguish literature from literacy. Literature is art; literacy is a basic skill. And even the stuff that isn't numeracy – trigonometry, for example – is absurdly old. Haven't you ever wondered what mathematicians have been up to since then?
I think the good stuff is beautiful – some of the most beautiful stuff in human history – and I want more people to at least know what it looks like. So I'd like to give some non-technical descriptions of the courses I took last fall at the University of Cambridge through CME. (I'd describe the courses I'm taking now, but most of them are graduation requirements.) It's a little harder to do this for math classes, especially purer math classes, than other classes because I can't just give short, easily-understandable descriptions like
2.665: Build robots.
2.666: Build robots that shoot lasers.
I have to explain at least roughly what some abstract concept is, and also why studying it is interesting. I haven't really tried to do this before, but I might as well start now, right? So let's see how I do. Feel free to ask in the comments for clarification!
The quadratic formula tells you how to find the roots of a quadratic polynomial in terms of square roots. There's also a cubic formula for finding the roots of a cubic polynomial in terms of square roots and cube roots, although it's huge and impractical to use. There's even a quartic formula, which is even huger and more impractical. You might expect, based on this pattern, that there's a quintic formula that takes pages and pages to write down.
But something much more interesting is true: there is no quintic formula! I bet you're wondering why. The modern explanation, in terms of Galois theory, goes something like this: the roots of a polynomial are not as different from each other as they seem. In fact, in certain situations you can swap around some of them, and it doesn't really matter. In other words, the roots have certain symmetries, which are mathematically described using the notion of a group. Rather than try to explain what this means, I'll give some examples: think of the rotational and reflectional symmetries of a regular polygon, or of a Platonic solid.
Galois discovered an amazing relationship between these symmetries and writing down generalizations of the quadratic formula. It turns out that our ability to write down generalizations of the quadratic formula for a given polynomial depends on how complicated the symmetries of its roots are. For quadratic polynomials, the only interesting case is where you can swap the two roots, which is a very simple symmetry. For cubic polynomials, you can either cyclically permute the three roots, or you can in addition swap two of them: think of the rotational (then reflectional) symmetries of a triangle. For quartic polynomials, there are a few more possibilities: think of the rotational (then reflectional) symmetries of a square, then of a rectangle, then of a tetrahedron. It turns out that none of these are particularly complicated in the sense above, which is why we can write down quadratic, cubic, and quartic formulas.
For quintic polynomials, it can happen that the symmetries are too complicated: they can look like the rotational and reflectional symmetries of an icosahedron! And this turns out to be too complicated to allow for a quintic formula to exist.
Galois theory is related at least by analogy to a wide swath of modern mathematics, and in particular complicated descendants of Galois theory were fundamental to Wiles' proof of Fermat's Last Theorem.
Rough MIT equivalent: Studied in 18.702.
Put six people into a room. Then either three of them will all be friends with each other or three of them will all be strangers. A sociologist once observed this and thought he might have made some deep sociological discovery, but he consulted some mathematicians first and learned that what he had observed instead was pure mathematical fact: what I just said is true regardless of which people are friends with which other people!
The relevant structure here is that of a graph, a collection of nodes connected by edges. Above, the nodes are the six people and the edges indicate who is friends with who. Another example of significant practical importance is the graph whose nodes are all websites on the internet and where an edge between two nodes means one links to the other. A surprising number of questions in mathematics can be phrased as questions about graphs, and there are all sorts of interesting questions you can ask about them that turn out to have interesting answers. Algorithms that deal with graphs are also extremely important in computer science and have many applications, both practical and theoretical.
Rough MIT equivalent: Studied in 18.304 and 6.042.
More commonly known as functional analysis, linear analysis is roughly speaking the study of infinite-dimensional vectors and matrices. The study of many interesting differential equations can be phrased as the study of properties of certain infinite-dimensional matrices, and differential equations are a powerful tool in both pure and applied mathematics, so functional analysis finds applications everywhere.
Rough MIT equivalent: Somewhere between 18.100B and 18.102.
Logic and Set Theory
It's difficult to explain what the point of this class is without explaining something called the foundational crisis in mathematics. Here is a very rough summary of what happened: mathematicians discovered that certain naive ways of constructing mathematical objects led to logical contradictions. To explain the kind of problem that mathematicians ran into, let me use the Grelling-Nelson paradox, which goes like this: some adjectives have the funny property that they don't describe themselves. For example, "monosyllabic" doesn't describe itself because it is polysyllabic. Let's say that such words are heterological.
Is "heterological" a heterological word?
If it is, then it doesn't describe itself, so it isn't. But if it isn't, then it describes itself, so it is! The mathematical version of this, called Russell's paradox, allows you to write down a mathematical object which both does and doesn't have a certain property. This is a contradiction, which is bad news; if you allow yourself a contradiction, you can prove anything. I had fun doing this in middle school by writing down proofs of statements like "Mr. Black [my teacher] is a carrot" starting from a "proof" that 0 = 1.
Anyway, this and other developments convinced mathematicians that they were being too naive about how they constructed mathematical objects, so they tried to write down rules that would allow them to construct all the objects they wanted without leading to contradictions. These rules are, roughly speaking, the subject of set theory. The study of how rules like the rules of set theory work is the subject of logic.
Rough MIT equivalent: 18.510.
Probability and Measure
Flip a coin a bunch of times. Approximately what proportion of them will be heads? About half, okay, but how does the deviation from exactly half behave? As it turns out, the deviation from exactly half looks like a Bell curve: mathematicians say it is (approximately) normally distributed, and gets more so the more coins you flip.
But this isn't just a fact about coins; analogous statements are true if you replace coins by dice or by even more complicated random phenomena. This fundamental result, known as the central limit theorem, explains at least heuristically why Bell curves appear in nature: many (but by no means all!) natural phenomena occur due to the accumulation of a large number of independent, essentially random phenomena, such as properties of organisms controlled by a large number of genes.
The central limit theorem, as a mathematical fact, also has applications in mathematics; however, in mathematics, the random phenomena that need to be considered are very general. To handle them, mathematicians invented measure theory, the general study of "measures" (such as volumes, but also such as probabilities). This is a somewhat technical subject, but a very valuable tool: it gives us, among other things, a very flexible notion of integration.
Rough MIT equivalent: 18.125, with some material from 18.440/18.175.