This is an expanded version of an answer I gave to a question that came up while I was assisting the 2014-2015 WOOT class. It struck me as an unusually good way to motivate higher math using stuff that people notice in high school but for some reason decide to not think about.

In high school precalculus, you’ll often be asked to find the roots of some polynomial with integer coefficients. For instance,

has roots , , . Or as another example,

has roots , , . You’ll notice that the “weird” roots, like and , are coming up in pairs. In fact, I think precalculus explicitly tells you that the imaginary roots come in conjugate pairs. More generally, it seems like all the roots of the form come in “conjugate pairs”. And you can see why.

But a polynomial like

has no rational roots. (The roots of this are approximately , , .) Or even simpler,

has only one real root, . These roots, even though they are irrational, have no “conjugate” pairs. Or do they?

Let’s try and figure out exactly what’s happening. Let be any complex number. We define the **minimal polynomial** of to be the monic polynomial such that

- has rational coefficients, and leading coefficient ,
- .
- The degree of is as small as possible.

For example, has minimal polynomial . Note that is also a polynomial of the same degree which has as a root; that’s why we want to require the polynomial to be monic. That’s also why we choose to work in the rational numbers; that way, we can divide by leading coefficients without worrying if we get non-integers.

Why do we care? The point is as follows: suppose we have another polynomial such that . Then we claim that actually divides ! That means that all the other roots of will also be roots of .

The proof is by contradiction: if not, by polynomial long division, we can find a quotient and remainder , such that

and . Notice that by plugging in , we find that . But , and was supposed to be the minimal polynomial. That’s impossible!

Let’s look at a more concrete example. Consider from the beginning. The minimal polynomial of is (why?). Now we know that if is a root, then is divisible by . And that’s how we know that if is a root of , so must .

As another example, the minimal polynomial of is . So actually has **two** conjugates, namely, and . Thus any polynomial which vanishes at also has and as roots!

You can generalize this by replacing with any field and all of this still works. One central idea of Galois theory is that these “conjugates” all “look the same” as far as can tell.

As another aside: does the minimal polynomial exist for every ? It turns out the answer is no, and the numbers for which there is no minimal polynomial are called the transcendental numbers.