Epistemic status: highly dubious. I found almost no literature doing anything quite like what follows, which unsettles me because it makes it likely that I’m overcomplicating things significantly.

## 1. Synopsis

Recently I was working on an elegant problem which was the original problem 6 for the 2015 International Math Olympiad, which reads as follows:

**Problem**

[IMO Shortlist 2015 Problem C6] Let be a nonempty set of positive integers. We say that a positive integer is *clean* if it has a unique representation as a sum of an odd number of distinct elements from . Prove that there exist infinitely many positive integers that are not clean.

Proceeding by contradiction, one can prove (try it!) that in fact all sufficiently large integers have exactly one representation as a sum of an *even* subset of . Then, the problem reduces to the following:

**Problem**

Show that if is an increasing sequence of positive integers and is a nonzero polynomial then we cannot have

as formal series.

To see this, note that all sufficiently large have coefficient . Now, the intuitive idea is obvious: the root appears with finite multiplicity in so we can put where , and then we get that on the RHS divides too many times, right?

Well, there are some obvious issues with this “proof”: for example, consider the equality

The right-hand side is “divisible” by , but the left-hand side is not (as a polynomial).

But we still want to use the idea of plugging , so what is the right thing to do? It turns out that this is a complete minefield, and there are a lot of very subtle distinctions that seem to not be explicitly mentioned in many places. I think I have a complete answer now, but it’s long enough to warrant this entire blog post.

Here’s the short version: there’s actually **two distinct notions** of “generating function”, namely a “formal series” and “functional series”. They use exactly the same notation but are two different types of objects, and this ends up being the source of lots of errors, because “formal series” do not allow substituting , while “functional series” do.

Spoiler: we’ll need the asymptotic for the partition function .

## 2. Formal Series Functional Series

I’m assuming you’ve all heard the definition of . It turns out unfortunately that this isn’t everything: there are actually two types of objects at play here. They are usually called *formal power series* and *power series*, but for this post I will use the more descriptive names **formal series** and **functional series**. I’ll do everything over , but one can of course use instead.

The formal series is easier to describe:

**Definition 1**

A **formal series** is an infinite sequence of complex numbers. We often denote it by . The set of formal series is denoted .

This is the “algebraic” viewpoint: it’s a sequence of coefficients. Note that there is no worry about convergence issues or “plugging in ”.

On the other hand, a functional series is more involved, because it has to support substitution of values of and worry about convergence issues. So here are the necessary pieces of data:

**Definition 2**

A **functional series** (centered at zero) is a function , where is an open disk centered at or . We require that there exists an infinite sequence of complex numbers satisfying

(The limit is take in the usual metric of .) In that case, the are unique and called the **coefficients** of .

This is often written as , with the open set suppressed.

**Remark 3**

Some remarks on the definition of functional series:

- This is enough to imply that is holomorphic (and thus analytic) on .
- For experts: note that I’m including the domain as part of the data required to specify , which makes the presentation cleaner. Most sources do something with “radius of convergence”; I will blissfully ignore this, leaving this data implicitly captured by .
- For experts: Perhaps non-standard, . Otherwise I can’t take derivatives, etc.

Thus formal and functional series, despite having the same notation, have different *types*: a formal series is a sequence, while a functional series is a function that happens to be expressible as an infinite sum within its domain.

Of course, from every functional series we can extract its coefficients and make them into a formal series . So, for lack of better notation:

**Definition 4**

If is a formal series, and is a functional series whose coefficients equal , then we write .

## 3. Finite operations

Now that we have formal and functional series, we can define sums. Since these are different types of objects, we will have to run definitions in parallel and then ideally check that they respect .

For formal series:

**Definition 5**

Let and be formal series. Then we set

This makes into a ring, with identity and .

We also define the derivative by .

It’s probably more intuitive to write these definitions as

and in what follows I’ll start to use more. But officially, all definitions for formal series are in terms of the coefficients alone; these presence of serves as motivation only.

**Exercise 6**

Show that if is a formal series, then it has a multiplicative inverse if and only if .

On the other hand, with functional series, the above operations are even simpler:

**Definition 7**

Let and be functional series with the same domain . Then and are defined pointwise.

If is a functional series (hence holomorphic), then is defined poinwise.

If is nonvanishing on , then is defined pointwise (and otherwise is not defined).

Now, for these finite operations, everything works as you expect:

**Theorem 8** **(Compatibility of finite operations)**

Suppose , , are formal series, and , , are functional series . Assume , , .

- , .
- .
- If is defined, then is defined and .

So far so good: as long as we’re doing finite operations. But once we step beyond that, things begin to go haywire.

## 4. Limits

We need to start considering limits of and , since we are trying to make progress towards infinite sums and products. Once we do this, things start to burn.

**Definition 9**

Let and be formal series, and define the difference by

This function makes into a metric space, so we can discuss limits in this space. Actually, it is a normed vector space obtained by above.

Thus, if each coefficient of eventually stabilizes as . For example, as formal series we have that , , converges to , which we write as

As for functional series, since they are functions on the same open set , we can use **pointwise convergence** or the stronger **uniform convergence**; we’ll say explicitly which one we’re doing.

**Example 10** **(Limits don’t work at all)**

In what follows, for every .

- Here is an example showing that if , the functions may not converge even pointwise. Indeed, just take as before, and let .
- Here is an example showing that even if uniformly, may not exist. Take as constant functions. Then , but doesn’t exist because the constant term never stabilizes (in the combinatorial sense).
- The following example from this math.SE answer by Robert Israel shows that it’s possible that exists, and pointwise, and still . Let be the open unit disk, and set
for . By Runge theorem there’s a polynomial such that

Then

is the desired counterexample (with being the sequence of coefficients from ). Indeed by construction , since for each . Alas, for , so converges pointwise to the identity function.

To be fair, we do have the following saving grace:

**Theorem 11** **(Uniform convergence and both limits exist is sufficient)**

Suppose that converges *uniformly*. Then if for every , and , then .

*Proof:* Here is a proof, copied from this math.SE answer by Joey Zhou. WLOG , and let . It suffices to show that for all . Choose any . By Cauchy’s integral formula, we have

since converges uniformly to on . Hence, . Since for , the result follows.

The take-away from this section is that limits are relatively poorly behaved.

## 5. Infinite sums and products

Naturally, infinite sums and products are defined by taking the limit of partial sums and limits. The following example (from math.SE again) shows the nuances of this behavior.

**Example 12** **(On )**

The expression

does *not* make sense as a formal series: we observe that for every the constant term of the partial sum changes.

But this *does* converge (uniformly, even) to a functional series on , namely to .

**Exercise 13**

Let be formal series.

- Show that an infinite sum converges as formal series exactly when .
- Assume for convenience for each . Show that an infinite product converges as formal series exactly when .

Now the upshot is that one example of a convergent formal sum is the expression itself! This means we can use standard “radius of convergence” arguments to transfer a formal series into functional one.

**Theorem 14** **(Constructing from )**

Let be a formal series and let

If then there exists a functional series on such that .

*Proof:* Let and be the corresponding partial sums of to . Then by Cauchy-Hadamard theorem, we have uniformly on (compact subsets of) . Also, by construction.

This works less well with products: for example we have

as formal series, but we can’t “plug in ”, for example,

## 6. Finishing the original problem

We finally return to the original problem: we wish to show that the equality

cannot hold as formal series. We know that tacitly, this just means

as formal series.

Here is a solution obtained only by only considering coefficients, presented by Qiaochu Yuan from this MathOverflow question.

Both sides have constant coefficient , so we may invert them; thus it suffices to show we cannot have

as formal power series.

The coefficients on the LHS have asymptotic growth a polynomial times an exponential.

On the other hand, the coefficients of the RHS can be shown to have growth both strictly larger than any polynomial (by truncating the product) and strictly smaller than any exponential (by comparing to the growth rate in the case where , which gives the partition function mentioned before). So the two rates of growth can’t match.