Epistemic status: highly dubious. I found almost no literature doing anything quite like what follows, which unsettles me because it makes it likely that I’m overcomplicating things significantly.
Recently I was working on an elegant problem which was the original problem 6 for the 2015 International Math Olympiad, which reads as follows:
Proceeding by contradiction, one can prove (try it!) that in fact all sufficiently large integers have exactly one representation as a sum of an even subset of . Then, the problem reduces to the following:
To see this, note that all sufficiently large have coefficient . Now, the intuitive idea is obvious: the root appears with finite multiplicity in so we can put where , and then we get that on the RHS divides too many times, right?
Well, there are some obvious issues with this “proof”: for example, consider the equality
The right-hand side is “divisible” by , but the left-hand side is not (as a polynomial).
But we still want to use the idea of plugging , so what is the right thing to do? It turns out that this is a complete minefield, and there are a lot of very subtle distinctions that seem to not be explicitly mentioned in many places. I think I have a complete answer now, but it’s long enough to warrant this entire blog post.
Here’s the short version: there’s actually two distinct notions of “generating function”, namely a “formal series” and “functional series”. They use exactly the same notation but are two different types of objects, and this ends up being the source of lots of errors, because “formal series” do not allow substituting , while “functional series” do.
Spoiler: we’ll need the asymptotic for the partition function .
2. Formal Series Functional Series
I’m assuming you’ve all heard the definition of . It turns out unfortunately that this isn’t everything: there are actually two types of objects at play here. They are usually called formal power series and power series, but for this post I will use the more descriptive names formal series and functional series. I’ll do everything over , but one can of course use instead.
The formal series is easier to describe:
This is the “algebraic” viewpoint: it’s a sequence of coefficients. Note that there is no worry about convergence issues or “plugging in ”.
On the other hand, a functional series is more involved, because it has to support substitution of values of and worry about convergence issues. So here are the necessary pieces of data:
Thus formal and functional series, despite having the same notation, have different types: a formal series is a sequence, while a functional series is a function that happens to be expressible as an infinite sum within its domain.
Of course, from every functional series we can extract its coefficients and make them into a formal series . So, for lack of better notation:
3. Finite operations
Now that we have formal and functional series, we can define sums. Since these are different types of objects, we will have to run definitions in parallel and then ideally check that they respect .
For formal series:
It’s probably more intuitive to write these definitions as
and in what follows I’ll start to use more. But officially, all definitions for formal series are in terms of the coefficients alone; these presence of serves as motivation only.
On the other hand, with functional series, the above operations are even simpler:
Now, for these finite operations, everything works as you expect:
So far so good: as long as we’re doing finite operations. But once we step beyond that, things begin to go haywire.
We need to start considering limits of and , since we are trying to make progress towards infinite sums and products. Once we do this, things start to burn.
Thus, if each coefficient of eventually stabilizes as . For example, as formal series we have that , , converges to , which we write as
As for functional series, since they are functions on the same open set , we can use pointwise convergence or the stronger uniform convergence; we’ll say explicitly which one we’re doing.
To be fair, we do have the following saving grace:
Proof: Here is a proof, copied from this math.SE answer by Joey Zhou. WLOG , and let . It suffices to show that for all . Choose any . By Cauchy’s integral formula, we have
since converges uniformly to on . Hence, . Since for , the result follows.
The take-away from this section is that limits are relatively poorly behaved.
5. Infinite sums and products
Naturally, infinite sums and products are defined by taking the limit of partial sums and limits. The following example (from math.SE again) shows the nuances of this behavior.
Now the upshot is that one example of a convergent formal sum is the expression itself! This means we can use standard “radius of convergence” arguments to transfer a formal series into functional one.
Proof: Let and be the corresponding partial sums of to . Then by Cauchy-Hadamard theorem, we have uniformly on (compact subsets of) . Also, by construction.
This works less well with products: for example we have
as formal series, but we can’t “plug in ”, for example,
6. Finishing the original problem
We finally return to the original problem: we wish to show that the equality
cannot hold as formal series. We know that tacitly, this just means
as formal series.
Here is a solution obtained only by only considering coefficients, presented by Qiaochu Yuan from this MathOverflow question.
Both sides have constant coefficient , so we may invert them; thus it suffices to show we cannot have
as formal power series.
The coefficients on the LHS have asymptotic growth a polynomial times an exponential.
On the other hand, the coefficients of the RHS can be shown to have growth both strictly larger than any polynomial (by truncating the product) and strictly smaller than any exponential (by comparing to the growth rate in the case where , which gives the partition function mentioned before). So the two rates of growth can’t match.