How to add infinitely many numbers together?
Exploring the surprising world of infinite series and their rearrangements
Adding two numbers together is one of the very first things we learn in school.
We practice addition and multiplication tables so much that it becomes second nature. Addition feels boring.
However, once we add an infinite number of terms, things get really crazy. For instance, look at what happens when we sum the alternating sequence of 1-s and (-1)-s.
Is it zero? Is it one?
It is neither. I told you: infinite sums are weird. In this post, I’ll also show you: they are beautiful and useful.
The Palindrome is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
Let’s go back to square one: the sum of infinitely many terms is called an infinite series. (Or series in short.)
Infinite series form the foundations of mathematics.
Even though in practice we don’t often see them explicitly, they are everywhere. For instance, the famous exponential function is defined by an infinite series.
Do infinite series make sense? Sure. Take a look at the geometric series: summing the positive powers of 1/2 adds up to one. Here is a visual proof to convince you.
(If you are interested in why, check out my recent post about the details.)
Not all sums are finite. Here is the so-called harmonic series, that is, the sum of the reciprocals of positive integers.
Some don’t even make sense, like summing the alternating sequence of 1-s and (-1)-s. We’ve seen that depending on how we look at it, it can be zero or one. Or any other integer, in fact.
(For simplicity, we’ll call this the alternating series. There are many alternative series, but this is the special one.)
So, what makes an infinite series well-defined?
Making sense by convergence
Even if we can’t (yet) sum infinitely many terms, we can sum finitely many. As you have seen, this is denoted by the capital greek sigma (Σ), short for sum.
The value of an infinite series is straightforward to define: we sum the first N terms — called partial sums — and take the limit.
If the limit exists, we say that the series is convergent. A non-convergent series is called divergent.
For instance, here is the geometric series, one of the most famous infinite series.
Want a divergent example? We have seen one already: the alternating series. It can seemingly evaluate to zero or one because the “usual” properties of addition (like associativity and commutativity) break down for divergent series.
Of course, things are not that simple. Infinite series can behave in completely counterintutive ways. For instance, can you add two infinite series term by term? Yes, but only if both of them are convergent.
Otherwise, remarkably weird things happen. Again, check what happens for the alternating series.
This implies that the value of the alternating series is 1/2. That is, you add integers together, and you obtain a fraction.
Of course, this is not the case: as the alternating series is divergent, termwise addition is an invalid operation.
Here is the biggest surprise: there are mathematically precise ways to extend the concept of convergence for infinite series, and the value of the alternating series will indeed be 1/2!
Turning up the weirdness
Even among convergent series, crazy stuff can happen. Recall the harmonic series? It’s divergent, but it has a close relative that is convergent.
Meet the alternating harmonic series.
Surprisingly, the alternating sum of reciprocals adds up to log 2.
What happens when we rearrange the terms of the series? Common sense dictates that the sum should remain the same.
That is not the case. Take a look at the following rearrangement.
The change is seemingly small: instead of alternating between one odd and one even term, we do one odd and two even. (Odd and even with respect to their indices.)
Check out below what happens.
By changing the order of terms, we managed to cut the value in half. (Moreover, here is a rearrangement that’ll make the sum divergent.) Is this an accident?
No. One does not just simply rearrange the terms of a convergent series. To differentiate between series that we can rearrange, we have to refine our concept of convergence.
We say that a series is absolutely convergent if the series formed by its termwise absolute values is convergent as well. Otherwise, it is conditionally convergent if convergent but not absolutely convergent.
We have already seen some examples: the geometric series is absolutely, while the alternating harmonic series is conditionally convergent.
For absolutely convergent series, we can rearrange the terms as we wish. This is how we expect a “proper” infinite series to behave.
However, conditionally convergent series go bonkers. What we’ve seen before is just the tip of the iceberg. As it turns out, given any conditionally convergent series and any real number c, there is a rearrangement that’ll sum to c.
This is called the Riemann rearrangement theorem.
When I first saw this, I couldn’t believe it. Any conditionally convergent series, any real number.
We are at our conclusion here. To sum up, infinite series form a pillar of mathematics. They play a crucial role in
defining the exponential function (via the so-called Taylor series),
expressing functions in terms of trigonometric functions (a.k.a the Fourier series),
and many more.
Without infinite series, the face of science would be completely different.
We often use them without knowing that there are series behind, and upon a closer look, there are many surprises in store. Just recall our very first example showing how the familiar rules of addition break down. Associativity. Commutativity. Rules that are second nature for us.
The real lesson is this: intuition is very dangerous in mathematics. If weird things start popping up, we have to clear up our axioms and definitions first.
After all, axioms and definitions are the pillars of this massive structure we call mathematics.