Convergent Series
We have seen that a series is the sum of the terms of a sequence.
Just like sequences, series can also converge or diverge. A series converges if the sequence of partial sums converges. So in other words using the original sequence we calculate a new sequence, where each term is the sum of all terms up to that point.
\[\begin{align*} \text{Sequence} & : (a_1, a_2, a_3, \ldots, a_n) \\ \text{Series} & : S_n = \sum^{n}_{k=1}{a_k} = a_1 + a_2 + a_3 + \ldots + a_n \\ \text{Sequence of partial sums} & : S_1, S_2, S_3, \ldots, S_n \end{align*} \]If the sequence of partial sums converges then the series converges. The limit of the sequence of partial sums is called the sum or value of the series. If the sequence of partial sums diverges then the series diverges.
For a series to converge the underlying sequence must be a null sequence, in other words the limit of the original sequence must be zero. This is a necessary but not sufficient condition. There are series that diverge even though the sequence of terms converges to zero.
\[\sum_{n=1}^{\infty} a_n \text{ converges} \implies \lim_{n \to \infty} a_n = 0 \]There is an intuitive interpretation behind this condition. Imagine you’re summing up the terms of a series. For the series to converge, the partial sums need to settle on a finite value as you keep adding more and more terms. If the terms of the sequence do not approach zero, it becomes impossible for the partial sums to settle, and the series will diverge. In short, if the terms are not getting smaller and smaller, the series will keep getting larger and larger and will not converge.
Geometric Series
Shouldn’t this be ar^k instead of just r^k?
Let’s look at some example of series and analyze their convergence. The geometric series is a good example to start with. We define the geometric series for a value \(q \in \mathbb{C}\) where \(|q| < 1\) as:
\[\sum^{\infty}_{k=0}{q^k} \]For the series to converge we need to check if the sequence of partial sums converges. The sequence of partial sums is:
\[\begin{align*} S_n &= \sum^{n}_{k=0}{q^k} = 1 + q + q^2 + \ldots + q^n \\ q * S_n &= q + q^2 + q^3 + \ldots + q^{n+1} \\ S_n - q * S_n &= 1 - q^{n+1} \\ (1-q) * S_n &= 1 - q^{n+1} \\ S_n &= \frac{1 - q^{n+1}}{1-q} \end{align*} \]Now we have a closed form for the sequence of partial sums. We can now take the limit of the sequence of partial sums to see if the series converges. As \(n \to \infty\) the term \(q^{n+1}\) goes to zero as \(|q| < 1\). So we can assume the limit exists and that it is \(\frac{1}{1-q}\). Let’s prove it is indeed the limit:
\[\lim_{n \to \infty}\left|\frac{1 - q^{n+1}}{1-q} - \frac{1}{1-q}\right| = \lim_{n \to \infty}\left|\frac{1 - q^{n+1} - 1}{1-q}\right| = \lim_{n \to \infty}\left|\frac{- q^{n+1}}{1-q}\right| = \lim_{n \to \infty}\left|\frac{q^{n+1}}{1-q}\right| = 0 \]So sequence of partial sums converges to \(\frac{1}{1-q}\) with \(|q| < 1\), which means the geometric series converges to \(\frac{1}{1-q}\).
Let’s look at the geometric series for \(q = \frac{1}{2}\):
\[\begin{align*} \sum^{\infty}_{k=0}{\left(\frac{1}{2}\right)^k} &= 1 + \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \ldots \\ &= \frac{1}{1-\frac{1}{2}} = \frac{1}{\frac{1}{2}} = 2 \end{align*} \]But what if the index starts at 1 rather than 0? Let’s revist the closed form for the sequence of partial sums:
\[\begin{align*} \sum^{\infty}_{k=1}{q^k} &= q + q^2 + q^3 + \ldots \\ &= \sum^{\infty}_{k=0}{q^k} - q^0 = \frac{1}{1-q} - 1 = \frac{q}{1-q} \end{align*} \]We know that this sequence still converges. So we can say that what we do in the first steps does not have an effect on the convergence of the series. However, it does have an effect on the value of the series. So the geometric series for \(q = \frac{1}{2}\) starting at 1 converges to \(\frac{\frac{1}{2}}{1-\frac{1}{2}} = 1\) instead of 2.
Harmonic Series
Add the proof for the harmonic series. Probably with Cauchy criterion.
diverges. Despite the sequence of terms converging to zero, the series diverges. This is a good example to show that the terms of the sequence converging to zero is a necessary but not sufficient condition for the series to converge.
Geometric Series

However, there is a nice visualization showing that the series is bounded.

Telescope Series
Add the proof for the telescope series, show visually why it is called a telescope series.
converges to 1.
Beware of Infinite Sums
An important note: In the case of series we can not just use our usual rules for the sum operator. The reason is because the sums go to infinity and we can not just use the normal rules of arithmetic.
So if we have a series \(\sum^{\infty}_{k=1}{a_k}\) und \(\sum^{\infty}_{j=1}{b_j}\) and they converge then the following rules apply:
- \(\sum^{\infty}_{k=1}{c*a_k}=c* \sum^{\infty}_{k=1}{a_k}\) für \(c \in C\)
- \(\sum^{\infty}_{k=1}{a_k\pm b_k}=\sum^{\infty}_{k=1}{a_k}\pm \sum^{\infty}_{k=1}{b_k}\)
For this lets compare two example \(\sum^{\infty}_{n=1}{\frac{1}{n(n+1)}\) into \(\sum^{\infty}_{n=1}{\frac{1}{n}} - \sum^{\infty}_{n=1}{\frac{1}{n+1}}\) We already know this is the telescoping series that converges to 1. So in this infinity minus infinity case we get 1.
But if we try to do the same for the value of 0 and the sum over 0. We can split it into \(\sum^{\infty}_{n=1}{1} - \sum^{\infty}_{n=1}{1}\) which is not correct as these then diverge and we suddenly get a value of 1 (infinity minus infinity).
Cauchy Criterion
Add the proof for the Cauchy criterion. Show that it also works for the sequence of partial sums.
We have seen that for sequences we can use the Cauchy criterion to check if a sequence converges. Specifically we can say that a sequence converges if the following holds:
\[\forall \epsilon > 0 \quad \exists N_{\epsilon} \in \mathbb{N} : |a_n - a_m| < \epsilon \text{ for all } n, m \geq N_{\epsilon} \]We can also apply this to series. The Cauchy criterion for series states that a series converges if the following holds:
\[\forall \epsilon > 0 \exists N_{\epsilon} \in \mathbb{N} : |\sum^{m}_{k=n} a_k| < \epsilon \text{ for all } m \geq n \text{ and } n \geq N_{\epsilon} \]So after some point \(N_{\epsilon}\) the sum of all terms from \(n\) to all \(m\) is less than \(\epsilon\). So in other words after some point the sum of all terms does not change the value of the series anymore. This is a very powerful test to check if a series converges.
The proof is rather simple. We know a series converges if the sequence of partial sums converges. So we can just apply the Cauchy criterion for sequences to the sequence of partial sums. So if the partial sums are Cauchy, then the series converges. For the partial sums to be Cauchy, we need to show that the following holds where \(S_n = \sum^{n}_{k=1}{a_k}\):
\[\forall \epsilon > 0 \exists N_{\epsilon} \in \mathbb{N} : |S_m - S_{n-1}| < \epsilon \text{ for all } n, m \geq N_{\epsilon} \]From this we can derive the following which gives us the Cauchy criterion for series:
\[|S_m - S_{n-1}| = |\sum^{m}_{k=1}{a_k} - \sum^{n-1}_{k=1}{a_k}| = |\sum^{m}_{k=n}{a_k}| < \epsilon \text{ for all } n, m \geq N_{\epsilon} \]Direct Comparison Test
The direct comparison test is another method like the Cauchy criterion to check if a series converges. The idea comes from the Monotone Convergence Theorem and is in my opinion a lot more intuitive then the cauchy criterion.
First let’s just focus on series where after some point \(k\) the terms are always positive. So more formally we focus on series where the following holds:
\[\text{Let } \sum^{\infty}_{n=1}{a_n} \text{ be a series with } a_n \geq 0 \text{ for all } n \geq k \text{ where } k \geq 1 \]We can then use the monotone convergence theorem to show that the series converges if the sequence of partial sums is bounded. So we can show that the sequence of partial sums is bounded by some value \(M\) and that the sequence of partial sums is monotone increasing intuitively this is of course true as the terms are always positive:
\[S_{n+1} - S_n = \sum^{n+1}_{k=1}{a_k} - \sum^{n}_{k=1}{a_k} = a_{n+1} \geq 0 \text{ for all } n \geq 1 \]So if we have such a series that fullfils the conditions above, we just need to show that the sequence of partial sums is bounded to show that the series converges.
Add an example of using this and showing that the sequence of partial sums is bounded. Such as maybe the telescope series.
The direct comparison follows from the idea shown above but in a more general way and kind of mixing it with the squeeze theorem. The idea is that we can compare two series and show that one converges if the other converges. The formal definition of the direct comparison test is as follows. If we have two series \(\sum^{\infty}_{n=1}{a_n}\) and \(\sum^{\infty}_{n=1}{b_n}\) and we know that the following holds for all \(k \geq 1\):
\[0 \leq a_k \leq b_k \]So the terms of both series are always positive and the terms of the first series are always less than or equal to the terms of the second series. Then we can say that if the second series converges, then the first series also converges.
\[\sum^{\infty}_{n=1}{b_n} \text{ converges} \implies \sum^{\infty}_{n=1}{a_n} \text{ converges} \]THe intuition behind this is rather simple. If \(b_n\) converges, then the sum of all terms of \(b_n\) is finite. If \(a_n\) is always less than or equal to \(b_n\), then the sum of all terms of \(a_n\) must also be finite. Because we are only allowing for positve terms, we also do not have to worry about convergence against negative infinity. In a similar way we can also show that if the first series diverges, then the second series also diverges.
\[\sum^{\infty}_{n=1}{a_n} \text{ diverges} \implies \sum^{\infty}_{n=1}{b_n} \text{ diverges} \]The idee is again the same, if \(a_n\) diverges, then the sum of all terms of \(a_n\) is infinite. If \(b_n\) is always greater than or equal to \(a_n\), then the sum of all terms of \(b_n\) must also be infinite and therefore diverges.
Let’s look at the sequence \(\sum^{\infty}_{n=1}{\frac{1}{n^2}}\) and compare it to the telescope series \(\sum^{\infty}_{n=1}{\frac{1}{n(n-1)}}\). We know that the telescope series converges to 1. So we want to show that the other series converges as well. We can do this by showing that the terms of the first series are always less than or equal to the terms of the second series and that they are always positive. The telescope series is not defined for \(n=1\) but if we start at \(n=2\) we can see that the terms of the first series are always less than or equal to the terms of the second series.
\(k\) | \(\frac{1}{k^2}\) | \(\frac{1}{k(k-1)}\) |
---|---|---|
1 | 1 | Undefined |
2 | \(\frac{1}{4}\) | \(\frac{1}{2}\) |
3 | \(\frac{1}{9}\) | \(\frac{1}{6}\) |
4 | \(\frac{1}{16}\) | \(\frac{1}{12}\) |
5 | \(\frac{1}{25}\) | \(\frac{1}{20}\) |
6 | \(\frac{1}{36}\) | \(\frac{1}{30}\) |
Therefore by the direct comparison test we can say that the series \(\sum^{\infty}_{n=1}{\frac{1}{n^2}}\) converges as well.
We can actually make this test even more powerful. Above we defined that the terms of \(b_k\) always had to be greater than or equal to the terms of \(a_k\) and that the terms of \(a_k\) always had to be positive. It turns out that the test also works if we can show that the conditions are met for all \(k \geq m\) for some \(m \in \mathbb{N}\). So we can have a finite number of terms that do not meet the conditions. The intuition behind this is that the first \(m\) terms of the series do not have an influence on the convergence of the series only on the value of the series. So the test works as long as the following holds:
\[0 \leq a_k \leq b_k \text{ for all } k \geq m \text{ where } m \geq 1 \]Some example where the condition does not hold for the first few terms.
Absolute Convergence
We have seen that series can converge to some limit or diverge. But there is also something called absolute convergence. A series is said to be absolutely convergent if the absolute values of the terms of the series converge. In other words, if we take the absolute value of each term in the series and then check if that series converges, then we can say that the original series is absolutely convergent. More formally we define absolute convergence as:
\[\sum^{\infty}_{n=1}{|a_n|} \text{ converges} \]If a series is absolutely convergent, then it is also convergent. However, the converse is not true. A series can be convergent without being absolutely convergent.
\[\sum^{\infty}_{n=1}{a_n} \text{ converges} \implies \sum^{\infty}_{n=1}{|a_n|} \text{ converges} \]This is easily proven by the cauchy criterion for series. If the series is absolutely convergent, then the value of the series is finite. If we instead take the absolute value of the value of the series, we get a smaller or equal value as some terms may cancel each other out. So the absolute value of the series is less than or equal to the value of the series. Therefore the value of the series is also finite and converges.
\[|\sum^{m}_{k=n} a_k| \leq \sum^{m}_{k=n} |a_k| < \epsilon \text{ for all } m \leq n \text{ and } n \leq N_{\epsilon} \]If we say that \(S_n = \sum^{n}_{k=1}{a_k}\) and \(T_n = \sum^{n}_{k=1}{|a_k|}\), then we also get the following from the above inequality:
\[|S_n| = |\sum^{\infty}_{k=1}{a_k}| = |\lim_{n \to \infty} S_n| = \lim_{n \to \infty} |S_n| \leq \lim_{n \to \infty} T_n = \sum^{\infty}_{k=1}{|a_k|} \]Let’s look at the series \(\sum^{\infty}_{n=1}{(-1)^{n+1} \frac{1}{n}}\). The sequence of the terms converge to zero, so our first condition is met. If we analyze the values of the series then we notice that the series converges to \(\ln(2)\). So the series converges.
But if we take the absolute value of the terms of the series, we get the harmonic series \(\sum^{\infty}_{n=1}{\frac{1}{n}}\) which diverges. So we can say that the original series is convergent but not absolutely convergent. This is a good example to show that the converse of that absolute convergence implies convergence is not true.
Alternating Series Test
Also known as Leibniz test.
Alternating Series Estimation Theorem
Riemann Rearrangement Theorem
A Rearrangement of a series is a new series that is formed by rearranging the terms of the original series. More formally the series \(\sum^{\infty}_{n=1}{x_n}\) is a rearrangement of the series \(\sum^{\infty}_{n=1}{a_n}\) if there exists a bijective function of the form:
\[f: \mathbb{N} \to \mathbb{N} \text{ such that } x_n = a_{f(n)} \text{ for all } n \in \mathbb{N} \]Riemann showed that the series \(\sum^{\infty}_{n=1}{(-1)^{n+1} \frac{1}{n}}\) could be rearranged to converge to any real number.
Dirchlet’s Rearrangement Theorem
Dirchlet showed that if a series converges absolutely, then any rearrangement of the series converges and that it even converges to the same value.
Ratio Test
\[\limsup_{n \to \infty} \left| \frac{a_{n+1}}{a_n} \right| < 1 \implies \sum^{\infty}_{n=1}{a_n} \text{ converges} \] \[\liminf_{n \to \infty} \left| \frac{a_{n+1}}{a_n} \right| > 1 \implies \sum^{\infty}_{n=1}{a_n} \text{ diverges} \]Root Test
Stronger version of the ratio test. Can do everything the ratio test can do and more.
\[\limsup_{n \to \infty} \sqrt[n]{|a_n|} < 1 \implies \sum^{\infty}_{n=1}{a_n} \text{ converges} \] \[\limsup_{n \to \infty} \sqrt[n]{|a_n|} > 1 \implies \sum^{\infty}_{n=1}{a_n} \text{ and } \sum^{\infty}_{n=1}{|a_n|} \text{ diverges} \]