Summing up series using Central Limit theorem

🪔 🪔 🪔  Happy Deepavali!  🪔 🪔 🪔

I wanted to share an unexpected connection that I discovered this morning.
Usually, calculus and analysis are used to prove results about random variables. In this post, we’ll turn that idea on its head — we’ll use the Central Limit Theorem to evaluate a purely analytical limit!

So I was browsing through Crux Mathematicorum’s latest issue and came across this interesting limit problem:

Crux Problem OC750 (pp 377).

Find

\[ \lim_{n \to \infty} \sum_{k = n}^{2n} \binom{k - 1}{\,n - 1\,} 2^{-k}. \]

Usually Crux does not accept undergraduate-level solutions, so I will discuss one here.

It also happens that I am teaching a probability and statistics course right now. One of the fundamental theorems that connects the theory of probability to the practice of statistics is the Central Limit Theorem. Let me state a simple version of it:

Theorem. Let \( X_1,X_2,\cdots,X_n \) be a sequence of independent and identically distributed random variables with expectation $\mu$ and variance $\sigma^2$. If \[ S_n := \sum_{i=1}^n X_i, \quad Y_n = \dfrac{S_n - n\mu}{\sqrt{n\sigma^2}}, \] then \[ \lim_{n\to \infty} F_{Y_n}(x) = \int_{-\infty}^{x} \dfrac{e^{-\frac{t^2}{2}}}{\sqrt{2\pi}} \, dt, \] where $F_X$ represents the cdf of $X$. We denote this statement by \(Y_n \xrightarrow{d} N(0,1)\).

We will solve the Crux problem by identifying the summand as a probability from a classical setup.

Definition. Let \(S_n\) denote the number of trials required to obtain the \(n\)-th head in a sequence of independent fair coin tosses. Then for \(k \geq n\), \[ \Pr(S_n = k) = \binom{k-1}{\,n-1\,}2^{-k}. \]

This distribution is called the negative binomial distribution with parameters \((n,1/2)\).

Proof.
Since the \(n\)-th head appears at the \(k\)-th toss, the remaining \(k-1\) tosses must contain exactly \(n-1\) heads.
This chance can be computed using the binomial distribution: \[ \Pr(S_n = k) = \binom{k-1}{\,n-1\,} \left(\frac{1}{2}\right)^{n} \left(\frac{1}{2}\right)^{k-n} = \binom{k-1}{\,n-1\,}2^{-k}. \]

Hence our Crux sum can be written as \[ A_n = \sum_{k=n}^{2n} \binom{k-1}{\,n-1\,} 2^{-k} = \Pr(S_n \le 2n), \] that is, the probability that the \(n\)-th head appears by time \(2n\).

Now that we have converted the sum to a probability, we can compute the limit using the Central Limit Theorem.
To apply it, we need to express \(S_n\) as a sum of i.i.d. random variables.
Luckily, the negative binomial distribution is exactly such a sum — of independent geometric random variables.


Let’s recall geometric random variables

Definition (Geometric random variable). A random variable \(X\) has the geometric distribution with parameter \(p\) if \[ \Pr(X=k) = (1-p)^{k-1}p, \qquad k = 1,2,\dots \] and it represents the number of trials required for the first success.

It has mean \(E[X] = 1/p\) and variance \(\mathrm{Var}(X) = (1-p)/p^2.\) If we define \(X_1, X_2, \dots, X_n\) as i.i.d. geometric\((1/2)\) random variables, then the total number of trials needed to get \(n\) successes is \[ S_n = X_1 + X_2 + \cdots + X_n. \] Note that \(S_n\) is precisely the number of coin tosses required to see exactly \(n\) heads, since \(X_i\) is the interarrival time between the \((i-1)\)th head and the \(i\)th head.

Linearity of expectation and variance immediately gives \[ E[S_n] = 2n, \qquad \mathrm{Var}(S_n) = 2n. \]


Since the random variable \(S_n\) is a sum of i.i.d. geometric\((1/2)\) variables, the Central Limit Theorem implies \[ \frac{S_n - 2n}{\sqrt{2n}} \;\xrightarrow{d}\; N(0,1). \]

Thus, for \(Z \sim N(0,1)\), \[ \lim_{n \to \infty}\Pr(S_n \le 2n) = \Pr(Z \le 0) = \tfrac{1}{2}. \]

We finally have

Conclusion.

\[ \lim_{n \to \infty} \sum_{k=n}^{2n} \binom{k-1}{\,n-1\,}2^{-k} = \tfrac{1}{2}. \]

So the apparently analytic limit is, in fact, the probability that the \(n\)-th head appears before its expected time.


As a professor, I will be remiss if I don’t give my readers a few exercises:

Exercises.

1. Evaluate the following limit:

\[ \lim_{n \to \infty} \int_{0}^{n} \frac{x^{n-1} e^{-x}}{(n-1)!}\, dx. \]

[Hint: Sum of iid exponentials is the Gamma distribution.]

2. Evaluate the following limit:

\[ \lim_{n \to \infty} \sum_{k = 0}^{n} e^{-n}\frac{n^k}{k!}. \]

[Hint: Sum of iid Poisson random variables is Poisson.]

3. Let \(\Gamma(x)\) denote the Gamma function. Evaluate the following limit:

\[ \lim_{n \to \infty} \int_{0}^{n} \frac{1}{2^{n/2}\Gamma\!\left(\tfrac{n}{2}\right)} x^{\frac{n}{2}-1} e^{-x/2}\, dx. \]

[Hint: Sum of squares of iid standard normals is Chi-square.]

4. For a fixed \(p\in(0,1)\), evaluate the limit:

\[ \lim_{n\to\infty} \sum_{k=0}^{\lfloor n p\rfloor} \binom{n}{k}\, p^{k}(1-p)^{n - k}. \]

[Hint: Sum of iid Bernoulli variables is Binomial.]




Enjoy Reading This Article?

Here are some more articles you might like to read next:

  • A measure of risk aversion:Arrow-Pratt Index
  • Copulas - the art of avoiding marginal distributions!
  • Andre's Reflection Trick