Martingale Property¶
The martingale property is one of the most important structural features of the symmetric random walk. It is the foundation for the optional stopping theorem and the exact ruin probability formulas in the exercises.
Filtration and Adaptedness¶
Let \(\mathcal{F}_n = \sigma(\xi_1, \ldots, \xi_n)\) be the natural filtration — the \(\sigma\)-algebra generated by the first \(n\) steps. Informally, \(\mathcal{F}_n\) represents all information available at time \(n\).
The process \(\{S_n\}_{n \geq 0}\) is adapted to \(\{\mathcal{F}_n\}\): each \(S_n = \sum_{i=1}^n \xi_i\) is measurable with respect to \(\mathcal{F}_n\), since it depends only on \(\xi_1, \ldots, \xi_n\). Adaptedness is a prerequisite for the martingale definition, and it matters later for optional stopping.
The Martingale Property¶
Proposition 1.1.3 (Martingale Property)
If \(p = 1/2\), then \(\{S_n, \mathcal{F}_n\}_{n \geq 0}\) is a martingale:
- Integrability: \(\mathbb{E}|S_n| \leq \sqrt{\mathbb{E}[S_n^2]} = \sqrt{n} < \infty\) for all \(n\), by Jensen's inequality.
- Adaptedness: \(S_n\) is \(\mathcal{F}_n\)-measurable.
- Martingale condition: \(\mathbb{E}[S_{n+1} \mid \mathcal{F}_n] = S_n\).
Proof of the martingale condition.
The key steps are: (i) \(S_n\) is \(\mathcal{F}_n\)-measurable so it pulls out of the conditional expectation; (ii) \(\xi_{n+1}\) is independent of \(\mathcal{F}_n = \sigma(\xi_1,\ldots,\xi_n)\), so the conditional expectation reduces to the unconditional mean \(\mathbb{E}[\xi_{n+1}] = 0\).
Why \(p = 1/2\) is required
For \(p \neq 1/2\) we have \(\mathbb{E}[\xi_{n+1}] = 2p-1 \neq 0\), so \(\mathbb{E}[S_{n+1} \mid \mathcal{F}_n] = S_n + (2p-1) \neq S_n\). The walk is a submartingale if \(p > 1/2\) and a supermartingale if \(p < 1/2\).
The Quadratic Martingale¶
Proposition 1.1.4 (Quadratic Martingale)
The process \(\{M_n\}_{n \geq 0}\) defined by
is a martingale with respect to \(\{\mathcal{F}_n\}\).
Proof.
Expanding the square:
Using measurability of \(S_n\), independence of \(\xi_{n+1}\), and \(\mathbb{E}[\xi_{n+1}] = 0\), \(\mathbb{E}[\xi_{n+1}^2] = 1\):
Therefore:
Quadratic Variation¶
Definition 1.1.4 (Discrete Quadratic Variation)
The quadratic variation of the random walk over \(\{0, 1, \ldots, n\}\) is
Proposition 1.1.5 (Deterministic Quadratic Variation)
For any random walk (symmetric or asymmetric), \([S]_n = n\) almost surely.
Proof. Since \(\xi_i \in \{-1,+1\}\), we have \(\xi_i^2 = 1\) for every \(i\), so \([S]_n = \sum_{i=1}^n 1 = n\). \(\square\)
The quadratic variation is deterministic — it equals \(n\) for every realisation of the walk, regardless of which steps were \(+1\) or \(-1\). This contrasts sharply with the position \(S_n\) itself, which is random.
Connection to \(S_n^2 - n\). Observe that
The quadratic variation \([S]_n = n\) accounts for the "diagonal" part of \(S_n^2\). The martingale \(M_n = S_n^2 - n = S_n^2 - [S]_n\) is precisely the "off-diagonal" remainder. This decomposition \(S_n^2 = [S]_n + M_n\) is the discrete analogue of the Itô decomposition \(W_t^2 = t + 2\int_0^t W_s\,dW_s\).
Why quadratic variation matters. For a smooth function \(f\) with bounded derivative:
For the random walk, \([S]_n = n\) regardless of how finely we partition. This non-vanishing quadratic variation is what forces the correction term in Itô's formula: \((dW_t)^2 = dt\).
Consequences for Optional Stopping¶
The two martingales \(\{S_n\}\) and \(\{S_n^2 - n\}\) are the workhorses of stopping-time calculations. By the Optional Stopping Theorem, if \(\tau\) is a bounded stopping time then:
These two identities give \(\mathbb{E}[S_\tau] = 0\) and \(\mathbb{E}[\tau] = \mathbb{E}[S_\tau^2]\). See Exercises (Exercise 3) for the Gambler's Ruin application.
Optional Stopping Theorem
The Optional Stopping Theorem (to be proved in the chapter on filtrations and martingales) states that if \(\{M_n\}\) is a martingale and \(\tau\) is a stopping time satisfying appropriate integrability conditions, then \(\mathbb{E}[M_\tau] = \mathbb{E}[M_0]\).
References¶
- Durrett, R. (2019). Probability: Theory and Examples, 5th ed. Cambridge University Press.
- Williams, D. (1991). Probability with Martingales. Cambridge University Press.
- Lawler, G. F., & Limic, V. (2010). Random Walk: A Modern Introduction. Cambridge University Press.
Exercises¶
Exercise 1. For the asymmetric random walk with \(p \neq 1/2\), show that the process \(M_n = S_n - n(2p-1)\) is a martingale. Then show that \(M_n^2 - 4np(1-p)\) is also a martingale. What are the analogues of Propositions 1.1.3 and 1.1.4 for the centred walk?
Solution to Exercise 1
For the asymmetric walk, \(\mathbb{E}[\xi_{n+1}] = 2p - 1 \neq 0\). Define \(M_n = S_n - n(2p-1)\). Then:
So \(\{M_n\}\) is a martingale. For the quadratic martingale, let \(\sigma^2 = \text{Var}(\xi_i) = 4p(1-p)\). Consider \(M_n^2 - n\sigma^2\):
where \(\eta_{n+1} = \xi_{n+1} - (2p-1)\) has mean 0 and variance \(\sigma^2 = 4p(1-p)\). Expanding:
Therefore \(\mathbb{E}[M_{n+1}^2 - (n+1)\sigma^2 \mid \mathcal{F}_n] = M_n^2 + \sigma^2 - (n+1)\sigma^2 = M_n^2 - n\sigma^2\), confirming \(M_n^2 - 4np(1-p)\) is a martingale.
These are the analogues of Propositions 1.1.3 and 1.1.4: the centred walk \(M_n = S_n - n\mu\) replaces \(S_n\), and the compensator \(n\sigma^2 = 4np(1-p)\) replaces \(n\).
Exercise 2. Let \(\{S_n\}\) be a symmetric random walk. Use the Optional Stopping Theorem applied to the martingale \(M_n = S_n^2 - n\) and the stopping time \(\tau = \min\{n : S_n = -a \text{ or } S_n = b\}\) (with \(a, b > 0\)) to show that \(\mathbb{E}[\tau] = ab\).
Solution to Exercise 2
The stopping time \(\tau = \min\{n : S_n = -a \text{ or } S_n = b\}\) is finite a.s. by recurrence, and bounded by the first hitting time of the boundary. Applying the Optional Stopping Theorem to \(M_n = S_n^2 - n\):
Therefore \(\mathbb{E}[S_\tau^2] = \mathbb{E}[\tau]\). At the stopping time, \(S_\tau \in \{-a, b\}\), so:
From the first martingale \(\mathbb{E}[S_\tau] = 0\):
With \(\mathbb{P}(S_\tau = -a) + \mathbb{P}(S_\tau = b) = 1\), solving gives \(\mathbb{P}(S_\tau = b) = a/(a+b)\) and \(\mathbb{P}(S_\tau = -a) = b/(a+b)\). Therefore:
Exercise 3. Prove that the exponential process \(Z_n = \left(\frac{1-p}{p}\right)^{S_n}\) is a martingale for any \(p \in (0,1)\). For \(p = 1/2\), what does this process simplify to, and why is it trivial?
Solution to Exercise 3
Let \(Z_n = \left(\frac{1-p}{p}\right)^{S_n}\) and \(r = \frac{1-p}{p}\). Then:
Computing the expectation:
Therefore \(\mathbb{E}[Z_{n+1} \mid \mathcal{F}_n] = r^{S_n} = Z_n\), so \(\{Z_n\}\) is a martingale.
For \(p = 1/2\): \(r = (1-1/2)/(1/2) = 1\), so \(Z_n = 1^{S_n} = 1\) for all \(n\). The process is the constant martingale \(Z_n = 1\), which is trivial (it carries no information about \(S_n\)).
Exercise 4. Show that the discrete quadratic variation \([S]_n = n\) implies that the process \(S_n^2 - [S]_n = S_n^2 - n\) is a martingale. In other words, derive Proposition 1.1.4 from Proposition 1.1.5 without directly computing \(\mathbb{E}[S_{n+1}^2 \mid \mathcal{F}_n]\). (Hint: write \(S_{n+1}^2 - S_n^2 = 2S_n \xi_{n+1} + \xi_{n+1}^2\) and use \(\xi_{n+1}^2 = [S]_{n+1} - [S]_n = 1\).)
Solution to Exercise 4
We have \(S_{n+1}^2 - S_n^2 = (S_n + \xi_{n+1})^2 - S_n^2 = 2S_n\xi_{n+1} + \xi_{n+1}^2\). Since \(\xi_{n+1}^2 = [S]_{n+1} - [S]_n = 1\):
Taking conditional expectations:
Since \(\xi_{n+1}\) is independent of \(\mathcal{F}_n\) and \(\mathbb{E}[\xi_{n+1}] = 0\):
Since \([S]_n = n\) a.s., this gives \(\mathbb{E}[S_{n+1}^2 - (n+1) \mid \mathcal{F}_n] = S_n^2 - n\), which is exactly the statement that \(\{S_n^2 - n\}\) is a martingale (Proposition 1.1.4), derived without directly computing \(\mathbb{E}[S_{n+1}^2 \mid \mathcal{F}_n]\).
Exercise 5. Let \(\lambda \in \mathbb{R}\) and define \(E_n = e^{\lambda S_n} / (\cosh \lambda)^n\). Prove that \(\{E_n\}\) is a martingale. Use this to derive the MGF of the hitting time \(\tau_a = \min\{n \geq 0 : S_n = a\}\) for \(a > 0\): show that \(\mathbb{E}[(\cosh \lambda)^{-\tau_a}] = e^{-\lambda a}\) for \(\lambda > 0\) (under appropriate stopping conditions).
Solution to Exercise 5
Define \(E_n = e^{\lambda S_n}/(\cosh \lambda)^n\). Then:
So \(\{E_n\}\) is a martingale. Now apply the Optional Stopping Theorem to \(\tau_a = \min\{n : S_n = a\}\). At the stopping time, \(S_{\tau_a} = a\), so:
Therefore:
This holds for \(\lambda > 0\) under appropriate conditions ensuring the Optional Stopping Theorem applies (e.g., \(\tau_a < \infty\) a.s. by recurrence, and the stopped process is uniformly integrable, which holds since \(\cosh\lambda > 1\) for \(\lambda > 0\) makes \(E_{n \wedge \tau_a}\) bounded by \(e^{\lambda a}\)).
Exercise 6. A process \(\{X_n\}\) is called a submartingale if \(\mathbb{E}[X_{n+1} \mid \mathcal{F}_n] \geq X_n\). Show that \(|S_n|\) is a submartingale for the symmetric random walk. (Hint: use Jensen's inequality with the convex function \(f(x) = |x|\).)
Solution to Exercise 6
Since \(\{S_n\}\) is a martingale (for \(p = 1/2\)) and \(f(x) = |x|\) is a convex function, Jensen's inequality for conditional expectations gives:
We also need to verify the other conditions: \(|S_n|\) is adapted (since \(S_n\) is adapted), and \(\mathbb{E}[|S_n|] < \infty\) (since \(\mathbb{E}[|S_n|] \leq \sqrt{\mathbb{E}[S_n^2]} = \sqrt{n} < \infty\)).
Therefore \(\{|S_n|\}\) satisfies \(\mathbb{E}[|S_{n+1}| \mid \mathcal{F}_n] \geq |S_n|\), which is the submartingale condition.