Adapted Processes¶
Overview¶
In the study of stochastic processes, adaptedness and related measurability concepts formalize the intuition that a process should not "see into the future." These concepts are essential for defining meaningful operations like stochastic integration and for ensuring that trading strategies and other decision processes use only available information.
Prerequisite: This document assumes familiarity with filtered probability spaces \((\Omega, \mathcal{F}, (\mathcal{F}_t)_{t \ge 0}, \mathbb{P})\) as developed in the companion document Filtration. We assume the filtration satisfies the usual conditions (right-continuous and complete) unless otherwise stated.
Measurability Concepts for Stochastic Processes¶
Let \(X = (X_t)_{t \ge 0}\) be a stochastic process with values in \(\mathbb{R}^d\) defined on a filtered probability space \((\Omega, \mathcal{F}, (\mathcal{F}_t), \mathbb{P})\). Several measurability notions capture different aspects of the relationship between \(X\) and the filtration, forming a hierarchy from weakest to strongest regularity.
1. Adaptedness¶
\(X\) is adapted to \((\mathcal{F}_t)\) if, for each \(t \ge 0\),
Interpretation: The value \(X_t\) is determined by information available at time \(t\). One cannot "see into the future."
Example: If \((W_t)\) is Brownian motion and \(\mathcal{F}_t = \sigma(W_s : 0 \le s \le t)\), then \(W_t\) is adapted by construction. However, \(W_{t+1}\) is not \(\mathcal{F}_t\)-measurable.
Non-example: The process \(X_t = W_1\) (constant in \(t\), equal to the value of Brownian motion at time 1) is not adapted to \((\mathcal{F}_t^W)\) for \(t < 1\), because \(W_1\) is not \(\mathcal{F}_t^W\)-measurable when \(t < 1\).
2. Progressive Measurability¶
\(X\) is progressively measurable if, for each \(t \ge 0\), the map
restricted to \([0, t] \times \Omega\) is measurable with respect to \(\mathcal{B}([0, t]) \otimes \mathcal{F}_t\).
Interpretation: Progressive measurability is joint measurability in time and outcome, with the constraint that at time \(t\), we only use information available up to \(t\). This is stronger than adaptedness because it requires the process to be "well-behaved" as a function of both variables simultaneously.
Why it matters: Progressive measurability ensures that stopped processes \(X_\tau\) are well-defined and measurable for stopping times \(\tau\). It is the natural measurability class for integrands in stochastic integration.
Key fact: Every adapted process with càdlàg (right-continuous with left limits) or càglàd (left-continuous with right limits) paths is progressively measurable. This follows because such processes can be approximated by simple processes, and the approximation preserves measurability. See Karatzas & Shreve, Brownian Motion and Stochastic Calculus, Proposition 1.1.13.
Example: Brownian motion \((W_t)\) has continuous paths, hence is progressively measurable with respect to its natural filtration.
Non-example (adapted but not progressively measurable): Such examples require set-theoretic constructions beyond standard ZFC (specifically the Continuum Hypothesis). In practice, all "naturally occurring" adapted processes—those with measurable paths, càdlàg or càglàd paths—are progressively measurable. The existence of pathological counterexamples follows from cardinality arguments but is not constructive; no explicit elementary example is known without additional set-theoretic assumptions.
3. Optional Processes¶
A process \(X\) is optional if it is measurable with respect to the optional σ-algebra \(\mathcal{O}\) on \(\Omega \times [0, \infty)\).
Formal definition: The optional σ-algebra \(\mathcal{O}\) is defined as the σ-algebra on \(\Omega \times [0, \infty)\) generated by all càdlàg adapted processes. Equivalently, \(\mathcal{O}\) is generated by the stochastic intervals:
for all stopping times \(S \le T\). In particular, \(\mathcal{O}\) contains \([\![0, T]\!] = \{(\omega, t) : 0 \le t \le T(\omega)\}\) for every stopping time \(T\).
Interpretation: Optional processes are those that can be "observed" at stopping times. The optional σ-algebra captures events that are determined by stopping at random times.
Example: Any càdlàg adapted process is optional (by definition of \(\mathcal{O}\)).
Example (optional but not predictable): Let \((N_t)\) be a Poisson process and let \(\tau_1\) be its first jump time. Define \(X_t = \mathbf{1}_{\{t \ge \tau_1\}}\). This process: - Is càdlàg (right-continuous with a single jump) - Is adapted (at time \(t\), we know whether the jump has occurred) - Is optional (càdlàg adapted) - Is not predictable — we cannot know "just before \(\tau_1\)" that the jump is about to occur, because \(\tau_1\) is a totally inaccessible stopping time
4. Predictability¶
A process \(X\) is predictable if it is measurable with respect to the predictable σ-algebra \(\mathcal{P}\) on \(\Omega \times [0, \infty)\).
Formal definition: The predictable σ-algebra \(\mathcal{P}\) is defined as the σ-algebra generated by all càglàd (left-continuous) adapted processes. Equivalently, \(\mathcal{P}\) is generated by sets of the form:
Why these generators? The sets \((s, t] \times F_s\) represent "during the interval \((s,t]\), the event \(F_s\) holds." Crucially, \(F_s \in \mathcal{F}_s\) means the event is determined by information available at time \(s\), strictly before the interval \((s,t]\) begins. This captures the "no peeking" property.
Interpretation: A predictable process at time \(t\) is determined by information strictly before \(t\). This is the strongest measurability condition in our hierarchy.
Example: Any càglàd (left-continuous) adapted process is predictable. In particular, any continuous adapted process is predictable.
Example: In discrete time, a process \((H_n)_{n \ge 1}\) is predictable if and only if \(H_n\) is \(\mathcal{F}_{n-1}\)-measurable for all \(n \ge 1\). The value at time \(n\) is determined by information at time \(n-1\).
Hierarchy of Measurability¶
The four measurability classes form a strict hierarchy:
Reading the hierarchy: Smaller classes impose stricter conditions. A predictable process satisfies the most stringent requirements; an adapted process satisfies the weakest.
``` Hierarchy (smaller = more regular):
Predictable ⊂ Optional ⊂ Prog. meas. ⊂ Adapted
↑ ↑ ↑ ↑
Left-cont. Right-cont. Joint meas. Pointwise
adapted adapted in (t,ω) meas.
```
Intuition for inclusions:
- Predictable ⊂ Optional: The generators of \(\mathcal{P}\), namely \((s,t] \times F_s\) with \(F_s \in \mathcal{F}_s\), can be represented as \(\{(\omega, u) : s < u \le t, \omega \in F_s\}\). This set is the "graph" of holding the event \(F_s\) constant over the interval \((s,t]\), which can be expressed using càdlàg adapted indicator processes, hence lies in \(\mathcal{O}\).
- Optional ⊂ Progressively measurable: If \(X\) is optional, then for each \(t\), the restriction to \([0,t] \times \Omega\) is measurable w.r.t. \(\mathcal{B}([0,t]) \otimes \mathcal{F}_t\) (this requires proof, but follows from the structure of \(\mathcal{O}\)).
- Progressively measurable ⊂ Adapted: If \((s,\omega) \mapsto X_s(\omega)\) is jointly measurable on \([0,t] \times \Omega\) w.r.t. \(\mathcal{B}([0,t]) \otimes \mathcal{F}_t\), then evaluating at \(s = t\) shows \(X_t\) is \(\mathcal{F}_t\)-measurable.
When do the classes coincide? For processes with continuous paths:
Hence for continuous processes, all four classes coincide. The distinctions become essential for jump processes (Poisson processes, Lévy processes, solutions to SDEs with jumps).
Properties of Adapted Processes¶
Algebraic Closure¶
Adapted processes are closed under standard operations:
- If \(X_t\) and \(Y_t\) are adapted, then \(X_t + Y_t\), \(X_t \cdot Y_t\), and \(f(X_t)\) (for Borel measurable \(f\)) are adapted.
- Pointwise limits of adapted processes are adapted: if \(X_t^n \to X_t\) for each \(t\) and \(\omega\), and each \(X^n\) is adapted, then \(X\) is adapted. (This follows because pointwise limits of measurable functions are measurable.)
Integral Processes¶
If \(X_t\) is adapted, measurable in \(t\), and satisfies \(\int_0^t |X_s| \, ds < \infty\) a.s., then the integral process
is also adapted. This follows because \(Y_t\) depends only on \((X_s)_{0 \le s \le t}\).
Stronger result: If \(X\) is progressively measurable, then \(Y_t = \int_0^t X_s \, ds\) is also progressively measurable.
Connection to Filtrations¶
The concept of adaptedness is inherently tied to the choice of filtration:
- A process may be adapted to one filtration but not another.
- The natural filtration \(\mathcal{F}_t^X = \sigma(X_s : 0 \le s \le t)\) is the smallest filtration to which \(X\) is adapted.
- Working with a larger filtration (e.g., one that includes additional market information) may change which processes are adapted.
Example: Let \((W_t)\) and \((B_t)\) be independent Brownian motions. The process \(W_t\) is adapted to \(\mathcal{F}_t^W\) but also to the larger filtration \(\mathcal{G}_t = \sigma(W_s, B_s : s \le t)\). However, \(B_t\) is adapted to \(\mathcal{G}_t\) but not to \(\mathcal{F}_t^W\).
Applications¶
Mathematical Finance¶
In no-arbitrage theory, a trading strategy \((H_t)\) representing portfolio holdings must be predictable:
- The number of shares held at time \(t\) must be decided based on information available strictly before \(t\).
- This prevents "anticipating" price jumps or using future information.
- If \(H\) were merely adapted (not predictable), a trader could potentially adjust holdings at the exact moment of a price jump, which is unrealistic and leads to arbitrage.
Stochastic Integration¶
The Itô integral \(\int_0^t H_s \, dW_s\) requires:
- \(H\) to be at least progressively measurable (for the integral to be well-defined via approximation by simple processes).
- \(H\) to be predictable (for optimal theoretical properties, including the Itô isometry and martingale property of the integral).
For Brownian motion (continuous paths), progressive measurability and predictability coincide for adapted processes, so the distinction is often glossed over. For integration with respect to jump processes (e.g., Poisson or Lévy), predictability becomes essential.
Stopping and Localization¶
Progressive measurability ensures that:
- Stopped processes \(X_{t \wedge \tau}\) are measurable for stopping times \(\tau\).
- Localization arguments (using stopping times to control integrability) work correctly.
Optional and predictable projections: For any bounded measurable process \(Y\): - There exists a unique optional process \({}^o Y\) (the optional projection) such that \({}^o Y_\tau = \mathbb{E}[Y_\tau \mid \mathcal{F}_\tau]\) for all stopping times \(\tau\). - There exists a unique predictable process \({}^p Y\) (the predictable projection) such that \({}^p Y_\tau = \mathbb{E}[Y_\tau \mid \mathcal{F}_{\tau-}]\) for all predictable stopping times \(\tau\).
Summary¶
| Concept | Definition | Interpretation | Key Example |
|---|---|---|---|
| Adapted | \(X_t\) is \(\mathcal{F}_t\)-measurable for each \(t\) | Value at \(t\) uses only info up to \(t\) | Brownian motion |
| Prog. meas. | Joint measurability on \([0,t] \times \Omega\) | Needed for stopping and integration | Càdlàg adapted |
| Optional | Measurable w.r.t. \(\mathcal{O}\) | Observable at stopping times | \(\mathbf{1}_{\{t \ge \tau\}}\) |
| Predictable | Measurable w.r.t. \(\mathcal{P}\) | Determined by strictly past info | Continuous adapted |
Hierarchy (strict inclusions, smaller = more regular):
For continuous processes, all classes coincide. For jump processes, the distinctions are crucial.
Exercises¶
Exercise 1: Adapted Processes — Basic Properties¶
(a) Prove that if \(X_t\) and \(Y_t\) are adapted to \((\mathcal{F}_t)\), then so are \(X_t + Y_t\) and \(X_t \cdot Y_t\).
(b) Let \((W_t)\) be Brownian motion with natural filtration \((\mathcal{F}_t^W)\). Show that \(W_{t+1}\) is not adapted to \((\mathcal{F}_t^W)\).
(c) Let \(X_t = \int_0^t W_s \, ds\). Prove that \(X_t\) is adapted to \((\mathcal{F}_t^W)\).
Hint: Use the fact that \(\mathbb{E}[\int_0^t |W_s| \, ds] < \infty\) and that \(X_t\) depends only on \((W_s)_{s \le t}\).
Solution to Exercise 1
(a) If \(X_t\) and \(Y_t\) are adapted to \((\mathcal{F}_t)\), then \(X_t\) is \(\mathcal{F}_t\)-measurable and \(Y_t\) is \(\mathcal{F}_t\)-measurable for each \(t\).
-
\(X_t + Y_t\): The sum of two \(\mathcal{F}_t\)-measurable functions is \(\mathcal{F}_t\)-measurable (since addition is a continuous, hence Borel measurable, function from \(\mathbb{R}^2 \to \mathbb{R}\), and \((X_t, Y_t)\) is \(\mathcal{F}_t\)-measurable). Therefore \(X_t + Y_t\) is adapted.
-
\(X_t \cdot Y_t\): Similarly, multiplication is a continuous function from \(\mathbb{R}^2 \to \mathbb{R}\), so the product of two \(\mathcal{F}_t\)-measurable functions is \(\mathcal{F}_t\)-measurable. Therefore \(X_t \cdot Y_t\) is adapted. \(\square\)
(b) For \(t \ge 0\), \(W_{t+1}\) is \(\mathcal{F}_{t+1}^W\)-measurable but we need to check \(\mathcal{F}_t^W\)-measurability. Since \(W_{t+1} = W_t + (W_{t+1} - W_t)\) and \(W_{t+1} - W_t\) is independent of \(\mathcal{F}_t^W\) with distribution \(N(0, 1)\), the random variable \(W_{t+1}\) is not determined by information up to time \(t\).
Formally: if \(W_{t+1}\) were \(\mathcal{F}_t^W\)-measurable, then \(W_{t+1} - W_t\) would also be \(\mathcal{F}_t^W\)-measurable (since \(W_t\) is). But \(W_{t+1} - W_t \sim N(0,1)\) is independent of \(\mathcal{F}_t^W\) and non-constant, which is impossible for a random variable that is simultaneously measurable with respect to a \(\sigma\)-algebra and independent of it. \(\square\)
(c) The process \(X_t = \int_0^t W_s \, ds\) depends only on the values \((W_s)_{0 \le s \le t}\). For each \(t\), \(X_t\) is a functional of \((W_s)_{s \le t}\). More precisely, \(X_t\) can be approximated by Riemann sums \(\sum_{k} W_{t_k}(t_{k+1} - t_k)\), where each \(W_{t_k}\) is \(\mathcal{F}_{t_k}^W \subseteq \mathcal{F}_t^W\)-measurable. The limit of \(\mathcal{F}_t^W\)-measurable random variables is \(\mathcal{F}_t^W\)-measurable (measurability is preserved under pointwise limits). Since \(\mathbb{E}[\int_0^t |W_s| \, ds] \le \int_0^t \mathbb{E}[|W_s|] \, ds = \int_0^t \sqrt{2s/\pi} \, ds < \infty\), the integral is well-defined and \(X_t\) is \(\mathcal{F}_t^W\)-measurable for each \(t\). \(\square\)
Exercise 2: Progressive Measurability¶
(a) Let \(X\) be an adapted process with left-continuous paths. Prove that \(X\) is progressively measurable.
Hint: For fixed \(t\), define \(X^n_s(\omega) = X_{(k-1)/2^n}(\omega)\) for \(s \in ((k-1)/2^n, k/2^n] \cap [0,t]\). Show each \(X^n\) is \(\mathcal{B}([0,t]) \otimes \mathcal{F}_t\)-measurable (it's a step function in \(s\) with \(\mathcal{F}_t\)-measurable values), and \(X^n \to X\) pointwise by left-continuity.
(b) Give an example of an adapted process that is not progressively measurable.
Hint: Such examples require set-theoretic constructions beyond standard ZFC. As noted in the Measurability section, the existence of adapted processes that are not progressively measurable follows from cardinality arguments, though no explicit elementary construction is known. For this exercise, describe the general structure: a process where \(X_t(\omega)\) is measurable for each fixed \(t\), but the joint map \((t,\omega)\mapsto X_t(\omega)\) fails to be jointly measurable.
(c) Let \(X\) be progressively measurable and let \(\tau\) be a stopping time. Show that for each \(t \ge 0\), the random variable \(X_\tau \mathbf{1}_{\{\tau \le t\}}\) is \(\mathcal{F}_t\)-measurable.
Hint: On the event \(\{\tau \le t\}\), the map \(\omega \mapsto (\tau(\omega), \omega)\) takes values in \([0,t] \times \Omega\). Use the composition \(X_\tau(\omega) = X(\tau(\omega), \omega)\) and the joint measurability of \(X\) on \([0,t] \times \Omega\).
Solution to Exercise 2
(a) Fix \(t > 0\). For each \(n\), define the step-function approximation:
with \(X^n_0(\omega) = X_0(\omega)\).
Each \(X^n\) is \(\mathcal{B}([0,t]) \otimes \mathcal{F}_t\)-measurable: on the set \((s, \omega) \in ((k-1)/2^n, k/2^n] \times \Omega\), the value is \(X_{(k-1)/2^n}(\omega)\), which is \(\mathcal{F}_{(k-1)/2^n}\)-measurable (by adaptedness) and hence \(\mathcal{F}_t\)-measurable. The indicator of \((k-1)/2^n < s \le k/2^n\) is \(\mathcal{B}([0,t])\)-measurable. So each piece is jointly measurable, and \(X^n\) is a finite sum of such pieces.
By left-continuity of paths, \(X^n_s(\omega) \to X_s(\omega)\) as \(n \to \infty\) for each \((s, \omega)\): when \(s > 0\), the points \((k-1)/2^n\) approach \(s\) from below, and \(X_{(k-1)/2^n} \to X_{s-} = X_s\) by left-continuity. The pointwise limit of jointly measurable functions is jointly measurable, so \(X\) restricted to \([0, t] \times \Omega\) is \(\mathcal{B}([0,t]) \otimes \mathcal{F}_t\)-measurable. \(\square\)
(b) Constructing an explicit adapted process that is not progressively measurable requires set-theoretic constructions beyond ZFC. The general structure is: one constructs a process where \(X_t(\omega)\) is \(\mathcal{F}_t\)-measurable for each fixed \(t\) (pointwise measurability), but the joint map \((t, \omega) \mapsto X_t(\omega)\) fails to be \(\mathcal{B}([0,t]) \otimes \mathcal{F}_t\)-measurable. This can happen when the sample paths \(t \mapsto X_t(\omega)\) are highly irregular (e.g., non-measurable as functions of \(t\)). Under the Continuum Hypothesis, one can construct such examples using well-orderings of \([0,1]\), but no explicit elementary construction is known in standard ZFC.
(c) Define the map \(\phi: \Omega \to [0, t] \times \Omega\) by \(\phi(\omega) = (\tau(\omega), \omega)\). On the event \(\{\tau \le t\}\), \(\phi\) maps into \([0, t] \times \Omega\).
Since \(X\) is progressively measurable, the restriction of \((s, \omega) \mapsto X_s(\omega)\) to \([0, t] \times \Omega\) is \(\mathcal{B}([0,t]) \otimes \mathcal{F}_t\)-measurable.
The map \(\omega \mapsto \tau(\omega)\) is \(\mathcal{F}_t\)-measurable on \(\{\tau \le t\}\) (since \(\tau\) is a stopping time), and \(\omega \mapsto \omega\) is trivially \(\mathcal{F}_t\)-measurable. Thus the composition \(X_\tau(\omega) \cdot \mathbf{1}_{\{\tau \le t\}}(\omega) = X(\tau(\omega), \omega) \cdot \mathbf{1}_{\{\tau \le t\}}(\omega)\) is \(\mathcal{F}_t\)-measurable, being the composition of a \(\mathcal{B}([0,t]) \otimes \mathcal{F}_t\)-measurable function with an \(\mathcal{F}_t\)-measurable map. \(\square\)
Exercise 3: The Hierarchy of Measurability¶
(a) Show that \(\mathcal{P} \subseteq \mathcal{O}\) by verifying that the generators of \(\mathcal{P}\) lie in \(\mathcal{O}\).
Hint: Show that \((s,t] \times F_s\) with \(F_s \in \mathcal{F}_s\) can be written using càdlàg adapted indicator processes.
(b) Let \((N_t)\) be a Poisson process with intensity \(\lambda > 0\), and let \(\tau_1 = \inf\{t : N_t \ge 1\}\) be the first jump time. Define \(X_t = \mathbf{1}_{\{t \ge \tau_1\}}\).
- Verify that \(X\) is adapted to the natural filtration of \(N\).
- Verify that \(X\) is càdlàg, hence optional.
- Argue that \(X\) is not predictable.
Hint for non-predictability: \(\tau_1\) is a totally inaccessible stopping time (it cannot be announced by an increasing sequence of stopping times). For a predictable process, the "jump" would need to be predictable.
(c) Prove: If \(X\) is a continuous adapted process, then \(X\) is predictable.
Hint: A continuous function is in particular left-continuous (càglàd). By definition, the predictable σ-algebra \(\mathcal{P}\) is generated by all càglàd adapted processes. Therefore, any càglàd adapted process is \(\mathcal{P}\)-measurable, i.e., predictable. Since continuous implies càglàd, we're done.
Solution to Exercise 3
(a) The predictable \(\sigma\)-algebra \(\mathcal{P}\) is generated by sets \((s, t] \times F_s\) with \(F_s \in \mathcal{F}_s\) and \(\{0\} \times F_0\) with \(F_0 \in \mathcal{F}_0\).
For \((s, t] \times F_s\): define the càdlàg adapted process \(Y_u(\omega) = \mathbf{1}_{(s, t]}(u) \cdot \mathbf{1}_{F_s}(\omega)\). This process is zero for \(u \le s\) and \(u > t\), and equals \(\mathbf{1}_{F_s}(\omega)\) for \(u \in (s, t]\). It is right-continuous (constant on \((s, t]\), zero elsewhere with right-continuous jumps at \(s\) and \(t\)) and adapted (for \(u \le s\), \(Y_u = 0\); for \(u \in (s, t]\), \(Y_u = \mathbf{1}_{F_s}\) is \(\mathcal{F}_s \subseteq \mathcal{F}_u\)-measurable). The set \((s, t] \times F_s = \{(\omega, u) : Y_u(\omega) = 1\}\) is in \(\mathcal{O}\).
Since the generators of \(\mathcal{P}\) lie in \(\mathcal{O}\), and \(\mathcal{P}\) is the smallest \(\sigma\)-algebra containing them, \(\mathcal{P} \subseteq \mathcal{O}\). \(\square\)
(b) Let \((N_t)\) be a Poisson process with intensity \(\lambda\), and \(\tau_1\) its first jump time.
-
Adapted: At time \(t\), \(X_t = \mathbf{1}_{\{t \ge \tau_1\}} = \mathbf{1}_{\{N_t \ge 1\}}\), which is \(\sigma(N_s : s \le t)\)-measurable. So \(X\) is adapted.
-
Càdlàg and optional: \(X_t\) jumps from 0 to 1 at \(t = \tau_1\) and is right-continuous (constant on \([0, \tau_1)\) and \([\tau_1, \infty)\)). So \(X\) is càdlàg adapted, hence optional.
-
Not predictable: The stopping time \(\tau_1\) is totally inaccessible: there is no increasing sequence of stopping times \(\sigma_n < \tau_1\) with \(\sigma_n \uparrow \tau_1\) a.s. (this is because \(\tau_1 \sim \text{Exp}(\lambda)\) has no atoms, and the Poisson process has the "memoryless" property). A predictable process cannot jump at a totally inaccessible time — its jumps must occur at predictable stopping times. Since \(X\) has a jump at the totally inaccessible time \(\tau_1\), \(X\) is not predictable.
(c) A continuous adapted process \(X\) is in particular left-continuous (càglàd). The predictable \(\sigma\)-algebra \(\mathcal{P}\) is generated by all càglàd adapted processes. Therefore every càglàd adapted process is \(\mathcal{P}\)-measurable (i.e., predictable). Since continuous implies càglàd, any continuous adapted process is predictable. \(\square\)
Exercise 4: Predictability in Finance¶
(a) In discrete time with filtration \((\mathcal{F}_n)_{n \ge 0}\), prove that a process \((H_n)_{n \ge 1}\) is predictable if and only if \(H_n\) is \(\mathcal{F}_{n-1}\)-measurable for all \(n \ge 1\).
Hint: The predictable σ-algebra in discrete time is generated by sets of the form \(\{n\} \times F_{n-1}\) for \(F_{n-1} \in \mathcal{F}_{n-1}\).
(b) Explain why trading strategies must be predictable rather than merely adapted. Construct a concrete "arbitrage" example using an adapted but non-predictable strategy.
Hint: Consider a stock priced at \(S_t\) that jumps from \(\$100\) to \(\$150\) at a totally inaccessible stopping time \(\tau\). An adapted (but non-predictable) strategy could be: "hold 1 share during the instant of the jump," i.e., \(H_t = \mathbf{1}_{\{t = \tau\}}\). This is adapted (at each time \(t\), we know if \(t = \tau\)) but not predictable (we can't know just before \(\tau\) that we should buy). The "profit" is \(H_\tau \cdot \Delta S_\tau = 1 \cdot 50 = \$50\), achieved with zero investment before the jump. This is arbitrage. In reality, one cannot implement such a strategy because buying takes time—you must decide to hold before observing the jump, which is exactly what predictability enforces.
(c) In the Black-Scholes model (continuous prices), explain why the distinction between adapted and predictable is less critical.
Solution to Exercise 4
(a) In discrete time, the predictable \(\sigma\)-algebra is generated by sets \(\{n\} \times F_{n-1}\) where \(F_{n-1} \in \mathcal{F}_{n-1}\) and \(n \ge 1\).
(\(\Rightarrow\)) If \((H_n)_{n \ge 1}\) is predictable (measurable w.r.t. \(\mathcal{P}\)), then for each \(n\), \(H_n(\omega) = H(\{n\}, \omega)\) where \(H\) is \(\mathcal{P}\)-measurable. Since \(H_n\) is determined by the "slice" at time \(n\) of a \(\mathcal{P}\)-measurable process, and \(\mathcal{P}\) at time \(n\) is generated by \(\mathcal{F}_{n-1}\)-measurable sets, \(H_n\) is \(\mathcal{F}_{n-1}\)-measurable.
(\(\Leftarrow\)) If \(H_n\) is \(\mathcal{F}_{n-1}\)-measurable for each \(n\), define \(H(\omega, n) = H_n(\omega)\). For any Borel set \(B\), \(\{(\omega, n) : H(\omega, n) \in B\} = \bigcup_{n \ge 1} \{H_n \in B\} \times \{n\}\), and each \(\{H_n \in B\} \in \mathcal{F}_{n-1}\), so each piece is in \(\mathcal{P}\). Hence \(H\) is \(\mathcal{P}\)-measurable. \(\square\)
(b) Consider a stock \(S_t\) that jumps from $100 to $150 at a totally inaccessible stopping time \(\tau\). An adapted but non-predictable strategy: \(H_t = \mathbf{1}_{\{t = \tau\}}\) (hold 1 share only at the exact moment of the jump).
At time \(t\), we know whether \(t = \tau\), so \(H_t\) is adapted. But it is not predictable because we cannot know just before \(\tau\) that the jump is about to occur (\(\tau\) is totally inaccessible).
The "profit" is \(H_\tau \cdot \Delta S_\tau = 1 \cdot 50 = \$50\), achieved with zero investment before the jump. This constitutes an arbitrage — a riskless profit from nothing.
The predictability requirement prevents this: a predictable strategy requires deciding holdings before observing the price change, so one cannot selectively hold shares only at the instant of a surprise jump.
(c) In the Black-Scholes model, asset prices follow geometric Brownian motion with continuous paths. For continuous processes, the hierarchy collapses: every continuous adapted process is automatically predictable, optional, and progressively measurable. Since both the price process and any continuous trading strategy are predictable, the distinction between "adapted" and "predictable" is moot. The subtle issues of predictability only arise when prices or strategies can jump.
Exercise 5: Connecting to the Filtration Document¶
Using the random walk setting from the companion document Filtration:
Let \((X_n)_{n \ge 1}\) be i.i.d. coin flips with \(X_n \in \{+1, -1\}\), \(S_n = X_1 + \cdots + X_n\), \(S_0 = 0\), and \(\mathcal{F}_n = \sigma(X_1, \ldots, X_n)\) with \(\mathcal{F}_0 = \{\emptyset, \Omega\}\).
(a) Show that \((S_n)_{n \ge 0}\) is adapted to \((\mathcal{F}_n)\).
(b) Define \(H_n = S_{n-1}\) for \(n \ge 1\) (with \(H_0 = 0\)). Is \((H_n)\) predictable? Justify your answer.
(c) Is the process \(Y_n = X_n = S_n - S_{n-1}\) (for \(n \ge 1\)) predictable? Justify.
(d) Define \(\tau = \min\{n \ge 1 : S_n = 2\}\) (first hitting time of level 2). Classify the process \(Z_n = \mathbf{1}_{\{n \ge \tau\}}\): - Is it adapted? - Is it predictable? - Justify each answer.
Solution to Exercise 5
(a) We have \(S_n = X_1 + \cdots + X_n\) and \(\mathcal{F}_n = \sigma(X_1, \ldots, X_n)\). Since \(S_n\) is a function of \((X_1, \ldots, X_n)\), it is \(\sigma(X_1, \ldots, X_n) = \mathcal{F}_n\)-measurable for each \(n\). Also \(S_0 = 0\) is \(\mathcal{F}_0 = \{\emptyset, \Omega\}\)-measurable (it's a constant). Hence \((S_n)_{n \ge 0}\) is adapted. \(\square\)
(b) \(H_n = S_{n-1} = X_1 + \cdots + X_{n-1}\) for \(n \ge 1\). This is a function of \((X_1, \ldots, X_{n-1})\), hence \(\mathcal{F}_{n-1}\)-measurable.
By the characterization in Exercise 4(a), \((H_n)\) is predictable. \(\square\)
Intuitively: the strategy "hold \(S_{n-1}\) shares at time \(n\)" uses only information available at time \(n - 1\).
(c) \(Y_n = X_n\) for \(n \ge 1\). For predictability, we need \(X_n\) to be \(\mathcal{F}_{n-1}\)-measurable. But \(X_n\) is independent of \(\mathcal{F}_{n-1} = \sigma(X_1, \ldots, X_{n-1})\) and non-constant (\(\mathbb{P}(X_n = +1) = \mathbb{P}(X_n = -1) = 1/2\)). A non-constant random variable cannot be measurable with respect to a \(\sigma\)-algebra from which it is independent. Therefore \((Y_n)\) is not predictable.
(d) The process \(Z_n = \mathbf{1}_{\{n \ge \tau\}}\).
-
Adapted: At time \(n\), we know whether \(\tau \le n\) (since \(\tau = \min\{k \ge 1 : S_k = 2\}\) and we have observed \(S_1, \ldots, S_n\)). So \(Z_n = \mathbf{1}_{\{\tau \le n\}}\) is \(\mathcal{F}_n\)-measurable, and \(Z\) is adapted. Yes, adapted.
-
Predictable: We need \(Z_n = \mathbf{1}_{\{\tau \le n\}}\) to be \(\mathcal{F}_{n-1}\)-measurable. Since \(\{\tau \le n\} = \{\tau \le n - 1\} \cup \{\tau = n\}\) and \(\{\tau \le n-1\} \in \mathcal{F}_{n-1}\), we need \(\{\tau = n\} \in \mathcal{F}_{n-1}\). But \(\{\tau = n\} = \{S_1 < 2, \ldots, S_{n-1} < 2, S_n = 2\}\). The condition \(S_n = 2\) depends on \(X_n\), which is independent of \(\mathcal{F}_{n-1}\). So \(\{\tau = n\} \notin \mathcal{F}_{n-1}\) in general (it depends on the future flip \(X_n\)). However, note that \(\{\tau \le n\} = \{\tau \le n-1\} \cup \{\tau = n\}\). Since \(\{\tau \le n-1\} \in \mathcal{F}_{n-1}\), the question reduces to whether \(\mathbf{1}_{\{\tau = n\}}\) is \(\mathcal{F}_{n-1}\)-measurable. Since it is not, \(Z_n\) is not \(\mathcal{F}_{n-1}\)-measurable, and \(Z\) is not predictable.