
8 1 Continuous-Path Random Processes: Mathematical Prerequisites
or equivalently, if Q << P and P << Q. In that case, there exists a strictly
positive, F-measurable random variable L, such that Q(A)=E
P
(L1
A
). Note
that
dP
dQ
= L
−1
and P(A)=E
Q
(L
−1
1
A
).
Conversely, if L is a strictly positive F-measurable r.v., with expectation
1 under P,thenQ = L · P defines a probability measure on F, equivalent to
P. From the definition of equivalence, if a property holds almost surely (a.s.)
with respect to P, it also holds a.s. for any probability Q equivalent to P.
Two probabilities P and Q on the same filtered probability space (Ω,F)are
said to be locally equivalent
1
if they have the same negligible sets on F
t
,for
every t ≥ 0, i.e., if Q|
F
t
∼ P|
F
t
. In that case, there exists a strictly positive F-
adapted process (L
t
,t≥ 0) such that Q|
F
t
= L
t
P|
F
t
. (See Subsection 1.7.1
for more information.) Furthermore, if τ is a stopping time (see Subsection
1.2.3), then
Q|
F
τ
∩{τ<∞}
= L
τ
· P|
F
τ
∩{τ<∞}
.
This will be important when dealing with Girsanov’s theorem and explosion
times (See Proposition 1.7.5.3).
Warning 1.1.7.1 If P ∼ Q and X is a P-integrable random variable, it is
not necessarily Q-integrable.
1.1.8 Construction of Simple Probability Spaces
In order to construct a random variable with a given law, say a Gaussian law,
the canonical approach is to take Ω = R, X : Ω → R; X(ω)=ω the identity
map and P the law on Ω = R with the Gaussian density with respect to the
Lebesgue measure, i.e.,
P(dω)=
1
√
2π
exp
−
ω
2
2
dω
(recall that here ω is a real number). Then the cumulative distribution
function of the random variable X is
F
X
(x)=P(X ≤ x)=
Ω
1
{ω≤x}
P(dω)=
x
−∞
1
√
2π
exp
−
ω
2
2
dω .
Hence, the map X is a Gaussian random variable. The construction of a real
valued r.v. with any given law can be carried out using the same idea; for
example, if one needs to construct a random variable with an exponential
law, then, similarly, one may choose Ω = R and the density e
−ω
1
{ω≥0}
.
For two independent variables, we choose Ω = Ω
1
×Ω
2
where Ω
i
,i=1, 2
are two copies of R.OneachΩ
i
, one constructs a random variable as above,
1
This commonly used terminology often refers to a sequence (T
n
) of stopping times,
with T
n
↑∞a.s.; here, it is preferable to restrict ourselves to the deterministic
case T
n
= n.