
28 Chapter 1
a property extending the theorem of total probability (6.14). If moreover, A is
independent of
1
ℑ
, that is to say, if for all B belonging to
1
( ) ()()PA B PAPB
∩ , (6.37)
then we see from relation (6.34) that
1
() (),PA PA
ωω
=∈Ω. (6.38)
Similarly, if the r.v. Y is independent of
1
, that is to say if for each event B
belonging to
1
and each set A belonging to the
-algebra generated by the
inverse images of Y, denoted by
(Y), the relation (6.37) is true, then from
relation (6.27), we have
1
()
YEYℑ= . (6.39)
Indeed, from relation (6.37), we can write that
()
1
1
()() () ,
1
( ) ( )
( ) ,
BB
B
B
EY dP Y dPB
EY
EY PB
EY dP
ωω
ℑ
∈ℑ
=
=
=
∫∫
∫
(6.40)
and so, relation (6.39) is proved. In particular, if
1
is generated by the r.v.
X
1
,…,X
n
, then the independence between Y and
1
implies, that
1
,..., ( )
n
YX X EY= . (6.41)
Relations (6.39) and (6.41) allow us to have a better understanding of the
intuitive meaning of conditioning. Under independence assumptions,
conditioning has absolutely no impact, for example, on the expectation or the
probability, and on the contrary, dependence implies that the results with or
without conditioning will be different, this fact meaning that we can interpret
conditioning as given additional information useful to get more precise results in
the case of dependence of course.
The properties of expectation, quoted in section 4, are also properties of
conditional expectation, true now a.s., but there are supplementary properties
which are very important in stochastic modelling. They are given in the next
proposition.
Proposition 6.1 (Supplementary properties of conditional expectation)
On the probability space (,,)PΩℑ , we have the following properties:
(i) If the r.v. X is
1
ℑ -measurable, then
1
(),..
XXasℑ= (6.42)
(ii) Let X be a r.v. and Y
1
ℑ -measurable, then