86
CHAPTER
3.
PROBABILITY.
BLACK-SCHOLES
FORMULA.
Lemma
3.5.
If
X is a random variable) then
var(X)
= E[X2] - (E[X])2.
Definition
3.4.
Let X and Y
be
two random variables over the same prob-
ability space. The covariance cov(X,
Y)
of
X and Y is defined
as
cov(X,
Y)
=
E[(X
- E[X])
(Y
- E[Y])].
(3.17)
The correlation corr(X, Y) between X and Y is
equal
to the covariance
?f
X
and Y normalized with respect to the standard deviations
of
X and
Y)
2.
e.)
cov(X,
Y)
corr(X,
Y)
= o-(X) o-(Y) ,
(3.18)
where o-(X) and o-(Y)
are
the standard deviations
of
X and
Y)
respectively.
Lemma
3.6.
Let X and Y
be
two random variables over the same probability
space. Then
cov(X,
Y)
=
E[XY]
- E[X]E[Y].
(3.19)
Proof. Using (3.10)
and
(3.11) repeatedly, we find
that
cov(X, Y)
E[ (X - E[X])
(Y
- E[Y]) ]
E[
XY
-
XE[Y]
-
YE[X]
+
E[X]E[Y]]
E[XY]
- E[
XE[Y]
] - E[
YE[X]
] + E[X]E[Y].(3.20)
Since E[X]
and
E[Y] are constants, we conclude from (3.11)
that
E[
XE[Y]]
= E[Y] E[X]
and
E[
YE[X]]
= E[X] E[Y].
Therefore, formula (3.20) becomes
cov(X, Y)
=
E[XY]
- 2E[X]E[Y] + E[X]E[Y]
E[XY]
- E[X]E[Y].
D
Let
X Y
and
U
be
random
variables over
the
same probability space.
, , .
The
following properties are easy
to
establish
and
show
that
the
covanance
of two
random
variables is a
symmetric
bilinear operator:
cov(X,
Y)
cov(X
+
U,
Y)
cov(X, Y
+
U)
cov(cX, Y)
cov(cIX,
C2
Y
)
cov(Y, X);
cov(X,
Y)
+
cov(U,
Y);
cov(X,
Y)
+ cov(X,
U);
cov(X,
cY)
= c cov(X,
Y),
\j
c E
1R;
CIC2
cov(X, Y),
\j
Cl,
C2
E
JR..
(3.21)
(3.22)
3.2.
CONTINUOUS
PROBABILITY
CONCEPTS
87
Lemma
3.7.
Let X and Y
be
two random variables over the same probability
space.
Then)
var(X
+ Y) =
var(X)
+ 2cov(X,
Y)
+ var(Y) , (3.23)
or)
equivalently)
var(X
+
Y)
= o-2(X) + 2o-(X) o-(Y) corr(X,
Y)
+ o-2(y),
(3.24)
where o-(X) and o-(Y)
are
the standard deviation
of
X and
Y)
respectively.
Proof.
Formula (3.23) is derived from definitions (3.13)
and
(3.17), by using
the
additivity
of
the
expected value (3.10),
as
follows:
var(X
+
Y)
E [ ( (X +
Y)
E[X
+
Y]
)2
]
E [ (
(X
- E[X]) + (Y - E[Y])
)2
]
E [
(X
- E[X])2 ] + 2E[
(X
- E[X]) (Y - E[Y]) ]
+ E [
(Y
- E[y])2 ]
var(X)
+ 2cov(X,
Y)
+
var(Y).
Formula (3.24) is a direct consequence
of
(3.18)
and
(3.23).
D
The
relevant information contained
by
the
covariance
of
two
random
vari-
ables is
related
to
whether
they
are
positively
or
negatively correlated, i.e.,
whether cov(X,
Y)
> 0
or
cov(X, Y) <
O.
The
correlation of
the
two vari-
ables contains
the
same
information as
the
covariance,
in
terms
of
its
sign,
but
its size is also relevant, since
it
is scaled
to
adjust
for multiplication by
constants, i.e.,
corr(cIX,C2Y)
= sign(clc2)
corr(X,Y),
\j
CI,C2
E
JR..
(3.25)
Here,
the
sign function sgn(
c)
is equal
to
1, if C > 0,
and
to
-1
is C <
O.
To see this, recall from (3.16)
that
o-(CIX) = ICllo-(X)
and
o-(C2Y)
IC2Io-(Y)
Then,
from (3.18)
and
using (3.22),
it
follows
that
cov(cIX,
C2Y)
CIC2
cov(X, Y)
0-(
CIX)
0-(
C2Y)
ICII
IC21
o-(X) o-(Y)
CIC2
cov(X, Y)
I
C
IC21
o-(X) o-(Y) = sgn(clc2)
corr(X,
Y).
Lemma
3.8.
Let X and Y
be
two random variables over the same probability
space.
Then
-1
:s;
corr(X,
Y)
:s;
1.
(3.26)