
44-2
REFERENCE
DATA
FOR ENGINEERS
I
NTROD
U
CTlO
N
Probability and statistics concepts have very useful
applications in science and engineering. This chapter
briefly introduces some of the important concepts of
probability and statistics.
In
probabilistic modeling one often refers to
a
ran-
dom experiment,
which represents an experiment that
can be repeated time and again under similar circum-
stances, but which yields unpredictable results at each
trial. For example, tossing
a
die is
a
random experi-
ment where the result is
a
number from the sample
space
{
1,2,3,4,5,6].
A
random experiment may con-
sist of observing or measuring elements taken from
a
set that is known
as
a
population.
A real number
asso-
ciated with the result of
a
random experiment is called
a
random variable
or
variute.
A variate may be
dis-
crete
or
continuous.
Discrete Probability
Distribution
A discrete variate takes its value from
a
finite or
denumerable set
{x,,
x2,
. .
.,
x,)
described by itsproba-
bility function,
pk,
defined
as
the probability of obtain-
ing
x,
as a
result
of
one trial.
pic
has the following
properties:
osp,
51
and
ZPk
all
k
A discrete variate can be described by its
cumulative
probability function
as
Xk5X
A
discrete variate of multiple dimensions is described
by its
joint distribution.
For example, if
(xi,
yk)
are the
coordinates of
a
point in the
(x,
y)
plane, then
p(xl,
yk)
is
the joint probability distribution such that
x
=
xj
and
y
=
yk.
In
addition,
all
k
and
P2(Yk)=xP(XJ,Yk)
all
j
are, respectively, the
marginal probabilities
that
x
=
xj
independent
of
y
and that
y
=
yk, independent of
x.
Also,
p(xj
I
Yk)
=
p(xj?yk)
/
P2
(Yk).
p2
(Yk)
>
0
and
P(Yk
I
XJ
1
=
P(XJ
>Yk)
/
PI
(XJ
1,
PI
(XJ
1
>
0
are, respectively, the
conditional probabilities
that
x
=
xj
given that
y
=
yk
has already occurred, and that
y
=
yk
given that
x
=
x,
has already occurred.
Finally,
xJ
and
yk
are said to be
independent
if
p(xJ>
Yk)
=
pl(x,)p2(Yk)
for
'J
and
Yk.
Continuous Probability
Distribution
A
continuous variate takes its value from
a
non-
denumerable set described by its
probability density
function,
p(x).
The probability that one
trial
of the
experiment gives
a
result between
x
and
x
+
dx
is
p(x)dx.
The
cumulative distribution function, P(x),
is defined
as
the probability that the continuous variate
is less than or equal to
x.
It can be represented mathe-
matically
as
P(x)
=
J:mP(s)ds
Note that the following properties hold:
P(xl)
2
P(x2)
if
xl
2
x2
P(-)
=
0
+-
P(+m)
=
p(s)ds
=
1
--
P(x)
2
0
p(x)
=
dP
/
dx
A
continuous variate of multiple dimensions, or
multivariate,
is described by its
joint distribution.
For
example, if
(x,
y)
are
the coordinates of
a
point
in
a
plane, then
p(x,y)dxdy
is the probability that the multi-
variate
has
its x-coordinate between
x
and
x
+
dx
and
its y-coordinate between
y
and y
+
dy.
In
addition,
and
are
the
marginal probability density functions,
which
means that
p,(x)dx is
the probability that
the
variate
x
lies between
x
and
x
+
dx
independent of
y,
and
p,(y)dy
is the probability that the variate
y
lies between
y
and
y
+
dy
independent of
x.
Also,
P(xIY)=
P(X,Y)lP,(Y),
P2(Y)>O