Springer, 2007. - 218 pages.
Introductory Probability is a pleasure to read and provides a fine answer to the question: How do you construct Brownian motion from scratch, given that you are a competent analyst?
There are at least two ways to develop probability theory. The more familiar path is to treat it as its own discipline, and work from intuitive examples such as coin flips and conundrums such as the Monty Hall problem. An alteative is to first develop measure theory and analysis, and then add interpretation. Bhattacharya and Waymire take the second path. To illustrate the authors' frame of reference, consider the two definitions they give of conditional expectation. The first is as a projection of L2 spaces. The authors rely on the reader to be familiar with Hilbert space operators and at a glance, the connection to probability may not be not apparent. Subsequently, there is a discusssion of Bayes's rule and other relevant probabilistic concepts that lead to a definition of conditional expectation as an adjustment of random outcomes from a finer to a coarser information set.
Introductory Probability is a pleasure to read and provides a fine answer to the question: How do you construct Brownian motion from scratch, given that you are a competent analyst?
There are at least two ways to develop probability theory. The more familiar path is to treat it as its own discipline, and work from intuitive examples such as coin flips and conundrums such as the Monty Hall problem. An alteative is to first develop measure theory and analysis, and then add interpretation. Bhattacharya and Waymire take the second path. To illustrate the authors' frame of reference, consider the two definitions they give of conditional expectation. The first is as a projection of L2 spaces. The authors rely on the reader to be familiar with Hilbert space operators and at a glance, the connection to probability may not be not apparent. Subsequently, there is a discusssion of Bayes's rule and other relevant probabilistic concepts that lead to a definition of conditional expectation as an adjustment of random outcomes from a finer to a coarser information set.