Addison-wesley publishing company
September 1989
730 pages
Probability & Statistics was written for a one or two semester probability and statistics course offered primarily at four-year institutions and taken mostly by sophomore and junior level students, majoring in mathematics or statistics. Calculus is a prerequisite, and a familiarity with the concepts and elementary properties of vectors and matrices is a plus. The revision of this well-respected text presents a balanced approach of the classical and Bayesian methods and now includes a new chapter on simulation (including Markov chain Monte Carlo and the Bootstrap), expanded coverage of residual analysis in linear models, and more examples using real data.
Table of Contents
1. Introduction to Probability.
The History of Probability.
Interpretations of Probability.
Experiments and Events.
Set Theory.
The Definition of Probability.
Finite Sample Spaces.
Counting Methods.
Combinatorial Methods.
Multinomial Coefficients.
The Probability of a Union of Events.
Statistical Swindles.
Supplementary Exercises.
2. Conditional Probability.
The Definition of Conditional Probability.
Independent Events.
Bayes' Theorem.
Markov Chains.
The Gambler's Ruin Problem.
Supplementary Exercises.
3. Random Variables and Distribution.
Random Variables and Discrete Distributions.
Continuous Distributions.
The Distribution Function.
Bivariate Distributions.
Marginal Distributions.
Conditional Distributions.
Multivariate Distributions.
Functions of a Random Variable.
Functions of Two or More Random Variables.
Supplementary Exercises.
4. Expectation.
The Expectation of a Random Variable.
Properties of Expectations.
Variance.
Moments.
The Mean and The Median.
Covariance and Correlation.
Conditional Expectation.
The Sample Mean.
Utility.
Supplementary Exercises.
5. Special Distributions.
Introduction.
The Beoulli and Binomial Distributions.
The Hypergeometric Distribution.
The Poisson Distribution.
The Negative Binomial Distribution.
The Normal Distribution.
The Central Limit Theorem.
The Correction for Continuity.
The Gamma Distribution.
The Beta Distribution.
The Multinomial Distribution.
The Bivariate Normal Distribution.
Supplementary Exercises.
6. Estimation.
Statistical Inference.
Prior and Posterior Distributions.
Conjugate Prior Distributions.
Bayes Estimators.
Maximum Likelihood Estimators.
Properties of Maximum Likelihood Estimators.
Sufficient Statistics.
Jointly Sufficient Statistics.
Improving an Estimator.
Supplementary Exercises.
7. Sampling Distributions of Estimators.
The Sampling Distribution of a Statistic.
The Chi-Square Distribution.
Joint Distribution of the Sample Mean and Sample Variance.
The t Distribution.
Confidence Intervals.
Bayesian Analysis of Samples from a Normal Distribution.
Unbiased Estimators.
Fisher Information.
Supplementary Exercises.
8. Testing Hypotheses.
Problems of Testing Hypotheses.
Testing Simple Hypotheses.
Uniformly Most Powerful Tests.
Two-Sided Alteatives.
The t Test.
Comparing the Means of Two Normal Distributions.
The F Distribution.
Bayes Test Procedures.
Foundational Issues.
Supplementary Exercises.
9. Categorical Data and Nonparametric Methods.
Tests of Goodness-of-Fit.
Goodness-of-Fit for Composite Hypotheses.
Contingency Tables.
Tests of Homogeneit.
Simpson's Paradox.
Kolmogorov-Smiov Test.
Robust Estimation.
Sign and Rank Tests.
Supplementary Exercises.
10. Linear Statistical Models.
The Method of Least Squares.
Regression.
Statistical Inference in Simple Linear Regression.
Bayesian Inference in Simple Linear Regression.
The General Linear Model and Multiple Regression.
Analysis of Variance.
The Two-Way Layout.
The Two-Way Layout with Replications.
Supplementary Exercises.
11. Simulation.
Why is Simulation Useful?
Simulating Specific Distributions.
Importance Sampling.
Markov Chain Monte Carlo.
The Bootstrap.
Supplementary Exercises.
September 1989
730 pages
Probability & Statistics was written for a one or two semester probability and statistics course offered primarily at four-year institutions and taken mostly by sophomore and junior level students, majoring in mathematics or statistics. Calculus is a prerequisite, and a familiarity with the concepts and elementary properties of vectors and matrices is a plus. The revision of this well-respected text presents a balanced approach of the classical and Bayesian methods and now includes a new chapter on simulation (including Markov chain Monte Carlo and the Bootstrap), expanded coverage of residual analysis in linear models, and more examples using real data.
Table of Contents
1. Introduction to Probability.
The History of Probability.
Interpretations of Probability.
Experiments and Events.
Set Theory.
The Definition of Probability.
Finite Sample Spaces.
Counting Methods.
Combinatorial Methods.
Multinomial Coefficients.
The Probability of a Union of Events.
Statistical Swindles.
Supplementary Exercises.
2. Conditional Probability.
The Definition of Conditional Probability.
Independent Events.
Bayes' Theorem.
Markov Chains.
The Gambler's Ruin Problem.
Supplementary Exercises.
3. Random Variables and Distribution.
Random Variables and Discrete Distributions.
Continuous Distributions.
The Distribution Function.
Bivariate Distributions.
Marginal Distributions.
Conditional Distributions.
Multivariate Distributions.
Functions of a Random Variable.
Functions of Two or More Random Variables.
Supplementary Exercises.
4. Expectation.
The Expectation of a Random Variable.
Properties of Expectations.
Variance.
Moments.
The Mean and The Median.
Covariance and Correlation.
Conditional Expectation.
The Sample Mean.
Utility.
Supplementary Exercises.
5. Special Distributions.
Introduction.
The Beoulli and Binomial Distributions.
The Hypergeometric Distribution.
The Poisson Distribution.
The Negative Binomial Distribution.
The Normal Distribution.
The Central Limit Theorem.
The Correction for Continuity.
The Gamma Distribution.
The Beta Distribution.
The Multinomial Distribution.
The Bivariate Normal Distribution.
Supplementary Exercises.
6. Estimation.
Statistical Inference.
Prior and Posterior Distributions.
Conjugate Prior Distributions.
Bayes Estimators.
Maximum Likelihood Estimators.
Properties of Maximum Likelihood Estimators.
Sufficient Statistics.
Jointly Sufficient Statistics.
Improving an Estimator.
Supplementary Exercises.
7. Sampling Distributions of Estimators.
The Sampling Distribution of a Statistic.
The Chi-Square Distribution.
Joint Distribution of the Sample Mean and Sample Variance.
The t Distribution.
Confidence Intervals.
Bayesian Analysis of Samples from a Normal Distribution.
Unbiased Estimators.
Fisher Information.
Supplementary Exercises.
8. Testing Hypotheses.
Problems of Testing Hypotheses.
Testing Simple Hypotheses.
Uniformly Most Powerful Tests.
Two-Sided Alteatives.
The t Test.
Comparing the Means of Two Normal Distributions.
The F Distribution.
Bayes Test Procedures.
Foundational Issues.
Supplementary Exercises.
9. Categorical Data and Nonparametric Methods.
Tests of Goodness-of-Fit.
Goodness-of-Fit for Composite Hypotheses.
Contingency Tables.
Tests of Homogeneit.
Simpson's Paradox.
Kolmogorov-Smiov Test.
Robust Estimation.
Sign and Rank Tests.
Supplementary Exercises.
10. Linear Statistical Models.
The Method of Least Squares.
Regression.
Statistical Inference in Simple Linear Regression.
Bayesian Inference in Simple Linear Regression.
The General Linear Model and Multiple Regression.
Analysis of Variance.
The Two-Way Layout.
The Two-Way Layout with Replications.
Supplementary Exercises.
11. Simulation.
Why is Simulation Useful?
Simulating Specific Distributions.
Importance Sampling.
Markov Chain Monte Carlo.
The Bootstrap.
Supplementary Exercises.