tions of microscopic entities (positions and moments of individual molecules)
were replaced with their statistical averages. These averages, calculated under
certain reasonable assumptions, were shown to represent relevant macro-
scopic entities such as temperature and pressure. A new field of physics, sta-
tistical mechanics, was an outcome of this research.
Statistical methods, developed originally for studying motions of gas mole-
cules in a closed space, have found utility in other areas as well. In engineer-
ing, they have played a major role in the design of large-scale telephone
networks, in dealing with problems of engineering reliability, and in numerous
other problems. In business, they have been essential for dealing with prob-
lems of marketing, insurance, investment, and the like. In general, they have
been found applicable to problems that involve large-scale systems whose
components behave in a highly random way. The larger the system and the
higher the randomness, the better these methods perform.
When statistical mechanics was accepted,by and large,by the scientific com-
munity as a legitimate area of science at the beginning of the 20th century, the
negative attitude toward uncertainty was for the first time revised. Uncertainty
became recognized as useful, or even essential, in certain scientific inquiries.
However, it was taken for granted that uncertainty, whenever unavoidable in
science, can adequately be dealt with by probability theory. It took more than
half a century to recognize that the concept of uncertainty is too broad to be
captured by probability theory alone, and to begin to study its various other
(nonprobabilistic) manifestations.
Analytic methods based upon the calculus, which had dominated science
prior to the emergence of statistical mechanics, are applicable only to prob-
lems that involve systems with a very small number of components that are
related to each other in a predictable way. The applicability of statistical
methods based upon probability theory is exactly opposite: they require
systems with a very large number of components and a very high degree of
randomness. These two classes of methods are thus complementary. When
methods in one class excel, methods in the other class totally fail. Despite their
complementarity, these classes of methods can deal only with problems that
are clustered around the two extremes of complexity and randomness scales.
In his classic paper “Science and Complexity” [1948], Warren Weaver refers
to them as problems of organized simplicity and disorganized complexity,
respectively. He argues that these classes of problems cover only a tiny frac-
tion of all conceivable problems. Most problems are located somewhere
between the two extremes of complexity and randomness, as illustrated by the
shaded area in Figure 1.1. Weaver calls them problems of organized complex-
ity for reasons that are well described in the following quote from his paper:
The new method of dealing with disorganized complexity, so powerful an
advance over the earlier two-variable methods, leaves a great field untouched.
One is tempted to oversimplify, and say that scientific methodology went from
one extreme to the other—from two variables to an astronomical number—and
1.1. UNCERTAINTY AND ITS SIGNIFICANCE 3