
156 Fractals and Multifractals in Ecology and Aquatic Science
5.5.2 Zi p F ’s la w , in F o r m a T i o n , a n d En T r o p y
Information theory (Shannon 1948, 1951; Shannon and Weaver 1949), although originally illus-
trated using statistically signicant samples of human language, generally provides quantitative
tools to assess and compare communication systems across species. Specically, this theory has
been applied to a wide range of animal communicative processes or sequential behavior (for exam-
ple, MacKay 1972; Slater 1973; Bradbury and Vehrencamp 1998). These include aggressive displays
of hermit crabs (Hazlett and Bossert 1965), aggressive communication in shrimp (Dingle 1969),
intermale grasshopper communication (Steinberg and Conant 1974), dragony larvae communica-
tion (Rowe and Harvey 1985), social communication of macaque (Altmann 1965), waggle dance
of honeybees (Haldane and Spurway 1954), chemical paths of re ants (Wilson 1962), structure of
songs in cardinals and wood pewees (Chateld and Lemon 1970), vocal recognition in Mexican
free-tailed bats (Beecher 1989), and bottlenose dolphin whistles (McCowan et al. 1999). In contrast,
only few investigations have assessed animal behavior using Zipf’s law (Hailman et al. 1985, 1987;
Hailman and Ficken 1986; Ficken et al. 1994; Hailman 1994). The Zipf law and Shannon entropy*
are conceptually and mathematically related but nevertheless subtly differ. Zipf’s law measures
the potential capacity for information transfer at the repertoire level by examining the “optimal”
amount of diversity and redundancy necessary for communication transfer across a “noisy” channel
(that is, all complex audio signals will require some redundancy). In comparison, Shannon entro-
pies were originally developed to measure channel capacity, and the rst-order entropy differs from
Zipf’s statistic as Zipf does not specically recognize language as a noisy channel.
Both the similarity and difference between Zipf’s law and Shannon entropy prompted Mandelbrot
(1953) to analyze the question of how the value of the Zipf exponent a relates to the Shannon entropy†
H
0
(Equation 5.20; see also Box 5.1) for an information source following a Zipf’s distribution.
The main problem here is that the maximum rank is intrinsically controlled by the length of the
data set—that is, the vocabulary or repertoire size, or the number of species. In theory, the maxi-
mum rank can grow to innity. In practice, however, the maximum rank is limited to a nite value
*
Here, Shannon entropy specically refers to the rst-order Shannon entropy; Shannon higher-order entropies provide a
more complex examination of communicative repertoires and are discussed and illustrated in Section 5.5.6.
†
Note that entropy is dened here as a measure of the informational degree of organization and is not directly related to
the thermodynamic property used in physics; see also Box 5.1.
Box 5.1 thERMoDynAMIC EntRoPy
In scientic elds such as information theory, mathematics, and physics, entropy is generally
considered as a measure of the disorder of a system. More specically, in thermodynam-
ics (the branch of physics dealing with the conversion of heat energy into different forms
of energy—for example, mechanical or chemical), entropy, S, is a measure of the amount of
energy in a system that cannot be used to do work. Entropy can also be seen as a measure of
the uniformity of the distribution of energy. Central to the concept of entropy is the second
law of thermodynamics, which states that “the entropy of an isolated system which is not in
equilibrium will tend to increase over time, approaching a maximum value at equilibrium.”
Ice melting illustrated in Figure 5.B1.1 is the archetypical example of entropy increase
through time. First, consider the glass containing ice blocks as our system. As time elapses,
the heat energy from the surrounding room will be continuously transferred to the system.
Ice will then continuously melt until it reaches the liquid state, and the liquid will then keep
receiving heat energy until it reaches thermal equilibrium with the room. Through this pro-
cess, energy has become more dispersed and spread out in the system at equilibrium than
when the glass was only containing ice.
2782.indb 156 9/11/09 12:10:46 PM