
(mortality) or developing the disease (morbidity), are all calculated. It is assumed that an
individual not completing the follow-up period is exposed for half this period, thus enabling
the data for those ‘leaving’ and those ‘staying’ to be combined into an appropriate denom-
inator for the estimation of the percentage dying from or developing the disease. The
advantage of this approach is that all patients, not only those who have been involved for
an extended period, can be be included in the estimation process. See also actuarial
estimator [SMR Chapter 13.]
Li fti ng scheme: A method for constructing new wavelets with prescribed properties for use in
wavelet analysis
.[SIAM Journal of Mathematical Analysis, 1998, 29,511–546.]
Likel ih ood: The probability of a set of observations given the value of some parameter or set of
parameters. For example, the likelihood of a random sample of n observations, x
1
; x
2
; ...; x
n
with probability distribution, f(x,θ) is given by
L ¼
Y
n
i¼1
f ðx
i
;Þ
This function is the basis of
maximum likelihood estimation
. In many applications the likelihood
involves several parameters, only a few of which are of interest to the investigator. The remaining
nuisance parameters
are necessary in order that the model make sense physically, but their
values are largely irrelevant to the investigation and the conclusions to be drawn. Since there are
difficulties in dealing with likelihoods that depend on a large number of incidental parameters
(for example, maximizing the likelihood will be more difficult) some form of modified
likelihood is sought which contains as few of the uninteresting parameters as possible. A number
of possibilities are available. For example, the marginal likelihood, eliminates the nuisance
parameters by integrating them out of the likelihood. The profile likelihood with respect to
the parameters of interest, is the original likelihood, partially maximized with respect to
the nuisance parameters. See also quasi-likelihood, pseudo-likelihood, partial likelihood,
hierarchical likelihood, conditional likelihood, law of likelihood and likelihood ratio.
[KA2 Chapter 17.]
Likelihooddistancetest: A procedure for the detection of
outliers
that uses the difference between
the
log-likelihood
of the complete data set and the log-likelihood when a particular obser-
vation is removed. If the difference is large then the observation involved is considered an
outlier. [Statistical Inference Based on the Likelihood, 1996, A. Azzalini, CRC/Chapman
and Hall, London.]
Likelihood principle: Within the framework of a statistical model, all the information which the data
provide concerning the relative merits of two hypotheses is contained in the
likelihood ratio
of these hypotheses on the data. [Likelihood, 1992, A. W. F. Edwards, Cambridge University
Press, Cambridge.]
Likel ih oodrati o test: The ratio of the
likelihoods
of the data under two hypotheses, H
0
and H
1
, can
be used to assess H
0
against H
1
since under H
0
, the statistic, λ, given by
l ¼2ln
L
H
0
L
H
1
has approximately a
chi-squared distribution
with degrees of freedom equal to the difference
in the number of parameters in the two hypotheses. See also Wilks’ theorem G
2
, deviance,
goodness-of-fit and Bartlett’s adjustment factor. [KA2 Chapter 23.]
Lik ert, R ensis ( 1903^1981) : Likert was born in Cheyenne, Wyoming and studied civil engineering
and sociology at the University of Michigan. In 1932 he obtained a Ph.D. at Columbia
251