
ORTHOGONAL POLYNOMIALS 201
For a finite interval [a, b
],
the general least squares problem can
now
be
stated.
Gicen f E C[a,
b],
does there exist a polynomial
r~(x)
of
degree
.:::;;
n that
minimizes
fw(x)[j(x)-
r(x)]
2
dx
a
(4.3.10)
among
all polynomials r(x)
of
degree.:::;;
n?
The function w(x) allows different
degrees of importance to
be
given to the error
at
different points in the interval
[a, b
].
This will prove useful
in
developing near minimax approximations.
Define
(4.3.11)
in order to compute (4.3.10) for an arbitrary polynomial
r(x)
of degree
.:::;;
n.
We
want to minimize F
as
the coefficients (a;} range over
all
real numbers. A
.necessary condition for a point
(a
0
,
•.•
, an)
to
be a minimizing point
is
BF
-=0
Ba;
i = 0, 1,
...
, n
(4.3.12)
By
differentiating through
the
integral
in
(4.3.11) and using (4.3.12),
we
obtain
the linear system
i = 0, 1,
...
, n (4.3.13)
To
see why this solution of the least squares problem
is
unsatisfactory,
consider the special case w(x)
=
1,
[a, b] =
[0,
1].
Then
the
linear system be-
comes
n
a·
i'
L
..
1
.=
Lf(x)x;dx
j-0
l + J +
·1
0
i=0,1,
...
,n
(4.3.14)
The matrix of coefficients
is
the · Hilbert matrix of order n +
1,
which was
introduced in (1.6.9). The solution of linear system
(4.3.14)
is
extremely sensitive
to small changes
in
the
coefficients or right-hand constants. Thus this
is
not a
good way to approach the least squares problem. In single precision arithmetic
on
an IBM 3033 computer, the cases n
~
4
will
be completely unsatisfactory.
4.4 Orthogonal Polynomials
As
is
evident from graphs of
xn
on
[0,
1],
n
~
0,
these monomials are very nearly
linearly dependent, looking much alike, and this results in the instability
of
the
linear system
(4.3.14). To avoid this problem,
we
consider an alternative basis for