Both, y in line 20, the functi on fitted from linreg, and fcurve in line 22, the
function fitted with leasqr, give the same results. They are exactly superimposed
in Fig. 8.1, so that only one line is visible, although both functions are plotted with
the plot command in line 25.
Note the parameters returned from leasqr. They are shown as a, b, c, and d in
Fig. 8.1. Each time the sample program is executed (type fit1 in the Octave
terminal wind ow), a different data set is generated and new parameters are fitted.
The resulting parameters a,b,c,d differ considerably for each execution of fit1.
Try it out! The erratic results for a,b,c,d are in contrast with the reasonable
fluctuations of pa(1) (¼Slope) and pa(2) (¼Intercept), returned from linreg.
Even though the results for a and b cannot be predicted, its quotient a/b is the
same as pa(1), the slope returned from linreg. Likewise, the difference c –d is
the same as pa(2) , the intercept returned from linreg. All this is not surprising.
Exactly two para meters (slope and intercept) define a straight line and one cannot
obtain more than two parameters from a fit to such a line. But let us pretend that we
do not know this, and that we have performed numerous fits with the program fit1.
m, alas with the same data set. When we then plot the resulting para meter a as a
function of b, and the parameter c as a function of d, we find that all these data pairs
lie on straight lines:
a ¼ f(b) ¼ b pa(1) (8.3)
c ¼ f(d) ¼ d þ
pa(2) (8.4)
In the general case of multi-parameter fits to unknown functions, the parameters
which give the same (optimal) fit will not lie on straight lines, as in (8.3) and (8.4),
but will show a certain distribution. In the ideal case, this distribution is narrow, so
that a variation of one parameter x could not be compensated by the variation of
another parameter y, and still gives the same quality of a fit. The linear dependence
of two parameters can be described with the help of Pearson’s correlation coeffi-
cient [2–6]. This coefficient can have values between 1 and 1. If it is one, the
parameters are correlated, so that the increase in x can be completely compensated
by an increase in y. If it is 1, an increase in x can be compensated by a decrease in
y. If it is zero, the parameters are not correlated, and an increase in x cannot be
compensated by a variation of y. For a and b of (8.2) and (8.3 ), an increase in a can
be compensated by an increase in b. Its correlation coefficient is 1, the same as the
correlation coefficient of c and d in (8.4). For the ideal case of significant
parameters, a variation of x cannot be compensated by a variation of y and the
correlation coefficient should be near zero.
For MATLAB, the function lsqcurvefit returns no statistical information like
corp or covp. For this, one would have to use the function nlinfit from the
MATLAB Statistics toolbox. The MATLAB program fit1M.m differs from the
Octave program fit1.m only in lines 22 and 23.
122 8 Fitting the Data