Regression formulas – Texas Instruments TITANIUM TI-89 User Manual
Page 943

Appendix B: Technical Reference
936
Regression Formulas
This section describes how the statistical regressions are calculated.
Least-Squares Algorithm
Most of the regressions use non-linear recursive least-squares techniques to optimize
the following cost function, which is the sum of the squares of the residual errors:
where:residualExpression is in terms of xi and yi
xi is the independent variable list
yi is the dependent variable list
N is the dimension of the lists
This technique attempts to recursively estimate the constants in the model expression to
make J as small as possible.
For example, y=a sin(bx+c)+d is the model equation for
SinReg
. So its residual
expression is:
a sin(bx
i
+c)+d
“
y
i
For
SinReg
, therefore, the least-squares algorithm finds the constants a, b, c, and d that
minimize the function:
Regressions
Regression
Description
CubicReg
Uses the least-squares algorithm to fit the third-
order polynomial:
y=ax3+bx2+cx+d
For four data points, the equation is a polynomial
fit; for five or more, it is a polynomial regression. At
least four data points are required.
ExpReg
Uses the least-squares algorithm and transformed
values x and ln(y) to fit the model equation:
y=abx
LinReg
Uses the least-squares algorithm to fit the model
equation:
y=ax+b
where a is the slope and b is the y-intercept.
[
]
J
residualExpression
i
N
=
=
∑
1
2
[
]
J
a
bx
c
d y
i
i
i
N
=
+
+ −
=
∑
sin
(
)
2
1