HP 15c User Manual
Page 126
126
Section 4: Using Matrix Operations
126
An orthogonal change of variables x = Qz, which is equivalent to rotating the coordinate
axes, changes the equation of a family of quadratic surfaces (x
T
Ax = constant) into the form
.
constant
λ
)
(
2
k
j
j
j
T
T
z
z
AQ
Q
z
With the equation in this form, you can recognize what kind of surfaces these are (ellipsoids,
hyperboloids, paraboloids, cones, cylinders, planes) because the surface's semi-axes lie along
the new coordinate axes.
The program below starts with a given matrix A that is assumed to be symmetric (if it isn't, it
is replaced by (A + A
T
)/2, which is symmetric).
Given a symmetric matrix A, the program constructs a skew-symmetric matrix (that is, one
for which B = −B
T
) using the formula
.
0
0
0
)))
/(
2
(
tan
¼
tan(
-1
ij
ij
jj
ii
ij
ij
a
or
j
i
if
a
and
j
i
if
a
a
a
b
Then Q = 2(I + B)
−1
− I must be an orthogonal matrix whose columns approximate the
eigenvalues of A; the smaller are all the elements of B, the better the approximation.
Therefore Q
T
AQ must be more nearly diagonal than A but with the same eigenvalues. If
Q
T
AQ is not close enough to diagonal, it is used in place of A above for a repetition of the
process.
In this way, successive orthogonal transformations Q
1
, Q
2
, Q
3
, ... are applied to A to produce
a sequence A
1
, A
2
, A
3
, ... , where
A
j
= (Q
1
Q
2
… Q
j
)
T
AQ
1
Q
2
…Q
j
with each successive A
j
more nearly diagonal than the one before.
This process normally leads to skew matrices whose elements are all small and A
j
rapidly
converging to a diagonal matrix A. However, if some of the eigenvalues of matrix A are very
close but far from the others, convergence is slow; fortunately, this situation is rare.
The program stops after each iteration to display
j
F
j
j
A
A |
of
elements
diagonal
off
|
2
1
which measures how nearly diagonal is A
j
. If this measure is not negligible, you can press
¦ to calculate A
j+1
; if it is negligible, then the diagonal elements of A
j
approximate the
eigenvalues of A. The program needs only one iteration for 1 × 1 and 2 × 2 matrices, and
rarely more than six for 3 × 3 matrices. For 4 × 4 matrices the program takes slightly longer
and uses all available memory; usually 6 or 7 iterations are sufficient, but if some
eigenvalues are very close to each other and relatively far from the rest, then 10 to 16
iterations may be needed.