Abstract

Solutions to a sequence of modified least squares problems, where either a new observation is added (updating) or an old observation is deleted (downdating), are required in many applications. Stable algorithms for downdating can be constructed if the complete QR factorization of the data matrix is available. Algorithms that only downdate R and do not store Q require less operations. However, they do not give good accuracy and may not recover accuracy after an ill-conditioned problem has occurred. The authors describe a new algorithm for accurate downdating of least squares solutions and compare it to existing algorithms. Numerical test results are also presented using the sliding window method, where a number of updatings and downdatings occur repeatedly.

MSC codes

  1. 65F05
  2. 65C20
  3. 15A04
  4. 65M12

Keywords

  1. downdating
  2. iterative refinement
  3. least squares
  4. seminormal equations

Get full access to this article

View all available purchase options and get full access to this article.

References

1.
S. T. Alexander, C.-T. Pan, R. J. Plemmons, Analysis of a recursive least squares hyperbolic rotation algorithm for signal processing, Linear Algebra Appl., 98 (1988), 3–40
2.
Andrew A. Anda, Haesun Park, Fast plane rotations with dynamic scaling, SIAM J. Matrix Anal. Appl., 15 (1994), 162–174
3.
M. G. Bellanger, J. McWhirter, The family of fast least squares algorithms for adaptive filteringMathematics in Signal Processing, Clarendon Press, Oxford, 1990, 415–434
4.
Å. Björck, Stability analysis of the method of seminormal equations for linear least squares problems, Linear Algebra Appl., 88/89 (1987), 31–48
5.
J. Daniel, W. B. Gragg, L. Kaufman, G. W. Stewart, Reorthogonalization and stable algorithms for updating the Gram-Schmidt $QR$ factorization, Math. Comp., 30 (1976), 772–795
6.
L. Eldén, J. McWhirter, Downdating $QR$ decompositionMathematics in Signal Processing, Clarendon Press, Oxford, 1990, 561–574
7.
William R. Ferng, Gene H. Golub, Robert J. Plemmons, Adaptive Lanczos methods for recursive condition estimation, Numer. Algorithms, 1 (1991), 1–19
8.
Gene H. Golub, Charles F. Van Loan, Matrix computations, Johns Hopkins Series in the Mathematical Sciences, Vol. 3, Johns Hopkins University Press, Baltimore, MD, 1989xxii+642, 2nd ed.
9.
C. C. Paige, Error analysis of some techniques for updating orthogonal decompositions, Math. Comp., 34 (1980), 465–471
10.
C.-T. Pan, A perturbation analysis of the problem of downdating a Cholesky factorization, Linear Algebra Appl., 183 (1993), 103–115
11.
C.-T. Pan, R. J. Plemmons, Least squares modifications with inverse factorizations: parallel implications, J. Comput. Appl. Math., 27 (1989), 109–127
12.
Beresford N. Parlett, The symmetric eigenvalue problem, Prentice-Hall Inc., Englewood Cliffs, N.J., 1980xix+348
13.
L. Reichel, W. B. Gragg, FORTRAN subroutines for updating the $QR$ decomposition, ACM Trans. Math. Software, 16 (1990), 369–377
14.
M. A. Saunders, Large-scale linear programming using the Cholesky factorization, Tech. Report, CS252, Computer Science Dept., Stanford University, Stanford, CA, 1972
15.
G. W. Stewart, The effects of rounding error on an algorithm for downdating a Cholesky factorization, J. Inst. Math. Appl., 23 (1979), 203–213

Information & Authors

Information

Published In

cover image SIAM Journal on Matrix Analysis and Applications
SIAM Journal on Matrix Analysis and Applications
Pages: 549 - 568
ISSN (online): 1095-7162

History

Submitted: 7 April 1992
Accepted: 10 September 1992
Published online: 31 July 2006

MSC codes

  1. 65F05
  2. 65C20
  3. 15A04
  4. 65M12

Keywords

  1. downdating
  2. iterative refinement
  3. least squares
  4. seminormal equations

Authors

Affiliations

Metrics & Citations

Metrics

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited By

View Options

View options

PDF

View PDF

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share on social media