The Perceptron algorithm, despite its simplicity, often performs well in online classification tasks. The Perceptron becomes especially effective when it is used in conjunction with kernel functions. However, a common difficulty encountered when implementing kernel-based online algorithms is the amount of memory required to store the online hypothesis, which may grow unboundedly as the algorithm progresses. Moreover, the running time of each online round grows linearly with the amount of memory used to store the hypothesis. In this paper, we present the Forgetron family of kernel-based online classification algorithms, which overcome this problem by restricting themselves to a predefined memory budget. We obtain different members of this family by modifying the kernel-based Perceptron in various ways. We also prove a unified mistake bound for all of the Forgetron algorithms. To our knowledge, this is the first online kernel-based learning paradigm which, on one hand, maintains a strict limit on the amount of memory it uses and, on the other hand, entertains a relative mistake bound. We conclude with experiments using real datasets, which underscore the merits of our approach.

MSC codes

  1. 68T05
  2. 68Q32


  1. online classification
  2. kernel methods
  3. the Perceptron algorithm
  4. learning theory

Get full access to this article

View all available purchase options and get full access to this article.


N. Cesa-Bianchi and C. Gentile, Tracking the best hyperplane with a simple budget perceptron, in Learning Theory, Lecture Notes in Comput. Sci. 4005, Springer-Verlag, Berlin, 2006, pp. 483–498.
K. Crammer, O. Dekel, J. Keshet, S. Shalev-Shwartz, and Y. Singer, Online passive aggressive algorithms, J. Mach. Learn. Res., 7 (2006), pp. 551–585.
K. Crammer, J. Kandola, and Y. Singer, Online classification on a budget, in Advances in Neural Information Processing Systems 16, The MIT Press, Cambridge, MA, 2004, pp. 225–232.
O. Dekel, S. Shalev-Shwartz, and Y. Singer, The Forgetron: A kernel-based perceptron on a fixed budget, in Advances in Neural Information Processing Systems 18, The MIT Press, Cambridge, MA, 2006, pp. 259–266.
C. Gentile, A new approximate maximal margin classification algorithm, J. Mach. Learn. Res., 2 (2001), pp. 213–242.
D. P. Helmbold, J. Kivinen, and M. Warmuth, Relative loss bounds for single neurons, IEEE Trans. Neural Networks, 10 (1999), pp. 1291–1304.
J. Kivinen, A. J. Smola, and R. C. Williamson, Online learning with kernels, IEEE Trans. Signal Process., 52 (2002), pp. 2165–2176.
J. Kivinen and M. Warmuth, Exponentiated gradient versus gradient descent for linear predictors, Inform. and Comput., 132 (1997), pp. 1–64.
Y. Li and P. M. Long, The relaxed online maximum margin algorithm, Machine Learning, 46 (2002), pp. 361–387.
F. Rosenblatt, The perceptron: A probabilistic model for information storage and organization in the brain, Psychological Review, 65 (1958), pp. 386–407.
V. N. Vapnik, Statistical Learning Theory, John Wiley, New York, 1998.
J. Weston, A. Bordes, and L. Bottou, Online (and offline) on an even tighter budget, in Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics, Barbados, 2005, pp. 413–420.

Information & Authors


Published In

cover image SIAM Journal on Computing
SIAM Journal on Computing
Pages: 1342 - 1372
ISSN (online): 1095-7111


Submitted: 7 August 2006
Accepted: 5 June 2007
Published online: 16 January 2008

MSC codes

  1. 68T05
  2. 68Q32


  1. online classification
  2. kernel methods
  3. the Perceptron algorithm
  4. learning theory



Metrics & Citations



If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited By

View Options

View options


View PDF







Copy the content Link

Share with email

Email a colleague

Share on social media

The SIAM Publications Library now uses SIAM Single Sign-On for individuals. If you do not have existing SIAM credentials, create your SIAM account https://my.siam.org.