Abstract

In this paper we prove a new complexity bound for a variant of the accelerated coordinate descent method [Yu. Nesterov, SIAM J. Optim., 22 (2012), pp. 341--362]. We show that this method often outperforms the standard fast gradient methods (FGM [Yu. Nesterov, Dokl. Akad. Nauk SSSR, 269 (1983), pp. 542--547; Math. Program. (A), 103 (2005), pp. 127--152]) on optimization problems with dense data. In many important situations, the computational expenses of oracle and method itself at each iteration of our scheme are perfectly balanced (both depend linearly on dimensions of the problem). As application examples, we consider unconstrained convex quadratic minimization and the problems arising in the smoothing technique [Nesterov, Math. Program. (A), 103 (2005), pp. 127--152]. On some special problem instances, the provable acceleration factor with respect to FGM can reach the square root of the number of variables. Our theoretical conclusions are confirmed by numerical experiments.

Keywords

  1. convex optimization
  2. structural optimization
  3. fast gradient methods
  4. coordinate descent methods
  5. complexity bounds

MSC codes

  1. 90C06
  2. 90C25
  3. 90C47
  4. 68Q25

Get full access to this article

View all available purchase options and get full access to this article.

References

1.
Z. Allen-Zhu, Z. Qu, P. Richtárik, and Y. Yuan, Even faster accelerated coordinate descent using non-uniform sampling, in Proceedings of the 33rd International Conference on Machine Learning, 2016, pp. 1110--1119.
2.
Y. T. Lee and A. Sidford, Efficient Accelerated Coordinate Descent Methods and Fastest Algorithms for Solving Linear Systems, in FOCS, 2013, https://doi.org/10.1109/FOCS.2013.24.
3.
Yu. Nesterov, A method for unconstrained convex minimization problem with the rate of convergence $O({1\over k^2})$, Dokl. AN SSSR (translated as Soviet Math. Docl.), 269 (1983), pp. 543--547.
4.
Yu. Nesterov, Smooth minimization of non-smooth functions, Math. Program. (A), 103 (2005), pp. 127--152, https://doi.org/10.1007/s10107-004-0552-5.
5.
Yu. Nesterov, Efficiency of coordinate descent methods on huge-scale optimization problems, SIAM J. Optim., 22 (2012), pp. 341--362, https://doi.org/10.1137/100802001.
6.
Yu. Nesterov, Introductory Lectures on Convex Optimization, The Basic Course, Kluwer, Boston, 2004.
7.
Yu. Nesterov, Universal gradient methods for convex optimization problems, Math. Program. (A), 152 (2015), pp. 381--404, https://doi.org/10.1007/s10107-014-0790-0.
8.
P. Richtarik and M. Takac͂, Parallel coordinate descent methods for big data optimization, Math. Program., (2012), pp. 1--51, https://doi.org/10.1007/s10107-015-0901-6.
9.
P. Richtarik and M. Takac͂, Distributed coordinate descent method for learning with big data, J. Machine Learning Res., 17 (2016), 75.

Information & Authors

Information

Published In

cover image SIAM Journal on Optimization
SIAM Journal on Optimization
Pages: 110 - 123
ISSN (online): 1095-7189

History

Submitted: 5 February 2016
Accepted: 12 September 2016
Published online: 26 January 2017

Keywords

  1. convex optimization
  2. structural optimization
  3. fast gradient methods
  4. coordinate descent methods
  5. complexity bounds

MSC codes

  1. 90C06
  2. 90C25
  3. 90C47
  4. 68Q25

Authors

Affiliations

Funding Information

Fédération Wallonie-Bruxelles http://dx.doi.org/10.13039/501100002910 : ARC 04/09-315
Schweizerischer Nationalfonds zur Förderung der Wissenschaftlichen Forschung http://dx.doi.org/10.13039/501100001711

Metrics & Citations

Metrics

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited By

View Options

View options

PDF

View PDF

Figures

Tables

Media

Share

Share

Copy the content Link

Share with email

Email a colleague

Share on social media