We consider the problem of sparse coding, where each sample consists of a sparse linear combination of a set of dictionary atoms, and the task is to learn both the dictionary elements and the mixing coefficients. Alternating minimization is a popular heuristic for sparse coding, where the dictionary and the coefficients are estimated in alternate steps, keeping the other fixed. Typically, the coefficients are estimated via $\ell_1$ minimization, keeping the dictionary fixed, and the dictionary is estimated through least squares, keeping the coefficients fixed. In this paper, we establish local linear convergence for this variant of alternating minimization and establish that the basin of attraction for the global optimum (corresponding to the true dictionary and the coefficients) is $\mathcal{O}(1/s^{2})$, where $s$ is the sparsity level in each sample and the dictionary satisfies restricted isometry property. Combined with the recent results of approximate dictionary estimation, this yields provable guarantees for exact recovery of both the dictionary elements and the coefficients, when the dictionary elements are incoherent.


  1. dictionary learning
  2. sparse coding
  3. alternating minimization
  4. RIP
  5. incoherence
  6. lasso

MSC codes

  1. 90C26
  2. 68T10

Get full access to this article

View all available purchase options and get full access to this article.


A. Agarwal, A. Anandkumar, and P. Netrapalli, A Clustering Approach to Learn Sparsely-Used Overcomplete Dictionaries, preprint, arXiv:1309.1952, 2013.
M. Aharon, M. Elad, and A. Bruckstein, K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation, IEEE Trans. Signal Process., 54 (2006), pp. 4311--4322.
S. Arora, R. Ge, Y. Halpern, D. Mimno, A. Moitra, D. Sontag, Y. Wu, and M. Zhu, A practical algorithm for topic modeling with provable guarantees, in Proceedings of the 30th International Conference on Machine Learning, ACM, 2013, pp. 280--288.
S. Arora, R. Ge, and A. Moitra, New Algorithms for Learning Incoherent and Overcomplete Dictionaries, arXiv:1308.6273, 2013.
K. Balasubramanian, K. Yu, and G. Lebanon, Smooth sparse coding via marginal regression for learning sparse representations, in Proceedings of ICML, 2013.
Y. Bengio, A. Courville, and P. Vincent, Representation learning: A review and new perspectives, IEEE Trans. Pattern Anal. Mach. Intel., 35 (2013), pp. 1798--1828.
E. J. Candes, The restricted isometry property and its implications for compressed sensing, C. R. Acad. Sci. Paris Ser. I, 346 (2008), pp. 589--592.
I. Csiszar and P. Shields, Information theory and statistics: A tutorial, Found. Trends Commun. Inform. Theory, 1 (2004), pp. 417--528, https://doi.org/10.1561/0100000004.
G. Davis, Adaptive Nonlinear Approximations, Ph.D. thesis, New York University, New York, 1994.
K. Engan, S. O. Aase, and J. Hakon Husoy, Method of optimal directions for frame design, in Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, Vol. 5, IEEE, 1999, pp. 2443--2446.
R. Garg and R. Khandekar, Gradient descent with sparsification: An iterative algorithm for sparse recovery with restricted isometry property, in Proceedings of ICML, 2009.
Q. Geng, H. Wang, and J. Wright, On the Local Correctness of $\ell_1$ Minimization for Dictionary Learning, preprint, arXiv:1101:5672, 2011.
R. Gribonval, R. Jenatton, and F. Bach, Sparse and spurious: Dictionary learning with noise and outliers, IEEE Trans. Inform. Theory, 61 (2015), pp. 6298--6319.
R. Gribonval and K. Schnass, Dictionary identification: Sparse matrix-factorization via l$_1$ minimization, IEEE Trans. inform theory, 56 (2010), pp. 3523--3539.
P. Jain, P. Netrapalli, and S. Sanghavi, Low-rank matrix completion using alternating minimization, in Proceedings of the 45th Annual ACM Symposium on Theory of Computing, 2013, pp. 665--674.
R. Jenatton, J. Mairal, F. R. Bach, and G. R. Obozinski, Proximal methods for sparse hierarchical dictionary learning, in Proceedings of the 27th International Conference on Machine Learning, 2010, pp. 487--494.
K. Lange, D. R. Hunter, and I. Yang, Optimization transfer using surrogate objective functions, J. comput. Graphi. Statisti., 9 (2000), pp. 1--20.
H. Lee, A. Battle, R. Raina, and A. Ng, Efficient sparse coding algorithms, in Advances in Neural Information Processing Systems, 2006, pp. 801--808.
M. S. Lewicki and T. J. Sejnowski, Learning overcomplete representations, Neural Comput., 12 (2000), pp. 337--365.
J. Mairal, F. Bach, J. Ponce, G. Sapiro, and A. Zisserman, Discriminative learned dictionaries for local image analysis, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2008, pp. 1--8.
A. Maurer, M. Pontil, and B. Romera-Paredes, Sparse Coding for Multitask and Transfer Learning, preprint, arXiv:1209.0738, 2012.
N. Mehta and A. G. Gray, Sparsity-based generalization bounds for predictive sparse coding, in Proceedings of the 30th International Conference on Machine Learning, 2013, pp. 36--44.
S. N. Negahban, P. Ravikumar, M. J. Wainwright, and B. Yu, A unified framework for high-dimensional analysis of m-estimators with decomposable regularizers, Statist. Sci., 27 (2012), pp. 538--557.
P. Netrapalli, P. Jain, and S. Sanghavi, Phase retrieval using alternating minimization, in Advances in Neural Information Processing Systems, 2013, pp. 2796--2804.
B. A. Olshausen, Sparse coding of time-varying natural images, in Proceedings of the International confevence on Independent Component Analysis and Blind Source Separation, 2000, pp. 603--608.
B. A. Olshausen et al., Emergence of simple-cell receptive field properties by learning a sparse code for natural images, Nature, 381 (1996), pp. 607--609.
B. A. Olshausen and D. J. Field, Sparse coding with an overcomplete basis set: A strategy employed by V1?, Vision Research, 37 (1997), pp. 3311--3325.
G. Raskutti, M. J. Wainwright, and B. Yu, Restricted eigenvalue properties for correlated gaussian designs, J. Mach. Learn. Res., 11 (2010), pp. 2241--2259.
K. Schnass, On the identifiability of overcomplete dictionaries via the minimisation principle underlying k-svd, Appl. Comput. Harmon. Analy., 37 (2014), pp. 464--491.
D. A. Spielman, H. Wang, and J. Wright, Exact recovery of sparsely-used dictionaries, in Proceedings of the conference on Learning Theory, 2012.
J. J. Thiagarajan, K. N. Ramamurthy, and A. Spanias, Learning stable multilevel dictionaries for sparse representations, IEEE Trans. Neural Networks Learning Systems, 26 (2015), pp. 1913--1926, doi:10.1109/TNNLS.2014.2361052.
A. M. Tillmann, On the computational intractability of exact and approximate dictionary learning, IEEE Signal Process. Lett., 22 (2015), pp. 45--49.
J. Tropp and A. Gilbert, Signal recovery from random measurements via orthogonal matching pursuit, IEEE Trans. Inform. Theory, 53 (2007), pp. 4655--4666.
D. Vainsencher, S. Mannor, and A. M. Bruckstein, The sample complexity of dictionary learning, J. Mach. Learn. Res., 12 (2011), pp. 3259--3281.
R. Vershynin, Introduction to the non-asymptotic analysis of random matrices, in Compressed sensing: Theory and applications, Y. C. Eldar and G. Kutyniok, eds., Cambridge University Press, Cambridge, UK, 2012.
M. Yaghoobi, L. Daudet, and M. E. Davies, Parametric dictionary design for sparse coding, IEEE Trans. Signal Process., 57 (2009), pp. 4800--4810.
J. Yang, J. Wright, T. S. Huang, and Y. Ma, Image super-resolution via sparse representation, IEEE Trans. Image Process., 19 (2010), pp. 2861--2873.

Information & Authors


Published In

cover image SIAM Journal on Optimization
SIAM Journal on Optimization
Pages: 2775 - 2799
ISSN (online): 1095-7189


Submitted: 29 July 2014
Accepted: 12 September 2016
Published online: 8 December 2016


  1. dictionary learning
  2. sparse coding
  3. alternating minimization
  4. RIP
  5. incoherence
  6. lasso

MSC codes

  1. 90C26
  2. 68T10



Funding Information

NSF http://dx.doi.org/10.13039/100000001 : CCF-1254106
Microsoft Faculty Fellowship
Google Faculty Award
ONR : N00014-14-1-0665
Air Force Office of Scientific Research https://doi.org/10.13039/100000181 : FA9550-15-1-0221

Metrics & Citations



If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

View options


View PDF







Copy the content Link

Share with email

Email a colleague

Share on social media

On May 28, 2024, our site will enter Read Only mode for a limited time in order to complete a platform upgrade. As a result, the following functions will be temporarily unavailable: registering new user accounts, any updates to existing user accounts, access token activations, and shopping cart transactions. Contact [email protected] with any questions.