In this paper we propose a tool for high-dimensional approximation based on trigonometric polynomials where we allow only low-dimensional interactions of variables. In a general high-dimensional setting, it is already possible to deal with special sampling sets such as sparse grids or rank-1 lattices. This requires black-box access to the function, i.e., the ability to evaluate it at any point. Here, we focus on scattered data points and grouped frequency index sets along the dimensions. From there we propose a fast matrix-vector multiplication, the grouped Fourier transform, for high-dimensional grouped index sets. Those transformations can be used in the application of the previously introduced method of approximating functions with low superposition dimension based on the analysis of variance (ANOVA) decomposition where there is a one-to-one correspondence from the ANOVA terms to our proposed groups. The method is able to dynamically detect important sets of ANOVA terms in the approximation. In this paper, we consider the involved least-squares problem and add different forms of regularization: classical Tikhonov-regularization, namely, regularized least squares, and the technique of group lasso, which promotes sparsity in the groups. As for the latter, there are no explicit solution formulas, which is why we applied the fast iterative shrinkage-thresholding algorithm to obtain the minimizer. Moreover, we discuss the possibility of incorporating smoothness information into the least-squares problem. Numerical experiments in underdetermined, overdetermined, and noisy settings indicate the applicability of our algorithms. While we consider periodic functions, the idea can be directly generalized to nonperiodic functions as well.


  1. analysis of variance (ANOVA)
  2. explainable approximation
  3. fast iterative shrinkage-thresholding algorithm (FISTA)
  4. group lasso
  5. high-dimensional approximation
  6. multivariate trigonometric polynomials
  7. nonequispaced fast Fourier transform (NFFT)
  8. LSQR

MSC codes

  1. 65T
  2. 42B05

Get full access to this article

View all available purchase options and get full access to this article.


J. Baldeaux and M. Gnewuch, Optimal randomized multilevel algorithms for infinite-dimensional integration on function spaces with ANOVA-type decomposition, SIAM J. Numer. Anal., 52 (2014), pp. 1128--1155, https://doi.org/10.1137/120896001.
F. Bartel, R. Hielscher, and D. Potts, Fast cross-validation in harmonic approximation, Appl. Comput. Harmon. Anal., 49 (2020), pp. 415--437, https://doi.org/10.1016/j.acha.2020.05.002.
A. Beck and M. Teboulle, A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM J. Imaging Sci., 2 (2009), pp. 183--202, https://doi.org/10.1137/080716542.
H.-J. Bungartz and M. Griebel, Sparse grids, Acta Numer., 13 (2004), pp. 147--269, https://doi.org/10.1017/S0962492904000182.
R. Caflisch, W. Morokoff, and A. Owen, Valuation of mortgage-backed securities using Brownian bridges to reduce effective dimension, J. Comput. Finance, 1 (1997), pp. 27--46, https://doi.org/10.21314/jcf.1997.005.
R. DeVore, G. Petrova, and P. Wojtaszczyk, Approximation of functions of few variables in high dimensions, Constr. Approx., 33 (2010), pp. 125--143, https://doi.org/10.1007/s00365-010-9105-8.
D. Dua and C. Graff, UCI Machine Learning Repository, University of California, Irvine, School of Information and Computer Sciences, 2017, http://archive.ics.uci.edu/ml.
A. D. Gilbert, F. Y. Kuo, D. Nuyens, and G. W. Wasilkowski, Efficient implementations of the multivariate decomposition method for approximating infinite-variate integrals, SIAM J. Sci. Comput., 40 (2018), pp. A3240--A3266, https://doi.org/10.1137/17M1161890.
M. Griebel and J. Hamaekers, Fast discrete Fourier transform on generalized sparse grids, in Sparse Grids and Applications---Munich 2012, J. Garcke and D. Pflüger, eds., Lect. Notes Comput. Sci. Eng. 97, Springer International, 2014, pp. 75--107, https://doi.org/10.1007/978-3-319-04537-5_4.
M. Griebel and M. Holtz, Dimension-wise integration of high-dimensional functions with applications to finance, J. Complexity, 26 (2010), pp. 455--489, https://doi.org/10.1016/j.jco.2010.06.001.
M. Griebel, F. Y. Kuo, and I. H. Sloan, The ANOVA decomposition of a non-smooth function of infinitely many variables can have every term smooth, Math. Comp., 86 (2017), pp. 1855--1876, https://doi.org/10.1090/mcom/3171.
C. Gu, Smoothing Spline ANOVA Models, Springer, New York, 2013, https://doi.org/10.1007/978-1-4614-5369-7.
A. Hashemi, H. Schaeffer, R. Shi, U. Topcu, G. Tran, and R. Ward, Generalization Bounds for Sparse Random Feature Expansions, preprint, https://arxiv.org/abs/2103.03191, 2021.
M. Hegland, Adaptive sparse grids, ANZIAM J., 44 (2002), pp. C335--C353, https://doi.org/10.21914/anziamj.v44i0.685.
M. Holtz, Sparse Grid Quadrature in High Dimensions with Applications in Finance and Insurance, Lect. Notes Comput. Sci. Eng. 77, Springer-Verlag, Berlin, 2011, https://doi.org/10.1007/978-3-642-16004-2.
Y. Jiang and Y. Xu, Fast discrete algorithms for sparse Fourier expansions of high dimensional functions, J. Complexity, 26 (2010), pp. 51--81.
L. Kämmerer, Reconstructing hyperbolic cross trigonometric polynomials by sampling along rank-1 lattices, SIAM J. Numer. Anal., 51 (2013), pp. 2773--2796, https://doi.org/10.1137/120871183.
L. Kämmerer, D. Potts, and T. Volkmer, Approximation of multivariate periodic functions by trigonometric polynomials based on rank-1 lattice sampling, J. Complexity, 31 (2015), pp. 543--576, https://doi.org/10.1016/j.jco.2015.02.004.
L. Kämmerer, T. Ullrich, and T. Volkmer, Worst case recovery guarantees for least squares approximation using random samples, Constr. Approx., 54 (2021), pp. 295--352, https://doi.org/10.1007/s00365-021-09555-0.
J. Keiner, S. Kunis, and D. Potts, NFFT 3.5, C subroutine library, http://www.tu-chemnitz.de/~potts/nfft; Contributors: F. Bartel, M. Fenn, T. Görner, M. Kircheis, T. Knopp, M. Quellmalz, M. Schmischke, T. Volkmer, and A. Vollrath.
J. Keiner, S. Kunis, and D. Potts, Using NFFT 3---A software library for various nonequispaced fast Fourier transforms, ACM Trans. Math. Software, 36 (2009), 19, https://doi.org/10.1145/1555386.1555388.
R. Kohavi, Scaling up the accuracy of naive-Bayes classifiers: A decision-tree hybrid, in Proceedings of the Second International Conference on Knowledge Discovery and Data Mining, 1996, pp. 202--207.
T. Kühn, S. Mayer, and T. Ullrich, Counting via entropy: New preasymptotics for the approximation numbers of Sobolev embeddings, SIAM J. Numer. Anal., 54 (2016), pp. 3625--3647, https://doi.org/10.1137/16m106580x.
F. Y. Kuo, G. Migliorati, F. Nobile, and D. Nuyens, Function integration, reconstruction and approximation using rank-1 lattices, Math. Comp., 90 (2021), pp. 1861--1897, https://doi.org/10.1090/mcom/3595.
F. Y. Kuo, D. Nuyens, L. Plaskota, I. H. Sloan, and G. W. Wasilkowski, Infinite-dimensional integration and the multivariate decomposition method, J. Comput. Appl. Math., 326 (2017), pp. 217--234, https://doi.org/10.1016/j.cam.2017.05.031.
F. Y. Kuo, I. H. Sloan, G. W. Wasilkowski, and H. Woźniakowski, On decompositions of multivariate functions, Math. Comp., 79 (2009), pp. 953--966, https://doi.org/10.1090/s0025-5718-09-02319-9.
R. Liu and A. B. Owen, Estimating mean dimensionality of analysis of variance decompositions, J. Amer. Statist. Assoc., 101 (2006), pp. 712--721, https://doi.org/10.1198/016214505000001410.
L. Meier, S. van de Geer, and P. Bühlmann, The group lasso for logistic regression, J. R. Stat. Soc. Ser. B Stat. Methodol., 70 (2008), pp. 53--71, https://doi.org/10.1111/j.1467-9868.2007.00627.x.
H. Niederreiter, Random Number Generation and Quasi-Monte Carlo Methods, CBMS-NSF Regional Conf. Ser. in Appl. Math. 63, SIAM, Philadelphia, 1992, https://doi.org/10.1137/1.9781611970081.
J. E. Oakley and A. O'Hagan, Probabilistic sensitivity analysis of complex models: A Bayesian approach, J. Roy. Stat. Soc. Ser. B Stat. Methodol., 66 (2004), pp. 751--769, https://doi.org/10.1111/j.1467-9868.2004.05304.x.
L. Oneto, S. Ridella, and D. Anguita, Tikhonov, Ivanov and Morozov regularization for support vector machine learning, Mach. Learn., 103 (2016), pp. 103--136, https://doi.org/10.1007/s10994-015-5540-x.
A. Owen, Effective dimension of some weighted pre-Sobolev spaces with dominating mixed partial derivatives, SIAM J. Numer. Anal., 57 (2019), pp. 547--562, https://doi.org/10.1137/17M1158975.
C. C. Paige and M. A. Saunders, LSQR: An algorithm for sparse linear equations and sparse least squares, ACM Trans. Math. Software, 8 (1982), pp. 43--71, https://doi.org/10.1145/355984.355989.
G. Plonka, D. Potts, G. Steidl, and M. Tasche, Numerical Fourier Analysis, Appl. Numer. Harmon. Anal., Birkhäuser, 2018, https://doi.org/10.1007/978-3-030-04306-3.
D. Potts and M. Schmischke, Approximation of high-dimensional periodic functions with Fourier-based methods, SIAM J. Numer. Anal., 59 (2021), pp. 2393--2429, https://doi.org/10.1137/20M1354921.
D. Potts and M. Schmischke, Interpretable approximation of high-dimensional data, SIAM J. Math. Data Sci., 3 (2021), pp. 1301--1323, https://doi.org/10.1137/21M1407707.
D. Potts and M. Schmischke, Learning multivariate functions with low-dimensional structures using polynomial bases, J. Comput. Appl. Math., 403 (2022), 113821, https://doi.org/10.1016/j.cam.2021.113821.
D. Potts and T. Volkmer, Sparse high-dimensional FFT based on rank-1 lattice sampling, Appl. Comput. Harmon. Anal., 41 (2016), pp. 713--748.
H. Rabitz and O. F. Alis, General foundations of high dimensional model representations, J. Math. Chem., 25 (1999), pp. 197--233, https://doi.org/10.1023/A:1019188517934.
N. Simon, J. Friedman, T. Hastie, and R. Tibshirani, A sparse-group lasso, J. Comput. Graph. Statist., 22 (2013), pp. 231--245, https://doi.org/10.1080/10618600.2012.681250.
I. M. Sobol', On sensitivity estimation for nonlinear mathematical models, Matem. Mod., 2 (1990), pp. 112--118.
I. M. Sobol', Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates, Math. Comput. Simulation, 55 (2001), pp. 271--280, https://doi.org/10.1016/s0378-4754(00)00270-6.
G. Wasilkowski and H. Woźniakowski, Liberating the dimension for function approximation, J. Complexity, 27 (2011), pp. 86--110, https://doi.org/10.1016/j.jco.2010.08.004.
C. F. J. Wu and M. S. Hamada, Experiments: Planning, Analysis, and Optimization, 2nd ed., John Wiley & Sons, Hoboken, NJ, 2009.
H. Yang, Z. Xu, I. King, and M. R. Lyu, Online learning for group lasso, in Proceedings of the 27th International Conference on Machine Learning (ICML), 2010, pp. 1191--1198, https://icml.cc/Conferences/2010/papers/473.pdf.
M. Yuan and Y. Lin, Model selection and estimation in regression with grouped variables, J. R. Stat. Soc. Ser. B Stat. Methodol., 68 (2006), pp. 49--67, https://doi.org/10.1111/j.1467-9868.2005.00532.x.

Information & Authors


Published In

cover image SIAM Journal on Scientific Computing
SIAM Journal on Scientific Computing
Pages: A1606 - A1631
ISSN (online): 1095-7197


Submitted: 20 October 2020
Accepted: 3 November 2021
Published online: 23 June 2022


  1. analysis of variance (ANOVA)
  2. explainable approximation
  3. fast iterative shrinkage-thresholding algorithm (FISTA)
  4. group lasso
  5. high-dimensional approximation
  6. multivariate trigonometric polynomials
  7. nonequispaced fast Fourier transform (NFFT)
  8. LSQR

MSC codes

  1. 65T
  2. 42B05



Funding Information

Bundesministerium für Bildung und Forschung https://doi.org/10.13039/501100002347 : 01-S20053A
Deutsche Forschungsgemeinschaft https://doi.org/10.13039/501100001659 : 416228727-SFB 1410
European Science Foundation https://doi.org/10.13039/501100000782 : 100367298

Metrics & Citations



If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited By

There are no citations for this item

View Options

View options


View PDF







Copy the content Link

Share with email

Email a colleague

Share on social media

The SIAM Publications Library now uses SIAM Single Sign-On for individuals. If you do not have existing SIAM credentials, create your SIAM account https://my.siam.org.