Abstract

In this paper, we propose new linesearch-based methods for nonsmooth constrained optimization problems when first-order information on the problem functions is not available. In the first part, we describe a general framework for bound-constrained problems and analyze its convergence toward stationary points, using the Clarke--Jahn directional derivative. In the second part, we consider inequality constrained optimization problems where both objective function and constraints can possibly be nonsmooth. In this case, we first split the constraints into two subsets: difficult general nonlinear constraints and simple bound constraints on the variables. Then, we use an exact penalty function to tackle the difficult constraints and we prove that the original problem can be reformulated as the bound-constrained minimization of the proposed exact penalty function. Finally, we use the framework developed for the bound-constrained case to solve the penalized problem. Moreover, we prove that every accumulation point, under standard assumptions on the search directions, of the generated sequence of iterates is a stationary point of the original constrained problem. In the last part of the paper, we report extended numerical results on both bound-constrained and nonlinearly constrained problems, showing that our approach is promising when compared to some state-of-the-art codes from the literature.

Keywords

  1. derivative-free optimization
  2. Lipschitz optimization
  3. exact penalty functions
  4. inequality constrained optimization
  5. stationarity conditions

MSC codes

  1. 90C30
  2. 90C56
  3. 65K05
  4. 49J52

Get full access to this article

View all available purchase options and get full access to this article.

References

1.
M. A. Abramson, C. Audet, G. Couture, J. E. Dennis, Jr., S. Le Digabel, and C. Tribes, The NOMAD Project, http://www.gerad.ca/nomad.
2.
M. A. Abramson, C. Audet, J. E. Dennis, Jr., and S. Le Digabel, Orthomads: A deterministic MADS instance with orthogonal directions, SIAM J. Optim., 20 (2009), pp. 948--966.
3.
N. Alexandrov and M. Y. Hussaini, Multidisciplinary Design Optimization: State of the Art, SIAM, Philadelphia, 1997.
4.
C. Audet, A. Custódio, and J. E. Dennis, Jr., Erratum: Mesh adaptive direct search algorithms for constrained optimization, SIAM J. Optim., 18 (2008), pp. 1501--1503.
5.
C. Audet and J. E. Dennis, Jr., Mesh adaptive direct search algorithms for constrained optimization, SIAM J. Optim., 17 (2006), pp. 188--217.
6.
C. Audet and J. E. Dennis, Jr., A progressive barrier for derivative-free nonlinear programming, SIAM J. Optim., 20 (2009), pp. 445--472.
7.
C. Audet, J. E. Dennis, Jr., and S. Le Digabel, Globalization strategies for mesh adaptive direct search, Comput. Optim. Appl., 46 (2010), pp. 193--215.
8.
P. Bratley and B. Fox, Algorithm $659$: Implementing Sobol's quasirandom sequence generator, ACM Trans. Math. Softw., 14 (1988), pp. 88--100.
9.
F. H. Clarke, Optimization and Nonsmooth Analysis, John Wiley & Sons, New York, 1983.
10.
A. R. Conn, K. Scheinberg, and L. N. Vicente, Global convergence of general derivative-free trust-region algorithms to first and second order critical points, SIAM J. Optim., 20 (2009), pp. 387--415.
11.
A. Conn, K. Scheinberg, and L. N. Vicente, Introduction to Derivative-Free Optimization, MPS/SIAM Ser. Optim., SIAM, Philadelphia, 2009.
12.
A. L. Custódio, J. F. A. Madeira, A. I. F. Vaz, and L. N. Vicente, Direct multisearch for multiobjective optimization, SIAM J. Optim., 21 (2011), pp. 1109--1140.
13.
A. L. Custódio and L. N. Vicente, Using sampling and simplex derivatives in pattern search methods, SIAM J. Optim., 18 (2007), pp. 537--555.
14.
A. L. Custódio, J. E. Dennis, Jr., and L. N. Vicente, Using simplex gradients of nonsmooth functions in direct search methods, IMA J. Numer. Anal., 28 (2008), pp. 770--784.
15.
G. Di Pillo and F. Facchinei, Exact barrier function methods for Lipschitz programs, Appl. Math. Optim., 32 (1995), pp. 1--31.
16.
E. D. Dolan, R. M. Lewis, and V. Torczon, On the local convergence of pattern search, SIAM J. Optim., 14 (2003), pp. 567--583.
17.
R. Fletcher and S. Leyffer, Nonlinear programming without a penalty function, Math. Program., 91 (2002), pp. 239--269.
18.
L. Grippo, F. Lampariello, and S. Lucidi, Global convergence and stabilization of unconstrained minimization methods without derivatives, J. Optim. Theory Appl., 56 (1988), pp. 385--406.
19.
J. H. Halton, On the efficiency of certain quasi-random sequences of points in evaluating multi-dimensional integrals, Numer. Math., 2 (1960), pp. 84--90.
20.
W. L. Hare, Using derivative free optimization for constrained parameter selection in a home and community care forecasting model, in International Perspectives on Operations Research and Health Care, J. Blake and M. Carter, eds., Proceedings of the 34th Meeting of the EURO Working Group on Operational Research Applied to Health Sciences, 2010, pp. 61--73.
21.
J. B. Hiriart-Urruty, On optimality conditions in nondifferentiable programming, Math. Program., 14 (1978), pp. 73--86.
22.
J. Jahn, Introduction to the Theory of Nonlinear Optimization, Springer-Verlag, Berlin, 1996.
23.
N. Karmitsa, Test Problems for Large-Scale Nonsmooth Minimization, Tech. report B. 4/2007, Department of Mathematical Information Technology, University of Jyväskylä, Finland, 2007.
24.
T. G. Kolda, R. M. Lewis, and V. Torczon, Optimization by direct search: New perspectives on some classical and modern methods, SIAM Rev., 45 (2003), pp. 385--482.
25.
T. G. Kolda, R. M. Lewis, and V. Torczon, A Generating Set Direct Search Augmented Lagrangian Algorithm for Optimization with a Combination of General and Linear Constraints, Tech. report SAND2006-5315, Sandia National Laboratories, 2006.
26.
T. G. Kolda, R. M. Lewis, and V. Torczon, Stationarity results for generating set search for linearly constrained optimization, SIAM J. Optim., 17 (2007), pp. 943--968.
27.
S. Le Digabel, Algorithm $909$: NOMAD: Nonlinear optimization with the MADS algorithm, ACM Trans. Math. Softw., 37 (2011), pp. 1--15.
28.
R. M. Lewis and V. Torczon, A globally convergent augmented Lagrangian pattern search algorithm for optimization with general constraints and simple bounds, SIAM J. Optim., 12 (2002), pp. 1075--1089.
29.
C. J. Lin, S. Lucidi, L. Palagi, A. Risi, and M. Sciandrone, Decomposition algorithm model for singly linearly-constrained problems subject to lower and upper bounds, J. Optim. Theory Appl., 141 (2009), pp. 107--126.
30.
G. Liuzzi and S. Lucidi, New results on a continuously differentiable exact penalty function, SIAM J. Optim., 2 (1992), pp. 558--574.
31.
G. Liuzzi, S. Lucidi, and F. Rinaldi, Derivative-free methods for bound constrained mixed-integer optimization, Comput. Optim. Appl., 53 (2012), pp. 505--526.
32.
G. Liuzzi, S. Lucidi, and M. Sciandrone, A derivative-free algorithm for linearly constrained finite minimax problems, SIAM J. Optim., 16 (2006), pp. 1054--1075.
33.
G. Liuzzi, S. Lucidi, and M. Sciandrone, Sequential penalty derivative-free methods for nonlinear constrained optimization, SIAM J. Optim., 20 (2010), pp. 2614--2635.
34.
S. Lucidi and M. Sciandrone, A derivative-free algorithm for bound constrained optimization, Comput. Optim. Appl., 21 (2002), pp. 119--142.
35.
S. Lucidi and M. Sciandrone, On the global convergence of derivative free methods for unconstrained optimization, SIAM J. Optim., 13 (2002), pp. 97--116.
36.
O. L. Mangasarian, Nonlinear Programming, Classics in Applied Mathematics, SIAM, Philadelphia, 1994.
37.
J. J. Moré and S. M. Wild, Benchmarking derivative-free optimization algorithms, SIAM J. Optim., 20 (2009), pp. 172--191.
38.
G. Di Pillo and S. Lucidi, An augmented Lagrangian function with improved exactness properties, SIAM J. Optim., 12 (2001), pp. 376--406.
39.
M. J. D. Powell, A direct search optimization method that models the objective and constraint functions by linear interpolation, in Advances in Optimization and Numerical Analysis, S. Gomez and J.-P. Hennart, eds., Kluwer Academic Publishers, Dordrecht, Netherlands, 1994, pp. 51--67.
40.
M. J. D. Powell, The BOBYGA Algorithm for Bound Constrained Optimization Without Derivatives, Tech. report, Department of Applied Mathematics and Theoretical Physics, University of Cambridge, UK, 2009.
41.
R. T. Rockafellar, Convex Analysis, Princeton Mathematical Series, Princeton University Press, Princeton, NJ, 1970.
42.
K. Shimizu, Y. Yshizuka, and J. F. Bard, Nondifferentiable and Two-Level Mathematical Programming, Kluwer Academic Publishers, Norwell, MA, 1997.
43.
I. Sobol, Uniformly distributed sequences with an additional uniform property, USSR Comput. Math. Math. Phys., 16 (1977), pp. 236--242.
44.
V. Torczon, On the convergence of pattern search algorithms, SIAM J. Optim., 7 (1997), pp. 1--25.
45.
J. Vlček and V. Lukšan, Test Problems for Nonsmooth Unconstrained and Linearly Constrained Optimization, Tech. report, Institute of Computer Science, Academy of Sciences of the Czech Republic, 2000.
46.
L. N. Vicente and A. L. Custódio, Analysis of direct searches for discontinuous functions, Math. Program., 133 (2012), pp. 299--325.
47.
Y. Yshizuka and K. Shimizu, Necessary and sufficient conditions for the efficient solutions of nondifferentiable multi-objective problems, IEEE Trans. Systems Man Cybernet., 14 (1984), pp. 624--629.

Information & Authors

Information

Published In

cover image SIAM Journal on Optimization
SIAM Journal on Optimization
Pages: 959 - 992
ISSN (online): 1095-7189

History

Submitted: 7 October 2013
Accepted: 23 April 2014
Published online: 10 July 2014

Keywords

  1. derivative-free optimization
  2. Lipschitz optimization
  3. exact penalty functions
  4. inequality constrained optimization
  5. stationarity conditions

MSC codes

  1. 90C30
  2. 90C56
  3. 65K05
  4. 49J52

Authors

Affiliations

Metrics & Citations

Metrics

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited By

View Options

View options

PDF

View PDF

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share on social media