Tải bản đầy đủ (.pdf) (362 trang)

Nonsmooth equations in optimization

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (7.97 MB, 362 trang )


www.pdfgrip.com

Nonsmooth Equations in Optimization


www.pdfgrip.com

Nonconvex Optimization and Its Applications
Volume 60
Managing Editor:
Panos Pardalos
Advisory Board:
J.R. Birge
Northwestern University, U.S.A.
Ding-Zhu Du
University of Minnesota, U.S.A.
C. A. Floudas
Princeton University, U.S.A.
J. Mockus
Lithuanian Academy of Sciences, Lithuania
H. D. Sherali
Virginia Polytechnic Institute and State University, U.S.A.
G. Stavroulakis
Technical University Braunschweig, Germany

The titles published in this series are listed at the end of this volume.


www.pdfgrip.com


Nonsmooth Equations
in Optimization
Regularity, Calculus, Methods and Applications

by
Diethard Klatte
Institute for Operations Research and Mathematical Methods of Economics,
University of Zurich, Switzerland

and

Bernd Kummer
Institute of Mathematics, Faculty of Mathematics and Natural Sciences II,
Humboldt University Berlin, Germany

KLUWER ACADEMIC PUBLISHERS
NEW YORK, BOSTON, DORDRECHT, LONDON, MOSCOW


www.pdfgrip.com

eBook ISBN:
Print ISBN:

0-306-47616-9
1-4020-0550-4

©2002 Kluwer Academic Publishers
New York, Boston, Dordrecht, London, Moscow
Print ©2002 Kluwer Academic Publishers

Dordrecht
All rights reserved
No part of this eBook may be reproduced or transmitted in any form or by any means, electronic,
mechanical, recording, or otherwise, without written consent from the Publisher
Created in the United States of America
Visit Kluwer Online at:
and Kluwer's eBookstore at:





www.pdfgrip.com

Contents
Introduction

xi

List of Results

xix

Basic Notation

xxv

1 Basic Concepts
1.1 Formal Settings
1.2 Multifunctions and Derivatives

1.3 Particular Locally Lipschitz Functions and Related Definitions
Generalized Jacobians of Locally Lipschitz Functions
Pseudo-Smoothness and D°f
Piecewise
Functions
NCP Functions
1.4 Definitions of Regularity
Definitions of Lipschitz Properties
Regularity Definitions
Functions and Multifunctions
1.5 Related Definitions
Types of Semicontinuity
Metric, Pseudo-, Upper Regularity; Openness with Linear Rate
Calmness and Upper Regularity at a Set
1.6 First Motivations
Parametric Global Minimizers
Parametric Local Minimizers
Epi-Convergence

1
1
2
4
4
4
5
5
6
6
7

9
10
10
12
13
14
15
16
17

2 Regularity and Consequences
2.1 Upper Regularity at Points and Sets
Characterization by Increasing Functions
Optimality Conditions
Linear Inequality Systems with Variable Matrix
Application to Lagrange Multipliers
Upper Regularity and Newton’s Method

19
19
19
25
28
30
31

v


www.pdfgrip.com

Contents

vi

2.2 Pseudo-Regularity
2.2.1 The Family of Inverse Functions
2.2.2 Ekeland Points and Uniform Lower Semicontinuity
2.2.3 Special Multifunctions
Level Sets of L.s.c. Functions
Cone Constraints
Lipschitz Operators with Images in Hilbert Spaces
Necessary Optimality Conditions
Intersection
Maps and Extension of MFCQ
2.2.4
Intersection with a Quasi-Lipschitz Multifunction
Special Cases
Intersections with Hyperfaces

32
34
37
43
43
44
46
47
49
49
54

58

3 Characterizations of Regularity by Derivatives
3.1 Strong Regularity and Thibault’s Limit Sets
3.2 Upper Regularity and Contingent Derivatives
3.3 Pseudo-Regularity and Generalized Derivatives
Contingent Derivatives
Proper Mappings
Closed Mappings
Coderivatives
Vertical Normals

61
61
63
63
64
64
64
66
67

4 Nonlinear Variations and Implicit Functions
4.1 Successive Approximation and Persistence of Pseudo-Regularity
4.2 Persistence of Upper Regularity
Persistence Based on Kakutani’s Fixed Point Theorem
Persistence Based on Growth Conditions
4.3 Implicit Functions

71

72
77
77
79
82

5 Closed Mappings in Finite Dimension
5.1 Closed Multifunctions in Finite Dimension
5.1.1 Summary of Regularity Conditions via Derivatives
5.1.2 Regularity of the Convex Subdifferential
5.2 Continuous and Locally Lipschitz Functions
5.2.1 Pseudo-Regularity and Exact Penalization
5.2.2 Special Statements for
5.2.3 Continuous Selections of Pseudo-Lipschitz Maps
5.3 Implicit Lipschitz Functions on

89
89
89
92
93
94
96
99
100

6 Analysis of Generalized Derivatives
6.1 General Properties for Abstract and Polyhedral Mappings
6.2 Derivatives for Lipschitz Functions in Finite Dimension
and

6.3 Relations between
6.4 Chain Rules of Equation Type
and
with
6.4.1 Chain Rules for

105
105
110
113
115
115


www.pdfgrip.com
Contents

6.4.2 Newton Maps and Semismoothness
6.5 Mean Value Theorems, Taylor Expansion and Quadratic Growth
6.6 Contingent Derivatives of Implicit (Multi–) Functions and Stationary Points
6.6.1 Contingent Derivative of an Implicit (Multi-)Function
6.6.2 Contingent Derivative of a General Stationary Point Map

vii

121
131
136
137
141


7 Critical Points and Generalized Kojima–Functions
7.1 Motivation and Definition
KKT Points and Critical Points in Kojima’s Sense
Generalized Kojima–Functions – Definition
7.2 Examples and Canonical Parametrizations
The Subdifferential of a Convex Maximum Function
Complementarity Problems
Generalized Equations
Nash Equilibria
Piecewise Affine Bijections
7.3 Derivatives and Regularity of Generalized Kojima–Functions
Properties of N
Formulas for Generalized Derivatives
Regularity Characterizations by Stability Systems
Geometrical Interpretation
7.4 Discussion of Particular Cases
7.4.1 The Case of Smooth Data
7.4.2 Strong Regularity of Complementarity Problems
7.4.3 Reversed Inequalities
7.5 Pseudo–Regularity versus Strong Regularity

149
149
150
151
154
154
156
157

159
160
160
160
164
167
168
170
170
175
177
178

8 Parametric Optimization Problems
8.1 The Basic Model
8.2 Critical Points under Perturbations
8.2.1 Strong Regularity
Geometrical Interpretation
Direct Perturbations for the Quadratic Approximation
Strong Regularity of Local Minimizers under LICQ
8.2.2 Local Upper Lipschitz Continuity
Reformulation of the C-Stability System
Geometrical Interpretation
Direct Perturbations for the Quadratic Approximation
8.3 Stationary and Optimal Solutions under Perturbations
8.3.1 Contingent Derivative of the Stationary Point Map
The Case of Locally Lipschitzian F
The Smooth Case
8.3.2 Local Upper Lipschitz Continuity
Injectivity and Second-Order Conditions


183
185
187
187
189
190
191
193
194
196
197
198
199
200
202
203
205


www.pdfgrip.com
Contents

viii

Conditions via Quadratic Approximation
Linearly Constrained Programs
8.3.3 Upper Regularity
Upper Regularity of Isolated Minimizers
Second–Order Optimality Conditions for

Programs
Strongly
Regular
and
Pseudo-Lipschitz
Stationary
Points
8.3.4
Strong Regularity
Pseudo-Lipschitz Property
8.4 Taylor Expansion of Critical Values
8.4.1 Marginal Map under Canonical Perturbations
8.4.2 Marginal Map under Nonlinear Perturbations
Formulas under Upper Regularity of Stationary Points
Formulas under Strong Regularity
Formulas in Terms of the Critical Value Function Given
under Canonical Perturbations

208
209
210
211
215
217
217
220
221
222
225
225

227

9 Derivatives and Regularity of Further Nonsmooth Maps
9.1 Generalized Derivatives for Positively Homogeneous Functions
9.2 NCP Functions
Case (i): Descent Methods
Case (ii): Newton Methods
9.3 The C-Derivative of the Max-Function Subdifferential
Contingent Limits
Characterization of
for Max-Functions: Special Structure
Characterization of
for Max-Functions: General Structure
Application 1
Application 2

231
231
236
237
238
241
243
244
251
253
254

10 Newton’s Method for Lipschitz Equations
10.1 Linear Auxiliary Problems

10.1.1 Dense Subsets and Approximations of M
10.1.2 Particular Settings
and NCP Functions
10.1.3 Realizations for
10.2 The Usual Newton Method for
Functions
10.3 Nonlinear Auxiliary Problems
10.3.1 Convergence
10.3.2 Necessity of the Conditions

257
257
260
261
262
265
265
267
270

11 Particular Newton Realizations and Solution Methods
11.1 Perturbed Kojima Systems
Quadratic Penalties
Logarithmic Barriers
11.2 Particular Newton Realizations and SQP-Models

275
276
276
276

278

229


www.pdfgrip.com
Contents

ix

12 Basic Examples and Exercises
12.1 Basic Examples
12.2 Exercises

287
287
296

Appendix
Ekeland’s Variational Principle
Approximation by Directional Derivatives
Proof of TF = T(NM) = NTM + TNM
Constraint Qualifications

303
303
304
306
307


Bibliography

311

Index

325


www.pdfgrip.com

This Page Intentionally Left Blank


www.pdfgrip.com

Introduction
Many questions dealing with solvability, stability and solution methods for variational inequalities or equilibrium, optimization and complementarity problems
lead to the analysis of certain (perturbed) equations. This often requires a reformulation of the initial model being under consideration. Due to the specific
of the original problem, the resulting equation is usually either not differentiable (even if the data of the original model are smooth), or it does not satisfy
the assumptions of the classical implicit function theorem.
This phenomenon is the main reason why a considerable analytical instrument dealing with generalized equations (i.e., with finding zeros of multivalued
mappings) and nonsmooth equations (i.e., the defining functions are not continuously differentiable) has been developed during the last 20 years, and that
under very different viewpoints and assumptions.
In this theory, the classical hypotheses of convex analysis, in particular,
monotonicity and convexity, have been weakened or dropped, and the scope of
possible applications seems to be quite large. Briefly, this discipline is often
called nonsmooth analysis, sometimes also variational analysis. Our book fits
into this discipline, however, our main intention is to develop the analytical
theory in close connection with the needs of applications in optimization and

related subjects.
Main Topics of the Book

1. Extended analysis of Lipschitz functions and their generalized derivatives,
including ”Newton maps” and regularity of multivalued mappings.
2. Principle of successive approximation under metric regularity and its application to implicit functions.
3. Characterization of metric regularity for intersection maps in general
spaces.
4. Unified theory of Lipschitzian critical and stationary points in
optimization, in variational inequalities and in complementarity problems via
a particular nonsmooth equation.
5. Relations between this equation and reformulations by penalty, barrier
and so-called NCP functions.
6. Analysis of Newton methods for Lipschitz equations based on linear and
xi


www.pdfgrip.com
xii

Introduction

nonlinear approximations, in particular, for functions having a dense set
of
points.
7. Consistent interpretation of hypotheses and methods in terms of original
data and quadratic approximations.
8. Collection of basic examples and exercises.
Motivations and Intentions


For sufficiently smooth functions, it is clear that many questions discussed in
this field become trivial or have classical answers. Even the way of dealing with
equations defined by nonsmooth functions seems to be evident:
(1) Define a reasonable derivative and prove a related inverse function theorem. Then, like in the Fréchet-differentiable case,
(2) derive statements about implicit functions, successive approximation and
Newton’s method,
(3) and develop conditions for characterizing critical points in extremal problems.
Of course, this calls for a deeper discussion.
First

of all, one has to specify the notion of a derivative. This should be a sufficiently
nice function L that approximates the original one, say locally at least of first
order like the usual linearization at some argument in the Fréchet concept
as
However, there are many problems when going into the details.
Example 0.1 (piecewise linear
Consider the real function
if
if
For
the function
satisfies
Our ”linearization”
of at the origin is nonlinear, but has still a simple piecewise
linear structure. Taking
the linearization
of should be the usual
one, namely,
Evidently, in this example, we found a
of near

which is simpler than the original function But in view of differentiation and
inverse mappings, there arise already three new problems:
What about inverse maps of piecewise linear functions ?
What about continuity of derivatives or of ”linearizations” in terms of
Which kind of singularity (critical point) appears at the origin ?
Example 0.2 (no piecewise linear
norm on
one cannot find any piecewise linear
origin.

For the Euclidean
at the


www.pdfgrip.com
xiii

Introduction

The functions and
of these examples are not only of academic interest
because they are typical optimal value functions in parametric optimization:
with respect to
with respect to

(for –1

They may occur, e.g., as objectives or as constraint functions in other optimization problems.
Next,


it may happen that one needs different derivatives for different purposes. To
illustrate this we note that there exists a real, strictly monotonous directionally
differentiable Lipschitz function
such that
on
where
is a countable set.
(i) is
(ii) The inverse
is well-defined and globally Lipschitz.
(iii) Newton’s method (to determine a zero of ) with start at any point
generates an alternating sequence and uses only points in
X. Notice that X has full Lebesgue measure.
Concerning the construction and further properties of such a function we refer
to Chapter 12, Basic Example BE.1.
So, the existence of a Lipschitzian inverse on the one hand and local convergence of Newton’s method on the other hand are different things. Indeed,
we have to expect and to accept that there are generalized derivatives which
allow (for certain nonsmooth functions) the construction of Newton-type solution methods without saying anything about uniqueness and Lipschitz behavior
of the inverse, whereas other ”derivatives”, which characterize the inverse function, are rather inappropriate for Newton-type solution methods.
Moreover,

the power of the classical differential calculus lies in the possibility of computing
derivatives for the functions of interest. The latter is based on several chain
rules. Related rules for composed generalized derivatives of functions or multifunctions are often not true or hold in a weaker form only. Even for rather
simple mappings in finite dimensional spaces, it may be quite difficult to determine the limits appearing in an actual derivative-definition. This means an
increase in the technical effort.
In addition,

everybody has an idea about what tangency is or what a normal cone is. This
had the effect that various more or less useful notions of generalized derivatives

have been introduced in the literature, and many relations have been shown between them. Each of these derivatives has its own history and own motivation
by geometrical aspects or by some statement, say by an application. However,
these applications and motivations often play a second (or no) role in subsequent publications, which are devoted to technical refinements of the calculus,
generalization and unification. So, the reader may easily gain the impression


www.pdfgrip.com
xiv

Introduction

that ”nonsmooth analysis” is a graph the vertices of which are definitions of
generalized derivatives and the edges are interrelations between them. It is
hard to see that the graph is indeed something like a network of electric power
because the lamps that can be switched on are hidden.
In the present book,

as far as general concepts are concerned, we motivate why this or another
concept is appropriate (or not) for answering a concrete question, we develop a
related theory and indicate possible applications in the context of optimization.
We also try to use as few notions of generalized derivatives as possible (only
those mentioned below), and we describe necessary assumptions mainly in terms
of well–known upper and lower semi–continuity properties.
In this way, we hope that every reader who is familiar with basic topological
and analytical notions and who is interested in the parametric behavior of
solutions to equations and optimization problems (smooth or nonsmooth) or in
the theory and methods of nonsmooth analysis itself will easily understand our
statements and constructions.
As a basic general instrument, we apply Ekeland’s variational principle.
A second tool consists in a slight generalization of successive approximation,

which opens the same applications (by the same arguments) as successive approximation in the usual (single-valued) case, namely implicit function theorems
and Newton-type solution methods.
Further, as a specific topic of our monograph, we use so-called Kojimafunctions (having a nice, simple structure for analytical investigations) in order
to characterize crucial points in variational models. For several reasons, but
mainly in order to establish tools for studying variational problems with nondata and, closely related, stationary points in
optimization, we
summarize and extend the calculus of generalized derivatives for locally Lipschitz functions.
Finally, we connect generalized Newton-type methods with the continuity
of (generalized) differentiability, as in the classical differentiable case; see the
concept of Newton maps. Via perturbed Kojima systems, we establish relations
to other standard optimization techniques, in particular, to penalty and barrier
type methods.
However, the most important tool for understanding nonsmooth analysis
with its various definitions and constructions, is the knowledge of several concrete functions and examples which show the difficulties and ”abnormalities” in
comparison with smooth settings. Such examples will be included in all parts
of this monograph. The most important ones as well as answers to various
exercises are collected in the last chapter.
We envision that our book is useful for researchers, graduate students and
practitioners in various fields of applied mathematics, engineering, OR and
economics, but we think that it is also of interest for university teachers and
advanced students who wish to get insights into problems, potentials and recent


www.pdfgrip.com
Introduction

xv

developments of this rich and thriving area of nonlinear analysis and optimization.
Structure of the Book


In Chapter 1, we start with some basic notation, in particular, with the presentation of certain desirable stability properties: pseudo- (or metric) regularity,
strong and upper regularity. We try to find intrinsic conditions, equivalent or
sufficient, which (as we hope) make the properties in question more transparent
and indicate the relations to other types of stability.
In Chapter 2, we present various conditions for certain Lipschitz properties
of multivalued maps and the related types of regularity, we investigate interrelations between them and discuss classical applications as, e.g., (necessary)
optimality conditions and ”stability” in parametric optimization.
A great part of this chapter is devoted to pseudo-regularity of multifunctions in Banach spaces, where the use of generalized derivatives is avoided.
This approach is based on the observation that the concepts of generalized
derivatives which are usually applied for describing this important regularitytype (contingent derivatives as well as Mordukhovich’s co-derivatives) lead us
to conditions that are not necessary even for level set maps of monotone Lipschitzian functionals in separable Hilbert spaces, cf. Example BE.2. Therefore,
we present characterizations which directly use Ekeland’s variational principle
as well as the family of assigned inverse functions. They allow characterizations
of pseudo-regularity for the intersection of multifunctions and permit weaker
assumptions concerning the image- and pre-image space as well.
In particular, we reduce the question of pseudo-regularity to the two basic
classical problems:
(i) Show the existence of solutions to an equation after small constant perturbations, i.e., provided that
and
is small, verify that
some satisfying
exists.
for some solution in a Lipschitzian way,
(ii) Estimate the distance
i.e., show that there is with
such that
Pseudo-regularity requires that, for certain neighborhoods U and V of and
respectively, one finds a constant L such that both requirements can be
satisfied whenever

and
We will demonstrate that, under weak hypotheses, it is enough to satisfy (i)
and (ii) for all
and for satisfying
where
is some constant depending on and
Chapter 3 is devoted to characterizations of regularity by the help of
(generalized) derivatives and may be seen as justification of the derivatives investigated in the current book.
We also intend to motivate why the regularity concepts introduced in the


www.pdfgrip.com
xvi

Introduction

first two chapters are really important. In particular, this will be done in Chapter 4 by showing persistence of regularity with respect to small Lipschitzian
perturbations which has several interesting consequences (e.g. in Section 11.1).
Note that we do not aim at presenting a complete perturbation theory for optimization problems and nonsmooth equations, our selection of results is subjective and essentially motivated by the applications mentioned above.
Many general regularity statements can be considerably sharpened for closed
multifunctions in finite dimension and for continuous or locally Lipschitz function sending
into itself. So Chapter 5 is devoted to specific properties of
such mappings and functions where we pay attention to statements that are
mainly of topological nature and independent on usual derivative concepts.
As an essential tool for locally Lipschitz functions, we apply here Thibault’s
limit sets. In contrast to Clarke’s generalized Jacobians, the latter provide us
with sufficient and necessary conditions for strong regularity and, even more
important, they satisfy intrinsic chain rules for inverse and implicit functions.
Basic tools for dealing with several generalized derivatives will be developed
in Chapter 6. Our calculus of generalized derivatives includes chain rules and

mean-value statements for contingent derivatives and Thibault’s limit sets under hypotheses that are appropriate for critical point theory of optimization
problems, where the involved problem-functions are not necessarily
Here,
we write coderivatives by means of contingent derivatives (which will be computed in Chapter 7), and we also introduce some derivative-like notion called
Newton function. It represents linear operators that are of interest in relation to Newton-type solution methods for Lipschitzian equations and describes,
in a certain sense, continuous differentiability for non-differentiable functions.
Derivatives for so-called semismooth functions are included in this approach.
Chapter 7 is devoted to stable solution behavior of generalized Kojimafunctions. By this approach, we cover in a unified way Karush-Kuhn-Tucker
points and stationary points in parametric optimization, persistence and stability of local minimizers and related questions in the context of generalized equations, of complementarity problems and equilibrium in games, as well. The
notation Kojima-function has its root in Kojima’s representation of KarushKuhn-Tucker points as zeros of a particular nonsmooth function. We will see
that basic generalized derivatives of such functions can be determined by means
of usual chain rules. The properties of these derivatives determine, in a clear
analytical way (based on results of Chapter 6), the stable behavior of critical
points. In contrast to descriptions of critical points by generalized equations,
our approach via Lipschitz equations has three advantages:
(i) The invariance of domain theorem and Rademacher’s theorem may be
used as additional powerful tools (not valid for multifunctions),
(ii) The classical approach via generalized equations is mainly restricted to
systems of the type
where varies in
and the multi-


www.pdfgrip.com
Introduction

xvii

function is fixed. This means for optimization problems: The involved
functions are

and the perturbed problem has to have the same structure as the original one. By our approach, for instance, stationary points
of the original problem and of assigned penalty or barrier functions may
be studied and estimated as zeros of the same perturbed Kojima function
(even for involved
(iii) The necessary approximations of the multifunction
- when speaking
about generalized equations - are now directly determined by the type of
derivative we are applying to the assigned Lipschitz equation.
In Chapter 8, the regularity characterizations for zeros of generalized Kojima functions are applied to critical points, stationary solutions and local minimizers of parametric nonlinear
programs in finite dimension, the specializations to the case of
data – which is well-studied in the optimization
literature – are explicitly discussed. In particular, we present second order
characterizations of strong regularity, pseudo-regularity and upper Lipschitz
stability in this context, give geometrical interpretations, and derive representations of derivatives of (stationary) solution maps. Finally, Taylor expansion
formulas for critical value functions are obtained.
In Chapter 9, we regard generalized derivatives of other mappings that are
important for the analysis of optimization models, namely of
(i) positively homogeneous functions and
for the maximum of finitely many
func(ii) Clarke subdifferentials
tions
In particular,
may be a directional derivative or a so-called NCPfunction, used for rewriting complementarity problems in form of nonsmooth
equations. We study the latter more extensively in order to show how the
properties of determine the behavior of first and second order methods for
solving the assigned equations and how related iteration steps can be interpreted
in an intrinsic way. The simple derivative
defined below, plays an essential
role in this context. In view of (ii), it turns out that
(which may be seen

as a proto-derivative, too) depends on the first and second derivatives of the
functions
at the reference point only. We will determine the concrete form of
in a direct way and establish the relations to C-derivatives of generalized
Kojima-functions.
Solution methods for general Lipschitzian equations are the subject of Chapter 10. Here, we summarize crucial conditions for superlinear convergence,
based on linear and nonlinear auxiliary problems and present typical examples.
In this chapter, our subsequent definitions of Newton maps, derivative D° and
of locally
will be justified from the algorithmic point of view.
Moreover, the relations between the regularity conditions, needed for Newton’s
method, as well as upper, pseudo- and strong regularity shall be clarified.


www.pdfgrip.com
xviii

Introduction

In Chapter 11, (generalized) Newton methods will be applied in order
to determine Karush-Kuhn-Tucker points of
problems. Depending on the reformulation as (nonsmooth) equation
(via NCPor Kojima-functions) and on the used generalized derivative ”DF” as well,
we formulate the related Newton steps in terms of assigned SQP- models and
of (quadratic) penalty and (logarithmic) barrier settings. The connection of
these different solution approaches becomes possible by considering the already
mentioned perturbed Kojima functions and by studying the properties of their
zeros. Taking the results of Chapter 4 into account, one obtains Lipschitz estimates for solutions, assigned to different methods of the mentioned type. From
Chapter 10 it is obvious that the
are only important for the

interpretations in terms of quadratic problems, not for solving
according to Chapter 10.
Chapter 12 contains Basic Examples which are used throughout the book
at several places, while all numbered Exercises occurring in the first 11 chapters
are once more compiled, now accompanied with the answers. In the Appendix,
we prove some known basic tools for convenience of the reader.

Acknowledgements

We would like to thank several colleagues for stimulating discussions and valuable hints during the work on this project, in particular Asen Dontchev, Helmut
Gfrerer, Peter Kall, Janos Mayer, Jiri Outrata, Stefan Scholtes and Alexander
Shapiro, but especially Peter Fusek, Bert Jongen and Alexander Kruger, who
in addition read parts of the manuscript.
Our thanks go to Jutta Kerger for her linguistic support and to Ramona
Klaass who typed a great part of the manuscript.
We are grateful to Panos Pardalos for welcoming our work into the series
Nonconvex Optimization and Its Applications and to John Martindale from
Kluwer for his editorial help.


www.pdfgrip.com

List of Results
Introduction
Example 0.1
Example 0.2

(piecewise linear
(no piecewise linear


1. Basic Concepts
Remark 1.1
Lemma 1.2
Example 1.3
Example 1.4
Example 1.5
Example 1.6
Example 1.7
Example 1.8
Example 1.9
Lemma 1.12
Example 1.13
Example 1.14
Theorem 1.15
Theorem 1.16

(derivatives of the inverse)
(composed maps)
(regularity for
functions)
(pseudo-regular, but not strongly regular)
(strong regularity for continuous functions)
(pseudo-regularity for linear operators)
(Graves-Lyusternik theorem)
(subdifferential of the Euclidean norm)
is u.s.c., but not l.s.c.)
(metrically regular = pseudo-regular)
(pseudo-Lipschitz, but not locally upper Lipschitz)
(the inverse of Dirichlet’s function)
(Berge/Hogan stability)

(stability of CLM sets)

2. Regularity and Consequences
Lemma 2.1
Theorem 2.4
Theorem 2.5
Theorem 2.6
Lemma 2.7
Lemma 2.8
Corollary 2.9
Theorem 2.10
Remark 2.11
Theorem 2.12

(upper Lipschitz and describing Lipschitz functionals)
(the max-form for intersections)
(calm intersections)
(free local minima and upper Lipschitz constraints)
(Hoffman’s lemma)
(Lipschitz u.s.c. linear systems)
(Lipschitz u.s.c. multipliers)
(selection maps and optimality condition)
(inverse families and pseudo-regularity)
(Ekeland’s variational principle)
xix


www.pdfgrip.com
xx


Lemma 2.13
Lemma 2.14
Example 2.15
Theorem 2.16
Theorem 2.17
Lemma 2.18
Lemma 2.19
Lemma 2.20
Lemma 2.21
Theorem 2.22
Remark 2.23
Corollary 2.24
Corollary 2.25
Theorem 2.26
Corollary 2.27

List of Results

(proper multifunctions)
(pseudo-regularity for proper mappings)
(F is not pseudo-regular)
(pseudo-regularity of proper mappings with closed ranges)
(basic equivalences, proper mappings)
(pseudo-singular level sets of l.s.c. functionals)
(Ekeland-points of norm-functionals in a real Hilbert space)
(pseudo-singular cone constraints)
(pseudo-singular equations)
(intersection theorem)
(estimates)
(intersection with level set)

(finite sets of directions)
(intersection with hyperfaces)
(Lipschitz equations)

3. Characterizations of Regularity by Derivatives
Lemma 3.1
Lemma 3.2
Lemma 1.10
Remark 1.11
Corollary 3.3
Theorem 3.4
Theorem 3.5
Theorem 3.7
Theorem 3.11

(strong regularity for multifunctions)
(upper regularity)
(pseudo-regularity at isolated zeros)
(pseudo-regularity and Lipschitz continuity)
(pseudo-regularity if CF is linearly surjective 1)
(basic equivalences, closed mappings)
(pseudo-regularity if CF is linearly surjective 2)
(injectivity of co-derivatives and pseudo-regularity)
(vertical normals and regularity)

4. Nonlinear Variations and Implicit Functions
Theorem 4.1
Theorem 4.2
Theorem 4.3
Corollary 4.4

Theorem 4.5
Lemma 4.6
Corollary 4.7
Theorem 4.8
Theorem 4.9
Theorem 4.11

(persistence under
variations)
(successive approximation)
(estimates for variations in
)
(pseudo- and strong regularity w.r. to
)
(persistence of upper regularity)
(lsc. and isolated optimal solutions)
(pseudo-Lipschitz and isolated optimal solutions)
(growth and upper regularity of minimizers)
(estimate of solutions)
(the classical parametric form)


www.pdfgrip.com
List of Results

xxi

5. Closed Mappings in Finite Dimension
Theorem 5.1
Theorem 5.2

Theorem 5.3
Theorem 5.4
Theorem 5.6
Theorem 5.7
Lemma 5.8
Lemma 5.9
Theorem 5.10
Theorem 5.12
Theorem 5.13
Theorem 5.14
Theorem 5.15

(regularity of multifunctions, summary)
(CF and D*F)
(conv CF)
(regularity of the convex subdifferential)
(pseudo-regularity and exact penalization)
(pseudo-regular & upper regular)
(continuous selections of the inverse map)
(convex pre-images)
(equivalence of pseudo- and strong regularity, bifurcation)
(isolated zeros of Lipschitz-functions,
(inverse functions and
(inverse functions and
(implicit Lipschitz functions)

6. Analysis of Generalized Derivatives
Theorem 6.4
Theorem 6.5
Theorem 6.6

Theorem 6.8
Theorem 6.11
Theorem 6.14
Theorem 6.15
Lemma 6.16
Lemma 6.17
Theorem 6.18
Theorem 6.20
Corollary 6.21
Theorem 6.23
Theorem 6.26
Theorem 6.27
Theorem 6.28

(polyhedral mappings)
and
and generalized Jacobians)
(partial derivatives for
(partial derivatives for
(existence and chain rule for Newton functions)
(semismoothness; Mifflin)
(selections of
(special locally
functions)
(Newton maps of
)
(
expansion)
(quadratic growth on a neighborhood)
(quadratic growth at a point)

(C-derivative of the implicit function)
(the case of pseudo-Lipschitz S)
(C–derivatives of stationary points, general case)

7. Critical Points and Generalized Kojima–Functions
Lemma 7.1
Lemma 7.3
Lemma 7.4
Theorem 7.5
Theorem 7.6

(necessity of LICQ for pseudo-regularity)

(TN, CN)
(N simple, and further properties)
(TF, CF; product rules)
(TF, CF; explicit formulas)


www.pdfgrip.com
xxii

Lemma 7.7
Theorem 7.8
Corollary 7.13
Corollary 7.14
Lemma 7.15
Lemma 7.16
Lemma 7.17
Lemma 7.18

Lemma 7.19
Lemma 7.20
Theorem 7.21
Corollary 7.22

List of Results

(subspace property of

)

(
strong regularity and local u.L. behavior)
(difference between TF and CF)
(Newton’s method under strong regularity)
(strong regularity of an NCP)
(transformed Newton solutions)
(invariance when reversing constraints)
(deleting constraints with zero LM, pseudo-regular)
(deleting constraints with zero LM, not strongly regular)
(reduction for
data)
(
pseudo-regular = strongly regular)

8. Parametric Optimization Problems
Theorem 8.2
Remark 8.3
Corollary 8.4
Remark 8.5

Corollary 8.6
Theorem 8.10
Theorem 8.11
Corollary 8.13
Lemma 8.15
Corollary 8.16
Theorem 8.19
Theorem 8.24
Corollary 8.25
Theorem 8.27
Lemma 8.31
Lemma 8.32
Theorem 8.33
Theorem 8.36
Corollary 8.37
Theorem 8.38
Theorem 8.39
Lemma 8.41
Theorem 8.42
Theorem 8.43
Theorem 8.45
Theorem 8.47

(strongly regular critical points)
(necessity of LICQ, variation of )
(nonlinear variations, strongly regular)
(strong stability in Kojima’s sense)
(geometrical interpretation, strongly regular)
(strongly regular local minimizers)
(locally u.L.

)
(nonlinear variations, u.L.)
(auxiliary problems)
(geometrical interpretation, u.L.)
(CX under MFCQ)
(locally u.L. stationary points)
(second–order sufficient condition)
(quadratic approximations)
(upper regularity implies MFCQ)
(u.s.c. of stationary and optimal solutions)
(upper regular minimizers,
)
(upper regular minimizers, )
(necessary condition for strong regularity,
case)
(local minimizer and quadratic growth,
case)
(TX under MFCQ)
(TF-injectivity w.r. to
(TF and pseudo-regularity of X)
(
derivatives of marginal maps)
(
for nonlinear perturbations I)
(
for nonlinear perturbations II)


www.pdfgrip.com
List of Results


xxiii

9. Derivatives and Regularity of Further Nonsmooth Maps
Lemma 9.1
Lemma 9.2
Lemma 9.3
Theorem 9.4
Theorem 9.7
Corollary 9.9
Corollary 9.10

and
for positively homogeneous functions)
(NCP: minimizers and stationary points)
(limits of
for pNCP)
(particular structure of
for max-functions)
(general structure of
(reformulation 1)
(reformulation 2)

10. Newton’s Method for Lipschitz Equations
Lemma 10.1
Theorem 10.5
Theorem 10.6
Theorem 10.7
Theorem 10.8
Theorem 10.9


(convergence of Newton’s method - I)
(regularity condition (10.4) for NCP)
(uniform regularity and monotonicity)
(convergence of Newton’s method - II)
(the condition (CA))
(the condition (CI))

11. Particular Newton Realizations and Solution Methods
Theorem 11.1
Lemma 11.3
Lemma 11.4
Lemma 11.5
Lemma 11.6

(perturbed Kojima–systems)
(Newton steps with perturbed F)
(Newton steps with pNCP)
(Newton steps with perturbed
(condition (CI) in Wilson’s method)

12. Basic Examples and Exercises
Example BE.0
Example BE.1
Example BE.2
Example BE.3
Example BE.4
Example BE.5
Example BE.6


(pathological real Lipschitz map: lightning function)
(alternating Newton sequences for real, Lipschitz
(level sets in a Hilbert space: pseudo-regularity holds,
but the sufficient conditions in terms of contingent
derivatives and coderivatives fail)
(piecewise linear bijection of
with
(piecewise quadratic function
having
pseudo-Lipschitz stationary points being not unique)
(Lipschitz function
directional derivatives
nowhere exist, and contingent derivatives are empty)
(convex
non-differentiable on a dense set)


www.pdfgrip.com
xxiv

List of Results

Appendix
Theorem A.1
Lemma A.2
Lemma A.3
Lemma A.5
Lemma A.7

(Ekeland’s variational principle: proof)

(approximation by directional derivatives 1)
(approximation by directional derivatives 2)
(descent directions)
(Gauvin’s theorem and Kyparisis’ theorem)


×