Classical And Modern Optimization

Classical And Modern Optimization

by Guillaume Carlier
Classical And Modern Optimization

Classical And Modern Optimization

by Guillaume Carlier

Paperback

$68.00 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Related collections and offers


Overview

The quest for the optimal is ubiquitous in nature and human behavior. The field of mathematical optimization has a long history and remains active today, particularly in the development of machine learning.Classical and Modern Optimization presents a self-contained overview of classical and modern ideas and methods in approaching optimization problems. The approach is rich and flexible enough to address smooth and non-smooth, convex and non-convex, finite or infinite-dimensional, static or dynamic situations. The first chapters of the book are devoted to the classical toolbox: topology and functional analysis, differential calculus, convex analysis and necessary conditions for differentiable constrained optimization. The remaining chapters are dedicated to more specialized topics and applications.Valuable to a wide audience, including students in mathematics, engineers, data scientists or economists, Classical and Modern Optimization contains more than 200 exercises to assist with self-study or for anyone teaching a third- or fourth-year optimization class.

Product Details

ISBN-13: 9781800610866
Publisher: World Scientific Publishing Europe Ltd
Publication date: 04/07/2022
Series: Advanced Textbooks In Mathematics
Pages: 388
Product dimensions: 6.00(w) x 9.00(h) x 0.80(d)

Table of Contents

Preface v

About the Author vii

1 Topological and Functional Analytic Preliminaries 1

1.1 Metric spaces 1

1.1.1 Completeness and compactness 3

1.1.2 Continuity, semicontinuity 8

1.1.3 Baire and Ekeland's theorems 13

1.2 Normed vector spaces 15

1.2.1 Finite dimensions 16

1.2.2 Linear and bilinear maps 17

1.3 Banach spaces 23

1.3.1 Definitions and properties 23

1.3.2 Examples 24

1.3.3 Ascoli's theorem 27

1.3.4 Linear maps in Banach Spaces 29

1.4 Hilbert spaces 30

1.4.1 Generalities 30

1.4.2 Projection onto a closed convex set 33

1.4.3 The dual of a Hilbert space 36

1.5 Weak convergence 37

1.6 On the existence and generic uniqueness of minimizers 39

1.6.1 Existence of minimizers: Variations on a theme 39

1.6.2 A generic uniqueness result 41

1.7 Exercises 42

2 Differential Calculus 47

2.1 First-order differential calculus 47

2.1.1 Several notions of differentiability 47

2.1.2 Calculus rules 52

2.1.3 Mean value inequalities 56

2.1.4 Partial derivatives 60

2.1.5 The finite-dimensional case, the Jacobian matrix 62

2.2 Second-order differential calculus 65

2.2.1 Definitions 65

2.2.2 Schwarz's symmetry theorem 66

2.2.3 Second-order partial derivatives 68

2.2.4 Taylor formula 70

2.3 The inverse function and implicit function theorems 73

2.3.1 The inverse function theorem 73

2.3.2 The implicit function theorem 75

2.3.3 A local subjection theorem via Ekeland's variational principle 77

2.4 Smooth functions on Rd, regularization, integration by parts 78

2.4.1 Test-functions, mollification 80

2.4.2 The divergence theorem and other integration by parts formulas 83

2.5 Exercises 88

3 Convexity 93

3.1 Hahn-Banach theorems 93

3.1.1 The analytic form of Hahn-Banach theorem 93

3.1.2 Separation of convex sets 94

3.2 Convex sets 97

3.2.1 Basic properties 97

3.2.2 Linear inequalities 98

3.2.3 Extreme points 101

3.3 Convex functions 105

3.3.1 Continuity properties 107

3.3.2 Difxerentiable characterizations 109

3.4 The Legendre transform 113

3.4.1 Basic properties 113

3.4.2 The biconjugate 118

3.4.3 Subdifferentiability 120

3.5 Exercises 126

4 Optimality Conditions for Differentiable Optimization 133

4.1 Unconstrained optimization 134

4.2 Equality constraints 136

4.2.1 Algebraic and topological preliminaries 137

4.2.2 Lagrange rule in the case of affine constraints 140

4.2.3 Lagrange rule in the finite-dimensional case 141

4.2.4 Lagrange rule in Banach spaces 144

4.2.5 Second-order conditions 145

4.3 Equality and inequality constraints 147

4.3.1 Karush-Kuhn-Tucker conditions for affine constraints 147

4.3.2 Karush-Kuhn-Tucker conditions in the general case 150

4.4 Exercises 153

5 Problems Depending on a Parameter 161

5.1 Setting and examples 161

5.2 Continuous dependence 164

5.2.1 Notions of continuity for set-valued maps 164

5.2.2 Semi continuity of values 167

5.3 Parameter-independent constraints, envelope theorems 169

5.3.1 Differentiability under local uniqueness 170

5.3.2 Non-smooth cases 172

5.3.3 The envelope theorem for suprema of convex functions 173

5.4 Parameter-dependent constraints 176

5.4.1 Smoothness of Lagrange points 176

5.4.2 Multipliers and the marginal price of constraints 181

5.5 Discrete-time dynamic programming 182

5.5.1 Finite horizon 182

5.5.2 Infinite horizon 183

5.6 Exercises 187

6 Convex Duality and Applications 195

6.1 Generalities 195

6.2 Convex duality with respect to a perturbation 196

6.2.1 Setting 196

6.2.2 A general duality result 198

6.3 Applications 200

6.3.1 The Fenchel-Rockafellar theorem 200

6.3.2 Linear programming 203

6.3.3 Semidefinite programming 205

6.3.4 Link with KKT and Lagrangian duality 209

6.4 On the optimal transport problem 212

6.4.1 Kantorovich duality 212

6.4.2 Characterization of optimal plans 217

6.4.3 Monge solutions 220

6.4.4 The discrete case 226

6.5 Exercises 234

7 Iterative Methods for Convex Minimization 243

7.1 On Newton's method 243

7.2 The gradient method 246

7.2.1 Convergence of iterates 248

7.2.2 Convergence of values 249

7.2.3 Nesterov acceleration 253

7.3 The proximal point method 255

7.4 Splitting methods 262

7.4.1 Forward-Backward splitting 262

7.4.2 Douglas-Rachford method 263

7.4.3 Link with augmented Lagrangian methods 268

7.5 Exercises 270

8 When Optimization and Data Meet 279

8.1 Principal component analysis 279

8.1.1 Singular value decomposition 280

8.1.2 Principal component analysis 282

8.2 Minimization for linear systems 287

8.2.1 Matrix operator norms and conditioning numbers 287

8.2.2 Least squares and linear regressions 290

8.2.3 Tikhonov regularization, the Moore-Penrose inverse 293

8.2.4 l1-penalization and sparse solutions 295

8.3 Classification 300

8.3.1 Logistic regression 301

8.3.2 Support-vector machines 303

8.4 Exercises 305

9 An Invitation to the Calculus of Variations 311

9.1 Preliminaries 311

9.1.1 On weak derivatives 313

9.1.2 Sobolev functions in dimension 1 315

9.1.3 Sobolev functions in higher dimensions 317

9.2 On integral functionals 319

9.2.1 Continuity, semicontinuity 319

9.2.2 The importance of being convex 320

9.2.3 Differentiability 323

9.3 The direct method 325

9.3.1 Obstructions to existence 326

9.3.2 Existence in the separable case 328

9.3.3 Relaxation 330

9.4 Euler-Lagrange equations and other necessary conditions 336

9.4.1 Euler-Lagrange equations 336

9.4.2 On existence of minimizers again 340

9.4.3 Examples 344

9.5 A focus on the case d = 1 348

9.5.1 Hamiltonian systems 348

9.5.2 Dynamic programming, Hamilton-Jacobi equations 350

9.5.3 A verification theorem 354

9.6 Exercises 355

Bibliography 363

Index 369

From the B&N Reads Blog

Customer Reviews