Optimal Control Theory: An Introduction

Optimal Control Theory: An Introduction

by Donald E. Kirk
Optimal Control Theory: An Introduction

Optimal Control Theory: An Introduction

by Donald E. Kirk

Paperback

$35.00 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Related collections and offers


Overview

Optimal control theory is the science of maximizing the returns from and minimizing the costs of the operation of physical, social, and economic processes. Geared toward upper-level undergraduates, this text introduces three aspects of optimal control theory: dynamic programming, Pontryagin's minimum principle, and numerical techniques for trajectory optimization.
Chapters 1 and 2 focus on describing systems and evaluating their performances. Chapter 3 deals with dynamic programming. The calculus of variations and Pontryagin's minimum principle are the subjects of chapters 4 and 5, and chapter 6 examines iterative numerical techniques for finding optimal controls and trajectories. Numerous problems, intended to introduce additional topics as well as to illustrate basic concepts, appear throughout the text.


Product Details

ISBN-13: 9780486434841
Publisher: Dover Publications
Publication date: 04/30/2004
Series: Dover Books on Electrical Engineering
Pages: 480
Sales rank: 1,138,065
Product dimensions: 5.50(w) x 8.50(h) x (d)

Table of Contents

I. Describing the System and Evaluating Its Performance.
1. Introduction.
2. The Performance Measure.
II. Dynamic Programming.
3. Dynamic Programming.
III. The Calculus of Variations and Pontryagin's Minimum Principle.
4. The Calculus of Variations.
5. The Variational Approach to Optimal Control Problems.
IV. Iterative Numerical Techniques for Finding Optimal Controls and Trajectories.
6. Numerical Determination of Optimal Trajectories.
V. Conclusion.
7. Summation.
Appendices.
Index.
From the B&N Reads Blog

Customer Reviews