No subject


Mon Jun 5 16:42:55 EDT 2006


"This book will be very useful for mathematics and engineering students
interested in a modern and rigorous systems course, as well as for experts
in control theory and applications" --- Mathematical Reviews

"An excellent book... gives a thorough and mathematically rigorous
treatment of control and system theory" --- Zentralblatt fur Mathematik    

"The style is mathematically precise... fills an important niche... serves as
an excellent bridge (to topics treated in traditional engineering courses).
The book succeeds in conveying the important basic ideas of mathematical
control theory, with appropriate level and style"
                                  --- IEEE Transactions on Automatic Control

Chapter and Section Headings:

Introduction
 What Is Mathematical Control Theory?
 Proportional-Derivative Control
 Digital Control
 Feedback Versus Precomputed Control
 State-Space and Spectrum Assignment
 Outputs and Dynamic Feedback
 Dealing with Nonlinearity
 A Brief Historical Background
 Some Topics Not Covered
Systems
 Basic Definitions
 I/O Behaviors
 Discrete-Time
 Linear Discrete-Time Systems
 Smooth Discrete-Time Systems
 Continuous-Time
 Linear Continuous-Time Systems
 Linearizations Compute Differentials
 More on Differentiability
 Sampling
 Volterra Expansions
 Notes and Comments
Reachability and Controllability
 Basic Reachability Notions
 Time-Invariant Systems
 Controllable Pairs of Matrices
 Controllability Under Sampling
 More on Linear Controllability
 Bounded Controls
 First-Order Local Controllability
 Controllability of Recurrent Nets
 Piecewise Constant Controls
 Notes and Comments
Nonlinear Controllability
 Lie Brackets
 Lie Algebras and Flows
 Accessibility Rank Condition
 Ad, Distributions, and Frobenius' Theorem
 Necessity of Accessibility Rank Condition
 Additional Problems
 Notes and Comments
Feedback and Stabilization
 Constant Linear Feedback
 Feedback Equivalence
 Feedback Linearization
 Disturbance Rejection and Invariance
 Stability and Other Asymptotic Notions
 Unstable and Stable Modes
 Lyapunov and Control-Lyapunov Functions
 Linearization Principle for Stability
 Introduction to Nonlinear Stabilization
 Notes and Comments
Outputs
 Basic Observability Notions
 Time-Invariant Systems
 Continuous-Time Linear Systems
 Linearization Principle for Observability
 Realization Theory for Linear Systems
 Recursion and Partial Realization
 Rationality and Realizability
 Abstract Realization Theory
 Notes and Comments
Observers and Dynamic Feedback
 Observers and Detectability
 Dynamic Feedback
 External Stability for Linear Systems
 Frequency-Domain Considerations
 Parametrization of Stabilizers
 Notes and Comments
Optimality: Value Function
 Dynamic Programming
 Linear Systems with Quadratic Cost
 Tracking and Kalman Filtering
 Infinite-Time (Steady-State) Problem
 Nonlinear Stabilizing Optimal Controls
 Notes and Comments
Optimality: Multipliers
 Review of Smooth Dependence
 Unconstrained Controls
 Excursion into the Calculus of Variations
 Gradient-Based Numerical Methods
 Constrained Controls: Minimum Principle
 Notes and Comments
Optimality: Minimum-Time for Linear Systems
 Existence Results
 Maximum Principle for Time-Optimality
 Applications of the Maximum Principle
 Remarks on the Maximum Principle
 Additional Exercises
 Notes and Comments
Appendix: Linear Algebra
 Operator Norms
 Singular Values
 Jordan Forms and Matrix Functions
 Continuity of Eigenvalues
Appendix: Differentials
 Finite Dimensional Mappings
 Maps Between Normed Spaces
Appendix: Ordinary Differential Equations
 Review of Lebesgue Measure Theory
 Initial-Value Problems
 Existence and Uniqueness Theorem
 Linear Differential Equations
 Stability of Linear Equations
Bibliography
List of Symbols


More information about the Connectionists mailing list