Este livro introduz o núcleo dos conceitos e resultados da teoria de sistemas e controle. Único em sua ênfase sobre os aspectos fundamentais, fornece uma abordagem híbrida na qual os resultados são produzidos para escalas de tempo e variáveis de estados tanto na formas discretas quanto nas contínuas. Primeiramente apropriado para ser dirigido à alunos em graduação com conhecimentos avançados em matemática ou alunos já graduados, este livro também é apropriado para um segundo curso em engenharia e controle, o qual vai muito além de bibliografias que tratam do clássico domínio da freqüência e espaço de estados. A escolha dos tópicos, juntamente com uma bibliografia detalhado no fim dos capítulos, fazem deste livro uma excelente referência para pesquisas.
O livro está em inglês e é disponibilizado em pdf pelo autor, Eduardo D. Sontag, em seu site. Abaixo, link para download.
Eduardo D. Sontag, Mathematical Control Theory: Deterministic Finite Dimensional Systems.
Second Edition, Springer, New York, 1998. (site do autor: http://www.math.rutgers.edu/~sontag/mct.html)
Sumário do livro:
Introduction
What Is Mathematical Control Theory?
Proportional-Derivative Control
Digital Control
Feedback Versus Precomputed Control
State-Space and Spectrum Assignment
Outputs and Dynamic Feedback
Dealing with Nonlinearity
A Brief Historical Background
Some Topics Not Covered Systems
Basic Definitions
I/O Behaviors
Discrete-Time
Linear Discrete-Time Systems
Smooth Discrete-Time Systems
Continuous-Time
Linear Continuous-Time Systems
Linearizations Compute Differentials
More on Differentiability
Sampling
Volterra Expansions
Notes and Comments
Reachability and Controllability
Basic Reachability Notions
Time-Invariant Systems
Controllable Pairs of Matrices
Controllability Under Sampling
More on Linear Controllability
Bounded Controls
First-Order Local Controllability
Controllability of Recurrent Nets
Piecewise Constant Controls
Notes and Comments
Nonlinear Controllability
Lie Brackets
Lie Algebras and Flows
Accessibility Rank Condition
Ad, Distributions, and Frobenius' Theorem
Necessity of Accessibility Rank Condition
Additional Problems
Notes and Comments
Feedback and Stabilization
Constant Linear Feedback
Feedback Equivalence
Feedback Linearization
Disturbance Rejection and Invariance
Stability and Other Asymptotic Notions
Unstable and Stable Modes
Lyapunov and Control-Lyapunov Functions
Linearization Principle for Stability
Introduction to Nonlinear Stabilization
Notes and Comments
Outputs
Basic Observability Notions
Time-Invariant Systems
Continuous-Time Linear Systems
Linearization Principle for Observability
Realization Theory for Linear Systems
Recursion and Partial Realization
Rationality and Realizability
Abstract Realization Theory
Notes and Comments
Observers and Dynamic Feedback
Observers and Detectability
Dynamic Feedback
External Stability for Linear Systems
Frequency-Domain Considerations
Parametrization of Stabilizers
Notes and Comments
Optimality: Value Function
Dynamic Programming
Linear Systems with Quadratic Cost
Tracking and Kalman Filtering
Infinite-Time (Steady-State) Problem
Nonlinear Stabilizing Optimal Controls
Notes and Comments
Optimality: Multipliers
Review of Smooth Dependence
Unconstrained Controls
Excursion into the Calculus of Variations
Gradient-Based Numerical Methods
Constrained Controls: Minimum Principle
Notes and Comments
Optimality: Minimum-Time for Linear Systems
Existence Results
Maximum Principle for Time-Optimality
Applications of the Maximum Principle
Remarks on the Maximum Principle
Additional Exercises
Notes and Comments
Appendix: Linear Algebra
Operator Norms
Singular Values
Jordan Forms and Matrix Functions
Continuity of Eigenvalues
Appendix: Differentials
Finite Dimensional Mappings
Maps Between Normed Spaces
Appendix: Ordinary Differential Equations
Review of Lebesgue Measure Theory
Initial-Value Problems
Existence and Uniqueness Theorem Linear Differential Equations
Stability of Linear Equations
Bibliography
List of Symbols
Index
Nenhum comentário:
Postar um comentário