Introduction to Variational Methods in Control Engineering focuses on the design of automatic controls.The monograph first discusses the application of classical calculus of variations, including a generalization of the Euler-Lagrange equations, limitation of classical variational calculus, and solution of the control problem. The book also describes dynamic programming. Topics include the limitations of dynamic programming; general formulation of dynamic programming; and application to linear multivariable digital control systems.The text also underscores the continuous form of dynamic programming; Pontryagin's principle; and the two-point boundary problem. The book also touches on inaccessible state variables. Topics include the optimum realizable control law; observed data and vector spaces; design of the optimum estimator; and extension to the continuous systems. The book also presents a summary of potential applications, including complex control systems and on-line computer control.The text is recommended to readers and students wanting to explore the design of automatic controls.
"synopsis" may belong to another edition of this title.
Seller: Revaluation Books, Exeter, United Kingdom
Paperback. Condition: Brand New. 132 pages. 8.50x5.50x0.30 inches. In Stock. Seller Inventory # zk0080135846
Quantity: 1 available