Module 1: State Space Representation and Feedback Control

Module 2: Nonlinear System Analysis

Module 3: Stability Analysis Using Lyapunov Methods

Module 4: Optimal Control Theory

Fundamental theorem and boundary conditions
Constrained Minimization techniques
Dynamic Programming
Hamilton’s Principle of Optimality
Jacobi-Bellman Equation
Pontryagin’s Minimum Principle
Formulation of optimal control problems

Module 5: Adaptive Control

Calculus of Variations

Calculus of Variations and Optimal Control:

Basic Concepts

1.Function and Functional

2.Increment of a function and functional

3.Differential of a Function

4.Variation of a Functional

5.Optimum function and Functional

Function:

A variable x is a function of a variable quantity t, (written as x(t) = f(t), if to every value of t over a certain range of t there corresponds a value x; i.e., we have a correspondence to a number t there corresponds a number x. 

Note: Here t need not be always time but any independent variable.

Example 1: x(t)=〖2t〗^2+1; x(t)=2t; x(t_1, t_2)=〖 t〗_1^3+ 〖 t〗_2^2

No Attachment Found
No Attachment Found