Linear quadratic control
Jump to navigation
Jump to search

In control engineering and systems and control theory, linear quadratic control or LQ control refers to controller design for a deterministic linear plant based on the minimization of a quadratic cost functional. The method is based on the state space formalism and is a fundamental concept in linear systems and control theory.
There are two main versions of the method, depending on the setting of the control problem:
- Linear quadratic control in discrete time
- Linear quadratic control in continuous time
The objective of LQ control is to find a control signal that minimizes a prescribed quadratic cost functional. In the so-called regulation problem, this functional can be viewed as an abstraction of the "energy" of the overall control system and minimization of the functional corresponds to minimization of that energy.