An optimization written in standard form can be transformed via Generalized Lagrangian Function.

For example, for an original minimization problem,

\min_{x} f(x); s.t.\quad g(x) = 0,  k(x) \leq 0 ,

we can write the generalized Lagrangian function of the original problem as follows,

L(x) = f(x) + \lambda g(x) + \mu \quad s.t.\quad \lambda > 0, \mu \geq 0.

The Lagrangian Transformation of the Original Problem

The original problem can be rewritten as the ‘min-max problem of the generalized Lagrangian function’, which equals to optimize the sup (min-max) of the function J(), in respective to x and Lagranian variables , like follows,

\sup J(\lambda, \mu) = \min\max L(x, \lambda, \mu) = f(x) + \lambda g(x) + \mu \quad s.t.\quad \lambda > 0, \mu \geq 0

The above problem has exactly the same solution as the original problem.

The Dual Form

From the original problem, we can also write out the expression of the dual problem, in the form of the ‘max-min problem of the generalized Lagrangian function’ , which optimizes the inf (max-min) of J(),

\inf J(\lambda, \mu) = \max\min L(x, \lambda, \mu) = f(x) + \lambda g(x) + \mu \quad s.t.\quad \lambda > 0, \mu \geq 0

However, this problem does not necessarily have the same solution as the original one. Normally, we have,

d^{*} = \inf~J() \leq \sup~J() = p^{*}.

From the equation we see that the solution of the dual problem at least provides the lower-bound of the solution to the  original problem, showing a positive dual gap p^{*} - d^{*} \geq 0. The equality holds if and only if the dual gap p^{*} - d^{*}= 0, when the the original problem is a convex one with Slater’s condition.

Advertisements