Baruol synthase

From formulasearchengine
Revision as of 19:08, 18 September 2013 by en>RjwilmsiBot (fixing page range dashes using AWB (9488))
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Template:Multiple issues

Convex optimization is a sub field of optimization which can produce reliable solutions and can be solved exactly. Many signal processing problems can be formulated as convex optimization problems of form

minimizexNf1(x)+f2(x)+...+fn1(x)+fn(x)

where f1,f2,...,fn are convex functions defined from f:N where some of the functions are non-differentiable, this rules out our conventional smooth optimization techniques like Steepest decent method, conjugate gradient method etc. There is a specific class of algorithms which can solve above optimization problem. These methods proceed by splitting, in that the functions f1,...,fn are used individually so as to yield an easily implementable algorithm. They are called proximal because each non smooth function among f1,...,fn is involved via its proximity operator. Iterative Shrinkage thresholding algorithm, projected Landweber, projected gradient, alternating projections, alternating-direction method of multipliers, alternating split Bregman are special instances of proximal algorithms. Details of proximal methods are discussed in Combettes et.al.[1] For the theory of proximal gradient methods from the perspective of and with applications to statistical learning theory, see proximal gradient methods for learning.

Notations and Terminology

Let N, the N-dimensional euclidean space, be the domain of the function f:N[,+]. Suppose C is the non-empty convex subset of N. Then, the indicator function of C is defined as

iC:x{0if xC+if xC

p-norm is defined as ( .p )

xp=(|x1|p+|x2|p++|xN|p)1p

The distance from xN to C is defined as

DC(x)=minyCxy

If C is closed and convex, the projection of xN onto C is the unique point PCxC such that DC(x)=xPCx2.

The subdifferential of f is given by

f:N2N:x{uN|yN,(yx)Tu+f(x)f(y)).}

Projection onto Convex Sets (POCS)

One of the widely used convex optimization algorithm is POCS (Projection Onto Convex Sets). This algorithm is employed to recover/synthesize a signal satisfying simultaneously several convex constraints. Let fi be the indicator function of non-empty closed convex set Ci modeling a constraint. This reduces to convex feasibility problem, which require us to find a solution such that it lies in the intersection of all convex sets Ci. In POCS method each set Ci is incorporated by its projection operator PCi. So in each iteration x is updated as

xk+1=PC1PC2...PCnxk

However beyond such problems projection operators are not appropriate and more general operators are required to tackle them. Among the various generalizations of the notion of a convex projection operator that exist, proximity operators are best suited for other purposes.

Definition

Proximity operators of function f at x is defined as

For every xN, the minimization problem

minimizeyCf(y)+12xy22

admits a unique solution which is denoted by proxf(x).

proxf(x):NN

The proximity operator of f is characterized by inclusion

p=proxf(x)xpf(p)((x,p)N×N)

If f is differentiable then above equation reduces to

p=proxf(x)xpf(p)((x,p)N×N)

Examples

Special instances of Proximal Gradient Methods are

See also

References

  • 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  • 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534

Notes

External links