 FACTOID # 29: 73.3% of America's gross operating surplus in motion picture and sound recording industries comes from California.

 Home Encyclopedia Statistics States A-Z Flags Maps FAQ About

 WHAT'S NEW

SEARCH ALL

FACTS & STATISTICS    Advanced view

Search encyclopedia, statistics and forums:

(* = Graphable)

Encyclopedia > Convex optimization

Convex optimization is a subfield of mathematical optimization. Given a real vector space $X$ together with a convex, real-valued function In mathematics, the term optimization, or mathematical programming, refers to the study of problems in which one seeks to minimize or maximize a real function by systematically choosing the values of real or integer variables from within an allowed set. ... In mathematics, the real numbers may be described informally as numbers that can be given by an infinite decimal representation, such as 2. ... In mathematics, a vector space (or linear space) is a collection of objects (called vectors) that, informally speaking, may be scaled and added. ... In mathematics, convex function is a real-valued function f defined on an interval (or on any convex subset C of some vector space), if for any two points x and y in its domain C and any t in [0,1], we have Convex function on an interval. ... Graph of example function, The mathematical concept of a function expresses the intuitive idea of deterministic dependence between two quantities, one of which is viewed as primary (the independent variable, argument of the function, or its input) and the other as secondary (the value of the function, or output). A... $f:mathcal{X}to mathbb R$

defined on a convex subset $mathcal{X}$ of $X$, the problem is to find the point $x^*$ in $mathcal{X}$ for which the number $f(x)$ is smallest, i.e., the point x * such that $f(x^*) le f(x)$ for all $x in X$. Look up Convex set in Wiktionary, the free dictionary. ...

The convexity of $mathcal{X}$ and $f$ make the powerful tools of convex analysis applicable: the Hahn-Banach theorem and the theory of subgradients lead to a particularly satisfying and complete theory of necessary and sufficient conditions for optimality, a duality theory comparable in perfection to that for linear programming, and effective computational methods. Convex optimization has applications in a wide range of disciplines, such as automatic control systems, estimation and signal processing, communications and networks, electronic circuit design, data analysis and modeling, statistics, and finance. Modern computing power has improved the tractability of convex optimization problems to a level roughly equal to that of linear programming. Convex analysis is the branch of mathematics devoted to the study of properties of convex functions and convex sets, often with applications in convex optimization, a subdomain of optimization theory. ... In mathematics, the Hahn-Banach theorem is a central tool in functional analysis. ... A convex function (blue) and subtangent lines at x0 (red). ... This article discusses only the formal meanings of necessary and sufficient causal meanings see causation. ... In mathematics, duality has numerous meanings. ... In mathematics, linear programming (LP) problems involve the optimization of a linear objective function, subject to linear equality and inequality constraints. ... A control system is a device or set of devices that manage the behavior of other devices. ... Signal processing is the processing, amplification and interpretation of signals, and deals with the analysis and manipulation of signals. ... The process of circuit design can cover systems ranging from national power grids all the way down to the individual transistors within an integrated circuit. ... This article is about the field of statistics. ... Finance is a field that studies and addresses the ways in which individuals, businesses, and organizations raise, allocate, and use monetary resources over time, taking into account the risks entailed in their projects. ...

Theory

The following statements are true about the convex optimization problem:

• if a local minimum exists, then it is a global minimum
• the set of all (global) minima is convex
• if the function is strictly convex, then there exists at most one minimum

The theoretical framework for convex optimization uses the facts above in conjunction with notions from convex analysis such as the Hilbert projection theorem, the separating hyperplane theorem, and Farkas's lemma. A graph illustrating local min/max and global min/max points In mathematics, a point x* is a local maximum of a function f if there exists some &#949; > 0 such that f(x*) &#8805; f(x) for all x with |x-x*| < &#949;. Stated less formally, a local maximum... A graph illustrating local min/max and global min/max points In mathematics, a point x* is a local maximum of a function f if there exists some &#949; > 0 such that f(x*) &#8805; f(x) for all x with |x-x*| < &#949;. Stated less formally, a local maximum... The Hilbert Projection Theorem is a famous result of convex analysis that says that for every point in a Hilbert space and every closed subspace , there exists a unique point for which is minimized over . ... Illustration of the separating axis theorem. ... Farkass lemma is a result in mathematics which is used amongst other things in the proof of the Karush-Kuhn-Tucker theorem in nonlinear programming. ...

Standard form

Standard form is the usual and most intuitive form of describing a convex optimization problem. It consists of the following three parts:

• A convex function $f_0(x): mathbb{R}^n to mathbb{R}$ to be minimized over the variable $x$
• Inequality constraints of the form $f_i(x) leq 0$, where the functions $f_i$ are convex
• Equality constraints of the form hi(x) = 0, where the functions $h_i$ are affine. In practice, the terminology "linear" and "affine" are generally equivalent and most often expressed in the form $Ax = b$, where $A$ is a matrix and $b$ is a vector.

A convex optimization problem is thus written as For uses in mathematics see: Affine transformation Affine combination Affine geometry Affine space Affine group Affine representation Affine variety Affine scheme Affine cipher This is a disambiguation page &#8212; a navigational aid which lists other pages that might otherwise share the same title. ...

minimize $f_0(x)$ subject to $f_i(x) leq 0, quad i = 1,dots,m$ $h_i(x) = 0, quad i = 1, dots,p$

Examples

The following problems are all convex optimization problems, or can be transformed into convex optimization problems via a change of variables: Image File history File links Size of this preview: 800 Ã— 502 pixelsFull resolution (966 Ã— 606 pixel, file size: 31 KB, MIME type: image/png) Made by me, its a hierarchical representation of convex optimization problems I, the creator of this work, hereby release it into the public domain. ... Image File history File links Size of this preview: 800 Ã— 502 pixelsFull resolution (966 Ã— 606 pixel, file size: 31 KB, MIME type: image/png) Made by me, its a hierarchical representation of convex optimization problems I, the creator of this work, hereby release it into the public domain. ...

In a significant example, the set $mathcal{X}$ is defined by a system of inequalities involving m convex functions g1, g2, ..., gm defined on X, like this: In regression analysis, least squares, also known as ordinary least squares analysis, is a method for linear regression that determines the values of unknown quantities in a statistical model by minimizing the sum of the residuals (the difference between the predicted and observed values) squared. ... In mathematics, linear programming (LP) problems involve the optimization of a linear objective function, subject to linear equality and inequality constraints. ... Conic optimization is a subfield of convex optimization. ... A Geometric Program is an optimization problem of the form minimize subject to where are posynomials and are monomials. ... A Second order cone program (SOCP) is a convex optimization problem of the form minimize subject to where the problem parameters are , and . ... Semidefinite programming (SDP) is an area of mathematics concerned with special optimization problems: the optimization of a linear objective function over the intersection of the cone of positive semidefinite matricies with an affine space. ... In mathematics, a general quadratically constrained quadratic program (QCQP) is an optimization problem of the form where and is the optimization variable. ... The entropy maximization problem is a convex optimization problem of the form minimize subject to where is the optimization variable, and are problem parameters, and denotes a vector whose components are all 1. ... $mathcal{X} = leftlbrace{xin X vert g_1(x)le0, ldots, g_m(x)le0}rightrbrace.$

The Lagrange function for the problem is

L(x,λ0,...,λm) = λ0f(x) + λ1g1(x) + ... + λmgm(x).

For each point x in A that minimizes f over A, there exist real numbers λ0, ..., λm, called Lagrange multipliers, that satisfy these conditions simultaneously: Fig. ...

1. x minimizes L(y, λ0, λ1, ..., λm) over all y in X,
2. λ0 ≥ 0, λ1 ≥ 0, ..., λm ≥ 0, with at least one λk>0,
3. λ1g1(x) = 0, ... , λmgm(x) = 0

If there exists a "strictly feasible point", i.e., a point z satisfying

g1(z) < 0,...,gm(z) < 0,

then the statement above can be upgraded to assert that λ0=1.

Conversely, if some x in A satisfies 1)-3) for scalars λ0, ..., λm with λ0 = 1, then x is certain to minimize f over A. A scalar may be: Look up scalar in Wiktionary, the free dictionary. ...

Methods

The following methods are used in solving convex optimization problems:

The ellipsoid method is an algorithm for solving linear programs. ... Subgradient methods are algorithms for solving convex optimization problems. ... In mathematics, more specifically in optimization, the cutting-plane method is a method in optimization consisting of polyhedral cutting planes. ... Interior point methods (also referred to as barrier methods) are a certain class of algorithms to solve linear and nonlinear convex optimization problems. ...

Software

Although most general-purpose nonlinear equation solvers such as LSSOL, LOQO, MINOS, and Lancelot work well, many software packages dealing exclusively with convex optimization problems are also available:

Convex optimization solvers Results from FactBites:

 Convex Optimization (779 words) Convex optimization problems are far more general than linear programming problems, but they share the desirable properties of LP problems: They can be solved quickly and reliably up to very large size -- hundreds of thousands of variables and constraints. A convex optimization problem is a problem where all of the constraints are convex functions, and the objective is a convex function if minimizing, or a concave function if maximizing. In a convex optimization problem, the feasible region -- the intersection of convex constraint functions -- is a convex region, as pictured below.
More results at FactBites »

Share your thoughts, questions and commentary here
Press Releases | Feeds | Contact