**Convex optimization** is a subfield of mathematical optimization. Given a real vector space together with a convex, real-valued function In mathematics, the term optimization, or mathematical programming, refers to the study of problems in which one seeks to minimize or maximize a real function by systematically choosing the values of real or integer variables from within an allowed set. ...
In mathematics, the real numbers may be described informally as numbers that can be given by an infinite decimal representation, such as 2. ...
In mathematics, a vector space (or linear space) is a collection of objects (called vectors) that, informally speaking, may be scaled and added. ...
In mathematics, convex function is a real-valued function f defined on an interval (or on any convex subset C of some vector space), if for any two points x and y in its domain C and any t in [0,1], we have Convex function on an interval. ...
Graph of example function, The mathematical concept of a function expresses the intuitive idea of deterministic dependence between two quantities, one of which is viewed as primary (the independent variable, argument of the function, or its input) and the other as secondary (the value of the function, or output). A...
defined on a convex subset of , the problem is to find the point in for which the number is smallest, i.e., the point *x* ^{*} such that for all . Look up Convex set in Wiktionary, the free dictionary. ...
The convexity of and make the powerful tools of convex analysis applicable: the Hahn-Banach theorem and the theory of subgradients lead to a particularly satisfying and complete theory of necessary and sufficient conditions for optimality, a duality theory comparable in perfection to that for linear programming, and effective computational methods. Convex optimization has applications in a wide range of disciplines, such as automatic control systems, estimation and signal processing, communications and networks, electronic circuit design, data analysis and modeling, statistics, and finance. Modern computing power has improved the tractability of convex optimization problems to a level roughly equal to that of linear programming. Convex analysis is the branch of mathematics devoted to the study of properties of convex functions and convex sets, often with applications in convex optimization, a subdomain of optimization theory. ...
In mathematics, the Hahn-Banach theorem is a central tool in functional analysis. ...
A convex function (blue) and subtangent lines at x0 (red). ...
This article discusses only the formal meanings of necessary and sufficient causal meanings see causation. ...
In mathematics, duality has numerous meanings. ...
In mathematics, linear programming (LP) problems involve the optimization of a linear objective function, subject to linear equality and inequality constraints. ...
A control system is a device or set of devices that manage the behavior of other devices. ...
Signal processing is the processing, amplification and interpretation of signals, and deals with the analysis and manipulation of signals. ...
The process of circuit design can cover systems ranging from national power grids all the way down to the individual transistors within an integrated circuit. ...
This article is about the field of statistics. ...
Finance is a field that studies and addresses the ways in which individuals, businesses, and organizations raise, allocate, and use monetary resources over time, taking into account the risks entailed in their projects. ...
## Theory The following statements are true about the convex optimization problem: - if a local minimum exists, then it is a global minimum
- the set of all (global) minima is convex
- if the function is strictly convex, then there exists at most one minimum
The theoretical framework for convex optimization uses the facts above in conjunction with notions from convex analysis such as the Hilbert projection theorem, the separating hyperplane theorem, and Farkas's lemma. A graph illustrating local min/max and global min/max points In mathematics, a point x* is a local maximum of a function f if there exists some ε > 0 such that f(x*) ≥ f(x) for all x with |x-x*| < ε. Stated less formally, a local maximum...
A graph illustrating local min/max and global min/max points In mathematics, a point x* is a local maximum of a function f if there exists some ε > 0 such that f(x*) ≥ f(x) for all x with |x-x*| < ε. Stated less formally, a local maximum...
The Hilbert Projection Theorem is a famous result of convex analysis that says that for every point in a Hilbert space and every closed subspace , there exists a unique point for which is minimized over . ...
Illustration of the separating axis theorem. ...
Farkass lemma is a result in mathematics which is used amongst other things in the proof of the Karush-Kuhn-Tucker theorem in nonlinear programming. ...
## Standard form *Standard form* is the usual and most intuitive form of describing a convex optimization problem. It consists of the following three parts: - A
**convex function** to be minimized over the variable **Inequality constraints** of the form , where the functions are convex **Equality constraints** of the form *h*_{i}(*x*) = 0, where the functions are affine. In practice, the terminology "linear" and "affine" are generally equivalent and most often expressed in the form , where is a matrix and is a vector. A convex optimization problem is thus written as For uses in mathematics see: Affine transformation Affine combination Affine geometry Affine space Affine group Affine representation Affine variety Affine scheme Affine cipher This is a disambiguation page — a navigational aid which lists other pages that might otherwise share the same title. ...
minimize subject to ## Examples
Hierarchical representation of common convex optimization problems The following problems are all convex optimization problems, or can be transformed into convex optimization problems via a change of variables: Image File history File links Size of this preview: 800 Ã— 502 pixelsFull resolution (966 Ã— 606 pixel, file size: 31 KB, MIME type: image/png) Made by me, its a hierarchical representation of convex optimization problems I, the creator of this work, hereby release it into the public domain. ...
Image File history File links Size of this preview: 800 Ã— 502 pixelsFull resolution (966 Ã— 606 pixel, file size: 31 KB, MIME type: image/png) Made by me, its a hierarchical representation of convex optimization problems I, the creator of this work, hereby release it into the public domain. ...
In a significant example, the set is defined by a system of inequalities involving *m* convex functions *g*_{1}, *g*_{2}, ..., *g*_{m} defined on *X*, like this: In regression analysis, least squares, also known as ordinary least squares analysis, is a method for linear regression that determines the values of unknown quantities in a statistical model by minimizing the sum of the residuals (the difference between the predicted and observed values) squared. ...
In mathematics, linear programming (LP) problems involve the optimization of a linear objective function, subject to linear equality and inequality constraints. ...
Conic optimization is a subfield of convex optimization. ...
A Geometric Program is an optimization problem of the form minimize subject to where are posynomials and are monomials. ...
A Second order cone program (SOCP) is a convex optimization problem of the form minimize subject to where the problem parameters are , and . ...
Semidefinite programming (SDP) is an area of mathematics concerned with special optimization problems: the optimization of a linear objective function over the intersection of the cone of positive semidefinite matricies with an affine space. ...
In mathematics, a general quadratically constrained quadratic program (QCQP) is an optimization problem of the form where and is the optimization variable. ...
The entropy maximization problem is a convex optimization problem of the form minimize subject to where is the optimization variable, and are problem parameters, and denotes a vector whose components are all 1. ...
The Lagrange function for the problem is *L*(*x*,*λ*_{0},...,*λ*_{m}) = *λ*_{0}*f*(*x*) + *λ*_{1}*g*_{1}(*x*) + ... + *λ*_{m}*g*_{m}(*x*). For each point *x* in *A* that minimizes *f* over *A*, there exist real numbers *λ*_{0}, ..., *λ*_{m}, called Lagrange multipliers, that satisfy these conditions simultaneously: Fig. ...
*x* minimizes *L*(*y*, λ_{0}, λ_{1}, ..., λ_{m}) over all *y* in *X*, - λ
_{0} ≥ 0, λ_{1} ≥ 0, ..., λ_{m} ≥ 0, with at least one λ_{k}>0, *λ*_{1}*g*_{1}(*x*) = 0, ... , *λ*_{m}*g*_{m}(*x*) = 0 If there exists a "strictly feasible point", i.e., a point *z* satisfying *g*_{1}(*z*) < 0,...,*g*_{m}(*z*) < 0, then the statement above can be upgraded to assert that λ_{0}=1. Conversely, if some *x* in *A* satisfies 1)-3) for scalars λ_{0}, ..., λ_{m} with λ_{0} = 1, then *x* is certain to minimize *f* over *A*. A scalar may be: Look up scalar in Wiktionary, the free dictionary. ...
## Methods The following methods are used in solving convex optimization problems: The ellipsoid method is an algorithm for solving linear programs. ...
Subgradient methods are algorithms for solving convex optimization problems. ...
In mathematics, more specifically in optimization, the cutting-plane method is a method in optimization consisting of polyhedral cutting planes. ...
Interior point methods (also referred to as barrier methods) are a certain class of algorithms to solve linear and nonlinear convex optimization problems. ...
## Software Although most general-purpose nonlinear equation solvers such as LSSOL, LOQO, MINOS, and Lancelot work well, many software packages dealing exclusively with convex optimization problems are also available:
### Convex programming languages ### Convex optimization solvers ## References - Rockafellar, R. T. (1970).
*Convex analysis*. Princeton: Princeton University Press. - Luenberger, David (1984).
*Linear and Nonlinear Programming*. Addison-Wesley. - Luenberger, David (1969).
*Optimization by Vector Space Methods*. Wiley & Sons. - Bertsekas, Dimitri (2003).
*Convex Analysis and Optimization*. Athena Scientific. - Hiriart-Urruty, Jean-Baptiste, and Lemaréchal, Claude. (2004).
*Fundamentals of Convex analysis*. Berlin: Springer. - Borwein, Jonathan, and Lewis, Adrian. (2000).
*Convex Analysis and Nonlinear Optimization*. Springer. - Berkovitz, Leonard (2001).
*Convexity and Optimization in *. John Wiley & Sons. - Ruszczynski, Andrzej (2006).
*Nonlinear Optimization*. Princeton University Press. - S.I. [S.I. Zukhovitskii] Zukhovitsky, L.I. Avdeeva, "Linear and convex programming" , Saunders (1966)
## External links |