
- Convex Optimization Tutorial
- Home
- Introduction
- Linear Programming
- Norm
- Inner Product
- Minima and Maxima
- Convex Set
- Affine Set
- Convex Hull
- Caratheodory Theorem
- Weierstrass Theorem
- Closest Point Theorem
- Fundamental Separation Theorem
- Convex Cones
- Polar Cone
- Conic Combination
- Polyhedral Set
- Extreme point of a convex set
- Direction
- Convex & Concave Function
- Jensen's Inequality
- Differentiable Convex Function
- Sufficient & Necessary Conditions for Global Optima
- Quasiconvex & Quasiconcave functions
- Differentiable Quasiconvex Function
- Strictly Quasiconvex Function
- Strongly Quasiconvex Function
- Pseudoconvex Function
- Convex Programming Problem
- Fritz-John Conditions
- Karush-Kuhn-Tucker Optimality Necessary Conditions
- Algorithms for Convex Problems
- Convex Optimization Resources
- Convex Optimization - Quick Guide
- Convex Optimization - Resources
- Convex Optimization - Discussion
Caratheodory Theorem
Let S be an arbitrary set in $\mathbb{R}^n$.If $x \in Co\left ( S \right )$, then $x \in Co\left ( x_1,x_2,....,x_n,x_{n+1} \right )$.
Proof
Since $x \in Co\left ( S\right )$, then $x$ is representated by a convex combination of a finite number of points in S, i.e.,
$x=\displaystyle\sum\limits_{j=1}^k \lambda_jx_j,\displaystyle\sum\limits_{j=1}^k \lambda_j=1, \lambda_j \geq 0$ and $x_j \in S, \forall j \in \left ( 1,k \right )$
If $k \leq n+1$, the result obtained is obviously true.
If $k \geq n+1$, then $\left ( x_2-x_1 \right )\left ( x_3-x_1 \right ),....., \left ( x_k-x_1 \right )$ are linearly dependent.
$\Rightarrow \exists \mu _j \in \mathbb{R}, 2\leq j\leq k$ (not all zero) such that $\displaystyle\sum\limits_{j=2}^k \mu _j\left ( x_j-x_1 \right )=0$
Define $\mu_1=-\displaystyle\sum\limits_{j=2}^k \mu _j$, then $\displaystyle\sum\limits_{j=1}^k \mu_j x_j=0, \displaystyle\sum\limits_{j=1}^k \mu_j=0$
where not all $\mu_j's$ are equal to zero. Since $\displaystyle\sum\limits_{j=1}^k \mu_j=0$, at least one of the $\mu_j > 0,1 \leq j \leq k$
Then, $x=\displaystyle\sum\limits_{1}^k \lambda_j x_j+0$
$x=\displaystyle\sum\limits_{1}^k \lambda_j x_j- \alpha \displaystyle\sum\limits_{1}^k \mu_j x_j$
$x=\displaystyle\sum\limits_{1}^k\left ( \lambda_j- \alpha\mu_j \right )x_j $
Choose $\alpha$ such that $\alpha=min\left \{ \frac{\lambda_j}{\mu_j}, \mu_j\geq 0 \right \}=\frac{\lambda_j}{\mu _j},$ for some $i=1,2,...,k$
If $\mu_j\leq 0, \lambda_j-\alpha \mu_j\geq 0$
If $\mu_j> 0, then \:\frac{\lambda _j}{\mu_j}\geq \frac{\lambda_i}{\mu _i}=\alpha \Rightarrow \lambda_j-\alpha \mu_j\geq 0, j=1,2,...k$
In particular, $\lambda_i-\alpha \mu_i=0$, by definition of $\alpha$
$x=\displaystyle\sum\limits_{j=1}^k \left ( \lambda_j- \alpha\mu_j\right )x_j$,where
$\lambda_j- \alpha\mu_j\geq0$ and $\displaystyle\sum\limits_{j=1}^k\left ( \lambda_j- \alpha\mu_j\right )=1$ and $\lambda_i- \alpha\mu_i=0$
Thus, x can be representated as a convex combination of at most (k-1) points.
This reduction process can be repeated until x is representated as a convex combination of (n+1) elements.