# 数学代写|凸优化作业代写Convex Optimization代考|OUTER LINEARIZATION – CUTTING PLANE METHODS

#### Doug I. Jones

Lorem ipsum dolor sit amet, cons the all tetur adiscing elit

couryes™为您提供可以保分的包课服务

couryes-lab™ 为您的留学生涯保驾护航 在代写凸优化Convex Optimization方面已经树立了自己的口碑, 保证靠谱, 高质且原创的统计Statistics代写服务。我们的专家在代写凸优化Convex Optimization代写方面经验极为丰富，各种代写凸优化Convex Optimization相关的作业也就用不着说。

## 数学代写|凸优化作业代写Convex Optimization代考|OUTER LINEARIZATION – CUTTING PLANE METHODS

Throughout this section, we consider the problem of minimizing a convex function $f: \Re^n \mapsto \Re$ over a closed convex set $X$. In the simplest cutting plane method, we start with a point $x_0 \in X$ and a subgradient $g_0 \in \partial f\left(x_0\right)$. At the typical iteration we solve the approximate problem
\begin{aligned} & \text { minimize } F_k(x) \ & \text { subject to } x \in X, \end{aligned}
where $f$ is replaced by a polyhedral approximation $F_k$, constructed using the points $x_0, \ldots, x_k$ generated so far, and associated subgradients $g_0, \ldots, g_k$, with $g_i \in \partial f\left(x_i\right)$ for all $i \leq k$. In particular, for $k=0,1, \ldots$, we define
$$F_k(x)=\max \left{f\left(x_0\right)+\left(x-x_0\right)^{\prime} g_0, \ldots, f\left(x_k\right)+\left(x-x_k\right)^{\prime} g_k\right},$$
and compute $x_{k+1}$ that minimizes $F_k(x)$ over $x \in X$,
$$x_{k+1} \in \arg \min _{x \in X} F_k(x)$$
see Fig. 4.1.1. We assume that the minimum of $F_k(x)$ above is attained for all $k$. For those $k$ for which this is not guaranteed (as may happen in the early iterations if $X$ is unbounded), artificial bounds may be placed on the components of $x$, so that the minimization will be carried out over a compact set and consequently the minimum will be attained by Weierstrass’ Theorem.

## 数学代写|凸优化作业代写Convex Optimization代考|Partial Cutting Plane Methods

In some cases the cost function has the form
$$f(x)+c(x)$$
where $f: X \mapsto \Re$ and $c: X \mapsto \Re$ are convex functions, but one of them, say $c$, is convenient for optimization, e.g., is quadratic. It may then be preferable to use a piecewise linear approximation of $f$ only, while leaving $c$ unchanged. This leads to a partial cutting plane algorithm, involving solution of the problems
\begin{aligned} & \operatorname{minimize} \quad F_k(x)+c(x) \ & \text { subject to } x \in X, \end{aligned}
where as before
$$F_k(x)=\max \left{f\left(x_0\right)+\left(x-x_0\right)^{\prime} g_0, \ldots, f\left(x_k\right)+\left(x-x_k\right)^{\prime} g_k\right}$$
with $g_j \in \partial f\left(x_j\right)$ for all $j$, and $x_{k+1}$ minimizes $F_k(x)$ over $x \in X$,
$$x_{k+1} \in \arg \min _{x \in X}\left{F_k(x)+c(x)\right} .$$
The convergence properties of this algorithm are similar to the ones shown earlier. In particular, if $f$ is polyhedral, the method terminates finitely, cf. Prop. 4.1.2. The idea of partial piecewise approximation can be generalized to the case of more than two cost function components and arises also in a few other contexts to be discussed later in Sections 4.4-4.6.

# 凸优化代写

## 数学代写|凸优化作业代写Convex Optimization代考|OUTER LINEARIZATION – CUTTING PLANE METHODS

\begin{aligned} & \text { minimize } F_k(x) \ & \text { subject to } x \in X, \end{aligned}

$$F_k(x)=\max \left{f\left(x_0\right)+\left(x-x_0\right)^{\prime} g_0, \ldots, f\left(x_k\right)+\left(x-x_k\right)^{\prime} g_k\right},$$

$$x_{k+1} \in \arg \min _{x \in X} F_k(x)$$

## 数学代写|凸优化作业代写Convex Optimization代考|Partial Cutting Plane Methods

$$f(x)+c(x)$$

\begin{aligned} & \operatorname{minimize} \quad F_k(x)+c(x) \ & \text { subject to } x \in X, \end{aligned}

$$F_k(x)=\max \left{f\left(x_0\right)+\left(x-x_0\right)^{\prime} g_0, \ldots, f\left(x_k\right)+\left(x-x_k\right)^{\prime} g_k\right}$$

$$x_{k+1} \in \arg \min _{x \in X}\left{F_k(x)+c(x)\right} .$$

Days
Hours
Minutes
Seconds

# 15% OFF

## On All Tickets

Don’t hesitate and buy tickets today – All tickets are at a special price until 15.08.2021. Hope to see you there :)