数学代写|凸优化作业代写Convex Optimization代考|OUTER LINEARIZATION – CUTTING PLANE METHODS

Doug I. Jones

Doug I. Jones

Lorem ipsum dolor sit amet, cons the all tetur adiscing elit

如果你也在 怎样代写凸优化Convex optimization 这个学科遇到相关的难题,请随时右上角联系我们的24/7代写客服。凸优化Convex optimization由于在大规模资源分配、信号处理和机器学习等领域的广泛应用,人们对凸优化的兴趣越来越浓厚。本书旨在解决凸优化问题的算法的最新和可访问的发展。

凸优化Convex optimization无约束可以很容易地用梯度下降(最陡下降的特殊情况)或牛顿方法解决,结合线搜索适当的步长;这些可以在数学上证明收敛速度很快,尤其是后一种方法。[22]如果目标函数是二次函数,也可以使用KKT矩阵技术求解具有线性等式约束的凸优化(它推广到牛顿方法的一种变化,即使初始化点不满足约束也有效),但通常也可以通过线性代数消除等式约束或解决对偶问题来解决。

couryes-lab™ 为您的留学生涯保驾护航 在代写凸优化Convex Optimization方面已经树立了自己的口碑, 保证靠谱, 高质且原创的统计Statistics代写服务。我们的专家在代写凸优化Convex Optimization代写方面经验极为丰富,各种代写凸优化Convex Optimization相关的作业也就用不着说。

数学代写|凸优化作业代写Convex Optimization代考|OUTER LINEARIZATION – CUTTING PLANE METHODS

数学代写|凸优化作业代写Convex Optimization代考|OUTER LINEARIZATION – CUTTING PLANE METHODS

Throughout this section, we consider the problem of minimizing a convex function $f: \Re^n \mapsto \Re$ over a closed convex set $X$. In the simplest cutting plane method, we start with a point $x_0 \in X$ and a subgradient $g_0 \in \partial f\left(x_0\right)$. At the typical iteration we solve the approximate problem
$$
\begin{aligned}
& \text { minimize } F_k(x) \
& \text { subject to } x \in X,
\end{aligned}
$$
where $f$ is replaced by a polyhedral approximation $F_k$, constructed using the points $x_0, \ldots, x_k$ generated so far, and associated subgradients $g_0, \ldots, g_k$, with $g_i \in \partial f\left(x_i\right)$ for all $i \leq k$. In particular, for $k=0,1, \ldots$, we define
$$
F_k(x)=\max \left{f\left(x_0\right)+\left(x-x_0\right)^{\prime} g_0, \ldots, f\left(x_k\right)+\left(x-x_k\right)^{\prime} g_k\right},
$$
and compute $x_{k+1}$ that minimizes $F_k(x)$ over $x \in X$,
$$
x_{k+1} \in \arg \min _{x \in X} F_k(x)
$$
see Fig. 4.1.1. We assume that the minimum of $F_k(x)$ above is attained for all $k$. For those $k$ for which this is not guaranteed (as may happen in the early iterations if $X$ is unbounded), artificial bounds may be placed on the components of $x$, so that the minimization will be carried out over a compact set and consequently the minimum will be attained by Weierstrass’ Theorem.

数学代写|凸优化作业代写Convex Optimization代考|Partial Cutting Plane Methods

In some cases the cost function has the form
$$
f(x)+c(x)
$$
where $f: X \mapsto \Re$ and $c: X \mapsto \Re$ are convex functions, but one of them, say $c$, is convenient for optimization, e.g., is quadratic. It may then be preferable to use a piecewise linear approximation of $f$ only, while leaving $c$ unchanged. This leads to a partial cutting plane algorithm, involving solution of the problems
$$
\begin{aligned}
& \operatorname{minimize} \quad F_k(x)+c(x) \
& \text { subject to } x \in X,
\end{aligned}
$$
where as before
$$
F_k(x)=\max \left{f\left(x_0\right)+\left(x-x_0\right)^{\prime} g_0, \ldots, f\left(x_k\right)+\left(x-x_k\right)^{\prime} g_k\right}
$$
with $g_j \in \partial f\left(x_j\right)$ for all $j$, and $x_{k+1}$ minimizes $F_k(x)$ over $x \in X$,
$$
x_{k+1} \in \arg \min _{x \in X}\left{F_k(x)+c(x)\right} .
$$
The convergence properties of this algorithm are similar to the ones shown earlier. In particular, if $f$ is polyhedral, the method terminates finitely, cf. Prop. 4.1.2. The idea of partial piecewise approximation can be generalized to the case of more than two cost function components and arises also in a few other contexts to be discussed later in Sections 4.4-4.6.

数学代写|凸优化作业代写Convex Optimization代考|OUTER LINEARIZATION – CUTTING PLANE METHODS

凸优化代写

数学代写|凸优化作业代写Convex Optimization代考|OUTER LINEARIZATION – CUTTING PLANE METHODS

在本节中,我们将考虑最小化封闭凸集$X$上的凸函数$f: \Re^n \mapsto \Re$的问题。在最简单的切割平面方法中,我们从一个点$x_0 \in X$和一个子梯度$g_0 \in \partial f\left(x_0\right)$开始。在典型的迭代中,我们求解近似问题
$$
\begin{aligned}
& \text { minimize } F_k(x) \
& \text { subject to } x \in X,
\end{aligned}
$$
其中$f$被多面体近似$F_k$取代,该近似使用到目前为止生成的点$x_0, \ldots, x_k$和相关的子梯度$g_0, \ldots, g_k$构建,所有$i \leq k$使用$g_i \in \partial f\left(x_i\right)$。特别地,对于$k=0,1, \ldots$,我们定义
$$
F_k(x)=\max \left{f\left(x_0\right)+\left(x-x_0\right)^{\prime} g_0, \ldots, f\left(x_k\right)+\left(x-x_k\right)^{\prime} g_k\right},
$$
计算出最小化$F_k(x)$ / $x \in X$的$x_{k+1}$,
$$
x_{k+1} \in \arg \min _{x \in X} F_k(x)
$$
见图4.1.1。我们假设所有$k$都达到上述$F_k(x)$的最小值。对于那些不能保证这一点的$k$(如可能发生在早期迭代中,如果$X$是无界的),可以在$x$的组件上放置人工边界,以便在紧集上进行最小化,从而通过Weierstrass定理获得最小值。

数学代写|凸优化作业代写Convex Optimization代考|Partial Cutting Plane Methods

在某些情况下,成本函数有这样的形式
$$
f(x)+c(x)
$$
其中$f: X \mapsto \Re$和$c: X \mapsto \Re$是凸函数,但其中一个,比如$c$,便于优化,例如,是二次函数。因此,最好只使用$f$的分段线性近似值,而保持$c$不变。这导致了部分切割平面算法,涉及问题的解决
$$
\begin{aligned}
& \operatorname{minimize} \quad F_k(x)+c(x) \
& \text { subject to } x \in X,
\end{aligned}
$$
和以前一样
$$
F_k(x)=\max \left{f\left(x_0\right)+\left(x-x_0\right)^{\prime} g_0, \ldots, f\left(x_k\right)+\left(x-x_k\right)^{\prime} g_k\right}
$$
用$g_j \in \partial f\left(x_j\right)$表示所有的$j$, $x_{k+1}$使$F_k(x)$比$x \in X$最小,
$$
x_{k+1} \in \arg \min _{x \in X}\left{F_k(x)+c(x)\right} .
$$
该算法的收敛性与前面所示的相似。特别地,如果$f$是多面体,则该方法有限地终止,参见Prop. 4.1.2。部分分段逼近的思想可以推广到两个以上成本函数组成部分的情况,并在稍后的4.4-4.6节中讨论的其他一些情况下也会出现

统计代写请认准statistics-lab™. statistics-lab™为您的留学生涯保驾护航。

Days
Hours
Minutes
Seconds

hurry up

15% OFF

On All Tickets

Don’t hesitate and buy tickets today – All tickets are at a special price until 15.08.2021. Hope to see you there :)