本文属于优化方向,研究的对象为牛顿法 (Newton Method)。事实上,在优化领域,牛顿法是非常有名的,而且是大多数优化教材中会把对牛顿法的介绍放在很靠前的位置。它之所以会有这样的待遇,因为它是一个二阶算法,通常几个iteration就可以彻底收敛,但是由于需要用到二阶信息(即 Hessian matrix),而且还需要求 Hessian 矩...
In particular, we show that the iterates given by x(k+1) = x(k) (del(2)f(x(k)) + root H parallel to del f(x(k))parallel to I)(-1) del f(x(k)), where H > 0 is a constant, converge globally with a O (1/k(2)) rate. Our method is the first variant of Newton'...
In this paper, we propose a regularized Newton method without line search. The proposed method controls a regularization parameter instead of a step size in order to guarantee the global convergence. We show that the proposed algorithm has the following convergence properties. (a) The proposed ...
factorized quasi-Newton methodnonlinear least squaresglobal convergenceIn this paper, we propose a regularized factorized quasi-Newton method with a new Armijo-type line search and prove its global convergence for nonlinear least squares problems. This convergence result is extended to the regularized ...
The idea to regularize Gauss–Newton algorithm iteratively proved to be extremely effective. Method (1.1) was successfully applied to a number of nonlinear ill-posed problems [3], [4], [5], [6]. One of the remarkable features of this scheme is the lack of the requirement on d0 to be...
Scheinberg, K., Tang, X.: Practical inexact proximal quasi-Newton method with global complexity analysis. Math. Program. 160(1–2), 495–529 (2016) Article MathSciNet MATH Google Scholar Schmidt, M., Roux, N., Bach, F.: Convergence rates of inexact proximal-gradient methods for convex...
In this work, we propose a Parallel Coordinate Descent Newton algorithm using multidimensional approximate Newton steps (PCDN), where the off-diagonal elements of the Hessian are set to zero to enable parallelization. It randomly partitions the feature set into b b bundles/subsets with size of P...
In this paper, we propose a Perry-type derivative-free algorithm for solving systems of nonlinear equations. The algorithm is based on the well-known BFGS quasi-Newton method with a modified Perry's parameter. The global convergence of the algorithm is established without assumption on the regular...
Our algorithm first relies on the KKT error to estimate the active and free variables, and then smoothly combines the proximal gradient iteration and the Newton iteration to efficiently pursue the convergence of the active and free variables, respectively. We show the global convergence without the ...
It has also been exploited to study the convergence of second-order algorithms such as Newton’s method (Noll and Rondepierre, 2013; Frankel et al., 2015) and cubic regularization method (Zhou et al., 2018). 2 Problem formulation and preliminaries In this paper, we consider the following ...