Tran-Dinh, Q., Necoara, I., Diehl, M.: Fast inexact decomposition algorithms for large-scale separable convex opti- mization. Optimization (to appear) pp. 1-33 (2015). DOI 10.1080/02331934.2015.1044898Q. Tran-Dinh, I. Necoara, and M. Diehl, Fast inexact decomposition algorithms for ...
Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexityStructured nonsmooth convex optimizationFirst-order black-box oracleEstimation sequenceStrong convexityOptimal complexityWe introduce four accelerated (sub)gradient algorithms (ASGA) for solving ...
Large-Scale Nonlinear Optimization reviews and discusses recent advances in the development of methods and algorithms for nonlinear optimization and its applications, focusing on the large-dimensional case, the current forefront of much research.\nThe chapters of the book, authored by some of the most...
Mirror Descent and Convex Optimization Problems with Non-smooth Inequality Constraints Anastasia Bayandina, Pavel Dvurechensky, Alexander Gasnikov, Fedor Stonyakin, Alexander Titov Pages 181-213 Frank-Wolfe Style Algorithms for Large Scale Optimization Lijun Ding, Madeleine Udell Pages 215-245 ...
In this paper, motivated by recently developed stochastic block coordinate algorithms, we propose a highly scalable randomized block coordinate Frank-Wolfe algorithm for convex optimization with general compact convex constraints, which has diverse applications in analyzing biomedical data for better ...
浙江大学 Zhejiang University Large-Scale Distributed Optimization via ADMM: Algorithms and Applications 讲座论坛 0 2016-06-20 15:00 信电楼117报告厅 讲座论坛简介 Multi-agent distributed optimization has drawn significant attention in recent years due to the need for large-scale signal processing and ...
envisioned to enable the paradigm shift from "connected things" to "connected intelligence", featured by ultra high density, large-scale, dynamic heterogeneity, diversified functional requirements and machine learning capabilities, which leads to a growing need for highly efficient intelligent algorithms. ...
In general, SNOPT requires less matrix computation than NPSOL and fewer evaluations of the functions than the nonlinear algorithms in MINOS [19, 1]. It is suitable for nonlinear problems with thousands of constraints and variables, and is efficient if many constraints and bounds are active at a...
In this chapter we review recent developments in the research of Bregman methods, with particular focus on their potential use for large-scale applications. We give an overview on several families of Bregman algorithms and discuss modifications such as a
Two online learning algorithms, namely the original IBLS and the proposed BPOSBLS, are utilized to model the above time series on the training set, and then the resulting models are further used for prediction on the testing set to evaluate and compare their modeling effects. The parameters of...