In this paper a robust second-order method is developed for the solution of strongly convex l1-regularized problems. The main aim is to make the proposed method as inexpensive as possible, while even difficult problems can be efficiently solved. The proposed approach is a primal-dual Newton Conj...
We show that the exact worst-case performance of fixed-step first-order methods for smooth (possibly strongly) convex functions can be obtained by solving convex programs. Finding the worst-case performance of a black-box first-order method is formulated as an optimization problem over a set of...
Aharouch, L., Akdim, Y.: Strongly nonlinear elliptic unilateral prob- lems without sign condition and L1 data. J. Convex Anal. 13 (2006), no. 1, 135-149.Aharouch, L. and Akdim, Y. Strongly Nonlinear Elliptic Unilateral Problems without sign condition and L1 Data. Journal of convex ...
Suppose that n≥3 and H(p)∈C1,1(Rn) is a locally strongly convex Hamiltonian. We obtain the everywhere differentiability of all absolute minimizers for H in any domain of Rn.doi:10.1016/j.jfa.2020.108829Fa Peng aQianyun Miao bYuan Zhou c...
Holomorphic invariant strongly pseudoconvex complex Finsler metrics on the irreducible classical domains 主持人:邵国宽副教授 报告人:钟春平 教授 时间:2022-09-24 14:30-15:30 地点:腾讯会议 919-470-572 单位:厦门大学 摘要: In thi...
In this paper a robust second-order method is developed for the solution of strongly convex \(\ell _1\)-regularized problems. The main aim is to make the proposed method as inexpensive as possible, while even difficult problems can be efficiently solved. The proposed approach is a primal-dua...
Stochastic approximation (SA) is a classical approach for stochastic convex optimization. Previous studies have demonstrated that the convergence rate of SA can be improved by introducing either smoothness or strong convexity condition. In this paper, we make use of smoothness and strong convexity simul...
Computer Science - LearningComputer Science - Artificial IntelligenceWith a weighting scheme proportional to t, a traditional stochastic gradientdescent (SGD) algorithm achieves a high probability convergence rate ofO({\\kappa}/T) for strongly convex functions, instead of O({\\kappa} ln(T)/T). ...
convex functionsstarlike functionssubordinationsuperordinationIn this paper, we prove a general result that can be a sufficient condition forfto be in the some subclass of strongly starlike functions. Also, the related sandwich-type results are given....
Then, we give some their applications in iterative methods, convex and pseudo-convex minimization(proximal point algorithm), fixed point theory and equilibrium problems. The results extend several new results in the literature and some of them seem new even in Hilbert spaces....