We introduce a general framework for large-scale model-based derivative-free optimization based on iterative minimization within random subspaces. We present a probabilistic worst-case complexity analysis for our method, where in particular we prove high-probability bounds on the number of iterations ...
The Nelder-Mead (NM) method is a popular derivative-free optimization algorithm owing to its fast convergence and robustness. However, it is known that the method often fails to converge or costs a long time for a large-scale optimization. In the present study, the NM method has been ...
We re-introduce a derivative-free subspace optimization framework originating from Chapter 5 of the Ph.D. thesis [Z. Zhang, On Derivative-Free Optimization... Z Zhang 被引量: 0发表: 2025年 Understanding and mitigating dimensional collapse of Graph Contrastive Learning: A non-maximum removal appro...
C (15) Taking the derivative of the objective with respect to C and setting it to zero lead to the following closed-form solution: C =(XT X + I)−1 XT (X − E + Q1 µ ) + Z − diag (Z) − Q2 µ . (16) E-subproblem: The associated optimization problem ...
derivative-free optimizationRecently, an increasing attention was paid on different procedures for an unconstrained optimization problem when the information of the first derivatives is unavailable or unreliable. In this paper, we consider a heuristic iterated-subspace minimization method with pattern search...
Take the derivative of the objectiveJwith respect to\((\mathbf{W}_p)_i \ (1 \le i \le u)\), we haveFootnote1 $$\begin{aligned} \begin{aligned} \frac{\partial J}{\partial (\mathbf{W}_p)_i}=2\mathbf{P}{} \mathbf{P}^T(\mathbf{W}_p)_i-2\mathbf{P}(\mathbf{A}_p)...
Model-based derivative-free optimization (DFO) methods are an important class of DFO methods that are known to struggle with solving high-dimensional optimization problems. Recent research has shown that incorporating random subspaces into model-based DFO methods has the potential to improve their ...
The cosine measure provides a way of quantifying the positive spanning property of a set of vectors, which is important in the area of derivative-free optimization. This paper proves some of the properties of the cosine measure for a nonempty finite set of nonzero vectors. It also introduces ...
In recent years, there have been many studies on the subspace conjugate gradient methods for solving unconstrained optimization problems. Based on these methods and the projection technique, in this paper a subspace derivative-free projection method is proposed to solve large-scale nonlinear equations ...
Our method provides a novel extension to the well-known Gaussian smoothing technique to descent in subspaces of dimension greater than one, opening the doors to new analysis of Gaussian smoothing when more than one directional derivative is used at each iteration. We also provide a finite-...