Sparse large-scale multi-objective optimization problems (LSMOPs) widely exist in real-world applications, which have the properties of involving a large number of decision variables and sparse Pareto optimal solutions, i.e., most decision variables of these solutions are zero. In recent years, ...
论文名称:《An Evolutionary Algorithm for Large-Scale Sparse Multiobjective Optimization Problems》 论文作者:Ye Tian , Xingyi Zhang , Senior Member, IEEE , Chao Wang, and Yaochu Jin , Fellow, IEEE …
Sparse multi-objective optimization problems (SMOPs) frequently exist in a variety of disciplines such as machine learning, economy, and signal processing. Evolutionary algorithms have demonstrated their proficiency in optimizing complex problems in recent years, although their performance often deteriorates...
Because most real-world SMOPs are based on large data sets, they are also large-scale sparse multi-objective optimization problems (LSMOPs) [2]. Multiple other applications, including key node detection [3], limited optimization and combinatorial issues [4], pattern mining [5], neural networks...
Due to the curse of dimensionality of search space, it is extremely difficult for evolutionary algorithms to approximate the optimal solutions of large-scale multiobjective optimization problems (LMOPs) by using a limited budget of evaluations. If the Pareto-optimal subspace is approximated during the...
Thus, we could rewrite problemSPOby the following mixed-integer problem (MIP) In order to move to a continuous optimization problem, we discard the binary constraints ony. We need to retain the constraint, because otherwise the objective function of (MIP) does not admit a minimum. This leads...
Black-Box Sparse Adversarial Attack via Multi-Objective Optimisation Summary: - (1):本文研究拥有稀疏性的黑盒对抗攻击。这些攻击可以欺骗深度神经网络(DNN)识别出偏离人类偏好的图像。 - (2):过去的稀疏攻击方法往往需要大量的查询,而且往往可以没有限制的更改像素,容易检测到。本文提出使用多目标优化的方法进...
The inference computation also offers a further optimization opportunity. In particular, many neural networks employ the rectified linear unit (ReLU) function that clamps all negative activation values to zero as a non-linear operator. The activations are the output values of an individual layer that...
Other approaches require the decomposition of the multiclass problem into several binary problems, or the definition of multiclass objective functions. This is the case, for example, of SVM one-vs.-one, one-vs.-rest or multiclass SVM. Sparse PLS-DA We introduce a sparse version of the PLS...
-regularizer in the class of parametric sparse optimization problems studied in this paper. at this point, it would be useful to mention that while there are other approximations of the \(\ell _0\) -function such as the minimax concave penalty mcp function [ 45 ], and the smoothly clipped...