Krabbenhøft, K., Damkilde, L.: A general nonlinear optimization algorithm for lower bound limit analysis. Int. J. Numer. Methods Eng. 56 , 165–184 (2003) MATHKrabbenhøft K, Damkilde L (2003) A general no
Starting from the technical results, we obtain the global convergence of: (i) the variable metric proximal methods presented by Bonnans, Gilbert, Lemarechal, and Sagastizabal, (ii) some algorithms proposed by Correa and Lemarechal, and (iii) the proximal point algorithm given by Rockafellar. ...
iterative algorithmequilibrium problemconstrained convex minimizationvariational inequalityThe gradient-projection algorithm (GPA) plays an important role in solving constrained convex minimization problems. Based on Marino and Xu's method [G. Marino and H.-K. Xu, A general method for nonexpansive ...
The aim is to draw general guidelines on the choice of the most suitable techniques for a given optimization process. It is stressed out that an optimization process should not be a one-shot application of a certain algorithm. It must generally be composed of different steps, if not all, ...
Among numerous methods, the elimination method is time-consuming and complex, but it has the highest precision and can get all the solutions under the same terminal posture. This paper optimized the elimination method to improve the solving speed and make the algorithm more complete. On the ...
Study of the literature indicates that the OCL problem has been solved by both conventional and heuristic optimization methods, namely, Lagrangian method (LM) [1], equal load rate (ELR) [1], branch and bound (B&B) method [2], genetic algorithm (GA) [3], evolutionary strategy (ES) [4]...
Two optimization methods, in our case the differential evolution (DE) algorithm and the Nelder-Mead simplex method, are used for the reconstruction at low frequencies. The Nelder-Mead simplex method is then used to obtain the solutions at higher frequencies, where the initial guess is obtained ...
Gumbel-softmax optimization method builds a mixed algorithm that combines the batched version of GSO algorithm and evolutionary computation methods. The key idea is to treat the batched optimization variables—the parameters as a population, such that the evolutionary operators, e.g., substitution, ...
Alternatively, instead of using a single algorithm, the ensemble method that combines the results of several algorithms may be also used. Table 1. Strengths and weaknesses of some machine learning algorithms (after Zhong et al., 2020a). AlgorithmStrengthsWeaknesses LR 1. Robust to noise 1. ...
nonzero residuals/ B0260 Optimisation techniques B0290Z Other numerical methods C1180 Optimisation techniques C4190 Other numerical methodsAn algorithm for solving the general nonlinear least-square problem is developed. An estimate for the Hessian matrix is constructed as the sum of two matrices. The...