ND-Adam is designed to preserve the direction of gradient for each weight vector, and produce the regularization effect of L2 weight decay in a more precise and principled way. We further introduced regularized softmax, which limits the magnitude of softmax logits to provide better learning ...
Thus, we introduce joint normalized virtual softmax loss (NV-softmax loss) and triplet loss to guide a better feature distance learning. In the practical application of vehicle ReID technology, comparing with the other softmax-base method, ours can return more accurate query results, thus ...
The overall experiments were performed in a system with a single Nvidia Geforce RTX 3090, and the implemented code was written in the Pytorch framework of version 1.13. The overall setup was implemented in a fashion similar to that in the work of the anomaly transformer [12]. A non-overlappi...
In the proposed CNN model, the size of the convolution filter is fixed at 3 × 3, and maxpooling uses a 2 × 2 window. Moreover, the rectified linear unit is used as the activation function and the result is obtained using a softmax function in the last layer. The software used in...