Specifically, we propose a novel channel-wise feature attention mechanism, which is integrated into the pipeline of a well-known convolutional neural network based visual tracking algorithm. It is crucial to represent the object robustly. Due to the representative feature, the tracking performance is ...
we aim to propose a novel attention mechanism by taking into account the two factors: For the first factor, inspired by the CapsuleNet [11] where the grouped sub-features can represent the instantiation parameters of a specific type of en- tity, we propose a group-...
In STGU, to learn the point-wise topology features, a new gate-based feature interaction mechanism is introduced to activate the features point-to-point by the attention map generated from the input sample. Based on the STGU, we propose the first MLP-based model, SiT-MLP, for skeleton-...
Furthermore, an element-wise feature fusion module (i.e., EFFM) based on the additive attention mechanism was developed for our decoder to fuse multi-level features to enhance the detection CRediT authorship contribution statement Feng He: Conceptualization, Data analysis, Methodology, Writing – ...
In STGU, to learn the point-wise topology features, a new gate-based feature interaction mechanism is introduced to activate the features point-to-point by the attention map generated from the input sample. Based on the STGU, we propose the first MLP-based model, SiT-MLP, for skeleton-...
To segment new categories given only few examples, we incorporate a self-gating mechanism into relation network to exploit global context information for adjusting perchannel modulation weights of local relation features. Extensive experiments on benchmark texture datasets and real scenarios demonstrate the...
Attention mechanism has become a widely researched method to improve the performance of convolutional neural networks (CNNs). Most of the researches focus on designing channel-wise and spatial-wise attention modules but neglect the importance of unique information on each feature, which is critical ...
According to news reporting from Chengdu, People's Republic of China,by NewsRx journalists, research stated, "Attention mechanism has become a widely researched methodto improve the performance of convolutional neural networks (CNNs). Most of the researches focus ondesigning channel-wise and spatial...
Secondly, in order to weak the influence of background, a novel channel-wise attention mechanism is introduced to highlight those informative channels while suppressing the confusing ones. Finally, an autoencoder-based deep feature prediction module is applied to capture temporal information and ...
In addition to the standard CFE method mentioned above, this study also designs a lightweight CFE‐tiny method, which adopts split‐attention mechanism, and the calculation amount of this method is much smaller than that of ADL method.doi:10.1049/iet-ipr.2020.0640Yanzhu Hu...