We incorporate an axial attention mechanism module (AAMM) into the denoising encoder to establish global long-range dependency, overcoming the limitations of convolutions and improving model's interpretability. To use texture and contextual information from multi-scale features, we propose a multi-scale...
On the basis of U-Net, we combined the ideas of multi-scale convolution module and attention mechanism to design a variety of innovative structures to improve its performance. At the same time, we mixed the binary cross-entropy loss function and the dice loss function in order to alleviate ...
We adopt a multi-scale attention method to each different layers in the U-net backbone to make the network extract features which focus on the crowds, instead of the background in the images. The attention mechanism and the skip-connections can adjust the weights of feature maps while ...
building on the baseline that makes use of the MSF. ECA62, SE63, EffectiveSE64, and CBAM65were tested on the NEU-DET dataset along with other attention mechanisms. The effect of using the attention mechanism is listed in Table6. Note that each of these approaches helped to reduce...
(1)We propose a hierarchical attention mechanismby which the network learns to predict a relative weighting between adjacent scales. In our method, because of it’shierarchical nature, we only require to augment the training pipeline with one extra scale whereas other methods suchas [1] require ...
(MFANet) based on deep learning, which integrates pyramid module and channel attention mechanism effectively. Pyramid module is designed for feature fusion in the channel and space dimensions. Channel attention mechanism obtains feature maps in different receptive fields, which divides each feature map ...
Additionally, attention mechanism has been widely used in various computer vision tasks [16], [17]. It allows networks to focus on more valuable features, thereby enhancing their representational capability. Motivated by [16], Zhang et al. [9] designed the channel attention mechanism in RCAN tha...
First, an attention mechanism block is introduced to construct a new type of residual block combination. Second, a multi-scale structure is constructed by choosing an appropriate convolution kernel size. Finally, the overall framework of MSA-ResNet is constructed for efficient training and failure ...
Facial expression recognition based on multi-scale feature fusion and attention mechanism[J]. Microelectronics & Computer, 2022, 39(3): 34-40. DOI: 10.19304/J.ISSN1000-7180.2021.0799 Citation: SHI Hao, XING Yuhang, CHEN Lian. Facial expression recognition based on multi-scale feature fusion ...
Both MSCAN-AFF and MSCAN-STN achieve state-of-the-art performance on challenging benchmark datasets, demonstrating the effectiveness of the attention mechanism in both face alignment and recognition. The major contributions of this work are as follows: The rest of the paper is organized as ...