QS-Attn的细节在以下两个小节中给出。 Attention for query selection CUT随机选择锚点q,正k+和负k-来计算等式中的对比损失(2)是低效的,因为它们对应的补丁可能不是来自与域相关的区域,例如马体在马→斑马任务中,有些特性并不反映域特征,但它们往往在翻译过程中被保留。因此, Lcon对G并不重要。我们的目的是选...
QS-Attn: Query-Selected Attention for Contrastive Learning in I2I Translation Xueqi Hu1, Xinyue Zhou1, Qiusheng Huang1, Zhengyi Shi1, Li Sun1,2,* Qingli Li1 1Shanghai Key Laboratory of Multidimensional Information Processing, 2Key Laboratory of Advanced Theory and Application in...
In a previous study, a query-selected attention (QS-Attn) module, which employed an attention matrix with a probability distribution, was used to maximize the mutual information between the source and translated images. This module selected significant queries using an ent...
Unsupervised Multi-Modal Medical Image Registration via query-selected attention and decoupled Contrastive Learning 来自 IEEEXplore 喜欢 0 阅读量: 2 作者:Z Huang,B Chen 摘要: The translation-based unsupervised deformable image registration method has become one of the classic methods for multi-modal ...