Multi-modal contrastive mutual learning and pseudo-label re-learning for semi-supervised medical image segmentation Multi-modal contrastive learningPseudo-label re-learningSemi-supervised learning? 2022Semi-supervised learning has a great potential in medical image segmentation... S Zhang,J Zhang,B Tian...
mscho@postech.ac.kr Ildoo Kim Kakao Brain ildoo.kim@kakaobrain.com Wook-Shin Han * POSTECH wshan@postech.ac.kr Abstract Consistency regularization on label predictions becomes a fundamental technique in semi-supervised learning, but it still requires a large number of trainin...
In terms of semi-supervised learning, by combining a small amount of labeled data with abundant unlabeled data, the model can learn a more generalized feature representation. Transfer learning enables the model to quickly adapt to new RS image datasets with scarce labels by utilizing pre-trained ...
Conditional Neural Processes Zesheng Ye Lina Yao The University of New South Wales zesheng.ye@unsw.edu.au lina.yao@unsw.edu.au Abstract Conditional Neural Processes (CNPs) bridge neural net- works with probabilistic inference to approximate func- tions of Stochastic Processes under meta-learning ...
Despite being in its early stages, self-supervised contrastive learning has shown great potential in overcoming the need for expert-created annotations in the research of medical time series. Keywords: self-supervised learning; medical time series; deep learning; healthcare; pretext tasks; contrastive...