This paper aims at the transferability of the zero-shot captioning for out-of-domain images. As shown in this image, we demonstrate the susceptibility of pre-trained vision-language models and large language models to modality bias induced by language models when adapting them into image-to-text...
On robustness and transferability of convolu- tional neural networks. In CVPR, 2021. 2 [18] Carl Doersch, Abhinav Gupta, and Alexei A. Efros. Unsu- pervised visual representation learning by context prediction. In ICCV, 2015. 2 [19] Jeff Donahue, Yangqing Jia, Oriol Vinyals...
To address the lack of labeled samples for training supervised classifiers for the target classes, we propose to transfer samples from source classes with pseudo labels assigned, in which the transferred samples are selected based on their transferability and diversity. Moreover, to account for the ...
Our study aims to fill this gap by providing a detailed analysis on Cross-Lingual Multi-Transferability (many-to-many transfer learning), for the recent IE corpora that cover a diverse set of languages. Specifically, we first determine the correlation between single-transfer performance and a ...
This approach deconstructs textual semantic expression features and fine-grained modeling to enhance feature transferability, facilitating zero-shot stance detection. We introduce a multi-expert collaborative feature learning framework. This framework is designed to learn fine-grained features, serving as a ...
(ACM) to weight-fuse the semantic features of seen classes and attribute labels to correct semantic supervisory information; then it uses an adaptive data distribution adjustment strategy to balance the discriminability and transferability of the model and alleviate domain drift problems; finally, it ...
For a more formal discussion of the terminologies parametrization and transfer, see Appendix A We emphasize that, to ensure transferability of any hyperparameter (such as learning rate), it's not sufficient to reparametrize only that hyperparameter, but rather, we need to identify and correctly ...
With detailed analysis, we find inter-esting patterns showing that RNNs-based ar-chitectures can transfer well for languages thatare close to English, while self-attentive mod-els are have better cross-lingual transferabilityacross a wide range of languages. 展开 ...
On the Cross-Lingual Transferability of Monolingual Representations (2020) BrownT. et al. Language models are few-shot learners Adv. Neural Inf. Process. Syst. (2020) ChenY. et al. Multilingual relation classification via efficient and effective prompting (2022) Chi, Z., Dong, L., Wei, F....
More recently, research has explored forward transfer in continual learning, with the belief that as knowledge accumulates, higher next-task transferability, as measured by zero-shot assessment, should be attained. Their evaluation space either includes the next task...