Paper 1:MULTI-TASK SEQUENCE TO SEQUENCE LEARNING Paper 2:A Knowledge-Grounded Neural Conversation Model 本文收获和重要文章我先列在前面,以使得在读正文之前能有个概念。文章也会根据本文收获的逻辑路线来写。 1. 什么是多任务学习? 多任务学习(Multi-task learning, MTL)用文字解释起来比较抽象,这里先用文字...
Sequence to sequence learning has recently emerged as a new paradigm in supervised learning. To date, most of its applications focused on only one task and not much work explored this framework for multiple tasks. This paper examines three settings to mu
Despite the popularity of multi-task learning and sequence to sequence learning, there has been little work in combining MTL withseq2seqlearning. To the best of our knowledge, there is only one recent publication byDong et al. (2015)which applies aseq2seqmodels for machine translation, where...
Multi-task sequence to sequence learning. In: Proceedings of the 4th International Conference on Learning Representations 2016. Google Scholar [119] Zhao L, Sun Q, Ye J et al. Multi-task learning for spatio-temporal event forecasting. In: Proceedings of the 21th ACM SIGKDD International ...
背景:只专注于单个模型可能会忽略一些相关任务中可能提升目标任务的潜在信息,通过进行一定程度的共享不同任务之间的参数,可能会使原任务泛化更好。广义的讲,只要loss有多个就算MTL,一些别名(joint learning,learning to learn,learning with auxiliary task)
In this work, we develop new self-learning techniques with an attention-based sequence-to-sequence (seq2seq) model for automatic speech recognition (ASR). For untranscribed speech data, the hypothesis from an ASR system must be used as a label. However, the imperfect ASR result makes ...
Luong MT, Le QV, Sutskever I, Vinyals O, Kaiser L (2015) Multi-task sequence to sequence learning. arXiv:1511.06114 Luong MT, Pham H, Manning CD (2015) Effective approaches to attention-based neural machine translation. arXiv:1508.04025 Mihalcea R, Tarau P (2004) Textrank: Bringing or...
摘要: Neural generative model in question answering (QA) usually employs sequence-to-sequence (Seq2Seq) learning to generate answers based on the user's questions as opposed to the retrieval-based model...关键词: Multitask learning Generative model Question answering Natural language Process ...
目前比较火的是learning what to share(outperform hard parameter sharing);还有就是对任务层级进行学习在任务具有多粒度因素时也是有用的。 三Auxiliary task 我们只关注主任务目标,但是希望从其他有效的辅助任务中获利! 目前选择一些辅助任务方法 Related task:常规思路(自动驾驶+路标识别;query classification+web sear...
Luong, M.T., Le, Q.V., Sutskever, I., Vinyals, O., Kaiser, L.: Multi-task sequence to sequence learning. arXiv (2015) 10. Hashimoto, K., Xiong, C., Tsuruoka, Y., Socher, R.: A joint many-task model: growing a neural network for multiple NLP tasks. arXiv (2016) 11. ...