Machine translation is one of the biggest applications of NLP. Learn about neural machine translation and its implementation in Python using keras.
nlp论文-《Neural Machine Translation by Jointly Learning to Align and Translate》-基于联合学习对齐和翻译的神经机器翻译(一),程序员大本营,技术文章内容聚合第一站。
Dual Learning for Machine Translation 源码:https://github.com/yistLin/pytorch-dual-learning 本文作者提出了对偶学习可以更高效的利用单语数据。利用这种对偶学习,单语数据可以扮演和平行语料一样的角色,同时减少训练阶段对平行语料的依赖。具体来说,翻译模型的对偶学习可以描述成以下两个agent的游戏。 第一个...
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks. machine-learningnatural-language-processingmachine-translationdialoguenamed-entity-recognitionnlp-tasks ...
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
论文链接:Neural Machine Translation in Linear Time 标题:Neural Machine Translation in Linear Time 来源:arxiv 作者:deepmind 问题:提出了新型的source--target网络结构ByteNet,并通过两个扩张卷积神经网络(Dilated Convolution)堆叠实现,完成了机器翻译任务,并且将时间复杂度控制在线性范围。 相关工作: 1、扩张卷积...
nlpnatural-language-processingtext-classificationneural-machine-translationsentiment-classificationnlp-tutorial UpdatedApr 1, 2020 Jupyter Notebook marian-nmt/marian Star1.3k Fast Neural Machine Translation in C++ fastgpucudaneural-machine-translation
这篇论文是第一个在NLP中使用attention机制的工作。翻译任务是典型的seq2seq问题。那么,什么是seq2seq问题?简单的说就是,根据输入序列X,生成一个输出序列Y,序列的长度不固定。当输入序列X和输出序列Y是不同的语言时,就是机器翻译;当输入序列X是问题,输出序列Y是答案时,就是问答系统或者对话系统。根据输入和输出...
本文主要用于记录发表于2014年的一篇神作(引用量破5k)。该论文第一次将注意力机制引入了NLP领域,而本笔记意在方便初学者快速入门,以及自我回顾。 论文链接:https://arxiv.org/pdf/1409.0473.pdf 基本目录如下: 摘要 核心思想 总结 ---第一菇 - 摘要--- 1.1 论文摘要 近年来,基于神经网络的机器翻译模型...
(NLP) tasks in translation. Part Three focuses on the role of data in both human and machine learning processes. It proposes that a translator’s unique value lies in the capability to create, manage, and leverage language data in different ML tasks in the translation process. It outlines ...