摘要
Neural machine translation : 神经机器翻译。
神经机器翻译模型经常包含编码器和解码器:an encoder and a decoder.
编码器: 从一个变长输入序列中提取固定长度的表示。a fixed-length representation.
解码器:从表示中生成一个正确的翻译。generates a correct translation
本文使用模型: RNN Encoder–Decoder、
a newly proposed gated recursive convolutional neural network (门递归卷积神经网络)
a grammatical structure 语法结构
介绍
问题:
it is crucial to understand the properties and behavior of this new neural machine translation approach in order to determine future research directions. Also, understanding the weaknesses and strengths of neural machine translation might lead to better ways of integrating SMT and neural machine translation systems
变长序列神经网络:
循环神经网络、门递归卷积神经网咯
RNN更新:
RNN序列可以有效的学习一个概率分布:
Gated Recursive Convolutional Neural Network
Purely Neural Machine Translation
编码器和解码器方法
实验
数据集: English-to-French translation.
模型:
两个模型的共同点:an RNN with gated hidden units as a decoder
优化算法:minibatch stochastic gradient descent with AdaDelta
使用beam-search去发现最大化条件概率分布。
结果和分析
* the BLEU score 作为评价指标。
the fixed-length vector representation does not have enough capacity to encode a long sentence with complicated structure and meaning
固定长度的向量表示并没有足够的能力去编码一个包含复杂语义信息的长句子。文章来源:https://www.toymoban.com/news/detail-533255.html
感悟
需要彻底理解以下条件概率分布以及RNN网络的数学推导。研读论文需要注意以下其算法原理。以及评价指标。文章来源地址https://www.toymoban.com/news/detail-533255.html
到了这里,关于On the Properties of Neural Machine Translation: Encoder–DecoderApproaches的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!