Abstract
Sequence-to-sequence models are a powerful workhorse of NLP. Most variants employ a softmax transformation in both their attention mechanism and outpu......
小提示:本篇文献需要登录阅读全文,点击跳转登录