Skip to main content Skip to main navigation

Publikation

Neural Associative Memory for Dual-Sequence Modeling

Dirk Weißenborn
In: Proceedings of the 1st Workshop on Representation Learning for NLP. ACL Workshop on Representation Learning for NLP (RepL4NLP-2016), August 11, Berlin, Germany, ACL, 2016.

Zusammenfassung

Many important NLP problems can be posed as dual-sequence or sequence-to-sequence modeling tasks. Recent ad- vances in building end-to-end neural ar- chitectures have been highly successful in solving such tasks. In this work we propose a new architecture for dual-sequence modeling that is based on associative memory. We derive AM-RNNs, a recur- rent associative memory (AM) which aug- ments generic recurrent neural networks (RNN). This architecture is extended to the Dual AM-RNN which operates on two AMs at once. Our models achieve very competitive results on textual entailment. A qualitative analysis demon- strates that long range dependencies between source and target-sequence can be bridged effectively using Dual AM-RNNs. However, an initial experiment on auto- encoding reveals that these benefits are not exploited by the system when learn- ing to solve sequence-to-sequence tasks which indicates that additional supervision or regularization is needed.

Projekte

Weitere Links