An Empirical Exploration of Recurrent Network Architectures

Pages: 2342 - 2350
Published: Jul 6, 2015
Abstract
The Recurrent Neural Network (RNN) is an extremely powerful sequence model that is often difficult to train. The Long Short-Term Memory (LSTM) is a specific RNN architecture whose design makes it much easier to train. While wildly successful in practice, the LSTM's architecture appears to be ad-hoc so it is not clear if it is optimal, and the significance of its individual components is unclear. In this work, we aim to determine whether the...
Paper Details
Title
An Empirical Exploration of Recurrent Network Architectures
Published Date
Jul 6, 2015
Pages
2342 - 2350
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.