Dropout: a simple way to prevent neural networks from overfitting

Volume: 15, Issue: 1, Pages: 1929 - 1958
Published: Jan 1, 2014
Abstract
Deep neural nets with a large number of parameters are very powerful machine learning systems. However, overfitting is a serious problem in such networks. Large networks are also slow to use, making it difficult to deal with overfitting by combining the predictions of many different large neural nets at test time. Dropout is a technique for addressing this problem. The key idea is to randomly drop units (along with their connections) from the...
Paper Details
Title
Dropout: a simple way to prevent neural networks from overfitting
Published Date
Jan 1, 2014
Volume
15
Issue
1
Pages
1929 - 1958
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.