The Stochastic Delta Rule: Faster and More Accurate Deep Learning Through Adaptive Weight Noise

Volume: 32, Issue: 5, Pages: 1018 - 1032
Published: May 1, 2020
Abstract
Multilayer neural networks have led to remarkable performance on many kinds of benchmark tasks in text, speech, and image processing. Nonlinear parameter estimation in hierarchical models is known to be subject to overfitting and misspecification. One approach to these estimation and related problems (e.g., saddle points, colinearity, feature discovery) is called Dropout. The Dropout algorithm removes hidden units according to a binomial random...
Paper Details
Title
The Stochastic Delta Rule: Faster and More Accurate Deep Learning Through Adaptive Weight Noise
Published Date
May 1, 2020
Volume
32
Issue
5
Pages
1018 - 1032
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.