The Stochastic Delta Rule: Faster and More Accurate Deep Learning Through Adaptive Weight Noise
Abstract
Multilayer neural networks have led to remarkable performance on many kinds of benchmark tasks in text, speech, and image processing. Nonlinear parameter estimation in hierarchical models is known to be subject to overfitting and misspecification. One approach to these estimation and related problems (e.g., saddle points, colinearity, feature discovery) is called Dropout. The Dropout algorithm removes hidden units according to a binomial random...
Paper Details
Title
The Stochastic Delta Rule: Faster and More Accurate Deep Learning Through Adaptive Weight Noise
Published Date
May 1, 2020
Journal
Volume
32
Issue
5
Pages
1018 - 1032
Citation AnalysisPro
You’ll need to upgrade your plan to Pro
Looking to understand the true influence of a researcher’s work across journals & affiliations?
- Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
- Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.
Notes
History