Model Compression for ResNet via Layer Erasure and Re-training

Volume: 35, Issue: 3, Pages: C - 10
Published: May 1, 2020
Abstract
Residual Networks with convolutional layers are widely used in the field of machine learning. Since they effectively extract features from input data by stacking multiple layers, they can achieve high accuracy in many applications. However, the stacking of many layers raises their computation costs. To address this problem, we propose Network Implosion, it erases multiple layers from Residual Networks without degrading accuracy. Our key idea is...
Paper Details
Title
Model Compression for ResNet via Layer Erasure and Re-training
Published Date
May 1, 2020
Volume
35
Issue
3
Pages
C - 10
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.