Exploiting potential of deep neural networks by layer-wise fine-grained parallelism

Volume: 102, Pages: 210 - 221
Published: Jan 1, 2020
Abstract
Deep neural networks (DNNs) have become more and more important for big data analysis. They usually use data parallelism or model parallelism for extreme scale computing. However, the two approaches realize the performance improvement mainly by using coarse-grained parallelization schemes. Neither can fully exploit the potentials of the parallelism of many-core systems (such as GPUs) for neural network models. Here, a new fine−grained...
Paper Details
Title
Exploiting potential of deep neural networks by layer-wise fine-grained parallelism
Published Date
Jan 1, 2020
Volume
102
Pages
210 - 221
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.