Accelerated Distributed Nesterov Gradient Descent for smooth and strongly convex functions

Published: Sep 1, 2016
Abstract
This paper considers the distributed optimization problem over a network, where the objective is to optimize a global function formed by a sum of local functions, using only local computation and communication. We develop an Accelerated Distributed Nesterov Gradient Descent (Acc-DNGD) method for strongly-convex and smooth functions. We show that it achieves a linear convergence rate and analyze how the convergence rate depends on the condition...
Paper Details
Title
Accelerated Distributed Nesterov Gradient Descent for smooth and strongly convex functions
Published Date
Sep 1, 2016
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.