The Kullback-Leibler Divergence Used in Machine Learning Algorithms for Health Care Applications and Hypertension Prediction: A Literature Review

Volume: 141, Pages: 448 - 453
Published: Jan 1, 2018
Abstract
Kullback-Leibler divergence class or relative entropy is an exceptional instance of a more extensive divergence. It is an estimation of how a particular dissemination wanders from another, normal likelihood appropriation. Kullback-Leibler divergence has a considerable measure of ongoing applications. Despite the fact that there is an advance in the drug field, it still requires a measurable examination for supporting developing prerequisites....
Paper Details
Title
The Kullback-Leibler Divergence Used in Machine Learning Algorithms for Health Care Applications and Hypertension Prediction: A Literature Review
Published Date
Jan 1, 2018
Volume
141
Pages
448 - 453
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.