The Kullback-Leibler Divergence Used in Machine Learning Algorithms for Health Care Applications and Hypertension Prediction: A Literature Review
Abstract
Kullback-Leibler divergence class or relative entropy is an exceptional instance of a more extensive divergence. It is an estimation of how a particular dissemination wanders from another, normal likelihood appropriation. Kullback-Leibler divergence has a considerable measure of ongoing applications. Despite the fact that there is an advance in the drug field, it still requires a measurable examination for supporting developing prerequisites....
Paper Details
Title
The Kullback-Leibler Divergence Used in Machine Learning Algorithms for Health Care Applications and Hypertension Prediction: A Literature Review
Published Date
Jan 1, 2018
Journal
Volume
141
Pages
448 - 453
Citation AnalysisPro
You’ll need to upgrade your plan to Pro
Looking to understand the true influence of a researcher’s work across journals & affiliations?
- Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
- Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.
Notes
History