Description
Information divergence measures have been illustrated as exceptionally valuable in a assortment of disciplines in information theory, such as: guess of likelihood conveyances, choice making, design acknowledgment, examination of possibility tables, turbulence stream, Medical sciences, fuzzy sciences etc.. In this book, The author extends the work on new functional divergence with new information inequalities, bounds and their applications in a very simple way. This functional divergence has been introduced by Jain- Saraswat in 2012. The summary of the book is as follows:
Chapter 1 introduces the historical background of Entropy and Divergence Measures.
Chapter 2 introduces several new information inequalities on new functional divergence together with their applications and numerical verification.
Chapter 3 explains about new divergence measures of Csiszar's class, their bounds and their applications.
Chapter 4 characterizes new series of divergences, intra relations and their applications.
Chapter 5 evaluates several important and interesting relations among new divergences and well-known divergences.
Chapter 6 introduces new generalized parametric divergence for comparing finite number of discrete probability distributions.