Kernel density estimation of Tsalli’s entropy with applications in adaptive system training

Leena Chawla, Vijay Kumar, Arti Saxena

Abstract


Information theoretic learning plays a very important role in adaption learning systems. Many non-parametric entropy estimators have been proposed by the researchers. This work explores kernel density estimation based on Tsallis entropy. Firstly, it has been proved that for linearly independent samples and for equal samples, Tsallis-estimator is consistent for the PDF and minimum respectively. Also, it is investigated that Tsallisestimator is smooth for differentiable, symmetric, and unimodal kernel function. Further, important properties of Tsallis-estimator such as scaling and invariance for both single and joint entropy estimation have been proved. The objective of the work is to understand the mathematics behind the underlying concept.

Keywords


Entropy estimator; Information theoretic measure; Machine learning; Non-parametric estimator; Tsalli’s entropy

Full Text:

PDF


DOI: http://doi.org/10.11591/ijai.v13.i2.pp2247-2253

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

IAES International Journal of Artificial Intelligence (IJ-AI)
ISSN/e-ISSN 2089-4872/2252-8938 
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).

View IJAI Stats