Paper: | MLSP-L3.5 |
Session: | Learning Theory II |
Time: | Thursday, May 18, 11:20 - 11:40 |
Presentation: |
Lecture
|
Topic: |
Machine Learning for Signal Processing: Information-theoretic learning |
Title: |
A Normalized Minimum Error Entropy Stochastic Algorithm |
Authors: |
Seungju Han, Sudhir Rao, Kyu-Hwa Jeong, Jose Principe, University of Florida, United States |
Abstract: |
We propose in this paper the normalized Minimum Error Entropy (NMEE). Following the same rational that lead to the normalized LMS, the weight update adjustment for minimum error entropy (MEE) is constrained by the principle of minimum disturbance. Unexpectedly, we obtained an algorithm that not only is insensitive to the power of the input, but is also faster than the MEE for the same misadjustment, and also that is less sensitive to the kernel size. We explain these results analytically, and through system identification simulations. |