Paper: | MLSP-L1.3 |
Session: | Learning Theory I |
Time: | Wednesday, May 17, 10:40 - 11:00 |
Presentation: |
Lecture
|
Topic: |
Machine Learning for Signal Processing: Learning Theory and Modeling |
Title: |
HIDDEN MARKOV MODEL FRAMEWORK USING INDEPENDENT COMPONENT ANALYSIS MIXTURE MODEL |
Authors: |
Jian Zhou, Pixelworks Inc., Canada; Xiao-Ping Zhang, Ryerson University, Canada |
Abstract: |
This paper describes a novel method for the analysis of sequential data that exhibits strong non-Gaussianities. In particular, we extend the classical continuous hidden Markov model (HMM) by modeling the observation densities as a mixture of non-Gaussian distributions. In order to obtain a parametric representation of the densities, we apply the independent component analysis (ICA) mixture model to the observations such that each non-Gaussian mixture component is associated with a standard ICA. Under this new framework, we develop the re-estimation formulas for the three fundamental HMM problems, namely, likelihood computation, state sequence estimation, and model parameter learning. The simulations also validate the theoretical results. |