A Nonstationary Hidden Markov ModelWith Approximately Infinitely-Long Time-Dependencies

TitleA Nonstationary Hidden Markov ModelWith Approximately Infinitely-Long Time-Dependencies
Publication TypeConference Proceedings
Year of Conference2014
AuthorsChatzis, S, Kosmopoulos, D, Papadourakis, G
Conference NameInternational Symposium on Visual Computing
VolumeII
EditionLNCS, Advances in Visual Computing
Pagination51-62
Abstract

<p style="margin-top: 0px; margin-bottom: 0px; font-size: 9px; line-height: normal; font-family: Helvetica;">Hidden Markov models (HMMs) are a popular approach for modeling sequential data, typically based on the assumption of a first-order Markov chain. In other words, only one-step back dependencies are modeled which is a rather unrealistic assumption in most applications. In this paper, we propose a method for postulating HMMs with approximately infinitely-long time-dependencies. Our approach considers the whole history of model states in the postulated dependencies, by making use of a recently proposed nonparametric Bayesian method for modeling label sequences with infinitely-long time dependencies, namely the sequence memoizer. We manage to derive training and inference algorithms for our model with computational costs identical to simple first-order HMMs, despite its entailed infinitely-long time-dependencies, by employing a mean-field-like approximation.The efficacy of our proposed model is experimentally demonstrated.</p>