Wednesday, 8 November 2017

Assignment 28: Reading 24 - Rabiner HMM

Bibliography:
Lawrence R. Rabiner. A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition. Proceedings of the IEEE, Vol. 77, No. 2. pp. 257-286. 1989.

Summary:
This paper discusses the applications of hidden markov models in the area of speech recognition. It has been found that having signal models, allows us to study a great deal about real world signal sources, that are hard/expensive to capture and measure. These models broadly fall into the categories of deterministic and stochastic models. This paper talks about applications of a specific type of stochastic model, namely the hidden markov model.

A Markov model is a type of bayesian model in which the current state depends only on a finite set of previous states. This allows us to define a state machine with transition probabilities between each of the states. Hidden markov model is an extension to the markov model, where the observation is a probabilistic function of the state. The author explains this by giving examples using "a simple coin tossig model" and "the urn and ball model".

HMMs help us solve 3 types of problems. 1) Given an observation sequence and a model, how do we efficiently compute the probability of the observation, given the model. 2) How do we choose states in the model, such that it optimally explains the observations. 3) How do we adjust the model parameters to maximize the probability of observation given the model.

Discussion:
Markov process is a nice way to use simplify and use bayesian systems (or the theorem of total probability), when the current state depends only on a finite set of states in the past. Its interesting to see that HMMs take this one step further, and let us make prediction of the states, when we can only observe some 'effect' of being in a state. This also shows the importance of identifying the cause and effect direction correctly, before modelling our system as a HMM.

No comments:

Post a Comment