Hidden Markov Model: Tutorial

This is a tutorial paper for Hidden Markov Model (HMM). First, we briefly review the background on Expectation Maximization (EM), Lagrange multiplier, factor graph, the sum-product algorithm, the max-product algorithm, and belief propagation by the forward-backward procedure. Then, we introduce probabilistic graphical models including Markov random field and Bayesian network. Markov property and Discrete Time Markov Chain (DTMC) are also introduced. We, then, explain likelihood estimation and EM in HMM in technical details. We explain evaluation in HMM where direct calculation and the forward-backward belief propagation are both explained. Afterwards, estimation in HMM is covered where both the greedy approach and the Viterbi algorithm are detailed. Then, we explain how to train HMM using EM and the Baum-Welch algorithm. We also explain how to use HMM in some applications such as speech and action recognition.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here