Other Titles in Applied Mathematics

Hidden Markov Models and Dynamical Systems

3. Variants and Generalizations

    pp.47 - 57
    • Excerpt

      HMMs are special cases of discrete-time state space models characterized by a state transition probability function and an observation probability function, i.e., P ( Sn+1 | Sn ) , 3.1a P ( Yn | Sn ) . 3.1b In Chapter 2, I described algorithms for fitting and using the probability distributions specified in (3.1) if both the set of possible states S and the set of possible observations Y have an unstructured finite number of discrete values. However, in many applications the measurements, and perhaps the states also, are thought of as being drawn from continuous vector spaces.

      Since most experimental observations are measured and recorded digitally, one could argue that discrete approximations are adequate and attempt to use the algorithms of Chapter 2 anyway. That approach is disastrous because it precludes exploiting either the metric or topological properties of the space of measurements. Consider the histogram of the first 600 samples of Tang's laser data in Fig. 3.1. Neither 5 nor 93 occurs, but it seems more plausible that 93 will occur in the remainder of the samples because there are 14 occurrences between 90 and 96 and none between 2 and 8. To make more effective use of measured data, one usually approximates the probabilities by functions with a small number of free parameters. For many such families of parametric models one can use the algorithms of Chapter 2 with minor modifications. For a practitioner, the challenge is to find or develop both a parametric family that closely matches the measured system and algorithms for fitting and using the models.

      In this chapter, I will describe some model families with Gaussian observations. I will use the failure of the maximum likelihood approach with such models to motivate and develop regularization. Also, I will touch on the relationships between HMM model families and other kinds of models.