Fundamental insight. When reassigning two-dimensional data points to cluster centroids c and d in k-means clustering (left) the hyperrectangles obtained from kd-trees reduce the computational effort by making an argument about all points in an hyperrectangle based on their vertices; consider for example the rightmost hyperrectangle. For sequences (middle) there is no overlap in y-direction and decisions about the most likely state can be made per block considering the means of the Gaussians of a three-state HMM (right), μ-, μ
and μ+. Note that at every given block only a decision between the two states with closest mean is necessary, if one assume comparable variances. Decision boundaries are displayed dashed.