Learning parameterized models of image motion

(with David Fleet, Yaser Yacoob, and Allan Jepson)

A framework for learning parameterized models of optical flow from image sequences is presented. A class of motions is represented by a set of orthogonal basis flow fields that are computed from a training set using principal component analysis. Many complex image motions can be represented by a linear combination of a small number of these basis flows. The learned motion models may be used for optical flow estimation and for model-based recognition. For optical flow estimation we describe a robust, multi-resolution scheme for directly computing the parameters of the learned flow models from image derivatives. As examples we consider learning motion discontinuities, non-rigid motion of human mouths, and articulated human motion.

Consider the problem of "learning" a model of a motion discontinuity:

Example of a learned motion discontinuity model applied to the "Flower Garden" sequence:

One goal of this work is to be able to detect/recognize motion features such as discontinuities:

Related Publications

D. J. Fleet, M. J. Black, Y. Yacoob, and A. D. Jepson, Design and use of linear models for image motion analysis, Int. J. of Computer Vision, 36(3), pp. 171-193, 2000. (pdf).

Black, M. J., Yacoob, Y., Jepson, A. D., Fleet, D. J., Learning parameterized models of image motion, IEEE Conf. on Computer Vision and Pattern Recognition, CVPR-97, Puerto Rico, June 1997, pp. 561-567. (postscript)