[en] We propose a new framework that allows simultaneous modelling
and tracking of articulated objects in real time. We introduce a non-probabilistic
graphical model and a new type of message that propagates explicit motion information
for realignment of feature constellations across frames. These messages
are weighted according to the rigidity of the relations between the source and
destination features. We also present a method for learning these weights as well
as the spatial relations between connected feature points, automatically identifying
deformable and rigid object parts. Our method is extremely fast and allows
simultaneous learning and tracking of nonrigid models containing hundreds of
feature points with negligible computational overhead.