We address the problem of estimating human pose and body shape from 3D scans
over time. Reliable estimation of 3D body shape is necessary for many
applications including virtual try-on, health monitoring, and avatar creation
for virtual reality. Scanning bodies in minimal clothing, however, presents a
practical barrier to these applications. We address this problem by estimating
body shape under clothing from a sequence of 3D scans. Previous methods that
have exploited body models produce smooth shapes lacking personalized details.
We contribute a new approach to recover a personalized shape of the person. The
estimated shape deviates from a parametric model to fit the 3D scans. We
demonstrate the method using high quality 4D data as well as sequences of
visual hulls extracted from multi-view images. We also make available BUFF, a
new 4D dataset that enables quantitative evaluation
(http://buff.is.tue.mpg.de). Our method outperforms the state of the art in
both pose estimation and shape estimation, qualitatively and quantitatively.