Analyzing Clothing Layer Deformation Statistics of 3D Human Motions
Abstract
Recent capture technologies and methods allow not only to retrieve 3D model sequence of moving people in clothing, but also to separate and extract the underlying body geometry and motion component and separate the clothing as a geometric layer. So far this clothing layer has only been used as raw offsets for individual applications such as retargeting a different body capture sequence with the clothing layer of another sequence, with limited scope, e.g. using identical or similar motions. The structured, semantics and motion-correlated nature of the information contained in this layer has yet to be fully understood and exploited. To this purpose we propose a comprehensive analysis of the statistics of this layer with a simple two-component model, based on PCA subspace reduction of the layer information on one hand, and a generic parameter regression model using neural networks on the other hand, designed to regress from any semantic parameter whose variation is observed in a training set, to the layer parameteriza-tion space. We show that this model not only allows to reproduce previous motion retargeting works, but generalizes the data generation capabilities of the method to other semantic parameters such as clothing variation and size, or physical material parameters with synthetically generated training sequence, paving the way for many kinds of capture data-driven creation and augmentation applications.
Fichier principal
clothesModeling.pdf (10.25 Mo)
Télécharger le fichier
cameraReadyVideo.mp4 (89.46 Mo)
Télécharger le fichier
halImg.png (133.21 Ko)
Télécharger le fichier
Origin | Files produced by the author(s) |
---|
Format | Video |
---|
Format | Figure, Image |
---|
Loading...