GAN-based Garment Generation Using Sewing Pattern Images

ECCV 2020  ·  Yu Shen, Junbang Liang, Ming C. Lin ·

The generation of realistic apparel model has become increasingly popular as a result of the rapid pace of change in fashion trends and the growing need for garment models in various applications such as virtual try-on. For such application requirements, it is important to have a general cloth model that can represent a diverse set of garments. Previous studies often make certain assumptions about the garment, such as the topology or suited body shape. We propose a unified method using the generative network. Our model is applicable to different garment topologies with different sewing patterns and fabric materials. We also develop a novel image representation of garment models, and a reliable mapping algorithm between the general garment model and the image representation that can regularize the data representation of the cloth. Using this special intermediate image representation, the generated garment model can be easily retargeted to another body, enabling garment customization. In addition, a large garment appearance dataset is provided for use in garment reconstruction, garment capturing, and other applications. We demonstrate that our generative model has high reconstruction accuracy and can provide rich variations of virtual garments.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here