LGVTON: A Landmark Guided Approach to Virtual Try-On

1 Apr 2020  ·  Debapriya Roy, Sanchayan Santra, Bhabatosh Chanda ·

In this paper, we propose a Landmark Guided Virtual Try-On (LGVTON) method for clothes, which aims to solve the problem of clothing trials on e-commerce websites. Given the images of two people: a person and a model, it generates a rendition of the person wearing the clothes of the model. This is useful considering the fact that on most e-commerce websites images of only clothes are not usually available. We follow a three-stage approach to achieve our objective. In the first stage, LGVTON warps the clothes of the model using a Thin-Plate Spline (TPS) based transformation to fit the person. Unlike previous TPS-based methods, we use the landmarks (of human and clothes) to compute the TPS transformation. This enables the warping to work independently of the complex patterns, such as stripes, florals, and textures, present on the clothes. However, this computed warp may not always be very precise. We, therefore, further refine it in the subsequent stages with the help of a mask generator (Stage 2) and an image synthesizer (Stage 3) modules. The mask generator improves the fit of the warped clothes, and the image synthesizer ensures a realistic output. To tackle the problem of lack of paired training data, we resort to a self-supervised training strategy. Here paired data refers to the image pair of model and person wearing the same cloth. We compare LGVTON with four existing methods on two popular fashion datasets namely MPV and DeepFashion using two performance measures, FID (Fr\'echet Inception Distance) and SSIM (Structural Similarity Index). The proposed method in most cases outperforms the state-of-the-art methods.

PDF Abstract
No code implementations yet. Submit your code now

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here