Arbitrary Handwriting Image Style Transfer

14 Jan 2022  ·  Kai Yang, Xiaoman Liang, Huihuang Zhao ·

This paper proposed a method to imitate handwriting style by style transfer. We proposed an neural network model based on conditional generative adversarial networks (cGAN) for handwriting style transfer. This paper improved the loss function on the basis of the GAN. Compared with other handwriting imitation methods, the handwriting style transfer's effect and efficiency have been significantly improved. The experiments showed that the shape of the generated Chinese characters is clear and the analysis of experimental data showed the Generative adversarial networks showed excellent performance in handwriting style transfer. The generated text image is closer to the real handwriting and achieved a better performance in term of handwriting imitation.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here