Journal of Jilin University Science Edition ›› 2021, Vol. 59 ›› Issue (3): 577-586.

Previous Articles     Next Articles

Controllable Multi-texture Extended Synthesis and Transfer

LI Erqiang, CHEN Kaijian, ZHOU Yang   

  1. College of Computer Science & Software Engineering, Shenzhen University, Shenzhen 518060, Guangdong Province, China
  • Received:2020-09-02 Online:2021-05-26 Published:2021-05-23

Abstract: Aiming at the problems  of mode collapse in the application of single texture expansion model to multiple texture expansion, and the user could not control the output texture mode and style, we proposed a novel network  for controllable multi-texture extended synthesis  and transfer. Firstly, by adding  classification training in the discrimination  training of generative adversarial network (GAN),  the discriminator could distinguish the generated data from the real data, and could further correctly identify the input texture came from which training image, so as to improve the problem of  mode collapse. Secondly, in order to achieve the user’s control of texture mode in texture transfer, we modified the generator to a two-stream data input, in which one stream provided structure-guiding features, and the other one provided texture mode features. The final texture image was generated by decoding after  fusing  the two features. The experimental results show   that the proposed model can not only correctly  learn the  texture modes of multi-texture images with one network,  but also has better and  controllable texture transfer function.

Key words: texture synthesis, control synthesis, texture transfer, generative adversarial network (GAN)

CLC Number: 

  • TP391.41