|Table of Contents|

Iterative cost function and variable parameter generativeadversarial networks(PDF)


Research Field:
Publishing date:


Iterative cost function and variable parameter generativeadversarial networks
Chen Yao1Song Xiaoning1Yu Dongjun2
1.School of IoT Engineering,Jiangnan University,Wuxi 214122,China; 2.School of ComputerScience and Engineering,Nanjing University of Science and Technology,Nanjing 210094,China
generative adversarial networks iterative cost function method variable parameter distribution distance
In order to solve the difficult training problem of generative adversarial networks,this paper proposes an iterative cost function and variable parameter generative adversarial networks based on the Wasserstein GAN(WGAN)method. For the improvement of penalty items in the original WGAN,iterative methods are used to increase penalty instead of the original randomly selected method. Aiming at the hyper-parameter of penalty item of fixed cost function in WGAN,the strategy of changing hyper-parameter is put forward. The change is based on the distance between imitation distribution and real distribution. Experiments conducted on MNIST handwritten font datasets and CELEBA face datasets show the effectiveness of the proposed method as compared with the traditional WGAN,significantly improving the convergence speed of the generator.


[1] 戚湧,胡俊,於东军.基于自组织映射与概率神经网络的增量式学习算法[J]. 南京理工大学学报,2013,37(1):1-6.
Qi Yong,Hu Jun,Yu Dongjun.Incremental learning algorithm based on self-organizing map and probabilistic neural network[J]. Journal of Nanjing University of Science and Technology,2013,37(1):1-6.
[2]Goodfellow I J,Pouget-Abadie J,Mirza M,et al. Generative adversarial networks[J]. Advances in Neural Information Processing Systems,2014,3:2672-2680.
[3]Goodfellow I. NIPS 2016 tutorial:Generative adversarial networks[J]. arXiv preprint arXiv:1701.00160,2016.
[4]唐朝辉,朱清新,洪朝群,et al. 基于自编码器及超图学习的多标签特征提取[J]. 自动化学报,2016,42(7):1014-1021.
Tang Chaohui,Zhu Qingxin,Hong Chaoqun,et al. Multi-label feature selection with autoencoders and hypergraph learning[J]. Acta Automatica Sinica,2016,42(7):1014-1021.
[5]Arjovsky M,Chintala S,Bottou L. Wasserstein gan[J]. arXiv preprint arXiv:1701.07875,2017.
[6]Arjovsky M,Bottou L. Towards principled methods for training generative adversarial networks[J]. arXiv preprint arXiv:1701.04862,2017.
[7]Gulrajani I,Ahmed F,Arjovsky M,et al. Improved training of wasserstein gans[C]//Advances in Neural Information Processing Systems. Long Beach Convention Center,PH:NIPS,2017:5767-5777.
[8]Radford A,Metz L,Chintala S. Unsupervised representation learning with deep convolutional generative adversarial networks[J]. arXiv preprint arXiv:1511.06434,2015.
[9]Zhao J,Mathieu M,LeCun Y. Energy-based generative adversarial network[J]. arXiv preprint arXiv:1609.03126,2016.

[10]Chen X,Duan Y,Houthooft R,et al. Infogan:Interpretable representation learning by information maximizing generative adversarial nets[C]//Advances in neural information processing systems. Center Convencions International Barcelona,Barcelona,Spain:NIPS,2016:2172-2180.
[11]丁绪星,朱日宏,李建欣.一种静止图像质量评价指标[J]. 南京理工大学学报,2004,28(5):507-510.
Ding Xuxing,Zhu Rihong,Li Jianxin. Image quality assessment index for still image[J]. Journal of Nanjing University of Science and Technology,2004,28(5):507-510.
[12]Zhang L,Zhang L,Mou X. RFSIM:A feature based image quality assessment metric using Riesz transforms[C]//2010 IEEE International Conference on Image Processing. Hong Kong,China:IEEE,2010:321-324.


Last Update: 2019-02-28