This paper is published in Volume-5, Issue-2, 2019
Area
Neural Networks
Author
P. Mounica, A. Nagaratnam, Mohammad Farzana, K. Muralidhar
Org/Univ
Anil Neerukonda Institute of Technology and Sciences, Visakhapatnam, Andhra Pradesh, India
Keywords
Style transfer, Artistic style, Content loss, Style loss, Gram matrix, VGG16, CNN, Texture transfer, Image synthesis, Imagenet
Citations
IEEE
P. Mounica, A. Nagaratnam, Mohammad Farzana, K. Muralidhar. Artistic style transfer using deep learning, International Journal of Advance Research, Ideas and Innovations in Technology, www.IJARIIT.com.
APA
P. Mounica, A. Nagaratnam, Mohammad Farzana, K. Muralidhar (2019). Artistic style transfer using deep learning. International Journal of Advance Research, Ideas and Innovations in Technology, 5(2) www.IJARIIT.com.
MLA
P. Mounica, A. Nagaratnam, Mohammad Farzana, K. Muralidhar. "Artistic style transfer using deep learning." International Journal of Advance Research, Ideas and Innovations in Technology 5.2 (2019). www.IJARIIT.com.
P. Mounica, A. Nagaratnam, Mohammad Farzana, K. Muralidhar. Artistic style transfer using deep learning, International Journal of Advance Research, Ideas and Innovations in Technology, www.IJARIIT.com.
APA
P. Mounica, A. Nagaratnam, Mohammad Farzana, K. Muralidhar (2019). Artistic style transfer using deep learning. International Journal of Advance Research, Ideas and Innovations in Technology, 5(2) www.IJARIIT.com.
MLA
P. Mounica, A. Nagaratnam, Mohammad Farzana, K. Muralidhar. "Artistic style transfer using deep learning." International Journal of Advance Research, Ideas and Innovations in Technology 5.2 (2019). www.IJARIIT.com.
Abstract
In this paper, we are implementing the style transfer using convolutional neural networks. The style transfer means to extract the style and texture of a style image and applying it to the extracted content of another image. Our work is based on the work proposed by LA Gatys. We use a pre-trained model, VGG 16 for our work. This work includes the content reconstruction and style reconstruction from the content image and style image respectively. Now the style and content are merged in a manner that the features of content and style are retained.