Landscape Painting Style Transfer and Feature Extraction Model based on Convolutional Neural Networks

Authors

  • Rui Bian

Keywords:

Style Transfer; Feature Extraction; Landscape Painting; Convolutional Neural Networks; Image Processing

Abstract

In computer-based painting research, style transfer and feature extraction is a challenging subject. Most research use manual segmentation to choose local regions, which results in inefficient final extracted characteristics and an inability to adequately identify the artist's aesthetic approach. To address this problem, this paper proposes a novel style loss and content loss guided two-channel VGG network for landscape painting style transfer and feature extraction. We convolve the input image and then use the feature maps from the third to the fifth layer of the two-channel network. A loss function for content and style is built up from the higher layers to the lower layers and decoded to the next layer by a decoder after each layer is matched until the final synthetic map is obtained, achieving feature extraction, feature constraint addition and parameter control for traditional Chinese paintings with local style transfer. It is observed that the feature information such as lines, textures and frequencies in landscape paintings are significantly different from other images compared to other styles of paintings, and therefore these feature information can be extracted and constrained and learned with style loss and content loss. The final experimental results show that the proposed method improves image style transfer and feature extraction for landscape paintings to some extent.

Published

2024-08-26

How to Cite

Rui Bian. (2024). Landscape Painting Style Transfer and Feature Extraction Model based on Convolutional Neural Networks. The International Journal of Multiphysics, 18(2), 581 - 599. Retrieved from https://themultiphysicsjournal.com/index.php/ijm/article/view/1355

Issue

Section

Articles