Printed Texture Guided Color Feature Fusion for Impressionism Style Rendering of Oil Paintings
Abstract
:1. Introduction
- We proposed an unsupervised printed texture guided color fusion framework, namely PTGCF, for impressionist oil painting style migration, which can fully consider the tonal characteristics and the representative color of the original oil painting.
- We proposed an effective fusion strategy to adaptively transfer the texture information from oil painting image to the nature image in the frequency domain of dominant color component.
2. Related Work
3. Materials and Methods
3.1. Data Collection and Pre-Processing
3.2. Workflow of the Proposed PTGCF Model
3.3. Color Matching
- Input a source and an oil painting image. In the Figure 5, the landscape image on the left is the source image, the middle image is the oil painting image;
- Convert both the source and the oil painting image from the RGB to the L*a*b* color space;
- Calculate the mean value and standard deviation of each channels for the source and oil painting images in the L*a*b* color space, denoted as () and ();
- Subtract the mean value of each channel (i.e., , , ) from the oil painting image in the L*a*b* color space;
- Scale the oil painting channels by the ratio of the standard deviation of the oil painting image divided by the standard deviation of the source image, multiplied by the channels of oil painting image;
- Add the mean value of the result of previous step into the source image;
- Convert the result of previous step from the L*a*b* space to the RGB color space.
3.4. Feature Fusion
3.4.1. Edge Enhancement
3.4.2. Adaptive Selection of Brush Stroke
3.4.3. Feature Fusion in the Fourier Domain
- Convert the original photograph into the frequency domain.
- Convert the brush texture patch into the frequency domain to create a filter.
- Blur the original photo.
- Enhance the boundary of image in step 3.
- Acquire the direction information of gradation from the output of step 4.
- Update the value of original photograph by adaptively shifting the brush stroke in FFT domain as shown in Figure 10b.
- Convert the result of step 6 into spatial domain by inverse Fourier transformed.
- Concatenate the result of step 7 with original a* and b*. To this end, the color and texture features of digital image and brush stroke have been successfully fused.
- Finally, the fused result of step 8 will be transferred from the L*a*b* to the RGB color space for displaying the optimised rendering effects.
3.4.4. Selection of Stroke Texture
4. Results
4.1. Landscape Painting
4.2. Portrait Painting
4.3. Key Parameter Analysis
4.4. Objective Comparison
5. Conclusions
Author Contributions
Funding
Informed Consent Statement
Acknowledgments
Conflicts of Interest
References
- Liao, Y.; Huang, Y. Deep Learning-Based Application of Image Style Transfer. Math. Probl. Eng. 2022, 2022, 1693892. [Google Scholar] [CrossRef]
- Kumar, M.P.; Poornima, B.; Nagendraswamy, H.S.; Manjunath, C. A comprehensive survey on non-photorealistic rendering and benchmark developments for image abstraction and stylization. Iran J. Comput. Sci. 2019, 2, 131–165. [Google Scholar] [CrossRef]
- Efros, A.A.; Freeman, W.T. Image Quilting for Texture Synthesis and Transfer. In Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, Los Angeles, CA, USA, 28 August 2001; pp. 341–346. [Google Scholar]
- Hertzmann, A.; Jacobs, C.E.; Oliver, N.; Curless, B.; Salesin, D.H. Image Analogies. In Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, Los Angeles, CA, USA, 28 August 2001; pp. 327–340. [Google Scholar]
- Ashikhmin, N. Fast texture transfer. IEEE Comput. Graph. Appl. 2003, 23, 38–43. [Google Scholar] [CrossRef]
- Lee, H.; Seo, S.; Ryoo, S.; Yoon, K. Directional Texture Transfer. In Proceedings of the 8th International Symposium on Non-Photorealistic Animation and Rendering, Annecy, France, 7–10 June 2010; pp. 43–48. [Google Scholar]
- Semmo, A.; Limberger, D.; Kyprianidis, J.E.; Döllner, J. Image stylization by interactive oil paint filtering. Comput. Graph. 2016, 55, 157–171. [Google Scholar] [CrossRef]
- Gatys, L.; Ecker, A.; Bethge, M. A Neural Algorithm of Artistic Style. J. Vis. 2016, 16, 326. [Google Scholar] [CrossRef]
- Gatys, L.A.; Bethge, M.; Hertzmann, A.; Shechtman, E. Preserving color in neural artistic style transfer. arXiv 2016, arXiv:1606.05897. [Google Scholar]
- Johnson, J.; Alahi, A.; Fei-Fei, L. Perceptual Losses for Real-Time Style Transfer and Super-Resolution; Springer: Berlin/Heidelberg, Germany, 2016; pp. 694–711. [Google Scholar]
- Li, C.; Wand, M. Combining Markov Random Fields and Convolutional Neural Networks for Image Synthesis. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 2479–2486. [Google Scholar]
- Liao, J.; Yao, Y.; Yuan, L.; Hua, G.; Kang, S.B. Visual attribute transfer through deep image analogy. arXiv 2017, arXiv:1705.01088. [Google Scholar] [CrossRef] [Green Version]
- Zou, Z.; Shi, T.; Qiu, S.; Yuan, Y.; Shi, Z. Stylized neural painting. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 15689–15698. [Google Scholar]
- Liu, S.; Lin, T.; He, D.; Li, F.; Deng, R.; Li, X.; Ding, E.; Wang, H. Paint transformer: Feed forward neural painting with stroke prediction. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, QC, Canada, 10–17 October 2021; pp. 6598–6607. [Google Scholar]
- Deng, Y.; Tang, F.; Dong, W.; Ma, C.; Pan, X.; Wang, L.; Xu, C. StyTr2: Image Style Transfer with Transformers. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), New Orleans, LA, USA, 19–20 June 2022; pp. 11326–11336. [Google Scholar]
- Yan, Y.; Ren, J.; Sun, G.; Zhao, H.; Han, J.; Li, X.; Marshall, S.; Zhan, J. Unsupervised image saliency detection with Gestalt-laws guided optimization and visual attention based refinement. Pattern Recognit. 2018, 79, 65–78. [Google Scholar] [CrossRef] [Green Version]
- Reinhard, E.; Stark, M.; Shirley, P.; Ferwerda, J. Photographic tone reproduction for digital images. In Proceedings of the 29th Annual Conference on Computer Graphics and Interactive Techniques, San Antonio, TX, USA, 21–26 July 2002; pp. 267–276. [Google Scholar]
- Yang, H.; Nan, G.; Lin, M.; Chao, F.; Shen, Y.; Li, K.; Ji, R. LAB-Net: LAB Color-Space Oriented Lightweight Network for Shadow Removal. arXiv 2022, preprint. arXiv:2208.13039. [Google Scholar]
- Reinhard, E.; Adhikhmin, M.; Gooch, B.; Shirley, P. Color transfer between images. IEEE Comput. Graph. Appl. 2001, 21, 34–41. [Google Scholar] [CrossRef]
- Super Fast Color Transfer between Images. Available online: https://pyimagesearch.com/2014/06/30/super-fast-color-transfer-images (accessed on 1 January 2022).
- Bradski, G. The openCV library. Dr. Dobb’s J. Softw. Tools Prof. Program. 2000, 25, 120–123. [Google Scholar]
- Gao, J.; Li, D.; Gao, W. Oil painting style rendering based on kuwahara filter. IEEE Access 2019, 7, 104168–104178. [Google Scholar] [CrossRef]
- Hertzmann, A. A survey of stroke-based rendering. IEEE Comput. Graph. Appl. 2003, 23, 70–81. [Google Scholar] [CrossRef]
- Engeldrum, P.G. Psychometric Scaling: A Toolkit for Imaging Systems Development; Imcotek Press: Winchester, MA, USA, 2000. [Google Scholar]
- Ulyanov, D.; Vedaldi, A.; Lempitsky, V. Instance normalization: The missing ingredient for fast stylization. arXiv 2016, arXiv:1607.08022. [Google Scholar]
- Wang, Q.; Wang, Z.; Genova, K.; Srinivasan, P.P.; Zhou, H.; Barron, J.T.; Martin-Brualla, R.; Snavely, N.; Funkhouser, T. Ibrnet: Learning multi-view image-based rendering. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA, 20–25 June 2021; pp. 4690–4699. [Google Scholar]
- Ancuti, C.O.; Ancuti, C.; De Vleeschouwer, C.; Sbert, M. Color channel compensation (3C): A fundamental pre-processing step for image enhancement. IEEE Trans. Image Process. 2019, 29, 2653–2665. [Google Scholar] [CrossRef] [PubMed]
- Liu, S. An Overview of Color Transfer and Style Transfer for Images and Videos. arXiv 2022, arXiv:2204.13339. [Google Scholar]
- Luthra, A.; Sulakhe, H.; Mittal, T.; Iyer, A.; Yadav, S. Eformer: Edge enhancement based transformer for medical image denoising. arXiv 2021, arXiv:2109.08044. [Google Scholar]
- Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 4700–4708. [Google Scholar]
- Jiang, W.; Wang, X.; Ren, J.; Li, S.; Sun, M.; Wang, Z.; Jin, J.S. MTFFNet: A Multi-task Feature Fusion Framework for Chinese Painting Classification. Cogn. Comput. 2021, 13, 1287–1296. [Google Scholar] [CrossRef]
SSIM↑ | PSNR↑ | |
---|---|---|
USM | 0.9954 | 60.23 |
Gamma | 0.9914 | 58.84 |
Laplance | 0.9930 | 58.82 |
Size (Pixel) | SSIM↑ | PSNR↑ |
---|---|---|
128 | 0.9954 | 60.23 |
192 | 0.9944 | 60.09 |
256 | 0.9944 | 60.11 |
SSIM↑ | PSNR↑ | LPIPS↓ | |
---|---|---|---|
PTGCF | 0.9954 | 60.23 | 0.0306 |
FT | 0.9914 | 58.84 | 0.0312 |
PLST | 0.9930 | 58.82 | 0.0327 |
ASNN | 0.9911 | 58.93 | 0.0656 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Geng, J.; Ma, L.; Li, X.; Zhang, X.; Yan, Y. Printed Texture Guided Color Feature Fusion for Impressionism Style Rendering of Oil Paintings. Mathematics 2022, 10, 3700. https://doi.org/10.3390/math10193700
Geng J, Ma L, Li X, Zhang X, Yan Y. Printed Texture Guided Color Feature Fusion for Impressionism Style Rendering of Oil Paintings. Mathematics. 2022; 10(19):3700. https://doi.org/10.3390/math10193700
Chicago/Turabian StyleGeng, Jing, Li’e Ma, Xiaoquan Li, Xin Zhang, and Yijun Yan. 2022. "Printed Texture Guided Color Feature Fusion for Impressionism Style Rendering of Oil Paintings" Mathematics 10, no. 19: 3700. https://doi.org/10.3390/math10193700
APA StyleGeng, J., Ma, L., Li, X., Zhang, X., & Yan, Y. (2022). Printed Texture Guided Color Feature Fusion for Impressionism Style Rendering of Oil Paintings. Mathematics, 10(19), 3700. https://doi.org/10.3390/math10193700