Developing an AI-Based Digital Biophilic Art Curation to Enhance Mental Health in Intelligent Buildings
Abstract
:1. Introduction
1.1. Nature Disconnectedness and Intelligent Biophilic Buildings
1.2. Inequality to Access Arts
1.3. Developing AI-Based Automated Classification and Recommendation Systems for Biophilic Arts Curation
2. Materials and Methods
- The first stage is to develop metrics for Biophilic arts attributes and emotional responses.
- The second stage involves the collation of the art dataset from a public repository and then conducting a public survey to assign labels to the painting.
- The third stage is to develop an ML algorithm to automate the categorisation process and then develop a recommendation system for the users.
2.1. Stage 1: Developing Metrics to Categorise Biophilic Attributes and Emotional Responses
2.2. Stage 2: Develop Data and Survey to Establish Relationships Between Biophilic Arts and Emotional Responses
2.3. Stage 3: Machine Learning Techniques for Classification and Recommendation
- Dominant niophilic labels: Identify images where there is a clear consensus among the participants for the dominant Biophilic characteristics.
- Proportion of emotions: For the selected images in the previous step, the proportion of each emotion expressed by participants is computed.
- Aggregated emotional proportions: Next, the proportion of emotions expressed for each Biophilic class is aggregated.
3. Results and Discussion
3.1. Metrics of Biophilic Attributes and Emotional Responses
3.2. Data Repository and Public Survey Results Analysis
3.3. ML Classification Results and Recommendation Systems
- Null hypothesis (): The difference between the observations is not statistically significant, i.e., the observations are random coincidences.
- Alternative hypothesis (): The difference between the observations is statistically significant, i.e., the observations are not random coincidences.
4. Conclusions
4.1. Summary of Key Findings
4.2. Implications of the Findings
4.3. Limitation of the Study and Future Research
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Cooper, C.; Browning, B. The Human Spaces Report. Intell. Build. Int. J. 2014. [Google Scholar]
- Xing, Y.; Eames, M.; Lannon, S. Exploring the use of systems dynamics in sustainable urban retrofit planning. In Urban Retrofitting for Sustainability; Dixon, T., Eames, A., Hunt, M., Lannon, S., Eds.; Taylor & Francis: Abingdon, UK, 2014; pp. 67–88. [Google Scholar]
- Wu, W.; Chen, W.Y. Inequalities of green infrastructure in the context of healthy and resilient cities. Urban For. Urban Green. 2024, 94, 128244. [Google Scholar] [CrossRef]
- Bai, R.; Guo, Y.; Xing, Y. Relationship between urban heat island and green infrastructure fraction in Harbin. Remote. Sens. Technol. Appl. Urban Environ. III 2018, 10793. [Google Scholar] [CrossRef]
- Kellert, S.R.; Wilson, E.O. The Biophilia Hypothesis. Bull. Sci. Technol. Soc. 1993, 15, 52–53. [Google Scholar]
- Mahnke, F.H. Color, Environment, and Human Response: An Interdisciplinary Understanding of Color and Its Use as a Beneficial Element in the Design of the Architectural Environment; Van Nostrand Reinhold: New York, NY, USA, 1996. [Google Scholar]
- Browning, W.D.; Ryan, C.O. Nature Inside: A Biophilic Design Guide; RIBA Publishing: London, UK, 2020. [Google Scholar]
- Ulrich, R.S. Effects of healthcare design on wellness: Theory and recent scientific research. J. Healthc. Des. 1991, 3, 97–109. [Google Scholar]
- Berman, M.G.; Jonides, J.; Kaplan, S. The cognitive benefits of interacting with nature. Psychol. Sci. 2008, 19, 1207–1212. [Google Scholar] [CrossRef]
- Fromm, E. The Anatomy of Human Destructiveness; Henry Holt: New York, NY, USA, 1973. [Google Scholar]
- Xing, Y.; Williams, A.; Knight, A. Developing a biophilic behavioural change design framework-A scoping study. Uban For. Urban Green. 2024, 94, 128278. [Google Scholar] [CrossRef]
- Xing, Y.; Jones, P.; Bosch, M.; Donnison, I.; Spear, M.; Ormondroyd, G. Exploring design principles of biological and living building envelopes: What can we learn from plant cell walls? Intell. Build. Int. 2018, 10, 78–102. [Google Scholar] [CrossRef]
- Thomas, C.; Xing, Y. To What Extent Is Biophilia Implemented in the Built Environment to Improve Health and Wellbeing? — State-of-the-Art Review and a Holistic Biophilic Design Framework. In Emerging Research in Sustainable Energy and Buildings for a Low-Carbon Future; Springer: Berlin/Heidelberg, Germany, 2021. [Google Scholar]
- Xing, Y.; Kar, P.; Bird, J.J.; Sumich, A.; Knight, A.; Lotfi, A.; van Barthold, B.C. Exploring Machine Learning Applications for Biophilic Art Displays to Promote Health and Well-being. In Proceedings of the Pervasive Technologies Related to Assistive Environments (PETRA) Conference (PETRA ’24), Crete, Greece, 26–28 June 2024; ACM: New York, NY, USA, 2024. [Google Scholar] [CrossRef]
- Carlson, A. Aesthetics and the Environment: The Appreciation of Nature, Art and Architecture; Routledge: Abingdon-on-Thames, UK, 2005. [Google Scholar]
- UNESCO. Culture: Urban Future—Global Report on Culture for Sustainable Urban Development; Summary, Director-General, 2009–2017 Document Code: CLT-2016/WS/18; UNESCO: Paris, France, 2016. [Google Scholar]
- Moore, J. Poverty and access to the arts: Inequalities in arts attendance. Cult. Trends 1998, 8, 53–73. [Google Scholar] [CrossRef]
- Serota, N. Introducing Our Strategy. Let’s Create; Arts Council England: Manchester, UK, 2023. [Google Scholar]
- Mughal, R.; Polley, M.; Sabey, A.; Chatterjee, H.J. How Arts, Heritage and Culture Can Support Health and Wellbeing Through Social Prescribing; National Association of School Psychologists (NASP): Bethesda, MD, USA, 2022. [Google Scholar]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Lee, S.G.; Cha, E.Y. Style classification and visualization of art painting’s genre using self-organizing maps. Hum. Cent. Comput. Inf. Sci 2016, 6, 7. [Google Scholar] [CrossRef]
- Imran, S.; Naqvi, R.A.; Sajid, M.; Malik, T.S.; Ullah, S.; Moqurrab, S.A.; Yon, D.K. Artistic Style Recognition: Combining Deep and Shallow Neural Networks for Painting Classification. Mathematics 2023, 11, 4564. [Google Scholar] [CrossRef]
- Tashu, T.M.; Hajiyeva, S.; Horvath, T. Multimodal Emotion Recognition from Art Using Sequential Co-Attention. J. Imaging 2021, 7, 157. [Google Scholar] [CrossRef] [PubMed]
- Aslan, S.; Castellano, G.; Digeno, V.; Migailo, G.; Scaringi, R.; Vessio, G. Recognizing the Emotions Evoked by Artworks Through Visual Features and Knowledge Graph-Embeddings. Image Anal. Process. ICIAP 2022, 13373, 129–140. [Google Scholar]
- Gell, A. Art and Agency: An Anthropological Theory; Clarendon Press: Oxford, UK, 1998. [Google Scholar]
- González-Martín, C.; Carrasco, M.; Wachter Wielandt, T.G. Detection of Emotions in Artworks Using a Convolutional Neural Network Trained on Non-Artistic Images: A Methodology to Reduce the Cross-Depiction Problem. Empir. Stud. Arts 2024, 42, 38–64. [Google Scholar] [CrossRef]
- Bose, D.; Somandepalli, K.; Kundu, S.; Lahiri, R.; Gratch, J.; Narayanan, S. Understanding of Emotion Perception from Art. arXiv 2021, arXiv:2110.06486. [Google Scholar]
- Berlyne, D.E. Aesthetics and psychobiology. J. Aesthet. Art Crit. 1973, 12, 126–128. [Google Scholar]
- Tan, E.S. Emotion, art, and the humanities. J. Aesthet. Art Crit. 2000, 3, 116–134. [Google Scholar]
- Szeliski, R. Computer Vision: Algorithms and Applications; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
- Roberts, L.G. Outstanding Dissertations in the Computer Sciences; Massachusetts Institute of Technology: Cambridge, MA, USA, 1963. [Google Scholar]
- Huang, T.S. Computer Vision: Evolution and Promise; CERN European Organization for Nuclear Research-Reports-CERN: Genève, Switzerland, 1996; pp. 21–26. [Google Scholar]
- Marr, D. Vision: A Computational Investigation into the Human Representation and Processing of Visual Information; MIT Press: Cambridge, MA, USA, 2010. [Google Scholar]
- Kim, J.; Jun, J.Y.; Hong, M.; Shim, H.; Ahn, J. Classification of Oil Painting using Machine Learning with visualized depth information. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 2, 617–623. [Google Scholar] [CrossRef]
- Watson, D.; Clark, L.A.; Tellegen, A. Development and validation of brief measures of positive and negative affect: The PANAS scales. J. Pers. Soc. Psychol. 1988, 54, 1063–1070. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 770–778. [Google Scholar]
- Liu, Z.; Lin, Y.; Cao, Y.; Hu, H.; Wei, Y.; Zhang, Z.; Lin, S.; Guo, B. Swin Transformer: Hierarchical Vision Transformer using Shifted Windows. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, QC, Canada, 10–17 October 2021; pp. 9992–10002. [Google Scholar]
- Touvron, H.; Cord, M.; Douze, M.; Massa, F.; Sablayrolles, A.; Jegou, H. Training data-efficient image transformers and distillation through attention. Proc. Mach. Learn. Res. PMLR 2021, 139, 10347–10357. [Google Scholar]
- Radford, A.; Kim, J.W.; Hallacy, C.; Ramesh, A.; Goh, G.; Agarwal, S.; Sastry, G.; Askell, A.; Mishkin, P.; Clark, J.; et al. Learning Transferable Visual Models From Natural Language Supervision. Proc. Mach. Learn. Res. PLMR 2021, 139, 8748–8763. [Google Scholar]
Intercept l | Value | Num DF | Den DF | F Value | Pr > F |
---|---|---|---|---|---|
Wilks’ lambda | 0.00 | 15.00 | 844.00 | 4,445,658,579,532,924.50 | 0.00 |
Pillai’s trace | 1.00 | 15.00 | 844.00 | 4,445,658,579,532,924.50 | 0.00 |
Hotelling–Lawley trace | 79,010,519,778,428.75 | 15.00 | 844.00 | 4,445,658,579,532,924.50 | 0.00 |
Roy’s greatest root | 79,010,519,778,428.75 | 15.00 | 844.00 | 4,445,658,579,532,924.50 | 0.00 |
Emotional Label | Prevalence (%) |
---|---|
Hostile, Angry | 0.8 |
Shy, Bashful | 0.5 |
Ashamed, Guilty | 0.2 |
Safe, Cosy | 1.3 |
Determined, Confident | 1.1 |
Energised, Excited | 2.1 |
Happy, Cheerful | 3.5 |
Nourished, Fulfilled | 3.5 |
Afraid, Frightened | 4.5 |
Upset, Distressed | 5.3 |
Sad, Downhearted | 7.7 |
Proud, Grand | 8.8 |
Inspired, Amazed | 9.8 |
Attentive, Concentrating | 24.5 |
Relaxed, Calm | 26.3 |
Models | Accuracy (%) |
---|---|
ResNet50 | 69.30 |
DEIT Transformers | 68.40 |
Swin Transformers | 68.80 |
Biophilic Label | Emotional Label | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Relaxed, Calm | Proud, Grand | Nourishing, Fullfilled | Attentive, Concentrating | Sad, Downhearted | Afraid, Frightened | Upset, Distressed | Inspired, Amazed | Energised, Excited | Happy, Cheerful | Determined, Confident | Safe, Cosy | Ashamed, Guilty | Shy, Bashful | Hostile, Angry | |
Mystery | 0.144 | 0.095 | 0.023 | 0.225 * | 0.107 | 0.062 | 0.055 | 0.081 | 0.022 | 0.039 | 0.033 | 0.034 | 0.014 † | 0.028 | 0.04 |
Varying light | 0.192 * | 0.092 | 0.029 | 0.163 | 0.092 | 0.061 | 0.047 | 0.113 | 0.037 | 0.053 | 0.037 | 0.021 | 0.018 † | 0.024 | 0.021 |
Presence of water | 0.349 * | 0.069 | 0.069 | 0.102 | 0.043 | 0.026 | 0.013 | 0.117 | 0.061 | 0.066 | 0.02 | 0.041 | 0.005 † | 0.008 | 0.01 |
Connection with nature | 0.189 * | 0.047 | 0.134 | 0.103 | 0.037 | 0.037 | 0.045 | 0.137 | 0.047 | 0.063 | 0.026 | 0.066 | 0.024 | 0.026 | 0.018 † |
Presence of animals | 0.116 | 0.092 | 0.048 | 0.126 * | 0.072 | 0.085 | 0.116 | 0.075 | 0.072 | 0.061 | 0.031 | 0.031 | 0.017 | 0.014 † | 0.044 |
Biomorphic shapes | 0.121 | 0.13 | 0.037 | 0.195 * | 0.098 | 0.07 | 0.065 | 0.126 | 0.028 | 0.019 | 0.023 | 0.023 | 0.009 † | 0.028 | 0.028 |
Natural materials | 0.112 | 0.158 * | 0.102 | 0.153 | 0.092 | 0.066 | 0.026 | 0.138 | 0.015 | 0.026 | 0.026 | 0.056 | 0 † | 0.015 | 0.015 |
Unimpeded views | 0.349 * | 0.038 | 0.123 | 0.057 | 0.047 | 0 † | 0.019 | 0.132 | 0.094 | 0.047 | 0 † | 0.047 | 0.019 | 0.009 | 0.019 |
Awe | 0.106 | 0.068 | 0.061 | 0.129 | 0.129 | 0.068 | 0.053 | 0.167 * | 0.015 † | 0.045 | 0.015 | 0.038 | 0.03 | 0.015 † | 0.061 |
Refuge | 0.065 | 0.087 | 0.022 | 0.217 * | 0.174 | 0.13 | 0 † | 0.087 | 0.065 | 0.022 | 0.022 | 0.043 | 0.022 | 0.022 | 0.022 |
Natural organisation | 0.055 | 0.073 | 0.055 | 0.127 | 0.127 | 0.055 | 0.073 | 0.182 * | 0.091 | 0.036 | 0 † | 0.055 | 0 † | 0.018 | 0.055 |
Presence of plants or fungi | 0.278 * | 0.022 | 0.183 | 0.106 | 0.044 | 0.006 | 0.006 | 0.039 | 0.044 | 0.128 | 0 † | 0.133 | 0.006 | 0.006 | 0 † |
Risk | 0.037 | 0.074 | 0.007 † | 0.074 | 0.132 | 0.206 * | 0.162 | 0.059 | 0.022 | 0.015 | 0.044 | 0.015 | 0.022 | 0.015 | 0.118 |
Complexity in order | 0.078 | 0.167 | 0.011 † | 0.333 * | 0.089 | 0.022 | 0.067 | 0.089 | 0.033 | 0.022 | 0.033 | 0.022 | 0.011 † | 0.011 † | 0.011 † |
Emotional Label | Recommendation System | Random Recommendations | ||
---|---|---|---|---|
Before | After | Before | After | |
Relaxed, Calm | 2.44 * | 2.69 * | 2.4 † | 2.36 † |
Proud, Grand | 1.42 † | 1.38 † | 1.23 | 1.23 |
Nourished, Fulfilled | 1.87 † | 1.81 † | 1.80 † | 1.73 † |
Attentive, Concentrating | 2.87 † | 2.75 † | 2.53 * | 2.63 * |
Inspired, Amazed | 1.48 * | 2.02 * | 0.96 * | 1.56 * |
Energised, Excited | 1.57 * | 1.81 * | 1.06 * | 1.26 * |
Happy, Cheerful | 1.89 * | 2.18 * | 1.6 † | 1.53 † |
Determined, Confident | 2.02 † | 1.87 † | 1.8 † | 1.63 † |
Safe, Cosy | 2.40 † | 2.38 † | 2.3 † | 2.13 † |
Sad, Downhearted | 0.81 † | 0.48 † | 0.9 † | 0.73 † |
Afraid, Frightened | 0.55 † | 0.34 † | 0.4 | 0.4 |
Upset, Distressed | 0.44 † | 0.34 † | 0.3 * | 0.4 * |
Ashamed, Guilty | 0.35 * | 0.44 * | 0.43 * | 0.46 * |
Shy, Bashful | 0.71 † | 0.34 † | 0.7 † | 0.53 † |
Hostile, Angry | 0.34 † | 0.20 † | 0.26 * | 0.33 * |
Emotional Label | p-Value (Recommendation System) | p-Value (Random Recommendation) |
---|---|---|
Relaxed, Calm | 0.039 * | 0.796 |
Proud, Grand | 0.673 | 0.902 |
Nourished, Fulfilled | 0.532 | 0.648 |
Attentive, Concentrating | 0.201 | 0.405 |
Sad, Downhearted | 0.007 * | 0.268 |
Afraid, Frightened | 0.108 | 1.0 |
Upset, Distressed | 0.346 | 0.47 |
Inspired, Amazed | 0.0002 * | 0.013 * |
Energised, Excited | 0.053 | 0.318 |
Happy, Cheerful | 0.022 * | 0.694 |
Determined, Confident | 0.252 | 0.272 |
Safe, Cosy | 0.957 | 0.368 |
Ashamed, Guilty | 0.601 | 0.748 |
Shy, Bashful | 0.011 * | 0.371 |
Hostile, Angry | 0.114 | 0.414 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xing, Y.; Kar, P.; Bird, J.J.; Sumich, A.; Knight, A.; Lotfi, A.; Carpenter van Barthold, B. Developing an AI-Based Digital Biophilic Art Curation to Enhance Mental Health in Intelligent Buildings. Sustainability 2024, 16, 9790. https://doi.org/10.3390/su16229790
Xing Y, Kar P, Bird JJ, Sumich A, Knight A, Lotfi A, Carpenter van Barthold B. Developing an AI-Based Digital Biophilic Art Curation to Enhance Mental Health in Intelligent Buildings. Sustainability. 2024; 16(22):9790. https://doi.org/10.3390/su16229790
Chicago/Turabian StyleXing, Yangang, Purna Kar, Jordan J. Bird, Alexander Sumich, Andrew Knight, Ahmad Lotfi, and Benedict Carpenter van Barthold. 2024. "Developing an AI-Based Digital Biophilic Art Curation to Enhance Mental Health in Intelligent Buildings" Sustainability 16, no. 22: 9790. https://doi.org/10.3390/su16229790
APA StyleXing, Y., Kar, P., Bird, J. J., Sumich, A., Knight, A., Lotfi, A., & Carpenter van Barthold, B. (2024). Developing an AI-Based Digital Biophilic Art Curation to Enhance Mental Health in Intelligent Buildings. Sustainability, 16(22), 9790. https://doi.org/10.3390/su16229790