Next Article in Journal
“A Constant Juggling Act”—The Daily Life Experiences and Well-Being of Doctoral Students
Previous Article in Journal
Enhancing Learning Outcomes in Econometrics: A 12-Year Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Emotions Matter: A Systematic Review and Meta-Analysis of the Detection and Classification of Students’ Emotions in STEM during Online Learning

by
Aamir Anwar
1,
Ikram Ur Rehman
1,*,
Moustafa M. Nasralla
2,
Sohaib Bin Altaf Khattak
2 and
Nasrullah Khilji
1
1
School of Computing and Engineering, University of West London, London W5 5RF, UK
2
Smart Systems Engineering Laboratory, Department of Communications and Networks Engineering, Prince Sultan University, Riyadh 66833, Saudi Arabia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(9), 914; https://doi.org/10.3390/educsci13090914
Submission received: 19 July 2023 / Revised: 1 September 2023 / Accepted: 4 September 2023 / Published: 8 September 2023
(This article belongs to the Special Issue Educational Equity Gap and Artificial Intelligence in STEM Subjects)

Abstract

:
In recent years, the rapid growth of online learning has highlighted the need for effective methods to monitor and improve student experiences. Emotions play a crucial role in shaping students’ engagement, motivation, and satisfaction in online learning environments, particularly in complex STEM subjects. In this context, sentiment analysis has emerged as a promising tool to detect and classify emotions expressed in textual and visual forms. This study offers an extensive literature review using the preferred reporting items for systematic reviews and meta-analyses (PRISMA) technique on the role of sentiment analysis in student satisfaction and online learning in STEM subjects. The review analyses the applicability, challenges, and limitations of text- and facial-based sentiment analysis techniques in educational settings by reviewing 57 peer-reviewed research articles out of 236 articles, published between 2015 and 2023, initially identified through a comprehensive search strategy. Through an extensive search and scrutiny process, these articles were selected based on their relevance and contribution to the topic. The review’s findings indicate that sentiment analysis holds significant potential for improving student experiences, encouraging personalised learning, and promoting satisfaction in the online learning environment. Educators and administrators can gain valuable insights into students’ emotions and perceptions by employing computational techniques to analyse and interpret emotions expressed in text and facial expressions. However, the review also identifies several challenges and limitations associated with sentiment analysis in educational settings. These challenges include the need for accurate emotion detection and interpretation, addressing cultural and linguistic variations, ensuring data privacy and ethics, and a reliance on high-quality data sources. Despite these challenges, the review highlights the immense potential of sentiment analysis in transforming online learning experiences in STEM subjects and recommends further research and development in this area.

1. Introduction

The global online learning market is expected to surpass USD 848.12 billion by 2030 [1]. It has become popular, especially in response to the COVID-19 pandemic. As a result, educators have had to adapt their teaching methods to accommodate the needs of online learners. One of the most challenging problems for instructors is keeping learners engaged in the learning process. User engagement is a user experience attribute that assesses a person’s cognitive, affective, and behavioural investment when interacting with a digital system [2]. In contrast, learner engagement refers to students’ level of attention, interest, optimism, and passion when learning or being taught [3]. Engagement is critical for learning because it increases learners’ likelihood of retaining information and completing the course [4,5]. Keeping students motivated in an online learning environment can be challenging because of many distractions, such as noise from family members or pets, technology issues, and other demands such as self-discipline and time management. To ensure students’ success, educators must develop innovative ways to keep them engaged and motivated. Interactive technologies are one of the techniques to increase learner engagement in online learning. By providing a more immersive and dynamic learning environment, interactive technology helps keep learners interested [6]. For example, online discussion forums and chat rooms can be utilised to increase student collaboration and promote active learning. Online quizzes and games, on the other hand, can make studying more interesting and engaging while providing students with instant feedback on their progress [7]. Virtual reality (VR) and augmented reality (AR) can also be used to create more engaging and immersive learning experiences. Educators can assist learners in staying engaged and motivated in their studies by introducing these technologies into their online courses.
Sentiment analysis (SA) is becoming increasingly significant in many fields, including marketing, healthcare, and education, since it enables the capture and evaluation of emotions expressed through various modalities, such as text, facial gestures, electroencephalogram (EEG) signals, and voices [8]. Like other modalities, EEG signals are also important for capturing emotions, as they directly measure brain activity associated with emotional states [9]. They can provide insights into people’s feelings, attitudes, and opinions by analysing sentiment, which can inform decision making and improve the quality of services and goods. For example, sentiment analysis of customer feedback can assist businesses in improving their products or services, whereas sentiment analysis of patient feedback can assist healthcare practitioners in improving patient care. Furthermore, sentiment analysis can be used to monitor online conversations for potential threats such as cyberbullying or hate speech and in a classroom setting to analyse students’ facial expressions and infer their emotional states or sentiment during learning activities. As sentiment analysis continues to develop and expand, it can revolutionise how we understand and interact with each other, both online and offline.
Based on sentiment analysis, it is evident that educators and learners consistently emphasise and value the importance of student engagement in online learning [10]. Sentiment analysis which examines online conversations and social media activity may disclose students’ views and opinions about online learning and the efficiency of various engagement tactics such as interactive multimedia, gamification, and personalised learning. In general, sentiment analysis has revealed that students are more likely to be interested and motivated in interactive, collaborative, and immersive online learning settings [11]. Educators may contribute to creating a more helpful and effective online learning experience for all students by assessing their feelings towards online learning and utilising this knowledge to drive their teaching practices. Furthermore, sentiment analysis has also demonstrated that learner engagement positively correlates with academic achievement [11]. When learners actively engage in learning, they are more likely to retain information, apply it to real-world situations, and achieve higher grades. In contrast, disengaged learners are more likely to struggle with motivation and may have lower levels of academic achievement [12]. This highlights the importance of implementing effective engagement strategies in online learning, particularly during the COVID-19 pandemic, which has forced many learners to learn remotely. By leveraging sentiment analysis to identify the most effective strategies for promoting engagement, educators can help ensure their learners are successful and prepared for future academic and professional pursuits. Figure 1 shows an illustration of text-based and facial sentiment analysis in online learning.
Although sentiment analysis has grown in popularity in recent years to assess people’s attitudes and opinions on various topics, including online learning, several issues are associated with using sentiment analysis to analyse chat, face gestures, and videos. One of the most challenging problems is that different modes of communication can be complicated and multifaceted, making it difficult to capture and understand sentiment effectively [13]. Individuals, for example, might use sarcasm or irony in written or verbal discussions, which sentiment analysis algorithms may find difficult to distinguish [14]. Similarly, facial expressions and body language can convey a wide range of emotions and attitudes, which can be difficult to categorise using automated sentiment analysis approaches [15]. Other challenges with sentiment analysis are the possibility of data bias, language barriers, cultural variations, and individual writing styles, which may all impact chat data.
In light of the issues associated with chat-based and facial gesture sentiment analysis in the context of online and e-learning, there is a clear need for further research to improve the accuracy and reliability of these modalities. Developing more advanced techniques can address the challenges in accurately interpreting text-based and facial expressions and improve the overall accuracy of sentiment analysis. Furthermore, a systematic review study is necessary to analyse the current datasets, techniques, and performance of text-based and facial gesture sentiment analysis to make recommendations to improve sentiment analysis in online learning.
Furthermore, this paper presents four research questions, which are given in Table 1. It is important to note that these research questions are answered in the Results and Analysis section (i.e., Section 5), based on the review and evaluation carried out in our study.
Sentiment analysis is a widely applied technique in various domains, including education, where it has attracted considerable attention from researchers. However, comprehensive, and systematic literature reviews on sentiment analysis in education are scarce and limited. For instance, Oghu et al. [16] conducted a systematic literature review on the techniques, resources, and benefits of sentiment analysis in education; however, they included a limited number of papers in their study. Similarly, Dolianiti et al. [17] reviewed the current state of sentiment analysis in education, but they did not examine the experimental processes and methods of sentiment analysis in education, nor did they consider the sentiment categories, research objectives, and key findings of the studies. Another study, by Jin Zhou et al. [18], attempted to fill this gap by conducting a systematic literature review on sentiment analysis in education from 2010 to 2020, focusing on the research purposes, sentiment categories, data sources, methods, and findings of the studies. However, this study also had some limitations, such as excluding some relevant papers that did not match the search terms, only reviewing journal publications, and not performing a meta-analysis of the included studies. Moreover, most of the existing literature reviews on sentiment analysis in education are focused on text-based data, while other forms of data, such as EEG signals, audio, and video, still need to be addressed. For example, Kastrati et al. [19] reviewed the approaches/techniques and solutions for text-based sentiment analysis systems, as well as the evaluation metrics and datasets used for assessing their performance. They discussed the use of natural language processing (NLP) techniques, such as lexicon- and dictionary-based approaches, for text-based SA. However, they did not address the challenges and opportunities of applying sentiment analysis to other types of data that are increasingly available in educational settings. Wankhade et al. [20] performed a comprehensive and systematic literature review on the methods and evaluation of sentiment analysis tasks and their applications in various domains. They evaluated the strengths and limitations of the methods and identified the future research challenges of sentiment analysis regarding both the methods and the data types. However, this survey does not discuss the sentiment analysis in the educational and online learning domains.
Therefore, there is a need for a more comprehensive and updated literature review on sentiment analysis in education that covers both text-based and facial-based data sources along with different methods and applications in various educational contexts. This paper provides a holistic overview of the current state of the art and future directions of sentiment analysis in education.
Furthermore, our study emphasises the vast potential of sentiment analysis in reshaping online learning experiences within STEM subjects. In addition, it calls for additional research and advancements in this domain to unlock further potential.

2. Theoretical Background

This section provides an essential foundation for understanding the key concepts and techniques related to sentiment analysis. It encompasses two main subsections, i.e., text-based sentiment analysis and facial sentiment analysis. The former is divided into four levels and the latter is divided into two broad levels, as shown in Figure 2. We further explore these levels of sentiment analysis and investigate the techniques employed in both text and facial sentiment analysis. This section aims to establish a basis for the investigations and findings presented in this paper by comprehensively addressing both text-based and facial sentiment analysis.

2.1. Text-Based Sentiment Analysis

Text-based sentiment analysis uses NLP techniques to extract useful information from a text and classify that information into certain sentiment classes such as happiness, sadness, stress, angriness, and other emotions [21]. Text-based sentiment analysis aims to extract the required information and meaningful insights from text, allowing educators to make informed decisions. Text-based sentiment analysis can be used in certain areas, such as emotion detection from Twitter posts, detecting learners’ emotions about teaching methods and content using student responses or surveys, and detecting consumers’ emotions when utilising a product or service. For example, the authors in [22] used sentiment analysis on Twitter data to gather comments on movies before they were released. This study forecast box-office revenues for upcoming movies and discovered a significant association between the amount of attention an upcoming movie has on Twitter and its ranking in the future. Another study used sentiment analysis on the reviews of mobile apps to select highly rated apps for autism spectrum disorder (ASD) [23]. Similarly, the authors in [24] used sentiment analysis in the educational domain, utilising students’ chat data during online learning sessions. This study explored the effectiveness of different tutoring strategies and teaching content by classifying students’ responses into positive and negative emotions.
After building our basic understanding of text-based sentiment analysis, we further explore it by investigating two key aspects: the levels of analysis and the range of techniques utilised in this field.

2.1.1. Text-Based Sentiment Analysis Levels

Text-based sentiment analysis can be applied on four levels of granularity, i.e., (1) document-level analysis, (2) sentence-level analysis, (3) phrase-level analysis, and (4) aspect-level analysis. The different levels of text-based sentiment analysis are further discussed below:
(a)
Document-Level Sentiment Analysis: The approach of document-level sentiment analysis involves analysing an entire document and assigning a single polarity to the document as a whole [20]. While this approach is not frequently used, it can be helpful in categorising chapters or pages of a book as positive, negative, or neutral sentiments. Suppose an article on “using technology in the educational domain to enhance student learning” can be reviewed using document-level sentiment analysis to determine the overall sentiment expressed in the article regarding the impact of technology integration on student learning outcomes. Both supervised and unsupervised learning methods can classify the document [25]. One of the most significant challenges in document-level sentiment analysis is cross-domain and cross-language sentiment analysis [25]. For instance, domain-specific sentiment analysis has achieved high accuracy while maintaining domain sensitivity. These tasks require a set of domain-specific and -limited words to be used as the feature vector.
(b)
Sentence-Level Sentiment Analysis: Sentence-level sentiment analysis evaluates the emotions of individual sentences inside a text. For example, “The teacher explained this topic very well”. In this case, the sentence-level sentiment analysis technique would analyse and classify the sentence as positive. While compared to document-level sentiment analysis, this approach provides a more granular sentiment analysis. For sentence-level sentiment analysis, machine learning methods such as naive Bayes (NB), support vector machine (SVM), and deep learning models, e.g., recurrent neural networks (RNNs) and convolutional neural networks (CNNs), can be utilised [26]. Handling negation, sarcasm, and irony, and dealing with domain-specific language and jargon are typical challenges of sentence-level sentiment analysis. Despite these challenges, sentence-level sentiment analysis has applications in various domains, including customer feedback analysis, social media monitoring, and product review analysis. This approach provides a more thorough understanding of the sentiment expressed in a text by analysing the sentiment of individual sentences, which can be used to improve decision making and consumer satisfaction.
(c)
Phrase-Level Sentiment Analysis: The sentiment of individual phrases or expressions inside a sentence or text is analysed at the phrase-level. For example, “The teaching style and content of the topic helped me in understanding this complex information”. In this example, the phrase “helped me in understanding this complex information” indicates a positive sentiment as it suggests that the teaching style and content effectively facilitated the comprehension of complex information. Compared to sentence-level sentiment analysis, this approach provides an even more granular analysis. For phrase-level sentiment analysis, machine learning algorithms such as SVM, decision trees, neural networks, and rule-based techniques are often utilised [27]. Dealing with context-dependent phrases, everyday idioms, and unclear words are the most common challenges of phrase-level sentiment analysis. However, phrase-level sentiment analysis has many applications, including customer review analysis, social media monitoring, opinion mining, etc. This approach provides a more sophisticated understanding of the sentiment represented in a text by analysing the sentiment of individual phrases, which can aid in strengthening decision making and consumer satisfaction.
(d)
Aspect-Level Sentiment Analysis: The sentiment of specific aspects or features of a product or service described in a text is analysed using aspect-level sentiment analysis. This method is beneficial for businesses as it reveals which components of their product or service customers most appreciate or dislike. Machine learning algorithms such as SVM, RNN, and CNN are commonly used for aspect-level sentiment analysis [28]. In a massive open online course (MOOC) review, for example, aspect-level sentiment analysis would analyse the sentiment of specific factors such as learning material, teaching quality, and instructor experience, rather than just giving an overall positive or negative rating.

2.1.2. Text-Based Sentiment Analysis Techniques

Text-based sentiment analysis is a prominent and widely used form of sentiment analysis, so various techniques have been developed to perform sentiment analysis. Lexicon-based, machine-learning-based, and hybrid methods are some of the most commonly used techniques in text-based sentiment analysis. Figure 3 provides an overview of different sentiment analysis techniques for emotion detection and classification. This section focuses on two prominent approaches: the lexicon-based and machine learning approaches. It will provide the necessary understanding of the methodologies employed to extract sentiment information from textual data, highlighting their distinctive methodologies and practical applications.
(a)
Lexicon-Based Approach: Lexicon-based sentiment analysis is an established method that determines the sentiment of a text by using pre-defined dictionaries of terms and their associated sentiment scores. Each word in the text is scored based on its polarity, i.e., positive, negative, or neutral. The sum or average of the scores of the words in the text is then used to calculate the overall sentiment of the text [30]. Lexicon-based techniques have the advantage of being reasonably simple to implement and requiring minimal annotated data for training. However, the coverage and quality of the lexicons used may limit their accuracy [31]. To address this limitation, researchers have developed various lexicons specific to particular domains or languages [32]. Furthermore, lexicon-based approaches can be combined with other techniques, such as part-of-speech tagging and syntactic parsing, to improve the accuracy of the sentiment analysis [33]. Some of the prominent techniques of lexicon-based sentiment analysis used in the literature are given in Table 2.
(b)
Machine Learning Approach: Machine-learning-based approaches have been widely used in sentiment analysis due to their ability to learn complex patterns and relationships in data automatically. One prominent strategy is using supervised learning algorithms such as SVM, NB, and decision trees to classify text as positive, negative, or neutral based on labelled training data [49]. Unsupervised learning is another strategy, which involves grouping similar documents based on their sentiment using techniques such as k-means clustering or latent Dirichlet allocation (LDA) [50].
Deep learning approaches, such as RNN and CNN, have also been successfully used in sentiment analysis tasks [51]. These algorithms require enormous training data and computer resources, yet they can achieve excellent accuracy and generalisation across domains and languages. Overall, machine-learning-based techniques for sentiment analysis on text data provide a robust and adaptable method. A list of the most commonly used machine learning techniques for sentiment analysis is given in Table 3.

2.2. Facial Sentiment Analysis

Facial sentiment analysis (FSA) is an emerging field that has gained much attention recently due to its potential applications in various domains, including healthcare, education, marketing, and entertainment. FSA involves computer vision techniques to recognise facial expressions and automatically extract individuals’ emotional states. The main goal of FSA is to understand human emotions and behaviour by analysing facial features such as eyebrows, eyes, nose, and mouth. FSA has been utilised in many applications, such as emotion recognition, stress detection, and deception detection. Facial expression recognition to gauge students’ emotional responses offers clear benefits over relying solely on self-reported data. Self-reports can be influenced by subjectivity and dishonesty, as students may not accurately convey their true feelings. Facial expressions, on the other hand, provide an objective, measurable indicator of emotions that is not affected by conscious biases. The automatic nature of facial recognition also leads to more natural, instinctual reactions from students rather than ones filtered through deliberation. We anticipate that real-time feedback promotes the generation of novel solutions, collaboration, and student engagement, which have a strong pedagogical impact on learning and teaching methodologies. Deep learning algorithms have significantly improved FSA performance in recent years, allowing for more accurate and reliable analysis.
Several studies have explored the potential of FSA in various domains. For instance, FSA has been used in healthcare to recognise emotions in patients with depression, anxiety, and bipolar disorder. A study [66] developed a deep-learning-based FSA model to identify depression in patients by analysing their facial expressions. The model achieved an accuracy of 93.3%, demonstrating FSA’s potential in detecting mental disorders. In the education domain, FSA has been utilised to monitor students’ engagement and attention levels during online learning [67]. This section will explore the different levels at which facial expressions are analysed to determine emotional states and investigate various methodologies employed to detect and interpret emotions from facial expressions.

2.2.1. Levels of Facial Sentiment Analysis

FSA can be performed at different levels depending on the level of granularity required to extract emotional information from facial expressions. The three primary levels of FSA are (1) face-level analysis, (2) region-level analysis, and (3) landmark-level analysis, which are further discussed below:
(a)
Face-Level Analysis: At the face level, FSA involves recognising emotions shown by individuals through facial expressions. This type of analysis is critical in fields including psychology, marketing, and human–computer interaction, where understanding emotions and their effects are vital. Several approaches to performing face-level sentiment analysis have been proposed, including rule-based, feature-based, and deep-learning-based methods. A recent study proposed a deep-learning-based approach for facial sentiment analysis that outperformed traditional methods [68]. The proposed model utilised a CNN for feature extraction and a long short-term memory (LSTM) network for sequence modelling. Another study proposed a method for facial sentiment analysis that utilised a set of hand-crafted features, including facial action units and their combinations, to train an SVM classifier [69]. The proposed method achieved a high accuracy of 89.5% on the AffectNet dataset [70]. These studies highlight the effectiveness of deep-learning-based and feature-based approaches for face-level sentiment analysis.
(b)
Region-Level Analysis: Region-level facial sentiment analysis involves analysing the emotional expressions of specific regions of the face, such as the eyes, mouth, or eyebrows. This approach allows for a more fine-grained analysis of emotional expression and can provide insights into the conveyed emotions. Various region-level facial sentiment analysis techniques have been proposed, including deep-learning-based methods such as CNN and RNN [71]. These techniques have been shown to achieve high accuracy in detecting emotions from specific regions of the face, such as the eyes or mouth. Other approaches include the use of geometric features and hand-crafted features, such as local binary patterns (LBPs) and histogram of oriented gradients (HOGs), which have also been shown to be effective in region-level sentiment analysis [72]. However, region-level analysis is still challenging due to occlusion and variations in expression intensity, which can impact emotion recognition accuracy [73]. Overall, region-level facial sentiment analysis has shown promise in improving the accuracy and granularity in the recognition of facial expressions’ emotions. Further research is needed to address the remaining challenges, such as subtle emotion recognition, subjectivity, and individual variance and contextual understanding.
(c)
Landmark-Level Analysis: Facial landmark detection is another approach used for FSA. This approach is based on extracting facial landmarks, defined as critical points on the face, such as the corners of the mouth, nose, and eyes. These landmarks are then used to extract features for emotion recognition. This technique has been used in recent studies for FSA. For example, in a study, the authors used facial landmarks to recognise emotions from YouTube videos to develop an effective feature selection algorithm to determine the optimal features for further improving the performance of multimodal sentiment analysis [74]. Another study utilised facial landmarks for emotion recognition in the context of social robotics [75]. The landmark-level approach is considered more accurate than the face- or region-level approaches as it captures more subtle changes in facial expressions that can be missed at the higher levels. However, it requires more computational resources and may not be suitable for real-time applications such as driver monitoring systems and emotion recognition in video conferencing.

2.2.2. Facial Sentiment Analysis Techniques

Computer vision and machine learning algorithms are typically employed in FSA approaches to analyse facial expressions and extract emotional information from human faces. Various methodologies are applied in these techniques, such as facial landmark identification to identify significant facial characteristics, facial expression recognition to classify emotions, and intensity estimate to quantify the strength of displayed emotions. Deep learning models, such as CNNs, RNNs, or hybrid architectures, are commonly used in these methodologies. These algorithms are trained on massive datasets of labelled facial expressions to understand patterns and generate reliable predictions. Additionally, some techniques analyse other facial characteristics such as eye movements, head pose, or micro-expressions to improve sentiment analysis accuracy. With advances in computer vision and machine learning, FSA techniques are evolving, allowing applications in domains as diverse as market research, customer service, education, and human–computer interaction. Several techniques can be applied at each level of FSA. Some of the commonly used techniques are given below in Table 4.

2.3. Other Modalities

While our work primarily focuses on text-based and facial sentiment analysis, it is indeed important to acknowledge that other physiological modalities, such as EEG and heart rate monitoring, have also been utilised to assess students’ cognitive and emotional states [99,100]. These modalities can also provide direct and objective indicators of students’ emotional and mental processes and complement the information obtained from facial expressions and text. Therefore, it is important to consider the potential benefits of using multiple physiological modalities for sentiment analysis and engagement detection in education.
EEG and heart rate monitoring-based methods help measure students’ emotions and engagement because they can capture the changes in their brain activity and physiological arousal associated with different emotional states and cognitive load levels. The EEG contains a broader range of information about a subject’s mental state concerning other biosignals [9,101]. It measures various parameters that reflect the cognitive and emotional engagement of students. Similarly, heart rate monitoring can measure the variations in students’ heartbeats that indicate their stress, excitement, boredom, or frustration. When combined with facial expressions and text, these methods might offer a good amount of data, which may be influenced by students’ self-reporting bias, social desirability, or cultural differences. Moreover, these methods can offer teachers and students more continuous and real-time feedback, which can help them adjust their teaching and learning strategies accordingly. Therefore, EEG and heart rate monitoring-based methods can benefit sentiment analysis and engagement detection in education.
Our work complements the existing literature by providing a comprehensive overview of sentiment analysis in online STEM learning. While focusing on textual and facial sentiment analysis, we recognise the value of integrating physiological measures. Future research could combine sentiment analysis with EEG data to better understand emotional and cognitive engagement processes. By encouraging interdisciplinary exploration, these approaches promise richer insights into student experiences, enhancing online learning environments and enriching discussions on improving STEM engagement.

3. Method

We conducted a systematic literature review using the preferred reporting items for systematic reviews and meta-analyses (PRISMA) technique [102]. PRISMA is a widely recognised and acknowledged guideline for comprehensive reporting of systematic reviews and meta-analyses. Following PRISMA guidelines, the systematic literature review consists of two main stages: (i) the planning stage, which includes search and inclusion/exclusion strategies of research articles; and (ii) data extraction and analysis. These approaches provide a holistic view of the research landscape through different perspectives and form relationships between the diverse layers of multifaceted research questions provided in Section 1.

3.1. Search Strategy

A search was conducted using the terms “Sentiment Analysis”, “Text-Based Sentiment Analysis”, “Facial Sentiment Analysis”, and “Sentiment Analysis in Online Learning/Education” to retrieve different studies on the use of sentiment analysis in online learning. A thorough search was performed, retrieving data from various databases such as ACM, ScienceDirect, IEEE Xplore, Medline, Scopus, and Google Scholar, resulting in time-relevant sources from 2015 to 2023. The logical operators “OR” and “AND” were used to identify duplicates. The search was based on the metadata (e.g., title, abstract, and keywords). We tried certain synonyms as well to cover all possible content.

3.2. Inclusion/Exclusion Criteria

The authors scrutinised the returned articles and excluded surveys, short papers without proper explanation, not-peer-reviewed articles, pre-prints, and articles not written in English. The keywords listed above resulted in articles and technical reports published in various publication databases. The database searches generated a total of 236 articles. After initial scrutiny based on the articles’ relevance and state-of-the-art methodologies used, 57 articles were selected and categorised into three categories, i.e., sentence-based sentiment analysis, aspect-based sentiment analysis in online learning/education, and facial sentiment analysis in online learning. Figure 4 depicts the inclusion/exclusion criteria based on the PRISMA technique.

4. Results and Analysis

This section explores, analyses, and provides the answers to the study research questions outlined in Section 1, Table 1. The solutions provided are based on the data available in the current studies and the authors’ analysis of each research question. The studies included in the investigation have utilised diverse datasets collected from various sources such as social media posts, online forums, and learning management systems (LMSs). These datasets encompass a range of educational subjects and grade levels. However, the studies have yet to specifically examine the generalisation of sentiment analysis algorithms to different academic domains and grade levels. Most research has focused on performance within a particular subject area or grade level. The ability of sentiment analysis models to maintain accuracy when applied to new subjects and grade groups remains an open question that could be explored in future work. The section is structured into four distinct subsections, addressing each research question. We start by focusing on text-based sentiment analysis, showcasing sentence-based and aspect-based sentiment analysis studies, highlighting the datasets, employed techniques, and resultant findings. Moreover, in the second subsection, the challenges and limitations encountered when capturing learners’ emotions through text-based sentiment analysis are addressed, and a summary is provided in the form of a table. Afterwards, the utilisation of FSA in the educational domain is addressed in the third subsection, and an overview of the relevant studies is provided, presenting the methodology, extracted emotion classes, and resultant findings. Finally, the fourth subsection discusses the challenges and limitations associated with capturing learners’ emotions using FSA. Different aspects, including cultural differences in facial expressions and dependence on lighting and camera quality, are also explored. Overall, this section will provide readers with insights from the authors about text-based SA and FSA in the educational domain.

4.1. RQ1: Use of Text-Based Sentiment Analysis in Educational Domain

Text-based sentiment analysis techniques have played a significant role in online learning environments, providing valuable insights into students’ emotions and sentiments expressed through text. NLP and machine learning algorithms are used in these techniques to analyse and categorise text documents based on their sentiment, which can be positive, negative, or neutral. To extract sentiment information from online learning materials, discussion forums, social media platforms, and student feedback, approaches such as bag-of-words, sentiment lexicons, and machine learning models such as NB, SVM, and RNN have been used.
The authors analysed the current literature for text-based emotion classification of learners’ responses to this research question. Moreover, we have identified the datasets and techniques used for text-based sentiment analysis in the educational domain. Between 2015 and 2023, 37 papers were published on sentiment analysis of text. Of these, 25 papers focused on analysing sentiment at the sentence-level, while the remaining 12 concentrated on analysing sentiment at the aspect-level. Table 5 illustrates the studies on sentence-based sentiment analysis, including the datasets utilised, techniques employed, extracted emotion classes, and the resultant findings.
Table 6 shows the aspect-based sentiment analysis studies published using educational datasets while using different techniques.

4.2. RQ2: Challenges/Limitations to Capture Learners’ Emotions Using Text-Based Sentiment Analysis

There has been a growing interest in using sentiment analysis to capture learners’ emotions and improve the efficacy of educational systems in recent years. Text-based sentiment analysis, which can automatically analyse enormous volumes of data from multiple sources such as social media, discussion forums, and chat logs is a potential strategy for achieving this goal. However, accurately detecting and interpreting learners’ emotions from textual data poses significant challenges and limitations.
The authors investigated some of the challenges and limitations characterised in recent studies and provided suggestions for possible solutions. A comprehensive review of the relevant literature, including recent articles on sentiment analysis in education, supports our findings. Some of the significant challenges are shown in Table 7 and briefly discussed below:

4.2.1. Classification of Students Textual Utterances

Student utterances are spontaneous words such as “Wow! that topic was interesting” while performing a task or providing feedback about a specific event [140]. Student utterances are considered one of the effective ways of recording student feedback about the pedagogical activity and helping to understand students’ emotions [152]. However, recording and classifying students’ utterances are challenging tasks. These utterances are usually very short and need to provide more factual information. For textual utterances, chat data from an online learning platform has been used to detect emotion using text-based sentiment analysis techniques for emotion detection [153]. Several studies used automatic chatbots to record and analyse student chat data using lexicon-based (e.g., corpus-, dictionary-based approaches) and machine learning techniques (rule-based and linear classifiers) [154,155]. These sentiment analysis techniques (lexicon-based/machine learning approaches) provide substantial results in emotion detection. However, they require further enhancement to provide an error-free classification of utterances in emotion classes (i.e., happy, sad, angry).

4.2.2. Emotion Classes Overlapping

Class imbalance and overlapping are well-known classification challenges in machine learning that have piqued academics for over a decade. The classes overlapping problem usually occurs due to ambiguous meanings/regions in the data where data qualifies for more than one class based on its reasonable probability [156]. Minor changes between samples of two different classes are typically challenging to capture solely using data characteristics suggested by domain experts. Unbalanced learning has recently received extensive attention from the scientific community, resulting in several dedicated approaches and algorithms such as synthetic minority oversampling technique (SMOTE), and overfitting and underfitting techniques [157]. The research community previously neglected the issue of unbalanced data in the context of sentiment classification since most datasets analysed were deliberately designed to have an equal amount of positive and negative reviews [158]. However, where good ratings often outnumber negative ones, the balanced data assumption does not apply. The authors in [159] were among the first to recognise the issue and demonstrate that utilising realistic imbalanced datasets resulted in classifier construction that performed significantly better in reality. SMOTE was recently applied to over-sampled text representations built by a recursive neural tensor network to utilise an imbalanced dataset for emotion classification [160]. Class overlapping of different emotions makes it challenging to classify learner emotions accurately, and learners’ sentences such as “Oh, I made a mistake” depict characteristics of “sad” and “angry” classes, which become difficult for the machine learning algorithm to classify.

4.2.3. Dealing with Bipolar Words

Sentiment analysis is the field of classification that extracts polarity and classifies them into different sentiment classes (i.e., happy, sad, angry, etc.). When two words with conflicting polarities are combined, they produce an overall contradiction that can exhibit positive and negative behaviour, known as bipolar phrases. Considering the phrase “passing marks”, which generally has a positive polarity, it may also be negative regarding student progress. Salas-Zárate et al. [161] worked in the health domain and used N-grams to find the polarity of aspects by using aspect-level sentiment analysis techniques. They used Twitter data using N-grams, resulting in 81.93% precision in sentiment classification into different classes. Another study used “SentiWordNet” to handle bipolar words in sentiment analysis to analyse statistical techniques for affective computing as knowledge [162]. Bipolarity is a common problem in learners’ sentiment analysis as the learners’ feedback consists of double-meaning words. For example, a learner feedback sentence, “The exam was quite easy, but I was short of time to attempt all questions”, is a bipolar sentence and conventional machine learning algorithms to classify emotions underperform with such sentences. Another problem with learners’ sentiment classification is the confusing sentences as learners express their feelings about any event or topic of discussion. Therefore, we expect an unstructured answer/response consisting of certain bipolar words, such as “challenging”, that must be adequately addressed before classification into emotion classes.

4.2.4. Fake Comments/Responses

One of the limitations of existing student response systems (SRSs) is fake responses from the learners during course/teaching evaluation. Most SRSs are predesigned with objective questions such as “rate the instructor’s teaching methodology (1 to 5)” about the teaching content and methods and their satisfaction with both. Learners select the first available option to answer those questions due to their lack of interest and motivation. Educational institutes rely on these responses to evaluate a course or instructor’s performance. Several studies worked on identifying fake responses in terms of fake news and fabricated product reviews. E. Tacchini et al. [163] used a dataset of 15,500 posts from 909,236 users to detect fake news on Facebook. They used the logistic regression (LR) and Boolean label crowdsourcing algorithm (BLC) techniques for fake news detection, resulting in 94.3% and 90% detection accuracy, respectively. S. Aphiwongsophon [164] discussed the impact of fake news in today’s technologically advanced society while considering that detecting fake news is a critical task. Their study aimed to identify fake news by using machine learning classifiers using three different methods: NN, SVM, and NB. Fake news detected by NB showed an accuracy of 96.08%. Fake news detected by NB showed an accuracy of 96.08%. At the same time, the SVM and the NN have an accuracy of 99.90% for detecting fake news. All of the above studies worked on fake news detection; however, detecting fake responses from the learners while providing feedback needs substantial work.

4.2.5. Lack of Reliable Data for Training and Evaluation

Researchers have pointed out certain challenges that sentiment analysis models face regarding chat-based data. One of the critical challenges they identified is the need for labelled data, which can significantly impact the accuracy of the models. Labelled data is essential for training supervised machine learning models to classify text data into sentiment categories accurately. However, in chat-based scenarios, acquiring labelled data can be challenging due to the dynamic nature of conversations and the need for human annotators to label the data manually. Moreover, the quality of the labelled data can also impact the performance of the models. Patil and Nandi [147] emphasised that the labelled data should be diverse, representative of the target population, and annotated by experts to ensure accuracy and reliability.
There is a growing need to collect and utilise real-time utterances data from learners to address these challenges. Real-time data can capture the context of the conversations, the dynamic nature of the interactions, and the changes in the sentiment over time. This data can then be used to train sentiment analysis models that are more accurate and robust. However, the collection and analysis of real-time data also pose significant ethical and privacy concerns, which must be addressed appropriately. In conclusion, the scarcity and quality of labelled data remain a significant challenge for sentiment analysis models, especially in chat-based scenarios. Collecting real-time utterances data from learners can help to overcome these challenges and improve the accuracy and effectiveness of sentiment analysis models in educational contexts.

4.2.6. Identification of Sarcasm and Irony in Chat Data

Student chat can often include sarcasm and irony, which pose significant challenges for sentiment analysis. As noted by the authors of [148], online discussions frequently employ linguistic terms, such as acronyms/abbreviations and emoticons, and failing to account for them can lead to inaccurate sentiment analysis results. The difficulty is compounded by the absence of contextual information in chat-based data, making it challenging to identify when a message is sarcastic or ironic. This lack of context can result in sentiment analysis models misclassifying messages and failing to capture students’ nuanced emotions and attitudes in educational settings. Therefore, sentiment analysis models need to account for the impact of sarcasm and irony in student chat to improve the accuracy of sentiment analysis in educational contexts. Several studies have explored the impact of sarcasm and irony on sentiment analysis and considered it one of the main challenges in text-based sentiment analysis, such as in [165].

4.2.7. Unstructured Data

Online learning platforms create massive amounts of unstructured data, such as student essays, forum posts, and social media comments. Incorporating these data sources into sentiment analysis models has the potential to provide a more comprehensive understanding of student emotions. However, as the authors in [149] pointed out, this approach faces various obstacles, including data cleansing and integration. The unstructured nature of the data can also make it challenging to distinguish emotions and attitudes appropriately.
Other studies have pointed out the drawbacks of using unstructured data for sentiment analysis. For example, the authors in [150] stated that the variety of language use and context complexity might lead to sentiment analysis models delivering inaccurate or unreliable results. Furthermore, the authors of [151] stated that the subjective nature of sentiment analysis makes establishing a criterion for evaluating model accuracy challenging.
Despite these challenges, sentiment analysis of unstructured data in educational contexts remains a promising area of research.

4.3. RQ3: Use of Facial Sentiment Analysis in Educational Domain

FSA techniques have emerged as a beneficial tool in online learning environments, allowing for analysing students’ emotions and sentiments based on facial expressions. Using computer vision and machine learning algorithms, these techniques identify and analyse facial cues, such as changes in facial muscle movements, to infer emotional states, such as happiness, sadness, confusion, or engagement. To analyse video recordings or real-time video interactions in online learning platforms, methods such as facial landmark detection, facial emotion recognition, and deep learning models have been used. Using FSA, educators and researchers can gather insights about students’ emotional responses, level of engagement, and comprehension. This data can be utilised to improve teaching tactics, give personalised assistance, and create more inclusive and effective online learning environments. As technology advances, FSA algorithms evolve, opening new avenues for improving online learning environments and student outcomes.
The authors comprehensively reviewed existing literature on classifying learners’ facial emotions in this research question. Several research studies have been conducted to investigate FSA in educational settings. Our findings revealed that nine papers were published between 2015 and 2023, focusing on sentiment analysis of facial expressions. Table 8 provides an overview of these studies, presenting the techniques utilised, the emotion classes extracted, and the resulting findings.

4.4. RQ4: Challenges/Limitations to Capture Learners’ Emotions Using Facial Sentiment Analysis

The utilisation of FSA has emerged as a highly promising method for capturing the emotions of learners. This approach, however, has several major challenges. One major challenge is the accurate recognition and interpretation of facial expressions, as emotions can be subtle and context-dependent [176]. Additionally, variations in cultural norms and individual differences in facial expressions pose challenges to universal emotion recognition [177]. Furthermore, the presence of confounding factors such as occlusions, variations in lighting conditions, and the complexity of dynamic facial expressions can significantly impact the accuracy of FSA models [178].
Through a systematic literature analysis, several challenges have been identified in capturing learners’ emotions. These challenges encompass various aspects of the process and highlight the complexities of accurately capturing and interpreting learners’ emotional states. Firstly, we examine the limitations in accurately identifying emotions through facial analysis. We then delve into the challenge posed by cultural differences in interpreting emotions. The subsequent point highlights the limited effectiveness of FSA in detecting subtle emotions. Additionally, we address the influence of factors such as lighting conditions and camera quality on the analysis quality. Lastly, we explore the challenges such as with real-time analysis of emotions, primarily due to computational and processing limitations. These challenges are summarised in Table 9 and discussed in subsequent paragraphs.

4.4.1. Limited Accuracy in Identifying Emotions

Due to the complexity and subjectivity of emotions, face-gesture-based sentiment analysis models face significant challenges in accurately recognising emotions. Emotions are multifaceted and can be influenced by various external and internal factors, challenging their interpretation through facial expressions. Facial expressions do not always adequately reflect an individual’s true feelings [179]. Students, for example, may present facial expressions that may not correspond to their actual emotional states or exhibit emotions that differ from the expected facial expressions generally associated with those emotions.
Further research revealed new challenges in face-gesture-based sentiment analysis. Researchers [180] investigated the shortcomings of facial expression detection algorithms in recognising subtle emotional nuances. They discovered that these algorithms might struggle to detect subtle changes in facial expressions, potentially leading to errors in emotion recognition. In a study by Kaminska et al. (2021) [181], the authors evaluated the impact of cultural variances on the recognition of facial emotions. Their findings revealed that variations in facial expression across cultures could reduce the effectiveness of emotion identification models when applied across different cultural contexts.

4.4.2. Cultural Differences in Facial Expressions

Face-gesture-based sentiment analysis models encounter challenges in accounting for cultural differences in facial expressions, leading to inaccurate analysis of student emotions. Cultural differences in facial expressions have been identified as a critical aspect influencing the perception of emotions and recognition. Individuals from diverse cultural origins may express emotions via different facial expressions, making considering the cultural context in sentiment analysis essential. For example, in some cultures, nodding from left to right is often employed to show agreement or affirmation, similar to saying “yes”. While in Western societies, nodding from left to right usually indicates disagreement or negation, equivalent to saying “no”.
Li Y. et al. [182] evaluated the impact of culture on facial expression recognition and discovered substantial differences across cultures. Their findings emphasised the significance of combining cultural knowledge and context when developing accurate and culturally sensitive emotion identification systems. In another study, the authors explored the function of cultural background in emotion perception. They emphasised the importance of considering cultural differences to improve the reliability and validity of emotion analysis models [183].
Furthermore, Zhang et al. [184] explored cross-cultural facial emotion detection and discovered that cultural factors considerably influence the perception of facial emotions. Their findings highlighted the importance of tailoring FSA models to various cultural contexts to produce more accurate and reliable emotion recognition results.
In addition to cultural differences, other studies have identified challenges related to the dynamic nature of facial expressions. Wang et al. [185] highlighted the limitations of static facial analysis and emphasised the importance of considering temporal dynamics in emotion recognition models. They proposed incorporating temporal information to capture the evolving nature of facial expressions and improve the accuracy of sentiment analysis. Overall, these studies highlight the challenges FSA models face in accounting for cultural differences in facial expressions and effectively analysing student emotions.

4.4.3. Limited Effectiveness in Identifying Subtle Emotions

Face-gesture-based sentiment analysis algorithms encounter difficulties in recognising subtle emotions, particularly those familiar to online learning, such as boredom or confusion. These emotions are often conveyed through subtle facial expressions that might be challenging for automated systems to identify and analyse. The traditional facial expression recognition methods struggle to effectively capture and analyse these subtle emotions [186]. The studies conducted between 2015 and 2023 revealed new challenges in detecting subtle emotions using face-gesture-based sentiment analysis. For example, Wang Y. et al. [187] investigated the limitations of facial expression recognition algorithms in collecting micro-expressions such as narrowing of the eyes or slight wrinkling of the nose that shows anger or frustration, which are brief and temporary facial expressions that reflect hidden emotions. Their findings highlighted the need for more sophisticated techniques to detect and analyse these subtle emotional cues.
Furthermore, another study focused on the difficulties in recognising emotions such as confusion and frustration in the educational context [188]. The authors discovered that these emotions are frequently accompanied by slight changes in facial expressions, making them difficult to identify solely from visual clues. The findings emphasise the necessity of multimodal techniques integrating facial expressions alongside other modalities, such as voice, text, or physiological data, to improve emotion recognition accuracy.
In addition, another study investigated the effect of facial occlusions, such as face masks or partial obstructions, on emotion recognition [189]. They discovered that occlusions could considerably impact the performance of FSA models, especially when learners’ faces are not completely visible. Overcoming these obstacles requires the development of robust systems capable of handling occlusions while still effectively capturing and interpreting facial expressions.
In conclusion, face-gesture-based sentiment analysis algorithms encounter difficulties reliably recognising subtle emotions, especially those observed in online learning contexts. The complexity associated with effectively analysing these emotions is further complicated by limitations in capturing micro-expressions, recognising subtle changes in facial expressions, and accounting for facial occlusions. To address these issues, multimodal methodologies and more sophisticated ways of gathering and analysing subtle emotional cues must be investigated.

4.4.4. Dependence on Lighting and Camera Quality

Face-gesture-based sentiment analysis models’ accuracy can be affected by factors such as lighting quality and the camera used to collect facial emotions. According to a study, inadequate lighting or low-quality cameras might substantially impact the analysis of facial expressions and emotions, resulting in incorrect findings [190].
Further research examined the effect of lighting and camera quality on face-gesture-based sentiment analysis. For example, a study evaluated the impacts of varying lighting conditions on facial expression recognition [191]. The authors discovered that lighting variations could affect the visibility and intensity of facial characteristics, affecting the accuracy of emotion detection.
Furthermore, a research study investigated the impact of the camera quality on facial expression analysis [192]. The study discovered that low-resolution or noisy images from low-quality cameras might affect accurate emotion recognition. They emphasised the need for high-quality cameras to capture facial expressions to produce consistent and accurate outcomes.
Another study evaluated the effect of head positioning changes on facial emotion detection, lighting, and camera quality. They discovered that abnormal head positions could make capturing facial expressions difficult, resulting in lower performance of the sentiment analysis model [193].
These studies highlight the importance of considering lighting, camera quality, and head poster variations when designing and deploying face-gesture-based sentiment analysis models.

4.4.5. Lack of Real-Time Analysis

Face-gesture-based sentiment analysis models may face challenges in providing real-time analysis of student emotions and sentiment due to the time required for image processing and analysis. The computational challenges of analysing facial expressions might cause a delay in acquiring results, restricting these models’ ability to provide real-time feedback to students and instructors in online learning environments. Extensive research conducted has explored the issue of real-time analysis in face- gesture-based sentiment analysis. Deshmukh et al. [194] evaluated the real-time performance of facial expression recognition algorithms. They discovered that computational demands could slow down real-time analysis, especially when dealing with massive datasets or complicated algorithms. Furthermore, another study investigated real-time emotion recognition using deep learning approaches [195]. Due to the enormous processing requirements of deep learning models, their research emphasised the difficulties in achieving real-time analysis. These studies highlight the difficulties that face-gesture-based sentiment analysis algorithms encounter while delivering real-time analysis of student emotions. While advances in hardware and algorithms continue to improve analysis speed, computational complexity remains a crucial barrier to providing rapid feedback in online learning environments.
Considering the above-discussed text-based and facial sentiment analysis challenges, combining facial expressions, voice, written words, and physiological signals, including EEG signals, could help us better understand an individual’s emotions. These various perspectives on emotions can be combined to detect subtle sentiments such as irritation, boredom, or lack of interest. Combining different modalities could be challenging; however, combining multiple methods for analysing emotions correlates effectively with how complex human emotions are. Making better technology that understands emotions could aid in making educational resources more understandable and beneficial. They could provide personalised help tailored to each individual’s emotional and cognitive needs.

5. Conclusions and Future Directions

This PRISMA-based systematic literature review and meta-analysis provided valuable insights into applying text-based sentiment analysis and FSA in education, specifically STEM subjects, to assist learners and educators. Our review addressed four primary research questions regarding the application and challenges of sentiment analysis in online learning by reviewing time-relevant articles published between 2015 and 2023.
Regarding RQ1, the review revealed that text-based sentiment analysis had been frequently used in education to improve student’s learning experiences. Several research studies have shown that sentiment analysis can help with student engagement, feedback generation, and personalised learning. Textual data analysis allowed instructors to better understand their students’ emotions, perceptions, and learning requirements, allowing for more targeted interventions and assistance.
Regarding RQ2, the review identified several issues and limitations related to text-based sentiment analysis in the context of online learning. Significant challenges were identified, including the lack of labelled data, the impact of sarcasm and irony in text, and the requirement for real-time analysis of learners’ utterances. These difficulties demand further research and the development of new techniques to improve the accuracy and efficacy of text-based sentiment analysis in online learning environments.
Moving on to RQ3, the review shed light on the growing popularity of FSA in education and online learning. Studies have shown that analysing facial expressions can help us understand learners’ emotions, engagement levels, and cognitive processes. FSA provides valuable insights for adaptive learning systems, learner modelling, and affective computing, resulting in more personalised and successful educational experiences.
Finally, RQ4 investigated the challenges and limitations of FSA in online learning. Concerns raised in the review highlighted the accurate identification and interpretation of facial expressions, the impact of cultural variations, the detection of subtle emotions, and the influence of lighting and camera quality. These challenges highlight the importance of robust algorithms, standardised datasets, and cultural diversity considerations when implementing face-gesture sentiment analysis.
Overall, this systematic literature review provides an in-depth understanding of text-based and FSA applications, challenges, and limitations in the educational domain. The findings demonstrate the potential of sentiment analysis approaches to improve learning experiences and identify areas that require further study and development to address the identified limitations. By addressing these issues, sentiment analysis can continue to play a vital part in improving online learning environments and increasing student engagement, satisfaction, and academic performance.
Despite the insights presented by this review, certain important research gaps regarding sentiment analysis in online educational environments need to be addressed. Currently, no studies are exploring real-time sentiment analysis of chat data to provide interventions for learners. Most sentiment analysis models rely on post hoc analysis of textual data, but real-time analysis of chat could allow for dynamic adaptation of instruction and support. Integrating real-time facial expression or gesture recognition with chat sentiment analysis may further enrich the understanding of learner states. However, developing accurate real-time models that respond appropriately remains an open challenge. More research is required to create larger annotated datasets covering various educational contexts and can be used to train more robust sentiment analysis models. More research is also needed on multimodal sentiment analysis, which combines textual, visual, and auditory inputs to achieve greater precision.
Furthermore, more research needs to be conducted to investigate advanced neural network techniques such as transformers and how they might improve sentiment classification accuracy for informal textual data. Research must also explore the cultural nuances of showing and interpreting sentiments to increase the contextual understanding skills of sentiment analysis technologies. Such studies can provide more evidence on how these tools impact key outcomes such as learner motivation, engagement, and academic achievement. Ultimately, addressing these research gaps through interdisciplinary efforts will be vital to unlocking the full potential of sentiment analysis for providing personalised and emotionally aware learning on a large scale.

Author Contributions

Study conception and design: A.A., I.U.R., M.M.N., S.B.A.K. and N.K.; data collection: A.A., I.U.R. and S.B.A.K.; analysis and interpretation of results: A.A., I.U.R., M.M.N. and S.B.A.K.; draft manuscript preparation: A.A., I.U.R., M.M.N., S.B.A.K. and N.K. All authors reviewed the results and approved the final version of the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This manuscript is funded by the UK–Saudi Challenge Fund grant from the British Council’s Going Global Partnerships programme. The project title is “IntelliStudent—A Cutting-edge Computer-aided Learning Platform to Augment Online Teaching and Learning Pedagogies: A UK-Saudi Partnership Project”.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

There is no private data or data hidden from the readers. All materials and data are reported in the manuscript and can be easily accessed through the provided references in the bibliography section.

Acknowledgments

This paper is a part of “IntelliStudent—A Cutting-edge Computer-aided Learning Platform to Augment Online Teaching and Learning Pedagogies: A UK-Saudi Partner- ship Project ”, which is funded and supported by a UK–Saudi Challenge Fund grant from the British Council’s Going Global Partnerships programme. The programme builds stronger, more inclusive, internationally connected higher education and TVET systems. The authors would also like to acknowledge the School of Computing and Engineering, University of West London (UWL), and Smart Systems Engineering lab, Prince Sultan University (PSU), for their valuable support.

Conflicts of Interest

The authors declare that they have no conflict of interest to report regarding the present study.

Abbreviations

The following abbreviations are used in this manuscript:
PRISMAPreferred reporting items for systematic reviews and meta-analyses
STEMScience, technology, engineering, and maths
MOOCMassive open online course
FSAFacial sentiment analysis
VRVirtual reality
ARAugmented reality
EEGElectroencephalogram
SASentiment analysis
RQResearch question
NLPNatural language processing
ASDAutism spectrum disorder
NBNaive Bayes
SVMSupport vector machine
CNNsConvolutional neural networks
RNNsRecurrent neural networks
VADERValence-aware dictionary for sentiment reasoning
LDALatent Dirichlet allocation
BERTBidirectional encoder representations from transformers

References

  1. Factors, F. E-Learning Market Is Projected to Hit USD 848.12 Billion at a CAGR of 17.54% by 2030—Report by Facts & Factors (FnF). Available online: https://www.globenewswire.com/news-release/2023/02/02/2600283/0/en/E-Learning-Market-is-Projected-to-Hit-USD-848-12-Billion-at-a-CAGR-of-17-54-by-2030-Report-by-Facts-Factors-FnF.html (accessed on 27 May 2023).
  2. Acharya, S.; Reza, M. Real-time emotion engagement tracking of students using human biometric emotion intensities. In Machine Learning for Biometrics; Elsevier: Amsterdam, The Netherlands, 2022; pp. 143–153. [Google Scholar]
  3. Castiblanco Jimenez, I.A.; Gomez Acevedo, J.S.; Marcolin, F.; Vezzetti, E.; Moos, S. Towards an integrated framework to measure user engagement with interactive or physical products. Int. J. Interact. Des. Manuf. (IJIDeM) 2023, 17, 45–67. [Google Scholar] [CrossRef]
  4. Slavich, G.M.; Zimbardo, P.G. Transformational teaching: Theoretical underpinnings, basic principles, and core methods. Educ. Psychol. Rev. 2012, 24, 569–608. [Google Scholar] [CrossRef] [PubMed]
  5. Sobnath, D.; Kaduk, T.; Rehman, I.U.; Isiaq, O. Feature selection for UK disabled students’ engagement post higher education: A machine learning approach for a predictive employment model. IEEE Access 2020, 8, 159530–159541. [Google Scholar] [CrossRef]
  6. Haq, I.U.; Anwar, A.; Rehman, I.U.; Asif, W.; Sobnath, D.; Sherazi, H.H.R.; Nasralla, M.M. Dynamic group formation with intelligent tutor collaborative learning: A novel approach for next generation collaboration. IEEE Access 2021, 9, 143406–143422. [Google Scholar] [CrossRef]
  7. Plump, C.M.; LaRosa, J. Using Kahoot! in the classroom to create engagement and active learning: A game-based technology solution for eLearning novices. Manag. Teach. Rev. 2017, 2, 151–158. [Google Scholar] [CrossRef]
  8. Marechal, C.; Mikolajewski, D.; Tyburek, K.; Prokopowicz, P.; Bougueroua, L.; Ancourt, C.; Wegrzyn-Wolska, K. Survey on AI-Based Multimodal Methods for Emotion Detection. High-Perform. Model. Simul. Big Data Appl. 2019, 11400, 307–324. [Google Scholar]
  9. Fraiwan, M.; Alafeef, M.; Almomani, F. Gauging human visual interest using multiscale entropy analysis of EEG signals. J. Ambient. Intell. Humaniz. Comput. 2021, 12, 2435–2447. [Google Scholar] [CrossRef]
  10. Ferri, F.; Grifoni, P.; Guzzo, T. Online learning and emergency remote teaching: Opportunities and challenges in emergency situations. Societies 2020, 10, 86. [Google Scholar] [CrossRef]
  11. Feidakis, M.; Daradoumis, T.; Caballã, S.; Conesa, J. Embedding emotion awareness into e-learning environments. Int. J. Emerg. Technol. Learn. (iJET) 2014, 9, 39–46. [Google Scholar] [CrossRef]
  12. Maxey, K.; Norman, D. Blended Learning in Tennessee: How Secondary Teachers’ Perceptions Impact Levels of Implementation, Student Engagement, and Student Achievement. Ph.D. Thesis, Lipscomb University, Nashville, TN, USA, 2019. [Google Scholar]
  13. Nazir, A.; Rao, Y.; Wu, L.; Sun, L. Issues and challenges of aspect-based sentiment analysis: A comprehensive survey. IEEE Trans. Affect. Comput. 2020, 13, 845–863. [Google Scholar] [CrossRef]
  14. Filatova, E. Irony and Sarcasm: Corpus Generation and Analysis Using Crowdsourcing. In Proceedings of the Eighth International Conference on Language Resources and Evaluation (LREC’12), Istanbul, Turkey, May 2012; pp. 392–398. [Google Scholar]
  15. Fei, Z.; Yang, E.; Li, D.D.U.; Butler, S.; Ijomah, W.; Li, X.; Zhou, H. Deep convolution network based emotion analysis towards mental health care. Neurocomputing 2020, 388, 212–227. [Google Scholar] [CrossRef]
  16. Oghu, E.; Ogbuju, E.; Abiodun, T.; Oladipo, F. A review of sentiment analysis approaches for quality assurance in teaching and learning. Bull. Soc. Inform. Theory Appl. 2022, 6, 177–188. [Google Scholar]
  17. Dolianiti, F.S.; Iakovakis, D.; Dias, S.B.; Hadjileontiadou, S.; Diniz, J.A.; Hadjileontiadis, L. Sentiment analysis techniques and applications in education: A survey. In Proceedings of the International Conference on Technology and Innovation in Learning, Teaching and Education, Thessaloniki, Greece, 20–22 June 2018; pp. 412–427. [Google Scholar]
  18. Zhou, J.; Ye, J.M. Sentiment analysis in education research: A review of journal publications. Interact. Learn. Environ. 2023, 31, 1252–1264. [Google Scholar] [CrossRef]
  19. Kastrati, Z.; Dalipi, F.; Imran, A.S.; Pireva Nuci, K.; Wani, M.A. Sentiment analysis of students’ feedback with NLP and deep learning: A systematic mapping study. Appl. Sci. 2021, 11, 3986. [Google Scholar] [CrossRef]
  20. Wankhade, M.; Rao, A.C.S.; Kulkarni, C. A survey on sentiment analysis methods, applications, and challenges. Artif. Intell. Rev. 2022, 55, 5731–5780. [Google Scholar] [CrossRef]
  21. Babu, N.V.; Kanaga, E.G.M. Sentiment analysis in social media data for depression detection using artificial intelligence: A review. SN Comput. Sci. 2022, 3, 1–20. [Google Scholar] [CrossRef]
  22. Gaikar, D.D.; Marakarkandy, B.; Dasgupta, C. Using Twitter data to predict the performance of Bollywood movies. Ind. Manag. Data Syst. 2015, 115, 1604–1621. [Google Scholar] [CrossRef]
  23. Rehman, I.U.; Sobnath, D.; Nasralla, M.M.; Winnett, M.; Anwar, A.; Asif, W.; Sherazi, H.H.R. Features of mobile apps for people with autism in a post COVID-19 scenario: Current status and recommendations for apps using AI. Diagnostics 2021, 11, 1923. [Google Scholar] [CrossRef]
  24. Anwar, A.; Rehman, I.U.; Husamaldin, L.; Ijaz-ul-Haq. Smart Education for People with Disabilities (PwDs): Conceptual Framework for PwDs Emotions Classification from Student Utterances (SUs) during Online Learning. In Proceedings of the 2022 IEEE International Smart Cities Conference (ISC2), Pafos, Cyprus, 26–29 September 2022; pp. 1–7. [Google Scholar]
  25. Behdenna, S.; Barigou, F.; Belalem, G. Document level sentiment analysis: A survey. EAI Endorsed Trans.-Context-Aware Syst. Appl. 2018, 4, e2. [Google Scholar] [CrossRef]
  26. Fawzy, M.; Fakhr, M.W.; Rizka, M.A. Word Embeddings and Neural Network Architectures for Arabic Sentiment Analysis. In Proceedings of the 2020 16th International Computer Engineering Conference (ICENCO), Cairo, Egypt, 29–30 December 2020; pp. 92–96. [Google Scholar]
  27. Tembhurne, J.V.; Diwan, T. Sentiment analysis in textual, visual and multimodal inputs using recurrent neural networks. Multimed. Tools Appl. 2021, 80, 6871–6910. [Google Scholar] [CrossRef]
  28. Dang, N.C.; Moreno-García, M.N.; De la Prieta, F. Sentiment analysis based on deep learning: A comparative study. Electronics 2020, 9, 483. [Google Scholar] [CrossRef]
  29. Shaik, T.; Tao, X.; Li, Y.; Dann, C.; McDonald, J.; Redmond, P.; Galligan, L. A review of the trends and challenges in adopting natural language processing methods for education feedback analysis. IEEE Access 2022, 10, 56720–56739. [Google Scholar] [CrossRef]
  30. Yu, L.C.; Lee, L.H.; Hao, S.; Wang, J.; He, Y.; Hu, J.; Lai, K.R.; Zhang, X. Building Chinese affective resources in valence-arousal dimensions. In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, San Diego, CA, USA, 12–17 June 2016; pp. 540–545. [Google Scholar]
  31. Asghar, M.Z.; Khan, A.; Ahmad, S.; Qasim, M.; Khan, I.A. Lexicon-enhanced sentiment analysis framework using rule-based classification scheme. PLoS ONE 2017, 12, e0171649. [Google Scholar] [CrossRef]
  32. Deng, S.; Sinha, A.P.; Zhao, H. Adapting sentiment lexicons to domain-specific social media texts. Decis. Support Syst. 2017, 94, 65–76. [Google Scholar] [CrossRef]
  33. Mathapati, S.; Manjula, S.; Venugopal, K. Sentiment analysis and opinion mining from social media: A review. Glob. J. Comput. Sci. Technol. 2016, 6, 2017. [Google Scholar]
  34. Denecke, K. Using sentiwordnet for multilingual sentiment analysis. In Proceedings of the 2008 IEEE 24th International Conference on Data Engineering Workshop, Cancun, Mexico, 7–12 April 2008; pp. 507–512. [Google Scholar]
  35. Guerini, M.; Gatti, L.; Turchi, M. Sentiment analysis: How to derive prior polarities from SentiWordNet. arXiv 2013, arXiv:1309.5843. [Google Scholar]
  36. Khan, F.H.; Qamar, U.; Bashir, S. A semi-supervised approach to sentiment analysis using revised sentiment strength based on SentiWordNet. Knowl. Inf. Syst. 2017, 51, 851–872. [Google Scholar] [CrossRef]
  37. Elbagir, S.; Yang, J. Twitter sentiment analysis using natural language toolkit and VADER sentiment. In Proceedings of the International Multiconference of Engineers and Computer Scientists, Hong Kong, China, 13–15 March 2019; Volume 122, p. 16. [Google Scholar]
  38. Tymann, K.; Lutz, M.; Palsbröker, P.; Gips, C. GerVADER-A German Adaptation of the VADER Sentiment Analysis Tool for Social Media Texts. In Proceedings of the LWDA, Berlin, Germany, 30 September–2 October 2019; pp. 178–189. [Google Scholar]
  39. Bisio, F.; Meda, C.; Gastaldo, P.; Zunino, R.; Cambria, E. Concept-level sentiment analysis with SenticNet. In A Practical Guide to Sentiment Analysis; Springer: Berlin/Heidelberg, Germany, 2017; pp. 173–188. [Google Scholar]
  40. Hung, C.; Wu, W.R.; Chou, H.M. Improvement of sentiment analysis via re-evaluation of objective words in SenticNet for hotel reviews. Lang. Resour. Eval. 2021, 55, 585–595. [Google Scholar] [CrossRef]
  41. Hossain, M.S.; Rahman, M.F. Customer sentiment analysis and prediction of insurance products’ reviews using machine learning approaches. FIIB Bus. Rev. 2022, 23197145221115793. [Google Scholar] [CrossRef]
  42. Bakar, N.S.A.A.; Rahmat, R.A.; Othman, U.F. Polarity classification tool for sentiment analysis in Malay language. IAES Int. J. Artif. Intell. 2019, 8, 259. [Google Scholar]
  43. Mohammad, S.M.; Turney, P.D. Nrc emotion lexicon. Natl. Res. Counc. 2013, 2, 234. [Google Scholar]
  44. Tabak, F.S.; Evrim, V. Comparison of emotion lexicons. In Proceedings of the 2016 HONET-ICT, Nicosia, Cyprus, 13–14 October 2016; pp. 154–158. [Google Scholar]
  45. De Smedt, T.; Daelemans, W. Pattern for python. J. Mach. Learn. Res. 2012, 13, 2063–2067. [Google Scholar]
  46. Gatti, L.; van Stegeren, J. Improving Dutch sentiment analysis in Pattern. Comput. Linguist. Neth. J. 2020, 10, 73–89. [Google Scholar]
  47. Gujjar, J.P.; Kumar, H.P. Sentiment analysis: Textblob for decision making. Int. J. Sci. Res. Eng. Trends 2021, 7, 1097–1099. [Google Scholar]
  48. Bose, R.; Aithal, P.; Roy, S. Sentiment analysis on the basis of tweeter comments of application of drugs by customary language toolkit and textblob opinions of distinct countries. Int. J. 2020, 8, 3684–3696. [Google Scholar]
  49. Wan, Y.; Gao, Q. An ensemble sentiment classification system of twitter data for airline services analysis. In Proceedings of the 2015 IEEE International Conference on Data Mining Workshop (ICDMW), Atlantic City, NJ, USA, 14–17 November 2015; pp. 1318–1325. [Google Scholar]
  50. Alsayat, A. Customer decision-making analysis based on big social data using machine learning: A case study of hotels in Mecca. Neural Comput. Appl. 2023, 35, 4701–4722. [Google Scholar] [CrossRef]
  51. Tang, D.; Qin, B.; Liu, T. Deep learning for sentiment analysis: Successful approaches and future challenges. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2015, 5, 292–303. [Google Scholar] [CrossRef]
  52. Dey, L.; Chakraborty, S.; Biswas, A.; Bose, B.; Tiwari, S. Sentiment analysis of review datasets using naive bayes and k-nn classifier. arXiv 2016, arXiv:1610.09982. [Google Scholar] [CrossRef]
  53. Goel, A.; Gautam, J.; Kumar, S. Real time sentiment analysis of tweets using Naive Bayes. In Proceedings of the 2016 2nd International Conference on Next Generation Computing Technologies (NGCT), Dehradun, India, 14–16 October 2016; pp. 257–261. [Google Scholar]
  54. Ahmad, M.; Aftab, S.; Ali, I. Sentiment analysis of tweets using svm. Int. J. Comput. Appl 2017, 177, 25–29. [Google Scholar] [CrossRef]
  55. Ahmad, M.; Aftab, S.; Bashir, M.S.; Hameed, N.; Ali, I.; Nawaz, Z. SVM optimization for sentiment analysis. Int. J. Adv. Comput. Sci. Appl. 2018, 9, 393–398. [Google Scholar] [CrossRef]
  56. Al Amrani, Y.; Lazaar, M.; El Kadiri, K.E. Random forest and support vector machine based hybrid approach to sentiment analysis. Procedia Comput. Sci. 2018, 127, 511–520. [Google Scholar] [CrossRef]
  57. Fauzi, M.A. Random Forest Approach fo Sentiment Analysis in Indonesian. Indones. J. Electr. Eng. Comput. Sci 2018, 12, 46–50. [Google Scholar]
  58. Liao, S.; Wang, J.; Yu, R.; Sato, K.; Cheng, Z. CNN for situations understanding based on sentiment analysis of twitter data. Procedia Comput. Sci. 2017, 111, 376–381. [Google Scholar] [CrossRef]
  59. Feng, Y.; Cheng, Y. Short text sentiment analysis based on multi-channel CNN with multi-head attention mechanism. IEEE Access 2021, 9, 19854–19863. [Google Scholar] [CrossRef]
  60. Murthy, G.; Allu, S.R.; Andhavarapu, B.; Bagadi, M.; Belusonti, M. Text based sentiment analysis using LSTM. Int. J. Eng. Res. Tech. Res 2020, 9. [Google Scholar]
  61. Long, F.; Zhou, K.; Ou, W. Sentiment analysis of text based on bidirectional LSTM with multi-head attention. IEEE Access 2019, 7, 141960–141969. [Google Scholar] [CrossRef]
  62. Borele, P.; Borikar, D.A. An approach to sentiment analysis using artificial neural network with comparative analysis of different techniques. IOSR J. Comput. Eng. (IOSR-JCE) 2016, 18, 64–69. [Google Scholar]
  63. Sharma, D.; Sabharwal, M.; Goyal, V.; Vij, M. Sentiment analysis techniques for social media data: A review. In Proceedings of the First International Conference on Sustainable Technologies for Computational Intelligence: Proceedings of ICTSCI 2019, Jaipur, India, 29–30 March 2019; pp. 75–90. [Google Scholar]
  64. Hoang, M.; Bihorac, O.A.; Rouces, J. Aspect-based sentiment analysis using bert. In Proceedings of the 22nd Nordic Conference on Computational Linguistics, Turku, Finland, 30 September–2 October 2019; pp. 187–196. [Google Scholar]
  65. Zhao, L.; Li, L.; Zheng, X.; Zhang, J. A BERT based sentiment analysis and key entity detection approach for online financial texts. In Proceedings of the 2021 IEEE 24th International Conference on Computer Supported Cooperative Work in Design (CSCWD), Dalian, China, 5–7 May 2021; pp. 1233–1238. [Google Scholar]
  66. Mohan, M.; Abhinav, A.; Ashok, A.; Akhil, A.; Achinth, P. Depression Detection using Facial Expression and Sentiment Analysis. In Proceedings of the 2021 Asian Conference on Innovation in Technology (ASIANCON), Pune, India, 27–29 August 2021; pp. 1–6. [Google Scholar]
  67. Altuwairqi, K.; Jarraya, S.K.; Allinjawi, A.; Hammami, M. Student behavior analysis to measure engagement levels in online learning environments. Signal Image Video Process. 2021, 15, 1387–1395. [Google Scholar] [CrossRef]
  68. Ming, Y.; Qian, H.; Guangyuan, L. CNN-LSTM Facial Expression Recognition Method Fused with Two-Layer Attention Mechanism. Comput. Intell. Neurosci. 2022, 2022, 7450637. [Google Scholar] [CrossRef]
  69. Guo, W.; Xu, Z.; Guo, Z.; Mao, L.; Hou, Y.; Huang, Z. Pain Assessment Using Facial Action Units and Bayesian Network. In Proceedings of the 2021 40th Chinese Control Conference (CCC), Shanghai, China, 26–28 July 2021; pp. 4665–4670. [Google Scholar]
  70. Mollahosseini, A.; Hasani, B.; Mahoor, M.H. Affectnet: A database for facial expression, valence, and arousal computing in the wild. IEEE Trans. Affect. Comput. 2017, 10, 18–31. [Google Scholar] [CrossRef]
  71. Adnan, M.M.; Rahim, M.S.M.; Rehman, A.; Mehmood, Z.; Saba, T.; Naqvi, R.A. Automatic image annotation based on deep learning models: A systematic review and future challenges. IEEE Access 2021, 9, 50253–50264. [Google Scholar] [CrossRef]
  72. Angadi, S.; Reddy, V.S. Multimodal sentiment analysis using reliefF feature selection and random forest classifier. Int. J. Comput. Appl. 2021, 43, 931–939. [Google Scholar] [CrossRef]
  73. Zhou, L.; Li, W.; Du, Y.; Lei, B.; Liang, S. Adaptive illumination-invariant face recognition via local nonlinear multi-layer contrast feature. J. Vis. Commun. Image Represent. 2019, 64, 102641. [Google Scholar] [CrossRef]
  74. Podder, T.; Bhattacharya, D.; Majumdar, A. Time efficient real time facial expression recognition with CNN and transfer learning. Sādhanā 2022, 47, 177. [Google Scholar] [CrossRef]
  75. Heredia, J.; Cardinale, Y.; Dongo, I.; Díaz-Amado, J. A multi-modal visual emotion recognition method to instantiate an ontology. In Proceedings of the 16th International Conference on Software Technologies, online, 6–8 July 2021; SCITEPRESS-Science and Technology Publications: Setubal, Portugal, 2021; pp. 453–464. [Google Scholar]
  76. Kirana, K.C.; Wibawanto, S.; Herwanto, H.W. Facial emotion recognition based on viola-jones algorithm in the learning environment. In Proceedings of the 2018 International Seminar on Application for Technology of Information and Communication, Semarang, Indonesia, 21–22 September 2018; pp. 406–410. [Google Scholar]
  77. Paul, T.; Shammi, U.A.; Ahmed, M.U.; Rahman, R.; Kobashi, S.; Ahad, M.A.R. A study on face detection using Viola-Jones algorithm in various backgrounds, angles and distances. Int. J. Biomed. Soft Comput. Hum. Sci. Off. J. Biomed. Fuzzy Syst. Assoc. 2018, 23, 27–36. [Google Scholar]
  78. Hirzi, M.F.; Efendi, S.; Sembiring, R.W. Literature Study of Face Recognition using The Viola-Jones Algorithm. In Proceedings of the 2021 International Conference on Artificial Intelligence and Mechatronics Systems (AIMS), Bandung, Indonesia, 28–30 April 2021; pp. 1–6. [Google Scholar]
  79. Suhaimin, M.S.M.; Hijazi, M.H.A.; Kheau, C.S.; On, C.K. Real-time mask detection and face recognition using eigenfaces and local binary pattern histogram for attendance system. Bull. Electr. Eng. Inform. 2021, 10, 1105–1113. [Google Scholar] [CrossRef]
  80. Mukhopadhyay, S.; Sharma, S. Real time facial expression and emotion recognition using eigen faces, LBPH and fisher algorithms. In Proceedings of the 2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence), Noida, India, 29–31 January 2020; pp. 212–220. [Google Scholar]
  81. Happy, S.; George, A.; Routray, A. A real time facial expression classification system using local binary patterns. In Proceedings of the 2012 4th International Conference on Intelligent Human Computer Interaction (IHCI), Kharagpur, India, 27–29 December 2012; pp. 1–5. [Google Scholar]
  82. Shan, C.; Gong, S.; McOwan, P.W. Facial expression recognition based on local binary patterns: A comprehensive study. Image Vis. Comput. 2009, 27, 803–816. [Google Scholar] [CrossRef]
  83. Zhao, X.; Zhang, S. Facial expression recognition using local binary patterns and discriminant kernel locally linear embedding. EURASIP J. Adv. Signal Process. 2012, 2012, 20. [Google Scholar] [CrossRef]
  84. Hegde, N.; Preetha, S.; Bhagwat, S. Facial Expression Classifier Using Better Technique: FisherFace Algorithm. In Proceedings of the 2018 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Bangalore, India, 19–22 September 2018; pp. 604–610. [Google Scholar]
  85. Kandhro, I.A.; Uddin, M.; Hussain, S.; Chaudhery, T.J.; Shorfuzzaman, M.; Meshref, H.; Albalhaq, M.; Alsaqour, R.; Khalaf, O.I. Impact of Activation, Optimization, and Regularization Methods on the Facial Expression Model Using CNN. Comput. Intell. Neurosci. 2022, 2022, 3098604. [Google Scholar] [CrossRef]
  86. Refat, M.A.R.; Sarker, S.; Kaushal, C.; Kaur, A.; Islam, M.K. WhyMyFace: A Novel Approach to Recognize Facial Expressions Using CNN and Data Augmentations. In Emerging Technologies in Data Mining and Information Security: Proceedings of IEMIS 2022, Volume 3; Springer: Berlin/Heidelberg, Germany, 2022; pp. 553–563. [Google Scholar]
  87. Jang, G.J.; Park, J.S.; Jo, A.; Kim, J.H. Facial emotion recognition using active shape models and statistical pattern recognizers. In Proceedings of the 2014 Ninth International Conference on Broadband and Wireless Computing, Communication and Applications, Guangdong, China, 8–10 November 2014; pp. 514–517. [Google Scholar]
  88. Shbib, R.; Zhou, S. Facial expression analysis using active shape model. Int. J. Signal Process. Image Process. Pattern Recognit. 2015, 8, 9–22. [Google Scholar] [CrossRef]
  89. Ratliff, M.S.; Patterson, E. Emotion recognition using facial expressions with active appearance models. In Proceedings of the Third IASTED International Conference on Human Computer Interaction, Innsbruck, Austria, 17–19 March 2008. [Google Scholar]
  90. Martin, C.; Werner, U.; Gross, H.M. A real-time facial expression recognition system based on active appearance models using gray images and edge images. In Proceedings of the 2008 8th IEEE International Conference on Automatic Face & Gesture Recognition, Amsterdam, The Netherlands, 17–19 September 2008; pp. 1–6. [Google Scholar]
  91. Zadeh, A.; Chong Lim, Y.; Baltrusaitis, T.; Morency, L.P. Convolutional experts constrained local model for 3d facial landmark detection. In Proceedings of the IEEE International Conference on Computer Vision Workshops, Venice, Italy, 22–29 October 2017; pp. 2519–2528. [Google Scholar]
  92. Hamm, J.; Kohler, C.G.; Gur, R.C.; Verma, R. Automated facial action coding system for dynamic analysis of facial expressions in neuropsychiatric disorders. J. Neurosci. Methods 2011, 200, 237–256. [Google Scholar] [CrossRef] [PubMed]
  93. Skiendziel, T.; Rösch, A.G.; Schultheiss, O.C. Assessing the convergent validity between the automated emotion recognition software Noldus FaceReader 7 and Facial Action Coding System Scoring. PLoS ONE 2019, 14, e0223905. [Google Scholar] [CrossRef] [PubMed]
  94. Carcagnì, P.; Del Coco, M.; Leo, M.; Distante, C. Facial expression recognition and histograms of oriented gradients: A comprehensive study. SpringerPlus 2015, 4, 1–25. [Google Scholar] [CrossRef] [PubMed]
  95. Azeem, A.; Sharif, M.; Shah, J.; Raza, M. Hexagonal scale invariant feature transform (H-SIFT) for facial feature extraction. J. Appl. Res. Technol. 2015, 13, 402–408. [Google Scholar] [CrossRef]
  96. Karthikeyan, C.; Jabber, B.; Deepak, V.; Vamsidhar, E. Image Processing based Improved Face Recognition for Mobile Devices by using Scale-Invariant Feature Transform. In Proceedings of the 2020 International Conference on Inventive Computation Technologies (ICICT), Coimbatore, India, 26–28 February 2020; pp. 716–722. [Google Scholar]
  97. Kallipolitis, A.; Galliakis, M.; Menychtas, A.; Maglogiannis, I. Emotion analysis in hospital bedside infotainment platforms using speeded up robust features. In Proceedings of the Artificial Intelligence Applications and Innovations: 15th IFIP WG 12.5 International Conference, AIAI 2019, Hersonissos, Crete, Greece, 24–26 May 2019; Proceedings 15. pp. 127–138. [Google Scholar]
  98. Madupu, R.K.; Kothapalli, C.; Yarra, V.; Harika, S.; Basha, C.Z. Automatic human emotion recognition system using facial expressions with convolution neural network. In Proceedings of the 2020 4th International Conference on Electronics, Communication and Aerospace Technology (ICECA), Coimbatore, India, 5–7 November 2020; pp. 1179–1183. [Google Scholar]
  99. Apicella, A.; Arpaia, P.; Frosolone, M.; Improta, G.; Moccaldi, N.; Pollastro, A. EEG-based measurement system for monitoring student engagement in learning 4.0. Sci. Rep. 2022, 12, 5857. [Google Scholar] [CrossRef]
  100. Ladino Nocua, A.C.; Cruz Gonzalez, J.P.; Castiblanco Jimenez, I.A.; Gomez Acevedo, J.S.; Marcolin, F.; Vezzetti, E. Assessment of cognitive student engagement using heart rate data in distance learning during COVID-19. Educ. Sci. 2021, 11, 540. [Google Scholar] [CrossRef]
  101. Shah, S.M.A.; Usman, S.M.; Khalid, S.; Rehman, I.U.; Anwar, A.; Hussain, S.; Ullah, S.S.; Elmannai, H.; Algarni, A.D.; Manzoor, W. An ensemble model for consumer emotion prediction using EEG signals for neuromarketing applications. Sensors 2022, 22, 9744. [Google Scholar] [CrossRef]
  102. Selçuk, A.A. A guide for systematic reviews: PRISMA. Turk. Arch. Otorhinolaryngol. 2019, 57, 57. [Google Scholar] [CrossRef]
  103. Nguyen, V.D.; Van Nguyen, K.; Nguyen, N.L.T. Variants of long short-term memory for sentiment analysis on Vietnamese students’ feedback corpus. In Proceedings of the 2018 10th International Conference on Knowledge and Systems Engineering (KSE), Ho Chi Minh City, Vietnam, 1–3 November 2018; pp. 306–311. [Google Scholar]
  104. Estrada, M.L.B.; Cabada, R.Z.; Bustillos, R.O.; Graff, M. Opinion mining and emotion recognition applied to learning environments. Expert Syst. Appl. 2020, 150, 113265. [Google Scholar] [CrossRef]
  105. Hew, K.F.; Hu, X.; Qiao, C.; Tang, Y. What predicts student satisfaction with MOOCs: A gradient boosting trees supervised machine learning and sentiment analysis approach. Comput. Educ. 2020, 145, 103724. [Google Scholar] [CrossRef]
  106. Onan, A. Sentiment analysis on massive open online course evaluations: A text mining and deep learning approach. Comput. Appl. Eng. Educ. 2021, 29, 572–589. [Google Scholar] [CrossRef]
  107. Nkomo, L.M.; Ndukwe, I.G.; Daniel, B.K. Social network and sentiment analysis: Investigation of students’ perspectives on lecture recording. IEEE Access 2020, 8, 228693–228701. [Google Scholar] [CrossRef]
  108. Rääf, S.A.; Knöös, J.; Dalipi, F.; Kastrati, Z. Investigating learning experience of MOOCs learners using topic modeling and sentiment analysis. In Proceedings of the 2021 19th International Conference on Information Technology Based Higher Education and Training (ITHET), Sydney, Australia, 4–6 November 2021; pp. 1–7. [Google Scholar]
  109. Mujahid, M.; Lee, E.; Rustam, F.; Washington, P.B.; Ullah, S.; Reshi, A.A.; Ashraf, I. Sentiment analysis and topic modeling on tweets about online education during COVID-19. Appl. Sci. 2021, 11, 8438. [Google Scholar] [CrossRef]
  110. OSMANOĞLU, U.Ö.; Atak, O.N.; Çağlar, K.; Kayhan, H.; Talat, C. Sentiment analysis for distance education course materials: A machine learning approach. J. Educ. Technol. Online Learn. 2020, 3, 31–48. [Google Scholar] [CrossRef]
  111. Vedavathi, N.; KM, A.K. E-learning course recommendation based on sentiment analysis using hybrid Elman similarity. Knowl.-Based Syst. 2023, 259, 110086. [Google Scholar]
  112. Tan, L.; Tan, O.K.; Sze, C.C.; Goh, W.W.B. Emotional Variance Analysis: A new sentiment analysis feature set for Artificial Intelligence and Machine Learning applications. PLoS ONE 2023, 18, e0274299. [Google Scholar] [CrossRef]
  113. Sadigov, R.; Yıldırım, E.; Kocaçınar, B.; Patlar Akbulut, F.; Catal, C. Deep learning-based user experience evaluation in distance learning. Clust. Comput. 2023, 26, 1–13. [Google Scholar] [CrossRef]
  114. Kathuria, A.; Gupta, A.; Singla, R. AOH-Senti: Aspect-Oriented Hybrid Approach to Sentiment Analysis of Students’ Feedback. SN Comput. Sci. 2023, 4, 152. [Google Scholar] [CrossRef]
  115. Crossley, S.; McNamara, D.S.; Baker, R.; Wang, Y.; Paquette, L.; Barnes, T.; Bergner, Y. Language to Completion: Success in an Educational Data Mining Massive Open Online Class. In Proceedings of the 7th Annual Conference on Educational Data Mining, EDM2015, Madrid, Spain, 26–29 June 2015. [Google Scholar]
  116. Kagklis, V.; Karatrantou, A.; Tantoula, M.; Panagiotakopoulos, C.T.; Verykios, V.S. A learning analytics methodology for detecting sentiment in student fora: A Case Study in Distance Education. Eur. J. Open Distance e-Learn. 2015, 18, 74–94. [Google Scholar] [CrossRef]
  117. Fatima, N.; Rahman, M.A. An Enhanced Decision Support System through Mining of Teachers Online Chat Data. Int. J. Electron. Eng. 2018, 10, 426–436. [Google Scholar]
  118. Almalki, J. A machine learning-based approach for sentiment analysis on distance learning from Arabic Tweets. PeerJ Comput. Sci. 2022, 8, e1047. [Google Scholar] [CrossRef] [PubMed]
  119. Asmita, S.; Anuja, T.; Ash, D. Analysis of student feedback using deep learning. Int. J. Comput. Appl. Technol. Res 2019, 8, 161–164. [Google Scholar] [CrossRef]
  120. Rajput, Q.; Haider, S.; Ghani, S. Lexicon-based sentiment analysis of teachers’ evaluation. Appl. Comput. Intell. Soft Comput. 2016, 2016, 2385429. [Google Scholar] [CrossRef]
  121. Nasim, Z.; Rajput, Q.; Haider, S. Sentiment analysis of student feedback using machine learning and lexicon based approaches. In Proceedings of the 2017 International Conference On Research And Innovation In Information Systems (ICRIIS), Langkawi, Malaysia, 16–17 July 2017; pp. 1–6. [Google Scholar]
  122. Santos, C.L.; Rita, P.; Guerreiro, J. Improving international attractiveness of higher education institutions based on text mining and sentiment analysis. Int. J. Educ. Manag. 2018, 32, 431–447. [Google Scholar] [CrossRef]
  123. Kandhro, I.A.; Chhajro, M.A.; Kumar, K.; Lashari, H.N.; Khan, U. Student feedback sentiment analysis model using various machine learning schemes: A review. Indian J. Sci. Technol. 2019, 12, 1–9. [Google Scholar] [CrossRef]
  124. Jena, R. Sentiment mining in a collaborative learning environment: Capitalising on big data. Behav. Inf. Technol. 2019, 38, 986–1001. [Google Scholar] [CrossRef]
  125. Shen, C.w.; Kuo, C.J. Learning in massive open online courses: Evidence from social media mining. Comput. Hum. Behav. 2015, 51, 568–577. [Google Scholar] [CrossRef]
  126. Aung, K.Z.; Myo, N.N. Sentiment analysis of students’ comment using lexicon based approach. In Proceedings of the 2017 IEEE/ACIS 16th International Conference on Computer and Information Science (ICIS), Wuhan, China, 24–26 May 2017; pp. 149–154. [Google Scholar]
  127. Tanwani, N.; Kumar, S.; Jalbani, A.H.; Soomro, S.; Channa, M.I.; Nizamani, Z. Student opinion mining regarding educational system using facebook group. In Proceedings of the 2017 First International Conference on Latest trends in Electrical Engineering and Computing Technologies (INTELLECT), Karachi, Pakistan, 15–16 November 2017; pp. 1–5. [Google Scholar]
  128. Sindhu, I.; Daudpota, S.M.; Badar, K.; Bakhtyar, M.; Baber, J.; Nurunnabi, M. Aspect-based opinion mining on student’s feedback for faculty teaching performance evaluation. IEEE Access 2019, 7, 108729–108741. [Google Scholar] [CrossRef]
  129. Kastrati, Z.; Imran, A.S.; Kurti, A. Weakly supervised framework for aspect-based sentiment analysis on students’ reviews of MOOCs. IEEE Access 2020, 8, 106799–106810. [Google Scholar] [CrossRef]
  130. Alassaf, M.; Qamar, A.M. Aspect-based sentiment analysis of Arabic tweets in the education sector using a hybrid feature selection method. In Proceedings of the 2020 14th International conference on innovations in information technology (IIT), Al Ain, United Arab Emirates, 17–18 November 2020; pp. 178–185. [Google Scholar]
  131. Zhai, G.; Yang, Y.; Wang, H.; Du, S. Multi-attention fusion modeling for sentiment analysis of educational big data. Big Data Min. Anal. 2020, 3, 311–319. [Google Scholar] [CrossRef]
  132. Kastrati, Z.; Arifaj, B.; Lubishtani, A.; Gashi, F.; Nishliu, E. Aspect-Based Opinion Mining of Students’ Reviews on Online Courses. In Proceedings of the 2020 6th International Conference on Computing and Artificial Intelligence, Tianjin, China, 25–27 May 2022; pp. 510–514. [Google Scholar]
  133. Balachandran, L.; Kirupananda, A. Online reviews evaluation system for higher education institution: An aspect based sentiment analysis tool. In Proceedings of the 2017 11th International Conference on Software, Knowledge, Information Management and Applications (SKIMA), Malabe, Sri Lanka, 6–8 December 2017; pp. 1–7. [Google Scholar]
  134. Heryadi, Y.; Wijanarko, B.D.; Murad, D.F.; Tho, C.; Hashimoto, K. Aspect-based Sentiment Analysis for Improving Online Learning Program Based on Student Feedback. In Proceedings of the 2022 IEEE International Conference on Cybernetics and Computational Intelligence (CyberneticsCom), Malang, Indonesia, 16–18 June 2022; pp. 505–509. [Google Scholar]
  135. Zhao, A.; Yu, Y. Knowledge-enabled BERT for aspect-based sentiment analysis. Knowl.-Based Syst. 2021, 227, 107220. [Google Scholar] [CrossRef]
  136. Schurig, T.; Zambach, S.; Mukkamala, R.R.; Petry, M. Aspect-Based Sentiment Analysis for University Teaching Analytics. 2022. ECIS 2022 Research Papers. 2022. Available online: https://aisel.aisnet.org/ecis2022_rp/13 (accessed on 18 July 2023).
  137. Ramesh, A.; Kumar, S.H.; Foulds, J.; Getoor, L. Weakly supervised models of aspect-sentiment for online course discussion forums. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), Beijing, China, 26–31 July 2015; pp. 74–83. [Google Scholar]
  138. Sivakumar, M.; Reddy, U.S. Aspect based sentiment analysis of students opinion using machine learning techniques. In Proceedings of the 2017 International Conference on Inventive Computing and Informatics (ICICI), Coimbatore, India, 23–24 November 2017; pp. 726–731. [Google Scholar]
  139. Ismail, H.; Khalil, A.; Hussein, N.; Elabyad, R. Triggers and Tweets: Implicit Aspect-Based Sentiment and Emotion Analysis of Community Chatter Relevant to Education Post-COVID-19. Big Data Cogn. Comput. 2022, 6, 99. [Google Scholar] [CrossRef]
  140. Phillips, T.; Saleh, A.; Glazewski, K.D.; Hmelosilver, C.E.; Lee, S.; Mott, B.; Lester, J.C. Comparing Natural Language Processing Methods for Text Classification of Small Educational Data. In Proceedings of the Companion Proceedings 11th International Conference on Learning Analytics & Knowledge, Irvine, CA, USA, 12–16 April 2021. [Google Scholar]
  141. Cao, Y.; Zhang, P.; Xiong, A. Sentiment analysis based on expanded aspect and polarity-ambiguous word lexicon. Int. J. Adv. Comput. Sci. Appl. 2015, 6, 97–103. [Google Scholar] [CrossRef]
  142. Piryani, R.; Madhavi, D.; Singh, V.K. Analytical mapping of opinion mining and sentiment analysis research during 2000–2015. Inf. Process. Manag. 2017, 53, 122–150. [Google Scholar] [CrossRef]
  143. Bhargava, S.; Choudhary, S. Behavioral analysis of depressed sentimental over twitter: Based on supervised machine learning approach. In Proceedings of the 3rd International Conference on Internet of Things and Connected Technologies (ICIoTCT), Jaipur, India, 26–27 March 2018; pp. 26–27. [Google Scholar]
  144. Catal, C.; Nangir, M. A sentiment classification model based on multiple classifiers. Appl. Soft Comput. 2017, 50, 135–141. [Google Scholar] [CrossRef]
  145. Shu, K.; Sliva, A.; Wang, S.; Tang, J.; Liu, H. Fake news detection on social media: A data mining perspective. ACM SIGKDD Explor. Newsl. 2017, 19, 22–36. [Google Scholar] [CrossRef]
  146. Chen, T.; Li, X.; Yin, H.; Zhang, J. Call attention to rumors: Deep attention based recurrent neural networks for early rumor detection. In Proceedings of the Trends and Applications in Knowledge Discovery and Data Mining: PAKDD 2018 Workshops, BDASC, BDM, ML4Cyber, PAISI, DaMEMO, Melbourne, VIC, Australia, 3 June 2018; Revised Selected Papers 22. pp. 40–52. [Google Scholar]
  147. Sayyad, S.; Kumar, S.; Bongale, A.; Kamat, P.; Patil, S.; Kotecha, K. Data-driven remaining useful life estimation for milling process: Sensors, algorithms, datasets, and future directions. IEEE Access 2021, 9, 110255–110286. [Google Scholar] [CrossRef]
  148. Varol, O.; Ferrara, E.; Davis, C.; Menczer, F.; Flammini, A. Online human-bot interactions: Detection, estimation, and characterization. In Proceedings of the International AAAI Conference on Web and Social Media, Montreal, QC, Canada, 15–18 May 2017; Volume 11, pp. 280–289. [Google Scholar]
  149. Arora, M.; Kansal, V. Character level embedding with deep convolutional neural network for text normalization of unstructured data for Twitter sentiment analysis. Soc. Netw. Anal. Min. 2019, 9, 1–14. [Google Scholar] [CrossRef]
  150. Al-Natour, S.; Turetken, O. A comparative assessment of sentiment analysis and star ratings for consumer reviews. Int. J. Inf. Manag. 2020, 54, 102132. [Google Scholar] [CrossRef]
  151. Ghiassi, M.; Skinner, J.; Zimbra, D. Twitter brand sentiment analysis: A hybrid system using n-gram analysis and dynamic artificial neural network. Expert Syst. Appl. 2013, 40, 6266–6282. [Google Scholar] [CrossRef]
  152. Sedova, K.; Sedlacek, M.; Svaricek, R.; Majcik, M.; Navratilova, J.; Drexlerova, A.; Kychler, J.; Salamounova, Z. Do those who talk more learn more? The relationship between student classroom talk and student achievement. Learn. Instr. 2019, 63, 101217. [Google Scholar] [CrossRef]
  153. Hinkle, C.M.; Koretsky, M.D. Toward professional practice: Student learning opportunities through participation in engineering clubs. Eur. J. Eng. Educ. 2019, 44, 906–922. [Google Scholar] [CrossRef]
  154. Lakshmi, K.N.; Reddy, Y.K.; Kireeti, M.; Swathi, T.; Ismail, M. Design and implementation of student chat bot using AIML and LSA. Int. J. Innov. Technol. Explor. Eng. 2019, 8, 1742–1746. [Google Scholar]
  155. Chang, C.Y.; Hwang, G.J.; Gau, M.L. Promoting students’ learning achievement and self-efficacy: A mobile chatbot approach for nursing training. Br. J. Educ. Technol. 2022, 53, 171–188. [Google Scholar] [CrossRef]
  156. Das, B.; Krishnan, N.C.; Cook, D.J. Handling class overlap and imbalance to detect prompt situations in smart homes. In Proceedings of the 2013 IEEE 13th International Conference on Data Mining Workshops, Dallas, TX, USA, 7–10 December 2013; pp. 266–273. [Google Scholar]
  157. He, H.; Garcia, E.A. Learning from imbalanced data. IEEE Trans. Knowl. Data Eng. 2009, 21, 1263–1284. [Google Scholar]
  158. Blitzer, J.; Dredze, M.; Pereira, F. Biographies, bollywood, boom-boxes and blenders: Domain adaptation for sentiment classification. In Proceedings of the 45th Annual Meeting of the Association of Computational Linguistics, Prague, Czech Republic, 25–26 June 2007; pp. 440–447. [Google Scholar]
  159. Burns, N.; Bi, Y.; Wang, H.; Anderson, T. Sentiment analysis of customer reviews: Balanced versus unbalanced datasets. In Proceedings of the Knowledge-Based and Intelligent Information and Engineering Systems: 15th International Conference, KES 2011, Kaiserslautern, Germany, 12–14 September 2011; Proceedings, Part I 15. Springer: Berlin/Heidelberg, Germany, 2011; pp. 161–170. [Google Scholar]
  160. Amjad, A.; Qaiser, S.; Anwar, A.; Ijaz-ul-Haq; Ali, R. Analysing public sentiments regarding COVID-19 vaccines: A sentiment analysis approach. In Proceedings of the 2021 IEEE International Smart Cities Conference (ISC2), Manchester, UK, 7–10 September 2021; pp. 1–7. [Google Scholar]
  161. Salas-Zárate, M.d.P.; Medina-Moreira, J.; Lagos-Ortiz, K.; Luna-Aveiga, H.; Rodriguez-Garcia, M.A.; Valencia-Garcia, R. Sentiment analysis on tweets about diabetes: An aspect-level approach. Comput. Math. Methods Med. 2017, 2017, 5140631. [Google Scholar] [CrossRef]
  162. Flekova, L.; Preoţiuc-Pietro, D.; Ruppert, E. Analysing domain suitability of a sentiment lexicon by identifying distributionally bipolar words. In Proceedings of the 6th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, Lisbon, Portugal, 17 September 2015; pp. 77–84. [Google Scholar]
  163. Tacchini, E.; Ballarin, G.; Della Vedova, M.L.; Moret, S.; De Alfaro, L. Some like it hoax: Automated fake news detection in social networks. arXiv 2017, arXiv:1704.07506. [Google Scholar]
  164. Fang, K. Deep Learning Techniques for Fake News Detection. Mach. Learn. 2022, 16, 511–518. [Google Scholar] [CrossRef]
  165. Sykora, M.; Elayan, S.; Jackson, T.W. A qualitative analysis of sarcasm, irony and related# hashtags on Twitter. Big Data Soc. 2020, 7, 2053951720972735. [Google Scholar]
  166. Megahed, M.; Mohammed, A. Modeling adaptive E-learning environment using facial expressions and fuzzy logic. Expert Syst. Appl. 2020, 157, 113460. [Google Scholar] [CrossRef]
  167. Ayvaz, U.; Gürüler, H.; Devrim, M.O. Use of facial emotion recognition in e-learning systems. Inf. Technol. Learn. Tools 2017, 60. [Google Scholar] [CrossRef]
  168. Ashwin, T.; Jose, J.; Raghu, G.; Reddy, G.R.M. An e-learning system with multifacial emotion recognition using supervised machine learning. In Proceedings of the 2015 IEEE Seventh International Conference on Technology for Education (T4E), Warangal, India, 10–12 December 2015; pp. 23–26. [Google Scholar]
  169. Al-Alwani, A. Mood extraction using facial features to improve learning curves of students in e-learning systems. Int. J. Adv. Comput. Sci. Appl. 2016, 7, 444–453. [Google Scholar] [CrossRef]
  170. Tabassum, T.; Allen, A.A.; De, P. Non-intrusive identification of student attentiveness and finding their correlation with detectable facial emotions. In Proceedings of the 2020 ACM Southeast Conference, Tampa, FL, USA, 2–4 April 2020; pp. 127–134. [Google Scholar]
  171. Lopes, A.T.; De Aguiar, E.; De Souza, A.F.; Oliveira-Santos, T. Facial expression recognition with convolutional neural networks: Coping with few data and the training sample order. Pattern Recognit. 2017, 61, 610–628. [Google Scholar] [CrossRef]
  172. Ramakrishnan, A.; Ottmar, E.; LoCasale-Crouch, J.; Whitehill, J. Toward automated classroom observation: Predicting positive and negative climate. In Proceedings of the 2019 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019), Lille, France, 14–18 May 2019; pp. 1–8. [Google Scholar]
  173. Ozdemir, D.; Ugur, M.E. Model proposal on the determination of student attendance in distance education with face recognition technology. Turk. Online J. Distance Educ. 2021, 22, 19–32. [Google Scholar] [CrossRef]
  174. Salamh, A.B.S.; Akyüz, H.I. A New Deep Learning Model for Face Recognition and Registration in Distance Learning. Int. J. Emerg. Technol. Learn. (Online) 2022, 17, 29. [Google Scholar] [CrossRef]
  175. TS, A.; Guddeti, R.M.R. Automatic detection of students’ affective states in classroom environment using hybrid convolutional neural networks. Educ. Inf. Technol. 2020, 25, 1387–1415. [Google Scholar]
  176. Gandhi, A.; Adhvaryu, K.; Poria, S.; Cambria, E.; Hussain, A. Multimodal sentiment analysis: A systematic review of history, datasets, multimodal fusion methods, applications, challenges and future directions. Inf. Fusion 2022, 91, 424–444. [Google Scholar] [CrossRef]
  177. Wang, Z.; Zeng, F.; Liu, S.; Zeng, B. OAENet: Oriented attention ensemble for accurate facial expression recognition. Pattern Recognit. 2021, 112, 107694. [Google Scholar] [CrossRef]
  178. Bodini, M. A review of facial landmark extraction in 2D images and videos using deep learning. Big Data Cogn. Comput. 2019, 3, 14. [Google Scholar] [CrossRef]
  179. Nestor, M.S.; Fischer, D.; Arnold, D. “Masking” our emotions: Botulinum toxin, facial expression, and well-being in the age of COVID-19. J. Cosmet. Dermatol. 2020, 19, 2154–2160. [Google Scholar] [CrossRef]
  180. Karnati, M.; Seal, A.; Bhattacharjee, D.; Yazidi, A.; Krejcar, O. Understanding Deep Learning Techniques for Recognition of Human Emotions using Facial Expressions: A Comprehensive Survey. IEEE Trans. Instrum. Meas. 2023, 72, 5006631. [Google Scholar] [CrossRef]
  181. Kamińska, D.; Aktas, K.; Rizhinashvili, D.; Kuklyanov, D.; Sham, A.H.; Escalera, S.; Nasrollahi, K.; Moeslund, T.B.; Anbarjafari, G. Two-stage recognition and beyond for compound facial emotion recognition. Electronics 2021, 10, 2847. [Google Scholar] [CrossRef]
  182. Li, Y.; Tao, J.; Schuller, B.; Shan, S.; Jiang, D.; Jia, J. MEC 2016: The multimodal emotion recognition challenge of CCPR 2016. In Proceedings of the Pattern Recognition: 7th Chinese Conference, CCPR 2016, Chengdu, China, 5–7 November 2016; Proceedings, Part II 7. pp. 667–678. [Google Scholar]
  183. Schirmer, A.; Adolphs, R. Emotion perception from face, voice, and touch: Comparisons and convergence. Trends Cogn. Sci. 2017, 21, 216–228. [Google Scholar] [CrossRef] [PubMed]
  184. Zhang, M.; Chen, Y.; Lin, Y.; Ding, H.; Zhang, Y. Multichannel perception of emotion in speech, voice, facial expression, and gesture in individuals with autism: A scoping review. J. Speech Lang. Hear. Res. 2022, 65, 1435–1449. [Google Scholar] [CrossRef] [PubMed]
  185. Song, B.; Li, K.; Zong, Y.; Zhu, J.; Zheng, W.; Shi, J.; Zhao, L. Recognizing spontaneous micro-expression using a three-stream convolutional neural network. IEEE Access 2019, 7, 184537–184551. [Google Scholar] [CrossRef]
  186. Cai, W.; Gao, M.; Liu, R.; Mao, J. MIFAD-net: Multi-layer interactive feature fusion network with angular distance loss for face emotion recognition. Front. Psychol. 2021, 12, 4707. [Google Scholar] [CrossRef]
  187. Wang, Y.; See, J.; Oh, Y.H.; Phan, R.C.W.; Rahulamathavan, Y.; Ling, H.C.; Tan, S.W.; Li, X. Effective recognition of facial micro-expressions with video motion magnification. Multimed. Tools Appl. 2017, 76, 21665–21690. [Google Scholar] [CrossRef]
  188. Al Chanti, D.; Caplier, A. Deep learning for spatio-temporal modeling of dynamic spontaneous emotions. IEEE Trans. Affect. Comput. 2018, 12, 363–376. [Google Scholar] [CrossRef]
  189. Huang, B.; Wang, Z.; Jiang, K.; Zou, Q.; Tian, X.; Lu, T.; Han, Z. Joint segmentation and identification feature learning for occlusion face recognition. IEEE Trans. Neural Netw. Learn. Syst. 2022. [Google Scholar] [CrossRef]
  190. Yang, K.; Wang, C.; Sarsenbayeva, Z.; Tag, B.; Dingler, T.; Wadley, G.; Goncalves, J. Benchmarking commercial emotion detection systems using realistic distortions of facial image datasets. Vis. Comput. 2021, 37, 1447–1466. [Google Scholar] [CrossRef]
  191. Meng, Z.; Liu, P.; Cai, J.; Han, S.; Tong, Y. Identity-aware convolutional neural network for facial expression recognition. In Proceedings of the 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017), Washington, DC, USA, 30 May–3 June 2017; pp. 558–565. [Google Scholar]
  192. Li, S.; Deng, W. Deep facial expression recognition: A survey. IEEE Trans. Affect. Comput. 2020, 13, 1195–1215. [Google Scholar] [CrossRef]
  193. Samadiani, N.; Huang, G.; Cai, B.; Luo, W.; Chi, C.H.; Xiang, Y.; He, J. A review on automatic facial expression recognition systems assisted by multimodal sensor data. Sensors 2019, 19, 1863. [Google Scholar] [CrossRef] [PubMed]
  194. Deshmukh, S.; Patwardhan, M.; Mahajan, A. Survey on real-time facial expression recognition techniques. Iet Biom. 2016, 5, 155–163. [Google Scholar] [CrossRef]
  195. Hassouneh, A.; Mutawa, A.; Murugappan, M. Development of a real-time emotion recognition system using facial expressions and EEG based on machine learning and deep neural network methods. Inform. Med. Unlocked 2020, 20, 100372. [Google Scholar] [CrossRef]
Figure 1. Sentiment analysis and emotion classification in online learning.
Figure 1. Sentiment analysis and emotion classification in online learning.
Education 13 00914 g001
Figure 2. Taxonomy of sentiment analysis.
Figure 2. Taxonomy of sentiment analysis.
Education 13 00914 g002
Figure 3. Text-based sentiment analysis techniques [29].
Figure 3. Text-based sentiment analysis techniques [29].
Education 13 00914 g003
Figure 4. Papers selection criteria based on PRISMA technique.
Figure 4. Papers selection criteria based on PRISMA technique.
Education 13 00914 g004
Table 1. Research questions.
Table 1. Research questions.
Question No.Research Question
RQ1How has text-based sentiment analysis been used in the educational domain to facilitate learners?
RQ2What are the challenges/limitations of text-based sentiment analysis in online learning?
RQ3How has facial sentiment analysis been used in the educational domain to facilitate learners?
RQ4What are the challenges/limitations of facial sentiment analysis in online learning?
Table 2. Lexicon-based sentiment analysis techniques.
Table 2. Lexicon-based sentiment analysis techniques.
TechniqueDescriptionReference(s)
SentiWordNetA publicly available lexical resource for opinion mining which assigns to each synset of WordNet three sentiment scores: positivity, negativity, and objectivity.[34,35,36]
VADERA rule-based sentiment analysis tool that uses a lexicon of words and their intensity scores, as well as grammatical rules, to determine the polarity of a given text.[37,38],
SenticNetA concept-level sentiment analysis framework that assigns sentiment scores to concepts based on their semantic orientation, conceptual polarity, and semantic relatedness to other concepts.[39,40]
AFINNA list of English words rated for valence with an integer between minus five (negative) and plus five (positive).[41,42]
NRC Emotion LexiconA list of English words and their associations with eight basic emotions (anger, fear, anticipation, trust, surprise, sadness, joy, and disgust) and two sentiments (negative and positive).[43,44]
PatternA Python package that includes a sentiment analysis module based on a lexicon of sentiment words and a rule-based classifier. It can handle negations, idioms, and slang, and can also be trained on custom data.[45,46]
TextBlobA Python library that includes a sentiment analysis module based on the pattern analyser. It also includes a naive Bayes classifier that can be trained on custom data.[47,48]
Table 3. Machine learning sentiment analysis techniques.
Table 3. Machine learning sentiment analysis techniques.
TechniqueDescriptionReference(s)
Naive BayesA probabilistic algorithm that uses Bayes’ theorem to classify text as positive, negative, or neutral.[52,53]
Support Vector Machine (SVM)A supervised learning algorithm that separates data into different classes using a hyperplane.[54,55]
Random ForestAn ensemble learning algorithm that constructs multiple decision trees to classify data.[56,57]
Convolutional Neural Network (CNN)A type of neural network that uses convolutional layers to automatically learn features from input data.[58,59]
Long Short-Term Memory (LSTM)A type of recurrent neural network that is capable of capturing long-term dependencies in input data.[60,61]
Artificial Neural Networks (ANNs)A set of algorithms that attempt to recognise underlying relationships in a data set through a process that mimics how the human brain operates.[62,63]
BERTA pre-trained language model that uses deep neural networks to generate contextualised word embeddings. BERT has been shown to achieve state-of-the-art results in a wide range of natural language processing tasks, including sentiment analysis.[64,65]
Table 4. Facial sentiment analysis techniques used at different levels.
Table 4. Facial sentiment analysis techniques used at different levels.
TechniqueDescriptionLevelReference(s)
Viola–Jones AlgorithmDetects faces using Haar-like featuresFace level[76,77,78]
EigenfacesProjects face images into a lower-dimensional space and uses Principal Component Analysis (PCA) to classify emotionsFace level[79,80]
Local Binary Patterns (LBPs)Texture descriptor that extracts information about local patterns of pixel intensitiesFace level[81,82,83]
FisherfacesProjects face images into a lower-dimensional space and uses Fisher discriminant analysis (FDA) to classify emotionsFace level[84]
Convolutional Neural Networks (CNNs)Multi-layer neural networks that can automatically learn features for classifying emotionsFace level[85,86]
Active Shape Models (ASMs)Statistical models of the shape and appearance of objects, used to detect facial featuresLandmark level[87,88]
Active Appearance Models (AAMs)Extension of ASMs that also models texture information to track facial expression changesLandmark level[89,90]
Constrained Local Models (CLMs)Combines an ASM with a texture model to track facial expressions and improve accuracyLandmark level[91]
Facial Action Coding System (FACS)System for analysing and describing facial expressions based on the activation of individual musclesLandmark level[92,93]
Histograms of Oriented Gradients (HOGs)Descriptor that extracts information about the distribution of gradient directions in an imageRegion level[94]
Scale-Invariant Feature Transform (SIFT)Descriptor that extracts features invariant to scaling, rotation and translationRegion level[95,96]
Speeded Up Robust Features (SURF)Descriptor similar to SIFT but faster and more robust to changes in image scale and orientationRegion level[97,98]
Table 5. Sentence-based sentiment analysis studies in educational domain.
Table 5. Sentence-based sentiment analysis studies in educational domain.
S. NoArticle TitlePublished YearArticle TypeDataset/Sample SizeStudy MethodologyEmotion ClassesResults/Findings
1Variants of Long Short-Term Memory for Sentiment Analysis on Vietnamese Students’ Feedback Corpus [103]2018Conf.16,175 sentences from students’ feedbackLSTM, Dependency Tree-LSTM (DT-LSTM), L-SVM, D-SVM, and LD-SVM, NBPositive, negative, neutralLD-SVM: Negative—92.52, Neutral—43.37, Positive—93.06, Accuracy—90.20, F1 score—90.74
2Opinion mining and emotion recognition applied to learning environments [104]2020JournalEvoMSA, Multinomial NB, KNN, BERT, SVC, Linear SVCsentiTEXT: (positive and negative), eduSERE (engaged, excited, bored, and frustrated)Accuracy: 93 percent sentiTEXT, and 84 percent eduSERE
3What predicts student satisfaction with MOOCs: A gradient boosting trees supervised machine learning and sentiment analysis approach [105]2020Journal249 randomly sampled MOOCs and 6393 students’ perceptions of these MOOCsBoosting tree model with TextBlob3Positive, negative, neutralF1 score: structure (0.7780), video (0.8832), instructor (0.8570), content and resources (0.7625), interaction and support (0.8375), assignment and assessment (0.8138)
4Sentiment analysis on massive open online course evaluations: A text mining and deep learning approach [106]2020Journal66,000 MOOC reviewsMachine learning, ensemble learning methods, and deep learning methods with Word2Vec embeddingPositive, negativeAccuracy of 95.80 percent
5Social Network and Sentiment Analysis: Investigation of Students’ Perspectives on Lecture Recording [107]2020Journal1435 students reacted to Facebook question via emojis, 220 likes and 65 comments were generated from 150 unique studentsGoogle Natural Language APIPositive, negative, neutralSentiment score: positive (39.4 percent), negative (33.3 percent), neutral (27.3 percent)
6Investigating Learning Experience of MOOCs Learners Using Topic Modeling and Sentiment Analysis [108]2021Conf.8281 reviews scraped from five courses within the field of data science are analysed from CourseraTopic modelling (LDA) with VADER for sentiment analysisPositive, negative, neutralSentiment score: positive, 67.9; negative, 17.4; neutral, 14.7
7Sentiment Analysis and Topic Modeling on Tweets about Online Education during COVID-19 [109]2021JournalTwitter dataset containing 17,155 tweets about e-learningTextBlob, VADER, and SentiWordNet—For comparison: SVM, LR, DT, RF, SGD, KNN, GNB, CNN, LSTM, CNN-LSTM, and Bi-LSTMPositive, negative, neutralSVM achieves 0.95 accuracy using TF-IDF with SMOTE
8Sentiment Analysis for Distance Education Course Materials: A Machine Learning Approach [110]2020Journal6059 feedbacksMachine learning techniques (decision tree, MLP, XGB, SVC, multinomial logistic regression, Gaussian NB, and k-neighbours)Positive, negative, neutralHighest accuracy: LR (0.775)
9E-learning course recommendation based on sentiment analysis using hybrid Elman similarity [111]2023Journal10,000 tweets, short texts, and comments from social websitesFeature extraction: TF-IDF, Word2Vec, hybrid N-gram Classification: Elman minimal redundancy maximum relevance model and enhanced aquila optimisation (EMRMR_EAO) modelPositive, negative, neutralAccuracy: 99.98 percent
10Emotional Variance Analysis: A new sentiment analysis feature set for Artificial Intelligence and Machine Learning applications [112]2023Journal37 individual DEEP students journalsEmotional variance analysisPositive, negative, neutralAccuracy: 88.7 percent
11Deep-learning-based user experience evaluation in distance learning [113]2022Journal160,000 tweetsLSTM with Word2Vec embeddingPositive, negative, neutralAccuracy: 76 percent
12AOH-Senti: Aspect-Oriented Hybrid Approach to Sentiment Analysis of Students’ Feedback [114]2023Journal—————SVM, MNB, LR, RFC, DTC, and KNNPositive, negative, neutral98.7 percent aggregate accuracy using the RFC algorithm
13Language to Completion: Success in an Educational Data Mining Massive Open Online Class [115]2015Conf.320 students, 50 words in discussionNLP tools (WAT, TAALES, TAAS)Positive, negativeAccuracy: 67.8 percent, F1 score: 0.650
14A Learning Analytics Methodology for Detecting Sentiment in Student Fora: A Case Study in Distance Education  [116]2015Journal64 students, 371 messagesNioSto opinion word extraction algorithmPositive, negative, neutral27.27 percent positive, 55.56 percent neutral, and 17.17 percent negative
15An Enhanced Decision Support System through Mining of Teachers Online Chat Data [117]2018Journal6650 in-service K12 academics in China had participated, in 17,624 distinctive postsSingle-label naïve mathematician classification rulePositive, negative, neutralClassified: technical description (961), technical analysis (1638), technical critique (2235), personal description (613), personal analysis (5875), and personal critique (1166)
16A machine-learning-based approach for sentiment analysis on distance learning from Arabic Tweets [118]2022JournalTwitter dataset, 14,000 tweetsLogistic regression modelPositive, negative, neutralAccuracy, F1 score, precision, and recall, obtaining scores of 91 percent, 90 percent, 90 percent, and 89 percent, respectively
17Analysis of Student Feedback using Deep Learning [119]2019Journal—-CNN, SVM with Word2VecPositive, negative, neutral—-
18Lexicon-Based Sentiment Analysis of Teachers’ Evaluation [120]2016Journal1748 students’ feedbackKnimePositive, negative, neutralAccuracy: 91.2 percent
19Sentiment Analysis of Student Feedback Using Machine Learning and Lexicon Based Approaches [121]2017Conf.1230 comments extracted from our institute’s educational portalTF-IDF, N-grams with SVM and RFPositive, negative, neutralAccuracy: 0.93, F-measurement: 0.92
20Improving international attractiveness of higher education institutions based on text mining and sentiment analysis [122]2018Journal1938 reviews from 65 different business schoolsNLPPositive, negative, neutralThe satisfaction of the students towards HE institutions is significantly varied and depends on the topic being discussed in their opinions shared online
21Student Feedback Sentiment Analysis Model using Various Machine Learning Schemes: A Review [123]2019Journal950 postsMultinomial naive Bayes (MNB), stochastic gradient descent, SVM, random forest and multilayer perceptron (MLP)Positive, negative, neutral83 percent, 79 percent, 80 percent, 72 percent, and 83 percent for classifier MNB, SGD, SVM, random forest, and MLP
22Sentiment mining in a collaborative learning environment: capitalising on big data [124]2019Journal12,300 tweets, 10,500 Facebook comments, and 8450 Moodle feedback messagesNB and SVMPositive, negative, neutralSM approaches can be used to understand students’ sentiment in a collaborative learning environment
23Learning in massive open online courses: Evidence from social media mining [125]2015Journal402,812 tweetsOpinion finder tool and social media mining approachesPositive, negative, neutralSocial media SA provide a comprehensive understanding of MOOC learning trends
24Sentiment Analysis of Students’ Comment Using Lexicon Based Approach [126]2017Conf.Sentiment word database: 745 wordsLexicon-based approachStrongly positive, moderately positive, weakly positive, strongly negative, moderately negative, weakly negative, or neutral.8.5 + (−2.5) + 6 = 12 by (5) and divided by total number of opinion words in all comments. The result is 12/9 = 1.3333
25Student Opinion Mining regarding Educational System using Facebook group [127]2017Conf.Comments of master’s students from the Facebook Academic group: 250 commentsBayesian network probabilistic modelPositive, negative, neutralSentiment score, we have found 56 percent positive, 32 percent neutral, and 12 percent negative comments
Table 6. Aspect-based sentiment analysis studies in educational domain.
Table 6. Aspect-based sentiment analysis studies in educational domain.
S. NoArticle TitlePublished YearArticle TypeSample SizeStudy MethodologyWord EmbeddingAspects ExtractedResults/Findings
1Aspect-Based Opinion Mining on Student’s Feedback for Faculty Teaching Performance Evaluation [128]2019JournalDataset constructed from the last five years of students’ comments from Sukkur IBA University as well as on a standard SemEval-2014 datasetTwo-layered LSTM modelAcademic Domain, OpinRank, Glove.6B.100DTeaching pedagogy, behaviour, knowledge, assessment, experience, generalAspect extraction (91 percent) and sentiment polarity detection (93 percent)
2Weakly Supervised Framework for Aspect-Based Sentiment Analysis on Students’ Reviews of MOOCs [129]2020Journal105 k students’ reviews collected from Coursera and a dataset comprising 5989 students’ feedbackLSTM, CNNFastText, GloVe, Word2Vec, MOOCCourse, instructor, assessment, technologyF1 score: weakly supervised LSTM (domain embedding: 92.5, GloVe: 93.3), weakly supervised CNN (domain embedding: 90.1, GloVe: 91.5)
3Aspect-Based Sentiment Analysis of Arabic Tweets in the Education Sector Using a Hybrid Feature Selection Method [130]2020Conf.7943 Arabic tweets related to Qassim University in KSASVMOne-way ANOVATeaching, environment, electronic services, staff affairs, academic affairs, activities, student affairs, higher education, miscellaneousF-score: aspect detection 60 percent (0.76)
4Multi-Attention Fusion Modeling for Sentiment Analysis of Educational Big Data [131]2020JournalEducation dataset: 5052; course dataset: 705TD-LSTM, AE-LSTM, ATAE-LSTM, IAN, RAM——————–Difficulty, content, practicality, and teacherEducation dataset: Multi-AFM: 94.6; course dataset: Multi-AFM: 81.4
5Aspect-Based Opinion Mining of Students’ Reviews on Online Courses [132]2020Conf.21 thousand manually annotated student reviews that are collected from Coursera1D-CNN, decision tree, naïve Bayes, SVM, boostingFastText, GloVe, Word2Vec, own datasetInstructor, structure, content, design, generalFastText: precision—86.78, recall—89.52, F1 score—88.13; Word2Vec: precision—87.08, recall—89.34, F1 score—88.20; GloVe: precision—86.75, recall—88.89, F1 score—87.81; Own dataset: precision—86.70, recall—89.54, F1 score—88.10
6Online Reviews Evaluation System for Higher Education Institution: An Aspect Based Sentiment Analysis Tool  [133]2017Conf.Twitter and Facebook dataApache OpenNLP, Stanford NLP libraryPOS (part of speech) tagsOpinions about institution touch upon many intrinsic aspects and qualities and analysing each of these aspectsAccuracy of 72.56 percent
7Aspect-based Sentiment Analysis for Improving Online Learning Program Based on Student Feedback [134]2022Conf.162 new graduates from BINUS’s (Bina Nusantara University) online programStanford NLTK libraryAFINN standard polarity of English words for each token with noun POS taggingCheating punishment, class facilities, college management, learning material, learning guide, education system benefit, happy learning experiencesSuccessful classification of aspects in positive, negative, and neutral sentiment classes
8Knowledge-enabled BERT for aspect-based sentiment analysis [135]2021JournalMOOC offerings on two Chinese university MOOC platforms 9123 posts by 7590 different online learners in different advanced language programming courseKNEE, CG-BERT, R-GAT+BERT, BERT+LinerSKG—-BERT + SKG model outperforms all the baseline methods in accuracy and macro-F1 accuracy < 0.80, macro-F1 0.75
9Aspect-based Sentiment Analysis for University Teaching Analytics [136]2022JournalTwo different surveys: (i) COVID-19 specific student survey (1805); (ii) semester-based student course evaluations (9348)TextBlob, NLTK, spaCy package and flair—-Flexibility, teaching, pace, misc, technology, motivation, informationFindings reveal that students disliked online teaching due to insufficient information and unadjusted teaching methods. However, students liked its flexibility and possibility to learn at an individual pace.
10Weakly Supervised Models of Aspect-Sentiment for Online Course Discussion Forums [137]2015Conf.MOOC dataset of m different disciplines (business, technology, history, and the sciences)Joint probabilistic model (PSL-Joint)Seed words and weighted logical rulesLecture, quiz, certificate, social3–5 times improvement in F1 score in most cases over a system using only seeded LDA
11Aspect based sentiment analysis of students opinion using machine learning techniques [138]2017Conf.2000 TweetsNaive Bayes (NB), complementary naive Bayes (CNB), and PART algorithm—-Teaching, placement, facilities, fees, sports, organising events, transportPART algorithm precision: POS (1), Neg (1); recall POS (1), Neg (0.994); F-measure POS (1), Neg (0.997)
12Triggers and Tweets: Implicit Aspect-Based Sentiment and Emotion Analysis of Community Chatter Relevant to Education Post-COVID-19 [139]2022JournalTwitter chat dataLinear support vector classifier (SVC), logistic regression, multinomial naïve Bayes, random forestTF-IDF-BoWSafety, education quality and educational right, financial securityASBA: logistic regression (81 percent), overall SA: linear SVC (91 percent)
Table 7. Text-based sentiment analysis challenges in capturing learners’ emotions.
Table 7. Text-based sentiment analysis challenges in capturing learners’ emotions.
S. NoChallenge(s)Reference(s)
1Classification of students’ textual utterances[140]
2Emotion classes overlapping (classes ambiguity)[141,142,143]
3Dealing with bipolar words[144]
4Fake comments/responses[145,146]
5Lack of reliable ground truth data for training and evaluation[147]
6Difficulty of accurately identifying and interpreting sarcasm and irony in chat-based data[148]
7Unstructured data[149,150,151]
Table 8. Facial sentiment analysis studies in educational domain.
Table 8. Facial sentiment analysis studies in educational domain.
S. NoArticle TitlePublished YearArticle TypeDataset/Sample SizeStudy MethodologyEmotion ClassesResults/Findings
1Modeling adaptive E-Learning environment using facial expressions and fuzzy logic [166]2020JournalCorpora of 12 learners contain 72 learning activities and 1735 data points of distinct emotional statesCNNAnger, disgust, fear, happiness, sadness, surprise, neutralProposed approach provides adaptive learning flows that match the learning capabilities of all learners in a group
2Use of facial emotion recognition in E-learning systems [167]2017JournalFrontal face images of participants recorded through Skype, size of 11,680 × 10kNN, random forest, CART, SVMHappiness, fear, sadness, anger, surprise, and disgustHighest accuracy: SVM 98.24
3An E-learning System With Multifacial Emotion Recognition Using Supervised Machine Learning [168]2015ConferenceYale Face Database (YFD) for training and Face Detection Dataset and Benchmark (FDDB) and Labelled Faces in the Wild (LFW) for evaluationSVM7 major emotionsAccuracy of 89 to 100% with respect to different datasets
4Mood Extraction Using Facial Features to Improve Learning Curves of Students in E-Learning Systems [169]2016JournalCohn–Kanade AU-coded facial expression database consists of 486 sequences from 97 facesRadial basis function NN algorithm, SVMHappy, sad, confused, disturbed, surprisedProposed algorithm showed a success rate of over 70% in assessing the student’s mood
5Non-intrusive Identification of Student Attentiveness and Finding Their Correlation with Detectable Facial Emotions [170]2020ConferenceDataset of the raw images consisted of 3500 imagesCNNAttentiveness, calm, confused, disgusted, fear, happy, sad, surprised93% accuracy
6Facial expression recognition with Convolutional Neural Networks: Coping with few data and the training sample order [171]2017JournalExtended Cohn–Kanade (CK þ), Japanese Female Facial Expressions (JAFFE), and Binghamton University 3D Facial Expression (BU-3DFE) databaseCNNAngry, disgust, fear, happy, sad, and surprise96.76% accuracy on the CKþ database
7Toward Automated Classroom Observation: Predicting Positive and Negative Climate [172]2019Conference241 class-labelled videosCNN and Bi-LSTMPositive climate and negative climateAccuracy: 0.40 and 0.51, respectively
8Model Proposal on The Determination of Student Attendance in Distance Education with Face Recognition Technology [173]2021JournalFace gestures captured through LMS cameraEigenfaces recognition algorithm with Gaussian filtersMore than 80% accuracy achieved
9A New Deep Learning Model for Face Recognition and Registration in Distance Learning [174]2022JournalFaces94, Faces95, Faces96, and Grimace datasets contain 7873 imagesCNNEigenfaces emotionsAccuracy of 100% for the Faces94 and Grimace datasets and achieves 99.86% for Faces95. Faces96 model achieves accuracy of 99.54%
10Automatic detection of students’ affective states in classroom environment using hybrid convolutional neural networks [175]2019Journal8000 single face in a single image frame and 12,000 multiple faces in a single image frameHybrid CNNEngaged, boredom, and neutralAccuracy of 86% and 70% for posed and spontaneous affective states of classroom data, respectively.
Table 9. Facial sentiment analysis challenges in capturing learners’ emotions.
Table 9. Facial sentiment analysis challenges in capturing learners’ emotions.
S. NoChallenge(s)Reference(s)
1Limited accuracy in identifying emotions[179,180,181]
2Cultural differences in facial expressions[182,183,184,185]
3Limited effectiveness in identifying subtle emotions[186,187,188,189]
4Dependence on lighting and camera quality[190,191,192,193]
5Lack of real-time analysis[194,195]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Anwar, A.; Rehman, I.U.; Nasralla, M.M.; Khattak, S.B.A.; Khilji, N. Emotions Matter: A Systematic Review and Meta-Analysis of the Detection and Classification of Students’ Emotions in STEM during Online Learning. Educ. Sci. 2023, 13, 914. https://doi.org/10.3390/educsci13090914

AMA Style

Anwar A, Rehman IU, Nasralla MM, Khattak SBA, Khilji N. Emotions Matter: A Systematic Review and Meta-Analysis of the Detection and Classification of Students’ Emotions in STEM during Online Learning. Education Sciences. 2023; 13(9):914. https://doi.org/10.3390/educsci13090914

Chicago/Turabian Style

Anwar, Aamir, Ikram Ur Rehman, Moustafa M. Nasralla, Sohaib Bin Altaf Khattak, and Nasrullah Khilji. 2023. "Emotions Matter: A Systematic Review and Meta-Analysis of the Detection and Classification of Students’ Emotions in STEM during Online Learning" Education Sciences 13, no. 9: 914. https://doi.org/10.3390/educsci13090914

APA Style

Anwar, A., Rehman, I. U., Nasralla, M. M., Khattak, S. B. A., & Khilji, N. (2023). Emotions Matter: A Systematic Review and Meta-Analysis of the Detection and Classification of Students’ Emotions in STEM during Online Learning. Education Sciences, 13(9), 914. https://doi.org/10.3390/educsci13090914

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop