Next Article in Journal
Multiple q-Integral and Records from Geometrically Distributed Sequences
Next Article in Special Issue
Teaching and Learning of Mathematics through CLIL, CBI, or EMI—A Systematic Literature Review
Previous Article in Journal
Multimodal Image Aesthetic Prediction with Missing Modality
Previous Article in Special Issue
A Correlational Predictive Study of Teacher Well-Being and Professional Success in Foreign Language Student Teachers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Has the COVID-19 Pandemic Affected Mathematics Achievement? A Case Study of University Students in Social Sciences

1
Faculty of Organizational Sciences, University of Maribor, SI-4000 Kranj, Slovenia
2
Institute of Mathematics, Physics and Mechanics, SI-1000 Ljubljana, Slovenia
3
Faculty of Natural Sciences and Mathematics, University of Maribor, SI-2000 Maribor, Slovenia
*
Author to whom correspondence should be addressed.
Mathematics 2022, 10(13), 2314; https://doi.org/10.3390/math10132314
Submission received: 21 April 2022 / Revised: 27 June 2022 / Accepted: 29 June 2022 / Published: 1 July 2022

Abstract

:
This study examines the effects of COVID-19-related measures on the mathematics achievement of university students in social sciences in Slovenia. Our particular concern was to compare two student populations (pre-pandemic and pandemic) in terms of factors affecting student performance in mathematics courses. Data were collected over nine consecutive academic years (from 2013–2014 to 2020–2021) and analyzed using two-stage structural equation modelling (SEM). The analyses confirmed that the conceptual model developed before the pandemic was applicable during the pandemic period. For both populations (pre-pandemic and pandemic), mathematics confidence, perceived level of math anxiety, background knowledge from secondary school, and self-engagement in mathematics courses at university were confirmed as factors influencing mathematics achievement. Moreover, both populations perceived the effects of the factors in the same way, and the magnitude of the effects is comparable. The rather high values of determination coefficient for mathematics achievement (greater than 0.66 for both student populations) indicate that the variables “Perceived Level of Math Anxiety” and “Self-Engagement in Mathematics Course at University” together explain a significant proportion of the total variance before and during the pandemic. Consequently, the results of our case study indicated that pandemic measures did not have a significant impact on our students’ mathematics achievement. Although a more in-depth study of a broader sample of academic courses would be needed to confirm our findings, our experience indicates that mathematics courses at the tertiary level of education can be successfully delivered online.

1. Introduction

Since the spring of 2020, the literature addressing adjustments to and consequences of COVID-19-related school closures at different educational levels has expanded rapidly. The sudden transition to distance learning and the period of its implementation affected all aspects of students’ lives. They faced social, psychological, and educational challenges [1,2,3,4,5].
In this paper, we will look at the impact of the COVID-19 pandemic on educational outcomes, or, more specifically, student achievement in mathematics. A review by Hammerstein et al. [6] provides a selection of studies at the primary and secondary education levels with comparative analyses of student achievement before and after COVID-19-related school closures. Among numerous studies published up to the end of April 2021, the authors chose only those (11 studies) that included applied statistical analyses rather than solely reporting percentages and analyzed them in detail. Most of these studies revealed negative effects of COVID-19-related school closures on students’ achievement, and seven of them reported a negative effect on mathematics achievement. Positive effects on mathematics achievement were less frequent and occurred only in connection with the more intensive use of online learning software already familiar to students.
A similar systematic review of school performance among children and adolescents during the COVID-19 pandemic was prepared by Panagouli [7], in which, in addition to student achievement, parents’ and teachers’ observations were also analyzed. The final selection covered 42 papers published by July 2021. Most of the students involved in the research had a significant decrease in mathematics results compared to previous years. Unfortunately, only a few of them maintained their mathematics performance or improved it.
Recently, another survey on the implications of primary to upper-secondary school closures was prepared by Svaleryd et al. [8]. They provided an overview of virus transmission in schools, student learning, and mental health among students. This survey mostly confirmed learning losses and more pronounced learning inequalities. The available evidence identified the distance learning period as more challenging for younger students.
Since our study is oriented to university students, we focused mainly on research dealing with parallels between populations of students pre-pandemic and during the pandemic and their achievements at the tertiary level. After reviewing the available literature, we found a poorer representation of publications at the tertiary level than the primary and secondary levels. This agrees with the findings of Soesanto et al. [9]. In our opinion, there are several reasons for this. Certainly, one reason is the specificity of the study areas and courses, making the findings more difficult to compare and generalize. University students are perceived as autonomous learners responsible for choosing their own learning techniques and monitoring their learning process [10]. In addition, students entering tertiary education are expected to have much more experience and coping skills than students at lower education levels. Coping strategies and adaptation skills are key to successfully adjusting to university challenges [11]. One might therefore expect that there would be fewer effects on academic achievement at the university level compared to the primary and secondary education levels. However, studies dealing with general COVID-19 confinement on students’ performance in higher education reported both negative [12,13,14] and positive [15,16,17] effects.
Studies analyzing the impact of COVID-19, and consequently distance learning, on mathematics achievement at the tertiary educational level are very rare. Perhaps surprisingly, positive or no effects of distance learning on mathematics achievement at the tertiary level have been reported to a greater extent than at lower education levels [16,17,18].
After reviewing the representative literature (see Section 2), we found that much of the current comparative research on mathematics performance between pre-pandemic and pandemic generations of tertiary education students is mainly based on the final course grade. Hence, it is essential to examine also the extent to which the previously identified primary factors influencing students’ mathematics performance changed during the pandemic [17]. Our study compares the pre-pandemic and pandemic generations of students from this perspective.
The primary goal of the present study was to investigate how COVID-19-related measures affected the mathematical performance of university students in social sciences. For this purpose, we performed a case study at the Faculty of Organizational Sciences, University of Maribor, Slovenia, and compared in detail two populations of students, pre-pandemic and during the pandemic, from the perspective of their achievement in mathematics. Such comparison requires systematic in-depth analysis. To comprehensively understand how student performance has changed due to COVID-19-related school closures, it is important to investigate factors that potentially contributed to the changes [17].
The results of our previous research [19] showed that the mathematics achievement of university students in social sciences before the COVID-19 pandemic depended on the following factors: mathematics confidence, math anxiety, background knowledge from secondary school, and engagement in university mathematics courses. It is thus necessary to determine whether these factors are also relevant to the population of students during the pandemic. In other words, we need to verify whether our conceptual model developed in the pre-pandemic period using structural equation modeling (SEM) [19] is applicable to the pandemic period. Such verification requires the use of advanced statistical methods that include verification of both the measurement and the structural model (see Section 4.2). Only after the applicability of the model has been confirmed can the two student populations be compared. For this purpose multi-group structural equation modeling (MG-SEM) [20,21] is used.
The remainder of the paper is organized as follows: first, the results of a relevant literature review are presented. Then, the methodology of our empirical study is explained, and the results are systematically presented and discussed. Finally, the conclusions based on research implications are presented, and the limitations of the study and future research directions are highlighted.

2. Review of Related Literature

Findings on the academic performance of students before and after COVID-19-related primary and secondary school closures were outlined in the Introduction. The present study compares the academic performance of tertiary students pre-pandemic and during the pandemic. Accordingly, a review of the representative literature on the tertiary educational level was prepared. The results of the recently published bibliometric analysis [22] show that the number of scientific papers on online learning in higher education rises exponentially. Qualitative methodologies, particularly case study analysis, dominate the field. structural equation modelling (SEM) was the most used quantitative method, and questionnaires were the most widely used instruments of data gathering [23]. Partial least squares structural equation modelling (PLS-SEM) was used to analyze a questionnaire related to the performance of mathematics teaching during the COVID-19 era in [24]. According to the topic of the present study, special attention was paid to comparative studies (pre-pandemic vs. pandemic) of mathematical achievements.
Alangari [25] systematically reviewed the current articles of STEM (sustainable science, technology, engineering, and mathematics) courses in higher education during COVID-19. Results indicated that faculty members consider the transition from face-to-face to distance learning as effective even though they faced several challenges. Further investigation showed that the application of STEM online learning has helped to increase students’ creativity and the rate of STEM research by both students and faculty members. The findings of [26] indicate that for many mathematics lecturers in higher education, the move to online teaching has been a beneficial experience that will impact their future teaching.
De Paola et al. [12] applied a difference-in-differences identification strategy to compare the academic performance of students in different cohorts (affected or not affected by the health emergency) in several academic areas (science, humanities, social sciences) offered at an Italian public university and confirmed a negative impact of online teaching, mainly on students with a tendency to procrastinate. The negative effect was reflected in the students’ final examination grades. The findings also showed that the effects were particularly detrimental for first-year students, while almost no effect on master’s degree students was found. Kofoed et al. [13] reported lower final grades in the introductory economics course at the United States Military Academy for students in online classes compared to those in in-person classes, especially students with lower ability. Negative effects of online teaching on student performance were also confirmed by Orlov et al. [14] for students enrolled in economics courses at four R1 PhD granting institutions. At the same time, they found that student characteristics, including gender, race, and first-year status, were not significantly associated with the decline in performance during the pandemic semester. Gonzalez et al. [15] compared assessments of students in three subject areas at Universidad Autónoma de Madrid (Spain) before and after COVID-19 confinement. They found that confinement changed the students’ learning strategies to a more continuous habit, improving their efficiency. Therefore, the significant positive effect of COVID-19 confinement on student performance was not surprising.
Extensive research on virtual mathematics teaching during the COVID-19 pandemic was conducted among mathematics students at California State University, Fullerton [18]. The impact of virtual instruction on student course outcomes, including completion rates, passing rates, and course grades, was studied. A comparison of student outcomes between fall 2019 and fall 2020 indicated that the new modality of virtual classes had no impact on aggregated grades for students completing their courses. The course outcome data for all students enrolled in a mathematics course showed that the overall grade point average and success rates for both semesters were essentially the same. Pócsová et al. [16] found that distance learning did not affect higher-achieving students and reduced the number of students who did not pass the Mathematics 1 course at the Technical University of Košice. Student evaluations from 2020–2021 were compared to overall assessments from the last six academic years using descriptive statistics. The evaluations in 2020–2021 turned out to be significantly better than those in the previous years, which can be attributed to the ample online support for distance education. Specifically, the online study materials for the Mathematics 1 course had been active since 2008 and were completed just before the pandemic outbreak. With when request came to change the teaching method from face-to-face to distance learning, the university was well prepared for immediate transfer to the online space. Tomal et al. [17] empirically measured the impact of online teaching on students’ academic performance in 11 science, technology, engineering, and mathematics courses using a Bayesian linear mixed-effects model fitted to longitudinal data. They observed an increase in overall average marks in courses requiring lower-level cognitive skills (including first-year mathematics courses) according to Bloom’s taxonomy of knowledge [27] and a decrease in marks in courses requiring higher-level cognitive skills (those with hands-on lab, coding, and programming components). Rosillo et al. [28] studied the impact of a new teaching experience called “Escape Room” on academic performance in mathematics of pharmacy and nursing students. The escape room activity has not shown significant differences in the results compared with the two previous years.
In addition to the already described negative influence on mathematical background knowledge from secondary school, recent pandemic literature also reports increased levels of math anxiety among university students [29,30,31] and highlights a general positive attitude toward learning, with information and communications technology (ICT) as a crucial element influencing students’ coping with learning mathematics during the pandemic [32].

3. Research Objectives

The main objective of our research was to compare populations of university social science students pre-pandemic and during the pandemic in terms of their achievement in mathematics and the factors influencing this achievement.
In our previous study [19], conducted before the outbreak of COVID-19, we attempted to identify the factors that influence the mathematics achievement of social science students. For this purpose, we developed a conceptual model with seven interdependent factors selected according to the literature review, which are listed in Table 1.
To determine which of the seven proposed factors are influential for predicting mathematics achievement in the populations of students under consideration, seven hypotheses were formulated. Table 2 shows that five of them were supported and two were rejected. The results indicate that the constructs Behavioral Engagement (BE), Perceived Usefulness of Technology in Learning Mathematics (PUTLM) (and consequently Confidence with Technology (CT)) had no significant influence on the mathematical achievement of the student population pre-pandemic. Consequently, the constructs Mathematics Confidence (MC), Perceived Level of Math Anxiety (PLMA), Background Knowledge from Secondary School (BKSS), and Self-Engagement in Mathematics Course at University (SEMCU) were found to have a significant impact and therefore should be considered when planning improvements to educational processes.
The idea of the current study was to apply our pre-pandemic conceptual model to the population of university social science students during the pandemic and try to find parallels or identify possible divergences between the two populations, if any. In order to pursue the research objectives, the following research questions were formulated:
RQ1: is the conceptual model of factors influencing mathematics achievement (developed in [19]) applicable to the population of university social science students during the pandemic?
Answering this question was a basic and preliminary activity of our study. Indeed, the occurrence of the pandemic forced us to face a new reality, requiring us to conduct a comprehensive evaluation of the existing model in terms of its applicability to the new pandemic circumstances. Such an evaluation includes verifying that the understanding of the model constructs remains unchanged in both populations of students. Only if we determine that the model is applicable to the population of students during the pandemic can we proceed with further research questions, such as:
RQ2: which factors of mathematics achievement considered in our pre-pandemic research [19] are important for the population of students during the pandemic, and how strong is their influence? Are there parallels/divergences between the two populations?
In the context of this question, we wanted to examine the extent to which the factors that were found to influence the mathematics achievement of the pre-pandemic population of students also had an impact during the pandemic. We wanted to find out if there were differences in terms of the significance and magnitude of factors between the two populations of students.

4. Materials and Methods

4.1. Measurement Instrument and Data Collection

Our study was aimed at investigating the population of full-time students (bachelor’s degree) at the Faculty of Organizational Sciences, University of Maribor. To collect data, we used the three-part online questionnaire presented in [19].
The first part collected the students’ sociodemographic data (gender, age, year of study, and course of study) and mathematical background knowledge from secondary school (BKSS). The second part focused on the perceived level of math anxiety (PLMA), and the third part was designed to measure students’ attitudes toward mathematics (MC and BE), information technology (CT), and attitudes toward involving technology in learning mathematics (PUTLM).
Data were collected systematically for nine consecutive academic years (from 2013–2014 to 2020–2021). The questionnaire was made available at the beginning of the mathematics course through Moodle (Modular Object-Oriented Dynamic Learning Environment). Participation in the survey was completely voluntary, and the students did not receive any benefits. At the end of a given academic year, the survey data were supplemented with data on students’ self-engagement (SEMCU) and their final achievement of the course. After combining the two data sources, the data were anonymized.
The pre-pandemic population comprised students in the first six academic years (from 2013–2014 to 2018–2019) who took courses through traditional face-to-face sessions. However, due to the COVID-19 pandemic outbreak in the summer of 2020, all class activities for the last two years (2019–2020 and 2020–2021) were moved to distance learning. These students represent the pandemic population. To capture their perceptions of the new reality, students in the pandemic population were asked to rate a series of COVID-19-related stressors in terms of depressive symptoms, anxiety, stress, and loneliness [52]. In order to determine any change in the students’ mental health, they were specifically asked whether they felt more or less stressed by particular factors since the COVID-19 crisis began.

4.2. Statistical Methods

The present study is a continuation of our research [19], in which a conceptual SEM (structural equation modelling) model was developed to investigate the main factors influencing mathematics achievement of selected student populations. SEM is an advanced statistical method widely used in behavioral sciences to test causal and correlational relationships between observed and invisible (latent) variables simultaneously (see e.g., [20,53,54]).
As defined in Section 3, the main objective of the current research was to compare two populations of students, pre-pandemic and during the pandemic, in terms of their achievement in mathematics and the factors influencing this achievement. Therefore, in order to make a comprehensive comparison, the same methodological framework should be used. The data were again analyzed using the two-stage SEM approach. The first stage aimed to evaluate the measurement model (in order to answer RQ1), and while the second stage aimed to analyze the structural relationships among the model constructs (i.e., latent variables) (in order to answer RQ2).

4.2.1. Measurement Model Evaluation

The basis for the analyses was the conceptual model developed in [19] and described in Table 1. In order to answer RQ1, we needed to perform the comprehensive applicability of the research model for the purpose of the current study. This was a relatively complex task that involved the following consecutive steps:
  • Step 1: model overall fit evaluation,
  • Step 2: confirmatory factor analysis,
  • Step 3: multi-group confirmatory factor analysis.
The implementation of a particular step is described in the following subsections. At each step, certain conditions must be met. Otherwise, the research cannot continue, which means that the predefined research goal cannot be achieved.

Step 1: Model Overall Fit Evaluation

The overall fit of the measurement and structural models was assessed with a set of widely used fit indices and compared to the proposed thresholds, as follows:
  • comparative fit index (CFI), CFI > 0.9 [55];
  • root mean square error of approximation (RMSEA), RMSEA < 0.08 [56];
  • standardized root mean square residual (SRMR), SRMR < 0.08 [57];
  • ratio of Chi-square to degrees of freedom (χ2/df), χ2/df < 3 [58].
The evaluation of the overall fit of the model was performed in three ways: first, the entire sample was considered, and then each subsample (pre-pandemic and pandemic) was analyzed separately. In all cases, the fit indices should be satisfied. However, Cheung and Rensvold [59] emphasized that some fit indices may be affected by the complexity of the model. Therefore, we can be somewhat more lenient in evaluating the model fit of relatively complex models. The process of overall fit evaluation of the model is illustrated in Figure 1.

Step 2: Confirmatory Factor Analysis

Within this step, the measurement model should be validated on the entire sample of data for convergent and discriminant validity. For this purpose, confirmatory factor analysis (CFA) can be used by checking the following indices and their threshold requirements [55,60]:
  • convergent validity:
    standardized factor loadings (SFL), SFL > 0.5 for all questionnaire items;
  • discriminant validity:
    composite reliability (CR), CR > 0.7
    average variance extracted (AVE), AVE > 0.5 for all latent variables (i.e., model constructs).
The process of performing CFA is illustrated in Figure 2.

Step 3: Multi-Group Confirmatory Factor Analysis

While CFA examines whether the hypothesized measurement model fits the entire data sample well, within this step, we need to precisely compare the measurement model across both subsamples (pre-pandemic, pandemic). For this purpose, a multi-group confirmatory factor analysis (MG-CFA) (see e.g., [20,21]) can be used. The aim of MG-CFA is to examine whether respondents in different groups understand the conceptual model equally and ascribe similar importance to questionnaire items [61,62].
MG-CFA requires a sequential comparison of nested models that differ in terms of the item parameters constrained equally between groups to identify (non)invariant parameters [63].
In order to assess the psychometric equivalence of a construct between the groups [64], we performed the measurement invariance (MInv) test [65]. MInv is accepted if the difference in model parameters between groups is so small that it is attributable to chance. If the model is statistically invariant between groups, then it can be argued that any differences in factor scores are attributable to characteristics of the groups rather than to any deficiencies of the statistical model or inventory [66], while non-invariance in the measurement indicates that a construct has different structures and/or meanings for respondents in different groups.
MInv is typically tested in the following order: configural, weak, and strong invariance, sometimes followed by strict invariance [21].
Since χ2 tends to be oversensitive to small, unimportant deviations from a perfect model in large samples [59,67], the results for each invariance test can be explained by the change in several alternative fit indices (AFI). Therefore, changes in CFI (ΔCFI), SRMR (ΔSRMR), and RMSEA (ΔRMSEA) should be examined. In [67], the author suggested that a criterion of −0.01 for the ΔCFI should be paired with ΔRMSEA of 0.015 and ΔSRMR of 0.03 (for metric invariance) or 0.015 (for scalar or residual invariance).
The process of MG-CFA is illustrated in Figure 3.

4.2.2. Structural Model Analysis

In order to answer RQ2, the structural relationships among the model constructs (i.e., latent variables) were analyzed. In applications involving more than one sample (as is the case in the present study), it is of key importance to determine whether or not the components of the measurement model and/or structural model are equivalent (i.e., invariant) between particular groups of interest. For this purpose, we used multi-group structural equation modeling (MG-SEM) [20,21]. The evaluation of the structural model was performed by analyzing the relationships between the latent variables of the model, reporting the path coefficients (unstandardized b, and standardized β) and corresponding z-values. The path coefficients in both groups were examined in terms of whether they differed or not.

5. Results

All analyses were performed using the R-package lavaan [68] and semTools [69]. The results are presented in the following subsections.

5.1. Sample Characteristics

A total of 477 students participated in the study, 321 (67.30%) from the pre-pandemic period and 156 (32.70%) from the pandemic period; 44.2% were men and 55.8% were women. The average age of the participants was 20.1 years (with a standard deviation of 1.61 years), with a range of 18 to 32 years.

5.2. Descriptive Statistics

Table 3 presents descriptive statistics for all model constructs (i.e., latent variables). Results are reported for both groups of respondents, pre-pandemic and during the pandemic. For better transparency, the mean values of questionnaire items that were measured on a 5-point Likert-type scale (items within the constructs MC, BE, MTA, NTA, MCA, CT, and PUTLM) are presented graphically in Figure 4.

5.3. Measurement Model Evaluation

5.3.1. Step 1: Model Overall Fit Evaluation

To evaluate the overall fit of the measurement model we followed the process illustrated in Figure 1. First, the whole sample of respondents was considered. The calculated χ2 value was 1392.42 at 570 degrees of freedom, and the χ2/df ratio was 2.44 < 3. Both CFI and SRMR indicate good model fit (CFI = 0.911 > 0.9; SRMR = 0.065 < 0.08). RMSEA is 0.060, and the upper bound of its 90% confidence interval (0.055, 0.065) is less than 0.08, suggesting an appropriate model fit [56].
The same calculations were performed for both subsamples of respondents (pre-pandemic and pandemic). It was found that only CFI for the pandemic group is slightly below the desired threshold of 0.9 (probably due to smaller subsample size), while other fit indices fulfill the criteria (χ2/df = 1.57 < 3; SRMR = 0.079 < 0.08; upper bound of 90% confidence interval of RMSEA = 0.062 < 0.08). Since the recommendations for interpreting fit indices suggest using a set of indices instead of just one value, we conclude that the measurement model fits the overall sample as well as both subsamples. The results of the overall fit evaluation are presented in Table 4. It can be seen that all conditions to proceed with Step 2 are fulfilled.

5.3.2. Step 2: Confirmatory Factor Analysis

The analysis followed the process illustrated in Figure 2. As can be seen in Table A1 in Appendix A, the standardized factor loadings (SFLs) for all items exceed the threshold of 0.5, therefore the measurement model achieved overall convergent validity.
Table 5 shows the values of composite reliability (CR), average variance extracted (AVE), square root of AVE (on the diagonal), and correlations between model constructs. It can be seen that the AVE value for MCA is slightly below the threshold (0.498 < 0.5), while the value for all other constructs exceeds the threshold of 0.5. Since CR of MCA is above 0.6 (0.798), construct validity is still adequate [60], meaning that all included items are significantly related to the corresponding construct. When we compare the square root of AVE for each construct with the correlations between the other constructs, we can see that the square roots are larger than the correlation coefficients, thus also demonstrating the discriminant validity of the entire measurement model.
All the results obtained within this step prove that we can proceed with Step 3.

5.3.3. Step 3: Multi-Group Confirmatory Factor Analysis

The analysis followed the process illustrated in Figure 3. The results of testing MInv between groups are shown in Table 6. MInv tests were used in hierarchical order of the nested models, with configural invariance assessed first, followed by weak, strong, and strict invariance.
It can be seen from Table 6 that the first iteration (M2) indicates good model fit (CFI = 0.910 > 0.9; SRMR = 0.063 < 0.08; RMSEA = 0.059 < 0.08), suggesting that the structure of the constructs is the same between the two groups, thus supporting configural invariance.
In the next iteration, we performed a test for weak invariance, and the factor loadings were constrained to be equal between the groups. Comparing models M3 and M2, we can see that the p-value of the Chi-square test is not significant at the 5% level (p = 0.4173 > 0.05), indicating that weak invariance is supported. In addition, the differences in alternative fit indices (ΔAFI) between the configural and weak models are small (ΔCFI = −0.0031; ΔREMSA = −0.001; ΔSRMR = 0.001), supporting weak invariance.
In testing strong invariance, in addition to factor loadings, the intercepts were also constrained to be equal between the groups. Comparing M4 and M3, we can see that the p-value of the Chi-square test is significant at the 5% level (p = 0.0132 < 0.05), indicating that strong invariance is not supported. Therefore, intercepts are not completely invariant between the two considered groups.
In the next two sequential steps (models M4a and M4b), intercepts of measured items additional points and final grade in high school were freely estimated between the groups, and partial strong variance (M4b) was established since the p-value of the Chi-square test between M4b and M3 is not significant at the 5% level (p = 0.1895 > 0.05).
The first variable (additional points), for which intercepts were set to be freely estimated between groups, belongs to the construct self-engagement in mathematics course at university (SEMCU), while the second one (final grade in high school) belongs to the construct background knowledge from secondary school (BKSS). As can be seen in Table 3, the pre-pandemic students gained on average 4.28 additional points, while the pandemic students gained on average 7.55 additional points. Furthermore, a graphical representation with boxplot and histogram in Figure 5 shows that the fraction of students who did not gain any additional points was much greater pre-pandemic than during the pandemic (29.9% vs. 5.1%). Moreover, 14.2% of the pandemic population gained the maximum number of additional points (13), while 7.2% of the pre-pandemic population did so. Based on the results in Table 3 and Figure 6, we can also conclude that pre-pandemic students, on average, did slightly better according to the final grade in high school. The average final grade for pre-pandemic students was 3.35, while for pandemic students it was 3.66.
To test strict invariance, residual (or error) variance was set as fixed in both groups. There was a significant change (Chi-square p = 0.0151 < 0.05) between the partial strong model (M4b) and the strict model (M5), indicating a lack of fit when also constraining error variance to be invariant in the two groups. According to [64], strict measurement invariance is not obligatory, and we could proceed to evaluating the structural model (analysis of RQ2).

5.4. Structural Model Analysis

5.4.1. Structural Model Evaluation

Since we confirmed RQ1, we proceeded with evaluating the structural model fit. We adopted two variants of the structural model, SM1 and SM2, and performed MInv tests. In SM1, the structural coefficients (paths) between the groups are freely estimated, as are the loadings and intercepts (except for the intercepts of additional points and e-activities), while in SM2, the structural coefficients were constrained to be equal between the groups.
The results of the tests are given in Table 7. It can be seen that the fit of both models, SM1 and SM2, was good. The fit of SM2 was as follows: χ2 = 2481.29; df = 1289; CFI = 0.894; and RMSEA = 0.061 (90% CI: 0.057, 0.064). The χ2 test of the two nested models and small or negligible differences in the fit indices show that models SM1 and SM2 are not significantly different at a 5% significance level (p = 0.000). This implies that SM2, the simpler model with equal paths between groups, is preferred. Moreover, it implies that both groups of respondents (pre-pandemic and pandemic) perceived the impact of the factors within the conceptual model from [19] in the same way. Thus, from this point of view, the two populations of students are fully comparable, which provides part of the answer to RQ2.

5.4.2. Hypotheses Testing

The next step is to evaluate the structural paths and test the hypotheses from Table 2. The results are given in Table 8. The unstandardized coefficient (b) and the standardized coefficient (β), which reflect the relationships among the model constructs (i.e., latent variables) in terms of magnitude and statistical significance, are listed for both groups (pre-pandemic and pandemic). Corresponding z-values and p-values are also provided.
As can be seen in Table 8, five out of seven hypotheses were confirmed, and two (H1b and H6) were rejected at the 5% significance level.

5.4.3. Final Model

To find the most parsimonious final model, the nested models approach was applied, in which insignificant paths were iteratively eliminated from the model.
In the first step, the non-significant path PUTLM → MA (H6) was eliminated. The new model was compared to SM2, and a non-significant Chi-squared difference (Δχ2 = 0.892; Δdf = 1; p = 0.3448) justified the elimination. Consequently, the path CT → PUTLM (H5) was also eliminated, implying that the constructs CT and PUTLM could be decoupled from the rest of the model.
In the next step, the non-significant path BE → PLMA (H1b) was eliminated. A non-significant Chi-squared difference (Δχ2 = 2.708; ∆df = 1; p = 0.0998) again justified the elimination and indicated acceptance of the simpler model.
Finally, four significant paths (H1a, H2, H3, and H4) remained in the final structural model. The fit of the final model was good, with index values as follows: χ2 = 1584.16, df = 678, CFI = 0.874, and RMSEA = 0.078 (90% CI: 0.073, 0.083).
The structural paths of the final model were than evaluated, and the results are presented in Figure 7.
The results confirmed that all of four hypotheses (H1a, H2, H3, and H4) could be supported at the 0.1% significance level (H1a: βpp = −0.736, βp = −0.773, z = −7.332, p = 0.000; H2: βpp = −0.225, βp = −0.199, z = −4.165, p = 0.000; H3: βpp = 0.331, βp = 0.374, z = 3.793, p = 0.000; H4: βpp = 0.873, βp = 0.759, z = 9.645, p = 0.000). The coefficient of determination for MA regarding the pre-pandemic population of students is R pp 2 = 0.854 , while the corresponding value for the pandemic population is R p 2 = 0.661 .

6. Discussion and Conclusions

Due to the global crisis caused by the COVID-19 pandemic in 2020, universities all over the world underwent a complete transition to online education. Faculty members with little to no experience in online teaching quickly found themselves teaching all of their courses entirely online [18]. This transition was especially challenging in university mathematics education, where face-to-face lectures combined with tutorials are well established and commonly used forms of instruction (Bergsten (2007), Rach and Heinze (2017) as cited in [32]). Even before the pandemic, it was known that running courses entirely online was more likely to cause difficulties for students in mathematics than in other disciplines [70].
Several studies tried to explain the impact of the rapid transition to online mathematics teaching during the pandemic on students’ academic development [12,13,14,15,16,17,24,26,28]. In April 2021, Soesanto et al. [9] encouraged researchers to bridge the still large research gap regarding the causality of implementing online mathematics learning in times of crisis in order to contribute to developing pedagogical strategies to improve university students’ mathematics performance and attitudes.
In order to provide complementary evidence of the impact of online mathematics teaching at the tertiary level of education, the present research compares populations of university students in social sciences in Slovenia between pre-pandemic and during the pandemic in terms of their mathematics achievement and the factors influencing their achievement. The results of two-stage structural equation modeling confirm that the conceptual model developed in our previous study [19] was also applicable to the pandemic population of students (RQ1). The same factors relevant to mathematics achievement in the pre-pandemic population (mathematics confidence, perceived level of math anxiety, background knowledge from secondary school, and self-engagement in mathematics courses at university) proved to be relevant also for the pandemic population (RQ2). Moreover, both populations perceived the impact of the factors within the conceptual model in the same way, and the magnitude of the effects of given factors is comparable. The structural paths of the final model are listed below:
H1a: 
Mathematics confidence negatively affects the perceived level of math anxiety.
H2: 
Perceived level of math anxiety negatively affects the achievement in mathematics.
H3: 
Background knowledge from secondary school positively affects the self-engagement and motivation to fulfil obligations during mathematics course at university.
H4: 
Self-engagement in mathematics course at university positively affects the achievement in mathematics.
The nested models approach confirmed that all four hypotheses (H1a, H2, H3, and H4) could be supported at the 0.1% significance level (see Section 5.4.3). Advanced statistical analysis showed that causal and correlational relationships between mathematical achievement factors remained unchanged. From this point of view, no statistically significant differences between the populations were confirmed. For both populations (pre-pandemic and pandemic), mathematics confidence, perceived level of math anxiety, background knowledge from secondary school, and self-engagement in mathematics courses at university were confirmed as influential. Similar values of standardized path coefficient β prove that the magnitude of the effects of a particular factor are also comparable. Our results therefore proved that no increased negative impact on achievement in mathematics was found for any of the factors. These results support previous studies reporting only minor or no negative effects of distance learning on mathematics achievement [16,17,18].
In addition, the results showed a fairly high value of the coefficient of determination for mathematics achievement for both groups of students (0.854 and 0.661), suggesting that the variables “Perceived Level of Math Anxiety” and “Self-Engagement in Mathematics Course at University” together explain a substantial proportion of the total variance. However, given that the proportion of unexplained influences for the “pandemic” students is 33.9%, this likely indicates that their learning outcomes are also influenced by other factors that were not considered in the present model. We intend to focus on identifying these factors in our future research.
Given the general belief in society and commonly expressed assumptions about the negative effects of COVID-19-related measures on learning outcomes, we were quite surprised to find no significant differences in mathematics achievement between the two groups. The following is a reflection on the possible reasons for such an outcome. We believe that the results reflect students’ and teachers’ extraordinary efforts to successfully implement online mathematics teaching. For the pandemic population, all courses were redesigned for distance education using the Moodle e-learning environment and the MS Teams video conferencing system. We see the most important reasons for successful outcomes of the online learning process in:
  • familiarity with the Moodle learning platform. Even before the pandemic, parts of some courses at our faculty were conducted online via Moodle. Therefore, the professors had at least basic knowledge of and some experience with preparing lessons, quizzes, and other e-activities available in this environment;
  • well-adjusted study materials. We strived to construct an authentic and engaging online learning experience. A number of online study materials were made available, including recorded classes and tutorials. In the regular annual student survey at the end of the course, these recordings were highlighted as extremely helpful and important in enriching the learning experience. This is consistent with the findings of Busto et al. [71]. Similarly, in [18], recordings were reported as a useful resource for re-watching portions of a lecture, especially in mathematics courses, where lessons are often broken down by concepts using specific examples;
  • a positive attitude of students. We agree with Takács et al. [11], who found that students were capable of effectively adapting to the virtual teaching modality. They recently explored the characteristics and changes in coping skills of university students in three age groups. They found that students pre-pandemic and during the pandemic did not differ with regard to coping skills, but confirmed changes over the past 20 years. The younger students were found to be fast at processing information and less socially efficient compared to older students. Students of generation Z (born between 1995 and 2012) generally see new situations as a positive challenge and deal with them creatively. Vergara [72] also found that students had a positive attitude toward learning mathematics and strong persistence despite the challenges they encountered;
  • accessibility and responsiveness of professors. The lack of communication between students (most of the first-year students did not know each other) led to an increased volume of messages and questions addressed directly to the professors. Although it was very time-consuming, we tried to respond in a timely manner. Timely feedback on student work is highly encouraged so that they will maintain a positive outlook in terms of their capabilities [73].
In summary, the results of our research show that with proper planning, mathematics lectures and tutorials can be successfully conducted online. At the same time, we are aware that it is impossible to ensure the same conditions for all students, especially in providing adequate study space, sufficient technical accessories, and a reliable internet connection [16,73]. In addition, COVID-19 despair and the social, psychological, and emotional stress that comes with it can significantly determine poor student performance [31]. The impact of the pandemic on stress, mental health, and coping behavior for university students has been well-documented from various perspectives [31,52,74,75,76]. In our case, students’ responses to COVID-19-related stressors as described in [52] show that students during the pandemic felt more socially isolated, felt as if they were missing something, and were more worried about their future career, family, and friends compared to the pre-pandemic time. In order to expand the mathematics achievement model with social, psychological, and emotional stressors, other established psychological scales could be used in future research to more accurately assess the impact of the pandemic on students’ mental health and its correlation with mathematics achievement.
There are also some limitations to our study that should be noted. First, all data were collected from a single faculty of a Slovenian university. Consequently, the extent to which the results presented here are applicable to other types of institutions is unclear. Our sampling technique relied on voluntary participation, which increases selection bias. In addition, student achievement in mathematics could be linked to a modified format of assessment tools or cheating on online exams during the pandemic. However, it should be emphasized that only a small portion of the assessment (around 20%) was undertaken remotely. In this sense, we believe that these aspects are likely to have played a minor role in our setting.

Author Contributions

Conceptualization, A.B. and A.Ž.; methodology, A.Ž. and J.J.; data curation, G.R. and A.Ž.; formal analysis, A.Ž., J.J. and G.R.; writing—original draft preparation, A.B., J.J. and A.Ž.; writing—review and editing, A.B., J.J., G.R. and A.Ž. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The datasets generated and/or analyzed during the current study are available from the corresponding author on reasonable request.

Acknowledgments

This research was supported by the Slovenian Research Agency, program no. P5-0018, Decision Support Systems in Digital Business, and program no. P5-0433, Digital Restructuring of Deficit Occupations for Society 5.0 (Industry 4.0).

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Parameter estimates, error terms, and z-values for measurement model.
Table A1. Parameter estimates, error terms, and z-values for measurement model.
ConstructQuestionnaire
Item
Unst. Factor LoadingError
Term
Z-ValueStand. Factor Loading
Mathematics
Confidence
(MC)
MC11.000- a- a0.563
MC21.6070.14611.0130.806
MC31.3740.13010.6030.757
MC42.0020.17311.5580.866
Behavioral
Engagement (BE)
BE31.000- a- a0.728
BE41.2370.11310.9410.896
Mathematics Test Anxiety
(MTA)
MTA11.000- a- a0.735
MTA20.9280.05317.3590.758
MTA31.0180.04721.7580.811
MTA41.0310.04224.3660.811
MTA50.9430.07013.5160.687
MTA61.0900.07015.4900.806
MTA71.0840.06815.9060.767
MTA90.8580.05814.8310.639
MTA100.8150.06811.9860.580
Numerical
Task Anxiety
(NTA)
NTA21.000- a- a0.920
NTA30.9910.03726.9500.931
NTA40.9860.03330.3220.925
NTA51.0000.05019.9700.857
Mathematics
Course Anxiety
(MCA)
MCA11.000- a- a0.660
MCA21.3300.1598.3650.730
MCA31.1560.08913.0480.724
MCA41.1460.09911.5970.695
Perceived Level of
Mathematics Anxiety
(PLMA)
MTA1.000- a- a0.814
NTA0.4010.0884.5330.379
MCA0.7350.0977.6130.801
Background Knowledge from Secondary School
(BKSS)
Grade in mathematics in final year1.000- a- a0.936
Grade in mathematics at matura0.7270.0828.8960.649
Final grade in high school0.5330.04910.9540.594
Self-Engagement in Mathematics Course at University (SEMCU)e-Activities1.000- a- a0.783
Additional points0.2680.0396.7760.544
Confidence
With Technology (CT)
CT11.000- a-a0.880
CT20.6890.04415.8340.721
CT31.1150.05619.8490.848
CT40.7820.05314.6500.658
Perceived Usefulness of
Technology in
Learning Mathematics
(PUTLM)
PUTLM11.000- a- a0.880
PUTLM20.6890.04415.8340.721
PUTLM31.1150.05619.8490.848
PUTLM40.7820.05314.6500.658
Note: - a Indicates parameter fixed at 1 in original solution. Fit indices: χ2 = 1392.42, df = 570, χ2/df = 2.44, SRMR = 0.064, CFI = 0.906, RMSEA = 0.055, 90% CI for RMSEA: (0.052, 0.069).

References

  1. Chaturvedi, K.; Vishwakarma, D.K.; Singh, N. COVID-19 and its impact on education, social life and mental health of students: A survey. Child. Youth Serv. Rev. 2021, 121, 105866. [Google Scholar] [CrossRef] [PubMed]
  2. Al-Maskari, A.; Al-Riyami, T.; Kunjumuhammed, S.K. Students academic and social concerns during COVID-19 Pandemic. Educ. Inf. Technol. 2022, 27, 1–21. [Google Scholar] [CrossRef] [PubMed]
  3. Pokhrel, S.; Chhetri, R. A literature review on impact of COVID-19 pandemic on teaching and learning. High. Educ. Future 2021, 8, 133–141. [Google Scholar] [CrossRef]
  4. Clark, A.E.; Nong, H.; Zhu, H.; Zhu, R. Compensating for academic loss: Online learning and student performance during the COVID-19 pandemic. China Econ. Rev. 2021, 68, 101629. [Google Scholar] [CrossRef]
  5. Camacho-Zuñiga, C.; Pego, L.; Escamilla, J.; Hosseini, S. The impact of the COVID-19 pandemic on students’ feelings at high school, undergraduate, and postgraduate levels. Heliyon 2021, 7, e06465. [Google Scholar] [CrossRef]
  6. Hammerstein, S.; König, C.; Dreisörner, T.; Frey, A. Effects of COVID-19-related school closures on student achievement—A systematic review. Front. Psychol. 2021, 12, 746289. [Google Scholar] [CrossRef]
  7. Panagouli, E.; Stavridou, A.; Savvidi, C.; Kourti, A.; Sergentanis, T.; Tsitsika, A. School performance among children and adolescents during COVID-19 pandemic: A Systematic review. Children 2021, 8, 1134. [Google Scholar] [CrossRef]
  8. Svaleryd, H.; Vlachos, J. COVID-19 and School Closures; GLO Discussion Paper Series; Global Labor Organization (GLO): Geneva, Switzerland, 2022. [Google Scholar]
  9. Soesanto, R.H.; Dirgantoro, K.P.S. Commemorating one-year of the COVID-19 pandemic: Indonesian and international issues of secondary and tertiary mathematics learning. Int. J. Stud. Educ. Sci. 2021, 2, 18–35. [Google Scholar] [CrossRef]
  10. Al Ghazali, F. Challenges and opportunities of fostering learner autonomy and self-access learning during the COVID-19 pandemic. Stud. Self-Access Learn. J. 2020, 11, 114–127. [Google Scholar] [CrossRef]
  11. Takács, R.; Takács, S.; Kárász, J.T.; Horváth, Z.; Oláh, A. Exploring coping strategies of different generations of students starting university. Front. Psychol. 2021, 12, 740569. [Google Scholar] [CrossRef]
  12. De Paola, M.; Gioia, F.; Scoppa, V. Online Teaching, Procrastination and Students’ Achievement: Evidence from COVID-19 Induced Remote Learning; IZA Discussion Papers; Institute of Labor Economics (IZA): Bonn, Germany, 2022. [Google Scholar]
  13. Kofoed, M.S.; Gebhart, L.; Gilmore, D.; Moschitto, R. Zooming to Class?: Experimental Evidence on College Students’ Online Learning during COVID-19; IZA Discussion Papers; Institute of Labor Economics (IZA): Bonn, Germany, 2021. [Google Scholar]
  14. Orlov, G.; McKee, D.; Berry, J.; Boyle, A.; DiCiccio, T.; Ransom, T.; Rees-Jones, A.; Stoye, J. Learning during the COVID-19 Pandemic: It is not who you teach, but how you teach. Econ. Lett. 2021, 202, 109812. [Google Scholar] [CrossRef]
  15. Gonzalez, T.; De la Rubia, M.A.; Hincz, K.P.; Comas-Lopez, M.; Subirats, L.; Fort, S.; Sacha, G.M. Influence of COVID-19 confinement on students’ performance in higher education. PLoS ONE 2020, 15, e0239490. [Google Scholar] [CrossRef] [PubMed]
  16. Pócsová, J.; Mojžišová, A.; Takáč, M.; Klein, D. The impact of the COVID-19 pandemic on teaching mathematics and students’ knowledge, skills, and grades. Educ. Sci. 2021, 11, 225. [Google Scholar] [CrossRef]
  17. Tomal, J.; Rahmati, S.; Boroushaki, S.; Jin, L.; Ahmed, E. The impact of COVID-19 on students’ marks: A Bayesian hierarchical modeling approach. METRON 2021, 79, 57–91. [Google Scholar] [CrossRef] [PubMed]
  18. Bonsangue, M.V.; Clinkenbeard, J.E. A Comparison of American student and faculty experiences in mathematics courses during the COVID-19 pandemic. Int. J. Educ. Res. Open 2021, 2, 100075. [Google Scholar] [CrossRef]
  19. Brezavšček, A.; Jerebic, J.; Rus, G.; Žnidaršič, A. Factors influencing mathematics achievement of university students of social sciences. Mathematics 2020, 8, 2134. [Google Scholar] [CrossRef]
  20. Kline, R.B. Convergence of Structural Equation Modeling and Multilevel Modeling. In The SAGE Handbook of Innovation in Social Research Methods; SAGE Publications Ltd.: London, UK, 2011; pp. 562–589. [Google Scholar]
  21. Beaujean, A.A. Latent Variable Modeling Using R: A Step-by-Step Guide; Routledge: New York, NY, USA, 2014. [Google Scholar]
  22. Gao, Y.; Wong, S.L.; Khambari, M.N.; Noordin, N. A Bibliometric analysis of the scientific production of e-learning in higher education (1998–2020). Int. J. Inf. Educ. Technol. 2022, 12, 390–399. [Google Scholar] [CrossRef]
  23. Paiva, J.; Abreu, A.; Costa, E. Distance learning in higher education during the COVID-19 pandemic: A systematic literature review. Res. Bull. Cad. Investig. Master E-Bus. 2021, 1, 1–12. [Google Scholar]
  24. Del Carmen Valls Martínez, M.; Martín-Cervantes, P.A.; Sánchez Pérez, A.M.; del Carmen Martínez Victoria, M. Learning mathematics of financial operations during the COVID-19 Era: An assessment with partial least squares structural equation modeling. Mathematics 2021, 9, 2120. [Google Scholar] [CrossRef]
  25. Alangari, T.S. Online STEM education during COVID-19 period: A systematic review of perceptions in higher education. Eurasia J. Math. Sci. Technol. Educ. 2022, 18, em2105. [Google Scholar] [CrossRef]
  26. Ní Fhloinn, E.; Fitzmaurice, O. Challenges and opportunities: Experiences of mathematics lecturers engaged in emergency remote teaching during the COVID-19 pandemic. Mathematics 2021, 9, 2303. [Google Scholar] [CrossRef]
  27. Bloom, B.S.; Englehart, M.D.; Furst, E.J.; Hill, W.H.; Krathwohl, D.R. Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain; David McKay Co. Inc.: New York, NY, USA, 1956. [Google Scholar]
  28. Rosillo, N.; Montes, N. Escape room dual mode approach to teach maths during the COVID-19 era. Mathematics 2021, 9, 2602. [Google Scholar] [CrossRef]
  29. Mendoza Velazco, D.; Cejas, M.; Rivas, G.; Varguillas, C. Anxiety as a prevailing factor of performance of university mathematics students during the COVID-19 pandemic. Educ. Sci. J. 2021, 23, 94–113. [Google Scholar] [CrossRef]
  30. Lanius, M.; Jones, T.F.; Kao, S.; Lazarus, T.; Farrell, A. Unmotivated, depressed, anxious: Impact of the COVID-19 emergency transition to remote learning on undergraduates’ math anxiety. J. Humanist Math. 2022, 12, 148–171. [Google Scholar] [CrossRef]
  31. Ludwig, J. Poor performance in undergraduate math: Can we blame it on COVID-19 Despair? Int. J. Innov. Sci. Math. 2021, 9, 31–40. [Google Scholar]
  32. Reinhold, F.; Schons, C.; Scheuerer, S.; Gritzmann, P.; Richter-Gebert, J.; Reiss, K. Students’ coping with the self-regulatory demand of crisis-driven digitalization in university mathematics instruction: Do motivational and emotional orientations make a difference? Comput. Hum. Behav. 2021, 120, 106732. [Google Scholar] [CrossRef]
  33. Kargar, M.; Tarmizi, R.A.; Bayat, S. Relationship between mathematical thinking, mathematics anxiety and mathematics attitudes among university students. Procedia—Soc. Behav. Sci. 2010, 8, 537–542. [Google Scholar] [CrossRef] [Green Version]
  34. Enu, J.; Agyman, O.K.; Nkum, D. Factors influencing students’ mathematics performance in some selected colleges of education in Ghana. Int. J. Educ. Learn. Dev. 2015, 3, 68–74. [Google Scholar]
  35. Khedhiri, S. The determinants of mathematics and statistics achievement in higher education. Mod. Appl. Sci. 2016, 10, 60. [Google Scholar] [CrossRef]
  36. Nunez-Pena, M.I.; Suarez-Pellicioni, M.; Bono, R. Effects of math anxiety on student success in higher education. Int. J. Educ. Res. 2013, 58, 36–43. [Google Scholar] [CrossRef] [Green Version]
  37. Pierce, R.; Stacey, K.; Barkatsas, A. A scale for monitoring students’ attitudes to learning mathematics with technology. Comput. Educ. 2007, 48, 285–300. [Google Scholar] [CrossRef] [Green Version]
  38. Alexander, L.; Martray, C.R. The development of an abbreviated version of the mathematics anxiety rating scale. Meas. Eval. Couns. Dev. 1989, 22, 143–150. [Google Scholar] [CrossRef]
  39. Baloğlu, M.; Zelhart, P.F. Psychometric properties of the revised mathematics anxiety rating scale. Psychol. Rec. 2007, 57, 593–611. [Google Scholar] [CrossRef]
  40. Eng, T.H.; Li, V.L.; Julaihi, N.H. The relationships between students’ underachievement in mathematics courses and influencing factors. Procedia—Soc. Behav. Sci. 2010, 8, 134–141. [Google Scholar] [CrossRef] [Green Version]
  41. Nicholas, J.; Poladian, L.; Mack, J.; Wilson, R. Mathematics preparation for university: Entry, pathways and impact on performance in first year science and mathematics subjects. Int. J. Innov. Sci. Math. Educ. 2015, 23, 37–51. [Google Scholar]
  42. Joyce, C.; Hine, G.; Anderton, R. The association between secondary mathematics and first year university performance in health sciences. Issues Educ. Res. 2017, 27, 770–783. [Google Scholar]
  43. Anderton, R.; Hine, G.; Joyce, C. Secondary school mathematics and science matters: Predicting academic success for secondary students transitioning into university allied health and science courses. Int. J. Innov. Sci. Math. Educ. 2017, 25, 34–47. [Google Scholar]
  44. McMillan, J.; Edwards, D. Performance in first year mathematics and science subjects in Australian universities: Does senior secondary mathematics background matter? Final Report. High. Educ. Res. 2019. [Google Scholar]
  45. Hailikari, T.; Nevgi, A.; Komulainen, E. Academic self-beliefs and prior knowledge as predictors of student achievement in mathematics: A structural model. Educ. Psychol. 2008, 28, 59–71. [Google Scholar] [CrossRef]
  46. Faulkner, F.; Hannigan, A.; Fitzmaurice, O. The role of prior mathematical experience in predicting mathematics performance in higher education. Int. J. Math. Educ. Sci. Technol. 2014, 45, 648–667. [Google Scholar] [CrossRef] [Green Version]
  47. Warwick, J. Mathematical self-efficacy and student engagement in the mathematics classroom. MSOR Connect. 2008, 8, 31–37. [Google Scholar] [CrossRef]
  48. Linnenbrink, E.A.; Pintrich, P.R. The role of self-efficacy beliefs instudent engagement and learning intheclassroom. Read. Writ. Q. 2003, 19, 119–137. [Google Scholar] [CrossRef]
  49. Li, Q.; Ma, X. A Meta-analysis of the effects of computer technology on school students’ mathematics learning. Educ. Psychol. Rev. 2010, 22, 215–243. [Google Scholar] [CrossRef]
  50. Attard, C.; Holmes, K. “It Gives You That Sense of Hope”: An exploration of technology use to mediate student engagement with mathematics. Heliyon 2020, 6, e02945. [Google Scholar] [CrossRef]
  51. Cheung, A.C.; Slavin, R.E. The effectiveness of educational technology applications for enhancing mathematics achievement in K-12 classrooms: A meta-analysis. Educ. Res. Rev. 2013, 9, 88–113. [Google Scholar] [CrossRef]
  52. Elmer, T.; Mepham, K.; Stadtfeld, C. Students under lockdown: Comparisons of students’ social networks and mental health before and during the COVID-19 Crisis in Switzerland. PLoS ONE 2020, 15, e0236337. [Google Scholar] [CrossRef]
  53. Schumacker, E.; Lomax, G. A Beginner’s Guide to Structural Equation Modelling, 4th ed.; Routledge: London, UK, 2016. [Google Scholar]
  54. Kline, R.B. Principles and Practice of Structural Equation Modeling; Guilford Publications: New York, NY, USA, 2015. [Google Scholar]
  55. Koufteros, X.A. Testing a model of pull production: A paradigm for manufacturing research using structural equation modeling. J. Oper. Manag. 1999, 17, 467–488. [Google Scholar] [CrossRef]
  56. MacCallum, R.; Browne, M.; Sugawara, H.M. Power analysis and determination of sample size for covariance structure modeling. Psychol. Methods 1996, 1, 130–149. [Google Scholar] [CrossRef]
  57. Hu, L.; Bentler, P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Model. 1999, 6, 1–55. [Google Scholar] [CrossRef]
  58. Teo, T.; Zhou, M. Explaining the intention to use technology among university students: A structural equation modeling approach. J. Comput. High. Educ. 2014, 26, 124–142. [Google Scholar] [CrossRef]
  59. Cheung, G.W.; Rensvold, R.B. Evaluating goodness-of-fit indexes for testing measurement invariance. Struct. Equ. Model. 2002, 9, 233–255. [Google Scholar] [CrossRef]
  60. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  61. Cheung, G.W.; Lau, R.S. A direct comparison approach for testing measurement invariance. Organ. Res. Methods 2012, 15, 167–198. [Google Scholar] [CrossRef]
  62. Tracey, T.J.; Xu, H. Use of multi-group confirmatory factor analysis in examining measurement invariance in counseling psychology research. Eur. J. Couns. Psychol. 2017, 6, 75–82. [Google Scholar] [CrossRef]
  63. Immekus, J.C. Multigroup CFA and alignment approaches for testing measurement invariance and factor score estimation: Illustration with the schoolwork-related anxiety survey across countries and gender. Methodology 2021, 17, 22–38. [Google Scholar] [CrossRef]
  64. Putnick, D.L.; Bornstein, M.H. Measurement invariance conventions and reporting: The state of the art and future directions for psychological research. Dev. Rev. 2016, 41, 71–90. [Google Scholar] [CrossRef] [Green Version]
  65. Miceli, G.N.; Barbaranelli, C. Structural Equations Modeling: Theory and Applications in Strategic Management. In Research Methods for Strategic Management; Routledge: London, UK, 2015. [Google Scholar]
  66. Brown, G.T.L.; Harris, L.R.; O’Quin, C.; Lane, K.E. Using multi-group confirmatory factor analysis to evaluate cross-cultural research: Identifying and understanding non-invariance. Int. J. Res. Method Educ. 2017, 40, 66–90. [Google Scholar] [CrossRef]
  67. Chen, F.F. Sensitivity of goodness of fit indexes to lack of measurement invariance. Struct. Equ. Model. 2007, 14, 464–504. [Google Scholar] [CrossRef]
  68. Rosseel, Y.; Jorgensen, T.D.; Rockwood, N.; Oberski, D.; Byrnes, J.; Vanbrabant, L.; Savalei, V.; Merkle, E.; Hallquist, M.; Rhemtulla, M.; et al. Lavaan: Latent Variable Analysis. [R package]. 2022. Available online: https://cran.r-project.org/web/packages/lavaan/index.html (accessed on 15 February 2022).
  69. Jorgensen, T.D.; Pornprasertmanit, S.; Schoemann, A.M.; Rosseel, Y.; Miller, P.; Quick, C.; Garnier-Villarreal, M.; Selig, J.; Boulton, A.; Preacher, K.; et al. SemTools: Useful Tools for Structural Equation Modeling. [Computer Software]. 2022. Available online: https://CRAN.R-project.org/package=semTools (accessed on 15 February 2022).
  70. Trenholm, S.; Peschke, J.; Chinnappan, M. A review of fully online undergraduate mathematics instruction through the lens of large-scale research (2000–2015). PRIMUS 2019, 29, 1080–1100. [Google Scholar] [CrossRef]
  71. Busto, S.; Dumbser, M.; Gaburro, E. A Simple but efficient concept of blended teaching of mathematics for engineering students during the COVID-19 pandemic. Educ. Sci. 2021, 11, 56. [Google Scholar] [CrossRef]
  72. Vergara, C.R. Mathematics resilience and achievement goals: Exploring the role of non-cognitive factors to mathematics performance of university students amidst of Pandemic. Open Access Libr. J. 2021, 8, 1–10. [Google Scholar] [CrossRef]
  73. Bringula, R.; Reguyal, J.J.; Tan, D.D.; Ulfa, S. Mathematics self-concept and challenges of learners in an online learning environment during COVID-19 pandemic. Smart Learn. Environ. 2021, 8, 22. [Google Scholar] [CrossRef]
  74. Baltà-Salvador, R.; Olmedo-Torre, N.; Peña, M.; Renta-Davids, A.I. Academic and emotional effects of online learning during the COVID-19 pandemic on engineering students. Educ. Inf. Technol. 2021, 26, 7407–7434. [Google Scholar] [CrossRef] [PubMed]
  75. Brooks, S.K.; Webster, R.K.; Smith, L.E.; Woodland, L.; Wessely, S.; Greenberg, N.; Rubin, G.J. The psychological impact of quarantine and how to reduce it: Rapid review of the evidence. Lancet 2020, 395, 912–920. [Google Scholar] [CrossRef] [Green Version]
  76. Voltmer, E.; Köslich-Strumann, S.; Walther, A.; Kasem, M.; Obst, K.; Kötter, T. The Impact of the COVID-19 pandemic on stress, mental health and coping behavior in German university students—A longitudinal study before and after the onset of the pandemic. BMC Public Health 2021, 21, 1385. [Google Scholar] [CrossRef]
Figure 1. Step 1: Model overall fit evaluation [19,59].
Figure 1. Step 1: Model overall fit evaluation [19,59].
Mathematics 10 02314 g001
Figure 2. Step 2: Confirmatory Factor Analysis [60].
Figure 2. Step 2: Confirmatory Factor Analysis [60].
Mathematics 10 02314 g002
Figure 3. Step 3: Multi-Group Confirmatory Factor Analysis.
Figure 3. Step 3: Multi-Group Confirmatory Factor Analysis.
Mathematics 10 02314 g003
Figure 4. Mean values of questionnaire items measured on 5-point Likert-type scale.
Figure 4. Mean values of questionnaire items measured on 5-point Likert-type scale.
Mathematics 10 02314 g004
Figure 5. Histograms and boxplots for number of additional points in both groups of respondents, pre-pandemic (left) and pandemic (right). The asterisks (*) indicate average.
Figure 5. Histograms and boxplots for number of additional points in both groups of respondents, pre-pandemic (left) and pandemic (right). The asterisks (*) indicate average.
Mathematics 10 02314 g005
Figure 6. Histogram and boxplot for final grade in high school for both groups of respondents, pre-pandemic (left) and pandemic (right). The asterisks (*) indicate average.
Figure 6. Histogram and boxplot for final grade in high school for both groups of respondents, pre-pandemic (left) and pandemic (right). The asterisks (*) indicate average.
Mathematics 10 02314 g006
Figure 7. Relationships among constructs of final model. Note: l—Unstandardized loading, λ—Standardized loading, b—Unstandardized path coefficient, β—Standardized path coefficient, R2—Coefficient od determination, pp—Pre-pandemic, p—Pandemic, *** p < 0.001.
Figure 7. Relationships among constructs of final model. Note: l—Unstandardized loading, λ—Standardized loading, b—Unstandardized path coefficient, β—Standardized path coefficient, R2—Coefficient od determination, pp—Pre-pandemic, p—Pandemic, *** p < 0.001.
Mathematics 10 02314 g007
Table 1. Constructs of conceptual model developed in [19]: pre-pandemic approach.
Table 1. Constructs of conceptual model developed in [19]: pre-pandemic approach.
Model ConstructNo. of Items aRating ScaleReferences
Mathematics Confidence (MC)45-point Likert-type scale:
1 (“I do not agree at all”) to
5 (“I agree completely”)
[33,34,35,36]
Behavioral Engagement (BE)25-point Likert-type scale:
1 (“I do not agree at all”) to
5 (“I agree completely”)
[37]
Perceived Level of Math Anxiety (PLMA)Mathematics Test Anxiety (MTA)95-point Likert-type scale:
1 (“no anxiety”) to
5 (“high anxiety”)
[36,38,39]
Numerical Task Anxiety (NTA)4
Math. Course Anxiety (MCA)4
Background Knowledge from Secondary School (BKSS)Grade in math in final year1Achieved grade from
2 (sufficient) to
5 (excellent)
[35,40,41,42,43,44,45,46]
Grade in math at matura b1
Final grade in high school1
Self-Engagement in Math. Course at Univ. (SEMCU)e-Activities1% of points earned
(0–100%)
[46,47,48]
Additional points1Number of points earned
(0–13)
Confidence with Technology (CT)45-point Likert-type scale:
1 (“I do not agree at all”) to
5 (“I agree completely”)
[37]
Perceived Usefulness of Technology in Learning Mathematics (PUTLM)45-point Likert-type scale:
1 (“I do not agree at all”) to
5 (“I agree completely”)
[37,49,50,51]
Mathematics Achievement (MA)1% of points achieved on
final exam
a Five measured items (BE1, BE2, MTA8, NTA1, and MCA1) were omitted from the original questionnaire due to factor loadings below 0.5 (see [19] for details). b Matura is the final national school-leaving exam in Slovenia.
Table 2. Summary of hypotheses testing for structural model: pre-pandemic approach [19].
Table 2. Summary of hypotheses testing for structural model: pre-pandemic approach [19].
HypothesisPathExpected SignHypothesis
Supported?
H1aMC → PLMAYes
H1bBE → PLMANo
H2PLMA → MAYes
H3BKSS → SEMCU+Yes
H4SEMCU → MA+Yes
H5CT → PUTLM+Yes
H6PUTLM → MA+No
Table 3. Descriptive statistics of questionnaire items.
Table 3. Descriptive statistics of questionnaire items.
Model ConstructQuestionnaire ItemPre-PandemicPandemic
MSDMSD
Mathematics Confidence
(MC)
I have a mathematical mind. (MC1)3.920.8083.810.878
I can get good results in mathematics. (MC2)3.710.9083.580.983
I know I can handle difficulties in mathematics. (MC3)3.940.8213.780.904
I am confident with mathematics. (MC4)3.171.0793.101.094
Behavioral Engagement
(BE)
If I make mistakes, I work until I have corrected them. (BE3)3.500.9263.740.976
If I cannot do a problem, I keep trying different ideas. (BE4)3.470.9393.770.956
Mathematics Test Anxiety
(MTA)
Studying for a math test. (MTA1)3.131.2643.421.186
Taking the math section of the college entrance exam. (MTA2)2.701.1142.871.131
Taking an exam (quiz) in a math course. (MTA3)2.881.1383.121.158
Taking an exam (final) in a math course. (MTA4)3.361.1463.571.192
Thinking about an upcoming math test one week before. (MTA5)2.841.2392.891.293
Thinking about an upcoming math test one day before. (MTA6)3.361.2383.551.236
Thinking about an upcoming math test one hour before. (MTA7)3.631.2813.831.314
Receiving your final math grade in the mail. (MTA9)2.881.2193.211.223
Being given a “pop” quiz in a math class. (MTA10)3.791.2923.931.271
Numerical Task Anxiety
(NTA)
Being given a set of numerical problems involving addition to solve on paper. (NTA2)1.560.9071.460.740
Being given a set of s subtraction problems to solve. (NTA3)1.560.8821.460.740
Being given a set of multiplication problems to solve. (NTA4)1.600.8791.490.749
Being given a set of division problems to solve. (NTA5)1.730.9681.630.805
Mathematics Course Anxiety
(MCA)
Watching a teacher work on an algebraic equation on the blackboard. (MCA2)1.861.0371.801.037
Signing up for a math course. (MCA3)2.551.2092.721.313
Listening to another student explain a mathematical formula. (MCA4)2.071.0952.051.088
Walking into a math class. (MCA5)1.821.1001.991.180
Confidence with Technology
(CT)
I am good at using computers. (CT1)3.920.9674.020.898
I am good at using things like VCRs, DVDs, MP3s, and mobile phones. (CT2)4.280.8004.260.786
I can fix a lot of computer problems. (CT3)3.511.1433.580.984
I can master any computer program needed for school. (CT4)3.571.0043.850.928
Perceived Usefulness of Technology in Learning Mathematics
(PUTLM)
I like using computers for mathematics. (PUTLM1)3.471.1513.711.029
Using computers in mathematics is worth the extra effort. (PUTLM2)3.251.1203.421.119
Mathematics is more interesting when using computers. (PUTLM3)3.151.2253.371.108
Computers help me learn mathematics better. (PUTLM4)3.221.2223.401.190
Background Knowledge from Secondary School (BKSS)Grade in mathematics in the final year3.110.8823.280.935
Grade in mathematics at matura3.200.9343.250.988
Final grade in high school3.360.7493.660.741
Self-Engagement in Math. Course at Univ. (SEMCU)e-Activities73.5711.98878.579.363
Additional points4.284.1847.554.010
Mathematics Achievement
(MA)
% of points achieved on final exam65.9322.80168.6224.385
Table 4. Results of overall fit of measurement model.
Table 4. Results of overall fit of measurement model.
χ2dfχ2/dfCFISRMRRMSEARMSEA 90% CI
Entire Sample1392.425702.440.9110.0640.0580.054, 0.061
SubsamplesPre-pandemic1173.115702.060.9110.0650.0600.055, 0.065
Pandemic893.885701.570.8860.0790.0620.053, 0.068
Table 5. Composite reliability (CR), average variance extracted (AVE), square root of AVE (on the diagonal), and correlations among latent variables.
Table 5. Composite reliability (CR), average variance extracted (AVE), square root of AVE (on the diagonal), and correlations among latent variables.
Model ConstructCRAVEMCBEMTANTAMCABKSSSEMCUCTPUTLM
MC0.8510.6020.776 a
BE0.7990.6670.5500.817 a
MTA0.9120.538−0.649−0.4020.733 a
NTA0.9490.822−0.227−0.1530.2050.907 a
MCA0.7980.498−0.523−0.3750.6290.4970.706 a
BKSS0.7870.5680.4170.324−0.328−0.028−0.2480.646 a
SEMCU0.6580.5600.5270.375−0.384−0.320−0.3840.3170.748 a
CT0.8640.6220.2470.197−0.155−0.187−0.1440.1130.2320.789 a
PUTLM0.9170.7350.2130.205−0.073−0.088−0.0710.0200.1930.4550.857 a
a Square root of AVE.
Table 6. Testing measurement invariance between groups.
Table 6. Testing measurement invariance between groups.
Model
(Model Comparison)
χ2
(Δχ2)
dfCFI
(ΔCFI)
SRMR
(ΔSRMR)
RMSEA
(ΔRMSEA)
RMSEA
90% CI
p (Chi-Square Test)
M2 Configural invariance1996.25611160.9100.0630.0590.055, 0.063/
M3 Weak invariance
(M2)
2020.728
(27.878)
1143
(27)
0.909
(−0.001)
0.064
(0.001)
0.058
(−0.001)
0.054, 0.0620.4173
M4 Strong invariance
(M3)
2080.724
(60.883)
1170
(27)
0.906
(−0.003)
0.066
(0.002)
0.059
(0.001)
0.055, 0.0630.0002
M4a Partial strong invariance
(M3)
2065.358
(44.545)
1169
(26)
0.908
(−0.001)
0.065
(0.001)
0.058
(0.000)
0.054, 0.0620.0132
M4b Partial strong invariance
(M3)
2052.452
(30.985)
1168
(25)
0.909
(0.000)
0.064
(0.000)
0.058
(0.000)
0.054, 0.0620.1895
M5 Strict invariance
(M4b)
2104.861
(56.786)
1204
(36)
0.905
(−0.004)
0.065
(0.001)
0.058
(0.000)
0.054, 0.0620.0151
Note: In model M4a an intercept of Additional points is set as free between groups; in model M4b intercepts of Additional points and Final grade in high school are set as free between groups.
Table 7. Testing measurement invariance of structural coefficients.
Table 7. Testing measurement invariance of structural coefficients.
Structural Model (SM)χ2dfpCFISRMRRMSEARMSEA
(Model Comparison)(Δχ2)(Δdf)(ΔCFI)(ΔSRMR)(ΔRMSEA)90% CI
SM1 Partial strong invariance2472.281282/0.8940.0900.0610.057, 0.065
SM2 Structural coefficients2365.5612890.29760.8940.0940.0610.057, 0.064
(SM1)−106.72700.0040
Table 8. Summary of research hypotheses testing.
Table 8. Summary of research hypotheses testing.
Hypothesis/PathGroupbβzpHypothesis Supported?
H1a: MC → PLMAPre-pandemic−1.088−0.678 ***−6.9090.000Yes
Expected sign: −Pandemic−0.698 ***
H1b: BE → PLMAPre-pandemic−0.120−0.107 *−1.8120.070No
Expected sign: −Pandemic−0.116 *
H2: PLMA → MAPre-pandemic−6.379−0.221 ***−4.1340.000Yes
Expected sign: −Pandemic−0.199 ***
H3: BKSS → SEMCUPre-pandemic2.9360.331 ***3.7890.000Yes
Expected sign: +Pandemic0.375 ***
H4: SEMC → MAPre-pandemic2.7660.875 ***9.7000.000Yes
Expected sign: +Pandemic0.758 ***
H5: CT → PUTLMPre-pandemic0.5120.460 ***8.9390.000Yes
Expected sign: +Pandemic0.452 ***
H6: PUTLM→ MAPre-pandemic−0.253−0.011 *−0.3200.749No
Expected sign: +Pandemic−0.009 *
Note: Structural coefficient invariance was confirmed, which means unstandardized coefficients and z-values are the same between groups, while there may be small changes in standardized coefficients. *** p < 0.001, * p < 0.05.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Žnidaršič, A.; Brezavšček, A.; Rus, G.; Jerebic, J. Has the COVID-19 Pandemic Affected Mathematics Achievement? A Case Study of University Students in Social Sciences. Mathematics 2022, 10, 2314. https://doi.org/10.3390/math10132314

AMA Style

Žnidaršič A, Brezavšček A, Rus G, Jerebic J. Has the COVID-19 Pandemic Affected Mathematics Achievement? A Case Study of University Students in Social Sciences. Mathematics. 2022; 10(13):2314. https://doi.org/10.3390/math10132314

Chicago/Turabian Style

Žnidaršič, Anja, Alenka Brezavšček, Gregor Rus, and Janja Jerebic. 2022. "Has the COVID-19 Pandemic Affected Mathematics Achievement? A Case Study of University Students in Social Sciences" Mathematics 10, no. 13: 2314. https://doi.org/10.3390/math10132314

APA Style

Žnidaršič, A., Brezavšček, A., Rus, G., & Jerebic, J. (2022). Has the COVID-19 Pandemic Affected Mathematics Achievement? A Case Study of University Students in Social Sciences. Mathematics, 10(13), 2314. https://doi.org/10.3390/math10132314

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop