Next Article in Journal
Opportunities and Challenges in Harnessing Digital Technology for Effective Teaching and Learning
Previous Article in Journal
Effect of American-Based Professional Development Program on Acculturation Strategies of Kazakhstan Mathematics Faculty
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Examining Master’s Students’ Success at a Hispanic-Serving Institution

by
Kenneth John Tobin
1,*,
Jacinto De La Cruz Hernandez
1,
José R. Palma
2,
Marvin Bennett
1 and
Nandita Chaudhuri
3
1
Center for Earth and Environmental Studies, Texas A&M International University, Laredo, TX 78041, USA
2
Institute for Early Childhood Development & Education, Texas A&M University, College Station, TX 77843, USA
3
Public Policy Research Institute, Texas A&M University, College Station, TX 77845, USA
*
Author to whom correspondence should be addressed.
Trends High. Educ. 2025, 4(1), 5; https://doi.org/10.3390/higheredu4010005
Submission received: 2 August 2024 / Revised: 7 January 2025 / Accepted: 9 January 2025 / Published: 15 January 2025

Abstract

:
This work examines the indicators of master’s students’ persistence from 2014 to 2021 at a Hispanic-Serving Institution (HSI) in the southern United States. Demographic and academic variables were used in a logistic regression model to predict students’ successful completion across sixteen master’s programs. In this two-fold study, first, we examined the impact of COVID-19 on students enrolled in twelve face-to-face (F2F) programs and evaluated their performance against a pre-pandemic baseline period. Second, we compared student performance in four accelerated online programs against a pre-accelerated baseline. Most demographic variables were insignificant, while all academic variables were significant across program types. However, GPA became an insignificant variable when the F2F programs were forced to move online during the COVID-19 pandemic. During this period, GPA also increased for students who had discontinued their studies. The accelerated online programs recorded a significant decrease in terms enrolled (Term Count) compared to the pre-accelerated baseline. These results add to the limited literature on student success at the master’s level in HSIs, thus filling a vital knowledge gap. This study provides two case studies focusing on how the pandemic and the accelerated online learning model impacted academic persistence at the master’s level at an HSI.

1. Introduction

Extensive research has focused on undergraduate academic student success, e.g., studies [1,2,3,4]. In contrast, the literature focusing on master’s students’ success is more limited [5,6]. The metrics of student success that are applied to undergraduate degree programs such as first-year retention and six-year graduation rates are difficult to apply to master’s programs. For master’s students, there is no standardized format for official reporting [7]. This is not the case for undergraduate students, where retention and graduation rates are uniformly reported to the Integrated Postsecondary Education Data System (IPEDS) organized by the US National Center for Educational Statistics. Since master’s students can more often take pauses in their studies, the application of the 150% of normal time metric for degree length (i.e., the 6-year graduation rate) to graduate programs is problematic [5], and outcomes need to be calibrated against the propensity for adult learners to take more breaks in their studies due to family and career obligations. Another complication is that master’s programs vary in length, with diverse modalities spanning from face-to-face (F2F) to hybrid, online, and accelerated online programs. Undergraduate retention and graduation rates were developed to track the progress of traditional university students who took classes in an F2F setting [8]. Another issue lies in the fall cohort model used for tracking undergraduate students. This model is untenable for use in graduate students, who can enter programs year-round. Haydarov et al. [5] argues that these types of systemic issues result in the under-reporting of successful outcomes for master’s level programs and suggests that alternative student success indicators, such as time to degree, time to dropout, course completion ratio, and average number of semesters taken are more relevant success measures for graduate students.
The purpose of this paper is to better understand master’s students’ persistence at a Hispanic-Serving Institution (HSI), which is a regional university in the southern United States. The US Department of Education defines an HSI as having a minimum 25% Latinx student population. This study’s definition of a Hispanic person conforms to the ethnicity designation utilized by IPEDS, which is a person of Cuban, Mexican, Puerto Rican, South or Central American, or other Spanish culture or origin, regardless of race. In fall 2023, the subject HSI had an enrollment of 8489 students, who were 61.9% female, 38.1% male, and 89.0% Hispanic. The service area of the subject HSI has a low per capita income. For example, the county where the subject HSI is located has a per capita income that is 68% of the statewide average [9].
The master’s student population at the subject HSI was 17.6% in fall 2023. At the subject HSI, master’s programs are vital in that they help build capacity by closing educational disparities and bolstering economic growth within the communities that they serve. In particular, accelerated online programs serve a vital niche in that they can quickly equip students with skills needed in the workforce, ensuring that students graduate with the skills that employers demand in the targeted geography. In this context, master’s programs grant credentials that are of value to regional employers and provide a significant financial return for students who complete these programs [10].
Master’s students face unique challenges in that they must balance family and full-time work obligations, may be financially strained, and are less supported by first-generation student services programs that are geared more toward undergraduates [11]. These students, especially older adult learners who have been out of school for a while, might struggle with the need for self-directed learning and the accelerated pace of online programs. Understanding the factors that impact master’s students’ persistence is crucial in better serving this important and rapidly growing demographic.
Persistence in this study was examined in the context of four master’s program types between fall 2014 and 2021. During the fall 2018 semester, four programs (Business Administration, Criminal Justice, Curriculum and Instruction, and Educational Administration) at the subject HSI transitioned to an accelerated online modality. Research question 1 focused on four academic variables (GPA, Term Count, Course Drop Rate, and Course Load) before and after this transition. The accelerated online model should manifestly reduce Term Count. The impact of this transition on the other three values is unclear and will be explored in depth in this work. Of the four accelerated online programs, only Business Administration experienced a decline in GPA when transitioning to an accelerated online format. Possible explanations for this decline will be explored. Research question 2 examines how COVID-19 impacted academic variables during the 2020 to 2021 pandemic period for twelve F2F programs forced to move online by the pandemic. Given the disruptions of the pandemic, we speculated that time to degree or Term Count may have been extended, with adverse impacts on GPA and Course Drop Rate.
This research identifies the significant variables controlling student persistence for all four program types. Further, the examination of discontinued and graduate student populations within these programs adds additional nuance to these findings. Overarchingly, this study provides insights into the impacts of COVID-19 and the accelerated online model of program delivery at the master’s level, filling a vital gap in the literature.

2. Literature Review

The prevailing theory for why students drop out from their studies is well discussed in the literature. Tinto’s [12,13,14,15] groundbreaking studies developed the initial theoretical framework to explain why students persist in college. Tinto advocated for the importance of academic and social integration during the first year of study, and he suggested that students not having a good fit with college culture was the main reason why students abandoned their studies. It should be noted that Tinto’s theory was based on traditional undergraduate programs with studies undertaken in an F2F modality. An offshoot of Tinto’s work [16] focused on nontraditional, commuter undergraduate students enrolled in on-campus education. This population better matches the twelve F2F master’s programs examined in this study. Bean and Metzner’s model de-emphasized the importance of social integration. External environmental and student characteristics were of greater importance in this population in terms of providing a predictor of student success. The environmental factors examined included finances, hours of employment per week, family/work support, family responsibilities, and outside encouragement, which are highly relevant for graduate students. Rovai [17] focused on distance education and developed a composite persistence model that incorporated the theories of Tinto [12,13,14,15] and Bean and Metzner [16] while considering the skills needed for online students to succeed with the special considerations imposed by the distance learning setting. This framework also incorporated the necessity to harmonize student learning with effective teaching styles. While these models were not developed for graduate students, there are studies that have adapted them for this population. For example, Coleman [18] extended the above theories to predict dropout risk for students enrolled in graduate online programs based on survival analysis. All of these theoretical frameworks stress the importance of academic programs and GPA as key factors that control whether or not a student persists. Rotem et al. [6] found that only academic performance and not background demographic variables were useful in predicting dropouts at the master’s level. These results are in line with prior studies that have stressed post-enrollment academic performance (e.g., first-year GPA, overall GPA, and course failure) as a strong predictor of persistence [4,19,20].

3. Graduate Program Characteristics

A total of twelve F2F master’s programs were examined: Accounting; Biology; Communication; Counseling Psychology; English; History and Political Thought; Language, Literature and Translation; Mathematics; Nurse Practitioner; Psychology; School Counseling; and Sociology. At the end of March 2020, all F2F classes at the subject HSI were forced to move online. During the fall 2020 and spring 2021 semesters, the classes were taught in a flexible mode, where students could opt to attend in person or online. Most students (>95%) choose to attend classes online. For this reason, during the period between spring 2020 and summer 2021, instruction occurred, for the most part, online. In the fall 2021 semester, the subject HSI offered F2F classes with social distancing restrictions, although some instructors allowed graduate students to continue their studies in an online environment to some extent during this semester. All social distance restrictions were lifted in the spring 2022 semester, when normal instruction resumed. The baseline period for F2F programs was fall 2014 to 2019. The pandemic period examined, spring 2020 to fall 2021, was defined as the COVID-19 forced online period.
Master’s programs in Business Administration, Criminal Justice, Curriculum and Instruction, and Educational Administration at the subject HSI transitioned to an accelerated, online (7-week) cohort model during the fall 2018 semester. Under the new model, the expected time for completion was 12 to 18 months, unlike the 2+ years during the baseline, pre-accelerated period where these programs were taught in a mix of modalities. Thus, the accelerated online period spanned fall 2018 to 2021. The pre-accelerated period was about equal in length, fall 2014 to summer 2018, forming a baseline for comparison. During the pandemic, online education at the subject HSI continued without interruption, not affecting course delivery within these programs, unlike with the F2F programs. However, to isolate the potential impacts of the pandemic on the accelerated online programs, this program type was divided into pre-COVID-19 (fall 2018 to fall 2019) and COVID-19 (spring 2020 to fall 2021) periods of analysis.

4. Methods

4.1. Student Data

Student data were derived from the HSI’s student information system (Banner) and included only students who started a graduate program at any point between the fall 2014 semester and the fall 2021 semester. A cipher key was used to anonymize these data. Students were divided into four datasets based on program type, as defined in the prior section. The data were gathered when the student’s enrollment ended; therefore, only students who completed and graduated from their program or failed to re-enroll during the observed period were considered for model training and evaluation. For simplicity, students with missing variables were omitted. Another complication was that some students were coded in the pre-accelerated programs beyond the fall 2018 semester. These students were not considered in our analysis. A total of 2324 students were included in the logistic regression models, 84.8% of the enrolled students during the study period. Table 1 presents enrollment and demographic data for the four program types.

4.2. Variable Definitions and Support from the Literature

The seven variables selected were chosen based on their level of support in the literature. In addition, each program was considered a variable during exploratory analysis with logistic regression (described in Section 4.3). Three self-reported demographic variables included ethnicity, sex, and the age of the students at the time of first enrollment. Ethnicity was defined based on the IPEDS first designated ethnicity of Hispanic or Latino versus Not Hispanic or Latino. The one-hot encoding method was used to set a baseline, i.e., the value equal to 0, for each categorical variable. The baseline values for the ethnicity, sex, F2F, and COVID-19 forced online programs and pre-accelerated and accelerated online programs were set as Non-Hispanic, Male, Mathematics program, and Criminal Justice program, respectively.
The above demographic variables have been linked to graduate student success in the literature. Ethnicity was a predictor used in a study on graduate education by [21]. Sheridan and Pyke [22] examined gender, registration status (full-time or part-time), citizenship, and discipline area. Malone et al. [23] conducted a study on graduate student success, although at the doctoral level, that investigated student age at the start of their program and final graduate grade point average (GPA) as predictors of persistence.
The four academic variables included (1) the average overall student GPA; (2) the Course Load or average number of semester credit hours (SCHs) attempted in a semester, which reflects whether a student is part-time or full-time; (3) the Term Count or number of terms that the student enrolled in classes until the last record, reflecting the time spent within a program; and (4) the average student’s Course Drop Rate between the first and last term. Support for these variables exists in the literature. Many studies have examined the importance of GPA as a predictor of persistence [4,6,23,24,25,26,27,28]. Maslov Kruzicevic et al. [28] and Mendoza-Sanchez et al. [29] identified both GPA and study duration (Term Count) as significant factors in medical and biochemistry Ph.D. programs, respectively. Sheridan and Pyke [22] examined registration status (full-time or part-time), which is accounted for by the Course Load variable in this study. Several studies also point to the importance of GPA, course failure, and drop rate in student dropout prediction, e.g., [4,19,20].
Undergraduate metrics such as graduation rates have also been used to quantify graduate persistence (Mayer et al. [3]). However, Haydarov et al. [5] suggested that alternative indicators of student success should be used for graduate programs. These include time to degree (Term Count for graduated students), time to dropout, course completion ratio (Course Drop Rate), average number of semesters (Course Load), and other relevant measures of graduate student success.

4.3. Logistic Regression

Using the seven defined variables and academic program, a logistic regression classifier was trained for each program type (F2F, COVID-19 forced online, pre-accelerated, and accelerated online) to place students into a binary Graduated/Discontinued (0/1) outcome category. Students who graduated from their program during the study period were labeled as graduated, while students who stopped enrolling during the study period were labeled as discontinued. The logistic regression modeling was carried out in Python 3.9.20 using the Scikit-Learn package. Many researchers have used logistic regression to provide predictors of persistence at the undergraduate and graduate level, e.g., [4,6,24,29,30,31]. The models determined how each variable impacted the binary outcome. Pertinent to Research Questions 1 and 2, the logistic regression model tested whether there was a relationship between the chosen academic variables and the binary outcome of graduation vs. the discontinuation of a graduate student’s studies for the four student populations, allowing us to identify the variables that were relevant to graduate student success at the subject HSI.
Variable significance was determined by the Wald test, using p = 0.01 to 0.05 as significant and p < 0.01 as highly significant. The model’s accuracy, defined below, measured each model’s performance:
A c c u r a c y = C o r r e c t l y   C a t e g o r i z e d   S t u d e n t s T o t a l   S t u d e n t s   i n   t h e   S e t
To monitor the multicollinearity, the variance inflation factor (VIF) of the selected regression variables was compared. Following common practice [32], variables with a VIF higher than 5 were considered multicollinear. Furthermore, observations were assumed to be independent, as each observation corresponded to an individual student with no repeated records for any students. Furthermore, there was no overlap in records between the different student groups. Each dataset was pairwise disjoint with the rest.

4.4. Statistical Analysis

Descriptive statistics described how logistic regression variables changed between periods. Two-tailed t-tests, assuming unequal variance, were used to evaluate the change significance in variable values between F2F versus COVID-19 forced online and pre-accelerated versus accelerated online. The two-tailed t-test significance threshold was set at p = 0.05, with p < 0.01 being highly significant. When performing multiple tests on the same pairs of data, multiple hypothesis testing was resolved using the Bonferroni correction [33]. The Bonferroni method was chosen due to its conservative estimation. Additionally, The Cohen’s d statistic was used to measure effect size for the changes in the predictive variables [34,35]. All student sets were disaggregated by student outcomes (graduated and discontinued). In addition, the accelerated online business administration program was disaggregated between students with an undergraduate degree in business versus those who lacked this degree.
This methodology allowed us to test the hypothesis that a significant change in the academic variables that predict graduate student success was brought on by the respective changes in modality that these programs experienced against the null hypothesis that these changes in modality had no effect on the relevant academic variables.

5. Results

Enrollment for the accelerated programs doubled compared with the pre-accelerated period (Table 1). The recruiting efforts of an external consulting firm resulted in an influx of students from outside the HSI service area. This impacted student demographics by reducing the percentage of Hispanic students and increasing the percentage of female students in the accelerated online programs compared with the pre-accelerated baseline (Table 1). For the twelve F2F programs, the demographics did not experience a shift between periods, with Hispanic students being over 90% and female students around 70%. The dominant group in this study was Hispanic females, and the small size of other intersectional groups limited the ability to analyze and compare other groups (e.g., non-Hispanic, male).
All logistic regression models exhibited a robust ability to predict student outcomes with an accuracy of 92.95% or above. The student’s major was not a significant variable for F2F and COVID-19 forced online programs. For the pre-accelerated program type, Business and Educational Administration were significant, and for accelerated online, only Educational Administration was significant. For the pre-accelerated and accelerated online programs, all four academic variables (GPA, Course Load, Term Count, and Course Drop Rate) were significant to highly significant in predicting student persistence (Table 2). Ethnicity was also a highly significant variable for pre-accelerated but insignificant for other program types. For the F2F programs, all academic variables were also significant to highly significant (Table 2). However, when these programs shifted online during the pandemic, only Term Count and Course Drop Rate remained significant. GPA and Course Load became insignificant.
No multicollinearity was observed among the predictive variables (Table 3). GPA and Course Drop Rate showed the highest correlation values (between −0.5201 for the COVID-19 forced online set and −0.8079 for the accelerated online set). However, no predictor showed a VIF of 5 or higher in any of the four logistic regression models, supporting the reliability of the logistic regression results in assessing the significance of the predictive variables.
Table 4 compares the F2F and COVID-19 forced online program types. These eight t-tests were adjusted for using the Bonferroni correction. For discontinued students from these programs, only GPA is highly statistically significantly different with a Cohen’s d near the 0.4 threshold. Term Count also increased at a significant level. For graduated students, Term Count increased and was highly significant with a Cohen’s d greater than 0.4, indicating a significant impact. Table 5 focuses on how the averages for variables differ between student populations for the pre-accelerated and accelerated online programs. Again, these eight tests were corrected for using the Bonferroni correction. For discontinued students, there was no significant difference between the pre-accelerated and accelerated online programs. Comparing graduated students between the pre-accelerated and accelerated online programs yielded some significant differences in variable values. Term Count is significantly lower, and both Term Count and Course Load have a Cohen’s d greater than 0.4, indicating that the transition in program type had a sizable impact on these variables.
Table 6 focuses on the accelerated online program type only, comparing the pre-COVID-19 and COVID-19 periods. For this comparison, the values were adjusted using the Bonferroni correction (n = 8). Neither GPA nor Course Drop Rate was significantly different between the pre-COVID-19 and COVID-19 periods for either student population. Term Count was significantly increased, and Course Load decreased during the COVID-19 period for the accelerated online program type for both discontinued and graduated students. Table 7 explores the Business Administration program focusing on the accelerated online business students, which was the only program of this type that experienced a decline in GPA during the accelerated online period. These tests were adjusted using the Bonferroni correction (n = 4) as this comparison was unrelated to the tests in Table 5. There were no students from the pre-accelerated period considered in these tests. Students who had an undergraduate business major had a highly significantly increased GPA and a significantly lower Course Drop Rate compared with their peers who lacked an undergraduate business degree. Given the lower numbers of students in this individual program, the results were not disaggregated based on discontinued or graduated populations.

6. Discussion

HSIs play a critical role in advancing equity and educational outcomes for Hispanic and other underserved students through efforts such as increasing accessibility, fostering culturally compatible mentorship, improving student experiences, and adopting diverse program modalities [36]. As HSIs increasingly adopt diverse program modalities, it is important that we understand how these changes impact student outcomes. In our study, for three of the four types of program, all demographic variables were non-significant (Table 2). Only for the pre-accelerated programs was ethnicity highly significant (p = 0.001, Table 2). However, a model for the pre-accelerated program run without demographic variables yielded a similar significance for the four academic variables than observed in the model that included the demographic information (Table 2). These results alleviate concerns that algorithmic bias [37,38,39] might have skewed this study. The subsequent discussion focuses on how the academic variables were impacted by the transition between F2F and COVID-19 forced online and pre-accelerated and accelerated online master’s programs.

6.1. Impact on Students in F2F Program Forced to Move Online by COVID-19

The pandemic negatively impacted student learning experiences, as documented by [40,41]. During this period, Bryant [42], in a study of doctoral students, students of color were disproportionately impacted by COVID-19. This population took fewer courses due to financial and other issues, resulting in a lengthened time to degree completion. Klebs et al. [43], one year into the pandemic, conducted a survey that revealed an eleven percent increase (47% vs. 36%) in Latinx college students who required an extra academic year to complete their studies. For the F2F programs forced online by COVID-19, the Term Count was significantly increased for both student populations (Table 4) with a Cohen’s d near or above the impact threshold of 0.4. These results underscore the impacts that the pandemic had on delaying traditional master’s students’ progress in their field of study at the subject HSI.
The literature suggests that there was significant grade inflation during the pandemic. Karadag [44] speculated that grade inflation was the result of instructors who were thrust into teaching online. In this setting, grading criteria were positively skewed to compensate for the impacts associated with a sudden shift in modality. Al-Maqbali and Hussain [45] discuss the challenges associated with online assessment that could contribute to grade inflation. Some issues include academic dishonesty and impersonation due to the difficulty in tracking students in an online setting, increased plagiarism, and blatant cheating that went unrecognized by instructors new to online platforms. Issues with academic dishonesty in remote learning environments were also highlighted by [46,47]. Another contributing factor to consider is the resilience that students exhibited during the pandemic by using active coping strategies [48]. For the F2F programs, the GPA for discontinued students increased during the COVID-19 period by 0.43 (Table 4) with a Cohen’s d near the 0.4 impact threshold. Interestingly, GPA for graduated students showed no change during the baseline and pandemic periods. This observation supports the premise that while discontinued students benefited from pandemic grade inflation, this effect did not translate into an improved academic outcome as this population did not graduate.

6.2. Impact of the Transition to the Accelerated Online Format

The accelerated online model has been adopted by many universities as it provides adult learners graduate-level education courses in a compressed time frame [49]. At the subject HSI, courses are 100% online and 7 weeks in duration. Programs that utilize this model are typically orientated toward professional studies and include business administration, education [50,51,52], and nursing [53]. Not surprisingly, all four accelerated online programs had a significantly lower Term Count (Cohen’s d = 0.5) for graduated students compared with the baseline (Table 5). Interestingly, the accelerated online programs also had an increase in Course Load or SCH attempted (Cohen’s d = 0.5) for graduated students, although this increase was not statistically significant. The accelerated online programs have a prescribed sequence of required courses, unlike during the pre-accelerated period, and students are advised not to deviate from this sequence, explaining the increased Course Load. This finding reflects the commitment of the subject HSI to removing barriers, increasing the accessibility of courses, and promoting timely program completion.
A common critique of the accelerated academic model is that these programs sacrifice academic quality and are inferior to traditional, in-person programs. The results of this study do not bear out this concern. GPA is lower for discontinued students and actually higher for graduated students (Table 5). For both student populations, the Course Drop Rate is higher. But the magnitude of this difference is marginal and insignificant (Cohen’s d values < 0.2), indicating that the transition to the accelerated online format did not adversely impact GPA and Course Drop Rates.
The impact of COVID-19 on the accelerated online program type was not statistically significant for GPA or Course Drop Rate for any student population (Table 6). Interestingly, trends for Term Count and Course Load mirrored those seen in the COVID-19 forced online programs for both discontinued and graduated students. However, given the relatively low sample size of these comparison groups, caution should be exercised regarding the drawing of any broad conclusions from these findings.

6.3. Decreased GPA for Accelerated Online Business Administration Program Students

Braunstein [54] found that MBA students without an undergraduate business degree performed better than those who had this degree. Gump [55] asserts that students lacking an undergraduate business degree work harder than their peers who have an undergraduate business degree, to compensate for their lack of discipline and knowledge. In addition, students lacking an undergraduate business degree are required to take leveling courses and have learned content knowledge more recently than their peers who obtained an undergraduate business degree at an earlier time. However, students in the Business Administration program at the subject HSI who lacked an undergraduate degree in business had a GPA of 0.40 lower than their peers who had this degree. This difference in GPA was highly significant (Table 7; adjusted p = 0.006). These students also had a statistically significant higher Course Drop Rate (Table 7; adjusted p = 0.038). Recruitment from an external consulting firm produced an influx of students who lacked an undergraduate degree in business during the accelerated online period. These results contradict Braunstein’s study, which did not specify the modality of instruction but, given the study period (late 1990s to early 2000s), was likely F2F. The accelerated online modality diverges greatly from traditional graduate programs, perhaps accounting for these conflicting findings.

6.4. Study Limitations and Future Research Directions

This study should not be considered an endorsement or refutation of the accelerated online model since it was based on only four master’s programs at one HSI. A major limitation of this work was that only three years were examined for the targeted graduate programs that were transformed by the accelerated online model. With additional years of data, a greater validation of these results would be possible. In terms of the impact of COVID-19 on F2F programs, the shift in modality examined was short in duration (spring 2020 to fall 2021). Future research should expand the scope and include other master’s programs, extended time windows, and additional HSIs. For instance, Núñez et al. [56] identified six different types of HSI based on institution type, public/private, enrollment, and geographic location.
Another limitation of this work is its focus on a strict academic definition of student success or persistence. Other dimensions of student success have been advocated for in the literature [57] to examine the success of people of color more holistically. These other measures include student attitudes, intention to remain in a STEM field, number of publications, self-efficacy, identity formation, and sense of belonging. Additionally, this study was conducted at a regional HSI university, and the current findings may not extend to non-HSI institutions or other institutions with different characteristics and programs.
It should be noted that a single logistic regression model was fitted to each student group, and thus the observed differences in statistical significance for the predictor variables may not be replicated in models fitted on new data. This study should be viewed as a baseline that can serve as support for follow-up research that more fully explores the multiple dimensions of student success, as indicated above. Future research should examine frameworks that integrate student experiences at HSIs to better understand how cultural, social, and institutional factors influence student educational outcomes across different program modalities and important student characteristics such as gender and socioeconomic status.

6.5. Policy Implications

At public institutions, state leaders have pressured universities into offering their programs in formats that reduce the time to completion and student cost. Accelerated online graduate programs have achieved these goals, but concerns linger about whether the quality of these academic programs has been impaired. This study demonstrated that accelerated online programs, at an HSI, can reduce the time to completion without having a significant impact on student success. A major caveat to this finding is that if students lack the required undergraduate credentials needed for a program, then academic success is impaired. To maximize student success benefits, policies must address equity, quality assurance, faculty support, and cultural compatibility while ensuring that the accelerated online programs align with workforce needs and are sustainable in the long term. By addressing these policy realms, HSIs can create appropriate online learning environments that not only attract students but also ensure their success and contribute to closing the higher education attainment gap. Master’s students at HSIs face unique challenges. Key strategies to mitigate these issues could include policies offering flexible course designs, increasing access to technology, providing financial aid resources, and enhancing mental health and academic support services to ensure that these students have the tools and resources to succeed.
Another implication of this work documented the pandemic’s impacts. Karakose [58] indicated that there was a need for instructors and university administrators to monitor the online education environment and to develop strategies to mitigate against negative outcomes. This study revealed that such outcomes include increased time to degree and grade inflation that impacted most at-risk students who discontinued their studies. The pandemic uncovered an array of issues related to online learning that policymakers need to be aware of. These include negative effects on student socialization, a lack of suitable distance education in some disciplines, communication issues associated with distance education, negative student attitudes towards distance education, student motivational problems, technological gaps with economically disadvantaged segments of the student population, and technical difficulties.

7. Conclusions

The results of this study largely confirm the importance of academic variables as a predictor of master’s students’ persistence [5,6] across diverse program types. The Term Count increased during the pandemic period for F2F programs forced online. Despite the disruption of the pandemic, Term Count was less for accelerated online programs versus their pre-accelerated predecessors from before fall 2018. Interestingly, the Course Load also significantly increased for the accelerated online programs, possibly reflecting the more regimented nature of these programs. This result has valuable significance for stakeholders interested in maximizing student enrollment. No significant decrease in GPA or increase in Course Drop rate occurred when the programs transitioned into an accelerated online format. These findings have important implications, given the growing reliance on online programs, implying that there need not necessarily be a trade-off between academic rigor and quality and efficiency.
All theoretical frameworks stress the importance of GPA as a predictor of persistence. The increased GPA for discontinued students in the F2F programs forced online was the result of the unique conditions of the pandemic. The reduced GPA for students in the Business Administration program lacking an undergraduate degree in this field is contrary to previous findings in the literature and possibly due to the accelerated online format of this program at the subject HSI. The focus of this work was strictly on academics, and future studies should examine other student success dimensions in the understudied and rapidly growing Hispanic student population.

Author Contributions

Conceptualization, K.J.T. and J.D.L.C.H.; methodology, J.D.L.C.H.; validation, J.D.L.C.H.; formal analysis, J.D.L.C.H.; investigation, K.J.T., J.R.P., N.C., M.B. and J.D.L.C.H.; resources, K.J.T.; data curation, J.D.L.C.H.; writing—original draft preparation, K.J.T.; writing—review and editing, M.B.; supervision, K.J.T.; project administration, K.J.T.; funding acquisition, K.J.T. All authors have read and agreed to the published version of the manuscript.

Funding

US Department of Education: P031S190304.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data source used in this study were institutional records and our university’s Institutional Review Board has authorized this study if we made the data anonymous as student data is protected by the US FEPRA law. Deidentified data that support the findings of the paper are not publicly available but can be shared by submission team with the journal if necessary.

Conflicts of Interest

The authors declare no potential conflicts of interest with respect to the research, authorship, and/or publication of this study.

References

  1. Hoops, L.D.; Yu, S.L.; Burridge, A.B.; Wolters, C.A. Impact of a Student Success Course on Undergraduate Academic Outcomes. J. Coll. Read. Learn. 2015, 45, 123–146. [Google Scholar] [CrossRef]
  2. Manyanga, F.; Sithole, A.; Hanson, S.M. Comparison of Student Retention Models in Undergraduate Education from the Past Eight Decades. J. Appl. Learn. High. Educ. 2017, 7, 30–39. [Google Scholar] [CrossRef]
  3. Mayer, J.; Dineen, R.; Rockwell, A.; Blodgett, J. Undergraduate Student Success and Library Use: A Multimethod Approach. Coll. Res. Libr. 2020, 81, 378–398. [Google Scholar] [CrossRef]
  4. Attewell, P.; Maggio, C.; Tucker, F.; Brooks, J.; Giani, M.S.; Hu, X.; Massa, T.; Raoking, F.; Walling, D.; Wilson, N. Early Indicators of Student Success: A Mutli-State Analysis. J. Postsecond. Stud. Success 2022, 1, 35–53. [Google Scholar] [CrossRef]
  5. Haydarov, R.; Moxley, V.; Anderson, D. Counting Chickens Before They Are Hatched: An Examination of Student Retention, Graduation, Attrition, and Dropout Measurement Validity in an Online Master’s Environment. J. Coll. Stud. Retent. Res. Theory Pract. 2013, 14, 429–449. [Google Scholar] [CrossRef]
  6. Rotem, N.; Yair, G.; Shustak, E. Dropping Out of Master’s Degrees: Objective Predictors and Subjective Reasons. High. Educ. Res. Dev. 2021, 40, 1070–1084. [Google Scholar] [CrossRef]
  7. Howell, S.; Laws, D.; Lindsay, N. Reevaluating Course Completion in Distance Education—Avoiding the Comparison Between Apples and Oranges. Q. Rev. Distance Educ. 2024, 5, 243–252. [Google Scholar]
  8. Hagedorn, L. How to Define Retention: A New Look at an Old Problem; Transfer and Retention of Urban Community College Students Project (TRUCCS); Rossier School of Education, University of Southern California: Los Angeles, CA, USA, 2006. Available online: https://eric.ed.gov/?id=ED493674 (accessed on 30 July 2024).
  9. U.S. Census Bureau’s 2022 American Community Survey. Available online: https://www.census.gov/programs-surveys/acs (accessed on 30 July 2024).
  10. Gándara, D.; Toutkoushian, R.K. Updated Estimates of the Average Financial Return on Master’s Degree Programs in the United States. J. Educ. Financ. 2017, 43, 21–44. [Google Scholar] [CrossRef]
  11. Hutson, J.; Nasser, R.; Edele, S.; Parrish, G.; Rodgers, C.; Richmond, S.; Marzano, M.; Curtis, R. Predictors of Persistence, Retention, & Completion for First-Generation Graduate Students. J. Organ. Psychol. 2022, 22, 99–114. [Google Scholar]
  12. Tinto, V. Dropout From Higher Education: A Theoretical Synthesis of Recent Research. Rev. Educ. Res. 1975, 45, 89–125. [Google Scholar] [CrossRef]
  13. Tinto, V. Stages of Student Departure: Reflections on the Longitudinal Character of Student Leaving. J. High. Educ. 1988, 59, 438–455. [Google Scholar] [CrossRef]
  14. Tinto, V. Leaving College: Rethinking the Causes and Cures of Student Attrition; University of Chicago Press: Chicago, IL, USA, 1994. [Google Scholar]
  15. Tinto, V. Completing College: Rethinking Institutional Action; University of Chicago Press: Chicago, IL, USA, 2012. [Google Scholar]
  16. Bean, J.P.; Metzner, B.S. A Conceptual Model of Nontraditional Undergraduate Student Attrition. Rev. Educ. Res. 1985, 55, 485–540. [Google Scholar] [CrossRef]
  17. Rovai, A.P. In Search of Higher Persistence Rates in Distance Education Online Programs. Internet High. Educ. 2003, 6, 1–16. [Google Scholar] [CrossRef]
  18. Coleman, S.L. Predicting Student Dropout Risk in Online Graduate Programs: A Survival Analysis. Ph.D. Dissertation, University at Buffalo, SUNY, New York, NY, USA, 2019. Available online: https://go.openathens.net/redirector/tamiu.edu?url=https://www.proquest.com/dissertations-theses/predicting-student-dropout-risk-online-graduate/docview/2244414024/se-2 (accessed on 30 July 2024).
  19. Nora, A.; Barlow, E.; Crisp, G. Student Persistence and Degree Attainment Beyond the First Year in College. In College Student Retention: Formula for Student Success; Seidman, A., Ed.; Rowman and Littlefield Publishers: Lanham, MD, USA, 2012; pp. 163–175. [Google Scholar]
  20. Raju, D.; Schumacker, R. Exploring Student Characteristics of Retention that Lead to Graduation in Higher Education Using Data Mining Models. J. Coll. Stud. Retent. Res. Theory Pract. 2015, 16, 563–591. [Google Scholar] [CrossRef]
  21. Franklin, S.L.; Slate, J.R. First-Generation Student Enrollment and Attainment Beyond the Baccalaureate. J. Educ. Res. 2012, 6, 175–186. [Google Scholar]
  22. Sheridan, P.M.; Pyke, S.W. Predictors of Time to Completion of Graduate Degrees. Can. J. High. Educ. 1994, 24, 68–88. [Google Scholar] [CrossRef]
  23. Malone, B.G.; Nelson, J.S.; Van Nelson, C. Academic and Affective Factors Contributing to Degree Completion of Doctoral Students in Educational Administration. Teach. Educ. 2004, 40, 33–55. [Google Scholar] [CrossRef]
  24. Wilson, R.L.; Hardgrave, B.C. Predicting Graduate Student Success in an MBA Program: Regression Versus Classification. Educ. Psychol. Meas. 1995, 55, 186–195. [Google Scholar] [CrossRef]
  25. Miller, L.S.; Garcia, E.E. Better Informing Efforts to Increase Latino Student Success in Higher Education. Educ. Urban Soc. 2004, 36, 189–204. [Google Scholar] [CrossRef]
  26. Patron, H.; Lopez, S. Student Effort, Consistency, and Online Performance. J. Educ. Online 2011, 8, 1–11. Available online: http://www.thejeo.com/Archives/Volume8Number2/PatronandLopezpaper.pdf (accessed on 30 July 2024). [CrossRef]
  27. Baashar, Y.; Hamed, Y.; Alkawsi, G.; Capretz, L.F.; Alhussian, H.; Alwadain, A.; Al-amri, R. Evaluation of Postgraduate Academic Performance Using Artificial Intelligence Models. Alex. Eng. J. 2022, 61, 9867–9878. [Google Scholar] [CrossRef]
  28. Maslov Kruzicevic, S.; Barisic, K.J.; Banozic, A.; Esteban, C.D.; Sapunar, D.; Puljak, L. Predictors of Attrition and Academic Success of Medical Students: A 30-Year Retrospective Study. PLoS ONE 2012, 7, e39144. [Google Scholar] [CrossRef] [PubMed]
  29. Mendoza-Sanchez, I.; deGruyter, J.N.; Savage, N.T.; Polymenis, M. Undergraduate GPA Predicts Biochemistry PhD Completion and Is Associated with Time To Degree. CBE—Life Sci. Educ. 2022, 21, ar19. [Google Scholar] [CrossRef]
  30. Burrus, S.W.M.; Foire, T.D.; Shaw, M.E. Predictors of Online Doctoral Student Success: A Quantitative Study. Online J. Distance Learn. Adm. 2019, 22, 61–66. Available online: https://www.westga.edu/~distance/ojdla/winter224/burrusfioreshaw224.html (accessed on 30 July 2024).
  31. Osborne, J.B.; Lang, A.S. Predictive Identification of At-Risk Students: Using Learning Management System Data. J. Postsecond. Stud. Success 2023, 2, 108–126. [Google Scholar] [CrossRef]
  32. Kim, J.H. Multicollinearity and Misleading Statistical Results. Korean J. Anesthesiol. 2019, 72, 558–569. [Google Scholar] [CrossRef]
  33. Armstrong, R.A. When to use the Bonferroni correction. Ophthalmic Physiol. Opt. 2014, 34, 502–508. [Google Scholar] [CrossRef]
  34. Cohen, J. Statistical Power Analysis for the Behavioral Sciences; Routledge: London, UK, 1988. [Google Scholar]
  35. Hattie, J. Visible Learning: A Synthesis of 800+ Meta-Analyses on Achievement; Routledge: London, UK, 2009. [Google Scholar]
  36. Garcia, G.A.; Núñez, A.-M.; Sansone, V.A. Toward a multidimensional conceptual framework for understanding “servingness” in Hispanic-Serving Institutions: A synthesis of the research. Rev. Educ. Res. 2019, 89, 745–784. [Google Scholar] [CrossRef]
  37. Noble, S.U. Algorithms of Oppression: How Search Engines Reinforce Racism; New York University Press: New York, NY, USA, 2018. [Google Scholar]
  38. Baer, T. Understand, Manage, and Prevent Algorithmic Bias: A Guide for Business Users and Data Scientists; Apress: New York, NY, USA, 2019. [Google Scholar]
  39. Baker, R.S.; Hawn, A. Algorithmic Bias in Education. EdArXiv 2021. preprint. [Google Scholar] [CrossRef]
  40. Hodges, C.; Moore, S.; Lockee, B.; Trust, T.; Bond, A. The Difference Between Emergency Remote Teaching and Online Learning; EDUCAUSE Review; EDUCAUSE: Louisville, CO, USA, 2020; Available online: https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning (accessed on 30 July 2024).
  41. Stewart, W.H. A Global Crash-Course in Teaching and Learning Online: A Thematic Review of Empirical Emergency Remote Teaching (ERT) Studies in Higher Education During Year 1 of COVID-19. Open Prax. 2021, 13, 89–102. [Google Scholar] [CrossRef]
  42. Bryant, J. Doctoral Students’ Experiences of the Pandemic and Their Perceptions of Grit. Ph.D. Dissertation, Murray State University, Murray, KY, USA, 2023. Available online: https://digitalcommons.murraystate.edu/etd/307 (accessed on 30 July 2024).
  43. Klebs, S.; Fishman, R.; Nguyen, S.; Hiler, T. One Year Later: COVID-19s Impact on Current and Future College Students; Third Way: Washington, DC, USA, 2021; Available online: https://www.thirdway.org/memo/one-year-later-covid-19s-impact-on-current-and-future-college-students (accessed on 30 July 2024).
  44. Karadag, E. Effect of COVID-19 Pandemic on Grade Inflation in Higher Education in Turkey. PLoS ONE 2021, 16, e0256688. [Google Scholar] [CrossRef]
  45. Al-Maqbali, A.H.; Hussain, R.M.R. The Impact of Online Assessment Challenges on Assessment Principles During COVID-19 in Oman. J. Univ. Teach. Learn. Pract. 2022, 19, 73–92. [Google Scholar] [CrossRef]
  46. Kirk-Jenkins, A.J.; Hughey, A.W. Abrupt Adaption: A Review of the Impact of the COVID-19 Pandemic on Faculty in Higher Education. J. Profr. 2021, 12, 104–121. [Google Scholar]
  47. Mucci-Ferris, M.; Grabsch, D.K.; Bobo, A. Positives, Negatives, and Opportunities Arising in the Undergraduate Experience During the COVID-19 Pandemic. J. Coll. Stud. Dev. 2021, 62, 203–218. [Google Scholar] [CrossRef]
  48. Lee, J.; Solomon, M.; Stead, T.; Kwon, B.; Ganti, L. Impact of COVID-19 on the mental health of US college students. BMC Psychol. 2021, 9, 95. [Google Scholar] [CrossRef] [PubMed]
  49. Wlodkowski, R.J.; Ginsberg, M.B. Teaching Intensive and Accelerated Courses: Instruction That Motivates Learning; John Wiley & Sons: Hoboken, NJ, USA, 2010. [Google Scholar]
  50. Soles, N.; Maduli-Williams, D. Student Perceptions of an Accelerated Online Master’s in Education Administration Program Through the Lens of Social Presence. Educ. Leadersh. Adm. Teach. Program Dev. 2019, 30, 56–82. [Google Scholar]
  51. Hernandez, R.; Garcia, A. Graduate Student’s Perceptions of Academic Coaches in Accelerated Online Courses. Q. Rev. Distance Educ. 2022, 23, 99–117. [Google Scholar]
  52. Lowenthal, P.R.; Trespalacios, J. Classroom Community and Time: Comparing Student Perceptions in Traditional vs. Accelerated Online Courses. Online Learn. J. 2022, 26, 59–77. [Google Scholar] [CrossRef]
  53. Zajac, L.K.; Lane, A.J. Student Perceptions of Faculty Presence and Caring in Accelerated Online Courses. Q. Rev. Distance Educ. 2020, 21, 67–78. [Google Scholar]
  54. Braunstein, A.W. MBA Academic Performance and Type of Undergraduate Degree Possessed. Coll. Stud. J. 2006, 40, 685–690. [Google Scholar]
  55. Gump, S.E. Defining Distinction: Characteristics of Top MBA Students at Cardiff Business School. Int. Educ. 2003, 32, 63–84. [Google Scholar]
  56. Núñez, A.-M.; Crisp, G.; Elizondo, D. Mapping Hispanic-serving institutions: A typology of institutional diversity. J. High. Educ. 2016, 87, 55–83. [Google Scholar] [CrossRef]
  57. Weatherton, M.; Schussle, E.E. Success for All? A Call to Re-Examine How Student Success is Defined in Higher Education. CBE—Life Sci. Educ. 2021, 20, es3. [Google Scholar] [CrossRef] [PubMed]
  58. Karakose, T. The Impact of the COVID-19 Epidemic on Higher Education: Opportunities and Implications for Policy and Practice. Educ. Process Int. J. 2021, 10, 7–12. [Google Scholar] [CrossRef]
Table 1. Headcount and demographics by program type.
Table 1. Headcount and demographics by program type.
Program TypeTotal Student
Headcount
Students Used in Models% Hispanic% Female% from
HSI City
Pre-Accelerated54548678.2%53.3%68.3%
Accelerated Online1184102464.6%67.9%34.6%
Baseline Face-to-Face (F2F)67650291.8%69.3%84.3%
COVID-19 Forced Online33531294.9%66%79.8%
Table 2. Logistic regression odds ratios (standard errors) for the four program types.
Table 2. Logistic regression odds ratios (standard errors) for the four program types.
VariablePre-AcceleratedAccelerated OnlineF2FCOVID-19 Forced Online
Accuracy93.0%95.61%94.42%92.95%
Age0.8357 (0.7351)1.3014 (0.526)2.3467 (0.7064)1.3987 (0.7228)
Ethnicity0.4264 (0.3229) **1.1158 (0.2156)1.4382 (0.4599)0.7054 (0.7234)
Sex1.3429 (0.2675)1.1831 (0.2206)0.8892 (0.2816)1.0943 (0.3113)
Term Count0.0018 (0.7642) **0.0003 (0.6189) **0.0019 (0.6278) **0.0037 (0.7402) **
Course Load0.1252 (0.6669) **0.0198 (0.4642) **0.0442 (0.7017) **0.1261 (1.4834)
Course Drop Rate29.4326 (1.0199) **90.3667 (0.7843) **16.587 (1.2538) *31.7229 (0.8556) **
GPA0.0753 (1.0186) *0.0239 (0.8057) **0.0602 (0.9236) **0.371 (1.1454)
**: Variable is highly significant with p < 0.01. *: Variable is significant with p = 0.010 to 0.050.
Table 3. Variance Inflation Factor (VIF) of logistic regression predictors for the four program types.
Table 3. Variance Inflation Factor (VIF) of logistic regression predictors for the four program types.
VariablePre-AcceleratedAccelerated OnlineF2FCOVID-19 Forced Online
Age1.0631.0641.0751.167
Ethnicity1.0461.0731.061.044
Sex1.0791.0341.0441.022
Term Count1.2211.2651.2181.075
Course Load1.0371.0671.0561.109
Course Drop Rate2.6392.9462.1341.432
GPA2.6773.0102.2591.417
Table 4. Comparison of variable averages based on the student populations in the F2F and COVID-19 forced online program types.
Table 4. Comparison of variable averages based on the student populations in the F2F and COVID-19 forced online program types.
nGPATerm CountCourse LoadCourse Drop Rate
F2F (Discontinued)2242.872.305.4517.2%
COVID-19 Forced Online (Discontinued)1093.303.055.1924.5%
Adjusted p-Value 0.003 **0.014 *0.2470.077
Cohen’s d 0.38420.3518−0.14530.2486
F2F (Graduated)2783.736.326.610.9%
COVID-19 Forced Online (Graduated)2033.737.076.292.1%
Adjusted p-Value 0.990<0.001 **0.14620.160
Cohen’s d 0.00120.4285−0.16080.1597
**: Variable is highly significant with p < 0.01. *: Variable is significant with p = 0.01 to 0.05.
Table 5. Comparison of variable averages based on the student populations in the pre-accelerated and accelerated online program types.
Table 5. Comparison of variable averages based on the student populations in the pre-accelerated and accelerated online program types.
nGPATerm CountCourse LoadCourse Drop Rate
Pre-Accelerated (Discontinued)2152.712.005.4426.3%
Accelerated Online (Discontinued)4512.561.975.7132.2%
Adjusted p-Value 0.2610.7940.1910.155
Cohen’s d −0.1041−0.0230.11890.1637
Pre-Accelerated (Graduated)2713.755.046.160.9%
Accelerated Online (Graduated)5733.784.377.371.1%
Adjusted p-Value 0.2094<0.001 **0.19140.7662
Cohen’s d 0.1099−0.47330.53980.0298
**: Variable is highly significant with p < 0.01.
Table 6. Comparison of variable averages based on the student populations in the accelerated online program, divided based on the pre-COVID-19 and COVID-19 periods.
Table 6. Comparison of variable averages based on the student populations in the accelerated online program, divided based on the pre-COVID-19 and COVID-19 periods.
nGPATerm CountCourse LoadCourse Drop Rate
Pre-COVID-19 (Discontinued)962.211.356.5937.5%
COVID-19 (Discontinued)3552.652.135.4730.7%
Adjusted p-Value 0.111<0.001 **0.010 *1.000
Cohen’s d 0.29230.6785−0.4583−0.1819
Pre-COVID-19 (Discontinued)503.813.189.430.2%
COVID-19 (Discontinued)5233.784.487.171.1%
Adjusted p-Value 1.000<0.001 **<0.001 **0.086
Cohen’s d −0.09331.0355−0.96450.1690
**: Variable is highly significant with p < 0.01. *: Variable is significant with p = 0.01 to 0.05.
Table 7. Two-tailed t-tests of academic variables within the Business Administration program.
Table 7. Two-tailed t-tests of academic variables within the Business Administration program.
VariableNon-Business Undergraduate Major
(n = 253)
Business Undergraduate Major
(n = 125)
Adjusted p-Value
GPA2.983.380.006 **
Term Count3.643.550.655
Course Load5.786.250.053
Course Drop Rate16.9%9.8%0.038 *
**: Difference between each variable between periods and student undergraduate major is highly significant with p < 0.01. *: Variable is significant with p = 0.01 to 0.05.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tobin, K.J.; De La Cruz Hernandez, J.; Palma, J.R.; Bennett, M.; Chaudhuri, N. Examining Master’s Students’ Success at a Hispanic-Serving Institution. Trends High. Educ. 2025, 4, 5. https://doi.org/10.3390/higheredu4010005

AMA Style

Tobin KJ, De La Cruz Hernandez J, Palma JR, Bennett M, Chaudhuri N. Examining Master’s Students’ Success at a Hispanic-Serving Institution. Trends in Higher Education. 2025; 4(1):5. https://doi.org/10.3390/higheredu4010005

Chicago/Turabian Style

Tobin, Kenneth John, Jacinto De La Cruz Hernandez, José R. Palma, Marvin Bennett, and Nandita Chaudhuri. 2025. "Examining Master’s Students’ Success at a Hispanic-Serving Institution" Trends in Higher Education 4, no. 1: 5. https://doi.org/10.3390/higheredu4010005

APA Style

Tobin, K. J., De La Cruz Hernandez, J., Palma, J. R., Bennett, M., & Chaudhuri, N. (2025). Examining Master’s Students’ Success at a Hispanic-Serving Institution. Trends in Higher Education, 4(1), 5. https://doi.org/10.3390/higheredu4010005

Article Metrics

Back to TopTop