Next Article in Journal
Bringing Science to the Periphery Through Distance Learning: Barriers and Opportunities
Previous Article in Journal
Investigating the Influence of a Gamified Resource on Healthcare Students’ Attitudes, Confidence, and Knowledge Regarding Research Concepts: A Mixed-Method Study
Previous Article in Special Issue
“We Should Not Be Like a Dinosaur”—Using AI Technologies to Provide Formative Feedback to Students
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

AI in the Classroom: Insights from Educators on Usage, Challenges, and Mental Health

by
Julie A. Delello
1,*,
Woonhee Sung
1,
Kouider Mokhtari
1,
Julie Hebert
2,
Amy Bronson
2 and
Tonia De Giuseppe
3
1
School of Education, The University of Texas at Tyler, Tyler, TX 75799, USA
2
Physician Assistant Program, West Coast University, Richardson, TX 75080, USA
3
Department of Didactics, Special Pedagogy, and Educational Research, Giustino Fortunato University of Benevento, 82100 Benevento, Italy
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(2), 113; https://doi.org/10.3390/educsci15020113
Submission received: 7 December 2024 / Revised: 14 January 2025 / Accepted: 17 January 2025 / Published: 21 January 2025

Abstract

:
This study examines educators’ perceptions of artificial intelligence (AI) in educational settings, focusing on their familiarity with AI tools, integration into teaching practices, professional development needs, the influence of institutional policies, and impacts on mental health. Survey responses from 353 educators across various levels and countries revealed that 92% of respondents are familiar with AI, utilizing it to enhance teaching efficiency and streamline administrative tasks. Notably, many educators reported students using AI tools like ChatGPT for assignments, prompting adaptations in teaching methods to promote critical thinking and reduce dependency. Some educators saw AI’s potential to reduce stress through automation but others raised concerns about increased anxiety and social isolation from reduced interpersonal interactions. This study highlights a gap in institutional AI policies, leading some educators to establish their own guidelines, particularly for matters such as data privacy and plagiarism. Furthermore, respondents identified a significant need for professional development focused on AI literacy and ethical considerations. This study’s findings suggest the necessity for longitudinal studies to explore the long-term effects of AI on educational outcomes and mental health and underscore the importance of incorporating student perspectives for a thorough understanding of AI’s role in education.

1. Introduction

Artificial intelligence (AI) has been a topic of research for nearly 30 years, with roots tracing back to pivotal contributions from both Alan Turing and John McCarthy. Turing’s 1950 paper, “Computing Machinery and Intelligence”, led to foundational questions about machine thinking, marking the conceptual origins of AI (Turing, 1950). Building on these ideas, John McCarthy formalized AI as a field at his pioneering workshop at Dartmouth College in the 1950s. McCarthy coined the term artificial intelligence in 1956, stating that “every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it” (Moor, 2006, p. 87).
While the concepts and definitions of AI have been around for decades, recent years have seen a surge of interest in its potential, especially in education. Furthermore, the emergence of large language models (LLMs) like ChatGPT has accelerated AI’s integration into the classroom, offering educators new tools to enhance instructional time, streamline administrative tasks, analyze data, support student engagement, and provide automated feedback. However, despite these promising opportunities for enhancing learning experiences, significant gaps remain in understanding how to effectively integrate AI and address the barriers to its implementation into educational practices (J. Delello et al., 2024a). Furthermore, studies indicate a concerning increase in mental health challenges among students, attributed to factors such as internet addiction (Shen et al., 2020), social media use, social isolation (Chen et al., 2024), and online multitasking (J. A. Delello et al., 2016). Understanding the relationship between these issues and the potential impact of AI on mental health is crucial for informing effective intervention strategies and ethical guidelines in educational settings.

2. Literature Review

2.1. Benefits of AI in Educational Settings

AI technologies are increasingly being integrated into educational settings, with promises to revolutionize teaching and learning. By enhancing areas such as efficiency (Alexander et al., 2019; J. A. Delello et al., 2024b; Farahani & Ghasmi, 2024; Pedro et al., 2019; Kannan & Zapata-Rivera, 2022), personalized learning (Ahmad et al., 2021; Zawacki-Richter et al., 2019), intelligent tutoring (Alam, 2021; Owoc et al., 2021), student motivation (Kamalov et al., 2023), and data-driven insights (Kannan & Zapata-Rivera, 2022), AI holds the promise to reshape traditional educational practices, elevating both teaching and learning experiences.

2.1.1. Educational Efficiency

The integration of AI significantly streamlines both administrative duties and management tasks within educational settings, while also revolutionizing how educational content is delivered (Khan et al., 2025; Milberg, 2024). For example, AI-driven applications enable educators to dramatically reduce the time needed to develop dynamic and engaging learning materials (Farahani & Ghasmi, 2024; Pedro et al., 2019), provide timely feedback (Yuan et al., 2020), simplify the evaluation process, and customize teaching strategies to accommodate the varied learning styles of students (J. Delello et al., 2024a). Furthermore, institutions are adopting learning analytic dashboards (LADs) equipped with AI, which utilize machine learning (ML) algorithms to sort through large datasets to identify intricate patterns and forecast educational outcomes (Kannan & Zapata-Rivera, 2022). The Educause Horizon Report suggested that “AI’s ability to personalize experiences, reduce workloads, and assist with the analysis of large and complex datasets recommends it to educational applications” (Alexander et al., 2019, p. 27).

2.1.2. Personalized Learning

AI is reshaping educational practices by providing tailored learning experiences through adaptive platforms and intelligent tutoring systems (ITSs). At the core of this personalized approach, ML algorithms analyze individual student performance data to identify progress patterns and areas where additional support is needed (J. Delello et al., 2024a). This allows AI tutors to offer one-to-one support that mirrors individualized instruction, adapting guidance in real-time to each student’s progress, mastery levels, and learning styles, fostering more effective learning outcomes (Ahmad et al., 2021). By offering immediate feedback, AI tutors allow students to address misunderstandings right away, building both comprehension and confidence. For example, Duolingo, a language-learning application, leverages AI to detect areas where users struggle and then customizes follow-up lessons to focus on those specific words and phrases (Yin, 2024).
ITSs like those described by Zawacki-Richter et al. (2019) “can make decisions about the learning path of an individual student and the content to select, provide cognitive scaffolding and help, to engage the student in dialogue” (p. 4). These AI tutors continuously assess student performance, recommending personalized exercises, readings, or simulations to support specific needs. Such technologies enable both students and educators to monitor learning pathways dynamically, creating a responsive educational environment (Alam, 2021; Owoc et al., 2021). For example, the AI-adapted learning tool Mindspark was implemented in classrooms in Rajasthan, India, to address student learning gaps. The findings showed that students increased learning in both math and Hindi by 0.2 standard deviations compared to the control group (BW Online Bureau, 2019). This individualized approach curates materials that align with each student’s pace and learning style, offering sustained support for comprehension and retention (Zawacki-Richter et al., 2019). Additionally, a study by Huesca et al. (2024) demonstrated significant improvements in normalized learning gains among undergraduate engineering students who used ChatGPT in a flipped classroom setting compared to those who followed traditional methods. This underscores the potential of AI to not only support but actively improve the learning process, adapting to the needs of students in diverse educational contexts.

2.1.3. Student Motivation

Kamalov et al. (2023) highlighted the potential of AI to increase student engagement and motivation by making learning more immersive and interactive. Also, Wu et al. (2023) suggested that the use of AI in video games increases visual quality, improves gameplay, and creates more life-like and immersive worlds for players. LLMs like ChatGPT can be considered “game-like” in the way they simulate interactivity and encourage exploration through engaging and dynamic experiences. ChatGPT can present students with problem-solving tasks, quizzes, or scenarios, mimicking the challenge–response dynamic seen in games. Further, integrating game-like features with an LLM like ChatGPT in educational settings may also boost student engagement and enhance learning by providing interactive, tailored, and dynamic experiences (Tulsiani, 2024).
AI-powered tools also enable students to engage in simulations and scenarios that allow them to apply knowledge in real-world situations, enriching the learning experience. These technologies have been especially effective in areas such as medical education, where AI systems simulate real-world scenarios, allowing students to practice critical decision-making in a controlled environment (Masters, 2019).

2.1.4. Educational Equity

AI technologies are transforming educational spaces by promoting equity and accessibility across diverse learning needs. Features like text-to-speech and voice typing assist students with disabilities, creating a more supportive learning environment (see J. A. Delello et al., 2024b). Educators can use AI tools like MagicSchool.ai or Diffit.me to create individualized lesson plans to meet the needs of diverse students (Brazeau, 2024). Also, AI can expand access to resources, particularly in underserved or remote regions (Ahmad et al., 2021). For example, Saavedra et al. (2024) found that the AI tutor Rori provided effective, personalized math support, improving student achievement by adapting to individual learning needs in Ghana, Africa, where educational resources were limited. Also, the United Nations International Children’s Emergency Fund (UNICEF) is leveraging AI to transform traditional textbooks, integrating features such as narration, sign-language videos, interactive elements, text-to-speech capabilities, and other tools designed to accommodate students’ needs and accessibility requirements (World Economic Forum, 2024).
However, while AI has great potential to enhance educational equity, it also presents challenges that must be addressed. The reliance on advanced technologies may exacerbate existing inequities due to disparities in access to reliable internet, devices, or technological literacy, particularly in under-resourced communities.

2.2. Concerns of AI in Education

Despite its benefits, the implementation of AI in education raises several ethical and safety concerns. One significant risk is the potential for bias in AI algorithms, which may perpetuate existing inequalities (Miller et al., 2018). Additionally, inadequate sampling and improper training of generative AI models can result in biased predictions, further amplifying these disparities (Megahed et al., 2023). Alongside these risks, AI presents other challenges, including issues related to student privacy, ethics, inaccuracies, plagiarism, over-reliance on technology tools, and the need to maintain human interaction in educational settings (J. Delello et al., 2023).

2.2.1. Academic Integrity

AI tools can enable students to generate content, such as essays, without them fully understanding or engaging with the material. This over-reliance on AI may lead to academic dishonesty and hinder the development of critical thinking skills, as students bypass deeper cognitive engagement. In addition, AI-generated content often contains inaccuracies, making it difficult for students to discern valid information. Educators are particularly worried about AI tools like ChatGPT being used to plagiarize or cheat on assignments, further compromising academic integrity (Mittal, 2024; Moorhouse et al., 2023).

2.2.2. Bias and Equity

AI algorithms have the potential to heighten racial disparities in education, largely because the historical data used to train these technologies frequently carry existing biases, which can be inadvertently reproduced in the algorithm’s outcomes (Pham et al., 2024). Moreover, AI bias has also affected non-native English speakers, as a Stanford University study revealed that AI detectors may incorrectly flag their written work as AI-generated, leading to unjust accusations of academic dishonesty (Liang et al., 2023). Barriers to implementation present further challenges as ensuring equitable access to AI technologies for all students, especially those from underserved communities, is critical.

2.2.3. Privacy and Data Security

Alongside concerns about bias and equity, the rise of AI in education raises significant privacy and data security issues. AI technologies frequently require large volumes of personal data, sparking concerns about data storage and usage practices. The potential for data breaches or misuse of sensitive information underscores the need for robust policies in educational institutions to safeguard student privacy and ensure ethical data handling (Owoc et al., 2021).

2.2.4. Student and Teacher Relationships

The impersonal nature of AI-driven learning may weaken teacher–student relationships, reducing social interaction and empathy in educational environments (Guilherme, 2019). The increased use of AI technology can distance teachers and students, making the relationship more impersonal and negatively impacting student engagement and connection. Moreover, some students may anthropomorphize AI tools, such as chatbots, by attributing human-like emotions or thoughts to them, a phenomenon known as the Eliza Effect (Dillon, 2020). This could lead students to over-rely on AI outputs and view them as genuine or factual, further distancing themselves from human interactions and the critical evaluation process (Mullaney, 2024).

2.2.5. Lack of Training and Support

Training programs that foster AI literacy among teachers are crucial for addressing both ethical concerns and the technical limitations of AI (Paek & Kim, 2021). Professional development initiatives that provide practical tools and ongoing support help educators navigate the complexities of AI-enhanced learning environments (Ahmad et al., 2021). Institutions that invest in both AI training and technical support will be better equipped to address these challenges and leverage AI’s potential in education (Gocen & Aydemir, 2020). However, recent research from the Center on Reinventing Public Education (CRPE) shows that many teacher preparation programs are slow to include AI training (Weiner et al., 2024). Furthermore, faculty often focus on issues like plagiarism rather than teaching future educators how to effectively use AI in classrooms, limiting new teachers’ readiness for AI’s role in education (Weiner et al., 2024).

2.3. AI and Mental Health

AI technologies hold both promise and challenges in addressing mental health among students and educators. As educational institutions increasingly adopt AI tools for personalized learning and administrative efficiency, understanding the impact of these technologies on the mental well-being of both educators and students becomes essential. While AI offers sophisticated tools for enhancing learning experiences and supporting mental health interventions, it also raises concerns about stress, privacy, and the quality of human interactions.

2.3.1. Benefits of AI in Supporting Mental Health

AI technologies offer solutions to mental health challenges in educational settings. For example, personalized AI-driven learning experiences adapt to individual learner needs, potentially reducing academic stress (H. Li et al., 2023) and improving well-being. AI tutoring systems provide immediate feedback, alleviating feelings of frustration and inadequacy, increasing students’ self-efficacy and confidence (Yang & Xia, 2023). Further, AI has brought significant advancements in mental health support, providing tools such as automated feedback, virtual tutoring, and real-time assessments. Additionally, educators benefit from AI’s capacity to streamline administrative tasks, improving productivity and efficiency (Poth, 2023). Institutions can further reduce technostress by offering tailored workshops and resources to train educators in AI applications while providing emotional support for students navigating these technologies (Karan & Angadi, 2023).
AI’s capabilities in leveraging large-scale data analytics, ML, and natural language processing (NLP) may also enhance the detection and treatment of mental health conditions (D’Alfonso, 2020; Small et al., 2020). One of AI’s strengths is its ability to identify patterns in mental health data that may be difficult for human practitioners to detect. For instance, ML algorithms can analyze language changes to predict mental health issues, offering a precise and scalable approach to diagnosis and intervention (Olawade et al., 2024). In fact, AI models have shown promise in predicting conditions like depression and anxiety by analyzing electronic health records (EHRs) (Nemesure et al., 2021) and social media data (Santos et al., 2024). Ettman and Galea (2023) emphasized AI’s role in increasing access to mental health care through web-based programs, such as mindfulness-based cognitive therapy, which have demonstrated effectiveness in improving depression outcomes (Small et al., 2020). Furthermore, Ecological Momentary Interventions (EMIs), delivered via personal mobile devices, provide personalized psychological prompts that can help prevent mental health issues from escalating (Heron & Smyth, 2010). Moreover, AI tools, such as chatbots, are designed to supplement, rather than replace, human therapists, offering scalable interventions that reduce barriers like cost and stigma. As a result, AI holds the potential to expand access to mental health care for those who might otherwise lack resources, including students and educators in underserved schools, by offering scalable and adaptable solutions.
By enabling timely responses before conditions worsen, AI may provide schools with tools to proactively address mental health issues, especially in underserved regions where resources may be limited (Molli, 2024; Sharma et al., 2023). However, the effectiveness of these tools depends on the quality and diversity of training data, which often exclude underrepresented groups, highlighting the need for equitable AI practices (Small et al., 2020).

2.3.2. Challenges and Adverse Effects of AI on Mental Health

Despite its numerous benefits, AI integration is not without challenges that can adversely affect mental health. The AI tools that support learning may also introduce adverse psychological effects, creating new stressors such as the need to learn and adapt to complex technologies, which can lead to technostress and burnout (Chang et al., 2024). Educators face comparable challenges, with anxieties often stemming from uncertainties about how to effectively integrate AI into the classroom and feeling overwhelmed with its technical demands (Karan & Angadi, 2023). For example, a recent study by The Upwork Research Institute found that 77% of employees using AI tools reported diminished productivity and an increase in their workload (Monahan & Burlacu, 2024). According to Chang et al. (2024), “The increasing integration of artificial intelligence (AI) within enterprises generates significant technostress among employees, potentially influencing their intention to adopt AI” (p. 413). This technostress among educators can hinder their ability to engage students meaningfully, potentially impacting the quality of mentorship and emotional support they provide. Excessive reliance on AI or perceptions of constant monitoring in classrooms may contribute to educator anxiety and stress (TeachFlow, 2023).
In technology-mediated environments, students often feel disconnected despite digital connectivity, a common scenario in college educational settings (Chen et al., 2024). Additionally, AI tools like virtual assistants and automated grading systems may reduce face-to-face interactions between students and educators, promoting feelings of social isolation (Guilherme, 2019). Research suggests that increased use of AI in educational settings may unintentionally contribute to social isolation, a known risk factor for mental health issues (Small et al., 2020).
AI’s potential to replace educators raises concerns about diminishing meaningful interactions, thereby weakening the relationship between students and teachers (Karan & Angadi, 2023). Furthermore, the connection between internet addiction and mental health challenges among students has been linked to the rise in AI-powered platforms. Many AI tools promote constant connectivity, which can foster addictive behaviors. Multitasking with technology like AI tools may lead to cognitive overload and stress, impairing students’ focus and cognitive performance (Small et al., 2020; J. A. Delello et al., 2016).

2.3.3. Ethical Considerations

Ensuring the responsible use of AI in mental health care is crucial to mitigating potential risks for those receiving support, particularly in educational settings where students’ and educators’ well-being is crucial to success. Ethical challenges include making sure that AI algorithms are designed with fairness, autonomy, beneficence, and justice in mind (Graham et al., 2019). Standards must be developed to guide the selection, testing, and evaluation of AI systems in schools, with careful attention to bias in data and predictions. As AI becomes more integrated into educational environments to support mental health, safeguards are needed to prevent harm and ensure the accuracy of AI-driven interventions (Ettman & Galea, 2023).
Privacy and confidentiality are paramount, especially in handling sensitive mental health data within educational institutions (Espejo et al., 2023). The human connection remains vital in mental health care, particularly in schools, where counselors and educators provide critical empathy and support and AI must be used as a tool to complement, rather than replace, these essential human interactions.
While AI holds transformative potential for mental health care in education, more research is needed to ensure its effective, ethical, and equitable use in improving the well-being of students, educators, and school communities. Studies should explore not only the outcomes of AI-assisted interventions but also their impact on fostering connection and relationships in schools, ensuring that AI complements rather than replaces traditional support models. Special emphasis should also be placed on underserved populations in order to ensure equitable access to AI-driven care solutions (Olawade et al., 2024).
In conclusion, AI has transformative potential in addressing mental health challenges in educational settings. However, its promise can only be realized through ethical design, inclusive data practices, and a commitment to augment, not replace, the human touch in mental health care. Ultimately, the synergy between AI innovation and human expertise will determine its success in fostering well-being and creating supportive learning environments.

2.4. Theoretical Framework

The theoretical framework for this study is primarily based on the Technology Acceptance Model (TAM) (Davis, 1986). The TAM suggests that there are two primary factors influencing the acceptance and adoption of new technologies: perceived usefulness (PU) and perceived ease of use (PEOU) (Davis, 1989). According to Davis et al. (1989), “Perceived usefulness (U) is defined as the prospective user’s subjective probability that using a specific application system will increase his or her job performance within an organizational context” (p. 985). PEOU is described as the extent to which an individual perceives a particular technology would be useful or beneficial to them (Davis, 1989). In fact, He et al. (2018) reported that when technology is perceived as easy to use, users’ confidence and competence in adopting it increase, which also enhances self-efficacy. In an exploratory study on factors influencing engagement in AI education in K-12 settings, W. Li et al. (2024) reported that PU and PEOU significantly influenced users’ attitudes and intentions to adopt new technologies. Further, Zhang et al. (2023) suggested that PU and PEOU are the two main factors affecting pre-service teachers’ intentions to use AI technology, with PU having a greater impact than PEOU.
In addition to the TAM, this study integrates Sociotechnical Systems Theory (STS) to better understand both the social and technological factors that influence the use and impact of emerging technologies in educational environments. Sociotechnical Systems Theory, initially developed by Emery and Trist (1960), suggests that effective organizational performance results from the design and integration of both the social system (including people, work processes, and organizational structures) and the technical system (tools, technology, and work environments). This theory promotes creating work systems that incorporate both human (social) elements and technological components to enhance overall efficiency, performance, and the well-being of employees (Cuofano, 2024). However, Pasmore et al. (2018) emphasized that “the technology used to perform work will constantly evolve, requiring ongoing adjustments rather than designing a social system around a fixed technology” (p. 45). This underscores the importance for educational systems to continually adapt and align new tools such as AI with evolving educational practices and the changing needs of educators.

2.5. Research Aim and Questions

As AI continues to permeate educational settings, it becomes imperative to explore its potential ramifications on student and educator well-being. While AI offers advantages such as personalized learning and efficiency, it also presents risks, including concerns about data privacy, algorithmic bias, and its influence on mental health. Therefore, interdisciplinary collaboration is crucial to navigate these complexities, develop ethical guidelines, and ensure the responsible and equitable use of AI across all educational domains (Chiu et al., 2023).
The objective of this research was to explore educators’ perceptions regarding the use of AI in education, focusing on their concerns, perceived benefits, and ethical challenges. This study examined AI’s impact on teaching and learning processes, including its potential to personalize learning, improve efficiency, and pose risks to privacy and mental health. By analyzing how educators integrate AI tools into their teaching, adjust methodologies, and navigate institutional policies, this research contributes to the understanding of AI’s emerging role in education. Specifically, the research questions are as follows:
  • How familiar are educators with AI, and how frequently do they use AI tools at home and in the classroom?
  • How do educators adjust their teaching methods in response to students using AI tools for assignments?
  • What training and resources do educators need to effectively integrate AI tools into their teaching?
  • How do institutional policies affect educators’ ability to adopt and use AI tools in their teaching?
  • What is the perceived impact of AI on educators’ and students’ mental health, and what strategies do educators suggest to mitigate any negative effects?

3. Methodology

This study utilized a mixed-methods approach, combining both quantitative and qualitative data. Data were gathered through an online Qualtrics survey (Qualtrics.com) and included both open and closed questions. Seven of the survey questions examined demographics such as gender, ethnicity, country, level of education, years of experience, type of school, and occupation. Also, the survey included six multiple-choice and five open-ended questions, which included determining educators’ familiarity with AI, the extent they perceived students to use AI, usage in work and home settings, specific tools used in the classroom, the level of agreement on AI tools and platforms, school policies regarding AI, the impact of AI on mental health, training needed, curriculum adjustments, and additional suggestions. For recruitment purposes, a script was posted on educational listservs, social media pages, and emailed to educators. This study was based on a snowball sampling approach. This non-probabilistic method was selected for its effectiveness in leveraging professional networks to reach a broad and varied group of educators. By relying on referrals, this approach allowed us to extend the survey reach through the interconnected networks of participating educators, thereby capturing insights from a wide array of educational contexts. The research study was approved by the Institutional Review Board at the researchers’ university.

Participant Demographics

A total of 422 participants accessed the survey, and 388 provided online consent to participate. After excluding 54 responses with incomplete demographic information, the final sample consisted of 334 educators. Participant demographic details are presented in Table 1. In terms of current residency, 99 respondents (29.64%) were from the United States, while the majority (n = 235, 70.36%) resided in various international locations, including the United Kingdom, Italy, Australia, Germany, the Philippines, Nigeria, South Korea, India, Sweden, Brazil, Azerbaijan, Sudan, and Malaysia.
The educators represented a diverse range of educational backgrounds and teaching experiences. Nearly half of the participants (n = 162, 48.50%) held a professional degree, while 94 (28.14%) had earned a doctorate. Participants also varied in their teaching experience (see Table 1). In terms of professional roles, the majority (n = 236, 70.6%) reported being employed in educational settings. The remaining 98 participants (29.34%) held other roles, such as substitute teachers, special education teachers, medical educators, researchers, adjunct instructors, lecturers, counselors, instructional librarians, and postdoctoral fellows.
Educators worked across various educational settings. The largest group (n = 102, 30.54%) was employed at public four-year institutions, followed by 55 participants (16.47%) at public pre-kindergarten to 12th grade (PK-12) schools. Additionally, 35 participants (10.48%) worked at private four-year institutions, while 27 (8.08%) were employed at public two-year institutions. A smaller percentage (2.99%) worked at private two-year institutions, and three participants (0.90%) were employed at private PK-12 schools. Another 102 participants (30.54%) worked in other educational environments, such as vocational institutions and other colleges. Among the participants from Italy, 9 educators worked in lower secondary (middle school) settings, and 10 were employed in upper secondary (high school) settings.

4. Data Analysis

This study involved both quantitative and qualitative data. The quantitative data were descriptively analyzed to show participants’ demographic information, familiarity with AI, students’ uses of AI, uses of AI in the workforce, and the level of agreement regarding AI tools. Independent t-tests were utilized to analyze the differences between demographic groups for selected variables such as AI familiarity by regions. Additionally, the qualitative responses were analyzed using an inductive and comparative approach. Researchers independently developed initial codes, categorized the data, and refined these categories into broader themes. To enhance reliability, an intercoder agreement was achieved through collaborative comparison of coding results. This process ensured a comprehensive understanding of educators’ perspectives on AI integration, including their recommendations for improving AI use in educational settings.

5. Findings

5.1. Educators’ Familiarity with AI and Students’ Uses of AI

In terms of familiarity with AI, a 4-point scale was utilized (extremely knowledgeable = 4, not knowledgeable at all = 1), and the overall average indicated slight to moderate knowledge of the term AI (M = 2.67, Mdn = 3.00, SD = 0.77, n = 334). A total of 219 educators (65.57%) reported being moderately to extremely knowledgeable about AI, with 184 (55.09%) identifying as moderately knowledgeable and 35 (10.48%) as extremely knowledgeable. Meanwhile, 86 educators (25.75%) considered themselves slightly knowledgeable, and 29 (8.68%) admitted to having no knowledge of AI at all. An independent t-test was used to compare levels of familiarity between U.S. participants (M = 2.99, SD = 0.69) and non-U.S. participants (M = 2.54, SD = 0.77). The test showed significant differences, with higher familiarity among U.S. participants (t = 4.99, df = 332, p < 0.001, Cohen’s d = 0.598).
The survey asked participating educators about their students’ AI usage frequencies for completing assignments. Among 329 responses, 136 educators (41.3%) pointed out that their students use AI often (26–75% of the time), and 132 (40.1%) said their students sometimes use AI (1–25% of the time). Only 27 educators (8.2%) reported that their students use AI almost always (76–100% of the time) for completing assignments. Meanwhile, 34 respondents (10.3%) believed that their students never use AI tools for task completion.

5.2. Use of AI at Home or in the Workforce

The survey asked participants whether they used specific AI platforms at home or at work, allowing for multiple selections to capture the variety of platform usage. Figure 1 shows the counts of participants using these tools in either home or educational settings.
For Open AI platforms, such as conversational AI chatbots like ChatGPT, 137 educators (34%) reported using them in the workplace, while 189 (46.9%) reported using them at home. The use of Open AI tools was similarly high in both home and educational settings. AI-powered simulations, inclusive software (e.g., text-to-speech or read-aloud tools), and research-assisting AI tools were also used with similar frequencies at home and in educational contexts.
Additionally, language learning software was highly adopted at home (n = 173, 45.53%) and in educational settings (n = 115, 30.26%). However, virtual voice assistants like Siri, Alexa, and Google Assistant were more frequently used at home (n = 235, 66.76%) compared to educational settings (n = 54, 15.34%). Three AI platforms that included AI-enhanced LMSs such as Canvas or Moodle (n = 211, 55.53%), content generators like MagicSchool or Canva (n = 192, 49.61%), and AI-driven quiz and assessment tools like Kahoot and Quizizz (n = 213, 59.66%) were more commonly used in educational settings, though they were also frequently used at home.
Virtual writing assistants like QuillBot and ProWritingAid, intelligent tutoring systems like Carnegie Learning, and personalized learning platforms such as Khan Academy recorded lower usage both at home and in educational settings compared to other AI-supported platforms.

5.3. Level of Agreement Regarding AI Tools

When educators were asked to indicate their level of agreement with various statements about the use of AI tools in their educational practice, they responded using a 5-point Likert scale (1 = strongly disagree, 5 = strongly agree; 0 indicating “not applicable”). A total of 320 participants responded, and the overall average agreement was M = 3.25 (SD = 0.74), indicating a neutral sentiment toward the statements. Regarding the ability of AI automation to allow more focus on teaching and interacting with students, 32 educators (10.06%) strongly agreed and 110 (34.59%) agreed, indicating that nearly half of the educators (44.65%) saw this as a benefit. In contrast, 43 educators (13.52%) disagreed, 13 (4.09%) strongly disagreed, and 100 (31.45%) were neutral.
When asked about the use of AI tools for handling repetitive administrative tasks, such as grading and attendance tracking, 38 educators (12.03%) strongly agreed and 135 (42.72%) agreed, making a total of 54.75% of educators. In contrast, 24 (7.59%) disagreed, 16 (5.06%) strongly disagreed, and 79 (25%) remained neutral. Additionally, a large percentage of educators (63.61%) agreed or strongly agreed that AI tools assist in content creation, reducing the time spent on preparation and planning. Of these, 48 educators (15.19%) strongly agreed, while 153 (48.42%) agreed. Conversely, 33 educators (10.44%) disagreed, and 11 (3.48%) strongly disagreed.
Regarding the ability of AI applications to provide personalized professional development opportunities, 39 educators (12.34%) strongly agreed and 143 (45.25%) agreed, with a total of 57.59% of educators acknowledging this benefit. Meanwhile, 27 (8.54%) educators disagreed, and 14 (4.43%) strongly disagreed. In terms of AI’s role in enhancing student engagement through gamification and virtual simulations, 49 (15.71%) educators strongly agreed, and 147 (47.12%) agreed, with 62.83% seeing this as a positive impact. However, 28 educators (8.97%) disagreed and 11 (3.53%) strongly disagreed, while 62 (19.87%) were neutral on this matter.

5.4. How AI Tools Are Used

When educators were asked how AI tools were used in their teaching, 200 responses were analyzed (see Table 2). There was significant reliance on AI for writing and editing tasks, with 23 of the educators (11.5%) using tools like Grammarly to enhance text quality. Content generation through platforms like ChatGPT was highlighted in 12 responses (6%) for a variety of uses such as creating lectures, emails, and quiz questions. Image generation and design also played a role, with AI cited in four responses (2%) and creating captions for lessons further noted in three of the responses (1.5%). The platform Canva was specifically mentioned by eight (4%) educators for preparing presentations. Additionally, AI’s application in virtual reality and simulations was reported by educators in two answers (1%).
Educators also utilized AI for grading and assessment, with nine reactions (4.5%) noting assistance in streamlining evaluation processes like rubric creation. AI was used to create quizzes, with platforms like Kahoot and Quizizz mentioned in 19 (9.5%) responses. One major theme was the use of AI in lesson preparation and delivery, credited in 32 (16%) of the answers for reducing the workload involved in creating and presenting lesson plans. AI also supported differentiation and inclusion strategies by 14 (7%) of the educators and was used to provide professional development in six responses (3%).
AI enhanced research activities in 11 replies (5.5%), showing its applicability in academic settings. Further, learning management systems (LMSs) such as Google Classroom, Moodle, Brightspace, and Canvas, were noted by 12 (6%) of the educators regarding their support of AI in integrating tools and content. Math, coding, and programming tools like GeoGebra, Scratch, and Desmos were mentioned in 4 responses (2%) along with AI’s role in creating engaging and motivating content through interactive platforms as showcased in 11 of the responses (5.5%). An additional three educators (1.5%) reported teaching students about the ethical practices with AI, underscoring the importance of responsible usage.
Finally, AI usage in daily life or at home was observed by 7 (3.5%) educators, indicating its application beyond professional settings. Despite its widespread use, 10 educators (5%) reported not utilizing AI in their professional practices at all.

5.5. Curriculum Adjustments

When educators were asked How might you consider adjusting your curriculum in response to students using AI tools for completing assignments?, their responses, revealed through open-ended questions, were categorized into six themes: curriculum redesign, writing assignments, critical thinking and application, ethical use and citation of AI, support or no adjustments necessary, and a category for those undecided or against adjustments (see Table 3).
The data indicated that 34.72% of educators redesigned their curricula to counteract potential AI misuse. For example, many educators reported “AI-proofing” assessments by focusing on creating more specific, classroom content-driven questions, thus minimizing the ability of AI tools to provide comprehensive answers. One educator noted, “Lessons need to be less about recall and more about using information in new ways, meaning applying knowledge”. In fact, writing assignments were restructured by 8.33% of educators to foster originality and engagement, with some using version tracking platforms like OneDrive or Google Docs to monitor student progress. The move towards more personalized essay prompts and oral presentations was noted as a shift from traditional online exams. In addition to adjustments to curriculum and writing, critical thinking and application were emphasized by 7.64% of educators, highlighting the use of AI as a tool for starting rather than completing projects, with one educator remarking, “AI is just a starting point. We need to then critically evaluate the responses”.
Some educators (13.89%) also noted the ethical use and citation of AI, with several educators reporting the inclusion of syllabus statements that outline acceptable AI use and when it constitutes plagiarism. One educator shared, “I have written a syllabus statement outlining acceptable and unacceptable uses of AI in writing assignments,” demonstrating the need for clear guidelines on AI’s role in academic work. Furthermore, educators are asking students to cite AI sources when used and reflect on how these tools aided their work. A participant stated, “Citing when they’ve used AI… writing a response about how they critically thought about the information given by AI,” showing the focus on ethical usage and reflection.
However, not all educators felt the need to modify their practices. In fact, 17.36% of educators felt no adjustments were necessary, often citing the nature of their subjects, such as fine arts, where AI’s impact is minimal. Some of this group valued AI as a tool but maintained traditional teaching methods where AI does not disrupt the learning objectives. The remaining 18.06% were either undecided or against making any curriculum adjustments related to AI use, expressing concerns about AI’s potential to undermine fundamental learning processes or questioning its educational value altogether.

5.6. Training and Resources for the Implementation of AI

When educators were asked what training programs and resources they believed were necessary to effectively implement AI tools in the classroom, there were 137 responses, which highlighted a variety of needs (see Table 4). The themes that resonated included general and specific training, hands-on and practical workshops, exposure to specific AI tools, addressing AI use and bias, training modalities, independent learning, and uncertainty or lack of awareness.
General training/awareness was the most frequently mentioned category, with 42 (30.66%) educator responses highlighting a preference for broad, comprehensive training on AI. For example, one educator stated, “We need actual training in the use of AI. I only recently started using ChatGPT to help me with my lectures”.
Discipline or content specific training garnered 17 statements (12.41%), reflecting the need for training on specific content or in subject areas. Hands-on and/or practical experience was noted by six (4.38%) educators underscoring the need for experiential learning through direct engagement with AI tools. One educator requested “Hands-on workshops in how to use prompts and how to set assignments”. The modality of training was noted by four (2.92%) educators who emphasized the importance of offering both online and in-person workshops to accommodate various learning preferences and needs.
Exposure to AI tools or specific platforms was reported by seven (5.11%) of the educators. Tools included those such as ChatGPT, the LMS Canvas, Kahoot, and Eduboom. One of the educators also stated, “I need time to fully explore what is available so that I know how to make it a successful tool for student use”. Student learning and critical thinking issues were raised by six (4.38%) respondents pointing to the need to ensure AI tools enhance rather than detract from educational outcomes. “New training is definitely necessary to ensure students are not utilizing the tools as an escape from learning,” noted an educator.
Addressing ethical AI use and bias was a concern for 19 (13.87%) educators, illustrating the need for training focused on the responsible use of AI in educational settings. Discussions around the ethical implications of AI, such as bias and privacy, were stressed as critical areas for educator training. For example, one educator wrote “training on academically honest use of AI and the importance of teaching students’ ethical ways to use these tools. If we do not teach them how to use them appropriately, they will use them incorrectly”.
Policies and guidelines were discussed by four (2.92%) educators who suggested the need for clear institutional policies to govern the use of AI tools effectively and ethically within educational institutions. One educator responded with “…we need concrete policies that outline expectations and resources for teaching students to use it as a tool…”.
Independent learning was documented by five (3.65%) educators underscoring a trend where educators are advocating for self-directed learning pathways in AI education. One educator noted, “I haven’t really taken any trainings. It’s mostly been practice and self-learning”. Finally, 26 (18.98%) educators expressed uncertainty or a lack of awareness about what specific training would be beneficial. In fact, all the responses were “I do not know” or “unsure”.

5.7. Policies on AI

Regarding educators’ policies on the use of AI tools in their classrooms, a diverse range of approaches was observed among 353 educators. A total of 74 (20.96%) educators reported that campus policies on AI are implemented and assessed as written, ensuring adherence to formal guidelines. However, 29 (8.22%) educators noted that while policies exist, they are not consistently followed. A significant portion of 127 (35.98%) respondents indicated that no formal policy on AI usage exists on their campus. Additionally, 86 educators (24.36%) established their own individual policies for their classrooms or courses. The remaining 37 respondents (10.48%) categorized their responses as other.

5.8. Educators Perception of AI on Mental Health and Mitigating Strategies

When educators were asked how they perceived AI to impact the mental health of students or educators, there were 124 unique responses to the open-ended question (see Table 5). From the data, six themes emerged. The most prevalent theme was social isolation or interaction with others, which comprised 24 (19.5%) of the responses. One educator wrote, “AI helps me to see another perspective and consider things I might not have considered, but it also can lead to less real-world social interaction and increased feelings of isolation”. Anxiety and stress were identified in 16 (13.8%) responses, highlighting both benefits and concerns. In fact, six of the educators noted that AI could provide stress-relieving benefits as suggested in the following sentiment: “If anything, I think it would help alleviate stress. It’s nice to just plug in the topic and get a starting point”. However, ten of the educators perceived AI to actually increase anxiety or stress in schools. For example, an educator pointed out “I think teachers can be really stressed out about it, and I think the concerns and lack of clarity around its ethical use is also very stressful for students”.
The theme creativity and critical thinking emerged in 26 (21.1%) of the educators’ responses, primarily underscoring the negative impacts of AI on student learning. Statements ranged from decreased innovation in the classroom to a loss of problem-solving and creativity. One educator wrote:
Students’ use of GenAI can decrease their ability to develop content-based and critical thinking skills, which won’t actually help them in the workplace, and could lead to them failing, which wouldn’t be great for their mental health. predominantly highlighting negative impacts of AI on student development.
The theme of workload or the impact of AI on time resonated in nine (7.26%) of the responses. For some, AI helped educators manage their workload noting that it could significantly enhance productivity by managing time and resources more effectively. For example, an educator stated, “AI gives me more time, so I am better mentally”. Fifteen (12.2%) of the educators conveyed that AI had the potential to lead to student dependency or addiction as captured in the following excerpt: “Students increasingly rely on AI, which may lead to a dependency that diminishes their own learning efforts, while also finding it helpful in managing tasks”. Furthermore, some educators (n = 5, 4.1%) even noted that students might have a lack of competency or motivation as highlighted in the following statement: “[AI] can give the student a false sense of their personal abilities, making them “feel” as if they know more than they actually do”. However, 27 (22%) of the responses indicated no observed impact or an overall unfamiliarity with AI’s effects on mental health as stated in the following: “I haven’t noticed any effect either positively or negatively”. Finally, there was a single response (0.8%), which stated that AI was “not useful”.

5.9. Additional Suggestions

The final question of the survey invited educators to share additional thoughts and suggestions on the use of AI in education (see Table 6). There were 110 responses listed with the largest group, expressing no specific suggestions or uncertainty about AI applications in their educational contexts. Specifically, 53 (34.7%) educators, responded with “No, I don’t” and “None at this time”. Twenty (13.89%) responses were focused on the need for training and increased awareness about AI. These respondents advocated for educational programs that prepare both teachers and students to use AI responsibly and effectively, highlighting the importance of understanding artificial technologies. For example, one educator suggested, “Teachers should be trained so that AI can be used as a school tool for teaching”.
Concerns and caution about the implications of AI in education were raised by 19 (12.41%) of the educators. They voiced apprehension about AI replacing traditional methods, potential ethical issues, and broader social implications, including social inequalities and environmental impacts. A respondent pointed out, “It’s frustrating that AI/GenAI is heralded as the Fixer of All Things”. A smaller segment, emphasizing the need for a balance between AI tools and human effort, included four (2.76%) educators. They suggested that AI should enhance rather than replace the educational process, supporting a hybrid approach where AI assists teachers. “AI should not replace human efforts, but rather, enhance them,” shared one educator.
The potential benefits of AI were recognized in 16 (11.03%) of the responses, which pointed out how AI could facilitate personalized learning, assist with administrative tasks, and foster innovation and creativity in educational settings. For instance, one educator wrote, “AI could be used to assist in grading assignments like a teaching assistant”. Resources and expertise in AI were mentioned by one respondent, highlighting the importance of consulting specialists to effectively integrate AI into educational practices: “Check out my buddy Matt Miller. He’s the AI guru”. It is interesting to note that two of the excerpts regarding the benefits of AI were attributed to “GPT4” rather than the actual participant, indicating that the content was AI-generated.
Finally, ethical considerations in the use of AI were discussed by four (3.64%) of the educators. They called for the development of ethical guidelines and best practices to ensure that AI is integrated into educational settings in ways that are fair, transparent, and beneficial to all students. One educator advised, “It’s crucial to address ethical issues such as bias, privacy, and transparency”.

6. Discussion

The findings of this study highlight the increasing integration of AI in educational settings, emphasizing the complex relationship between technology and social contexts. This relationship is framed by the TAM, which posits that two key factors, namely the PU and PEU, drive the acceptance and adoption of new technologies. Additionally, the STS theory offers a valuable perspective for understanding how AI adoption affects educators’ workloads and mental well-being while also addressing the ethical considerations surrounding its use. It is important to note that the TAM focuses on how users’ perceptions of technology, specifically its usefulness (PU) and ease of use (PEU), influence their adoption. In contrast, STS highlights the dynamic interplay between social (e.g., educators’ perceptions, institutional policies) and technical (e.g., AI tools) systems in shaping how technologies are integrated and used.
In this study, educators’ high level of familiarity with AI (92%) significantly influenced their perceived ease of use, both in the classroom and in personal settings. They shared various ways they integrated AI into their work, ranging from grading papers to planning and creating lessons. More than half (54.75%) of the participants reported that AI was particularly useful for managing repetitive administrative tasks, aligning with the TAM, which suggests that perceived usefulness drives technology adoption. As educators receive more professional development, their readiness to adopt and adapt AI tools is likely to increase, further enhancing both the ease of use and perceived usefulness of these technologies.
With 92% of educators reporting familiarity with AI technologies, it is evident that these tools have become a significant component of contemporary teaching and learning environments. Educators reported utilizing AI for various purposes, including administrative tasks, lesson planning, and enhancing student engagement. This shift toward more efficient teaching practices aligns with the STS focus on optimizing both social and technological elements within educational systems. However, while AI can streamline tasks, its integration in classrooms introduces challenges, particularly in fostering critical thinking among students.
An interesting finding from this study was the prevalence of students using AI tools like ChatGPT to complete assignments. In our survey of 329 educators, 41.3% reported students use AI often (26–75% of the time), 40.1% said sometimes (1–25%), 8.2% said almost always (76–100%), and 10.3% believed students never use AI for assignments. This trend prompted educators to rethink their instructional strategies, often incorporating reflective writing and critical thinking activities to counter the potential over-reliance on AI. However, this trend of AI-assisted assignments also raises critical questions about academic integrity, the authenticity of student work, and the potential for AI to further widen the digital divide, especially among students who may lack access to such technologies outside of school.
The impact of AI on mental health emerged as a critical concern. While some educators specified that AI may reduce stress by automating mundane tasks, others mentioned risks such as increased anxiety and social isolation among students. This duality as both support and potential source of distress requires careful examination of AI’s role in diverse educational contexts. AI-driven tools, such as real-time interventions and conversational agents, hold significant potential for enhancing mental health support by providing scalable, accessible, and timely resources for students. For example, in a recent article titled “AI Chatbot Friendships: Potential Harms and Benefits for Students”, Ofgang (2024) explored the growing trend of students forming “friendships” with AI chatbots, examining both the potential benefits and harms. Ofgang argued that AI chatbots can provide companionship and emotional support, but raised concerns about the negative effects, particularly for young people who may increasingly rely on AI for emotional comfort rather than seeking real human connections. While research on AI’s impact on mental health remains limited, Ofgang pointed out that mental health experts see the potential for AI to play a positive role, particularly in developing programs designed to educate students about mental health and emotional awareness. Additionally, the early identification of mental health challenges is especially critical in educational settings, where AI-driven tools could empower educators and counselors to intervene proactively and support students in managing stress and developing resilience within supportive environments. However, further research is needed to ensure these tools are both effective and ethically tailored for education, prioritizing student well-being and fostering meaningful connections within the school community.
Another significant finding was the gap in institutional policies regarding AI usage. While some educators developed their own guidelines for issues like data privacy and plagiarism, this inconsistency highlights the need for concrete institutional policies. Such policies should ideally address ethical considerations, promote responsible use, and provide clear frameworks for evaluating AI’s role in education. Without a comprehensive and clear institutional AI strategy, the risk of misuse or misunderstanding of AI tools remains high.
This study also illustrated the need for targeted professional development focused on AI literacy. Educators reported a desire for professional learning that not only covers the basics of AI technology but also addresses ethical implications, disciplinary applications, and strategies for promoting meaningful engagement with AI tools. This proactive approach reflects a broader recognition that as AI continues to evolve, so too must educators’ capabilities to leverage these technologies effectively and responsibly. Professional development should not only address the technical skills needed to use AI tools but also include discussions on how to critically assess and incorporate these tools into pedagogical practices. For example, educators should be trained to recognize when AI might undermine student engagement and how to balance its use with strategies that promote active learning and independent problem-solving.
It is important to acknowledge a few limitations of this study. While the diverse sample of educators offers a broad perspective on AI integration in education, the reliance on self-reported perceptions may introduce social desirability bias, particularly given the growing public interest in AI in education. This could limit the generalizability of the findings. Additionally, the sample may not fully represent educators from all demographic backgrounds, especially those in under-resourced schools, and the use of snowball sampling may introduce biases, as it relies on referral chains that may not be representative of the broader educator population. Future research should explore the effects of AI in a variety of educational contexts, including rural schools and those in developing countries, to gain a more comprehensive understanding of AI’s global impact. Investigating the long-term effects of AI on both educational outcomes and mental health is also crucial. Furthermore, gathering data from students would provide a more holistic understanding of AI’s role in the classroom. Longitudinal studies examining the sustained impact of AI on educational outcomes and well-being should be a priority. Moreover, there is a need for research focused on developing comprehensive policies and best practices for professional development to ensure that AI is used effectively in educational settings. Exploring how AI tools influence teaching practices can also help identify which platforms are most beneficial or potentially disruptive. Finally, while this study captures educators’ perceptions of AI usage, including students’ perspectives would offer a more rounded view of AI’s impact in the classroom.
In summary, as AI continues to reshape not just education but also industries like health care and business, preparing students for a world where these technologies are ubiquitous requires more than just technical fluency. It demands the cultivation of critical thinking, ethical awareness, and emotional intelligence. By focusing on these broader competencies, educators can ensure that AI serves as a tool for positive change, rather than one that exacerbates existing inequalities.

Author Contributions

Conceptualization, J.A.D.; data collection, all authors; methodology, J.A.D. and W.S.; analysis, J.A.D. and W.S.; writing—original draft preparation, all authors; writing—review and editing, all authors; supervision, J.A.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was approved by the Institutional Review Board at The University of Texas at Tyler (protocol code 2024-106 and 7/26/2024).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ahmad, S. F., Rahmat, M. K., Mubarik, M. S., Alam, M. M., & Hyder, S. I. (2021). Artificial intelligence and its role in education. Sustainability, 13(22), 12902. [Google Scholar] [CrossRef]
  2. Alam, A. (2021, December 3–4). Should robots replace teachers? Mobilisation of AI and learning analytics in education [Conference session]. 2021 International Conference on Advances in Computing, Communication, and Control (ICAC3) (pp. 1–12), Mumbai, India. [Google Scholar] [CrossRef]
  3. Alexander, B., Ashford-Rowe, K., Barajas-Murphy, N., Dobbin, G., Knott, J., McCormack, M., Pomerantz, J., Seilhamer, R., & Weber, N. (2019). Educause horizon report: 2019 higher education edition. EDUCAUSE. Available online: https://library.educause.edu/resources/2019/4/2019-horizon-report (accessed on 12 September 2024).
  4. Brazeau, B. (2024). Creating inclusive learning environments: Supporting the diverse needs of all learners. American Consortium for Equity in Education. Available online: https://www.ace-ed.org/creating-inclusive-learning-environments-supporting-the-diverse-needs-of-all-learners/ (accessed on 12 September 2024).
  5. BW Online Bureau. (2019, November 6). Personalized adaptive learning tools to improve learning outcomes. BW Education. Available online: https://bweducation.businessworld.in/article/Personalized-Adaptive-Learning-Tools-To-Improve-Learning-Outcomes/06-11-2019-178606/ (accessed on 4 January 2025).
  6. Chang, P. C., Zhang, W., Cai, Q., & Guo, H. (2024). Does AI-driven technostress promote or hinder employees’ artificial intelligence adoption intention? A moderated mediation model of affective reactions and technical self-efficacy. Psychology Research and Behavior Management, 17, 413–427. [Google Scholar] [CrossRef] [PubMed]
  7. Chen, J., Yuan, D., Dong, R., Cai, J., Ai, Z., Zhou, S., & Dong, R. (2024). Artificial intelligence significantly facilitates development in the mental health of college students: A bibliometric analysis. Frontiers in Psychology, 15. [Google Scholar] [CrossRef] [PubMed]
  8. Chiu, T. K. F., Xia, Q., Zhou, X., Chai, C. S., & Cheng, M. (2023). Systematic literature review on opportunities, challenges, and future research recommendations of artificial intelligence in education. Computers and Education: Artificial Intelligence, 4, 100118. [Google Scholar] [CrossRef]
  9. Cuofano, G. (2024). What is sociotechnical systems theory? FourWeekMBA. Available online: https://fourweekmba.com/sociotechnical-systems-theory/ (accessed on 15 October 2024).
  10. D’Alfonso, S. (2020). AI in mental health. Current Opinion in Psychology, 36, 112–117. [Google Scholar] [CrossRef] [PubMed]
  11. Davis, F. D. (1986). A technology acceptance model for empirically testing new end-user information systems: Theory and results [Doctoral dissertation, MIT Sloan School of Management]. [Google Scholar]
  12. Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13, 319–340. [Google Scholar] [CrossRef]
  13. Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: A comparison of two theoretical models. Management Science, 35(8), 982–1003. [Google Scholar] [CrossRef]
  14. Delello, J. A., Mokhtari, K., & Reichard, C. (2016). Multitasking among college students: Are freshmen more distracted? International Journal of Cyber Behavior, Psychology, and Learning, 6(4), 1–12. [Google Scholar] [CrossRef]
  15. Delello, J., Sung, W., Mokhtari, K., & De Giuseppe, T. (2023). Exploring college students’ awareness of AI and ChatGPT: Unveiling perceived benefits and risks. Journal of Inclusive Methodology and Technology in Learning and Teaching, 3(4), 1–25. Available online: https://inclusiveteaching.it/index.php/inclusiveteaching/article/view/132/132 (accessed on 12 September 2024).
  16. Delello, J., Sung, W., Mokhtari, K., & De Giuseppe, T. (2024a). Are K-16 educators prepared to address the educational and ethical ramifications of artificial intelligence software? In K. Arai (Ed.), Advances in information and communication: FICC 2024 (pp. 406–432). Lecture Notes in Networks and Systems; Springer. [Google Scholar] [CrossRef]
  17. Delello, J. A., Watters, J. B., & Garcia-Lopez, A. (2024b). Artificial intelligence in education: Transforming learning and teaching. In J. A. Delello, & R. R. McWhorter (Eds.), Disruptive technologies in education and workforce development (pp. 1–26). IGI Global. [Google Scholar] [CrossRef]
  18. Dillon, S. (2020). The Eliza effect and its dangers: From demystification to gender critique. Journal for Cultural Research, 24(1), 1–15. [Google Scholar] [CrossRef]
  19. Emery, F. E., & Trist, E. L. (1960). Socio-technical systems. In C. W. Churchman, & M. Verhulst (Eds.), Management science models and techniques (Vol. 2, pp. 83–97). Pergamon. [Google Scholar]
  20. Espejo, G., Reiner, W., & Wenzinger, M. (2023). Exploring the role of artificial intelligence in mental healthcare: Progress, pitfalls, and promises. Cureus, 15(9), e44748. [Google Scholar] [CrossRef]
  21. Ettman, C. K., & Galea, S. (2023). The potential influence of AI on population mental health. JMIR Mental Health, 10, e49936. [Google Scholar] [CrossRef]
  22. Farahani, M. S., & Ghasmi, G. (2024). Artificial intelligence in education: A comprehensive study. Forum for Education Studies, 2(3), 1379. [Google Scholar] [CrossRef]
  23. Gocen, A., & Aydemir, F. (2020). Artificial intelligence in education and schools. Research on Education and Media, 12(1), 13–21. [Google Scholar] [CrossRef]
  24. Graham, S., Depp, C., Lee, E. E., Nebeker, C., Tu, X., Kim, H. -C., & Jeste, D. V. (2019). Artificial intelligence for mental health and mental illnesses: An overview. Current Psychiatry Reports, 21(11), 116. [Google Scholar] [CrossRef]
  25. Guilherme, A. (2019). AI and education: The importance of teacher and student relations. AI & Society, 34, 47–54. [Google Scholar] [CrossRef]
  26. He, Y., Chen, Q., Kitkuakul, S., & Wright, L. T. (2018). Regulatory focus and technology acceptance: Perceived ease of use and usefulness as efficacy. Cogent Business & Management, 5(1), 1459006. [Google Scholar] [CrossRef]
  27. Heron, K. E., & Smyth, J. M. (2010). Ecological momentary interventions: Incorporating mobile technology into psychosocial and health behaviour treatments. British Journal of Health Psychology, 15(1), 1–39. [Google Scholar] [CrossRef]
  28. Huesca, G., Martínez-Treviño, Y., Molina-Espinosa, J. M., Sanromán-Calleros, A. R., Martínez-Román, R., Cendejas-Castro, E. A., & Bustos, R. (2024). Effectiveness of using ChatGPT as a tool to strengthen benefits of the flipped learning strategy. Education Sciences, 14(6), 660. [Google Scholar] [CrossRef]
  29. Khan, S., Mazhar, T., Shahzad, T., Khan, M. A., Rehman, A. U., Saeed, M. M., & Hamam, H. (2025). Harnessing AI for sustainable higher education: Ethical considerations, operational efficiency, and future directions. Discover Sustainability, 6(1), 23. [Google Scholar] [CrossRef]
  30. Kamalov, F., Santandreu Calonge, D., & Gurrib, I. (2023). New era of artificial intelligence in education: Towards a sustainable multifaceted revolution. Sustainability, 15(16), 12451. [Google Scholar] [CrossRef]
  31. Kannan, P., & Zapata-Rivera, D. (2022). Facilitating the use of data from multiple sources for formative learning in the context of digital assessments: Informing the design and development of learning analytic dashboards. Frontiers in Education, 7, 913594. [Google Scholar] [CrossRef]
  32. Karan, B., & Angadi, G. R. (2023). Potential risks of artificial intelligence integration into school education: A systematic review. Bulletin of Science, Technology & Society, 43(3–4), 67–85. [Google Scholar] [CrossRef]
  33. Li, H., Zhang, R., Lee, Y. C., Kraut, R. E., & Mohr, D. C. (2023). Systematic review and meta-analysis of AI-based conversational agents for promoting mental health and well-being. NPJ Digital Medicine, 6(1), 236. [Google Scholar] [CrossRef] [PubMed]
  34. Li, W., Zhang, X., Li, J., Yang, X., Li, D., & Liu, Y. (2024). An explanatory study of factors influencing engagement in AI education at the K-12 level: An extension of the classic TAM model. Scientific Reports, 14, 13922. [Google Scholar] [CrossRef] [PubMed]
  35. Liang, W., Yuksekgonul, M., Mao, Y., Wu, E., & Zou, J. (2023). GPT detectors are biased against non-native English writers. Patterns, 4(7), 100779. [Google Scholar] [CrossRef]
  36. Masters, K. (2019). Artificial intelligence in medical education. Medical Teacher, 41(9), 976–980. [Google Scholar] [CrossRef] [PubMed]
  37. Megahed, F. M., Chen, Y. -J., Ferris, J. A., Knoth, S., & Jones-Farmer, L. A. (2023). How generative AI models such as ChatGPT can be (mis)used in SPC practice, education, and research? An exploratory study. Quality Engineering, 36(2), 287–315. [Google Scholar] [CrossRef]
  38. Milberg, T. (2024). The future of learning: How AI is revolutionizing education 4.0. World Economic Forum. Available online: https://www.weforum.org/agenda/2024/04/future-learning-ai-revolutionizing-education-4-0/ (accessed on 12 September 2024).
  39. Miller, F. A., Katz, J. H., & Gans, R. (2018). The OD imperative to add inclusion to the algorithms of artificial intelligence. OD Practitioner, 5(1), 6–12. [Google Scholar]
  40. Mittal, A. (2024). The plagiarism problem: How generative AI models reproduce copyrighted content. Unite.AI. Available online: https://www.unite.ai/the-plagiarism-problem-how-generative-ai-models-reproduce-copyrighted-content/ (accessed on 16 January 2025).
  41. Molli, V. L. P. (2024). Enhancing healthcare equity through ai-powered decision support systems: Addressing disparities in access and treatment outcomes. International Journal of Sustainable Development Through AI, ML and IoT, 3(1), 1–12. Available online: https://ijsdai.com/index.php/IJSDAI/article/view/49 (accessed on 15 October 2024).
  42. Monahan, K., & Burlacu, G. (2024). From burnout to balance: AI-enhanced work models. Upwork Research Institute. Available online: https://www.upwork.com/research/ai-enhanced-work-models (accessed on 3 December 2024).
  43. Moor, J. (2006). The Dartmouth College artificial intelligence conference: The next fifty years. AI Magazine, 27(4), 87–91. [Google Scholar]
  44. Moorhouse, B. L., Yeo, M. A., & Wan, Y. (2023). Generative AI tools and assessment: Guidelines of the world’s top-ranking universities. Computers and Education Open, 5, 100151. [Google Scholar] [CrossRef]
  45. Mullaney, T. (2024). Pedagogy and the AI guest speaker or what teachers should know about the eliza effect. Available online: https://tommullaney.com/2024/02/20/pedagogy-the-eliza-effect/?fbclid=IwAR0vDdDrcZ7HqaAm2ahf56hsE3To2VjWtLWvYAzt9Z44SGULcEnmqWWXdwY (accessed on 12 September 2024).
  46. Nemesure, M. D., Heinz, M. V., Huang, R., & Jacobson, N. C. (2021). Predictive modeling of depression and anxiety using electronic health records and a novel machine learning approach with artificial intelligence. Scientific Reports, 11. [Google Scholar] [CrossRef] [PubMed]
  47. Ofgang, E. (2024). AI chatbot friendships: Potential harms and benefits for students. EdTech Magazine. Available online: https://www.techlearning.com/news/ai-chatbot-friendships-potential-harms-and-benefits-for-students (accessed on 1 December 2024).
  48. Olawade, D. B., Wada, O. Z., Odetayo, A., David-Olawade, A. C., Asaolu, F., & Eberhardt, J. (2024). Enhancing mental health with artificial intelligence: Current trends and future prospects. Journal of Medicine, Surgery, and Public Health, 3, 100099. [Google Scholar] [CrossRef]
  49. Owoc, M. L., Sawicka, A., & Weichbroth, P. (2021). Artificial intelligence technologies in education: Benefits, challenges and strategies of implementation. In IFIP international workshop on artificial intelligence for knowledge management (pp. 37–58). Springer. [Google Scholar] [CrossRef]
  50. Paek, S., & Kim, N. (2021). Analysis of worldwide research trends on the impact of artificial intelligence in education. Sustainability, 13(14), 7941. [Google Scholar] [CrossRef]
  51. Pasmore, W., Winby, S., Mohrman, S. A., & Vanasse, R. (2018). Reflections: Sociotechnical systems design and organization change. Journal of Change Management, 19(2), 67–85. [Google Scholar] [CrossRef]
  52. Pedro, F., Subosa, M., Rivas, A., & Valverde, P. (2019). Artificial intelligence in education: Challenges and opportunities for sustainable development. UNESCO. [Google Scholar]
  53. Pham, H., Kohli, T., Olick Llano, E., Nokuri, I., & Weinstock, A. (2024). How will AI impact racial disparities in education? Stanford Center for Racial Justice. Available online: https://law.stanford.edu/2024/06/29/how-will-ai-impact-racial-disparities-in-education/ (accessed on 12 September 2024).
  54. Poth, R. D. (2023). 7 AI tools that help teachers work more efficiently. Edutopia. Available online: https://www.edutopia.org/article/7-ai-tools-that-help-teachers-work-more-efficiently (accessed on 15 October 2024).
  55. Saavedra, J., Burbano, V., Anderson, M., & Larroucau, T. (2024). Effective and scalable math support: Evidence on the impact of an AI-tutor on math achievement in Ghana. arXiv, arXiv:2402.09809v2. [Google Scholar] [CrossRef]
  56. Santos, W. R. d., de Oliveira, R. L., & Paraboni, I. (2024). SetembroBR: A social media corpus for depression and anxiety disorder prediction. Language Resources & Evaluation, 58(1), 273–300. [Google Scholar] [CrossRef]
  57. Sharma, S., Rawal, R., & Shah, D. (2023). Addressing the challenges of AI-based telemedicine: Best practices and lessons learned. Journal of Education and Health Promotion, 12(1), 338. [Google Scholar] [CrossRef]
  58. Shen, Y., Meng, F., Xu, H., Li, X., Zhang, Y., Huang, C., Luo, X., & Zhang, X. Y. (2020). Internet addiction among college students in a Chinese population: Prevalence, correlates, and its relationship with suicide attempts. Depression and Anxiety, 37(8), 812–821. [Google Scholar] [CrossRef]
  59. Small, G. W., Lee, J., Kaufman, A., Jalil, J., Siddarth, P., Gaddipati, H., Moody, T. D., & Bookheimer, S. Y. (2020). Brain health consequences of digital technology use. Dialogues in Clinical Neuroscience, 22(2), 179–187. [Google Scholar] [CrossRef] [PubMed]
  60. TeachFlow. (2023). The impact of AI on teacher well-being and burnout. Available online: https://teachflow.ai/the-impact-of-ai-on-teacher-well-being-and-burnout/ (accessed on 12 September 2024).
  61. Tulsiani, R. (2024). The art of ChatGPT-driven gamification. eLearning Industry. Available online: https://elearningindustry.com/the-art-of-chatgpt-driven-gamification (accessed on 1 October 2024).
  62. Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59(236), 433–450. [Google Scholar] [CrossRef]
  63. Weiner, S., Lake, R., & Rosner, J. (2024). AI is evolving, but teacher prep is lagging: A first look at teacher preparation program responses to AI. Center on Reinventing Public Education. Available online: https://crpe.org/wp-content/uploads/Teacher-Prep-AI-2024.pdf (accessed on 12 September 2024).
  64. World Economic Forum. (2024). Shaping the future of learning: The role of AI in Education 4.0. Available online: https://www.weforum.org/publications/shaping-the-future-of-learning-the-role-of-ai-in-education-4-0/ (accessed on 4 January 2025).
  65. Wu, Y., Yi, A., Ma, C., & Chen, L. (2023). Artificial intelligence for video game visualization: Advancements, benefits, and challenges. Mathematical Biosciences and Engineering, 20, 15345–15373. [Google Scholar] [CrossRef] [PubMed]
  66. Yang, Y., & Xia, N. (2023). Enhancing students’ metacognition via AI-driven educational support systems. International Journal of Emerging Technologies in Learning, 18, 133–148. [Google Scholar] [CrossRef]
  67. Yin, W. J. (2024). Will our educational system keep pace with AI? A student’s perspective on AI and learning. EDUCAUSE Review. Available online: https://www.educause.edu (accessed on 12 September 2024).
  68. Yuan, S., He, T., Huang, H., Hou, R., & Wang, M. (2020). Automated Chinese essay scoring based on deep learning. CMC-Computers Materials & Continua, 65(1), 817–833. [Google Scholar] [CrossRef]
  69. Zawacki-Richter, O., Marín, V. I., Bond, M., & Gouverneur, F. (2019). Systematic review of research on artificial intelligence applications in higher education: Where are the educators? International Journal of Educational Technology in Higher Education, 16(1), 39. [Google Scholar] [CrossRef]
  70. Zhang, C., Schießl, J., Plößl, L., Hofmann, F., & Gläser-Zikuda, M. (2023). Acceptance of artificial intelligence among pre-service teachers: A multigroup analysis. International Journal of Educational Technology in Higher Education, 20, 49. [Google Scholar] [CrossRef]
Figure 1. The use of AI tools at home or in educational settings.
Figure 1. The use of AI tools at home or in educational settings.
Education 15 00113 g001
Table 1. The demographic information of participants.
Table 1. The demographic information of participants.
Variables N (%)
Gender
     Male 70 (21.0%)
     Female 260 (77.8%)
     Other2 (1.2%)
Ethnicity      
     Caucasian 259 (77.5%)
     Hispanic or Latino 39 (11.7%)
     Black or African American 8 (2.4%)
     Asian, Asian Indian, or Pacific Islander10 (3.0%)
     Other18 (5.4%)
Region     
     United States99 (29.6%)
     International other than United States235 (70.4%)
Level of Education     
     High school graduate14 (4.2%)
     Less than high school6 (1.8%)
     2-year degree1 (0.3%)
     4-year degree44 (13.2%)
     Professional degree162 (48.5%)
     Doctorate94 (28.1%)
     Some college 13 (3.9%)
Teaching Experience     
     0 to 5 years104 (31.1%)
     6 to 10 years76 (22.8%)
     11 to 15 years53 (15.9%)
     16 to 20 years32 (9.6%)
     21 to 25 years23 (6.9%)
     26 to 30 years19 (5.7%)
     Over 30 years27 (8.1%)
Professional Roles     
     Assistant professor45 (13.5%)
     Associate professor26 (7.8%)
     Full professor73 (21.9%)
     University administrator3 (0.9%)
     PreK–5th grade teacher15 (4.5%)
     6th–8th grade teacher17 (5.1%)
     9th–12th grade teacher50 (15%)
     Instructional coach2 (0.6%)
     Curriculum director2 (0.6%)
     Principal or Assistant principal2 (0.6%)
     Registrar1 (0.3%)
     Other (including not employed in education)98 (29.3%)
N334
Table 2. How AI is used in teaching.
Table 2. How AI is used in teaching.
Tool, Platform, ActivityRepresentative Excerpts
Writing/Grammar/Editing
  • I primarily use AI tools to correct spelling, grammar, and improve the quality of written documents.
  • I use AI tools, and I encourage students to do the same, when writing. Grammarly is an amazing tool for spotting errors and increasing the “readability” of work.
  • First, I want to mention that I teach nursing. I use Grammarly to succulently write content in my teaching materials.
Image Generation/Design
  • Post generation, image generation.
  • Generate designs for PowerPoints.
  • To generate pictures to better communicate content.
Captions
  • I edit auto-generated captions for my videos.
  • Students utilize read-aloud.
ChatGPT
  • I use ChatGPT to help write lectures and test questions for medical students.
  • Chat GPT: Create exam questions, survey questions, to help me reword grading feedback to students, to create case studies.
  • Students are allowed to use ChatGPT to brainstorm ideas for paper topics.
  • ChatGPT for email assist.
Virtual Reality (VR) and Simulations
  • Virtual reality.
  • Use virtual assessment tools for simulations.
Grading and Assessment/Evaluation
  • Assistance with generating grading rubrics.
  • Additionally, I’ve used AI to assist with grading. I upload the rubric I developed for the assignment and ask AI to grade the submissions. Initially, I graded the assignments myself before using AI, then compared the results. I found that AI consistently awarded points within 2 to 3 points of my ratings.
Quizzes/Quiz Platforms (e.g., Kahoot, Quizizz)
  • I have used Kahoot and Quizlet in the classroom as a teacher for test reviews and vocabulary study tools.
  • I also use Poll Everywhere to allow students the opportunity to be a part of classroom discussions and questions in a way that is non-threatening.
  • I use AI to see how AI will answer certain questions before assigning them to students. This helps me write questions AI can’t answer, but my students can.
Differentiation/Inclusion
  • I find the use of inclusive software to promote learning and reading or writing activities with disabled children very useful as I am a special needs teacher.
  • I use artificial intelligence to create slides for teaching, create summaries to facilitate topics for students with special educational needs.
  • We utilize personalized learning platforms in the form of iReady which tutors students in math and reading. If a student is strong in a concept, it calibrates the lessons to give the student more challenging work to meet them at their individual levels. Each student’s coursework is personalized in this way to promote closing the gaps. Many of our students with accommodations also use assistive technology such as text-to-speech and spelling assistance.
Brainstorming/Revision
  • Brainstorming, critiquing writing, discussion of ethics and how to use for home/school programs. Evaluating the responses.
  • Brainstorming potential question students may have, brainstorming research topics, summarizing chapters for students.
  • In my English classes, I show students how to use generative AI for brainstorming and revision.
Case Studies
  • Create case studies.
  • To generate case studies.
  • I have thought about using AI to help me create case studies but have not implemented this method yet.
Professional Development
  • I teach teachers how to utilize these tools in writing IEPs, in designing lessons, providing specially designed instruction, and to assist in appropriate communication with parents and peers.
  • I’m a teacher educator, so I show them how to use AI tools/platforms for different aspects of their future jobs (e.g., creating lesson plans, generating potential letters to families, differentiating in the classroom).
  • I use AI to support instructional design.
Research
  • Currently use it to help edit their writing, to do research, to generate new ideas. One assignment involves using AI to generate an essay response to a prompt, and students then substantiate each empirical claim with reference to peer-reviewed research.
  • My upper-level students are taught how to use Quillbot and Citation Machine, or EasyBib in essay or research writing.
  • All students are encouraged to do research for projects and other class activities. It’s usually a little obvious when a student uses AI to cheat when writing answers to a prompt, but it does occur.
Misinformation/Ethics
  • I have always given direct instruction on how to use AI and other tools to help with study, as well as on honest and ethical use of such tools to check and refine your work without crossing over into plagiarism.
  • So far, I have only used them to demonstrate to the medical students the answers they might get if they ask AI medical questions. I show them that some answers are correct, and some are incorrect. I do this to emphasize the importance of double checking the results they get. That AI can be a great tool in medicine, but they need to know what is right and what isn’t right or at least is suspect when they get answers back from AI.
Learning Management Systems (LMSs)
  • Google Classrooms-Teams.
  • I use online platforms for teaching (Brightspace).
  • Just Canvas right now.
  • I use Blackboard ULTRA for teaching, and I use other AI-enhanced platforms such as Mentimeter, pallet, adobe express, and Canvas for designing lesson plans, slides, and graphics.
Math/Coding/Programming Tools
  • Coding.
  • Use of scratches for coding.
  • Platforms like DESMOS and GeoGebra.
  • I’ve given my students some exposure to GitHub Copilot—a tool that has use in computer programming. It’s an open question whether or not this tool is suitable. I’m feeling it out.
Canva
  • I use Canva or other similar apps to produce teaching materials.
  • Use of Canva with students or to prepare presentations.
  • Canva helps me create visually appealing presentations, flyers, and infographics.
Engaging/Motivating Content
  • To catch students’ attention.
  • I use these tools to make learning more stimulating and meaningful, for assessment and for cooperative learning activities.
  • lessons with interactive presentations.
  • I use AI tools to make the lesson more engaging.
Lesson Creation/Delivery
  • To produce didactic material (slides, maps).
  • I also leverage AI to generate additional ideas, suggest alternative resources, and propose different ways to teach a concept.
  • I can use AI to help write lesson plans, warmups and exit tickets, activities, writing prompts, discussion topics, research, creating presentations, and creating unique images to demonstrate topics/content. Just yesterday Google Gemini helped me research the most effective teaching strategies and methods to use when covering classroom “rules” on the first days of school.
  • I themed my Rhetoric 101 courses on this semester.
Daily/Home Usage/Other
  • Rarely. Mostly at home with background apps like voice to text.
  • To this point, most of my use of AI has been personal. I have been exploring platforms this summer, and plan to use AI tools more often in the coming school year.
  • Some other tools as Alexa at home.
  • Many more ways I use AI. I use it daily and multiple times a day.
Do Not Use/Other
  • I don’t use AI in my teaching.
  • I rarely use electronic platforms in my teaching.
  • I do not currently use any AI tools or platforms in teaching.
Table 3. Curriculum adjustment.
Table 3. Curriculum adjustment.
ThemeRepresentative Excerpts
Curriculum Redesign/Modification
  • Lessons need to be less about recall and more about using information in new ways, meaning applying knowledge.
  • Rethinking my assessments, so that it cannot be a simple AI search. It requires application and personal connections.
  • I have created formats/rubrics that must be utilized for submissions and tightened parameters to ensure critical thinking questions are asked during in person encounters which will allow students to study before labs.
Writing Assignments
  • I have started requiring them to write in platforms where I can see version history, such as OneDrive or Google Docs.
  • I’ve already tried to do this by creating more authentic assessments that incorporate self-reflection and writing my own quiz-style assessment questions that are not multiple choice (e.g., a lot of multi-select and matching questions).
  • Our program personalizes essay questions to take into account personal perspectives and increase the difficulty of using AI to generate responses.
Critical Thinking and Application
  • I have included AI as a tool for brainstorming, or finding information about topical areas, then evaluating it. AI is just a starting point. We need to then critically evaluate the responses.
  • Critical thinking opportunities—evaluate merit of AI responses against current evidence.
  • We allow use of AI tools for assignments to assist with wording, grammar, etc. Typically, if students attempt to use AI for more advanced use (critical thinking), the answers do not address the assignment accurately.
Ethical Use and Citation of AI
  • I have included an AI usage statement on my syllabi beginning Fall 2024. Additionally, prior to any writing heavy or essay assignments, I will explicitly instruct my students on how they may or may not use AI to aid their answers.
  • I try to design prompts where students need to include their personal voice and opinion. I include lessons on how to use generative AI in useful and ethical ways.
  • You have to be proactive and learn as much as you can about the platforms so you can recognize and counter behaviors that are not conducive to intellectual or academic learning.
Support or No Adjustments Necessary
  • As long as AI is being used as a tool, no adjustments are needed. Learning to work with technology will only benefit students. When it comes time for them to graduate from high school and get out on their own AI will still be accessible.
  • AI can be used in several ways for teaching and learning. Not for completing assignments.
  • Get a Better understanding of how AI works.
Against or Undecided
  • I don’t believe students use AI to complete my courses, as I teach Painting and Drawing, and I do not accept digital work for my assignments.
  • This is an unsolved question, but up to now, I believe, the Motivation for using AI is not big enough among my students. Only the best ones (less than 5% I believe) are willing to receive such a help and then, they earn good grades.
  • I don’t like it as I feel they are not doing the learning as they are not completing the work on their own.
Table 4. Training and resources.
Table 4. Training and resources.
ThemeRepresentative Excerpts
General Training/Awareness
  • Extensive training on all types of AI software that can be used in the classroom.
  • An educational workshop to demonstrate all the AI and ways that the instructor could allow the use.
  • Professional Development in the use of technological formats so we know what the students have access to, what it looks like in use, and how it might be used.
  • Everything from general concepts to prompt development.
Time
  • I need time to fully explore what is available so that I know how to make it a successful tool for student use.
Discipline/Content Specific
  • Specific training courses.
  • AI could be used to improve and correct English pronunciation.
  • How to incorporate AI into your specific discipline.
  • University workshops, prompt engineering training.
Hands-on/Practical
  • Theoretical knowledge should be accompanied by a practical part of guided experimentation.
  • Hands on workshops in how to use prompts and how to set assignments.
  • I am an advocate of exploration and open experimentation. I think that any choice for a given protocol and process of “training” is premature at this juncture—too much may be missed in such a process. I would say that “training” should be engaged through “sharing”.
Modality
  • Regular 1 h online/in-person workshops on specific tools.
  • An online asynchronous training course specific to PA educators would be welcomed.
  • They have been rather scarce; however, I did participate in a very helpful online training on some AI platforms a few weeks ago. It was quite helpful.
Exposure to AI Tools/Platforms
  • Canva, Kahoot, Eduboom.
  • Curriculum re-design practice of using AI Tools.
  • Concept maps, laboratories.
Student Learning/Critical Thinking
  • Ways to use AI that will not decrease the students learning.
  • Exposure on current trends and tools to aid teachers in making their discussions such more interactive, fun and engaging.
  • More seminars to discuss how to make a student use their own mind instead of searching the internet for answers, especially in medicine.
Addressing Ethical AI Use and Bias
  • Safely and accurately use them. Also trust but verify. Be able to use only certain sources to access information.
  • General training on platforms conversations about pitfalls conversations about ethical use of AI in education clear direction for APA citation of AI sources how to utilize AI gaming in the classroom.
  • A reliable AI detector. Training on how OTs are currently utilizing AI in practice, education on how to determine and communicate expectations at various levels of education.
Policies and Guidelines
  • Frameworks for setting expectations for student use. Using AI in the classroom opens the door for students to explore more, but we need concrete policies that outline expectations and resources for teaching students to use it as a tool and not overly on it to make the product.
  • Workshops would be good, and links to other syllabus statements and articles about AI in the classroom.
  • Institutional policies and procedures as to acceptable use of AI. Statements saying they cannot be used is not sufficient to support instructors.
Not Aware/Uncertain
  • I don’t know any.
  • I don’t know at the moment.
  • I don’t know… the specialization course was so useful to me that with ICT it created the right foundations for me to move on this new educational field.
Self-taught/Other
  • I can Provide context. Before introducing an AI tool, offer a brief overview of how it works and why it will benefit students’ learning. Highlight the importance of data privacy. Educate your students on generative AI data privacy practices. Distribute readings or resources that delve into data privacy in AI. Consider sharing articles from reputable business journals or case studies that discuss real-world implications of data privacy breaches. Offer alternatives. Always provide students with an alternative if they’re uncomfortable with sharing their data. This could be another tool, a different assignment, or a manual approach to achieve the same learning outcome.
  • I haven’t really taken any trainings. It’s mostly been practice and self-learning.
  • Educators should be able to follow their own noses and figure out AI tools. I don’t think any particular programs or resources are needed. Trying to introduce programs and resources smells like administrative bloat to me.
Table 5. Perception of AI on mental health.
Table 5. Perception of AI on mental health.
ThemeRepresentative Excerpts
Social Isolation/Interaction
  • Students would rather talk to a computer than a real person. They would also prefer the computer do the work so they can mindlessly scroll TikTok.
  • It may increase mental health issues due to decreased interaction and self-initiative.
  • Increase the risk of isolation due to lack of engagement with the real world.
Anxiety and Stress
  • Could lower stress levels by helping educators create documents and scenarios and be more efficient.
  • It is stressful for educators to detect plagiarism.
  • I have seen mostly the positive side of AI, of how it helps people to write better and more appropriate forms of communication, especially for those who might not be as strong with effective and diplomatic communication. I think AI has the potential to help students and thereby enhance their learning and relieve stress.
Creativity and Critical Thinking
  • Disrupting all aspects of human life, natural ways of thinking, learning and so on.
  • Excessive use of artificial intelligence deprives children of creativity and development of critical thinking.
  • The opportunity to think, reflect, hypothesis, synthesis, analyze and take personal decisions are greatly discouraging. They don’t exercise their reasoning sense and will eventually be dormant and unproductive. They have to be exposed the critical orientation on the appropriate and meaningful use of AI.
No Observed Impact/Unfamiliar
  • I don’t know much about this, nor have I seen it.
  • I don’t really see it as a mental health issue to be honest. I’m not sure I see the connection.
  • I have not witnessed any specific effects of AI on the mental health of students.
Workload/Time
  • If AI is used as a method for effectively managing time and resources, it can positively impact student and educator workload. For me, the use of AI had provided more opportunities to work on more cognitively heavy tasks which allows me to finish my work in a more appropriate time frame.
  • It helps fill gaps, simplifies and summarizes the learning contents.
  • It has the potential to go either way, so it is important for educators to learn and be on the front end of steering where it goes. It has the potential to improve mental health by streamlining productivity to reduce administrative burden, allowing students and educators to focus on what they need to do and learn.
Student Dependency/Addiction
  • Addiction- physical fixity- loss of personality and originality.
  • I fear that the overreliance of AI will be present very soon.
  • Both educators and students seem more fixated on the things they do.
  • I believe that the use of AI has an impact on the mental health of students as it requires an increase in exposure to backlit screens and reduces the person’s ability to criticize. For us teachers, it is difficult to manage the speed of changes in the digital field.
Sense of Competency/Motivation
  • It makes students think that they aren’t a good enough or smart enough student.
  • As for students, my concern is the potential for students to be easily demotivated in performing thorough research on assignments. AI at times does not pull information from the most reliable sources.
  • It can give the student a false sense of their personal abilities, making them “feel” as if they know more than they actually do. When they run headlong into reality, this will be a shock.
  • AI discourages students.
Not Useful
  • In my opinion, it is not useful for student[s].
Table 6. Additional suggestions.
Table 6. Additional suggestions.
ThemeRepresentative Excerpts
No Suggestions/Unsure
  • No.
  • None currently.
  • Not at this time.
Training and Awareness
  • In my opinion there should be awareness on the conscious use of artificial technologies and AI.
  • I would love to see a training course on the ethical use of AI as a requirement for students and educators.
  • Need to understand how to implement, where are boundaries that are realistic and helpful.
Caution/Concerns
  • We should not completely trust in AI.
  • It’s frustrating that AI/GenAI is heralded as the Fixer of All Things, particularly by the tech industry, when we’ve seen this pattern of rising and falling technology in the past. (In its current form) AI is clearly, quantifiably, bad for the environment and replicates exists structural oppression. I’d love it if that weren’t the case, but it seems like, particularly for GenAI, all the applications are meant to replace creative enterprises instead of menial tasks that would free people to be more creative themselves.
  • My greatest concern is people who will maliciously use AI. I am hopeful that society will use AI responsibly.
Balance AI and Human Effort
  • Need a balance.
  • AI should not replace human efforts, but rather, enhance them.
  • It is a good thing as long as it is not allowed to replace humans. As an educator, I can see AI teaching, but it cannot replace a teacher in the classroom. We get to know our students and the various idiosyncrasies they face daily. A computer cannot FEEL.
Benefits of AI
  • Teachers can create personalized study plans for each student, taking into account their abilities, learning styles, and interests.
  • AI could be used to assist in grading assignments like a teaching assistant.
  • Innovation and Creativity: AI can drive innovation by enabling new forms of creativity and problem-solving. For instance, it can generate new designs, suggest novel research paths, or create art (GPT4).
  • Accessibility and Inclusion: Ensuring that AI technologies are accessible and beneficial to all, including marginalized and underserved communities, is crucial for equitable progress (GPT4).
Resources/Expertise 1
  • Check out my buddy Matt Miller. He’s the AI guru.
Ethical Use and Collaboration
  • We are very early on an exponential curve. Primarily I think what’s needed is a reality check in academia, not “best practices”, not denial, not ethical contemplations by people who are untrained in ethics issues, not social bias audits by people without any real expertise in social inequality, and definitely not insights from the dreary world of our academic education departments.
  • Ethical Considerations: It’s crucial to address ethical issues such as bias, privacy, and transparency. AI systems should be designed and implemented in ways that minimize discrimination and protect users’ personal information.
  • There should be ethical rules unto the AI. And rules about its use in schools.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Delello, J.A.; Sung, W.; Mokhtari, K.; Hebert, J.; Bronson, A.; De Giuseppe, T. AI in the Classroom: Insights from Educators on Usage, Challenges, and Mental Health. Educ. Sci. 2025, 15, 113. https://doi.org/10.3390/educsci15020113

AMA Style

Delello JA, Sung W, Mokhtari K, Hebert J, Bronson A, De Giuseppe T. AI in the Classroom: Insights from Educators on Usage, Challenges, and Mental Health. Education Sciences. 2025; 15(2):113. https://doi.org/10.3390/educsci15020113

Chicago/Turabian Style

Delello, Julie A., Woonhee Sung, Kouider Mokhtari, Julie Hebert, Amy Bronson, and Tonia De Giuseppe. 2025. "AI in the Classroom: Insights from Educators on Usage, Challenges, and Mental Health" Education Sciences 15, no. 2: 113. https://doi.org/10.3390/educsci15020113

APA Style

Delello, J. A., Sung, W., Mokhtari, K., Hebert, J., Bronson, A., & De Giuseppe, T. (2025). AI in the Classroom: Insights from Educators on Usage, Challenges, and Mental Health. Education Sciences, 15(2), 113. https://doi.org/10.3390/educsci15020113

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop