Next Article in Journal
Statistical Analysis of Major and Extra Serious Traffic Accidents on Chinese Expressways from 2011 to 2021
Previous Article in Journal
Developing DPSIR Framework for Managing Climate Change in Urban Areas: A Case Study in Jakarta, Indonesia
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Analysis of MOOC Quality Requirements for Landscape Architecture Based on the KANO Model in the Context of the COVID-19 Epidemic

1
School of Horticulture and Landscape Architecture, Henan Institute of Science and Technology, Xinxiang 453003, China
2
Henan Province Engineering Center of Horticultural Plant Resource Utilization and Germplasm Enhancement, Xinxiang 453003, China
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(23), 15775; https://doi.org/10.3390/su142315775
Submission received: 20 October 2022 / Revised: 21 November 2022 / Accepted: 24 November 2022 / Published: 27 November 2022

Abstract

:
COVID-19 has had a severe impact on higher education worldwide, and Massive Open Online Courses (MOOCs) have become the best solution to reduce the impact of the COVID-19 on student learning. In order to improve the quality of MOOCs for Landscape Architecture, it is essential to fully understand the psychological needs of students learning online. A total of 119 undergraduates and postgraduates majoring in landscape architecture were selected as the research subjects, and 18 indicators falling into 5 functions, including course organization, course resources, learning environment, learning experience, and learning support were screened. Questionnaires based on the KANO model were prepared at wjx.cn for investigation through WeChat. Attributes were classified according to the traditional KANO model and the KANO model based on Better-Worse coefficients. The research showed that based on the classification results of the traditional KANO model, 17 of the 18 indicators were of the attractive quality factor and the rest were of the must-be quality factor. After reclassification using the KANO model based on Better-Worse coefficients, 4 of the 18 indicators were must-be quality factors, 6 were one-dimensional quality factors, 4 were attractive quality factors, and the rest 4 were indifferent quality factors. Compared to the traditional KANO model, the KANO model based on Better-Worse coefficients has better quality element classification discrimination. According to the KANO-based analysis, appropriate strategies for indicators shall be adopted for MOOC development according to the four types of quality requirements. The research can provide a basis for the development and optimization of MOOCs for landscape architecture so as to better meet the learning needs of students and achieve better learning effects.

1. Introduction

COVID-19 has had a significant impact on social, political, economic and cultural exchanges across the world, and higher education has also been greatly influenced [1]. How to ensure teaching has become a topic of common concern. Education departments of various countries have taken numerous countermeasures, and online education has become the best solution to reduce the impact of COVID-19 on student learning, which has resulted in an explosive growth in the construction of online courses [2,3]. However, how to improve the quality of online open courses and meet learning needs has become the direction of future efforts.
Higher education is a significant part of the national innovation system, and it has always been an important force for boosting production changes, government efficiency and social progress. Countries around the world are constantly reforming and innovating higher education in terms of management, systems, models, and courses. China is the world’s largest developing country with a rapidly growing economy, and higher education has gained considerable development in recent years. Simultaneously, China has been constantly reforming higher education and has achieved remarkable results. On 29 April 2019, China’s Ministry of Education launched the training program for “six types of outstanding talents and top students in fundamental subjects” to comprehensively promote the construction of new engineering, new medicine, new agriculture and new liberal arts and improve the ability of colleges and universities to serve social and economic development. The core of this higher education reform is manifested in three aspects: professional development, course development and fundamental subject development among which course development requires the increase of levels, innovations and challenges to vigorously develop “online courses” and “combined online and offline courses” nationwide and to make proper use of modern information technology for boosting the development of online education.
Online education has grown rapidly since the 21st century, characterized by advanced flexibility, learner autonomy and widespread use of digital technology [4], thanks to the rapid development of information technology and innovation of education philosophy. The COVID-19 pandemic has accelerated the development of MOOC to become the essential resource for ensuring education continuity. However, MOOC is faced with both development opportunities and challenges. The education institutions, teachers and learners are not yet fully ready [5,6]. COVID-19 results in the accelerated transformation of digital education, while mixed coordination education is an important development orientation. Through coordinated learning, learners can jointly create knowledge and acquire skills of the 21st century, such as communication, critical thinking, decision-making, leadership and conflict management [7]. Online learning is a major change in the way of learning, allowing learners across the world to access the knowledge and skills they need through the Internet, breaking the limits of time and space [8]. Online education also drives the development of the flipped classroom, where the classroom is no longer just a place for teaching theories but also a place for interactions between teachers and students, enhancing the effectiveness of learning. Online education has also greatly boosted the development of training and education. Many well-known companies have conducted vocational training through online education, which not only improves training efficiency but also saves training cost [9]. COVID-19 has had a huge worldwide impact on higher education. Many educational institutions have moved their offline courses online, playing an important role in securing teaching [10].
The COVID-19 pandemic has forced teachers and students to rapidly adapt to online education mode. Without this option, many students would not be able to access higher education [11,12]. The process from passive acceptance to active adaptation requires the joint effort of the education circle. MOOC building needs to be student-centered, apply new teaching methods and tools to create more realistic learning environments and experiences as well as simultaneously develop active and reflective learning, which requires understanding students’ psychological demands for MOOC [13]. The psychological needs of students’ online learning are affected by many factors, such as employment, competitiveness, educational environment, learning environment, course content, and teaching methods, etc. [14]. Clear goals, clear navigation, interactive communication, homework evaluation standards, receipt of tutorials, independent learning, learning from multiple channels, and association with work are considered as the significant influencing factors of MOOCs [15,16]. The main drivers of MOOC learning persistence include satisfaction, effort expectation, engagement, behavioral intention, employer encouragement, convenience, and performance expectations [17], and students appreciate programs that are flexible enough to meet individual scheduling needs [18]. The effectiveness of online education is obvious to all, but it is not perfect. Moreover, due to the low participation of students, MOOC is deemed as disappointing and unlikely to replace regular face-to-face education [19]. The research results of some courses show that students may have good performance but lack academic practices, such as in-depth reading and essay writing. In addition, some teachers have reported the declining quality of education and increased workload [20]. Though online teaching can replace face-to-face teaching to some extent [21], there is no significant difference between offline learning and online learning [22]. However, MOOCs face greater challenges in terms of science, technology, engineering and math, since these courses usually require more practice and live demonstrations [23]. There are also some medical courses that are difficult for online teaching since they require laboratories, expensive equipment and instructions [24]. Compared to offline education, students are faced with issues, such as digital opportunities and threats, time pressure, lack of motivation, limited teamwork, and anxiety [25,26,27]. Obviously, it seems impossible for online education to provide social interaction and experience, which is deemed as the important experience of college education [28].
How to evaluate the quality of MOOCs has drawn extensive attention, and various evaluation tools have been developed for evaluating the quality of MOOCs. Different from offline education, online education offers a completely different learning experience. The United States was the first to launch online education in the world, with mature experience in online education evaluation. The E-learning certification standards mainly conduct MOOC evaluation from three aspects: practicability, technicality, and instruction. Quality on the line is a standard developed by the Institute for Higher Education Policy in the United States, which mainly consists of architecture, course development, teaching/learning, course structure, student support system, faculty support system, and evaluation and assessment system. In the research conducted by Chinese scholars, attention is mainly paid to the design of the course website, course content, learning interaction, network support, teaching resources, knowledge acquisition, ability acquisition, value cultivation and other indicators [29].
Qualitative evaluation and quantitative evaluation of MOOCs: Qualitative evaluation mainly adopts interviews, while quantitative evaluation mostly adopts analytic hierarchy process, fuzzy evaluation and other methods. The Indicators of Engaged Learning Online framework can be used as a tool for instructional designers and teachers to evaluate online courses [30]. The KANO model, service quality, quality function deployment and other quality tools have been widely used for design and improvement services [31]. Many service improvement models acquire customers’ needs based on scales, and confirm customers’ satisfaction according to intensity grading, reflecting the linear relationship between product performance and user satisfaction. The relationship of such satisfaction is one-dimensional: users will be satisfied when a product provides more functions or services. Contrarily, when functions or services are insufficient, users will be dissatisfied [32]. Contrarily, when functions or services are insufficient, users will be dissatisfied. However, this may not be the fact and a two-dimensional model may exist. This is because it has been found in satisfaction theory research that not all factors have a one-dimensional impact on user satisfaction. According to the two-dimensional model, the provision of some factors may not achieve user satisfaction, and sometimes, it even causes dissatisfaction. In addition, when certain factors are or are not provided, the user perceives no difference at all. That is the two-dimensional model of satisfaction.
Herzberg proposed the famous hygiene-motivational factors, believing that satisfaction and dissatisfaction are totally separated instead of coexisting in a single continuum [33]. Based on this theory, the KANO model was put forward and widely applied in product quality evaluation as a method for guiding developers in making informed decisions on the improvement of product quality based on the prediction of customer acceptance [34]. This model, put forward by professor KANO from the Tokyo Institute of Technology, is mainly used to classify and prioritize user needs, which reflects the nonlinear relationship between product performance and user satisfaction (Figure 1) [35].
The KANO model has been widely applied in industrial product design [36], tourism products, medical services [37], sports products, educational services [38], traffic quality, banking services [39] and other fields since its development, playing a vital role in improving product quality and services. By understanding consumer preferences and improving consumer loyalty, it has already become a common concern for enterprises. The KNAO model analysis has also been applied in the field of education. Fujs et al. proposed a method to evaluate remote conference tools features from the perspective of teachers and students based on the KANO model [40]. Seo and Um applied service fairness and service quality for predicting satisfaction and dissatisfaction based on the Stimulus-Organization-Response theory and the KANO model [41].
Online education has formed a huge commercial market, and surveys have shown that 61% of students have used at least one MOOC [42]. Especially since the spread of COVID-19 in 2019, there has been a surge in the learning of MOOCs [43]. MOOC is essentially an educational product, and a service that provides traditional offline education to teachers and students through the Internet. As an educational product, students’ psychological needs play a crucial role in the smooth running of courses, and it is an important approach to improve the quality of MOOCs by understanding students’ learning needs and enhancing their learning experience [44]. Therefore, it can find user needs for MOOCs with the KANO model and further improve the quality of MOOCs according to the types of needs so as to better serve teaching. Cost-benefit is one of the important principles of product development, with important influence in curriculum construction decision-making. KANO analysis can provide curriculum optimization with scientific evidence, giving better play to the values or benefits of curriculum construction under certain cost investment and resulting in better learning experience.

2. Materials and Methods

The research is divided into five steps. The first step is to know about students’ learning demands and combine the curriculum evaluation criteria screening questionnaire indicators of China and other countries. The second step is to determine the investigated objects, produce the questionnaire using Wenjuanxing software and distribute the questionnaire through the WeChat tool. The third step is to collect the data for reliability and validity tests. The fourth step is to carry out indicator classification using the classic KANO model and to analyze the discrimination of results. The fifth step is to carry out indicator classification using the KANO model based on Better-Worse coefficients and to discuss methods and measures for MOOC improvement according to the classification results.

2.1. Questionnaire Design and Survey

2.1.1. Questionnaire Design

There are many factors affecting online learning, and countries such as the United States have conducted research related to MOOC quality evaluation early. China’s MOOCs have undergone explosive growth in recent years. In order to regulate MOOC development, the Ministry of Education and other departments have issued several standards or documents [45,46,47,48], for instance, Notice on the Identification of National Quality Online Open Courses in 2019, Opinions on the Implementation of Construction of First-class Undergraduate Courses, Guidance for the Construction and Application of MOOCs in Schools of Higher Education, and Quality Assurance System of UOOCs and MOOCs, etc.; the main indicators mentioned in these documents include the course team, course objectives, teaching design, course content, teaching organization, teacher guidance, etc. This research integrated MOOC quality evaluation factors in China, Europe and the United States and selected 18 indicators, falling into 5 functions, including course organization, course resources, learning environment, learning experience, and learning support (Table 1).
A total of 18 indicators are designed into 18 pairs of forward and reverse questions, according to the KANO questionnaire. Each pair of forward and reverse questions uses a five-point Likert scale, corresponding to “I like it”, “it must be”, “I am neutral”, “I can live with it” and “I dislike it”, respectively. The questions in the questionnaire are described in words (Table 2) to facilitate respondents to understand and express their true attitudes.

2.1.2. Survey

In KANO research, the sample size is generally required to be 5–10 times the number of items on the scale. The research subjects are students majoring in landscape architecture from the School of Horticulture and Landscape Architecture, Henan Institute of Science and Technology. Online questionnaires were made on wjx.cn, which consist of two parts of which one is about the basic information of subjects and the other is the forward and reverse questions of 18 indicators. Questionnaires were respectively distributed to undergraduates and graduate students by the undergraduate counselor and graduate secretary via WeChat and completed between 30 and 31 May 2022. Eventually, 135 questionnaires were recovered. After excluding invalid questionnaires with the same answer to the forward or reverse questions, 119 valid surveys were obtained, and the recovery efficiency rate was 88.1%. Basic information of the research subjects is shown in Table 3.
After the survey, Excel 2016 was used to organize and analyze the questionnaire results. Data analysis was performed using SPSS 26 software (Chicago, IL, USA).

2.2. Methods

2.2.1. Reliability and Validity Test

Reliability analysis aims to test the reliability of the questionnaire. Generally, if the Cronbach’s alpha is above 0.9, the reliability of the questionnaire is excellent; if it is above 0.7, the reliability is good; if it is above 0.6, the reliability is acceptable; if it is below 0.6, the scale shall be redesigned. In the reliability test of the questionnaire, the Cronbach’s alpha of the forward and reverse questions exceeds 0.9 on the whole, and the Cronbach’s alpha of the forward and reverse questions in most functional modules is above 0.6, which is acceptable [49] (Table 4).
Validity analysis aims to test the authenticity and accuracy of the questionnaire. Generally, if the KMO is above 0.9, the validity of the questionnaire is good; if it is between 0.8 and 0.9, the validity is appropriate; if it is between 0.7 and 0.8, the validity is acceptable; if it is less than 0.6, the validity is poor. The KMO of the questionnaire is 0.887 on the whole, the KMO of the forward questionnaire is 0.896, while that of the reverse questionnaire is 0.934, but the significance of Bartlett’s Test of Sphericity is 0.000, less than 0.01, within the range of good validity (Table 5).

2.2.2. Analytical Methods

The traditional KANO analysis is to count the attribute categories with the highest frequency or percentage of each indicator selected against the quality attribute classification table and classify them into five types, including attractive quality factors, one-dimensional quality factors, must-be quality factors, indifferent quality factors, and reverse quality factors [50] (Table 6).
In the traditional KANO model, the classification cannot define which attribute category an indicator belongs to when there are multiple quality factors with higher percentages or other quality factors with very close percentage values to those of the highest-accounting quality factors. The traditional KANO model can only assess customer satisfaction qualitatively, and this limitation prevents quantitative assessment of the degree of customer satisfaction [51]. Later, to make up for the shortcomings of the traditional KANO model, American scholars proposed the Better-Worse coefficients, using Satisfaction Index (SI) and Dissatisfaction Index (DSI) for expression [52]. SI is the satisfaction index after the increase, with its value ranging between 0 and 1. The higher the value, the greater the impact of this indicator on public demand satisfaction. DSI is the dissatisfaction index after the elimination [53], with its value ranging between −1 and 0. The lower the value, the greater the impact of this indicator on public demand dissatisfaction. The minus sign in the DSI formula is to emphasize that failure to meet requirements or incorporate features in the product design will result in a negative impact on user satisfaction.
Better   coefficient ,   S I = ( A + O ) / ( A + O + I + M )
Worse   coefficient ,   D S I = 1 × ( O + M ) / ( A + O + I + M )

3. Results and analysis

3.1. Results of Traditional KANO Analysis

According to Table 7, it is clear that 17 of the 18 indicators are attractive quality factors. The traditional KANO model classification is prone to a large number of indicators belonging to attractive quality factors, and it is no exception for the MOOCs [54]. These indicators can satisfy learners when they are adequate and do not cause dissatisfaction when they are not. Obviously, in other similar cases, products or services with attractive quality factors are more attractive to consumers and more likely to form a competitive advantage.
MOOCs have undergone explosive growth worldwide since the 21st century and have become a new type of teaching product. Since 2011, three MOOC platforms, namely Coursera, edX and Udacity have sparked a boom in MOOC education [55]. MOOCs expand the time and space of learning and improve the flexibility of learning and the freedom of choice. However, there are also some problems, such as: lack of social interaction, which makes online learners feel lonely and has a negative impact on online learning [56]; and uneven quality of courses, as some courses merely move the offline learning mode to the Internet and teach all learners in a one-way filler style without any differentiation. Similarly, there are also a lot of problems in the application of MOOCs in China. Currently, in addition to national and provincial MOOCs organized by education departments, a large number of MOOCs have been established by colleges and universities in China. MOOC construction is costly and requires careful organization and production, but due to a lack of funds and experience, many courses are of poor quality, which affects students’ enthusiasm for learning to some extent [57]. Research shows that students have a low participation rate as well as willingness to continue the learning, and MOOC completion rates are usually low, at 5–10% [58]. Students’ enthusiasm for learning is not very high, and many of them passively participate in MOOC learning, so they pay little attention to the course. On the other hand, some high-quality courses, such as national MOOCs, are really excellent in course architecture, course organization, content presentation, course ware production, and interactive participation of teachers and students, attracting numerous learners. On providing these quality services, MOOCs will greatly increase satisfaction.

3.2. Results of KANO Analysis Based on Better-Worse Coefficients

According to Equations (1) and (2), SI and DSI are calculated, respectively. The SI of the course indicators ranges between 0.5470 and 0.7479, and the mean value is 0.6474; the absolute value of DSI of course indicators range between 0.0683 and 0.4118, and the mean value is 0.2385 (Table 8). With the mean of the absolute values of SI and DSI as the original point, a quadrant diagram is drawn (Figure 2). This data processing method makes up for the deficiency of the traditional KANO method that simply relies on the maximum frequency to determine the attribute classification of each indicator.
The indicators in the first quadrant are one-dimensional quality factors, and there are six indicators, including A6, C2, D1, D4, E1, and E2. The absolute values of SI and DSI are both higher than the mean value, indicating that the provision of such services can not only enhance learner satisfaction but also prevent the dissatisfaction of learners, that is, factors in this quadrant require enough attention.
The indicators in the second quadrant are must-be quality factors, and there are 4 indicators, including A2, A3, A5, and D3. The SI value is lower than the mean value, indicating that the provision of the corresponding services has little effect on enhancing satisfaction. The lower the SI value, the more the learner takes the services for granted; the absolute value of DSI is higher than the mean value, indicating that it would enhance dissatisfaction if such services were not provided. The higher the absolute value of DSI, the more dissatisfied the learners are about the lack of such services. It suggests that although these indicators cannot enhance learner satisfaction, dissatisfaction will make learners dissatisfied, so these factors can effectively reduce learner dissatisfaction.
The indicators in the third quadrant are indifferent quality factors, and there are 4 indicators, including A1, A4, C1, and E3. Both the SI value and the absolute value of DSI are lower than the mean value, suggesting that the related services do not particularly affect the satisfaction of learners, but the improvement of services in the future may transform such needs to those that are higher-level. The presence of these indicators is relatively unimportant to learners, for instance, learners pay little attention to these factors or these factors currently fail to attract the attention of learners.
The indicators in the fourth quadrant are attractive quality factors, and there are four indicators, including B1, B2, B3, and D2. The SI value is higher than the mean value, suggesting that the provision of such services can enhance learner satisfaction. The higher the SI value, the higher the satisfaction; the absolute value of DSI is lower than the mean value, suggesting that learner satisfaction would not decrease if such services were not provided.

4. Discussion

Students are the center of teaching and learning, and MOOC should not merely stand in the perspective of “teaching” to provide the so-called “services supposed to be provided” but fully consider the supply of educational services and the matching of service recipients. Students are discerning consumers. If meeting students’ learning needs, MOOCs will significantly enhance the learning effect, which requires reasonable adjustment of course indicators according to students’ learning needs to make them more in line with expectations [59].

4.1. Emphasizing Must-Be Quality Factors and Meeting the Basic Requirements of Course Quality

Course teaching design means making orderly arrangements of teaching factors and determining appropriate teaching plans according to the requirements of course standards and the characteristics of the teaching objects. It generally includes teaching objectives, teaching difficulties and emphases, teaching methods, teaching steps and time allocation. Different from offline courses, the teaching design of MOOCs is particular, and it is necessary to reorganize teaching links, instead of completely copying the offline mode.
Course content determines the knowledge and ability students acquire, so it is one of the basic quality requirements of MOOCs. Low student attendance and low willingness of continuous learning have restricted the development of online education in China. For an informal learning environment, course content is essential for MOOCs, which shall provide pertinent essential knowledge for building a common knowledge base [60,61]. Research shows that effort expectancy, content quality, perceived cost, and performance expectancy are important factors that affect continuous learning, and content quality is the most important factor [62].
Course schedule mainly refers to the time control of a course. Currently, many MOOCs are announced according to the time node of offline course organization, which goes against the original intention of MOOCs to a certain extent. Students shall be allowed to choose the place and time of course learning much more independently in order to enhance the efficiency of learning.
Course emotional value development is an important task of offline courses, but it is easily overlooked in MOOCs. Emotions, attitudes, and values are the results of the experience of a person in the process of practice, which not only depend on the positive or negative experience directly generated during the participation in the practice, they also rely on the positive experience generated from the positive or negative evaluation given by their teachers, parents and classmates. An excellent MOOC can cultivate both the intellectual and non-intellectual factors of students, so that students can acquire basic knowledge and skills, and develop correct values.

4.2. Improving One-Dimensional Quality Factors and Enhancing Course Satisfaction

The course teaching team is the basis for the activities carried out in the course, and a high-quality teaching team will enhance the success of knowledge transfer. Teachers’ teaching experience, teaching skills, and learning experience have a significant impact on online teaching capability, and by providing teachers with flexible interactive support, teaching designers can bridge this gap between theoretical knowledge and practical skills [63,64].
The learning of MOOCs is mainly based on multimedia. However, significant gaps in the funds, video quality, image organization and presentation, audio effects and other aspects have already become one of the factors affecting students’ interest in MOOC learning. Digital video plays an increasingly important role in the learning process, but for teachers who are not good at making videos, video-making is a time-consuming activity [65]. Research shows that incorporating digital music can help to improve MOOC teaching effect and satisfaction [66].
Course knowledge development mainly refers to the transfer of course knowledge. Therefore, the matching of knowledge and course, knowledge innovation and richness of knowledge would affect the quality of MOOCs. Even two courses with the same name differ greatly from each other in the organization of knowledge. Therefore, a good knowledge architecture becomes an essential factor for MOOCs.
Teachers and students can communicate face to face in offline courses, but in terms of MOOCs, there is a lack of opportunity for interaction between teachers and students or between students. Critics argue that the asynchronous interaction of MOOCs is not engaging and rigorous enough for higher education, and that a balanced online environment should provide both asynchronous and synchronous opportunities to facilitate the communication and collaboration between teachers and students [67]. For some students participating in MOOCs, receiving tutoring can be a key factor in their success, and even if students perform well in the end, they may feel frustrated due to technical difficulties and a lack of tutor support [68]. Therefore, course interaction becomes one of the factors highly desired by students [69]. Research shows that group learning can significantly reduce anxiety and cognitive load and enhance the effect of course learning [70]. Consequently, a collaborative system can be designed to encourage students to develop trust and teamwork in cross-cultural online learning environment [71]. Research shows that online students are less motivated to re-enroll in courses because of the lack of direct contact with teachers for learning. In addition, there is a lack of student support services, such as financial aid, academic advisors, or counseling services. MOOCs also miss providing opportunities to participate in student groups and organizations, speakers and campus events, etc. [72]. Blended courses, rather than pure MOOCs, can increase effective interaction and become a highly prized model [73]. Online Merge Offline Learning attempts to realize the simultaneous implementation of offline classroom and online platform teaching. This learning mode relies on hybrid infrastructure and open educational practices and combines online and offline (offline classroom) learning space in real time so as to provide more open and immersive learning experience [74].
Online education is divided into synchronous and asynchronous remote learning. In synchronous learning, communication is done in a certain virtual environment. In asynchronous learning, the interaction between teachers and learners is done via email, recordings, videos or texts [75]. As found by the research on technology guarantees for online education, there are a great number of obstacles and restrictions in the implementation of modern technology in synchronous or asynchronous learning, such as insufficient access to Internet and digital technology, a low level of computer knowledge or technology restrictions, exerting negative effects on MOOC education to certain extents [76]. Online learning requires being fully accessible, with a platform that is quick and easy to access and smooth in operation. At present, the MOOCs developed by some universities are usually complicated and greatly affected by server stability, while those developed by professional companies are usually smooth in access and running, as an important guarantee for teaching and learning.

4.3. Adjusting Indifferent Quality Factors and Improving Course Development Strategies

Course teaching objectives are the level of knowledge, skills, emotions and attitudes that students are expected to achieve after taking the course. Actually, whether or not a clear teaching objective is provided, it will be reflected in the content of the course. At present, compared with offline education, the teaching methods of MOOCs are relatively simple, mainly recording lectures, online tests, and interactive discussions. Most MOOCs are developed and run by professional companies, which share the same styles of image, so students pay less attention to the image. According to the theoretical views of the KANO model, users have varying needs, and their understandings of indifferent quality demands would also change constantly from I to A and O and then M [77]. Therefore, it is unscientific to completely ignore the indifferent quality demands, and course developers should track these indicators of indifferent demand appropriately, to adjust the relevant strategies as appropriate.

4.4. Highlighting Course Features by Centering on Attractive Quality Factors

MOOC resources are an important factor in attracting students, which can improve students’ academic performance, and students with higher grades generally participate more actively in online learning activities [78,79]. Course resources are divided into core resources that reflect the course teaching ideas, teaching content, teaching methods, and teaching process, necessary resources that reflect course introduction, syllabus, teaching calendar, lesson plans or presentations, assignments, reference materials, and teaching videos throughout the course, and additional resources that can expand students’ knowledge, skills, and emotions. The richness of course resources means that courses can provide students with more choices for learning, which is conducive to satisfying students’ learning interests and hobbies; the coverage of course resources means that the resources cover all the content and links of the course; the openness of course resources means that the resources are available free and in full.
It is relatively easy for MOOCs to cultivate both knowledge and emotions, but it is difficult to cultivate abilities. Similar to medicine, landscape architecture requires the cultivation of numerous operational skills, which are difficult to acquire through online learning [80]. Research shows that many students complain in course feedback that they have fewer opportunities to have contact with effective teaching practices, which shall draw enough attention [81].

5. Conclusions

According to students’ learning psychological needs, this research selects the course teaching objectives, course teaching design, course content organization, course teaching methods, course schedule, course teaching team capability, richness of course resources, coverage of course resources, openness of course resources, course image design, course multimedia quality, course knowledge development, course competence development, course emotional value development, course interaction, platform access, platform running, and platform update as MOOC quality indicators, which are classified using the KANO method by attributes.
The traditional KANO model owns poor discrimination in classifying the quality elements of online courses for landscape architecture majors, while the KANO model based on B-W owns relatively good discrimination in the quality elements of MOOC. After classification, the elements contained by the four quality types can be clearly identified. According to the attribute classification of each quality element, the attribute positioning issue of MOOC quality elements can be solved and proper teaching strategies can be more specifically formulated to adapt to the continuously changing learning demands and to improve students’ learning satisfaction.
MOOC is an educational product with students as the main consumer group, while the KANO model is mainly used to study consumer satisfaction with the product. Therefore, the KANO model is very suitable for evaluating the satisfaction of various educational products, and product quality can be improved according to the results.

6. Limitations and Future Research

Although the sample size of this research can meet the requirements, there is no doubt that a larger sample size can make the research results more representative. The samples are taken from the Henan Institute of Science and Technology. Future confirmatory studies may increase the sample size and expand the coverage of study participants.
In future research, some other indicators should also be taken into consideration [82]. The first is personalized learning. Different students have different learning needs, and how to meet personalized needs is also a factor worthy of attention in MOOC construction. The second is the cultivation of creativity by MOOC [83]. Research shows that college students believe that teachers are capable of integrating technologies into the course but hope that teachers can continue to introduce innovative practices in the educational environment so that students can gain experience and be prepared for adapting to a complicated society [84]. The third is the learning needs of physiologically vulnerable groups, such as how to solve the LMS (learning management system) barriers, course content, material barriers and communication barriers that deaf students have in their learning. Finally, with the development of modern technology, there are more and more types of learning terminals, which can meet students’ needs of learning anytime, anywhere [85]. Therefore, MOOCs should be able to adapt to various learning terminals and provide a stable learning environment [86]. For example, the development of intelligent tutoring systems enables online learning terminals to identify learners’ emotions. The emotions can be recognized by asking the user, tracking implicit parameters, voice recognition, facial expression recognition, vital signals and gesture recognition [87]. Artificial Intelligence technology has profoundly changed the ways of production and life and can also play a great role in the education circle. The application of AI technology into MOOC teaching will improve learning experience and teaching efficiency [88]. There is a serious phenomenon of “emotional loss” in online learning, that is, teachers cannot perceive the learning growth and emotions of online learners in real time. The facial expression recognition system can identify the seven learning emotions, namely confusion, curiosity, distraction, enjoyment, fatigue, depression, and neutrality, enabling teachers to timely perceive students’ learning emotions and adjust teaching strategies in real time [89].
According to Kano’s life cycle theory, quality requirements change as customer perceptions change. Quality factors that are seen as indifferent can become attractive, one-dimensional, or must-be. Therefore, future research should continue to verify and improve the assessment of the quality requirements of MOOCs, so as to provide a basis for continuous improvement of MOOCs.

Author Contributions

Conceptualization, L.Q. and Y.Z.; methodology, L.Q.; software, Y.Z.; validation, L.Q.; formal analysis, L.Q.; investigation, Y.Z.; resources, Y.Z.; data curation, Y.Z.; writing—original draft preparation, L.Q.; writing—review and editing, Y.Z.; visualization, Y.Z.; supervision, Y.Z.; project administration, Y.Z.; funding acquisition, Y.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the 2020 New Agricultural Science Research and Practice Reform Project of Henan Province, China (Project title: Research on the construction standards of first-class curriculum in landscape architecture under the background of new agricultural science), grant number: 2020JGLX134. This project is supported by the Education Department of Henan Province, China and “The APC was funded by School of Horticulture and Landscape Architecture, Henan Institute of Science and Technology”.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data supporting the study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Alzahrani, L.; Seth, K.P. Factors influencing students’ satisfaction with continuous use of learning management systems during the COVID-19 pandemic: An empirical study. Educ. Inf. Technol. 2021, 26, 6787–6805. [Google Scholar] [CrossRef] [PubMed]
  2. Hu, Y.H. Effects of the COVID-19 pandemic on the online learning behaviors of university students in Taiwan. Educ. Inf. Technol. 2022, 27, 469–491. [Google Scholar] [CrossRef] [PubMed]
  3. Misirli, O.; Ergulec, F. Emergency remote teaching during the COVID-19 pandemic: Parents experiences and perspectives. Educ. Inf. Technol. 2021, 26, 6699–6718. [Google Scholar] [CrossRef] [PubMed]
  4. Moreno-Marcos, P.M.; Alario-Hoyos, C.; Munoz-Merino, P.J.; Kloos, C.D. Prediction in MOOCs: A review and future research directions. IEEE Trans. Learn. Technol. 2019, 12, 384–401. [Google Scholar] [CrossRef]
  5. Swanson, B.A.; Valdois, A. Acceptance of online education in China: A reassessment in light of changed circumstances due to the COVID-19 pandemic. Int. J. Educ. Res. Open 2022, 3, 100214. [Google Scholar] [CrossRef]
  6. Coman, C.; Țîru, L.G.; Meseșan-Schmitz, L.; Stanciu, C.; Bularca, M.C. Online teaching and learning in higher education during the coronavirus pandemic: Students’ perspective. Sustainability 2020, 12, 10367. [Google Scholar] [CrossRef]
  7. Kalmar, E.; Aarts, T.; Bosman, E.; Ford, C.; de Kluijver, L.; Beets, J.; Veldkamp, L.; Timmers, P.; Besseling, D.; Koopman, J.; et al. The COVID-19 paradox of online collaborative education: When you cannot physically meet, you need more social interactions. Heliyon 2022, 8, e08823. [Google Scholar] [CrossRef] [PubMed]
  8. Vlachopoulos, D.; Makri, A. Online communication and interaction in distance higher education: A framework study of good practice. Int. Rev. Educ. 2019, 65, 605–632. [Google Scholar] [CrossRef]
  9. Burd, E.L.; Smith, S.P.; Reisman, S. Exploring business models for MOOCs in higher education. Innov. High Educ. 2015, 40, 37–49. [Google Scholar] [CrossRef]
  10. Cavanaugh, J.; Jacquemin, S.J.; Junker, C.R. Variation in student perceptions of higher education course quality and difficulty as a result of widespread implementation of online education during the COVID-19 pandemic. Tech. Know. Learn. 2022. [Google Scholar] [CrossRef]
  11. Neuwirth, L.S.; Jovic, S.; Mukherji, B.R. Reimagining higher education during and post-COVID-19: Challenges and opportunities. J. Adult Cont. Educ. 2021, 27, 141–156. [Google Scholar] [CrossRef]
  12. Racovita-Szilagyi, L.; Carbonero, D.; Diaconu, M. Challenges and opportunities to eLearning in social work education: Perspectives from Spain and the United States. Eur. J. Soc. Work 2018, 21, 836–849. [Google Scholar] [CrossRef]
  13. Fernández-Batanero, J.M.; Montenegro-Rueda, M.; Fernández-Cerero, J.; Tadeu, P. Online education in higher education: Emerging solutions in crisis times. Heliyon 2022, 8, e10139. [Google Scholar] [CrossRef] [PubMed]
  14. Alemán de la Garza, L.Y.; Sancho-Vinuesa, T.; Gómez Zermeño, M.G. Indicators of pedagogical quality for the design of a Massive Open Online Course for teacher training. Int. J. Educ. Technol. High Educ. 2015, 12, 104–118. [Google Scholar] [CrossRef] [Green Version]
  15. Baldwin, S.; Ching, Y.H.; Hsu, Y.C. Online course design in higher education: A review of national and statewide evaluation instruments. TechTrends 2018, 62, 46–57. [Google Scholar] [CrossRef] [Green Version]
  16. Hollebrands, K.F.; Lee, H.S. Effective design of massive open online courses for mathematics teachers to support their professional learning. ZDM-Math. Educ. 2020, 52, 859–875. [Google Scholar] [CrossRef] [Green Version]
  17. Lakhal, S.; Khechine, H.; Mukamurera, J. Explaining persistence in online courses in higher education: A difference-in-differences analysis. Int. J. Educ. Technol. High Educ. 2021, 18, 19. [Google Scholar] [CrossRef]
  18. Daniel, E.L. A review of time-shortened courses across disciplines. Coll. Stud. J. 2000, 34, 298–309. [Google Scholar]
  19. Han, H.P.; Lien, D.; Lien, J.W.; Zheng, J. Online or face-to-face? Competition among MOOC and regular education providers. Int. Rev. Econ. Financ. 2022, 80, 857–881. [Google Scholar] [CrossRef]
  20. Holzweiss, P.C.; Polnick, B.; Lunenburg, F.C. Online in half the time: A case study with online compressed courses. Innov. High Educ. 2019, 44, 299–315. [Google Scholar] [CrossRef]
  21. Olmes, G.L.; Zimmermann, J.S.M.; Stotz, L.; Takacs, F.Z.; Hamza, A.; Radosa, M.P.; Findeklee, S.; Solomayer, E.F.; Radosa, J.C. Students’ attitudes toward digital learning during the COVID-19 pandemic: A survey conducted following an online course in gynecology and obstetrics. Arch. Gynecol. Obstet. 2021, 304, 957–963. [Google Scholar] [CrossRef] [PubMed]
  22. Faulconer, E.K.; Griffith, J.; Wood, B.; Roberts, D. A comparison of online, video synchronous, and traditional learning modes for an introductory undergraduate physics course. J. Sci. Educ. Technol. 2018, 27, 404–411. [Google Scholar] [CrossRef]
  23. Yang, D. Instructional strategies and course design for teaching statistics online: Perspectives from online students. I. J. STEM Ed. 2017, 4, 34. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Yen, P.Y.; Hollar, M.R.; Griffy, H.; Lee, L.M.J. Students’ expectations of an online histology course: A qualitative study. Med. Sci. Educ. 2014, 24, 75–82. [Google Scholar] [CrossRef]
  25. Bruggeman, B.; Garone, A.; Struyven, K.; Pynoo, B.; Tondeur, J. Exploring university teachers’ online education during COVID-19: Tensions between enthusiasm and stress. Comput. Educ. Open 2022, 3, 100095. [Google Scholar] [CrossRef]
  26. Siah, C.J.R.; Huang, C.M.; Poon, Y.S.R.; Koh, S.L.S. Nursing students’ perceptions of online learning and its impact on knowledge level. Nurse Educ. Today 2022, 112, 105327. [Google Scholar] [CrossRef]
  27. Khan, M. The impact of COVID-19 on UK higher education students: Experiences, observations, and suggestions for the way forward. Corp. Gov-Int. J. Bus. Soc. 2021, 21, 1172–1193. [Google Scholar] [CrossRef]
  28. McCullogh, N.; Allen, G.; Boocock, E.; Peart, D.J.; Hayman, R. Online learning in higher education in the UK: Exploring the experiences of sports students and staff. J. Hosp. Leis. Sport. Tour. Educ. 2022, 31, 100398. [Google Scholar] [CrossRef]
  29. Zhang, W.Y.; Wang, L.X. The development of bench mark for assessing online teaching environments. Distance Educ. China 2003, 17, 34–39+78–79. [Google Scholar]
  30. Bigatel, P.M.; Edel-Malizia, S. Using the “indicators of engaged learning online” framework to evaluate online course quality. TechTrends 2018, 62, 58–70. [Google Scholar] [CrossRef]
  31. Lizarelli, F.L.; Osiro, L.; Ganga, G.M.D.; Mendes, G.H.S.; Paz, G.R. Integration of SERVQUAL, Analytical Kano, and QFD using fuzzy approaches to support improvement decisions in an entrepreneurial education service. Appl. Soft Comput. 2021, 112, 107786. [Google Scholar] [CrossRef]
  32. Agyeiwaah, E.; Badu Baiden, F.; Gamor, E.; Hsu, F.C. Determining the attributes that influence students’online learning satisfaction during COVID-19 pandemic. J. Hosp. Leis. Sport. Tour. Educ. 2022, 30, 100364. [Google Scholar] [CrossRef]
  33. Alrawahi, S.; Sellgren, S.F.; Altouby, S.; Alwahaibi, N.; Brommels, M. The application of Herzberg’s two-factor theory of motivation to job satisfaction in clinical laboratories in Omani hospitals. Heliyon 2020, 6, e04829. [Google Scholar] [CrossRef] [PubMed]
  34. Bhardwaj, J.; Yadav, A.; Chauhan, M.S.; Chauhan, A.S. Kano model analysis for enhancing customer satisfaction of an automotive product for Indian market. Mater. Today Proc. 2021, 46, 10996–11001. [Google Scholar] [CrossRef]
  35. Thipwong, P.; Wong, W.K.; Huang, W.T. Kano model analysis for five-star hotels in Chiang Mai, Thailand. J. Manag. Inf. Decis. Sci. 2020, 23, 1–15. [Google Scholar]
  36. Lin, F.H.; Tsai, S.B.; Lee, Y.C.; Hsiao, C.F.; Zhou, J.; Wang, J.; Shang, Z.W. Empirical research on Kano’s model and customer satisfaction. PLoS ONE 2017, 12, e0183888. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Chen, M.C.; Hsu, C.L.; Lee, L.H. Investigating pharmaceutical logistics service quality with refined Kano’s model. J. Retail. Consum. Serv. 2020, 57, 102231. [Google Scholar] [CrossRef]
  38. Kinker, P.; Swarnakar, V.; Singh, A.R.; Jain, R. Prioritizing NBA quality parameters for service quality enhancement of polytechnic education institutes–A fuzzy Kano-QFD approach. Mater. Today Proc. 2021, 47, 5788–5793. [Google Scholar] [CrossRef]
  39. Pakizehkar, H.; Sadrabadi, M.M.; Mehrjardi, R.Z.; Eshaghieh, A.E. The Application of integration of Kano’s Model, AHP technique and QFD matrix in prioritizing the bank’s substructions. Procedia Soc. Behav. Sci. 2016, 230, 159–166. [Google Scholar] [CrossRef] [Green Version]
  40. Fujs, D.; Vrhovec, S.; Žvanut, B.; Vavpotič, D. Improving the efficiency of remote conference tool use for distance learning in higher education: A kano based approach. Comput. Educ. 2022, 181, 104448. [Google Scholar] [CrossRef]
  41. Seo, Y.J.; Um, K.H. The asymmetric effect of fairness and quality dimensions on satisfaction and dissatisfaction: An application of the Kano model to the interdisciplinary college program evaluation. Stud. Educ. Eval. 2019, 61, 183–195. [Google Scholar] [CrossRef]
  42. Chen, X.; Geng, W. Enroll now, pay later: Optimal pricing and nudge efforts for massive-online-open-courses providers. Electron. Mark. 2021, 32, 1003–1018. [Google Scholar] [CrossRef]
  43. Yu, L.; Lan, M.; Xie, M. The survey about live broadcast teaching in Chinese middle schools during the COVID-19 Pandemic. Educ. Inf. Technol. 2021, 26, 7435–7449. [Google Scholar] [CrossRef] [PubMed]
  44. Oyelere, S.S.; Olaleye, S.A.; Balogun, O.S.; Tomczyk, Ł. Do teamwork experience and self-regulated learning determine the performance of students in an online educational technology course? Educ. Inf. Technol. 2021, 26, 5311–5335. [Google Scholar] [CrossRef]
  45. Notice on the Identification of National Quality Online Open Courses in 2019; Higher Education Division of Ministry of Education: Beijing, China, 2019.
  46. Opinions on the Implementation of Construction of First-class Undergraduate Courses; Ministry of Education: Beijing, China, 2019.
  47. Guidance for the Construction and Application of MOOCs in Schools of Higher Education; Innovation and Guidance Committee of Teaching Informatization and Teaching Methods of Higher Education of the Ministry of Education: Beijing, China, 2020.
  48. Quality Assurance System of UOOCs and MOOCs; University Open Online Courses: Shenzhen, China, 2018.
  49. Hossain, G. Rethinking self-reported measure in subjective evaluation of assistive technology. Hum. Cent. Comput. Inf. Sci. 2017, 7, 23. [Google Scholar] [CrossRef] [Green Version]
  50. Kano, N.; Seraku, K.; Takahashi, F.; Tsuji, S. Attractive quality and must-be quality. J. Jpn. Soc. Qual. Control. 1984, 14, 39–48. [Google Scholar] [CrossRef]
  51. Mkpojiogu, E.O.C.; Hashim, N.L. Understanding the relationship between Kano model’s customer satisfaction scores and self-stated requirements importance. SpringerPlus 2016, 5, 197. [Google Scholar] [CrossRef] [Green Version]
  52. Matzler, K.; Hinterhuber, H.H.; Bailom, F.; Sauermein, E. How to delight your customer. J. Prod. Brand Manag. 1996, 5, 6–18. [Google Scholar] [CrossRef]
  53. Matzler, K.; Hinterhuber, H.H. How to make product development projects more successful by integrating Kano’s model of customer satisfaction into quality function deployment. Technovation 1998, 18, 25–38. [Google Scholar] [CrossRef]
  54. Stöcker, B.; Baier, D.; Brand, B.M. New insights in online fashion retail returns from a customers’ perspective and their dynamics. J. Bus. Econ. 2021, 91, 1149–1187. [Google Scholar] [CrossRef]
  55. Wu, B. Influence of MOOC learners discussion forum social interactions on online reviews of MOOC. Educ. Inf. Technol. 2021, 26, 3483–3496. [Google Scholar] [CrossRef]
  56. Baldwin, S.J.; Ching, Y.H. An online course design checklist: Development and users’ perceptions. J. Comput. High. Educ. 2019, 31, 156–172. [Google Scholar] [CrossRef]
  57. Goldberg, L.R.; Bell, E.; King, C.; O’Mara, C.; McInerney, F.; Robinson, A.; Vickers, J. Relationship between participants’ level of education and engagement in their completion of the Understanding Dementia Massive Open Online Course. BMC Med. Educ. 2015, 15, 60. [Google Scholar] [CrossRef] [PubMed]
  58. Akinkuolie, B.; Shortt, M. Applying MOOCocracy learning culture themes to improve digital course design and online learner engagement. Educ. Tech. Res. Dev. 2021, 69, 369–372. [Google Scholar] [CrossRef] [PubMed]
  59. Castro, M.D.B.; Tumibay, G.M. A literature review: Efficacy of online learning courses for higher education institution using meta-analysis. Educ. Inf. Technol. 2021, 26, 1367–1385. [Google Scholar] [CrossRef]
  60. García-Cabrero, B.; Hoover, M.L.; Lajoie, S.P.; Andrade-Santoyo, N.L.; Quevedo-Rodríguez, L.M.; Wong, J. Design of a learning-centered online environment: A cognitive apprenticeship approach. Education. Tech. Res. Dev. 2018, 66, 813–835. [Google Scholar] [CrossRef]
  61. Wilhelm-Chapin, M.K.; Koszalka, T.A. Graduate students’ use and perceived value of learning resources in learning the content in an online course. TechTrends 2020, 64, 361–372. [Google Scholar] [CrossRef]
  62. Chen, M.T.; Wang, X.; Wang, J.X.; Zuo, C.; Tian, J.; Cui, Y.P. Factors affecting college students’ continuous intention to use online course platform. SN Comput. Sci. 2021, 2, 114. [Google Scholar] [CrossRef]
  63. Scoppio, G.; Luyt, I. Mind the gap: Enabling online faculty and instructional designers in mapping new models for quality online courses. Educ. Inf. Technol. 2017, 22, 725–746. [Google Scholar] [CrossRef]
  64. Perez-Navarro, A.; Garcia, V.; Conesa, J. Students perception of videos in introductory physics courses of engineering in face-to-face and online environments. Multimed. Tools Appl. 2021, 80, 1009–1028. [Google Scholar] [CrossRef]
  65. Joanna, C.D.; Patrick, R.L. Hot for teacher: Using digital music to enhance students’ experience in online courses. TechTrends 2010, 54, 58–73. [Google Scholar] [CrossRef]
  66. Reese, S.A. Online learning environments in higher education: Connectivism vs. Dissociation. Educ. Inf. Technol. 2015, 20, 579–588. [Google Scholar] [CrossRef]
  67. Miles, B.; Sorgen, C.H.; Zinskie, C.D. Using an outsourced online tutoring service to promote success in online composition courses. TechTrends 2021, 65, 743–749. [Google Scholar] [CrossRef]
  68. Rajabalee, Y.B.; Santally, M.I. Learner satisfaction, engagement and performances in an online module: Implications for institutional e-learning policy. Educ. Inf. Technol. 2021, 26, 2623–2656. [Google Scholar] [CrossRef]
  69. Du, J.X.; Fan, X.T.; Xu, J.Z.; Wang, C.; Sun, L.; Liu, F.T. Predictors for students’ self-efficacy in online collaborative groupwork. Educ. Tech. Res. Dev. 2019, 67, 767–791. [Google Scholar] [CrossRef]
  70. Beer, M.; Slack, F.; Armitt, G. Collaboration and teamwork: Immersion and presence in an online learning environment. Inf. Syst. Front. 2005, 7, 27–37. [Google Scholar] [CrossRef] [Green Version]
  71. Hamann, K.; Glazier, R.A.; Wilson, B.M.; Pollock, P.H. Online teaching, student success, and retention in political science courses. Eur. Polit. Sci. 2021, 20, 427–439. [Google Scholar] [CrossRef]
  72. Keis, O.; Grab, C.; Schneider, A.; Öchsner, W. Online or face-to-face instruction? A qualitative study on the electrocardiogram course at the University of Ulm to examine why students choose a particular format. BMC Med. Educ. 2017, 17, 194. [Google Scholar] [CrossRef] [PubMed]
  73. Maheshwari, G. Factors affecting students’ intentions to undertake online learning: An empirical study in Vietnam. Educ. Inf. Technol. 2021, 26, 6629–6649. [Google Scholar] [CrossRef]
  74. Huang, R.h.; Mustafa, M.Y.; Tlili, A.; Zhuang, R.X.; Chang, T.W.; Burgos, D. Design of Online-Merge-Offline (OMO) learning environments in the post-COVID-19 era: Case study analysis. In International Encyclopedia of Education, 4th ed.; Robert, J.T., Fazal, R., Kadriye, E., Eds.; Elsevier: Amsterdam, The Netherlands, 2023; pp. 296–304. [Google Scholar] [CrossRef]
  75. Hurajova, A.; Kollarova, D.; Huraj, L. Trends in education during the pandemic: Modern online technologies as a tool for the sustainability of university education in the field of media and communication studies. Heliyon 2022, 8, e09367. [Google Scholar] [CrossRef]
  76. Turnbull, D.; Chugh, R.; Luck, J. Transitioning to E-learning during the COVID-19 pandemic: How have higher education institutions responded to the challenge? Educ. Inf. Technol. 2021, 26, 6401–6419. [Google Scholar] [CrossRef] [PubMed]
  77. Kano, N. Life cycle and creation of attractive quality. In Proceedings of the 4th QMOD Conference, Linkoping, Sweden, 12–14 September 2001; pp. 12–14. [Google Scholar]
  78. Wang, Y.; Wang, Y.; Stein, D.; Liu, Q.T.; Chen, W.L. Examining Chinese beginning online instructors’ competencies in teaching online based on the Activity theory. J. Comput. Educ. 2019, 6, 363–384. [Google Scholar] [CrossRef]
  79. Luo, Y.; Han, X.; Zhang, C. Prediction of learning outcomes with a machine learning algorithm based on online learning behavior data in blended courses. Asia Pacific Educ. Rev. 2022. [Google Scholar] [CrossRef]
  80. Ayoola, A.S.; Acker, P.C.; Kalanzi, J.; Strehlow, M.C.; Becker, J.U.; Newberry, J.A. A qualitative study of an undergraduate online emergency medicine education program at a teaching Hospital in Kampala, Uganda. BMC Med. Educ. 2022, 22, 84. [Google Scholar] [CrossRef]
  81. Dumford, A.D.; Miller, A.L. Online learning in higher education: Exploring advantages and disadvantages for engagement. J. Comput. High. Educ. 2018, 30, 452–465. [Google Scholar] [CrossRef]
  82. Baldwin, S.J.; Ching, Y.H. Accessibility in online courses: A review of national and statewide evaluation instruments. TechTrends 2021, 65, 731–742. [Google Scholar] [CrossRef]
  83. Corfman, T.; Beck, D. Case study of creativity in asynchronous online discussions. Int. J. Educ. Technol. High. Educ. 2019, 16, 22. [Google Scholar] [CrossRef]
  84. Gleason, B. Expanding interaction in online courses: Integrating critical humanizing pedagogy for learner success. Educ. Tech. Res. Dev. 2021, 69, 51–54. [Google Scholar] [CrossRef]
  85. McKeown, C.; McKeown, J. Accessibility in online courses: Understanding the deaf learner. TechTrends 2019, 63, 506–513. [Google Scholar] [CrossRef]
  86. Baldwin, S.J.; Ching, Y.H. Guidelines for designing online courses for mobile devices. TechTrends 2020, 64, 413–422. [Google Scholar] [CrossRef]
  87. Imani, M.; Montazer, G.A. A survey of emotion recognition methods with emphasis on E-Learning environments. J. Netw. Comput. Appl. 2019, 147, 10242. [Google Scholar] [CrossRef]
  88. Kim, J.; Merrill, K.; Xu, K.; Kelly, S. Perceived credibility of an AI instructor in online education: The role of social presence and voice features. Comput. Human Behav. 2022, 136, 107383. [Google Scholar] [CrossRef]
  89. Lyu, L.; Zhang, Y.; Chi, M.Y.; Yang, F.; Zhang, S.G.; Liu, P.; Lu, W.G. Spontaneous facial expression database of learners’ academic emotions in online learning with hand occlusion. Comput. Electr. Eng. 2022, 97, 107667. [Google Scholar] [CrossRef]
Figure 1. KANO model.
Figure 1. KANO model.
Sustainability 14 15775 g001
Figure 2. Quadrant diagram of Better-Worse coefficients (Note: M means “Must-be Quality”, O means “One-dimensional Quality”, I means “Indifferent Quality”, A means “Attractive Quality”).
Figure 2. Quadrant diagram of Better-Worse coefficients (Note: M means “Must-be Quality”, O means “One-dimensional Quality”, I means “Indifferent Quality”, A means “Attractive Quality”).
Sustainability 14 15775 g002
Table 1. Functions and indicators of MOOC quality evaluation for landscape architecture.
Table 1. Functions and indicators of MOOC quality evaluation for landscape architecture.
Function NumberFunctionIndicator
Number
Indicator
ACourse organizationA1Course teaching objectives
A2Course teaching design
A3Course content organization
A4Course teaching methods
A5Course schedule
A6Course teaching team capacity
BCourse resourcesB1Richness of course resources
B2Coverage of course resources
B3Openness of course resources
CLearning environmentC1Course image design
C2Course multimedia quality
DLearning experienceD1Course knowledge development
D2Course competence development
D3Course emotional value development
D4Course interaction
ELearning supportE1Platform access
E2Platform running
E3Platform update
Table 2. Sample questionnaire.
Table 2. Sample questionnaire.
Forward questionsIn your MOOC study, how would you feel if the teaching team is competent?
I like itIt must beI am neutralI can live with itI dislike it
Reverse questionsIn your MOOC study, how would you feel if the teaching team is incompetent?
I like itIt must beI am neutralI can live with itI dislike it
Table 3. Basic information of research subjects.
Table 3. Basic information of research subjects.
FrequencyProportion (%)
GenderMale4336.13
Female7663.87
Educational backgroundUndergraduate9075.63
Postgraduate2924.37
Table 4. Reliability test.
Table 4. Reliability test.
DimensionQuestionnaire Question No.Reliability of Forward QuestionsReliability of Reverse Questions
Overall1–180.9540.944
Course organization1–60.9240.919
Course resources7–90.8710.838
Learning environment10–110.9020.854
Learning experience12–150.8590.690
Learning support16–180.7000.617
Table 5. Validity test.
Table 5. Validity test.
Overall QuestionnaireForward QuestionnaireReverse Questionnaire
Kaiser–Meyer–Olkin Measure of Sampling Adequacy0.8870.8960.934
Bartlett’s Test of SphericityApprox. Chi-Square3946.5772020.0201648.667
df630153153
Sig.0.0000.0000.000
Table 6. Classification of quality attributes.
Table 6. Classification of quality attributes.
If the Product Does not Have This Function (Reverse Questions)
I Like itIt Must BeI Am NeutralI Can Live with ItI Dislike It
If the product has this function
(forward questions)
I like itQAAAO
It must beRIIIM
I am neutralRIIIM
I can live with itRIIIM
I dislike itRRRRQ
Note: A means “Attractive Quality”, M means “Must-be Quality”, R means “Reverse Quality”, O means “One-dimensional Quality”, Q means “Questionable or contradictory answer”, I means “Indifferent Quality”.
Table 7. Attribute statistics of traditional KANO model.
Table 7. Attribute statistics of traditional KANO model.
Function NumberIndicator
Number
A (%)O (%)M (%)I (%)R (%)Q (%)Classification
AA140.3414.292.5239.500.842.52A
A242.0217.659.2429.411.680.00A
A337.8222.695.8831.930.840.84A
A447.0610.923.3638.660.000.00A
A538.6619.335.8833.612.520.00A
A651.2617.655.8824.370.840.00A
BB153.7815.971.6827.730.840.00A
B249.5816.815.0427.730.840.00A
B349.5816.815.0428.570.000.00A
CC149.5814.295.8829.410.840.00A
C242.8624.373.3628.570.840.00A
DD137.8228.575.8826.050.840.84A
D256.3010.081.6827.730.843.36A
D342.8620.174.2031.930.000.84A
D443.7020.175.8828.570.840.84A
EE139.5029.415.8825.210.000.00A
E236.9737.823.3621.850.000.00O
E348.745.041.6842.860.001.68A
Table 8. Attribute statistics of KANO model based on Better-Worse coefficients.
Table 8. Attribute statistics of KANO model based on Better-Worse coefficients.
Function NumberIndicator NumberSIRankingDSIRankingClassification
AA10.565217−0.17394I
A20.606914−0.273513O
A30.615413−0.290615O
A40.579816−0.14283I
A50.594915−0.258611O
A60.69493−0.23739M
BB10.70342−0.17805A
B20.66958−0.22048A
B30.66399−0.21857A
CC10.644111−0.20346I
C20.67806−0.279614M
DD10.67527−0.350416M
D20.69304−0.12282A
D30.635612−0.245810O
D40.649610−0.265012M
EE10.68915−0.352917M
E20.74791−0.411818M
E30.547018−0.06831I
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Qiao, L.; Zhang, Y. Analysis of MOOC Quality Requirements for Landscape Architecture Based on the KANO Model in the Context of the COVID-19 Epidemic. Sustainability 2022, 14, 15775. https://doi.org/10.3390/su142315775

AMA Style

Qiao L, Zhang Y. Analysis of MOOC Quality Requirements for Landscape Architecture Based on the KANO Model in the Context of the COVID-19 Epidemic. Sustainability. 2022; 14(23):15775. https://doi.org/10.3390/su142315775

Chicago/Turabian Style

Qiao, Lifang, and Yichuan Zhang. 2022. "Analysis of MOOC Quality Requirements for Landscape Architecture Based on the KANO Model in the Context of the COVID-19 Epidemic" Sustainability 14, no. 23: 15775. https://doi.org/10.3390/su142315775

APA Style

Qiao, L., & Zhang, Y. (2022). Analysis of MOOC Quality Requirements for Landscape Architecture Based on the KANO Model in the Context of the COVID-19 Epidemic. Sustainability, 14(23), 15775. https://doi.org/10.3390/su142315775

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop