In this section, we first depict the generic course design for the comprehensive set of courses and areas and then account for the chronological development of the innovations. Such analysis evaluates the collaborative performance of the teachers in the cluster; further analysis could provide complementary information about each of the three semesters in the study period. Next, we report the outcomes and immediate analysis and cover the research questions and objectives established beforehand. Finally, a more profound discussion in the next section mainly attends to certain factors and specific considerations related to each objective.
4.1. Course Design and the Implementation Enrichment Imposed by the COVID-19 Confinement
Every course in March 2020 went through a transition—a redesign process imposing the video class as the primary resource of contact and the institutional CANVAS learning management system for plan adaptations. Each course was unique per the content requirements.
The designs implemented by each teacher included a teaching lab that was continuously modified [
36] and followed the CCLT and Online Learning framework as previously established [
17,
28] but considered some design elements from BLT and HLT [
14,
15]. To prepare for our first research objective, we classified design decisions, particularly comparing the disruptive features under the hybrid version during the COVID-19 confinement versus traditional face-to-face teaching.
The first transition at the beginning of confinement in March 2020 showed that design categories to be considered should be extended in comparison with the corresponding traditional face-to-face course. Still, it already had some BL elements. Thus, during the suspension week, all the classes became hybrid to a lesser or greater degree depending on the available electronic resources and activities considered [
51] and the videoconferencing presence to which the face-to-face component shifted.
Figure 1 shows graphs exhibiting the transition (left to right), crossing from few elements (red lines) in each category (green lines) for most of the courses in the traditional face-to-face design (left graph) to the more complex design (right side). The transition included additional constructions, including evaluation, mandatory technologies, video classes, and other critical course elements. Methodologies explicitly had to have stimulating variations during the class sessions [
17,
28] in agreement with CCLT.
Finally, designs needed to incorporate contact among the teachers and students to facilitate collaboration and social learning [
15,
28]. Thus, new focuses were on teaching, stimulus, presence, accompaniment, and assessment compared with face-to-face versions [
8] or still hybrid versions with that component (instead of the hybrid approach conducted by videoconferencing).
Face-to-face designs should consider additional constructions (
Figure 1 on the right). For instance, for evaluation, with some of them as a part of playful evaluations as part of the teaching plan. Question banks should be prepared to avoid fraud despite some required flexibility instead of the direct transition on a face-to-face evaluation sustained through videoconference. Particularly in Math and Physics, the necessity of fluid mathematical writing in technologies involving tablets and electronic pencils substituted for the blackboard.
Some easy tools, such as Excel, Mathematica, and Matlab, supplied the possibility to construct demonstrations and visualizations, or still simulations in the more specialized cases. After the first week of videoconferences, most of the teachers learned that flat classes based on writing as in the face-to-face approach were not very convenient. Thus, they introduced learning methodologies, some of them supported by technology.
Another important aspect was the necessity to promote interactions not only among teacher and student but also among students as partners. Social learning was probably one of the most endangered elements during the pandemic [
39], and thus additional effort should be provided to avoid such loss, particularly in some courses where complexity and diversity became extreme [
52].
Thus, despite a well-prepared faculty in online technologies [
30], a sustained videoconference online class required new distinctions in the design. First, as was previously mentioned, for further analysis, courses were conveniently grouped by affinity in the three academic areas previously mentioned. A design in layers in agreement with the outstanding dimensions in CCLT let to classify most of the elements involved. Such layers are considered in the first level the learning networks in each course.
This included contact as well as activities considering the social learning among partners. The second level considered some adequate learning methodologies fulfiling the necessities of the first layer and concrete course requirements. Such methodologies then implemented social learning but directed to the necessities of each course.
Finally, the third level took account of the technologies and tools required to scaffold the other two first layers as well as the course orientation and specific technologies required by the professional practice. That structure is shown in
Figure 2 where each main technology appoints to elements considered in the design, each one allocated in transverse layers. In addition, each academic area had differentiated elements according to their own necessities [
35].
At the end of the first semester, the cluster noticed that additional complexity elements should appear. In a first approximation based on an Online Learning framework [
28], poor considerations about the extended and sustained videoconferencing learning were made.
Behavioural issues were not originally considered or assumed in the didactic design. Soon, collateral relations among some elements (not shown in the graphs on
Figure 1) became evident. Some of them are now shown in the graphs depicted in
Figure 3 (gray edges). Thus, for instance, Zoom videoconference system could promote teamwork and participation considering some of its functionalities. A virtual classroom construction was convenient in each course to gather in a single document of the LMS the mobility among activities, class, plans, homework, and supporting resources.
However, this was not only convenient, this also allowed the flexibility and affordability of the entire course to support each student. As well, some other tools or technologies could play multiple roles in the course: visualization, Active Learning, etc. [
36,
53].
With those distinctions on the design, which were common elements of classification inside the cluster, no matter if courses belong to any of each one of the academic areas, each course was evolving through the three periods, considering a growing or a better group of methodologies and technologies [
53]. In the following section, a quantitative analysis of the general implementations is given. Implementations were recovered from the daily course’ documentation. This was followed by an analysis of the students’ outcomes compared with their perceptions about the teaching-learning plan.
4.2. Implementation Chronology, Development, and Categorization by Academic Area and Didactic Group
To fulfil the first objective of the research, we begin our analysis depicting the global process of inclusion of methodologies and technologies in the current subsection. At the beginning of lockdown, some methodologies and technologies were suggested by the training program in technologies approximately in terms of
Figure 1 in agreement with the activity records in CANVAS. In general terms, apart from Zoom and CANVAS, other technologies were suggested for sciences and engineering: Remind as contact media and software like Mathematica, Matlab, and Geogebra for visualization.
While the semester advanced, some other technologies and methodologies were shared and introduced by teachers. In the teachers’ cluster being depicted here, a near collaboration in terms of procedures, outcomes, necessities, and implementation recommendations was permanently sustained, thus, conforming to a learning community. Thus, through the three semesters, different methodologies appeared in the different courses to fulfil certain requirements or necessities demanded by each one in the different academic areas to provide increased opportunities for success in learning.
Figure 4 shows a detailed analysis and depiction of those methodologies and technologies as they were introduced for the first time in each area. Then, each technology permeated in other curses and inclusively areas, through the dissemination in the teachers’ cluster. The plot depicts the months from February 2020 (the month when the first period of lockdown began) to June 2021 (the end of the third period, before the writing of the current report and the advent of New Normal for education in Mexico).
Orange lines state the end of each semester while black lines state the beginnings. Each academic area is depicted with different colours: MD in blue, NM in red, and PE in green. There, technologies implemented for the first time are remarked above and methodologies below. Each group of methodologies and technologies are grouped by affinity and they appoint to each academic area when they were first introduced.
The numbers in circles report the total number of estimated students in each group of methodologies and technologies impacted using them (such estimation was consistently obtained from the posterior register of such methodology or technology in CANVAS). The lower plot summarizes the softened continuous implementation process by a gradual model of increasing by monthly periods, which provides a perspective of the real progressive impact.
ME required a fluent way to present and share mathematical writing and diagrams, which were neither easy nor convenient to present on pre-built slides. Thus, tools, such as Padlet or i-Pencil, were gradually integrated; however, they did not arrive before going through more traditional experiences of physical writing on traditional whiteboards, handwriting, or inverted writing in transparent crystals during the first weeks of videoconferencing (some of them affordable in Zoom).
Otherwise, the Screencasting resources also allowed to extend the number of exercises seen in class. Additionally, tools such Excel, R, Mathematica, and Geogebra were included to support visualization in Math. Individual and continuous practices were given through tools, such as Kahoot and Socrative additionally promoting social learning as well as motivation. Those tools supported mainly the Flipping classroom methodology, a technique to allow a more recurrent social practice of learning during the class.
For MN, although there were courses that, by their nature, previously had a greater digital development (due to their BL focus) before the existence of the Hyperflex model versions, the concept of the virtual classroom was still implemented as in other areas (the concentrator map of activities guiding the entire course activities and their navigation). The students appreciated this type of space as easier to understand those procedural and methodological aspects expected in the transition.
There, tools, such as Phyton, Mathematica, and Matlab, settled the basic support, and the slideshow facilitated the concentration of the specific theoretical developments before each numerical or technical method was reviewed in the courses. Support for the programming component continued being delivered through Screencasting in addition to the videoclass, which already existed in many of the previous courses.
FE represents most of the courses that were analysed in the current report. They faced two aspects: experimentation and complexity. For the first aspect, some experiments with homemade materials and directions given through Screencasting were implemented. Additionally, different simulation tools ranging from Geogebra, Physics Studio, Tinkercad, and Verve were used, and even programming to connect the real experience of the phenomena with the theoretical components of the courses [
54].
Tools, such as the Physics Toolbox Suite allowed experimental measurements of motion, acceleration, magnetic field, etc., by using the mobiles of students [
55]. This app supplied some measuring devices or provided alternatives for measurement. The aspects of complexity were partly addressed first with a planned visualization using Mathematica and Matlab, and then with the realization of a simulation project. The use of Mathlab calculator, a useful app for mobiles, made it possible to effectively resume the use of the scientific calculator with editing, thus supporting students in its use with a minimum of mistakes and waste of time during the videoclass.
Methodologies, such as Flipping classroom, Active learning, and Exercise solving, were implemented through tools, such as Socrative to generate playful spaces of collaboration on which social learning was reached. This practice became useful to bring closer to the students into the formal summative evaluations. As in MS, Screencasting also allowed to extend the number of exercises seen in class, this practice was inherited from experiences in NM. The use of Storytelling and its combination with animated slideshows or including demonstrative videos were useful to capture the attention in those courses.
As a general practice, in some courses, the advisory was recorded to increase the flexibility and the broadcasting to other students. A large bank of questions, to sustain remote evaluation letting several attempts, was implemented in tools, such as ClassMarker and Canvas. This allowed students complete their learning while obtaining improved grades. To summarize specific technologies and methodologies used in each academic area in terms of the main use,
Table 4 presents that information.
4.3. Impact Learning Measured through Final Grades as Compared with the Student Opinion Recommendation
The second research objective deals with the impact of resources and methodologies under the Hyperflex model. Despite this, as was previously stated, such impacts should be measured not only through the summarized final grade but also in terms of engaging learning to students, so a contrasting perspective is considered. Due to the diversity and foci of the set of courses analysed, we should reach such analysis with the more comparable indicators available. Thus, one of those was the opinion of students, which strongly measures engagement.
In our institution, at each end of the semester, a uniform exit survey is applied to each student to evaluate the services and particularly the performance of each course design and teacher. For each course, a set of questions is applied as a function of their characteristics (theoretical, experimental, entrepreneurship, etc.). However, in any case, a global final question is considered, the recommendation of the teacher, considering there the dominion, accompaniment, support, design, and methodology applied in the course delivered.
That question is common for all courses and teachers, and it is expected to be answered as a recapitulation of the different components assumed for each one. Despite this, the survey is criticized because it is supposed to be related to the ease and flexibility of the course. However, in the implementations being analysed, flexibility was one of the aspects that could be exhibited as something positive. In addition, it was expected that survey reflected the utility and value of the course delivered. In addition, previous analysis of institutional exit surveys has shown that the outcomes mainly appoint to the must be of education, thus, ease is punished as well as unnecessary complexity [
56,
57].
With this purpose, we propose to contrast the outcome in the final common question in the survey mainly with the average of final grades, but still together with its standard deviation and other parameters. Other factors could be important in distinguishing the set of data, as for instance academic area, period, as well as the main innovation considered in the course revealing hidden behaviours during the lockdown.
The last aspect has been classified in three tracks as the teacher considered the main aspect innovated in the course: (a) Introduction of supporting technology, (b) Introduction of disruptive methodology, and (c) Design based on the constructive and flexible evaluation. Corresponding data for the 66 groups being considered in the study are synthetically represented in
Figure 5 in agreement with the data shown in the
Table A2,
Table A3 and
Table A4.
The plot exhibits each group considered in a dispersion plot between the average of final grade on the horizontal axis (the common scale is 0–100, but here it has been modified to easily to compare data on a 0–10 scale) and the average of student’s recommendation for the course (in a scale 0–10, where 10 is the best recommendation). Each dot representing a group surrounded by a circle whose radius reflects the corresponding standard deviation of final grades in each course.
Each dot is represented in a colour corresponding with their academic area: black (MS), yellow (NM), and orange (PE). In addition, circles were coloured in agreement with their type of innovation: red (Technology), green (Methodology), and blue (Evaluation): together, the edge of circles are dotted, dashed, or solid respectively in agreement with the period each one corresponds. All corresponding scale legends are included on the left as a visual reference. The diagonal line in brown remarks the identity between both main indicators on the axes.
The dispersion plot in
Figure 5 indicates some possible trends of analysis on the data [
58]. At first glance, several aspects become evident. First, the distribution shows an apparent higher average recommendation than the average final grade value for at least one point more. In general, recommendations became higher during the lockdown period than before for around one half-point at least than the historic registers (the average recommendation for the courses in the Science department was
during 2019, while the recommendation average for the groups reported between 2020–2021 became
).
Most of the courses are outstanding due to their Methodology (, green) rather than other innovations, with Technology (, red) as the second innovation in recurrence. In addition, courses proposing Methodology as innovation appear better recommended than those centering on Technology or Evaluation (despite the few data for these last). Differences among periods are not evident or they are few clear due to the meaningfulness (as for those of MS).
A particular behaviour for those courses was the extended dispersion in the final grades, which appeared apparently with the largest values of recommendation (despite, other courses with a larger dispersion still did not exhibit such behaviour). We will analyse those aspects more deeply below. Those aspects related to the meaningful dependence of the student’s recommendation are analysed in
Figure 6.
We are now interested in the meaningful explanation of the student’s recommendation concerning the variables: Period (the semester when the class was delivered), Number of students (
S), Final grade average (
), and Standard deviation of
(
), Type of innovation (Technology, Methodology, or Evaluation), and Academic area. Then, one-factor ANOVA tests were performed in each case, obtaining their corresponding
p-values, which were synthetically plotted on
Figure 6a presented in a radar plot (note that zero value for
p is not in the origin for better reading), remarking with the blue dashed circle the significance of the tests, which was
.
If
denotes a sensible correlation [
59], we noticed that only the
S, the
, and the
were meaningful for the variation explanation of the student’s recommendation. For some outliers will be interesting to know specific situations that occurred there. Such cases will be discussed concretely in the next section.
Figure 6a–c shows the corresponding dispersion plots between those meaningful variables with the student’s recommendation values. Plots include red lines for the linear correlation dependence under a single linear explanation model to reveal the kind of the dependence (positive or negative).
This analysis is useful to understand the main trend behind the functional behaviour on each independent aspect. Despite that the Academic area is not meaningful and its values are not discriminated in the analysis, we coloured the dots as a reference in agreement with their colours in
Figure 5.
Some immediate outcomes show that Student recommendation depended positively on the final grade of each course and negatively on the standard deviation, despite an expected outcome, note that not the highest grades were expected by the students to assign the best evaluation to the teacher. However, dispersion could be expected to be narrower, probably because the effort among partners could seem relatively similar to others in the current circumstances.
Another interesting, but not so strange outcome, is that larger groups tended to decrease the recommendation value to teachers. Obtaining the correlation with a linear model between the student’s recommendation (
) and the meaningful variables obtained, we obtain the Pearson correlation coefficient [
59]
(while taking all variables involved it barely raise to
confirming the low meaningfulness of the remaining variables). Nevertheless, a more optimal second-order model gives:
with the three meaningful variables (number of students
S, final grades average
, and standard deviation in the final grades
) giving
, a notably improved value at double (a similar second-order model with all variables simply raises it to
).
In this analysis, we have considered only academic or demography factors as causal factors on the student’s recommendation evaluations for the courses as an indicator of effectiveness. Despite this, several external elements in the emotional scope were identified as important factors in the academic success of the student. Those factors are diverse, but they affected the students in different strengths, thus stating lots of long-term consequences [
60]. We discuss some findings in our student community related to those aspects in the following subsection.
4.4. New Opportunities and Main Losses around Academic Life during the COVID-19 Lockdown
As part of our research, our third objective was boosted because the emotional component was diverse among the student community, thus changing, improving, conditioning, or altering his academic life. Despite that certain analyses during the period have appointed digital competencies as the key factor of successful learning, which is true as well for teachers as for students [
61], secondary factors arose in communities with a sufficient dominion of educational techniques and technologies, and also students could still benefit from other opportunities opened by the different conditions.
Then, at the end of the three semester period, a part of our students between the first and fourth semester within the courses of the teachers’ cluster was randomly asked about the new opportunities about the changes boosted by the COVID-19 lockdown had produced on them. On the other hand, they also were asked about those incidents mainly affecting negatively their academic life. Note that the two answers were mandatory for each student to evaluate both positive and negative facts. The outcomes were then classified, and thus proportionally reported synthetically on
Figure 7a,b, respectively.
For the positive aspects, students recognized mainly the inclusion of more meaningful challenges or projects involved in their courses, an augmented deepness in their learning due to the time-released by the confinement, and the increased sense of effort and discipline to fulfil their academic duties. Those aspects represented the of responses. The negative aspects were more diversified including mainly the failed courses, the low grades, difficulties in adaptation to teaching styles, and the complexity present in the course challenges or projects.
Those aspects represented almost of the responses, while the remaining appeared equally distributed between academic, technology, and family issues. Note how some of these appear as opposite facts—clearly as responses of different parts of the group of students surveyed.
In the following section, we will discuss the last findings by objective. We will relate and extend some facts presented here, thus, giving a closer and concise interpretation of the outcomes and quantitative analysis developed in the straight analysis.