Next Article in Journal
Development and Experimentation of a New Mathematical Model for Teaching–Learning the Radioactive Decay Law
Previous Article in Journal
Secondary School Students and Internet Forums—A Survey of Student Views Contrasted with an Analysis of Internet Forum Posts
Previous Article in Special Issue
On the Use of PDF-3D to Overcome Spatial Visualization Difficulties Linked with Ternary Phase Diagrams
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Course Evaluation for Low Pass Rate Improvement in Engineering Education

Department of Electrical and Electronic Engineering Science, University of Johannesburg, Johannesburg 2092, South Africa
Educ. Sci. 2019, 9(2), 122; https://doi.org/10.3390/educsci9020122
Submission received: 3 April 2019 / Revised: 12 May 2019 / Accepted: 20 May 2019 / Published: 2 June 2019
(This article belongs to the Special Issue Towards Excellence in Engineering Education)

Abstract

:
A course evaluation is a process that includes evaluations of lecturers’ teaching performances and their course material moderations. These two procedures are usually implemented, whether officially by the faculty of engineering or by lecturers’ own initiatives, to help identify lecturers’ strengths and weaknesses and the ways forward to improve their performances and their qualities of teaching. This paper presents different ways of implementing these two criteria from students’ and professionals’ perspectives. Official questionnaires from the faculty of engineering, personal questionnaires using Google surveys, Moodle and special designed forms have been used for moderation and evaluations. The process of evaluation is the core of a feedback procedure followed by universities in order for them to monitor the teaching quality of their staff. Satisfactory results show that such a process can improve the lecturers’ teaching performances, courses material quality, students’ satisfaction and performances, and finally the pass rate of the class.

1. Introduction

Engineers play different contextual roles in industry and academia. In the latter, they not only teach students, but are also regarded as mentors and expected to extend open door policies to their students.
The teaching practice should be informed by the lecturers’ working environment, namely the Faculty of Engineering, and their professional statuses as educators in the 21st century. Lecturers should be motivated by the importance of providing students with authentic learning experiences which are relevant to contemporary concerns, and place high value on developing responsible engineers who are insightful, can work independently, have good problem solving skills, and can apply and adapt their knowledge to unexpected and new situations.
It is known that the evaluation [1] of lecturers’ teaching quality is usually conducted for two reasons; the improvement of practice, since more experience can be built up from the received feedbacks; and the faculty of engineering promotion, which is subject to the university policy for staff promotion as proof of teaching evaluation should be required. Also, the evaluation can be conducted by students or professionals—either colleagues or visiting experts appointed by the faculty of engineering. In the case of students’ evaluations, the most important benefit lecturers can gain is feedback to help them refine their courses and teaching practices to provide students with better learning experiences [2,3,4].
The question here is: how important is the evaluation in the improvement of the low passing rate of the offered courses [5]. Although the evaluation’s impact on the students’ success cannot be demonstrated clearly, it can assist lecturers to improve their teaching style and upgrade their course materials, which usually have direct impact on students’ performances. It can be seen that evaluation processes can accurately identify the lecturers’ strengths as well as areas in which they need to improve.
Usually, a course can be defined into four major parts; the prescribed textbooks, lecture notes, tutorials, and the practicals. Another important item can be useful to make the course easier to follow and more comprehensive is the study guide. Study guides are very important in the organization of the course. They are the road map and can be seen as contracts between lecturers and their students.
Prescribed textbooks are usually the official books for the course. Lecturers, after obtaining faculty approval, list these books as essential for reading and reference. These books will help students to focus better and supplement lecture notes, creating better chances of student success in the course.
Lecturers are advised to avoid recommended classic textbooks due to their outdated contents and applications. These books were basically designed for students who had very limited access to computers and digital information. Modern engineering textbooks should be user-friendly, with new and modern applications inspired from the modern engineering world. Tutorials in these textbooks ought to be designed to solve real-life problems using pedagogical approaches that help students understand the course through their own studies and revisions.
In order to give students greater variety and inspire them to think out-of-the-box and not to rely on what is prescribed to them, lecturers could also recommend textbooks written by different authors and prescribed by other universities. The recommended textbooks will not replace lecturer-prescribed textbooks but give a chance to students to expand their horizons through exposure to something different. Usually, these textbooks are provided to students in the form of e-books for no extra expense.
Good design and presentation of lecture notes or course slides, despite the brevity of the latter, play an important role in making lectures very easy to understand and comprehensive. Succinct and well-summarized lecture notes help lecturers cope with the limited time allocated per session. Lecture notes facilitate revision for students.
Lecture notes should be designed and prepared in an attractive manner and the layout of the slides helps students psychologically follow the content when lecturers are busy presenting. The slides should have an easy logical flow and should be judiciously interspersed with some proverbs, photographs, or cartoons in line with the context of the lecture.
This added entertainment aspect could also include information related to the content of the course gleaned from famous researchers or well-known scientists and gives quality to the design of the slides. The extra information provides inspiration to students and assists them to see the course from a real-word perspective.
There is a famous quote from Socrates: I cannot teach anybody anything, I can only make them think. This quote describes the philosophy behind tutorials at engineering faculties. The main purpose of a tutorial is to give students a chance to develop their individual capacities to think deeply about engineering problems and thereby build their confidence.
Tutorials also encourage teamwork among students when they meet in small groups and discuss specific topics related to the subject matter of the course. Engineering tutorials which involve group work are appealing since they provide opportunities for students to practice and develop collaborative skills [6]. In a tutorial class, the lecturer will encourage interaction and participation in the discussion.
As tutorials are very important in mediating the course by helping students grasp the unclear concepts, the lecturer should link the problems given in tutorials to the theory in the lecture notes. Solutions to the most important problems should also be made available to students to enhance their understanding of the lecture material.
The overall goal of engineering education is to prepare students to practice engineering [7]. Therefore, practicals in engineering education play an important role in developing skills that will assist students to be ready for the professional engineering environment.
Engineering faculties consider laboratories as an essential part of undergraduate programs. Laboratory work is an established part of courses in engineering education that intends to produce skilled and highly competent engineers for industry. This enables students to integrate easily and quickly into industry.
Practicals enable students to link theoretical concepts learned in class to real-life applications. For example, practicals designed for the Signals and Systems and Telecommunications course constitute either software programming projects or hardware build projects or a combination of both.
The architecture of the proposed procedure for improvement of low class pass rate is depicted in Figure 1. Feedback is the core of the system. The teaching and course material evaluations are means to provide feedback about the course quality and thus help lecturers to improve and upgrade what is necessary in order to provide students with a better educational environment which will lead to better class pass rates.
Google’s survey service simplifies communication between lecturers and students and summarizes the collected data based on students’ opinions and views. Figure 2 is one of the surveys [8,9,10] administered to students regarding teaching expertise and course material. Lecturers also use Moodle [11,12] to reach a decision when it comes to meeting and test dates. This help create sort of a cloud-community that maintains open and flexible access and communication between lecturers and students even beyond physical universities.
The paper is organized as follows. In Section 2, different evaluation processes are presented. Section 3 covers the process of moderation of the course material from the students’ perspective and professional academics’ perspective. Finally, a conclusion summarizing the achievements which led to the improvement of the class pass rate is presented in Section 4.

2. Teaching Evaluation

Teaching evaluations are usually conducted for two reasons; to improve practice and to assist in the faculty promotion process. However, the most important benefit lecturers can gain is feedback to help them refine their courses and teaching practices to provide students with better learning experiences [13]. Teaching evaluation is important in the refining of the teaching excellence of any lecturer. Another important reason for teaching evaluation is the improvement of class pass rate and thus the faculty’s throughput. Although evaluation cannot be directly linked to throughput, it can assist lecturers to improve their teaching styles and upgrade course materials, which invariably impact on students’ performances.
Taking this into consideration, lecturers should conduct three types of evaluations, namely students’ evaluations, peer evaluations by colleagues, and evaluations by international guests and experts.

2.1. Students’ Evaluation

Students benefit equally from proper classes and from research, and lecturers should be well prepared on both fronts. Descriptions and evaluations of both types of teaching evaluations are described below.

2.1.1. Students Observe Lecturers (SOL)

Students Observe lecturers (SOL) is an application designed and developed by the author in order to give students the ability to observe the lecture flow and send comments live to the lecturer in order to adjust their lecture’s speed and flow. The designed App is installed on both the lecturer’s and the students’ computers. Students’ numbers and their computers’ IP addresses are considered in the App in order to secure the communication between students and lecturers within the class session. Many students are shy by nature and do not have the courage to ask a lecturer in the middle of the lecture. Also, some students are afraid to ask questions or to stop a lecturer to ask questions. The author, from his teaching experience, has realized that some students lose focus in the middle of the lecture due to disruption or the speed that the lecturer follows. To take control and give students the chance to slow down or catch up with the lecture, an application was designed and installed on the computers of each student in which they can click on different option to evaluate the flow of the lecture. Results will appear instantaneously on the screen of the lecturer’s computer, who should check it from time to time to get an idea about the flow and the response from the students. Figure 3 and Figure 4 show samples of the proposed SOL App.

2.1.2. Course Evaluation

Teaching evaluations from students have two major goals. The first is to assess the performance and teaching quality of lecturers and to provide them with insight on what they are doing well and how they need to improve. The second is to develop students’ responsibilities towards their faculty through taking part in the evaluation process in order to improve teaching quality in the faculty.
Considering the above, lecturers should conduct surveys in which students participate anonymously to evaluate their lecturers’ courses. In this paper, the author, who is lecturing a third year course on signal processing (SIG3B01), has conducted a survey which was based on a questionnaire which elicited students’ overall opinions of the course, and specifically of the related course material such as slides, tutorial, practicals, and prescribed textbooks. Other questions were related to their opinions about the type of assessments and practicals on offer, and to the way in which their marks are calculated. Figure 5 shows the students’ evaluation of the courses that he teaches.

2.1.3. Research Evaluation

In the author’s experience, effective teaching is predicated on guidance and research; students need to be trained in both research methods and problem solving to be considered properly educated. The following are examples from different categories of students whom the author has had the pleasure of supervising in undergraduate, masters, and doctoral degrees. Figure 6 and Figure 7 are samples of their personal evaluations to his research supervision.

2.2. Peer-Teaching Evaluation

A lecturer-to-lecturer evaluation is a means of obtaining accurate information about a colleague based on the fact that an evaluation from someone with experience in the same field and who knows the lecturer’s work, ethic, and behavior would result in an ultimately more useful and accurate evaluation. This also has the potential to develop lecturers’ working practices and help them understand the points of view of their colleagues [14,15,16].
Other benefits of lecturer-to-lecturer evaluation is the building of good working relationships between colleagues which will create a best practice environment throughout the university.
A few colleagues from different departments, universities, and different countries were invited to evaluate the author’s teaching, as shown in Table 1.

2.2.1. Local-Teaching Evaluation

Considering the benefits of peer teaching evaluation mentioned above, the author asked colleagues from different schools in his university and other colleagues from other universities to evaluate his teaching performance and to provide feedback.
The author has chosen the heads of departments from schools of electrical engineering and civil engineering and asked them to attend his lectures to evaluate his teaching styles and his course material. He also asked the head of the Department of Electrical and Electronic Engineering Technology and the head of the Department of Civil Engineering Science from his university, the University of Johannesburg. A questionnaire evaluating the quality of his teaching was given to them. A similar questionnaire was given to another colleague from another university in South Africa at the School of Electrical Engineering at the University of Witwatersrand.
The reasons behind his choices were simple. Firstly, he needed feedback from a colleague in the same field of expertise and same school who is familiar with his curriculum and internal policies, as was the case with the Department of Electrical and Electronic Engineering Technology. Secondly, he also needed the opinion of someone who was from the same faculty but from a different school with different curricula, as was the case with the Department of Civil Engineering Science.
Thirdly, he needed an evaluation by a colleague from another university with different curricula and engineering programmes but from the same field of expertise, as was the case with the School of Electrical Engineering at the University of Witwatersrand. A sample of the questionnaire and evaluation form is presented in Figure 8.

2.2.2. International Teaching Evaluation

In line with the earlier explanation, lecturers should try to get an international evaluation regarding their course materials and lecturing skills. In this context, the author took the opportunity to invite two international professors on two different occasions to attend his lectures to get their opinions about his teaching style, course material, and the teaching environment he provides to his students.
The author had the opportunity to invite a professor from Armenia. A sample of the questionnaire and evaluation is presented in Figure 9.

2.2.3. Self-Teaching Evaluation

As academics, lecturers should acquire benefits from overseas institutions either via research collaborations or by attending conferences. These are very important opportunities to gain exposure to different professional environments and to develop communication skills through interactions with academics from different parts of the world. Exploring opportunities to lecture at international universities and to get feedback from their students and staff is also very important in the professional career of an academic. The author had the opportunity to teach a course in 2014 as a visiting researcher at the University of Duisburg-Essen in Germany. The faculty of engineering approached him to design and lecture a Master’s degree course on Information Theory. Feedback from the University of Duisburg-Essen about his teaching experience is presented in Figure 10.

3. Course Material

This section discusses the university’s moderation philosophy and presents evidence of its practice. Universities usually stipulate a formal moderation procedure. In this paper, the author has extended this policy by approaching students and local and international colleagues to evaluate his courses for the sake of enhancing his teaching expertise.
In his case, the author’s faculty curriculum comprises two types of modules or courses, namely core/fundamental modules and exit-level modules. The Engineering Council South Africa (ECSA), a watchdog for engineering curriculum quality, call final degree courses exit-level outcomes (ELO). These courses are assessed and moderated twice in the exit level modules, both internally and externally by someone from another university. This entails proper moderation of all the exam papers, module content, and answer sheets. Non-exit-level modules can be moderated internally by colleagues as is the case with the author’s own modules, which are third year modules. Files with all course material evidence, called ECSA files, are prepared by lecturers and moderators are expected to inspect these for evidence of course teaching.

3.1. Students’ Moderation

3.1.1. Textbooks

The recommended textbook is a further benefit to students that provides an alternative solution to students who could not afford buying the prescribed textbook in the first place. Although the prescribed textbooks are available in the university’s libraries, many students are unable to access them because of the limited copies in relation to the number of students registered for the course.
From experience, another way in which lecturers could help students, mostly those who are struggling financially and cannot afford very expensive engineering textbooks, is for them to design their own textbooks. In this regard, the author designed his own textbook and made it available to his students on their blackboard, a website for course management. This e-book was inspired by the recommended textbook and was intended to build a strong relationship between the slides of his lecture notes and the prescribed textbook.
The author makes a direct link between the content of his slides used in class and the corresponding content in the prescribed textbook. The result was very positive and students liked the idea and his e-book became more readable and accessible to students than other available textbooks.
The author believes lecturers should always obtain feedback by getting their students to evaluate lecturers’ initiatives and ideas. In this regard, he conducted a survey among his students to gauge their responses to the prescribed textbook, the recommended textbook, and his own textbook. Figure 11, Figure 12 and Figure 13 show the reaction of his students to these aspects.

3.1.2. Lecture Notes

From the author’s experience in teaching, the author always designs his notes to be structured in a manner that would promote a logical and elucidatory flow to help students who “get lost” in the lecture and gives them a chance to catch up. This is achieved by inserting short problems in between subsections, where students are asked to solve them. These questions will help students to understand the previous slides’ contents and help others to catch up during the time reserved for such applications. At the end of each lecture, an example ought to be given that summarizes all sections in the slides and promotes clear understanding of the lecture.
Figure 14, results from a survey that was conducted among the author’s students, shows their reactions, their opinions, and their ratings of the author’s lecture notes and slides.

3.1.3. Tutorials

From the author’s lecturing experience, the best way to teach students is to create a competitive atmosphere among them and give prizes. Students love competing with each other, especially when rewards are on offer. In order to help students arrive at the correct answers, the lecturer should ask relevant questions. Brain-storming challenging questions makes students feel that they are in class not just to take notes and leave, but to take part in finding solutions and being proud of that achievement. Students should know that active participation in class and tutorials will ease their revision at home.
A survey was conducted among the author’s students to gauge their opinions and rating of his tutorials. The results are shown in Figure 15.

3.1.4. Practicals

Lecturers should pay attention to the capacity of students to handle practicals [17,18,19].
It is very important to recognize that students come from different backgrounds and possess different skills as a result of their secondary educational experiences. Many students come from underprivileged places where it is hard and even impossible to get access to computers and thus programming skills differ from one student to the other. The same is applicable to hardware practicals.
Based on the above, a practical that caters for different types of students will be fair for all of them. In this regard, a survey was conducted among the author’s students to gauge their opinions on the quality of his practicals and their preference of the type of practicals they prefer—hardware, software, or a combination of both. Interestingly, although programming was the least preferable choice for practicals, students accepted it with hardware implementation because they know that engineers ought to improve their programming skills to be ready for industry.
Figure 16 and Figure 17 respectively show, firstly, the ratings of his students regarding the quality of his practicals and, secondly, their preferences regarding the three different types of practicals and their best choices.

3.1.5. Class Tests

Reduced student workloads in a stress-free assessment atmosphere helps to improve throughput rates. This can be achieved by designing assessment schemes which allow both students and lecturers more time and flexibility to prepare for courses.
The department of Electrical and Electronic Engineering Science has changed its assessment strategy from the traditional summative assessment model consisting of semester tests and exams to a more fine-tuned outcomes-based continuous assessment model [20]. Each module is divided into a set of outcomes which encompass key knowledge areas and can be regarded as a chapter with a common theme. This system is considered to offer optimum efficiency for knowledge acquisition and serves to demonstrate our students’ capabilities to pass all knowledge areas in each module. During the course of the semester, students are given three small formative assessments for each module outcome. To pass a module outcome, a student has to achieve a 50% mark in two of the assessments or a 70% mark in one of the assessments.
The philosophy is that a student can fail one opportunity and use the experience gained to pass subsequent assessments. The 70% threshold was instituted to allow students who have mastered given outcomes the opportunity to demonstrate their knowledge once and then be able to focus on the remaining work.
The author has been using outcome-based assessments since 2011. In 2015, he had the opportunity to lecture three courses to third-year students, namely Signal and Systems (SST3A11) in the first semester, and Digital Signal Processing (SIG3B01) and Telecommunications (TEL3B01) in the second semester. After almost five years of using outcome-based assessments, the author decided in 2015 to evaluate this assessment scheme and to develop an assessment scheme using different assessment styles. This was done to avoid the heavy load caused by the outcome-based assessment and the types of questions given to students.
In his first semester course, Signal and Systems, the author applied the departmental assessment module, treating the practicals as outcomes on their own. The scheme for calculating student marks is depicted in Table 2.
In the second semester of the SIG3B01 module, the author applied a different assessment scheme from the one used with the SST3A11 module: he retained the three assessment opportunities to meet the ECSA requirements but treated the practicals as one of the assessments. The reason for dropping the number of assessments was because they create a heavy load on students, affect their results, and therefore the throughput rate. Another modification was to give different varieties of assessment that did not only focus on problem solving and derivation. He introduced a multiple-choice type of assessment to cater for students who are not comfortable with problem solving as the only type of assessment. Since different types of questions require different time allocations, he adjusted the percentage of each assessment mark to the final mark. The final assessment weights are shown in Table 3.
In the case of the TEL3B01 module, the author kept the same assessment style as for SIG3B01 but moved the practicals on their own—not as in the case of SST3A11—into a small project which contributed a certain percentage to the final mark of the module. The idea behind this was to give students a chance to do a separate project for submission at the end of the semester while taking advantage of the practical allocated times to do revision or homework. This new model, to be consistent with the ECSA’s assessment requirement, needed a third assessment. The author thus introduced a quiz which carried a lower percentage to accommodate the rest of the assessment types. The final assessment weights are shown in Table 4.
The author conducted a survey among his students to gauge their preferences and to assess how comfortable they were with each of the assessment types. He preferred not to rely on the results only but wanted them to express their views on this matter to assist in improving the proposed assessment schemes.
Figure 18, Figure 19 and Figure 20 provide information on student choices and the type of scheme that helped to improve their marks. Figure 18 illustrates that students prefer the assessments tool used for SIG3B01 as the scheme that best suits an outcomes-based approach. From Figure 19, it is clear that the assessment tool used with SIG3B01 is the one they feel comfortable with. Figure 20 shows that multiple-choice assessment is the best tool to help students improve their marks.

3.2. Professional Moderation

This section offers an explanation of the author’s moderation philosophy in the context of continuous evaluation of his course content by local as well as national and international colleagues, at his request. Table 5 gives an overview of his approach, and the statistics that are offered as evidence of his moderation.
The author has approached colleagues and academics from different departments, universities, and countries to moderate his course material, as shown in Table 5. He tried to get a broader opinion from academics from different backgrounds. He has asked colleagues from different departments of his faculty of engineering. He has also asked the opinion from a line management perspective from his head of department. He has approached colleagues from the University of the Witwatersrand, which has very strong ties with industry, and the University of Pretoria, which has longer history than his university. At the international level, he has approached academics from North Africa (Tunisia) and from Europe and Asia (Italy and Oman).
The author has personally prepared a special moderation form, which was sent to the moderators with his course containing his study guide, practical guide, lecture notes, tutorials, and his textbook. Figure 21, Figure 22 and Figure 23 are examples of moderation reports by academics from local and international universities. These moderation forms were designed by himself with questions to moderate the course material as explained earlier.
This kind of moderation is informed by his belief that lecturers should challenge themselves to induce innovative thinking. It also builds self-confidence to know that your course material has been subjected to expert evaluation and acknowledged by colleagues whom you respect both locally and internationally.

4. Discussion

In this section, we present the benefits of course moderation and teaching evaluation and their effect on the students’ performance and class pass rate. The feedback obtained through students and professionals is the key to improvement of teaching quality and class pass rate.

4.1. Course Evaluation

Students teaching evaluation has two major goals. The first is to evaluate the performance and teaching quality of their lecturers and provide them with insight on what they are doing well and how they need to improve. The second goal is to build within students’ responsibility towards their university by taking part in the drive to improve teaching quality for the future enrolled students.
Taking into consideration the feedback from different surveys presented in the previous sections, the author conducted a survey among students to evaluate the overall of his three courses he lectured, Signal and systems (SST3A11), digital signals (SIG3B01) and analog telecommunications (TEL3B01).
Figure 24 and Figure 25 are respectively students’ evaluation for analog and digital signals and systems. The content of both courses is mainly on transforms and filters. It is clear from the figures that students’ opinions/ratings are close for both courses, with a rate of higher than 90%. With regards to TEL3B01 module which deals with analog modulations, the extensive theory makes the course unattractive, different from Signals and Systems courses. Figure 26 shows students rating of more than 80%. These results show how important is for lecturers take into consideration the feedback from students and colleagues in order to improve his course.

4.2. Pass Rate

Asking students and colleagues to evaluate your course helps improve the course because preparing a high-quality and friendly educational environment for students is one of the most important factors in improving class pass rate. Students’ opinions and interaction with professionals from local or international universities help make a course more fruitful. Providing more consultation time and revision sessions to help students catch up with the course before examinations and helping underprivileged students with free hard copies of lecture notes play significant roles in making the course accessible to students.
The comparative results between the author’s modules SST3A11, SIG3B01, and TEL3B01 lectured in 2015, and a summary of the benefit of the author’s teaching philosophy in improving the class pass rates are presented in Figure 27.
Comparative results between the author’s modules in SIG3B01 and TEL3B01 lectured since 2010 summarizing the benefit of improving the author’s teaching skills and updating his teaching philosophy in improving the class pass rates are presented in Figure 28.

5. Conclusions

The improvement of class pass rate was a result of the most important procedure any lecturer should follow; receiving feedback. Feedback in education, whether from students or professionals, helps tremendously in improving the teaching and learning skills of any lecturer. Feedback on the teaching style and skills, teaching philosophy followed by the lecturer, and quality of the course material contribute enormously to the improvement of the quality of the course, the quality of the teaching environment, and thus the improvement of classes with low pass rates.

Funding

This research received no external funding.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Nalla, D.A. Framework for teaching evaluation. In Proceedings of the 2018 IEEE Frontiers in Education Conference (FIE), San Jose, CA, USA, 3–6 October 2018; pp. 1–9. [Google Scholar]
  2. Speaking of Teaching. Using Student Evaluations to Improve Teaching. Stanf. Univ. Newslett. Teach. 1997, 9, 1–4. [Google Scholar]
  3. Abu Kassim, R.; Johari, J.; Rahim, M. Lecturers’ perspective of student online feedback system: A case study. In Proceedings of the 2017 IEEE 9th International Conference on Engineering Education (ICEED), Kanazawa, Japan, 9–10 November 2017; pp. 163–168. [Google Scholar]
  4. Pyasi, S.; Gottipati, S.; Shankararaman, V. SUFAT-An Analytics Tool for Gaining Insights from Student Feedback Comments. In Proceedings of the 2018 IEEE Frontiers in Education Conference (FIE), San Jose, CA, USA, 3–6 October 2018; pp. 1–9. [Google Scholar]
  5. Runniza, S.; Bakri, B.A.; Ling, S.; Julaihi, N.; Liew, C.; Ling, S. Improving low passing rate in mathematics course at higher learning education: Problem identification and strategies towards development of mobile app. In Proceedings of the 2017 International Conference on Computer and Drone Applications (IConDA), Kuching, Malaysia, 9–11 November 2017; pp. 77–81. [Google Scholar]
  6. Sulaiman, F.; Herman, S.H. Enhancing Active Learning through Groupwork Activities in Engineering Tutorials. In Proceedings of the 2nd IEEE International Congress on Engineering Education, Kuala Lumpur, Malaysia, 8–9 December 2010; pp. 106–109. [Google Scholar]
  7. Feisel, L.D. The Role of the Laboratory in Undergraduate Engineering Education. J. Eng. Educ. 2005, 9, 121–130. [Google Scholar] [CrossRef]
  8. Cojocaru, D.; Popescu, D.; Poboroniuc, M.; Ward, T. Educational policies in European engineering higher education system–Implementation of a survey. In Proceedings of the 2014 IEEE Global Engineering Education Conference (EDUCON), Istanbul, Turkey, 3–5 April 2014; pp. 229–234. [Google Scholar]
  9. Barbara, K. The flipped classroom in engineering education: A survey of the research. In Proceedings of the 2015 International Conference on Interactive Collaborative Learning (ICL), Florence, Italy, 20–24 September 2015; pp. 815–818. [Google Scholar]
  10. Gehringer, E.; Cross, W. A suite of Google services for daily course evaluation. In Proceedings of the 2010 IEEE Frontiers in Education Conference (FIE), Washington, DC, USA, 27–30 October 2010. [Google Scholar]
  11. Wu, B.; Cheng, G. Moodle–The Fingertip Art for Carrying out Distance Education. In Proceedings of the 2009 First International Workshop on Education Technology and Computer Science, Wuhan, China, 7–8 March 2009; pp. 927–929. [Google Scholar]
  12. Andone, D.; Ternauciuc, A.; Vasiu, R. Moodle—Using Open Education Tools for a Higher Education Virtual Campus. In Proceedings of the 2017 IEEE 17th International Conference on Advanced Learning Technologies (ICALT), Timisoara, Romania, 3–7 July 2017; pp. 26–30. [Google Scholar]
  13. Liu, K.; Luo, X.; Xu, Z.; Zhang, J. On Teaching Method and Evaluation Mechanism in Research Teaching. In Proceedings of the 2009 First International Workshop on Education Technology and Computer Science, Wuhan, China, 7–8 March 2009; pp. 580–582. [Google Scholar]
  14. Garry, B.; Qian, L.; Hall, T. Work in progress—Implementing peer observation of teaching. In Proceedings of the 2011 Frontiers in Education Conference (FIE), Rapid City, SD, USA, 12–15 October 2011. [Google Scholar]
  15. Pitterson, N.; Brown, S.; Villanueva, K.; Sitomer, A. Investigating current approaches to assessing teaching evaluation in engineering departments. In Proceedings of the 2016 IEEE Frontiers in Education Conference (FIE), Erie, PA, USA, 12–15 October 2016; pp. 1–8. [Google Scholar]
  16. Yukari, K.; Shotaro, H.; Wataru, T.; Hironori, E.; Masaki, N. FD Commons: E-Teaching Portfolio to Enable an Ubiquitous Peer Reviewing Process. In Proceedings of the 2009 Ninth IEEE International Conference on Advanced Learning Technologies, Riga, Latvia, 15–17 July 2009; pp. 334–336. [Google Scholar]
  17. Wollenberg, B.; Mohan, N. The Importance of Modern Teaching Labs. IEEE Power Energy Mag. 2010, 8, 44–52. [Google Scholar] [CrossRef]
  18. Pereira, A.; Miller, M. Work in progress—A hands-on ability intervention. In Proceedings of the 2010 IEEE Frontiers in Education Conference (FIE), Washington, DC, USA, 3–6 October 2010. [Google Scholar]
  19. Kampmann, M.; Mottok, J. A “Laboratory” as an approach to foster writing skills at software engineering studies: Learning software engineering is easier when writing courses are directly applied to lecture’s content and the problems and examples enrolled in. In Proceedings of the 2018 IEEE Global Engineering Education Conference (EDUCON), Santa Cruz de Tenerife, Canary Islands, Spain, 17–20 April 2018; pp. 900–908. [Google Scholar]
  20. Department of Electrical and Electronic Engineering Science. Report for the Accreditation Visit of the Engineering Council of South Africa; University of Johannesburg: Johannesburg, South Africa, 2011. [Google Scholar]
Figure 1. System architecture of the proposed procedure.
Figure 1. System architecture of the proposed procedure.
Education 09 00122 g001
Figure 2. Google Survey for teaching excellence evaluation.
Figure 2. Google Survey for teaching excellence evaluation.
Education 09 00122 g002
Figure 3. Students observe lecturers (SOL) feedback to the lecturer.
Figure 3. Students observe lecturers (SOL) feedback to the lecturer.
Education 09 00122 g003
Figure 4. SOL students’ comments.
Figure 4. SOL students’ comments.
Education 09 00122 g004
Figure 5. Students’ evaluation of the SIG3B01 module.
Figure 5. Students’ evaluation of the SIG3B01 module.
Education 09 00122 g005
Figure 6. Research evaluation by undergraduate final year project student.
Figure 6. Research evaluation by undergraduate final year project student.
Education 09 00122 g006
Figure 7. Research evaluation by Master’s student.
Figure 7. Research evaluation by Master’s student.
Education 09 00122 g007
Figure 8. Peer evaluation from the University of the Witwatersrand.
Figure 8. Peer evaluation from the University of the Witwatersrand.
Education 09 00122 g008
Figure 9. Peer evaluation from VMware, Armenia.
Figure 9. Peer evaluation from VMware, Armenia.
Education 09 00122 g009
Figure 10. Evaluation letter from Duisburg-Essen University, Germany.
Figure 10. Evaluation letter from Duisburg-Essen University, Germany.
Education 09 00122 g010
Figure 11. Students’ Evaluation of the prescribed textbook.
Figure 11. Students’ Evaluation of the prescribed textbook.
Education 09 00122 g011
Figure 12. Students’ evaluation of the recommended textbook.
Figure 12. Students’ evaluation of the recommended textbook.
Education 09 00122 g012
Figure 13. Students’ evaluation of the author’s proposed e-Book.
Figure 13. Students’ evaluation of the author’s proposed e-Book.
Education 09 00122 g013
Figure 14. Students’ evaluation of lecture notes.
Figure 14. Students’ evaluation of lecture notes.
Education 09 00122 g014
Figure 15. Student evaluation of tutorials.
Figure 15. Student evaluation of tutorials.
Education 09 00122 g015
Figure 16. Students’ evaluation of the practicals.
Figure 16. Students’ evaluation of the practicals.
Education 09 00122 g016
Figure 17. Students’ preferences for different types of practicals.
Figure 17. Students’ preferences for different types of practicals.
Education 09 00122 g017
Figure 18. Student evaluation of module assessments schemes.
Figure 18. Student evaluation of module assessments schemes.
Education 09 00122 g018
Figure 19. Student preferences for the assessment schemes.
Figure 19. Student preferences for the assessment schemes.
Education 09 00122 g019
Figure 20. Student preferences for different types of assessment.
Figure 20. Student preferences for different types of assessment.
Education 09 00122 g020
Figure 21. TEL3B01 module moderation report by a colleague from the School of electrical engineering at Wits University.
Figure 21. TEL3B01 module moderation report by a colleague from the School of electrical engineering at Wits University.
Education 09 00122 g021
Figure 22. SIG3B module moderation report by the HOD.
Figure 22. SIG3B module moderation report by the HOD.
Education 09 00122 g022
Figure 23. SIG3B module moderation report by a colleague from ENIS, Sfax, Sfax University, Tunisia.
Figure 23. SIG3B module moderation report by a colleague from ENIS, Sfax, Sfax University, Tunisia.
Education 09 00122 g023
Figure 24. Student evaluation of the SST3A11 module.
Figure 24. Student evaluation of the SST3A11 module.
Education 09 00122 g024
Figure 25. Student evaluation of the SIG3B01 module.
Figure 25. Student evaluation of the SIG3B01 module.
Education 09 00122 g025
Figure 26. Students evaluation of the TEL3B01 module.
Figure 26. Students evaluation of the TEL3B01 module.
Education 09 00122 g026
Figure 27. Class pass rates of different modules.
Figure 27. Class pass rates of different modules.
Education 09 00122 g027
Figure 28. Class pass rate from 2010 to 2018.
Figure 28. Class pass rate from 2010 to 2018.
Education 09 00122 g028
Table 1. Peer-teaching evaluators.
Table 1. Peer-teaching evaluators.
CourseSouth African UniversitiesInternational University
Author’s UniversityLocal University
SIG3B01A colleague from another department, Civil Engineering Science, University of Johannesburg, was invited to evaluate the author’s teaching performance.A colleague from another university, School of Electrical Engineering, University of the Witwatersrand, Johannesburg, was invited to evaluate the author’s teachingA colleague from an international university Duisburg-Essen University
Germany, was invited to evaluate the author’s teaching
Table 2. Scheme used for assessment of the SST3A11 module.
Table 2. Scheme used for assessment of the SST3A11 module.
AssessmentsKind of AssessmentAssessment DetailsAssessment WeightOutcome Weight
Outcome A25%
Assessment 1Writing assessmentProblem Solving and Derivation70% Exemption
0.7 Max 1 + 0.3 Max 2
Assessment 2PracticalProblem Solving and Derivation
Assessment 3Writing assessmentProblem Solving and Derivation
Outcome B25%
Assessment 1Writing assessmentProblem Solving and Derivation70% Exemption
0.7 Max 1 + 0.3 Max 2
Assessment 2PracticalProblem Solving and Derivation
Assessment 3Writing assessmentProblem Solving and Derivation
Outcome C25%
Assessment 1Writing assessmentProblem Solving and Derivation70% Exemption
0.7 Max 1 + 0.3 Max 2
Assessment 2PracticalProblem Solving and Derivation
Assessment 3Writing assessmentProblem Solving and Derivation
Outcome D25%
Practicals: Reports and Matlab programming
Final Mark
Average (Outcome A + Outcome B + Outcome C + Outcome D)100%
Table 3. Structure used for assessments of the SIG3B01 module.
Table 3. Structure used for assessments of the SIG3B01 module.
AssessmentsKind of AssessmentAssessment Details Assessment WeightOutcome Weight
Outcome A33%
Assessment 1Writing assessmentMultiple-Choice + Theory30%
Assessment 2PracticalReport + Demonstration30%
Assessment 3Writing assessmentProblem Solving and Derivation40%
Outcome B33%
Assessment 1Writing assessmentMultiple-Choice + Theory30%
Assessment 2PracticalReport + Demonstration30%
Assessment 3Writing assessmentProblem Solving and Derivation40%
Outcome C33%
Assessment 1Writing assessmentMultiple-Choice + Theory30%
Assessment 2PracticalReport + Demonstration30%
Assessment 3Writing assessmentProblem Solving and Derivation40%
Final Mark
Average (Outcome A + Outcome B + Outcome C)100%
Table 4. Scheme used for the TEL3B01 module assessment.
Table 4. Scheme used for the TEL3B01 module assessment.
AssessmentsKind of AssessmentAssessment DetailsAssessment WeightOutcome Weight
Outcome A35%
Assessment 1TestQuiz20%
Assessment 2TestMultiple-Choice + Theory30%
Assessment 3TestProblem Solving and Derivation50%
Outcome B35%
Assessment 1TestQuiz20%
Assessment 2TestMultiple-Choice + Theory30%
Assessment 3TestProblem Solving and Derivation50%
Practical30%
PracticalProjectReport30%
Hardware implementation70%
Final Mark
0.35 × Outcome A + 0.35 × Outcome B + 0.3 × Practical100%
Table 5. Peer-module moderation.
Table 5. Peer-module moderation.
CoursesCourse MaterialCourse Moderation
South African UniversityInternational University
Author’s UniversityLocal University
SIG3B01
  • Lecture Notes
  • Tutorial
  • Practicals
  • Textbook
Colleagues from other departments were invited to evaluate the lecturer’s course material. The line manager as the head of the department (HOD) took part in this process.Colleagues from other universities were invited to evaluate the lecturer’s course material.Colleagues from other countries and international universities were invited to evaluate the lecturer’s course material.

Share and Cite

MDPI and ACS Style

Ouahada, K. Course Evaluation for Low Pass Rate Improvement in Engineering Education. Educ. Sci. 2019, 9, 122. https://doi.org/10.3390/educsci9020122

AMA Style

Ouahada K. Course Evaluation for Low Pass Rate Improvement in Engineering Education. Education Sciences. 2019; 9(2):122. https://doi.org/10.3390/educsci9020122

Chicago/Turabian Style

Ouahada, Khmaies. 2019. "Course Evaluation for Low Pass Rate Improvement in Engineering Education" Education Sciences 9, no. 2: 122. https://doi.org/10.3390/educsci9020122

APA Style

Ouahada, K. (2019). Course Evaluation for Low Pass Rate Improvement in Engineering Education. Education Sciences, 9(2), 122. https://doi.org/10.3390/educsci9020122

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop