MEIoT 2D-CACSET: IoT Two-Dimensional Cartesian Coordinate System Educational Toolkit Align with Educational Mechatronics Framework
Abstract
:1. Introduction
2. Related Work
3. Educational Mechatronics Conceptual Framework EMCF for Virtual Scenarios
4. Design and Implementation of the MEIoT 2D-CACSET in the OBNiSE Architecture
- The application and device layers allow the integration of several solutions from different areas to propose comprehensive solutions in education, smart cities, and healthcare, among others. The educational mechatronics application for this paper describes the macro-process of the EMCF-LCM presented in Figure 1, integrated with the ONBiSE architecture.
- The network layers allow to connect people to the network and set profiles, permissions, and availability of services. The configuration of this layer is through tools, user profiles, and data accessibility.
- The processing layer is responsible for processing the information and organizing the data to be visualized.
- The cloud layer ensures the data availability of users, systems, and applications.
- Finally, the security layer manages the data encryption of all the layers and secures the information in the system.
- 1.
- Web System. It is stored in a cloud system in the OBNiSE facilities, which only educators can access. The web system performs the connection with the Raspberry Pi and displays the Mindstorms EV3 mobile robot coordinates in real time.
- 2.
- Training Web. This training web is part of the primary system with limitations for participants. In this, participants are only allowed to modify the coordinates of the LEGO Mindstorms EV3 mobile robot in , , , , and visualize the movements that the robot executes to reach them in real time. It is only allowed to display up to 10 movements of the robot at a time. The platform has a “clean” button, which helps if anyone wants to see other coordinates or delete them at any time, so the participants do not have to wait until the movements are reached.
- 3.
- MEIoT 2D-CACSET. It is an IoT device that integrates a Raspberry Pi for the robot control, a Bluetooth module, an encoder, an ultrawide band (UWB)-tag, UWB anchor, and UWB listener modules. The MEIoT 2D-CACSET establishes the communication between the training web and the LEGO Mindstorms EV3 mobile robot via Wi-Fi to capture the coordinates sent from the training website and then execute them in the robot.
- 4.
- Users. These are divided into two categories.
- Educators. They have access to all the data and arrangements of the MEIoT 2D-CACSET system and have complete knowledge of handling the configurations. Likewise, they allow or deny access to registered participants in the system. Educators also can modify the dashboard, data, time of actualization, color, and other functionalities.
- Participants. These are primarily students who are part of the mechatronics education course, interns, and social service students. They require registration on the platform to access the system and view the MEIoT 2D-CACSET. The participants have limited access and can only modify some parameters that the educator allows for a defined time.
5. MEIoT 2D-CACSET: Two-Dimensional Cartesian Coordinate System Educational Toolkit
5.1. Raspberry Pi Gateway
- Raspberry Pi–web system unit test: The unitary test verifies the correct functionality and communication between the Raspberry Pi and the web system by exchanging data; the web sends commands to the Raspberry Pi to prove the arrival of the information and vice versa to control the status of the web (available, idle, or off).
- Camera–Raspberry Pi unit test: Connection tests were carried out between the camera and the Raspberry Pi to verify the correct operation and image visualization and set the image coordinates of the Cartesian plane, quality of the image, position, and the status of the camera.
- Raspberry PI–EV3IA unit test: The unitary test verifies the Bluetooth communication between Raspberry Pi and EV3IA by interchange mailboxes. To confirm the arrival of the data to the devices, a hyperterminal is implemented on Raspberry Pi using a serial port, and for the EV3IA, an SSH connection is conducted.
- OBS–Raspberry PI unit test: These tests verify the proper operation of practices using the OpenCV library, by adding the corresponding figures to the 2D board according to the selected practice. Moreover, this unitary test verifies the streaming video using OBS embedded in the web.
- Practice 1. The EV3IA, from its initial position, moves towards the coordinates that were sent through the system web. Once it passes through each of the coordinates in its path, it illuminates them until it reaches its destination. The points remain until the car stops.
- Practice 2. EV3IA leaves its initial position touring the Cartesian plane; when it passes any of the coordinates on its way, it draws a point that disappears once the EV3IA leaves the coordinates; if any of the coordinates are within a quadrant, this lights up in red, and the dialed number appears in white.
- Practice 3. The EV3IA moves from its initial position and draws points on the coordinates it visits on its way; once it continues moving, the point disappears and reappears at the new coordinate. Points stop being drawn when it reaches their final coordinates (see Figure 9).
5.2. GUI MEIoT 2D-CACSET
- EV3 Reference Position: The user can enter the desired coordinates to be reached by the EV3IA; by pressing the “MOVE CAR” button, the coordinate data are sent to the Raspberry Pi to request a movement of the EV3IA. Coordinates data entry are restricted according to the 2D-CACSET 2D workspace area.
- Configuration Check Box: The user can select between three interactive practices by selecting a checkbox; by selecting one of the checkboxes, interactive symbols and figures are enabled that appear in the video according to the chosen practice; this is described in depth in Section 6.
- Direction Buttons (Left/Up/Right/Down): The direction buttons allow the user to control the motion of EV3IA dynamically, and movements are performed unitary according to the pressed button, for example; if the EV3IA is at the origin of 2D-CACSET, then the down button is pressed and the EV3 should move to the position (0,−1).
- Video Streaming: The video is transmitted by using the OBS; this way, it enables the user to visualize the 2D-CACSET workspace remotely. Moreover, OpenCV is used to add figures according to the selected practice.
- Route Log Table: The table records all the movements made by the EV3IA. The user can select one of the registered coordinate positions, and a point will appear in the video transmission showing the chosen coordinate point.
- Current Car Position: Shows the current coordinates position of EV3IA.
- Status Indicator: The status indicator turns red, denoting a busy state, and green for an available state; these depend on the user’s interaction with the GUI. An example is when the user presses a button; at that moment, the status indicator will turn red, displaying that the system is attending to the request; other operations cannot be performed until the current one concludes.
- 1.
- Every participant must be previously registered on the training web and access using its username and password. Then, the server verifies the user to establish the connection with the EV3IA mobile robot (named robot in this section to simplify the explanation).
- 2.
- Once the participant is on the platform, they will be able to see the current state of the robot, the coordinates, and the visualization of these on the screen (see the right screen in Figure 11).
- 3.
- The participant will be able to modify the robot coordinates, which are configured in a range of in the X and Y axes. Once the coordinates have been entered, the participant presses the “Move Car” button.
- 4.
- Every coordinate or movement can be set individually on the camera by pressing the arrows around the video. The movements that can be made are UP (Y+), DOWN (Y−), LEFT (X−), and RIGHT (X+).
- 5.
- The coordinates are sent through the network to the server, and the server sends them to the MEIoT 2D-CACSET device.
- 6.
- The participant will then be able to visualize the robot’s movement on the platform in real time; the robot will stop once it has reached the coordinates.
- 7.
- The platform also contains a set of configurations. Every participant can select any of these configurations (practice 1, practice 2, and practice 3). Every practice performs a set of actions that users can see. The configurations can be selected one by one or more at a time.
- 8.
- Finally, the participant will also see the robot’s movements through the live camera, which is transmitted to one of the observatory’s screens (see the left screen in Figure 11).
5.3. LEGO Mindstoms EV3—Interactive Actuator (EV3IA)
6. Instructional Design for Mechatronic Concept Two-Dimensional Cartesian Coordinates Using the IoT 2D-CACSET
- Virtual Concrete Level: This level requires the student to experiment with virtual-world objects; therefore, the MEIoT 2D-CACSET represents a way to achieve this. The bidimensional CCS is a widely used way to locate objects. Consequently, it means the best choice to start with basic spatial concepts. For this purpose, the MEIoT 2D-CACSET includes a mobile robot equipped with a tag to report its current position concerning the board that represents the Cartesian plane. It is worth mentioning that even though the mobile robot is an actual artifact, it is considered a virtual object since the participant will be interacting through the GUI MEIoT 2D-CACSET from a remote location in a virtual manner. The participant can interact and command the movement of the mobile robot with the tag, which represents a point in the plane, through the control buttons “RIGHT”, “LEFT”, “UP”, and “DOWN”, which use instructions in colloquial language. It is essential to mention that the GUI MEIoT 2D-CACSET will graphically show the reached point cloud, using a series of interconnected filled circles as the mobile robot moves on the board.
- Virtual Graphic Level: In this second level, the participant must relate the skills acquired in the first level with symbolic elements. In this case, the participant can see all the circles representing the points that the mobile robot with the tag is reaching. Moreover, a CSV file containing the tag’s raw data position can be downloaded for analysis using Matlab to reinforce this level.
- Virtual Abstract Level: Abstract logical thinking is acquired at the stage of formal operations. Therefore, students can develop a mathematical understanding to enhance cognitive processes without relying on manipulating an object. For this purpose, the current x and y position values of the tag are displayed on the right side table of the GUI MEIoT 2D-CACSET video. Every time the tag reaches a new position, this is added to the table. Then, the participant can relate these values to the filled circles. Moreover, the mentioned CSV file in the previous level could be used for a better understanding of the variables.
6.1. Practice 1: Cartesian Coordinate System
- ●
- The set of instructions for the participants in the virtual concrete level are as follows:
- 1.
- Move the mobile robot from its home position to the right direction by clicking once on the corresponding button; then, the robot starts moving one place in that direction and then stops. Click once more on the button in the right direction; then, the robot begins moving in one position and stops. Now, click once on the button in the left direction, then the robot starts moving in one place and stops. Finally, click once more on the button in the left direction, then the robot starts moving one position and then stops.
- 2.
- Move the mobile robot from its home position to the left by clicking once on the corresponding button; then, the robot starts moving one position and stops. Click once more on the button in the left direction, then the robot starts moving one position and then stops. Now, click once on the button in the right direction; then, the robot begins moving in one position and then stops. Finally, click once more on the button in the right direction, then the robot starts moving one position and then stops.
- 3.
- Move the mobile robot from its home position to the upward direction by clicking once on the corresponding button; then, the robot starts moving one position and then stop. Click once more on the button in the upward direction; then the robot starts moving one position and stops. Now, click once on the button in the downward direction, then the robot starts moving one position and then stop. Finally, click once more on the button in the downward direction, then the robot starts moving one position and then stop.
- 4.
- Move the mobile robot from its home position to the downward direction by clicking once on the corresponding button; then, the robot starts moving one position and then stop. Click once more on the button in the downward direction, then the robot starts moving one position and then stop. Now, click once on the button in the upward direction, then the robot starts moving one position and then stop. Finally, click once more on the button in the upward direction, then the robot starts moving one position and then stop. Figure 15 shows the 2D board in the GUI once the concrete level instructions have been followed and performed.
- ●
- The set of instructions for the participants in the virtual graphic level participants are as follows:
- 5.
- Make a color change of the filled circle from blue to gray by clicking on the circle where the mobile robot ends after the four steps of the specific level. This is the origin of the system O.
- 6.
- Make a color change of the filled circle from blue to red by clicking on the circles where the robot passed when the first instruction of the concrete level was performed; exclude the home position. Then, assign to this movement the symbol .
- 7.
- Make a color change of the filled circle from blue to green by clicking on the circles where the robot passed when the second instruction of the concrete level was performed; exclude the home position. Then, assign to this movement the symbol .
- 8.
- Make a color change of the filled circle from blue to purple by clicking on the circles where the robot passed when the third instruction of the concrete level was performed; exclude the home position. Then, assign to this movement the symbol .
- 9.
- Make a color change of the filled circle from blue to magenta by clicking on the circles where the robot passed when the first instruction of the concrete level was performed; exclude the home position. Then, assign to this movement the symbol . Finally, the participant obtained a plus shape draw (see Figure 16). The table containing the points reached by the robot can be seen on the right side of the GUI. The table contains the points reached by the robot.
- ●
- The set of instructions for the participants in the virtual abstract level participants are as follows:
- 10.
- The plus shape is the representation of the 2D CCS. The horizontal line from the origin to the farthest point to the right and left are called the and axes, respectively. The complete horizontal line is known as X axis, containing the set of real numbers . Moreover, the vertical line from the origin to the farthest point to the top and bottom are called the and axes, respectively. The complete vertical line is known as Y axis, containing the set of real numbers . In addition, the intersection of the horizontal and vertical line is known as the origin, with representation in 2D coordinates ; the first coordinate corresponds to the X axis and the second coordinate to the Y axis.
- 11.
- Using the data in the GUI, build a table containing all the 2D coordinates for the , , , and (see Table 1).
- 12.
- We can extend this knowledge to assign a pair of coordinates to any filled circle in the MEIoT 2D-CACSET board to represent a point in the 2D CCS with coordinates. If any coordinate in the table is clicked, the font size will be increased, and the corresponding filled circle on the 2D board will be highlighted.
- 13.
- Finally, click on the point in the Table of the GUI. It can be noted that the font size of the 2D coordinate is increased, and the corresponding filled circle on the 2D board is highlighted. We can extend this knowledge to any pair of coordinates in the GUI table (see Figure 17).
6.2. Practice 2: Quadrants
- ●
- The set of instructions for the participants in the virtual concrete level:
- 1.
- Move the mobile robot from its home position to the right direction by clicking twice on the corresponding button; then, the robot starts moving two positions and then stops. Now, click twice on the button in the upward direction, then the robot starts moving in two positions and then stop. It can be noted that the mobile robot reached position .
- 2.
- Move the mobile robot from its current position to the left by clicking four times the corresponding button; then, the robot starts moving in four positions and stops. It can be noted that the mobile robot reached position .
- 3.
- Move the mobile robot from its current position to the downward direction by clicking four times on the corresponding button; then, the robot starts moving four positions and then stop. It can be noted that the mobile robot reached position .
- 4.
- Move the mobile robot from its current position to the right direction by clicking four times the corresponding button; then, the robot starts moving in four positions and then stops. It can be noted that the mobile robot reached position .
- 5.
- Finally, to return the mobile robot to the home position, click twice on the up button; then, the robot starts moving in two positions and then stops. Then, click twice on the button in the left direction, and the robot starts moving in two positions and then stops. It can be noted that the mobile robot reached position .
- ●
- The set of instructions for the participants in the virtual graphic level:
- 6.
- Insert a purple square by clicking on the endpoint of instruction 1 of the concrete virtual level, the purple point in the position . Then, assign to this area the Roman number I.
- 7.
- Insert a green square by clicking on the endpoint of instruction 2 of the concrete virtual level, the green point in the position . Then, assign to this area the Roman number II.
- 8.
- Insert a magenta square by clicking on the endpoint of instruction 3 of the concrete virtual level, the magenta point in the position . Then, assign to this area the Roman number III.
- 9.
- Insert a red square by clicking on the endpoint of instruction 4 of the concrete virtual level, the red point in the position . Then, assign to this area the Roman number IV.
- ●
- The set of instructions for the participants in the virtual abstract level:
- 10.
- The first quadrant of the 2D plane, called quadrant I, is the region labeled with the Roman number I. It can be noted that the coordinates in this region are , respectively.
- 11.
- The second quadrant of the 2D plane, called quadrant II, is the region labeled with the Roman number II. It can be noted that the coordinates in this region are , respectively.
- 12.
- The third quadrant of the 2D plane, called quadrant III, is the region labeled with the Roman number III. It can be noted that the coordinates in this region are , respectively.
- 13.
- The fourth quadrant of the 2D plane, called quadrant IV, is the region labeled with the Roman number II. It can be noted that the coordinates in this region are , respectively.
- 14.
- Using the data in the GUI, build a table containing all the reached 2D coordinates corresponding to quadrants I, II, III, and IV (see Table 2).
6.3. Practice 3: Point in Cartesian Coordinate System
- ●
- The set of instructions for the participants in the virtual concrete level:
- 1.
- In the car reference position section of the GUI, introduce the reference position: and coordinates. Now, click the MOVE CAR button, then the mobile robot starts moving to the reference position and then stops. It can be noted that the mobile robot reached position .
- ●
- The set of instructions for the participants in the virtual graphic level:
- 2.
- Make a color change of the filled circle from blue to purple by clicking on the circle where the mobile robot ends after the virtual concrete level is performed.
- ●
- The set of instructions for the participants in the virtual abstract level:
- 3.
- The circle you filled with purple color can be seen as very small, almost dimensionless. It is perceptible by a contrast of color and it is called a point. The point coordinates are defined by an ordered pair with two numbers written together in parentheses. The x and y coordinates are commonly named abscissa and ordinate, respectively. The coordinates of point P are . Click on the point in the table of the GUI. It can be noted that the font size of the 2D coordinate is increased, and the corresponding filled circle on the 2D board is highlighted. Table 3 shows the collected data from the tag, representing the position value of coordinates.
7. Discussion
8. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
2D-CACSET | Two-Dimensional Cartesian Coordinate System Educational Toolkit |
IoT | Internet of Things |
CCS | Cartesian Coordinate System |
COVID-19 | Coronavirus Disease 19 |
CSV | Comma Separated Values |
RTLS | DWM Real-Time Location Systems |
DWM | Decawave Module |
EMCF | Educational Mechatronics Conceptual Framework |
EMCF-LCM | EMCF Learning Construction Methodology |
GUI | Graphical User Interface |
RF | Radio Frequency |
RFID | Radio Frequency Identification |
RTLS | Real-Time Location Systems |
SPI | Serial Peripheral Interface |
STEM | Science, Technology, Engineering, and Mathematics |
TDoA | Time Differential on Arrival |
TWR | Two-Way Ranging |
UART | Universal Asynchronous Receiver–Transmitter |
UWB | Ultrawide Band |
VR | Virtual Reality |
WAN | Wide Area Network |
Wi-Fi | Wireless Fidelity |
OBS | Open Broadcaster Software |
RCX | Robotic Command eXplorer |
NXT | The Next Generation |
EV3 | The Evolution |
EV3IA | EV3 Interactive Actuator |
HOL | Hands-On Learning |
ME | Educational Mechatronics (for its acronym in Spanish) |
MEIoT | Educational Mechatronics and Internet of Things |
MEIoT-2D-CACSET | Educational Mechatronics and Internet of Things 2D-CACSET |
References
- World Health Organization. WHO Director-General’s Opening Remarks at the Media Briefing on COVID-19. Available online: https://www.who.int/director-general/speeches/detail/who-director-general-s-opening-remarks-at-the-media-briefing-on-covid-19—11-march-2020 (accessed on 12 December 2021).
- Nazempour, R.; Darabi, H.; Nelson, P.C. Impacts on Students’ Academic Performance Due to Emergency Transition to Remote Teaching during the COVID-19 Pandemic: A Financial Engineering Course Case Study. Educ. Sci. 2022, 12, 202. [Google Scholar] [CrossRef]
- Rassudov, L.; Korunets, A. COVID-19 Pandemic Challenges for Engineering Education. XI Int. Conf. Electr. Power Drive Syst. 2020, 1–3. [Google Scholar] [CrossRef]
- Loukatos, D.; Androulidakis, N.; Arvanitis, K.G.; Peppas, K.P.; Chondrogiannis, E. Using Open Tools to Transform Retired Equipment into Powerful Engineering Education Instruments: A Smart Agri-IoT Control Example. Electronics 2022, 11, 855. [Google Scholar] [CrossRef]
- Macioszek, E.; Kurek, A. An Analysis of the Factors Influencing the Selection of a Given Form of Learning. J. Open Innov. Technol. Mark. Complex. 2022, 8, 54. [Google Scholar] [CrossRef]
- Hernández de Menéndez, M.; Escobar, C.; Morales-Menendez, R. Engineering education for smart 4.0 technology: A review. Int. J. Interact. Des. Manuf. 2020, 16, 1–16. [Google Scholar] [CrossRef]
- Espinoza, J.V.; Alama, W.I.; Bohórquez, J.S.; Amaya, I.B. Implementation of a Pilot Plant for the Integral Development of Remote Laboratories. IEEE Rev. Iberoam. Tecnol. Aprendiz. 2021, 16, 1. [Google Scholar] [CrossRef]
- Monzo, C.; Cobo, G.; Morán, J.A.; Santamaría, E.; García-Solórzano, D. Remote Laboratory for Online Engineering Education: The RLAB-UOC-FPGA Case Study. Electronics 2021, 10, 1072. [Google Scholar] [CrossRef]
- Sotelo, D.; Sotelo, C.; Ramirez-Mendoza, R.A.; López-Guajardo, E.A.; Navarro-Duran, D.; Niño-Juárez, E.; Vargas-Martinez, A. Lab-Tec@Home: A Cost-Effective Kit for Online Control Engineering Education. Electronics 2022, 11, 907. [Google Scholar] [CrossRef]
- Grodotzki, J.; Upadhya, S.; Tekkaya, A.E. Engineering education amid a global pandemic. Adv. Ind. Manuf. Eng. 2021, 3, 100058. [Google Scholar] [CrossRef]
- Checa, D.; Saucedo-Dorantes, J.J.; Osornio-Rios, R.A.; Antonino-Daviu, J.A.; Bustillo, A. Virtual Reality Training Application for the Condition-Based Maintenance of Induction Motors. Appl. Sci. 2022, 12, 414. [Google Scholar] [CrossRef]
- Urban, H.; Pelikan, G.; Schranz, C. Augmented Reality in AEC Education: A Case Study. Buildings 2022, 12, 391. [Google Scholar] [CrossRef]
- Gutiérrez-Martínez, Y.; Bustamante-Bello, R.; Navarro-Tuch, S.A.; López-Aguilar, A.A.; Molina, A.; Álvarez-Icaza Longoria, I. A Challenge-Based Learning Experience in Industrial Engineering in the Framework of Education 4.0. Sustainability 2021, 13, 9867. [Google Scholar] [CrossRef]
- Miranda, J.; Navarrete, C.; Noguez, J. The core components of education 4.0 in higher education: Three case studies in engineering education. Comput. Electr. Eng. 2021, 93, 107278. [Google Scholar] [CrossRef]
- Zhao, Y.; Watterston, J. The changes we need: Education post COVID-19. J. Educ. Chang. 2021, 22, 3–12. [Google Scholar] [CrossRef]
- Castañeda-Miranda, V.H.; Luque-Vega, L.F.; Lopez-Neri, E.; Nava-Pintor, J.A.; Guerrero-Osuna, H.A.; Ornelas-Vargas, G. Two-Dimensional Cartesian Coordinate System Educational Toolkit: 2D-CACSET. Sensors 2021, 21, 6304. [Google Scholar] [CrossRef]
- Uttal, D.H.; Cohen, C.A. Spatial thinking and STEM education: When, why, and how. In Psychology of Learning and Motivation; Elsevier: Amsterdam, The Netherlands, 2012; Volume 57, pp. 147–181. [Google Scholar] [CrossRef]
- Dec, G.; Stadnicka, D.; Paśko, Ł.; Mądziel, M.; Figliè, R.; Mazzei, D.; Tyrovolas, M.; Stylios, C.; Navarro, J.; Solé-Beteta, X. Role of Academics in Transferring Knowledge and Skills on Artificial Intelligence, Internet of Things and Edge Computing. Sensors 2022, 22, 2496. [Google Scholar] [CrossRef]
- Qureshi, M.I.; Khan, N.; Raza, H.; Imran, A.; Ismail, F. Digital Technologies in education 4.0. does it enhance the effectiveness of learning? A systematic literature review. Int. J. Interact. Mob. Technol. 2021, 15, 31. [Google Scholar] [CrossRef]
- Du, B.; Chai, Y.; Huangfu, W.; Zhou, R.; Ning, H. Undergraduate University Education in Internet of Things Engineering in China: A Survey. Educ. Sci. 2021, 11, 202. [Google Scholar] [CrossRef]
- Coşkun, S.; Kayıkcı, Y.; Gençay, E. Adapting Engineering Education to Industry 4.0 Vision. Technologies 2019, 7, 10. [Google Scholar] [CrossRef] [Green Version]
- Magyari, A.; Chen, Y. FPGA Remote Laboratory Using IoT Approaches. Electronics 2021, 10, 2229. [Google Scholar] [CrossRef]
- Jacko, P.; Bereš, M.; Kováčová, I.; Molnár, J.; Vince, T.; Dziak, J.; Fecko, B.; Gans, Š.; Kováč, D. Remote IoT Education Laboratory for Microcontrollers Based on the STM32 Chips. Sensors 2022, 22, 1440. [Google Scholar] [CrossRef]
- Chamunorwa, T.; Modran, H.A.; Ursuțiu, D.; Samoilă, C.; Hedeșiu, H. Reconfigurable Wireless Sensor Node Remote Laboratory Platform with Cloud Connectivity. Sensors 2021, 21, 6405. [Google Scholar] [CrossRef] [PubMed]
- Pinto, J.F.; Silva, H.P.d.; Melo, F.; Fred, A. ScientIST: Biomedical Engineering Experiments Supported by Mobile Devices, Cloud and IoT. Signals 2020, 1, 110–120. [Google Scholar] [CrossRef]
- Robles-Gómez, A.; Tobarra, L.; Pastor-Vargas, R.; Hernández, R.; Cano, J. Emulating and Evaluating Virtual Remote Laboratories for Cybersecurity. Sensors 2020, 20, 3011. [Google Scholar] [CrossRef] [PubMed]
- Banica, L.; Burtescu, E.; Enescu, F. The impact of Internet-of-Things in higher education. Sci. Bull.-Econ. Sci. 2017, 16, 53–59. [Google Scholar]
- Luque-Vega, L.F.; Lopez-Neri, E.; Santoyo, A.; Ruiz-Duarte, J.; Farrera-Vazquez, N. Educational Methodology Based on Active Learning for Mechatronics Engineering Students: Towards Educational Mechatronics. Comput. Sist. 2019, 23, 325–333. [Google Scholar] [CrossRef]
- Miranda-Flores, J.R.; Luque-Vega, L.F.; López-Neri, E.; González-Jiménez, L.E.; Saad, M. Design and Implementation of A Novel Robot Manipulator Kit for Industry 4.0 through Educational Mechatronics. In Proceedings of the 22nd International Conference on Engineering and Product Design Education 2020, Herning, Denmark, 10–11 September 2020. [Google Scholar] [CrossRef]
- Ortiz, E.; Eisenreich, H.A.; Tapp, L.E.K. Physical and Virtual Manipulative Framework Conceptions of Undergraduate Pre-service Teachers. Int. J. Math. Teach. Learn. 2019, 20, 62–84. [Google Scholar]
- Moyer-Packenham, P.S.; Bolyard, J.J. Revisiting the Definition of a Virtual Manipulative. In Book Mathematics Education in the Digital Era, 1st ed.; Springer Link: Logan, UT, USA, 2016; Volume 7, pp. 3–23. [Google Scholar]
- Carlos-Mancilla, M.A.; Luque-Vega, L.F.; Guerrero-Osuna, H.A.; Ornelas-Vargas, G.; Aguilar-Molina, Y.; González-Jiménez, L.E. Educational Mechatronics and Internet of Things: A Case Study on Dynamic Systems Using MEIoT Weather Station. Sensors 2021, 21, 181. [Google Scholar] [CrossRef]
- Guerrero-Osuna, H.A.; Luque-Vega, L.F.; Carlos-Mancilla, M.A.; Ornelas-Vargas, G.; Castañeda-Miranda, V.H.; Carrasco-Navarro, R. Implementation of a MEIoT Weather Station with Exogenous Disturbance Input. Sensors 2021, 21, 1653. [Google Scholar] [CrossRef]
- León Araujo, H.; Gulfo Agudelo, J.; Crawford Vidal, R.; Ardila Uribe, J.; Remolina, J.F.; Serpa-Imbett, C.; López, A.M.; Patiño Guevara, D. Autonomous Mobile Robot Implemented in LEGO EV3 Integrated with Raspberry Pi to Use Android-Based Vision Control Algorithms for Human-Machine Interaction. Machines 2022, 10, 193. [Google Scholar] [CrossRef]
- Pawuś, D.; Paszkiel, S. The Application of Integration of EEG Signals for Authorial Classification Algorithms in Implementation for a Mobile Robot Control Using Movement Imagery-Pilot Study. Appl. Sci. 2022, 12, 2161. [Google Scholar] [CrossRef]
- Montes, N.; Rosillo, N.; Mora, M.C.; Hilario, L. A Novel Real-Time MATLAB/Simulink/LEGO EV3 Platform for Academic Use in Robotics and Computer Science. Sensors 2021, 21, 1006. [Google Scholar] [CrossRef] [PubMed]
- Spanò, A.; Cortesi, A. Legodroid: A Type-Driven Library for Android and LEGO Mindstorms Interoperability. Sensors 2020, 20, 1926. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Valdez, F.; Castillo, O.; Caraveo, C.; Peraza, C. Comparative Study of the Conventional Mathematical and Fuzzy Logic Controllers for Velocity Regulation. Axioms 2019, 8, 53. [Google Scholar] [CrossRef] [Green Version]
- EV3 Firmware Developer Kit. Available online: https://www.lego.com/cdn/cs/set/assets/blt77bd61c3ac436ea3/LEGO_MINDSTORMS_EV3_Firmware_Developer_Kit.pdf (accessed on 8 June 2022).
- Luque-Vega, L.F.; Michel-Torres, D.A.; Lopez-Neri, E.; Carlos-Mancilla, M.A.; González-Jiménez, L.E. IoT Smart Parking System Based on the Visual-Aided Smart Vehicle Presence Sensor: SPIN-V. Sensors 2021, 21, 1476. [Google Scholar] [CrossRef] [Green Version]
- Lopez-Rodriguez, F.M.; Cuesta, F. An Android and Arduino Based Low-Cost Educational Robot with Applied Intelligent Control and Machine Learning. Appl. Sci. 2021, 11, 48. [Google Scholar] [CrossRef]
- Gao, J.B.; Harris, C.J. Some remarks on Kalman filters for the multisensor fusion. Inf. Fusion 2002, 3, 191–201. [Google Scholar] [CrossRef]
- Lee, Y.; Lim, D. Vision/UWB/IMU sensor fusion based localization using an extended Kalman filter. In Proceedings of the 2019 IEEE Eurasia Conference on IOT, Communication and Engineering (ECICE), Yunlin, Taiwan, 3–6 October 2019. [Google Scholar] [CrossRef]
- Wang, D.; Zhang, H.; Ge, B. Adaptive Unscented Kalman Filter for Target Tacking with Time-Varying Noise Covariance Based on Multi-Sensor Information Fusion. Sensors 2021, 21, 5808. [Google Scholar] [CrossRef]
- Đorđević, A.; Milovanović, A.; Milojević, M.P.; Zorić, A.; Pešić, M.; Ristić Trajković, J.; Nikezić, A.; Djokić, V. Developing Methodological Framework for Addressing Sustainability and Heritage in Architectural Higher Education—Insights from HERSUS Project. Sustainability 2022, 14, 4597. [Google Scholar] [CrossRef]
Quadrant I | Quadrant II | Quadrant III | Quadrant IV |
---|---|---|---|
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Carrasco-Navarro, R.; Luque-Vega, L.F.; Nava-Pintor, J.A.; Guerrero-Osuna, H.A.; Carlos-Mancilla, M.A.; Castañeda-Miranda, C.L. MEIoT 2D-CACSET: IoT Two-Dimensional Cartesian Coordinate System Educational Toolkit Align with Educational Mechatronics Framework. Sensors 2022, 22, 4802. https://doi.org/10.3390/s22134802
Carrasco-Navarro R, Luque-Vega LF, Nava-Pintor JA, Guerrero-Osuna HA, Carlos-Mancilla MA, Castañeda-Miranda CL. MEIoT 2D-CACSET: IoT Two-Dimensional Cartesian Coordinate System Educational Toolkit Align with Educational Mechatronics Framework. Sensors. 2022; 22(13):4802. https://doi.org/10.3390/s22134802
Chicago/Turabian StyleCarrasco-Navarro, Rocío, Luis F. Luque-Vega, Jesús Antonio Nava-Pintor, Héctor A. Guerrero-Osuna, Miriam A. Carlos-Mancilla, and Celina Lizeth Castañeda-Miranda. 2022. "MEIoT 2D-CACSET: IoT Two-Dimensional Cartesian Coordinate System Educational Toolkit Align with Educational Mechatronics Framework" Sensors 22, no. 13: 4802. https://doi.org/10.3390/s22134802
APA StyleCarrasco-Navarro, R., Luque-Vega, L. F., Nava-Pintor, J. A., Guerrero-Osuna, H. A., Carlos-Mancilla, M. A., & Castañeda-Miranda, C. L. (2022). MEIoT 2D-CACSET: IoT Two-Dimensional Cartesian Coordinate System Educational Toolkit Align with Educational Mechatronics Framework. Sensors, 22(13), 4802. https://doi.org/10.3390/s22134802