SENS+: A Co-Existing Fabrication System for a Smart DFA Environment Based on Energy Fusion Information
Abstract
:1. Introduction
1.1. Background
1.2. Motivation and Approach
2. Related Research
2.1. Virtual Environment Energy with the Co-Fabrication Space
2.2. Sensing the Physical Environment with the Smart Factory
2.3. Human Interaction and Behaviors with the Co-Fabrication Space
3. Co-Existing Space for the Smart Factory
3.1. The System
3.2. The Virtual Environment with the Agent and Co-Fabrication Communication Framework
3.3. The Physical Environment with a Smart Factory Architecture
- The remote terminal allows users to view information using a variety of tools, such as IoT Explorer, MR, XR, VR, App, and Mobile Devices, making it easy for them to understand the data.
- The master control layer uses advanced technology such as the NXP RT1062, Raspberry Pi 3B+, and ibeacon to issue control commands and collect data from the entire system. The NXP RT1062 development board connects to Tencent IoT Explorer via an ESP8266 WiFi module to upload sensing data and receive remote control commands using the MQTT protocol stack. Additionally, the Raspberry Pi 3B+ development board’s network port communicates with radar to obtain point cloud data, which is then sent to the NXP RT1062 development board after processing.
- The communication layer serves as the interface between the main control layer and the perception and driver layers. It provides several communication interfaces such as WiFi, LoRa, ZigBee, NB-IoT, Ethernet, and others.
- The sensing layer contains three types of sensors: user gesture sensors, environmental state sensors, and energy sensors.
- The driver layer receives commands from the main control layer and completes the movement control of the user, local machine, and device during the prototyping and design phase.
3.4. User Interaction in the MR Device
4. Evaluation of User Behavior in Co-Fabrication Space
4.1. Human Behavior Data
4.2. User Interaction with the Scenario and the User Interface
- After scanning the QR code, the energy consumption, the generation of the location, and the reserve item are displayed in the system with basic location information (Figure 11a).
- When the user approaches the electronic devices that are in the room, there are two ways to see the current consumption load of the device. First, through the system, it is displayed as energy consumption information on the user’s mobile device. Second, the demand and use are displayed on an MR with a notification sent to the user. Users can update their design to the system. The system will automatically analysis and disassemble the component and will provide a construction methods suggestion (Figure 11b).
- After that, the system will generate three types of document for the user. First, an E-design document is generated by the designer 3D model. This document can help designers immediately examine their design (Figure 11c).
- The e-design document is provided by the designer before fabrication, and this document can help the designer check their design directly. The second type of archive is the e-fabrication document (Figure 11d).
- This file is provided for the fabricator to see. This stage is the file produced by solving the communication and imagination problems of the traditional digital fabrication process. These files will provide a detailed fabrication process and advice on how to break down each component. The last e-assembly document mainly provide the basis and step understanding for assembler assembly (Figure 11e).
- Finally, the designer checks that their file is complete and can use their preferences and habits to choose their methods and fabrication machine (Figure 11f).
5. Conclusions and Future Work
- Fabrication process automation: In the interactive fabrication process, fabricators are currently provided with the ability to design, manufacture, and assemble the operating components required for some of the fabrication processes one by one. After the fabrication is completed, the maker may then work out the path of the movement of the fabrication machine. To make digital fabrication easier to operate, design, and manufacture, in future development, the step-by-step design process can be automated through digital computing so that designers can control and prolong the design development process.
- Customized fabrication process integration: In the current study, the fabrication process provided workflow recommendations that allowed individual machines to process and distribute, but it lacked a more precise workflow. On this basis, it is possible to extend the processing module through precise system calculations so that designers, fabricators, and assemblers can be involved in the same process by the integration of the tool, so as to achieve the aim of customizing the fabrication process integration.
- Digital twins for integrated method applications: In the future, this study can focus on the fusion of real and virtual coexistence under the concept of digital twins. At present, spatial sensing as a mechanism feeds back physical products to the virtual environment, but only intercepts the current state for analysis, which is not in line with real-time physical signal transmission. The ability to receive signals in real time could be investigated in the future and combine sensors with interactive fabrication tools to allow physical feedback to occur in real time, making the finished product more compatible with virtual models.
Author Contributions
Funding
Conflicts of Interest
References
- Marques, G. Ambient assisted living and internet of things. In Harnessing the Internet of Everything (IoE) for Accelerated Innovation Opportunities; IGI Global: Hershey, PA, USA, 2019; pp. 100–115. [Google Scholar]
- Latif, S.; Driss, M.; Boulila, W.; Huma, Z.E.; Jamal, S.S.; Idrees, Z.; Ahmad, J. Deep Learning for the Industrial Internet of Things (IIoT): A Comprehensive Survey of Techniques, Implementation Frameworks, Potential Applications, and Future Directions. Sensors 2021, 21, 7518. [Google Scholar] [CrossRef] [PubMed]
- Hakimi, S.M.; Hasankhani, A. Intelligent energy management in off-grid smart buildings with energy interaction. J. Clean. Prod. 2020, 244, 118906. [Google Scholar] [CrossRef]
- Figueiredo, J.; Costa, J.L.S.d. A SCADA system for energy management in intelligent buildings. Energy Build. 2012, 49, 85–98. [Google Scholar] [CrossRef] [Green Version]
- Yu, G.J.; Chang, T.W. Reacting with Care: The Hybrid Interaction Types in a Sensible Space. In Proceedings of the Human-Computer Interaction. Towards Mobile and Intelligent Interaction Environments, Orlando, FL, USA, 9–14 July 2011; Jacko, J.A., Ed.; Springer: Berlin/Heidelberg, Germany, 2011; pp. 250–258. [Google Scholar]
- Zanella, A.; Bui, N.; Castellani, A.; Vangelista, L.; Zorzi, M. Internet of things for smart cities. IEEE Internet Things J. 2014, 1, 22–32. [Google Scholar] [CrossRef]
- Liu, H.; Gamboa, H.; Schultz, T. Sensor-Based Human Activity and Behavior Research: Where Advanced Sensing and Recognition Technologies Meet. Sensors 2023, 23, 125. [Google Scholar] [CrossRef] [PubMed]
- Leng, J.; Zhou, M.; Xiao, Y.; Zhang, H.; Liu, Q.; Shen, W.; Su, Q.; Li, L. Digital twins-based remote semi-physical commissioning of flow-type smart manufacturing systems. J. Clean. Prod. 2021, 306, 127278. [Google Scholar] [CrossRef]
- Nikolakis, N.; Maratos, V.; Makris, S. A cyber physical system (CPS) approach for safe human-robot collaboration in a shared workplace. Robot.-Comput.-Integr. Manuf. 2019, 56, 233–243. [Google Scholar] [CrossRef]
- Kotsiopoulos, T.; Sarigiannidis, P.; Ioannidis, D.; Tzovaras, D. Machine Learning and Deep Learning in smart manufacturing: The Smart Grid paradigm. Comput. Sci. Rev. 2021, 40, 100341. [Google Scholar] [CrossRef]
- Sun, S.; Zheng, X.; Gong, B.; García Paredes, J.; Ordieres-Meré, J. Healthy Operator 4.0: A Human Cyber–Physical System Architecture for Smart Workplaces. Sensors 2020, 20, 2011. [Google Scholar] [CrossRef] [Green Version]
- Wang, D.; Zhong, D.; Souri, A. Energy management solutions in the Internet of Things applications: Technical analysis and new research directions. Cogn. Syst. Res. 2021, 67, 33–49. [Google Scholar] [CrossRef]
- Henderson, T.C.; Sobh, T.M.; Zana, F.; Brüderlin, B.; Hsu, C.Y. Sensing strategies based on manufacturing knowledge. In Proceedings of the ARPA Image Understanding Workshop, Monterey, CA, USA, 13–16 November 1994; pp. 1109–1113. [Google Scholar]
- Fan, Y.; Yang, J.; Chen, J.; Hu, P.; Wang, X.; Xu, J.; Zhou, B. A digital-twin visualized architecture for Flexible Manufacturing System. J. Manuf. Syst. 2021, 60, 176–201. [Google Scholar] [CrossRef]
- Quaid, M.A.K.; Jalal, A. Wearable sensors based human behavioral pattern recognition using statistical features and reweighted genetic algorithm. Multimed. Tools Appl. 2020, 79, 6061–6083. [Google Scholar] [CrossRef]
- Frazzon, E.M.; Agostino, Í.R.S.; Broda, E.; Freitag, M. Manufacturing networks in the era of digital production and operations: A socio-cyber-physical perspective. Annu. Rev. Control 2020, 49, 288–294. [Google Scholar] [CrossRef]
- Baroroh, D.K.; Chu, C.H.; Wang, L. Systematic literature review on augmented reality in smart manufacturing: Collaboration between human and computational intelligence. J. Manuf. Syst. 2021, 61, 696–711. [Google Scholar] [CrossRef]
- Wang, L.; Liu, S.; Liu, H.; Wang, X.V. Overview of Human-Robot Collaboration in Manufacturing; Springer: Berlin/Heidelberg, Germany, 2020. [Google Scholar]
- Weiser, M.; Gold, R.; Brown, J.S. The origins of ubiquitous computing research at PARC in the late 1980s. IBM Syst. J. 1999, 38, 693–696. [Google Scholar] [CrossRef]
- Forbes, G.; Massie, S.; Craw, S. Fall prediction using behavioural modelling from sensor data in smart homes. Artif. Intell. Rev. 2020, 53, 1071–1091. [Google Scholar] [CrossRef] [Green Version]
- Chua, S.L.; Foo, L.K.; Guesgen, H.W.; Marsland, S. Incremental Learning of Human Activities in Smart Homes. Sensors 2022, 22, 8458. [Google Scholar] [CrossRef] [PubMed]
- Gupta, N.; Gupta, S.K.; Pathak, R.K.; Jain, V.; Rashidi, P.; Suri, J.S. Human activity recognition in artificial intelligence framework: A narrative review. Artif. Intell. Rev. 2022, 55, 4755–4808. [Google Scholar] [CrossRef] [PubMed]
- Zhang, R.; V E, S.; Jackson Samuel, R.D. Fuzzy Efficient Energy Smart Home Management System for Renewable Energy Resources. Sustainability 2020, 12, 3115. [Google Scholar] [CrossRef] [Green Version]
- Mekruksavanich, S.; Jitpattanakul, A. LSTM Networks Using Smartphone Data for Sensor-Based Human Activity Recognition in Smart Homes. Sensors 2021, 21, 1636. [Google Scholar] [CrossRef] [PubMed]
- Kabalci, Y.; Kabalci, E.; Padmanaban, S.; Holm-Nielsen, J.B.; Blaabjerg, F. Internet of things applications as energy internet in smart grids and smart environments. Electronics 2019, 8, 972. [Google Scholar] [CrossRef] [Green Version]
- Hřebíček, J.; Schimak, G.; Kubásek, M.; Rizzoli, A.E. (Eds.) E-SMART: Environmental Sensing for Monitoring and Advising in Real-Time; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
- Chang, T.W. Modeling generative interplay using actingrole model. From distributed collaboration to generative interplay. CoDesign 2006, 2, 35–48. [Google Scholar] [CrossRef]
- Chang, T.W.; Datta, S.; Lai, I.C. Modelling Distributed Interaction with Dynamic Agent Role Interplay System. Int. J. Digit. Media Des. 2016, 8, 1–14. [Google Scholar]
- Dutta, S.; Chukkapalli, S.S.L.; Sulgekar, M.; Krithivasan, S.; Das, P.K.; Joshi, A. Context Sensitive Access Control in Smart Home Environments. In Proceedings of the 2020 IEEE 6th Intl Conference on Big Data Security on Cloud (BigDataSecurity), IEEE Intl Conference on High Performance and Smart Computing, (HPSC) and IEEE Intl Conference on Intelligent Data and Security (IDS), Baltimore, MD, USA, 25–27 May 2020; pp. 35–41. [Google Scholar] [CrossRef]
- Chang, T.W.; Huang, H.Y.; Hung, C.W.; Datta, S.; McMinn, T. A Network Sensor Fusion Approach for a Behaviour-Based Smart Energy Environment for Co-Making Spaces. Sensors 2020, 20, 5507. [Google Scholar] [CrossRef] [PubMed]
- Grube, D.; Malik, A.A.; Bilberg, A. SMEs can touch Industry 4.0 in the Smart Learning Factory. Procedia Manuf. 2019, 31, 219–224. [Google Scholar] [CrossRef]
- Fernández-Caballero, A.; Martínez-Rodrigo, A.; Pastor, J.; Castillo, J.; Lozano-Monasor, E.; López, M.T.; Zangróniz, R.; Latorre, J.; Fernández-Sotos, A. Smart environment architecture for emotion detection and regulation. J. Biomed. Inform. 2016, 64, 55–73. [Google Scholar] [CrossRef] [PubMed]
- Hsieh, T.L.; Chang, T.W. How to collective design-and-fabricating a weaving structure interaction design—six experiments using a design-fabrication-assembly (DFA) approach. In Proceedings of the 4th RSU National and International Research Conference on Science and Technology, Social Sciences, and Humanities 2019 (RSUSSH 2019), Pathum Thani, Thailand, 26 April 2019. [Google Scholar]
- Piper, W.; Sun, H.; Jiang, J. Digital Twins for Smart Cities: Case Study and Visualisation via Mixed Reality. In Proceedings of the 2022 IEEE 96th Vehicular Technology Conference (VTC2022-Fall), Beijing, China, 26–29 September 2022. [Google Scholar]
- Ríos, A.P.; Callaghan, V.; Gardner, M.; Alhaddad, M.J. Using Mixed-Reality to Develop Smart Environments. In Proceedings of the 2014 International Conference on Intelligent Environments, Shanghai, China, 30 June–4 July 2014; pp. 182–189. [Google Scholar] [CrossRef] [Green Version]
- Croatti, A. Augmented Worlds: A Proposal for Modelling and Engineering Pervasive Mixed Reality Smart Environments. Ph.D. Thesis, University of Bologna, Bologna, Italy, 2019. [Google Scholar]
- Oliff, H.; Liu, Y.; Kumar, M.; Williams, M.; Ryan, M. Reinforcement learning for facilitating human-robot-interaction in manufacturing. J. Manuf. Syst. 2020, 56, 326–340. [Google Scholar] [CrossRef]
- Hsieh, T.L.; Chang, T.W. ViDA: A visual system of DFA process for interactive surface. In Proceedings of the 2019 23rd International Conference in Information Visualization–Part II, Adelaide, Australia, 16–19 July 2019; pp. 68–73. [Google Scholar]
- Chang, T.W.; Hsiao, C.F.; Chen, C.Y.; Huang, H.Y. CoFabs: An Interactive Fabrication Process Framework. In Architectural Intelligence; Springer: Berlin/Heidelberg, Germany, 2020; pp. 271–292. [Google Scholar]
- Croatti, A.; Ricci, A. A model and platform for building agent-based pervasive mixed reality systems. In Proceedings of the International Conference on Practical Applications of Agents and Multi-Agent Systems, Toledo, Spain, 20–22 June 2018; pp. 127–139. [Google Scholar]
- Weichel, C.; Lau, M.; Kim, D.; Villar, N.; Gellersen, H.W. MixFab: A mixed-reality environment for personal fabrication. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; pp. 3855–3864. [Google Scholar]
- Taylor, A.G. What Is the Microsoft HoloLens? In Develop Microsoft HoloLens Apps Now; Springer: Berlin/Heidelberg, Germany, 2016; pp. 3–7. [Google Scholar]
- Cengiz, A.B.; Birant, K.U.; Cengiz, M.; Birant, D.; Baysari, K. Improving the Performance and Explainability of Indoor Human Activity Recognition in the Internet of Things Environment. Symmetry 2022, 14, 2022. [Google Scholar] [CrossRef]
- Ramos, R.G.; Domingo, J.D.; Zalama, E.; Gómez-García-Bermejo, J.; López, J. SDHAR-HOME: A Sensor Dataset for Human Activity Recognition at Home. Sensors 2022, 22, 8109. [Google Scholar] [CrossRef] [PubMed]
- Kim, E.; Helal, S.; Cook, D. Human Activity Recognition and Pattern Discovery. IEEE Pervasive Comput. 2010, 9, 48–53. [Google Scholar] [CrossRef] [Green Version]
- Schwartz, T. HAL. In Proceedings of the Rob|Arch 2012; Brell-Çokcan, S., Braumann, J., Eds.; Springer: Vienna, Austria, 2013; pp. 92–101. [Google Scholar]
- Saeed, N.; Alouini, M.S.; Al-Naffouri, T.Y. Toward the Internet of Underground Things: A Systematic Survey. IEEE Commun. Surv. Tutor. 2019, 21, 3443–3466. [Google Scholar] [CrossRef] [Green Version]
- Rafferty, L.; Iqbal, F.; Aleem, S.; Lu, Z.; Huang, S.C.; Hung, P.C. Intelligent multi-agent collaboration model for smart home IoT security. In Proceedings of the 2018 IEEE International Congress on Internet of Things (ICIOT), San Francisco, CA, USA, 2–7 July 2018; pp. 65–71. [Google Scholar]
- Chu, G.; Lisitsa, A. Penetration testing for internet of things and its automation. In Proceedings of the 2018 IEEE 20th International Conference on High Performance Computing and Communications; IEEE 16th International Conference on Smart City; IEEE 4th International Conference on Data Science and Systems (HPCC/SmartCity/DSS), Exeter, UK, 28–30 June 2018; pp. 1479–1484. [Google Scholar]
- Stringer, P.; Cardoso, R.C.; Huang, X.; Dennis, L.A. Adaptable and verifiable BDI reasoning. arXiv 2020, arXiv:2007.11743. [Google Scholar] [CrossRef]
- Boulila, W.; Sellami, M.; Driss, M.; Al-Sarem, M.; Safaei, M.; Ghaleb, F.A. RS-DCNN: A novel distributed convolutional-neural-networks based-approach for big remote-sensing image classification. Comput. Electron. Agric. 2021, 182, 106014. [Google Scholar] [CrossRef]
- Yu, G.J.; Chang, T.W.; Wang, Y.C. SAM: A spatial interactive platform for studying family communication problem. In Proceedings of the Symposium on Human Interface, San Diego, CA, USA, 19–24 July 2011; pp. 207–216. [Google Scholar]
- Chang, T.; Jiang, H.; Chen, S.; Datta, S. Dynamic skin: Interacting with space. In Proceedings of the 17th International Conference on Computer Aided Architectural Design Research in Asia, Chennai, India, 25–28 April 2012; pp. 89–98. [Google Scholar]
- Cardoso, R.C.; Dennis, L.A.; Fisher, M. Plan library reconfigurability in BDI agents. In Proceedings of the International Workshop on Engineering Multi-Agent Systems, Online, 3–4 May 2019; pp. 195–212. [Google Scholar]
- Sierra, P. BDI logic applied to a dialogical interpretation of human–machine cooperative dialogues. Log. J. IGPL 2021, 29, 536–548. [Google Scholar] [CrossRef]
- Bordini, R.H.; Dennis, L.A.; Farwer, B.; Fisher, M. Automated verification of multi-agent programs. In Proceedings of the 2008 23rd IEEE/ACM International Conference on Automated Software Engineering, L’Aquila, Italy, 15–19 September 2008; pp. 69–78. [Google Scholar]
- Castanedo, F.; Garcia, J.; Patricio, M.A.; Molina, J. A multi-agent architecture based on the BDI model for data fusion in visual sensor networks. J. Intell. Robot. Syst. 2011, 62, 299–328. [Google Scholar] [CrossRef]
- Leask, S.; Alechina, N.; Logan, B. A Computationally Grounded Model for Goal Processing in BDI Agents. In Proceedings of the Proceedings of the 6th Workshop on Goal Reasoning (GR’2018), Stockholm, Sweden, 13 July 2018. [Google Scholar]
- Lee, E.A. Cyber physical systems: Design challenges. In Proceedings of the 2008 11th IEEE International Symposium on Object and Component-Oriented Real-Time Distributed Computing (ISORC), Orlando, FL, USA, 5–7 May 2008; pp. 363–369. [Google Scholar]
- Hallgrimsson, B. Prototyping and Modelmaking for Product Design; Laurence King Publishing: Hachette, UK, 2012. [Google Scholar]
- Yan, Y.; Li, S.; Zhang, R.; Lin, F.; Wu, R.; Lu, Q.; Xiong, Z.; Wang, X. Rapid prototyping and manufacturing technology: Principle, representative technics, applications, and development trends. Tsinghua Sci. Technol. 2009, 14, 1–12. [Google Scholar] [CrossRef]
- Lechuga-Jiménez, C. Las metáforas del mapa y el espejo en el arte contemporáneo. El Genio Maligno. Rev. Humanidades Cienc. Soc. 2016. [Google Scholar]
- Bechthold, M. The return of the future: A second go at robotic construction. Archit. Des. 2010, 80, 116–121. [Google Scholar] [CrossRef]
- Gramazio, F.; Kohler, M. Made by Robots: Challenging Architecture at a Larger Scale; John Wiley and Sons: Hoboken, NJ, USA, 2014. [Google Scholar]
- Lavallee, J.; Vroman, R.; Keshet, Y. Automated folding of sheet metal components with a six-axis industrial robot. In Proceedings of the 31st Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA) 2011, Calgary/Banff, AL, Canada, 11–16 October 2011. [Google Scholar]
- Schon, D.A.; Wiggins, G. Kinds of seeing and their functions in designing. Des. Stud. 1992, 13, 135–156. [Google Scholar] [CrossRef]
- Liu, H. Biosignal processing and activity modeling for multimodal human activity recognition. Ph.D. Thesis, Universitat Bremen, Bremen, Germany, 5 November 2021. [Google Scholar] [CrossRef]
Space | Machine | A | B | C | D | E | F | G | H | I | J |
---|---|---|---|---|---|---|---|---|---|---|---|
Laser Cutter Room | Laser Cutter | ✓ | ✓ | ✓ | ✓ | ||||||
Prototyping | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||||
Cutting Machine | ✓ | ✓ | ✓ | ||||||||
Computer | ✓ | ||||||||||
Laser Cutter Teaching | ✓ | ✓ | |||||||||
3D Printer Room | 3D Printer Class | ✓ | |||||||||
3D Printer | ✓ | ✓ | ✓ | ✓ | ✓ | ||||||
CNC Machine | ✓ | ✓ | |||||||||
Metal Printing | ✓ | ||||||||||
Light Curing Printing | ✓ | ||||||||||
Resin Printing | |||||||||||
Teaching Space | lecture | ✓ | ✓ | ✓ | ✓ | ✓ | |||||
Administration | ✓ | ✓ | |||||||||
Conference | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||
Manager Meeting | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||||
Teaching Assistant Class | ✓ | ✓ | ✓ | ||||||||
Muse Space | Project Discuss | ✓ | ✓ | ✓ | ✓ | ||||||
Idea Thinking | |||||||||||
Scrub Room | Polisher Machine | ||||||||||
Exhaust Fan | ✓ | ||||||||||
Sandblasting Machine | ✓ | ✓ | |||||||||
Bandsaw Machine | ✓ | ✓ | |||||||||
Metalworking Room | Hand Jointer Machine | ✓ | |||||||||
Angle Slot Machine | ✓ | ||||||||||
Vertical Flower Planer | ✓ | ✓ | ✓ | ||||||||
Model Room | Polisher Machine | ✓ | |||||||||
Hand Jointer Machine | ✓ | ✓ | |||||||||
Robotic Room | Robotic Arm | ✓ | ✓ | ✓ | |||||||
Motor | ✓ | ✓ | ✓ | ✓ | ✓ | ||||||
Pressurizer | ✓ | ✓ | ✓ | ✓ | ✓ |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chang, T.-W.; Huang, H.-Y.; Hong, C.-C.; Datta, S.; Nakapan, W. SENS+: A Co-Existing Fabrication System for a Smart DFA Environment Based on Energy Fusion Information. Sensors 2023, 23, 2890. https://doi.org/10.3390/s23062890
Chang T-W, Huang H-Y, Hong C-C, Datta S, Nakapan W. SENS+: A Co-Existing Fabrication System for a Smart DFA Environment Based on Energy Fusion Information. Sensors. 2023; 23(6):2890. https://doi.org/10.3390/s23062890
Chicago/Turabian StyleChang, Teng-Wen, Hsin-Yi Huang, Cheng-Chun Hong, Sambit Datta, and Walaiporn Nakapan. 2023. "SENS+: A Co-Existing Fabrication System for a Smart DFA Environment Based on Energy Fusion Information" Sensors 23, no. 6: 2890. https://doi.org/10.3390/s23062890
APA StyleChang, T. -W., Huang, H. -Y., Hong, C. -C., Datta, S., & Nakapan, W. (2023). SENS+: A Co-Existing Fabrication System for a Smart DFA Environment Based on Energy Fusion Information. Sensors, 23(6), 2890. https://doi.org/10.3390/s23062890