Next Article in Journal
3D Printed Antibacterial Prostheses
Next Article in Special Issue
Design and Implementation of a Smart IoT Based Building and Town Disaster Management System in Smart City Infrastructure
Previous Article in Journal
An Efficient Neural Network for Shape from Focus with Weight Passing Method
Previous Article in Special Issue
Cooperative Crossing Cache Placement in Cache-Enabled Device to Device-Aided Cellular Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enabling Technologies for Operator 4.0: A Survey

1
MTA-PE “Lendület” Complex Systems Monitoring Research Group, Department of Process Engineering, University of Pannonia, Egyetem u. 10, POB 158, H-8200 Veszprém, Hungary
2
Department of Applied Informatics, Nagykanizsa Campus, University of Pannonia, Zrínyi u. 18, H-8800 Nagykanizsa, Hungary
*
Author to whom correspondence should be addressed.
Appl. Sci. 2018, 8(9), 1650; https://doi.org/10.3390/app8091650
Submission received: 31 July 2018 / Revised: 25 August 2018 / Accepted: 7 September 2018 / Published: 13 September 2018
(This article belongs to the Special Issue Advanced Internet of Things for Smart Infrastructure System)

Abstract

:
The fast development of smart sensors and wearable devices has provided the opportunity to develop intelligent operator workspaces. The resultant Human-Cyber-Physical Systems (H-CPS) integrate the operators into flexible and multi-purpose manufacturing processes. The primary enabling factor of the resultant Operator 4.0 paradigm is the integration of advanced sensor and actuator technologies and communications solutions. This work provides an extensive overview of these technologies and highlights that the design of future workplaces should be based on the concept of intelligent space.

1. Introduction

The continuous innovations of Cyber-Physical Systems (CPS), the Internet of Things (IoT), the Internet of Services (IoS), robotics, big data, cloud and cognitive computing and augmented reality (AR) result in significant change in production systems [1,2]. As these technologies revolutionize industrial production, the high-tech strategy of the German government launched to promote the computerization of manufacturing was named as the fourth industrial revolution (Industry 4.0). China developed its own initiative. Made-in-China 2025 is a strategic plan announced in 2015 to increase competitiveness in cutting-edge industries including the manufacturing sector [3,4,5]. The approach of China is also based on the most modern IT technologies [6] that are not only used to improve the efficiency of the production, but also to share manufacturing capacity and support cooperation [7]. The U.S. has introduced “reindustrialization” policies to reinvigorate its manufacturing industry. By releasing the “New Robot Strategy,” Japan is attempting to accelerate the development of cooperative robots and unmanned plants to revolutionize the robot industry, cope with the aggravation of Japanese social and economic issues and enhance international competitiveness. The “New Industrial France”, the “high-value manufacturing” strategy of the U.K. and the “advanced innovators’ strategy” of South Korea have similar CPS-based focus points [8]. The common goal of these developments is to integrate the supply chain. Industry 4.0 and additive manufacturing, when combined, can help enable the creation of products that are first-to-market and fully customized. Thanks to the benefits of additive manufacturing, not only the consumer can find more customized products and services, but also the manufacturer has a chance to create more efficient and scalable production flow [9]. All in all, these novel manufacturing technologies appear to herald a future in which value chains are shorter, more collaborative and offer significant sustainability benefits.
Organizations should be prepared for the introduction of Industry 4.0-based complex production systems. Recently developed maturity or readiness models are mainly technology focused [10,11] and assess the Industry 4.0 maturity of industrial enterprises in the domain of discrete manufacturing [12]. Thanks to the fast and flexible communications between CPSs, smart sensors and actuators, real-time and self-controlled operations can be realized [6,13]. The new smart IoT devices have the potential to design mobile machines that replace human minds [14]. Researchers at Oxford University estimated that approximately 47% of all U.S. employment will be at a high risk of computerization by the early 2030s [15]. A survey conducted by PricewaterhouseCoopers (PwC) found that 37% of employees were worried about the possibility of redundancy due to automation [15,16].
Although the increase in the degree of automation reduces costs and improves productivity [17], human operators are still essential elements of manufacturing systems [18,19]. The increasing degree of automation also does not necessarily lead to enhanced operator performance [20]. Handling human factors is a challenging problem concerning both cellular manufacturing [21] and human-robot interaction [22]. For example, smart factories have to take into consideration operators who are aging or apprentices by using advanced technologies to help people to integrate into the modern manufacturing workforce [23].
Industry 4.0 (especially IoT devices and CPS) allows new types of interactions between operators and machines [24]. These interactions will generate a new intelligent workforce and have significant effects on the nature of work. The integration of workers into an Industry 4.0 system consisting of different skills, educational levels and cultural backgrounds is a significant challenge. The new concept of Operator 4.0 was created for the integrated analysis of these challenges. The concept of Operator 4.0 is based on the so-called Human-Cyber-Physical Systems (H-CPSs) designed to facilitate cooperation between humans and machines [25].
Although the state of the art in the area of Industry 4.0 has been reviewed recently [3] and systematic literature reviews are frequently published [26,27,28], there is a need to study how the fourth industrial revolution will not entirely replace operators; instead, sensors, smart devices, mobile IoT assets and technologies will be used to design systems for operator support.
This paper focuses on the elements of this infrastructure and proposes an intelligent space-based design methodology for the design of Operator 4.0 solutions. According to this goal, the development and application of advanced Internet of Things technologies with regard to smart sensing technologies, IoT architectures, services and applications will be discussed by following the types of Operator 4.0 solutions proposed by Romero et al. [23,25].
The paper is comprised of the following structure. The elements of Operator 4.0 solutions are presented, and a novel design methodology based on the concept of intelligent space is proposed in Section 2. The required infrastructural background is presented in the remaining sections. The IoT solution for tracking operator activities is introduced in Section 3, while IoT-based solutions developed to support operator activities by providing feedback are summarized in Section 4. Conclusions and recommendations based on the review are proposed in Section 5.

2. Framework of Operator 4.0 Solutions

The concepts of Operator 4.0, cyber-physical systems and intelligent space are introduced and connections between these methodologies discussed in this section.

2.1. The Operator 4.0 Concept and Human-Cyber-Physical Systems

The Operator 4.0 typology depicts how the technologies of the fourth industrial revolution will assist the work of operators [25]. Operator 1.0 is defined as humans conducting manual work. The Operator 2.0 generation represents a human entity whose job is supported by tools, e.g., by Computer Numerical Control (CNC) of machine tools. In the third generation, the humans are involved in cooperative work with robots and computer tools, also known as human-robot collaboration. This human-robot collaboration in the industrial environment is a fascinating field with a specific focus on physical and cognitive interaction [29]. However, the new set of solutions is based on even more intensive cooperation between operators and production systems. This new Operator 4.0 concept represents the future of workplaces [25] (see Figure 1).
The main elements of the Operator 4.0 methodology are explained in Table 1. Analytical operator-type solutions utilize big data analytics to collect, organize and analyze large datasets [23]. Augmented Reality (AR) can be considered as a critical enabling technology for improving the transfer of information from the digital to the physical world of the smart operator. The collaborative operator works together with Collaborative robots (CoBots). Healthy operator solutions measure and store exercise activity, stress, heart rate and other health-related metrics, as well as GPS location and other personal data. Smarter operators interact with machines, computers, databases and other information systems, as well as receive useful information to support their work. Social operators use mobile and social collaborative methods to connect to smart factory resources. Super-strength operators increase the strength of human operators to be able to conduct manual tasks without effort using wearable exoskeletons, while virtual operators interact with the computer mapping of design, assembly or manufacturing environments.
With regards to the development of Operator 4.0-based automation systems, attention has to be paid to the design principles of Industry 4.0 solutions, which are decentralization, virtualization, reconfiguration and adaptability [45,46,47]. How these principles should be applied during the development process is presented in Table 2.
The Operator 4.0 concept aims to create Human-Cyber-Physical Production Systems (H-CPPS) that improve the abilities of the operators [23]. The allocation of tasks to machines and operators requires the complex semantic model of the H-CPS. Operator instructions can be programmed into a machine, but handling uncertainty and the stochastic nature are difficult. Adaptive systems are suitable to handle these problems with the help of more frequent monitoring and model adaptation functions [56,57,58,59]. Real-time operator support and performance monitoring require accurate information concerning the activities of operators, which means all data related to operator activities should be measured, converted, analyzed, transformed into actionable knowledge and fed back to the operators. Based on this requirement, the operator should be connected from the bottom (connection) to the top (configuration) levels of the cyber-physical systems [60]. To support this goal, an overview concerning the elements of CPS from the perspective of operators is given in Table 3, and the levels of CPSs with a description of the functions and tasks are presented in Figure 2.
As tasks should be transformed into a form that computers can understand, task analysis is becoming more and more crucial due to the difficulties of the externalization of the tacit knowledge of the operators [61]. Tacit knowledge contains all cognitive skills and technical know-how that are challenging to articulate [62,63]. Without elicit tacit knowledge, the chance of losing critical information and best practice is very high [64]. Hierarchical task analysis extended with the ‘skill, rule and knowledge” framework can capture tacit knowledge [65], an approach which has been proven to be useful in manufacturing [66]. Sensor technologies are essential to elicit tacit knowledge, for example the tacit knowledge of the operator can be captured by a ‘sensorized’ hand-held belt grinder and a 3D scanner to generate a program of a robot that can replace the operator [67]. The modeling of the physical reality and realizing it in the CPS are critical tasks [68,69,70,71].
These examples illustrate that Operator 4.0 solutions should be based on contextual task analysis, which requires precise chronological time-synchronization of the operator actions, sensory data and psycho-physiological signals to infer the cognitive states [72] and emotions [73] associated with the decisions and operator actions.
Sensors and feedback technologies of the interactive intelligent space can be used not only for improving the abilities of the operators, but also for the extraction of their tacit knowledge. In the following section, these technologies will be detailed.

2.2. The Operator 4.0 Concept and Intelligent Space

In the previous section, the key functions of Operator 4.0 solutions were shown to be related to the monitoring and support of operator activities. The most significant trend is related to the development of human-machine interfaces that embrace interaction in a set of novel ways [100]. As the operator performs tasks, real-time information is provided about the production system and real-time support is received from it. Interactive human-machine systems had already been introduced in the Hashimoto Laboratory at the of University of Tokyo [101] where an intelligent Space (iSpace) system has been designed for the virtual and physical support of people and mobile robots [102]. Intelligent interaction space supports the operators to complete their work with high efficiency, high success rate and low burden [103]. The iSpace framework is shown in Figure 3.
The events within iSpace are continuously monitored by Distributed Intelligent Network Devices (DINDs) consisting of various networked sensors, e.g., indoor positioning systems and cameras for localization. DINDs interpret events in the physical space and provide services (feedback) to operators using physical devices, e.g., microphones, displays, etc. According to the horizontal integration concept, the proposed iSpace is also connected to suppliers and customers. This concept highlights that iSpace should relay on the CloudThings architecture that integrates Internet of Things (IoT) and cloud computing [104], as cloud computing enables a convenient, on demand and scalable network access to a shared pool of configurable computing resources.
Resources, users and tasks are the three core elements of intelligent interaction space (see Figure 4). The user-resource-task model supports the design of interaction among these components [103], the interactions of which should handle how resources trigger the tasks and how the tasks are assigned to the operators based on their availability, performance and competence.
Intelligent space should respond to requests from people, so the activities of the operators must be identified by cameras, internal positioning systems or based on voice signals, and these multi-sensory data should be processed by artificial intelligence and machine learning solutions [102]. The acquired information is transmitted via a wireless network and processed by dedicated computers, so any event involving or a change in the monitored parameters inside the space is carefully analyzed and processed [105].
This section highlighted that the development of H-CPSs requires an appropriate design concept. According to the concept of intelligent space the architecture must be modular, scalable and integrated, which results in low installation and maintenance costs and easy configuration [106].

3. IoT-Based Solutions for Operator Activity Tracking

From the viewpoint of operators, connection and conversion are the most critical levels of cyber-physical systems as these two levels are responsible for interaction. As smart sensors are key components of solutions for Cyber-Physical Production Systems (CPPS) [13], it is necessary to overview what kinds of tools are available for monitoring the activity of the operators.
Usually, operator activity is monitored by Radio Frequency IDentification (RFID)-based object tracking [107]. This technology can collect real-time data about the activities of workers (operators) and machines, as well as movements of materials [108] and workpieces [109,110]. Multi-agent supported RFID systems realize location-sensing systems [111] and intelligent-guided view systems [112]. RFID systems for human-activity monitoring provide an excellent opportunity to observe the work of the operators [113]. With the help of these devices, the whole production process, as well as production and waiting times have become measurable online. Based on this information, Shop Floor Control (SFC) and optimization can also be realized. When the RFID readers are placed such that the duration of the tasks can be estimated, how the production line is balanced in addition to the effect of product changes can be evaluated, as well as real-time data for OEE (Overall Equipment Effectiveness) calculations provided [114].
The tracking of production can be significantly improved by the Indoor Positioning System (IPS) utilized for localizing the positions of the products and operators [95]. The applications of IPS and its potential benefits in terms of process development are complied in Table 4.
Context-aware systems require unobtrusive sensors to track each step of the performed task [115]. As wearable sensors are becoming more common, their utilization is also becoming more attractive [116]. However, hand motion-based activity recognition is still challenging [117] and requires the application of advanced machine learning algorithms [118]. Tracking operator activity is a challenging and highly infrastructure-demanding task, which should utilize information stream fusion approaches to improve the robustness of the algorithms [119]. How all these smart sensor-based IoT technologies can be used to design Operator 4.0-type solutions is compiled in Table 5.

4. IoT-Based Solutions to Support Operator Activities

The operators not only have to provide real-time information about their actions, but at the same time require real-time support in their work. Industrial wearable [93] and communication [138] solutions help to handle this challenge. The previous section showed what kind of techniques exist to collect information from the operator. In this section, potentially applicable feedback technologies will be introduced, which are related to the configuration level of cyber-physical systems [60].
In the early applications, the production activities required to complete orders were scheduled and managed by Shop Floor Control Systems (SFCS). In [139], a hierarchical SFCS (shop, workstation, equipment) was adopted. In [140], a vision-based human-computer interaction system was introduced that interacts with the operator and provides feedback. Complex hardware was installed in intelligent environments, equipped with a steerable projector and spatial sound system, to position the character within the environment [141].
A potential grouping of feedback technologies is the following: fix-mounted devices (e.g., LED TVs), mobile devices (e.g., tablets, smartphones) and wearable devices (e.g., smart glasses). Intuitive displays can reduce the cost of operator intervention as the performance of the operator is improved by the auditory and visual understanding [142]. Visual collaboration systems can provide appropriate instructions for each step of the assembly task [140]. All groups are used correctly and efficiently, but the novelty of wearable devices compared to the ‘simple’ mobile devices is the total freedom of movement and free use of limbs [143]. So far, some of these only provide a human-machine interface (HMI) and need a (mobile) computer (e.g., a smartphone) to operate, but the tendency is that every device will work separately and can cooperate with other devices through some communication solutions (e.g., LAN/WiFi, Bluetooth). Headsets, VR helmets, smart gloves and smart clothes are examples of the types of devices presented in Table 6. The importance of this area is shown in the statistical increase in the numbers of sales. So far, these kinds of solutions have resulted in approximately $5.8 billion in business [144].
The connections between the categories of Operator 4.0 solutions and potential feedback technologies are shown in Table 6. Which feedback opportunity is expedient is defined by the task in question. For example, in the case of the super-strength operator, the feedback indicating danger is a critical function. The next step of the design is to select the technology that delivers the information. Danger can be indicated with the help of smart glasses or by a speaker. As soon as the operator hears the warning alarm, the danger can be avoided. In the case of smart glasses, the worker can obtain more detailed information about the type and location of risk. The potential applications of these solutions are summarized in the last column of the table.
Some companies have been testing these innovative technologies in manufacturing processes. In every case when these techniques are used, the production process is complex, the quality management is strict and there is a wide variety of products. The results are impressive because the efficiency improves while the learning time reduces in every observed situation. In the following, some of these solutions will be introduced.
Smart glasses-based augmented reality is used in the manufacturing of high-horsepower wheeled tractors with hundreds of variations by the company AGCO [146]. Presently, 100 pairs of glasses are in use to visualize the next manufacturing step and necessary information for the inspection process. The results in numbers are promising:
  • 50% reduction in learning time (in the case of new workers)
  • 30% reduction in inspection time (eliminates paperwork and manual upload)
  • 25% reduction in production time (in the case of complex assemblies and low volumes)
Similar advantages of smart glasses were reported at DHL, which is one of the leading logistics companies in the world [154]. Ten workers who used smart glasses for three weeks managed to distribute 20,000 packages (9000 orders), leading to a 25% increase in the efficiency of the operators and a reduction in errors of 40%.
Quality and reliability are critical in aerospace manufacturing. Boeing and Model-Based Instructions (MBI) from Iowa State University support the work of the operators. The first solution was designed to show the instructions for the workers. The installation of the desktop MBI was static, and there were numerous situations when the operator could not see them during the assembly process. The tablet MBI used the same instructions as the desktop MBI, but it was mounted on a mobile arm. The tablet AR was the same tablet that provided the tablet MBI solution; however, the operator could see the real world on the display of the tablet, and the software added virtual elements into the video stream. It was observed that the AR technology yielded the best solutions with regard to first-time quality, speed and worker efficiency out of these three solutions [164,165].
These benefits are in accordance with what was observed in the introduction of general Industry 4.0 solutions [166]. The examination of 385 published applications shows that the most common benefits of Industry 4.0 are the enhanced efficiency (47%), prevention of errors (33%), reduction of cost (33%), employee support (32%) and minimization of lead time (31%). It is worth noting that the importance of communication (31%), human-machine interfaces (25%) and sensor technology (11%) were also highlighted.
The review concerning examples of applications showed clearly that the Operator 4.0 concept works in practice, and the following advantages were observed: (1) elimination of classical paper-based administration; (2) operators can use their arms freely and receive real-time feedback about the manufacturing process; (3) the duration of training of workers decreases; and (4) the efficiency of production increases and the number of errors decreases simultaneously in all cases. In summary, operators will be more efficient in smart workplaces, where new opportunities will be available to safeguard their activities and ensure alertness. Production systems will become safer, more controllable and manageable than ever before. A win-win situation will develop in which humans remain an important element. Operator 4.0 technologies are only capable of bringing about these benefits when the manufacturing process is complex and the variety of products is wide. Of course, some advantages can be observed in cases of traditional mass production, as well, but it is difficult to compensate due to the high investment and development costs of these technologies.

5. Conclusions

This paper provided an overview of what kind of Industrial Internet of Things-based infrastructure should be developed to improve the efficiency of operators in production systems. By following the Operator 4.0 concept proposed by Romero et al. [23,25], the literature survey demonstrated that smart sensors and wearable devices provide the opportunity to integrate operators into the concept of smart factories.
It was highlighted that integrated workspaces should have a modular and integrated architecture, and the development should be based on the concepts of human-in-the-loop cyber-physical systems and intelligent space to ensure low installation and maintenance costs.
In this work, the architecture and infrastructure of Operator 4.0 technologies were surveyed. Monitoring and data-driven analytics are the key to process development [26,138]. There are several exciting model- and algorithm-based aspects of these solutions, e.g., big data, sensor fusion and optimization and machine learning, whose review would also be timely as significant added value and reductions in cost can be achieved by the model-based monitoring, control and optimization of the presented production support systems.

Author Contributions

T.H., S.J. and T.R. prepared the original draft,. J.A. created the concept and reviewed and edited the manuscript.

Funding

This research was supported by the National Research, Development and Innovation Office (NKFIH) through the project OTKA-116674 (Process mining and deep learning in the natural sciences and process development) and Széchenyi 2020 under EFOP-3.6.1-16-2016-00015 Smart Specialization Strategy (S3) Comprehensive Institutional Development Program.

Conflicts of Interest

The authors declare that there is no conflict of interest with regard to the publication of this paper.

Abbreviations

The following abbreviations are used in this manuscript:
5SWorkplace organization methoddology
ABCActivity-Based Costing
AIArtificial Intelligence
AMAdditive Manufacturing
ARAugmented Reality
BPRBusiness Process Reengineering
CNCComputer Numerical Control
CoBotCollaborative Robot
CPSCyber-Physical System
CPPSCyber-Physical Production System
CSComputer Science
DINDDistributed Intelligent Network Device
E-SNSEnterprise Social Networking Service
H-CPSHuman-Cyber-Physical System
H-CPPSHuman-Cyber-Physical Production System
HITLHuman-In-The-loop
HMIHuman Machine Interface
ICTInformation and Communication Technologies
IoTInternet of Things
IoSInternet of Services
IPAIntelligent Personal Assistant
IPSIndoor Positioning System
iSpaceIntelligent Space
KPIKey Performance Indicator
MaaSManufacturing as a Service
MBIModel-Based Instructions
MESManufacturing Execution System
MEMSMicro-Electro Mechanical System
MSTManufacturing Science and Technology
OEEOverall Equipment Effectiveness
PaaSProduct-as-a-Service
PwCPricewaterhouseCoopers
RFIDRadio Frequency IDentification
SFCShop Floor Control
SFCSShop Floor Control System
UWBUltra-Wideband
VRVirtual Reality

References

  1. Schmidt, R.; Möhring, M.; Härting, R.C.; Reichstein, C.; Neumaier, P.; Jozinović, P. Industry 4.0-potentials for creating smart products: Empirical research results. In International Conference on Business Information Systems; Springer: Cham, Germany, 2015; Volume 12, pp. 16–27. [Google Scholar]
  2. Pereira, A.; Romero, F. A review of the meanings and the implications of the Industry 4.0 concept. Procedia Manuf. 2017, 13, 1206–1214. [Google Scholar] [CrossRef]
  3. Xu, L.D.; Xu, E.L.; Li, L. Industry 4.0: State of the art and future trends. Int. J. Prod. Res. 2018, 56, 2941–2962. [Google Scholar] [CrossRef]
  4. Martin, K. Innovative Competition with Chinese Characteristics. The Case of ’Made in China 2025’ in Relation to the German Industry. Bachelor’s Thesis, 2018. Available online: https://openaccess.leidenuniv.nl/handle/1887/63733 (accessed on 7 September 2018).
  5. Shubin, T.; Zhi, P. “Made in China 2025” and “Industrie 4.0”—In Motion Together. In The Internet of Things; Springer: New York, NY, USA, 2018; pp. 87–113. [Google Scholar]
  6. Wang, S.; Wan, J.; Zhang, D.; Li, D.; Zhang, C. Towards smart factory for industry 4.0: A self-organized multi-agent system with big data based feedback and coordination. Comput. Netw. 2016, 101, 158–168. [Google Scholar] [CrossRef]
  7. Chai, X.; Hou, B.; Zou, P.; Zeng, J.; Zhou, J. INDICS: An Industrial Internet Platform. In Proceedings of the IEEE International Conference on Cloud and Big Data Computing, Guangzhou, China, 8–12 October 2018. [Google Scholar]
  8. Huimin, M.; Wu, X.; Yan, L.; Huang, H.; Wu, H.; Xiong, J.; Zhang, J. Strategic Plan of “Made in China 2025” and Its Implementation. In Analyzing the Impacts of Industry 4.0 in Modern Business Environments; IGI Global: Derry Township, PA, USA, 2018; Volume 23, pp. 1–23. [Google Scholar]
  9. Ford, S.; Despeisse, M. Additive manufacturing and sustainability: An exploratory study of the advantages and challenges. J. Clean. Prod. 2016, 137, 1573–1587. [Google Scholar] [CrossRef]
  10. Lanza, G.; Nyhuis, P.; Ansari, S.M.; Kuprat, T.; Liebrecht, C. Befähigungs-und Einführungsstrategien für Industrie 4.0. ZWF Zeitschrift Wirtschaftlichen Fabrikbetrieb 2016, 111, 76–79. [Google Scholar] [CrossRef]
  11. Lictblau, K.; Stich, V.; Bertenrath, R.; Blum, M.; Bleider, M.; Millack, A.; Schmitt, K.; Schmitz, E.; Schroter, M. Industrie 4.0 Readiness. Impuls-Stiftung des VDMA Aachen-Köln 2015, 52, 1–77. [Google Scholar]
  12. Schumacher, A.; Erol, S.; Sihn, W. A maturity model for assessing industry 4.0 readiness and maturity of manufacturing enterprises. Procedia CIRP 2016, 52, 161–166. [Google Scholar] [CrossRef]
  13. Wang, S.; Wan, J.; Li, D.; Zhang, C. Implementing Smart Factory of Industrie 4.0: An Outlook. Int. J. Distrib. Sens. Netw. 2016, 12, 1–12. [Google Scholar] [CrossRef]
  14. Hawksworth, J.; Berriman, R.; Goel, S. Will Robots Really Steal Our Jobs? An International Analysis of the Potential Long Term Impact of Automation. Available online: http://pwc.blogs.com/economics_in_business/2018/02/will-robots-really-steal-our-jobs.html (accessed on 20 July 2018).
  15. Frey, C.B.; Osborne, M. The Future of Employment: How Susceptible Are Jobs to Computerisation. Available online: https://www.oxfordmartin.ox.ac.uk/publications/view/1314 (accessed on 20 July 2018).
  16. PricewaterhouseCooper. Workforce of the Future: The Competing Forces Shaping 2030. Available online: https://www.pwc.com/gx/en/services/people-organisation/publications/workforce-of-the-future.html (accessed on 20 July 2018).
  17. Parasuraman, R.; Sheridan, T.B.; Wickens, C.D. A model of types and levels of human interaction with automation. IEEE Trans. Syst. Man Cybern. 2000, 30, 286–297. [Google Scholar] [CrossRef]
  18. Munir, S.; Stankovic, J.A.; Liang, C.J.M.; Lin, S. Cyber Physical System Challenges for Human-in-the-Loop Control. In Proceedings of the Presented as part of the 8th International Workshop on Feedback Computing, USENIX, San Jose, CA, USA, 24–28 June 2013; Volume 4, pp. 1–4. [Google Scholar]
  19. Hancock, P.A.; Jagacinski, R.J.; Parasuraman, R.; Wickens, C.D.; Wilson, G.F.; Kaber, D.B. Human-automation interaction research: Past, present, and future. Ergon. Des. 2013, 21, 9–14. [Google Scholar] [CrossRef]
  20. Wickens, C.; Li, H.; Santamaria, A.; Sebok, A.; Sarter, N. Stages and levels of automation: An integrated meta-analysis. In Proceedings of the Human Factors and Ergonomics Society 54th Annual Meeting, San Francisco, CA, USA, 1–3 September 2010; Volume 305, pp. 89–393. [Google Scholar]
  21. Bidanda, B.; Ariyawongrat, P.; Needy, K.L.; Norman, B.A.; Tharmmaphornphilas, W. Human related issues in manufacturing cell design, implementation, and operation: A review and survey. Comput. Ind. Eng. 2005, 48, 507–523. [Google Scholar] [CrossRef]
  22. Roitberg, A.; Perzylo, A.; Somani, N.; Giuliani, M.; Rickert, M.; Knoll, A. Human activity recognition in the context of industrial human-robot interaction. In Proceedings of the Signal and Information Processing Association Annual Summit and Conference (APSIPA), Siem Reap, Cambodia, 9–12 December 2014; Volume 10, pp. 1–10. [Google Scholar]
  23. Romero, D.; Stahre, J.; Wuest, T.; Noran, O.; Bernus, P.; Fast-Berglund, Å.; Gorecky, D. Towards an Operator 4.0 Typology: A Human-Centric Perspective on the Fourth Industrial Revolution Technologies. In Proceedings of the International Conference on Computers and Industrial Engineering (CIE46), Tianjin, China, 29–31 October 2016; Volume 11, pp. 1–11. [Google Scholar]
  24. Lorenz, M.; Ruessmann, M.; Strack, R.; Lueth, K.L.; Bolle, M. Man and Machine in Industry 4.0: How Will Technology Transform the Industrial Workforce Through 2025. Available online: https://www.bcg.com/publications/2015/technology-business-transformation-engineered-products-infrastructure-man-machine-industry-4.aspx (accessed on 6 July 2018).
  25. Romero, D.; Bernus, P.; Noran, O.; Stahre, J.; Fast-Berglund, Å. The Operator 4.0: Human Cyber-Physical Systems & Adaptive Automation Towards Human-Automation Symbiosis Work Systems. In Advances in Production Management Systems; Initiatives for a Sustainable World; Springer: New York, NY, USA, 2016; Volume 10, pp. 677–686. [Google Scholar]
  26. Liao, Y.; Deschamps, F.; de Freitas Rocha Loures, E.; Ramos, L.F.P. Past, present and future of Industry 4.0—A systematic literature review and research agenda proposal. Int. J. Prod. Res. 2017, 55, 3609–3629. [Google Scholar] [CrossRef]
  27. Monostori, L. Cyber-physical production systems: Roots, expectations and R&D challenges. Procedia CIRP 2014, 17, 9–13. [Google Scholar]
  28. Zhou, K.; Liu, T.; Zhou, L. Industry 4.0: Towards future industrial opportunities and challenges. In Proceedings of the 2015 12th International Conference on Fuzzy Systems and Knowledge Discovery (FSKD), Zhangjiajie, China, 15–17 August 2015; pp. 2147–2152. [Google Scholar] [CrossRef]
  29. Villani, V.; Pini, F.; Leali, F.; Secchi, C. Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics 2018, 19, 1–19. [Google Scholar] [CrossRef]
  30. Russom, P. Big data analytics. TDWI Best Practices Report. Fourth Quart. 2011, 19, 1–34. [Google Scholar]
  31. Chen, H.; Chiang, R.H.; Storey, V.C. Business intelligence and analytics: From big data to big impact. MIS Q. 2012, 24, 1165–1188. [Google Scholar]
  32. Chou, T.L.; ChanLin, L.J. Augmented reality smartphone environment orientation application: A case study of the Fu-Jen University mobile campus touring system. Procedia-Soc. Behav. Sci. 2012, 46, 410–416. [Google Scholar] [CrossRef]
  33. Penttila, K.; Pere, N.; Sioni, M.; Sydanheimo, L.; Kivikoski, M. Use and interface definition of mobile RFID reader integrated in a smart phone. In Proceedings of the Ninth International Symposium on Consumer Electronics, Macau, 14–16 June 2005; pp. 353–358. [Google Scholar] [CrossRef]
  34. Davis, J.; Edgar, T.; Porter, J.; Bernaden, J.; Sarli, M. Smart manufacturing, manufacturing intelligence and demand-dynamic performance. Comput. Chem. Eng. 2012, 47, 145–156. [Google Scholar] [CrossRef]
  35. Zhou, J.; Lee, I.; Thomas, B.; Menassa, R.; Farrant, A.; Sansome, A. In-Situ Support for Automotive Manufacturing Using Spatial Augmented Reality. Int. J. Virtual Real. 2012, 11, 33–41. [Google Scholar]
  36. Olwal, A.; Gustafsson, J.; Lindfors, C. Spatial augmented reality on industrial CNC-machines. The Engineering Reality of Virtual Reality 2008. Int. Soc. Opt. Photonics 2008, 9, 1–9. [Google Scholar] [CrossRef]
  37. Nee, A.Y.; Ong, S.; Chryssolouris, G.; Mourtzis, D. Augmented reality applications in design and manufacturing. CIRP Ann. Manuf. Technol. 2012, 61, 657–679. [Google Scholar] [CrossRef]
  38. Baxter & Sawyer. Rethink Robotics. Available online: http://www.rethinkrobotics.com/sawyer-intera-3/ (accessed on 7 September 2018).
  39. Myers, K.; Berry, P.; Blythe, J.; Conley, K.; Gervasio, M.; McGuinness, D.L.; Morley, D.; Pfeffer, A.; Pollack, M.; Tambe, M. An intelligent personal assistant for task and time management. AI Mag. 2007, 28, 1–27. [Google Scholar]
  40. Wuest, T.; Hribernik, K.; Thoben, K.D. Can a Product Have a Facebook? A New Perspective on Product Avatars in Product Lifecycle Management. In Towards Knowledge-Rich Enterprises; Springer: New York, NY, USA, 2012; pp. 400–410. [Google Scholar]
  41. Sylla, N.; Bonnet, V.; Colledani, F.; Fraisse, P. Ergonomic contribution of ABLE exoskeleton in automotive industry. Int. J. Ind. Ergon. 2014, 7, 475–481. [Google Scholar] [CrossRef]
  42. Mujber, T.S.; Szecsi, T.; Hashmi, M.S. Virtual reality applications in manufacturing process simulation. J. Mater. Process. Technol. 2004, 155-156, 1834–1838. [Google Scholar] [CrossRef]
  43. Darter, B.J.; Wilken, J.M. Gait Training With Virtual Reality-Based Real-Time Feedback: Improving Gait Performance Following Transfemoral Amputation. Phys. Ther. 2011, 91, 1385–1394. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  44. Horváth, L.; Rudas, I.J. Role of information content in multipurpose virtual engineering space. In Proceedings of the 2017 IEEE 15th International Symposium on Applied Machine Intelligence and Informatics (SAMI), Her’any, Slovakia, 26–28 January 2017; pp. 99–104. [Google Scholar] [CrossRef]
  45. Silva, R.M.; Benítez-Pina, I.F.; Blos, M.F.; Filho, D.J.S.; Miyagi, P.E. Modeling of reconfigurable distributed manufacturing control systems. IFAC-PapersOnLine 2015, 48, 1284–1289. [Google Scholar] [CrossRef]
  46. Shafiq, S.I.; Sanin, C.; Szczerbicki, E.; Toro, C. Virtual Engineering Factory: Creating Experience Base for Industry 4.0. Cybern. Syst. 2016, 47, 32–47. [Google Scholar] [CrossRef]
  47. Ghobakhloo, M. The future of manufacturing industry: A strategic roadmap toward Industry 4.0. J. Manuf. Technol. Manag. 2018, 29, 910–936. [Google Scholar] [CrossRef]
  48. Posada, J.; Toro, C.; Barandiaran, I.; Oyarzun, D.; Stricker, D.; de Amicis, R.; Pinto, E.B.; Eisert, P.; Döllner, J.; Vallarino, I. Visual Computing as a Key Enabling Technology for Industrie 4.0 and Industrial Internet. IEEE Comput. Graph. Appl. 2015, 35, 26–40. [Google Scholar] [CrossRef] [PubMed]
  49. Rüßmann, M.; Lorenz, M.; Gerbert, P.; Waldner, M.; Justus, J.; Engel, P.; Harnisch, M. Industry 4.0: The Future of Productivity and Growth in Manufacturing Industries. Available online: https://www.bcg.com/publications/2015/engineered_products_project_business_industry_4_future_productivity_growth_manufacturing_industries.aspx (accessed on 20 July 2018).
  50. Gilchrist, A. Industry 4.0: The Industrial Internet of Things; Apress: Berkeley, CA, USA, 2016. [Google Scholar]
  51. Ghobakhloo, M.; Azar, A. Business excellence via advanced manufacturing technology and lean-agile manufacturing. J. Manuf. Technol. Manag. 2018, 29, 2–24. [Google Scholar] [CrossRef]
  52. Pérez Perales, D.; Alarcón, F.; Boza, A. Industry 4.0: A Classification Scheme. In Closing the Gap between Practice and Research in Industrial Engineering; Springer International Publishing: New York, NY, USA, 2018; pp. 343–350. [Google Scholar]
  53. Jiang, P.; Ding, K.; Leng, J. Towards a cyber-physical-social-connected and service-oriented manufacturing paradigm: Social Manufacturing. Manuf. Lett. 2016, 7, 15–21. [Google Scholar] [CrossRef]
  54. MPDV. Industry 4.0: MES Supports Decentralization. Available online: https://www.mpdv.com/media/company/company_magazine/NEWS_International_2015.pdf (accessed on 20 July 2018).
  55. Moreno, A.; Velez, G.; Ardanza, A.; Barandiaran, I.; de Infante, Á.R.; Chopitea, R. Virtualisation process of a sheet metal punching machine within the Industry 4.0 vision. Int. J. Interact. Des. Manuf. (IJIDeM) 2017, 11, 365–373. [Google Scholar] [CrossRef]
  56. Dai, W.; Dubinin, V.N.; Christensen, J.H.; Vyatkin, V.; Guan, X. Toward Self-Manageable and Adaptive Industrial Cyber-Physical Systems with Knowledge-Driven Autonomic Service Management. IEEE Trans. Ind. Inf. 2017, 13, 725–736. [Google Scholar] [CrossRef]
  57. Adamson, G.; Wang, L.; Moore, P. Feature-based control and information framework for adaptive and distributed manufacturing in cyber physical systems. J. Manuf. Syst. 2017, 43, 305–315. [Google Scholar] [CrossRef]
  58. Jin, X.; Haddad, W.M.; Yucelen, T. An Adaptive Control Architecture for Mitigating Sensor and Actuator Attacks in Cyber-Physical Systems. IEEE Trans. Autom. Control 2017, 62, 6058–6064. [Google Scholar] [CrossRef]
  59. Jin, X.; Haddad, W.M.; Hayakawa, T. An adaptive control architecture for cyber-physical system security in the face of sensor and actuator attacks and exogenous stochastic disturbances. Cyber-Phys. Syst. 2018, 4, 39–56. [Google Scholar] [CrossRef]
  60. Lee, J.; Bagheri, B.; Kao, H.A. A Cyber-Physical Systems architecture for Industry 4.0-based manufacturing systems. Manuf. Lett. 2015, 3, 18–23. [Google Scholar] [CrossRef]
  61. Rao, S.S.; Nayak, A. Enterprise Ontology Model for Tacit Knowledge Externalization in Socio-Technical Enterprises. Interdisciplin. J. Inf. Knowl. Manag. 2017, 12, 99–124. [Google Scholar]
  62. Polanyi, M. The Tacit Dimension; University of Chicago Press: Chicago, IL, USA, 1992. [Google Scholar]
  63. Smith, E.A. The role of tacit and explicit knowledge in the workplace. J. Knowl. Manag. 2001, 5, 311–321. [Google Scholar] [CrossRef]
  64. Johnson, T.; Fletcher, S.; Baker, W.; Charles, R. How and why we need to capture tacit knowledge in manufacturing: Case studies of visual inspection. Appl. Ergon. 2019, 74, 1–9. [Google Scholar] [CrossRef]
  65. Phipps, D.L.; Meakin, G.H.; Beatty, P.C. Extending hierarchical task analysis to identify cognitive demands andinformation design requirements. Appl. Ergon. 2011, 42, 741–748. [Google Scholar] [CrossRef] [PubMed]
  66. Everitt, J.; Fletcher, S.; Caird-Daley, A. Task analysis of discrete and continuous skills: A dual methodology approach to human skills capture for automation. Theor. Issues Ergon. Sci. 2015, 16, 513–532. [Google Scholar] [CrossRef]
  67. Ng, W.X.; Chan, H.K.; Teo, W.K.; Chen, I. Programming a Robot for Conformance Grinding of Complex Shapes by Capturing the Tacit Knowledge of a Skilled Operator. IEEE Trans. Autom. Sci. Eng. 2017, 14, 1020–1030. [Google Scholar] [CrossRef]
  68. Tomov, M.; Kuzinovski, M.; Cichosz, P. Development of mathematical models for surface roughness parameter prediction in turning depending on the process condition. Int. J. Mech. Sci. 2016, 113, 120–132. [Google Scholar] [CrossRef]
  69. Lee, W.; Cheung, C. A dynamic surface topography model for the prediction of nano-surface generation in ultra-precision machining. Int. J. Mech. Sci. 2001, 43, 961–991. [Google Scholar] [CrossRef]
  70. Lu, X.; Zhang, H.; Jia, Z.; Feng, Y.; Liang, S.Y. Floor surface roughness model considering tool vibration in the process of micro-milling. Int. J. Adv. Manuf. Technol. 2018, 94, 4415–4425. [Google Scholar] [CrossRef]
  71. Urbikain, G.; de Lacalle, L.L. Modelling of surface roughness in inclined milling operations with circle-segment end mills. Simul. Model. Pract. Theory 2018, 84, 161–176. [Google Scholar] [CrossRef]
  72. Vicente, K.J.; Mumaw, R.J.; Roth, E.M. Operator monitoring in a complex dynamic work environment: A qualitative cognitive model based on field observations. Theor. Issues Ergon. Sci. 2004, 5, 359–384. [Google Scholar] [CrossRef]
  73. Nasoz, F.; Alvarez, K.; Lisetti, C.L.; Finkelstein, N. Emotion recognition from physiological signals using wireless sensors for presence technologies. Cogn. Technol. Work 2004, 6, 4–14. [Google Scholar] [CrossRef]
  74. Rao, P.K.; Liu, J.P.; Roberson, D.; Kong, Z.J.; Williams, C. Online real-time quality monitoring in additive manufacturing processes using heterogeneous sensors. J. Manuf. Sci. Eng. 2015, 6, 1–6. [Google Scholar] [CrossRef]
  75. Almagrabi, H.; Malibari, A.; McNaught, J. A Survey of Quality Prediction of Product Reviews. Int. J. Adv. Comput. Sci. Appl. 2015, 10, 49–58. [Google Scholar] [CrossRef]
  76. Bejczy, A. Virtual reality in manufacturing. In Re-engineering for Sustainable Industrial Production; Springer: New York, USA, 1997; Volume 13, pp. 48–60. [Google Scholar]
  77. Shiratuddin, M.F.; Zulkifli, A.N. Virtual reality in manufacturing. In Proceedings of the Management Education for the 21st Century, Ho Chi Minh City, Vietnam, 12–14 September 2001. [Google Scholar]
  78. Kopácsi, S.; Sárközy, F. Virtual Reality in Manufacturing. Available online: http://old.sztaki.hu/~kopacsi/vr/vr_main.htm (accessed on 20 July 2018).
  79. Azuma, R.T. A survey of augmented reality. Presence-Teleoper. Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
  80. Caudell, T.P.; Mizell, D.W. Augmented reality: An application of heads-up display technology to manual manufacturing processes. In Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences, Kauai, HI, USA, 7–10 January 1992; Volume 2, pp. 659–669. [Google Scholar]
  81. Nee, A.Y.; Ong, S.K. Virtual and Augmented Reality Applications in Manufacturing. IFAC Proc. Vol. 2013, 46, 15–26. [Google Scholar] [CrossRef] [Green Version]
  82. Tuegel, E.J.; Ingraffea, A.R.; Eason, T.G.; Spottswood, S.M. Reengineering aircraft structural life prediction using a digital twin. Int. J. Aerosp. Eng. 2011, 15, 1–15. [Google Scholar] [CrossRef]
  83. Tao, F.; Cheng, J.; Qi, Q.; Zhang, M.; Zhang, H.; Sui, F. Digital twin-driven product design, manufacturing and service with big data. Int. J. Adv. Manuf. Technol. 2018, 4, 3563–3576. [Google Scholar] [CrossRef]
  84. Boschert, S.; Rosen, R. Digital Twin—The Simulation Aspect. In Mechatronic Futures: Challenges and Solutions for Mechatronic Systems and their Designers; Springer: New York, USA, 2016; Volume 16, pp. 59–74. [Google Scholar]
  85. Xiaobo, Z.; Ohno, K. Algorithms for sequencing mixed models on an assembly line in a JIT production system. Comput. Ind. Eng. 1997, 32, 47–56. [Google Scholar] [CrossRef]
  86. Xiaobo, Z.; Ohno, K.; Lau, H.S. A Balancing Problem for Mixed Model Assembly Lines with a Paced Moving Conveyor. Naval Res. Logist. 2004, 19, 446–464. [Google Scholar] [CrossRef]
  87. Xiaobo, Z.; Ohno, K. Properties of a sequencing problem for a mixed model assembly line with conveyor stoppages. Eur. J. Oper. Res. 2000, 11, 560–570. [Google Scholar] [CrossRef]
  88. Xiaobo, Z.; Zhou, Z.; Asres, A. Note on Toyota’s goal of sequencing mixed models on an assembly line. Comput. Ind. Eng. 1999, 9, 57–65. [Google Scholar] [CrossRef]
  89. Barrett, J.C. A Monte Carlo simulation of human reproduction. Genus 1969, 22, 1–22. [Google Scholar]
  90. Raychaudhuri, S. Introduction to Monte Carlo simulation. In Proceedings of the 2008 Winter Simulation Conference, Miami, FL, USA, 7–10 December 2008; pp. 91–100. [Google Scholar]
  91. Migliaccio, G.C.; Cheng, T.; Gatti, U.C.; Teizer, J. Data Fusion of Real-Time Location Sensing (RTLS) and Physiological Status Monitoring (PSM) for Ergonomics Analysis of Construction Workers. In Proceedings of the 19th Triennial CIB World Building Congress, Brisbane, Australia, 5–9 May 2013; Volume 12, pp. 1–12. [Google Scholar]
  92. Petriu, E.M.; Georganas, N.D.; Petriu, D.C.; Makrakis, D.; Groza, V.Z. Sensor-based information appliances. IEEE Instrum. Meas. Mag. 2000, 3, 31–35. [Google Scholar]
  93. Kong, X.T.R.; Luo, H.; Huang, G.Q.; Yang, X. Industrial wearable system: The human-centric empowering technology in Industry 4.0. J. Intell. Manuf. 2018, 17, 1–17. [Google Scholar] [CrossRef]
  94. Kong, X.T.; Yang, X.; Huang, G.Q.; Luo, H. The impact of industrial wearable system on industry 4.0. In Proceedings of the 2018 IEEE 15th International Conference on Networking, Sensing and Control (ICNSC), Zhuhai, China, 27–29 March 2018; pp. 1–6. [Google Scholar]
  95. Liu, H.; Darabi, H.; Banerjee, P.; Liu, J. Survey of wireless indoor positioning techniques and systems. IEEE Trans. Syst. Man Cybern. Part C 2007, 14, 1067–1080. [Google Scholar] [CrossRef]
  96. Gu, Y.; Lo, A.; Niemegeers, I. A survey of indoor positioning systems for wireless personal networks. IEEE Commun. Surv. Tutor. 2009, 20, 13–32. [Google Scholar] [CrossRef]
  97. Mautz, R.; Tilch, S. Survey of optical indoor positioning systems. In Proceedings of the 2011 International Conference on Indoor Positioning and Indoor Navigation, Guimaraes, Portugal, 21–23 September 2011; pp. 1–7. [Google Scholar]
  98. Saab, S.S.; Nakad, Z.S. A standalone RFID indoor positioning system using passive tags. IEEE Trans. Ind. Electron. 2011, 10, 1961–1970. [Google Scholar] [CrossRef]
  99. Mautz, R. Indoor positioning technologies. ETH Zurich 2012, 129, 1–129. [Google Scholar]
  100. Kagermann, H.; Helbig, J.; Hellinger, A.; Wahlster, W. Recommendations for Implementing the Strategic Initiative INDUSTRIE 4.0: Securing the Future of German Manufacturing Industry; Final Report of the Industrie 4.0; Federal Ministry of Education and Research: Berlin, Germany, 2013. [Google Scholar]
  101. Lee, J.H.; Hashimoto, H. Intelligent Space, its past and future. In Proceedings of the Industrial Electronics Society IECON’99, San Jose, CA, USA, 29 November–3 December 1999; Volume 6, pp. 126–131. [Google Scholar]
  102. Hashimoto, H. Intelligent space: Interaction and intelligence. Artif. Life Robot. 2003, 7, 79–85. [Google Scholar] [CrossRef]
  103. Yan, H.; Zhu, K.; Ling, Y. The user-resource-task model in intelligent interaction space. In Proceedings of the 2015 4th International Conference on Computer Science and Network Technology (ICCSNT), Harbin, China, 19–20 December 2015; Volume 1, pp. 768–771. [Google Scholar]
  104. Zhou, J.; Leppanen, T.; Harjula, E.; Ylianttila, M.; Ojala, T.; Yu, C.; Jin, H.; Yang, L.T. Cloudthings: A common architecture for integrating the internet of things with cloud computing. In Proceedings of the Computer Supported Cooperative Work in Design (CSCWD), Hsinchu, Taiwan, 21–23 May 2014; pp. 651–657. [Google Scholar]
  105. Szász, C. Reconfigurable electronics application in intelligent space developments. Int. Rev. Appl. Sci. Eng. 2017, 8, 107–111. [Google Scholar] [CrossRef] [Green Version]
  106. Lee, J.H.; Hashimoto, H. Intelligent space—Concept and contents. Adv. Robot. 2002, 16, 265–280. [Google Scholar] [CrossRef]
  107. Krahnstoever, N.; Rittscher, J.; Tu, P.; Chean, K.; Tomlinson, T. Activity Recognition using Visual Tracking and RFID. In Proceedings of the 2005 Seventh IEEE Workshops on Applications of Computer Vision (WACV/MOTION’05), Breckenridge, CO, USA, 5–7 January 2005; Volume 1, pp. 494–500. [Google Scholar]
  108. Sardroud, J.M. Influence of RFID technology on automated management of construction materials and components. Sci. Iran. 2012, 19, 381–392. [Google Scholar] [CrossRef]
  109. Huang, G.Q.; Zhang, Y.; Jiang, P. RFID-based wireless manufacturing for walking-worker assembly islands with fixed-position layouts. Robot. Comput.-Integr. Manuf. 2007, 9, 469–477. [Google Scholar] [CrossRef]
  110. Huang, G.Q.; Zhang, Y.; Jiang, P. RFID-based wireless manufacturing for real-time management of job shop WIP inventories. Int. J. Adv. Manuf. Technol. 2008, 13, 752–764. [Google Scholar] [CrossRef]
  111. Satoh, I. A mobile agent-based framework for location-based services. In Proceedings of the 2004 IEEE International Conference on Communications (IEEE Cat. No.04CH37577), Paris, France, 20–24 June 2004; Volume 3, pp. 1355–1359. [Google Scholar]
  112. Chao, H. The non-specific intelligent guided-view system based on RFID technology. In Proceedings of the 19th International Conference on Advanced Information Networking and Applications (AINA’05), Washington, DC, USA, 25–30 March 2005; Volume 2, pp. 580–585. [Google Scholar]
  113. Smith, J.R.; Fishkin, K.P.; Jiang, B.; Mamishev, A.; Philipose, M.; Rea, A.D.; Roy, S.; Sundara-Rajan, K. RFID-based techniques for human-activity detection. Commun. ACM 2005, 6, 39–44. [Google Scholar] [CrossRef]
  114. Leitold, D.; Vathy-Fogarassy, A.; Varga, K.; Abonyi, J. RFID-based task time analysis for shop floor optimization. In Proceedings of the 2018 IEEE International Conference on Future IoT Technologies (Future IoT), Eger, Hungary, 18–19 January 2018; pp. 1–6. [Google Scholar]
  115. Stiefmeier, T.; Roggen, D.; Ogris, G.; Lukowicz, P.; Tröster, G. Wearable activity tracking in car manufacturing. IEEE Pervasive Comput. 2008, 7, 1–7. [Google Scholar] [CrossRef]
  116. Koskimaki, H.; Huikari, V.; Siirtola, P.; Laurinen, P.; Roning, J. Activity recognition using a wrist-worn inertial measurement unit: A case study for industrial assembly lines. In Proceedings of the 2009 17th Mediterranean Conference on Control and Automation, Thessaloniki, Greece, 24–26 June 2009; pp. 401–405. [Google Scholar]
  117. Ward, J.A.; Lukowicz, P.; Troster, G.; Starner, T.E. Activity recognition of assembly tasks using body-worn microphones and accelerometers. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 28, 1553–1567. [Google Scholar] [CrossRef] [PubMed]
  118. Huikari, V.; Koskimäki, H.; Siirtola, P.; Röning, J. User-independent activity recognition for industrial assembly lines-feature vs. instance selection. In Proceedings of the 5th International Conference on Pervasive Computing and Applications, Maribor, Slovenia, 1–3 December 2010; pp. 307–312. [Google Scholar]
  119. Voulodimos, A.S.; Doulamis, N.D.; Kosmopoulos, D.I.; Varvarigou, T.A. Improving multi-camera activity recognition by employing neural network based readjustment. Appl. Artif. Intell. 2012, 26, 97–118. [Google Scholar] [CrossRef]
  120. Kim, S.C.; Jeong, Y.S.; Park, S.O. RFID-based indoor location tracking to ensure the safety of the elderly in smart home environments. Persona Ubiquitous Comput. 2013, 17, 1699–1707. [Google Scholar] [CrossRef]
  121. Gladysz, B.; Santarek, K.; Lysiak, C. Dynamic Spaghetti Diagrams. A Case Study of Pilot RTLS Implementation. In Intelligent Systems in Production Engineering and Maintenance—ISPEM 2017; Springer: New York, NY, USA, 2018; pp. 238–248. [Google Scholar]
  122. Yang, Z.; Zhang, P.; Chen, L. RFID-enabled indoor positioning method for a real-time manufacturing execution system using OS-ELM. Neurocomputing 2016, 14, 121–133. [Google Scholar] [CrossRef]
  123. Blum, M.; Schuh, G. Towards a Data-oriented Optimization of Manufacturing Processes. In Proceedings of the 19th International Conference on Enterprise Information Systems, Porto, Portugal, 26–29 April 2017; Volume 8, pp. 257–264. [Google Scholar]
  124. Hodgins, D.; Sirnmonds, D. The electronic nose and its application to the manufacture of food products. J. Anal. Methods Chem. 1995, 7, 179–185. [Google Scholar] [CrossRef] [PubMed]
  125. Appenzeller, G.; Lee, J.H.; Hashimoto, H. Building topological maps by looking at people: An example of cooperation between intelligent spaces and robots. In Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems Innovative Robotics for Real-World Applications, Grenoble, France, 24–28 September 1997; Volume 3, pp. 1326–1333. [Google Scholar]
  126. Agin, G.J. Computer Vision Systems for Industrial Inspection and Assembly. Computer 1980, 13, 11–20. [Google Scholar] [CrossRef]
  127. Haralick, R.M.; Shapiro, L.G. Computer and Robot Vision, 1st ed.; Addison-Wesley Reading: Boston, MA, USA, 1992. [Google Scholar]
  128. Chen, S.E. QuickTime VR: An Image-based Approach to Virtual Environment Navigation. In Proceedings of the 22nd Annual Conference on Computer Graphics and Interactive Techniques, San Francisco, CA, USA, 22–26 July 1995; pp. 29–38. [Google Scholar]
  129. Maggioni, C. A novel gestural input device for virtual reality. In Proceedings of the IEEE Virtual Reality Annual International Symposium, Seattle, WA, USA, 18–22 September 1993; pp. 118–124. [Google Scholar]
  130. Fleck, S.; Strasser, W. Adaptive Probabilistic Tracking Embedded in a Smart Camera. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05)—Workshops, San Diego, CA, USA, 20–25 June 2005; p. 134. [Google Scholar]
  131. Zhai, C.; Zou, Z.; Zhou, Q.; Mao, J.; Chen, Q.; Tenhunen, H.; Zheng, L.; Xu, L. A 2.4-GHz ISM RF and UWB hybrid RFID real-time locating system for industrial enterprise Internet of Things. Enterp. Inf. Syst. 2017, 11, 909–926. [Google Scholar] [CrossRef]
  132. Hahnel, D.; Burgard, W.; Fox, D.; Fishkin, K.; Philipose, M. Mapping and localization with RFID technology. In Proceedings of the IEEE International Conference on Robotics and Automation, New Orleans, LA, USA, 26 April–1 May 2004; pp. 1015–1020. [Google Scholar]
  133. Yang, P.; Wu, W.; Moniri, M.; Chibelushi, C.C. Efficient object localization using sparsely distributed passive RFID tags. IEEE Trans. Ind. Electron. 2013, 11, 5914–5924. [Google Scholar] [CrossRef]
  134. Yang, P.; Wu, W. Efficient particle filter localization algorithm in dense passive RFID tag environment. IEEE Trans. Ind. Electron. 2014, 11, 5641–5651. [Google Scholar] [CrossRef]
  135. Hanssens, N.; Kulkarni, A.; Tuchida, R.; Horton, T. Building Agent-Based Intelligent Workspaces. In Proceedings of the International Conference on Internet Computing, Las Vegas, NV, USA, 24–27 June 2002; pp. 675–681. [Google Scholar]
  136. Morganti, E.; Angelini, L.; Adami, A.; Lalanne, D.; Lorenzelli, L.; Mugellini, E. A smart watch with embedded sensors to recognize objects, grasps and forearm gestures. Procedia Eng. 2012, 7, 1169–1175. [Google Scholar] [CrossRef]
  137. Chouhan, T.; Panse, A.; Voona, A.K.; Sameer, S. Smart glove with gesture recognition ability for the hearing and speech impaired. In Proceedings of the 2014 IEEE Global Humanitarian Technology Conference—South Asia Satellite (GHTC-SAS), San Jose, CA, USA, 18–21 October 2014; pp. 105–110. [Google Scholar]
  138. Li, X.; Li, D.; Wan, J.; Vasilakos, A.V.; Lai, C.F.; Wang, S. A review of industrial wireless networks in the context of Industry 4.0. Wirel. Netw. 2017, 23, 23–41. [Google Scholar] [CrossRef]
  139. Cho, H.; Wysk, R.A. Intelligent workstation controller for computer-integrated manufacturing: Problems and models. J. Manuf. Syst. 1994, 165, 1–165. [Google Scholar] [CrossRef]
  140. Ryoo, M.S.; Grauman, K.; Aggarwal, J.K. A task-driven intelligent workspace system to provide guidance feedback. Comput. Vision Image Understand. 2010, 15, 520–534. [Google Scholar] [CrossRef]
  141. Kruppa, M.; Spassova, L.; Schmitz, M. The virtual room inhabitant–intuitive interaction with intelligent environments. In Australasian Joint Conference on Artificial Intelligence; Springer: New York, NY, USA, 2005; Volume 10, pp. 225–234. [Google Scholar]
  142. Kaber, D.B.; Perry, C.M.; Segall, N.; McClernon, C.K.; III, L.J.P. Situation awareness implications of adaptive automation for information processing in an air traffic control-related task. Int. J. Ind. Ergon. 2006, 16, 447–462. [Google Scholar] [CrossRef]
  143. Perera, C.; Liu, C.H.; Jayawardena, S. The Emerging Internet of Things Marketplace From an Industrial Perspective: A Survey. IEEE Trans. Emerg. Top. Comput. 2015, 14, 585–598. [Google Scholar] [CrossRef]
  144. Sun, J.; Gao, M.; Wang, Q.; Jiang, M.; Zhang, X.; Schmitt, R. Smart services for enhancing personal competence in industrie 4.0 digital factory. Logforum 2018, 8, 51–57. [Google Scholar] [CrossRef]
  145. Obitko, M.; Jirkovský, V. Big Data Semantics in Industry 4.0. In Industrial Applications of Holonic and Multi-Agent Systems; Springer International Publishing: Nwe York, NY, USA, 2015; pp. 217–229. [Google Scholar]
  146. AGCO. AGCO Innovations in Manufacturing with Glass. Available online: https://news.agcocorp.com/topics/agco-innovations-in-manufacturing-with-glass (accessed on 6 July 2018).
  147. Makris, S.; Karagiannis, P.; Koukas, S.; Matthaiakis, A.S. Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann. 2016, 4, 61–64. [Google Scholar] [CrossRef]
  148. Chan, M.; Estève, D.; Fourniols, J.Y.; Escriba, C.; Campo, E. Smart wearable systems: Current status and future challenges. Artif. Intell. Med. 2012, 20, 137–156. [Google Scholar] [CrossRef] [PubMed]
  149. Appelboom, G.; Camacho, E.; Abraham, M.E.; Bruce, S.S.; Dumont, E.L.; Zacharia, B.E.; D’Amico, R.; Slomian, J.; Reginster, J.Y.; Bruyère, O.; et al. Smart wearable body sensors for patient self-assessment and monitoring. Arch. Public Health 2014, 72, 28–37. [Google Scholar] [CrossRef] [PubMed]
  150. Manogaran, G.; Thota, C.; Lopez, D.; Sundarasekar, R. Big Data Security Intelligence for Healthcare Industry 4.0. In Cybersecurity for Industry 4.0: Analysis for Design and Manufacturing; Thames, L., Schaefer, D., Eds.; Springer International Publishing: New York, NY, USA, 2017; Volume 24, pp. 103–126. [Google Scholar]
  151. Caldarola, E.G.; Modoni, G.E.; Sacco, M. A Knowledge-based Approach to Enhance the Workforce Skills and Competences within the Industry 4.0. In eK NOW 2018: The Tenth International Conference on Information, Process, and Knowledge Management; IARIA XPS Press: Rome, Italy, 2018. [Google Scholar]
  152. Miller, S. AI: Augmentation, more so than automation. Asian Manag. Insight 2018, 20, 1–20. [Google Scholar]
  153. Klinker, K.; Berkemeier, L.; Zobel, B.; Wüller, H.; Huck-Fries, V.; Wiesche, M.; Remmers, H.; Thomas, O.; Krcmar, H. Structure for innovations: A use case taxonomy for smart glasses in service processes. In Proceedings of the Multikonferenz Wirtschaftsinformatik, Lüneburg, Germany, 6–9 March 2018; Volume 12, pp. 1599–1610. [Google Scholar]
  154. DHL. Augmented Reality in Logistics. Available online: http://www.dhl.com/content/dam/downloads/g0/about_us/logistics_insights/csi_augmented_reality_report_290414.pdf (accessed on 20 July 2018).
  155. Spitzer, M.; Nanic, I.; Ebner, M. Distance Learning and Assistance Using Smart Glasses. Educ. Sci. 2018, 8, 21. [Google Scholar] [CrossRef]
  156. Hao, Y.; Helo, P. The role of wearable devices in meeting the needs of cloud manufacturing: A case study. Robot. Comput.-Integr. Manuf. 2017, 45, 168–179. [Google Scholar] [CrossRef]
  157. Mejia Orozco, E.I.; Luciano, C.J. Introduction to Haptics. In Comprehensive Healthcare Simulation: Neurosurgery; Springer: New York, NY, USA, 2018; pp. 141–151. [Google Scholar]
  158. HaptX. Available online: https://www.roadtovr.com/haptx-vr-glove-micro-pneumatic-haptics-force-feedback-axonvr/ (accessed on 20 July 2018).
  159. VRgluv. Available online: https://vrgluv.com/ (accessed on 10 July 2018).
  160. Garrec, P.; Friconneau, J.P.; Measson, Y.; Perrot, Y. ABLE, an innovative transparent exoskeleton for the upper-limb. In Proceedings of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, 22–26 September 2008; pp. 1483–1488. [Google Scholar] [CrossRef]
  161. van der Vorm, J.; Nugent, R.; O’Sullivan, L. Safety and Risk Management in Designing for the Lifecycle of an Exoskeleton: A Novel Process Developed in the Robo-Mate Project. Procedia Manuf. 2015, 3, 1410–1417. [Google Scholar] [CrossRef]
  162. Seth, A.; Vance, J.M.; Oliver, J.H. Virtual reality for assembly methods prototyping: A review. Virtual Real. 2011, 15, 5–20. [Google Scholar] [CrossRef]
  163. Leitão, P.; Colombo, A.W.; Karnouskos, S. Industrial automation based on cyber-physical systems technologies: Prototype implementations and challenges. Comput. Ind. 2016, 81, 11–25. [Google Scholar] [CrossRef] [Green Version]
  164. Richardson, T.; Gilbert, S.B.; Holub, J.; Thompson, F.; MacAllister, A.; Radkowski, R.; Winer, E.; Boeing Company. Fusing Self-Reported and Sensor Data from Mixed-Reality Training. In Proceedings of the Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC), Orlando, FL, USA, 1–4 December 2014. [Google Scholar]
  165. Frigo, M.A.; da Silva, E.C.C.; Barbosa, G.F. Augmented Reality in Aerospace Manufacturing: A Review. J. Ind. Intell. Inf. 2016, 4, 125–130. [Google Scholar] [CrossRef]
  166. Bauer, W.; Schlund, S.; Hornung, T.; Schuler, S. Digitalization of Industrial Value Chains—A Review and evaluation of Existing Use Cases of Industry 4.0 in Germany. Sci. J. Logist. 2018, 14, 331–340. [Google Scholar]
Figure 1. (R)evolution of the tasks of operators in manufacturing systems.
Figure 1. (R)evolution of the tasks of operators in manufacturing systems.
Applsci 08 01650 g001
Figure 2. Architecture of cyber-physical systems.
Figure 2. Architecture of cyber-physical systems.
Applsci 08 01650 g002
Figure 3. Intelligent Space (iSpace)-based integrated sensor signals can be used to monitor the work of the operators, extract their tacit knowledge, synchronize activities and provide contextualized information.
Figure 3. Intelligent Space (iSpace)-based integrated sensor signals can be used to monitor the work of the operators, extract their tacit knowledge, synchronize activities and provide contextualized information.
Applsci 08 01650 g003
Figure 4. The design of connections between resources, users and tasks is the key to the design of intelligent interaction space.
Figure 4. The design of connections between resources, users and tasks is the key to the design of intelligent interaction space.
Applsci 08 01650 g004
Table 1. Elements of the Operator 4.0 methodology according to [23,25].
Table 1. Elements of the Operator 4.0 methodology according to [23,25].
Type of Operator 4.0DescriptionExamples
Analytical operatorThe application of big data analytics in real-time smart manufacturing.Discovering useful information and predicting relevant events [30,31].
Augmented operatorAugmented Reality (AR)-based enrichment of the factory environment. AR improves information transfer from the digital to the physical world.Smartphones or tablets are used as Radio Frequency IDentification (RFID) readers and can become key tools of smart manufacturing [32,33,34].
Spatial AR projectors support automotive manufacturing [35,36,37].
Collaborative operatorCollaborative robots (CoBots) are designed to work in direct cooperation with operators to perform repetitive and non-ergonomic tasks.Rethink-Robotics with Baxter and Sawyer promises low-cost and easy-to-use collaborative robots [38].
Healthy operatorWearable trackers are designed to measure activity, stress, heart rate and other health-related metrics, as well as GPS location and other personal data.Apple Watch, Fitbit and Android Wear-based solutions had already been developed [23].
Military-based applications can predict potentially problematic situations before they arise [23].
Smarter operatorIntelligent Personal Assistant (IPA)-based solutions that utilize artificial intelligence.Help the operator to interact with machines, computers, databases and other information systems [39].
Social operatorEnterprise Social Networking Services (E-SNS) focus on the use of mobile and social collaborative methods to connect smart operators on the shop-floor with smart factory resources.The Social Internet of Industrial Things interacts, shares and creates information for the purpose of decision-making support [40].
Super-strength operatorPowered exoskeletons are wearable, lightweight and flexible biomechanical systems.Powered mechanics to increase the strength of a human operator for effortless manual functions [41].
Virtual operatorVirtual Reality (VR) is an immersive, interactive multimedia and computer-simulated reality that can digitally replicate a design, assembly or manufacturing environment and allow the operator to interact with any presence within it.Provide the users with an environment to explore the outcomes of their decisions without putting themselves or the environment at risk [42].
The Virtual Reality (VR)-based gait training program provides real-time feedback [43].
Multi-purpose virtual engineering space [44].
Table 2. Design principles of Industry 4.0 applied to Operator 4.0 solutions.
Table 2. Design principles of Industry 4.0 applied to Operator 4.0 solutions.
Design principleDescriptionApplication
System integrationIt combines subsystems into one system. Vertical integration connects manufacturing systems and technologies [48]; horizontal integration connects functions and data across the value chain [49].Analytical operator
ModularityIt is important for the ability of the manufacturing system to adapt to continuous changes [50,51,52].Augmented operator
InteroperabilityIt allows human resources, smart products and smart factories to connect, communicate and operate together [50]. The standardization of data is a critical factor for interoperability because the components have to understand each other.Collaborative operator
Product personalizationThe system has to be adapted to frequent product changes [53].Smarter operator
DecentralizationIt is based on the distributive approach, where the system consists of autonomous parts, which can act independently [50]. It simplifies the structure of the system, which simplifies the planning and coordination of processes and increases the reliability [54].
Corporate social responsibilityIt involves environmental and labor regulations.Social operator
VirtualizationIt uses a digital twin, i.e., all data from the physical world are presented in a cyber-physical model [55].Virtual operator
Table 3. Levels of cyber-physical systems from the perspective of operators.
Table 3. Levels of cyber-physical systems from the perspective of operators.
LevelFunctionExample
ConfigurationSelf-optimizePrediction and online feedback with regard to quality issues [74,75]
Self-adjust
Self-configure
CognitionCollaborative diagnostic and decision-makingVirtual Reality (VR) [76,77,78]
Remote visualization for humansAugmented Reality (AR) [79,80,81]
CyberDigital twinDecision-making based on a digital twin [82,83,84]
Model of operatorWorker-movement diagram [85,86,87,88]
Monte Carlo simulation of a stochastic process model [89,90]
ConversionSmart analyticsOnline performance monitoring based on sensor fusion [91,92]
Degradation and performance prediction
ConnectionSensor networkWearable tracker [93,94]
Indoor positioning system [95,96,97,98,99]
Table 4. Applications of indoor positioning systems in production management.
Table 4. Applications of indoor positioning systems in production management.
Application AreaDescriptionExamples
Performance monitoringMeasure effects of process development and Business Process Reengineering (BPR).Analyze moving- and staying-time of operators [120].
Movement analysisSpaghetti diagram of operator movement to reduce unnecessary movement and optimize the layout and supply chain.Reduce the duration of material handling [121]. Reduce the number of unnecessary movements of operators [120]. Support real-time Manufacturing Execution Systems (MES) [122].
Support 5S workplace organization methodology projectsTrack tools and optimize the place of application and storage.Decrease of stock and scrap. Improve activity times [120].
Digital twinDirect process the on-line information inside the process-simulation tools. Prove the real-time architecture for the digital twin method.The main elements of the real-time architecture are the ‘digital twin’ and IPS [123].
Table 5. Sensors of Operator 4.0 solutions.
Table 5. Sensors of Operator 4.0 solutions.
Type of Operator 4.0Type of SensorExamples
Analytical operatorInfra-red sensorsDiscover and predict events [102]
Olfactory sensorsElectronic nose [124]
MicrophonesCapturing voices and the location of speakers [125]
Augmented operator Machine vision systems for quality inspection [126,127]
Virtual operatorVisual sensorsImage processing, e.g., panoramic images [128], create the environment of virtual reality [129]
Smart camera for probabilistic tracking [130]
Collaborative operatorLocalization sensorsIPS in manufacturing [95] and hybrid locating systems [131]
Mapping and localization using RFID technology [132] and efficient object localization using passive RFID tags [133,134]
Social operatorSmart and social factories based on the connection between machines, products and humans [135]
Smarter and healthy operatorWearable sensorsSmart watch with embedded sensors to recognize objects [136]
The smart glove maps the orientation of the hand and fingers with the help of bend sensors [137]
Table 6. Feedback technologies for Operator 4.0 solutions.
Table 6. Feedback technologies for Operator 4.0 solutions.
Operator 4.0FeedbackTechnologiesExamples
Analytical operatorReport/potential dangerSmart glasses, smartphones, tablets and personal displaysBig data-based development of a manufacturing process [145].
Augmented operatorEach possible feedbackSmart glassesAR for tractor manufacturing [146]. Smart glasses [23,26].
Collaborative operatorWaiting for interaction/technical problemSmart glasses, smartphones, tablets, personal displays, headsets and smartwatchesCollaborative operator workspace [147].
Healthy operatorNeed restSmart glasses, smartphones, tablets, personal displays and headsetsMeasurement of physiological parameters [148,149]. Security issues [150].
Change activity
Need a medical test
Smarter operatorAnswer to a questionSmart glasses, smartphones, tablets, personal displays and headsetsChatbot [151] and AI provide support to operators [152].
Notice about an event
Process
Social operatorEmergencySmart glasses, smartphones, tablets, personal displays and headsetsFacebook-based product avatar [40] and Social Manufacturing (SocialM) [53].
Process
Manufacturing
Technical information
Super-strength operatorOptimal route/targeting/trainingSmart glasses, tablets and smartphonesNavigation [153,154] and targeting [154,155,156].
Force feedback on a hand or whole armSmart gloves and special exoskeletonsHaptX [157,158], VRgluv [159] and ABLEProject [41,160] are such technologies.
Danger indicatorSmart glasses and speakersSafety and risk management (related to exoskeleton technology) [161].
Virtual operatorCollision/weight/pressureSmart clothes/smart glovesVR technology in prototyping and testing [162]. This kind of technology becomes more efficient with every wearable feedback device (e.g., smart gloves [163]) that use (secondary) human senses directly.

Share and Cite

MDPI and ACS Style

Ruppert, T.; Jaskó, S.; Holczinger, T.; Abonyi, J. Enabling Technologies for Operator 4.0: A Survey. Appl. Sci. 2018, 8, 1650. https://doi.org/10.3390/app8091650

AMA Style

Ruppert T, Jaskó S, Holczinger T, Abonyi J. Enabling Technologies for Operator 4.0: A Survey. Applied Sciences. 2018; 8(9):1650. https://doi.org/10.3390/app8091650

Chicago/Turabian Style

Ruppert, Tamás, Szilárd Jaskó, Tibor Holczinger, and János Abonyi. 2018. "Enabling Technologies for Operator 4.0: A Survey" Applied Sciences 8, no. 9: 1650. https://doi.org/10.3390/app8091650

APA Style

Ruppert, T., Jaskó, S., Holczinger, T., & Abonyi, J. (2018). Enabling Technologies for Operator 4.0: A Survey. Applied Sciences, 8(9), 1650. https://doi.org/10.3390/app8091650

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop