1. Introduction
The emerging “Industry 4.0” concept is an umbrella term for a new industrial paradigm which embraces a set of future industrial developments including cyber-physical systems (CPS), the Internet of things (IoT), the Internet of services (IoS), robotics, big data, cloud manufacturing and augmented reality [
1]. Industrial processes need most tasks to be conducted locally due to time delays and security constraints, and structured data needs to be communicated over the internet. Fog computing is a potential intermediate software that can be very useful for various industrial scenarios. It can reduce and refine high volume industrial data locally, before being sent to the cloud. It can also provide local processing support with acceptable latency for actuators and robots in a manufacturing industry [
2]. The lack of interoperability between devices in the Industrial Internet of things (IIoT) considerably increases the complexity and cost of IIoT implementation and integration. The search for seamless interoperability is further complicated by the long lifetime of typical industrial equipment, which require costly upgrades or replacements to work with the newest technologies [
3].
One of the novelties of autonomous robots applied to industry 4.0 is the generalization of the use of drones (unmanned aerial vehicles—UAVs) to carry out a multitude of inspection and data collection tasks. In this paper, the focus is on the construction industry since it is one of the sectors where traditionally less advanced technology has been applied and is therefore suitable for the use of the new Technology of Industry 4.0 [
4]. Construction companies have mostly been using UAVs for real-time jobsite monitoring and to provide high-definition (HD) videos and images for identifying changes and solving or preventing many issues [
5]. They are also used for inspection and maintenance tasks that are either inaccessible, dangerous, or costly from the ground [
6].
Integrating UAVs into the IoT represents an interoperability challenge, as every IoT system has its own communications protocol. Moreover, a small error or delay beyond the tolerated limit could result in a disaster for various applications, such as UAV and aircraft manufacture and monitoring. While the (IoT) provides Internet access to any ‘thing’, UAVs can also be part of these connected things and send their on-board data to the cloud [
7]. The off-board base station gives them higher computational capacity and the ability to carry out more complex actions using high-level programming languages, or leveraging services from computer vision tools by acquiring, processing, analyzing and understanding digital images in real-time. Computing capabilities can be extended to the cloud, taking advantage of the services offered, and saving the cost and energy consumption of an embedded UAV system. There is a growing trend towards the three-layer IIot architecture with fog computing, with a convergence network of interconnected and distributed intelligent gateways.
Fog computing is a distributed computing paradigm that empowers network devices at different hierarchical levels with various degrees of computational and storage capacity [
8]. In this context, fog computing is not only considered for computation and storage but also as a way of integrating the different new systems capable of interconnecting urgent and complex processing tasks. The fog can be responsible for technical assistance between humans and machines, information transparency, interoperability, decentralized decision-making, information security, and data analysis. Its notable benefits minimize human error, reduce human health risks, improve operational efficiency, reduce costs, improve productivity, and maintain quality and customer satisfaction [
2].
Here, we propose a UAV-based IIoT monitoring and control system integrated into a traditional industrial control architecture by harnessing the power of fog middleware and cloud computing. The main aim of the work was to present an innovative concept and an open three-layer architecture, including a UAV, to enhance quality and reduce waste by introducing visual supervision through cloud services as part of the three-layer IIoT architecture with fog computing and a control system. We also analyzed the fog computing layer and the IoT gateways to comply with the requirements of interoperability and time latency. We developed a theoretical model to mathematically represent the end-to-end latency in UAV-based Industry 4.0 architecture. We provide a comparative study of a fog computing system through different platforms and analyze the impact of these platforms on the network performance. We also describe a case study in a bulk concrete production plant using a drone-borne camera and IBM Watson’s service image recognition in the cloud. The study involved monitoring the materials carried on conveyor belts and controlling the production process. This operation was considered as cost-effective and time effective and reduced the concrete batch production time.
The paper’s main contributions are as follows:
A proposal for an IIoT-based UAV architecture for monitoring and improving a production process using cloud computing services for visual recognition.
Assessment of the three-layer architecture latency.
Practical implementation and validation of the proposed architecture.
2. Related Works
The Industry 4.0 concept was born to apply the ideas of cyber-physical systems (CPSs) and IoT to industrial automation and to create smart products, smart production, and smart services [
9]. It involves cyber-physical systems, the Internet of things, cognitive computing and cloud computing and supports what has been termed a “Smart factory”. In 2011, Germany adopted the idea to develop its economy in the context of an industrial revolution with new technologies compatible with old systems [
10]. Industry now faces the challenge of making the IT network compatible with its machines, including interoperability, fog/cloud computing, security, latency, and quality of service. One of the proposed solutions is smarter IoT gateways [
11], which are the bridges between the traditional network and sensor networks [
12]. An IoT gateway is a physical device with software programs and protocols that act as intermediaries between sensors, controllers, intelligent devices, and the cloud. The IoT gateway provides the necessary connectivity, security, and manageability, while some of the existing devices cannot share data with the cloud [
13].
EtherCat, CANOpen, Modbus/Modbus TCP, EtherNet/IP, PROFIBUS, PROFINET, DeviceNet, IEEE802.11, ISA100.11a, and Wireless HART are the most frequently used industrial protocols [
14]. Due to the incompatible information models for the data and services of the different protocols, interoperability between the different systems with different protocols is always difficult. Up to only a few years ago the communication systems for industrial automation aimed only at real-time performance suitable for industry and maintainability based on international standards [
15]. The Industry 4.0 concept has the flexibility to achieve interoperability between the different industrial engineering systems. To connect the different industrial equipment and systems, the same standards and safety levels are required. Open Platform Communications Unified Architecture (OPC UA) is a machine-to-machine (M2M) communications protocol developed to create inter-operable and reliable communications and is now generally accepted as standard in industrial plant communications [
16]. OPC UA is an independent service-oriented architecture that integrates all the functionality of the individual OPC Classic specifications into one extensible framework [
17]. Girbea, et al. [
18] designed a service-oriented architecture for the optimization of industrial applications, using OPC UA to connect sub-manufacturing systems and ensure real-time communication between devices.
OPC UA can allocate all manufacturing resources, including embedded systems, to specific areas and extensible computing nodes through the address space and a pre-defined model. It solves the problem of unified access to the information of different systems [
19]. Infrastructure protocols have been proposed in many studies; for example, the authors of [
19,
20] developed an edge IoT gateway to extend the connectivity of MODBUS devices to IoT by storing the scanned data from MODBUS devices locally and then transferring the changes via an MQTT publisher to MQTT clients via a broker. In [
21], MQTT was adopted for machine-to-machine (M2M) communications to complement the MODBUS TCP operations in an IIoT environment. This environment integrates the MQTT event-based message-oriented protocol with the MODBUS TCP polling-based request–response protocol for industrial applications. The authors of [
22] designed and implemented a web-based real-time data monitoring system that uses MODBUS TCP communications in which all the data are displayed in a real-time chart in an Internet browser, which is refreshed at regular intervals using HTTP polling communications. The success of the IIoT initiative depends on communication protocols able to ensure effective, timely and ubiquitous aggregation [
23].
Implementing an Industry 4.0 architecture requires integration of the latest technologies, for example, IIoT, cyber-physical systems, additive manufacturing, big data and data analytics, cyber-security, cloud and edge computing, augmented and virtual reality, as well as autonomous robots and vehicles [
24]. The cloud robotics architecture is based on two elements: the cloud platform and its associated equipment and the bottom facility. Bottom facilities usually encompass all kinds of mobile robots, unmanned aerial vehicles, machines, and other equipment [
25]. The next generation of robots will include interconnected industrial robots [
26], cobots [
27] and autonomous land vehicles (AGVs) [
28]. Cobots support human workers in various tasks, while robots can carry out specific tasks, such as looking for objects or transporting tools. UAVs and drones are among the emerging robot technologies that leverage the power of perception science and are now the preferred remote sensing system for gathering data over long distances in difficult-to-access environments [
29]. Drone cameras can collect remotely sensed images from different areas safely and efficiently.
UAVs can save time and money in different sectors, such as agriculture, public safety, inspection and maintenance, transportation and autonomous delivery systems. This technological revolution was conceived to make people’s lives easier and to provide machine-to-machine communications without human intervention [
30]. Many industries use drones or unmanned aerial vehicles to increase sensing and manipulation capabilities, autonomy, efficiency, and reduce production costs. In the construction sector, drones play a significant role in industrial sites; they can fly over and monitor an area by acquiring photos and videos. They can be used to check a given installation or production areas, to transmit data, monitor construction processes, and detect anomalies.
As mentioned in [
4] many applications have already been implemented in the construction and the infrastructure fields. The net market value of deploying UAVs in support of construction and infrastructure inspection applications accounts for about 45% of the overall UAV market. UAVs are also used for the real-time inspection of power lines. In [
31], the authors implemented drones to detect trees and buildings close to power lines. They can also be deployed to monitor oil, gas and water pipelines. Industrial SkyWorks [
32] employs drones for building inspections and oil and gas inspections in North America using the powerful machine learning BlueVu algorithm to process the data collected. They provide asset inspection and data acquisition, advanced data processing with 2D and 3D images and detailed reports on the property inspected.
Crack assessment systems for concrete structures are constantly improving thanks to computer vision technologies and UAVs. UAVs combined with digital image processing have been applied to crack assessment as a cost-effective and time-effective solution, instead of visual observation [
33]. Image processing has become a significant asset for UAVs systems and not only in industry. Capturing footage and videos generates a huge amount of data, for which cloud computing is vital. Image recognition technology has a great potential in various industries and has been improved by deep learning and machine learning image recognition systems (TensorFlow, and MATLAB) or image processing techniques such as computer algorithms for digital image processing. In [
34], Machine Learning Techniques were used to estimate Nitrogen nutrition levels in corn crops (Zea mays). The work described in [
35] introduced a real-time drone surveillance system to identify violent individuals in public areas by a ScatterNet hybrid deep learning (SHDL) network. In [
36], the images from a drone camera were processed by the bag-of-words algorithm to detect crops, soils and flooded areas, with MATLAB to program the feature extraction algorithm. In [
37], a solution was proposed to detect a final target using the drone’s camera. The system implemented image processing algorithms using the open source computer vision library OpenCV. The main goal was to resolve the energy constraint without any wire connections or human intervention. Cloud solutions like Google AI, Amazon Web Services, and IBM Watson offer on-demand access to their image recognition services to connect with other systems in the internet. The authors in [
38] propose to move computationally-demanding object recognition to a remote computing cloud, instead of implementing it on the drone itself, by means of a cloud-based approach that allows real-time performance with hundreds of object categories. Other cloud-based platforms, e.g., SenseFly [
39], Skycatch [
40], and DroneDeploy [
41], offer their own end-to-end solution that incorporates mission control, flight planning, and post-processing. These solutions provide image analysis through a real connection with the main application.
The aforementioned studies show the significant advantages in different sectors of cost-effective and time-effective UAVs integrated with big data technology and machine learning. However, as far as we know, no studies have so far been published on the integration of UAVs into a complete industrial production system. Thus, here we propose an industrial real-time monitoring system with UAVs, fog computing and deep learning in the cloud (
Figure 1). The proposed IIoT-based UAVs collect photos from an industrial plant, while the cloud processing platform analyzes them and sends the results to a control system.
3. Industrial IoT Monitoring and Control Platform
Industry is taking advantage of ever more complex and sophisticated systems. Systems not designed to communicate across production lines often require integration with pre-existing devices. The challenge of interoperability is thus one of the main concerns in designing intelligent human-to-machine and machine-to-machine cooperation. Ensuring systems-of-systems communications involves blending robotics, interconnected devices/sensors, actors, heterogeneous systems, and convergent hybrid infrastructure with IIoT and CPS systems, including fog/edge computing and cloud services. Our aim was to design a drone-based monitoring system able to interact in real-time with industrial sensors, PLCs, and the cloud automatically via an IoT gateway as middleware, and to transmit data between the different systems securely. We validated our proposed architecture in an industrial concrete production plant in a case study to improve production and reduce costs.
3.1. Proposed Platform/Architecture
A UAV monitoring system was elaborated as an industrial control system to reduce inspection time and costs. An overview of the approach can be seen in
Figure 1. The proposed IIoT architecture is divided into three layers, with the UAVs in the data generation layer. The first layer consists of an industrial control system connected to a central collection point, which is the IoT gateway. The second layer is the fog computing layer for computation, storage, and communications. The last layer is a cloud back-end with image processing techniques. The fog layer connects the industrial control layer to the UAV system, the UAV system to the cloud, and finally the cloud to the industrial control system.
The control system receives data from remote or connected sensors that measure the process variables’ (PVs) setpoints (SP). When the system detects a trend change between PVs and SP, the change is routed to the programmable logic controllers (PLCs) and the central point (IoT gateway) to trigger the UAV system’s reaction. In this case, the human operator is replaced by a remote cloud calculation algorithm and a UAV system, in the sense that the UAV’s front camera serves as an additional surveillance sensor that is processed in the cloud to imitate an operator’s visual inspection. The drone goes to a specific point to supervise the process using the front camera. The UAV system is triggered automatically by responding to the sensor data from the industrial control system and data analyzed in the IoT gateway. The IoT gateway receives the captured photos and sends them to the cloud, which adopts deep learning techniques to analyze and send the results to the IoT gateway and the control system to confirm the anomaly.
3.2. IoT Gateway Capabilities
The IoT gateway is able to connect the sensor network to the cloud computing infrastructure and perform edge and fog computing and serves as a bridge between sensor networks and cloud services. Experiments were carried out using Node-RED and Ar.Drone library [
42] to connect to the industrial control system, the cloud, and the UAV. Node-RED is a programming tool for wiring together hardware devices, APIs, and online services using JavaScript runtime Node.js and a browser-based editor. It controls the flows to be designed and managed graphically. Node-RED has a sample set of nodes for communications between different protocols and platforms. Node.js is considered one of the best platforms to build real-time, asynchronous and event-driven applications [
15,
43,
44]. The Ar.Drone library [
42] is an application also developed in Node.js and implements the networking protocols used by the Parrot AR Drone 2.0 [
42]. This library provides a high-level client API that supports all drone features and enables developers to write autonomous programs. Using this library, the drone can be controlled via Wi-Fi, and automatically moves to a given target. It is also possible to describe the path, height, and direction the drone must follow to take the required photos.
3.3. The UAV-IIoT Architecture Development
This section describes the development of the proposed IIoT-UAV control system and its network protocols. It contains three layers, namely the industrial control system and UAVs, the IoT gateway, and the cloud. In the first layer, the industrial sensors of the control system are connected to a PLC that acts as OPC UA server, which routes the sensor data to the IoT gateway, which incorporates an OPC UA client installed in Node-RED. With the OPC UA client-server, data communication is independent of any particular operating platform. The central layer of the architecture augments the processing and communication abilities in the IoT gateway by connecting to the control system and cloud services, this part is considered as fog computing and depends on the sensor data retrieved from the sensors and driven to the OPC UA client node.
The fog layer is responsible for communications between all the other layers; it takes decision automatically based on the results and data received and conveys the output to the other layers or applications. The fog layer is presented in
Figure 2 as an IoT gateway, which can support all the necessary tools and protocols to ensure communication storage and computing. Node-RED is considered the key programming tool for wiring together the industrial control system, UAV applications, and the cloud. Node-RED makes it easy to wire together flows using a wide range of nodes.
The main nodes in this case study are the visual recognition node, OPC UA client, Cloudant node and Exec node. In
Figure 2, Node-RED is connected to the other systems and applications. Node-RED can connect to the Node.js Ar.Drone library in the IoT Gateway using the Exec node. While carrying out the mission triggered from Node-RED, the drone takes the necessary photos and sends them to the IoT gateway, in which Node-RED connects them to the Watson Visual recognition (WVR) node, which uses Watson visual recognition in the IBM cloud. The WVR node identifies the types of material transported on the conveyor belts and classifies the images according to the trained custom model. The photos are then sent to the IBM cloud by the Cloudant node, which is connected to the Cloudant database in the IBM. These photos can also be requested at any time by the Cloudant node in Node-RED (
Figure 2).
By implementing an MQTT client library in Node.js, MQTT messages can be used to send commands to the drone through a MQTT broker installed in the cloud and also request Navigation Data (NavData) from the drone, such as battery life, wind-speed, and velocity. MQTT can also be used as an alternative or supplement to the OPC UA protocol in the industrial control system. The focus of the present paper is to evaluate the proposed approach using only the OPC UA protocol.
Figure 3 details the communication process between the different parts of the proposed approach, including data flows between the different nodes, the industrial control system, UAVs and the cloud. Two main applications are installed in the IoT gateway: the Node-RED application and the Node.js application. The former facilitates communications, while the latter controls the drone. Node-RED checks the flow by reading the data from the OPC UA node, which is connected to the automation control system. If a problem is confirmed from the PLC, Node-RED triggers the drone mission executed by Node.js. The drone mission (
Figure 4) is split into three paths: planning the mission, taking photos, and returning to the starting point. The Watson visual recognition node and Cloudant node receive the images and send them to the IBM cloud for processing and storage. The visual recognition node then forwards the results to the plant control system.
Figure 5 shows the flows used in Node-RED in the IoT gateway. The OPC UA node is responsible for reading the updated data from the PLC and sending the results to the Exec node to launch the UAV mission. After the mission, the drone photos are saved in a folder on the IoT gateway by the watch node that monitors all new photos and sent to Watson’s visual recognition node for processing. The cloud visual recognition service analyzes the photos and classifies them into two classes. Each WVR result is provided as a score between 0.0 and 1.0 for each image for each trained class. The IoT gateway then receives the classification scores via the Watson VR node, the images’ scores are compared by the function node and the results are forwarded to the industrial control system and the PLC via the OPC UA write node for decision making.
3.4. UAV Mission Planning
The drone takes off at position (x,y), climbs to a certain altitude, hovers, returns to the start, and lands. The autonomous flight library was based on the AR.drone library [
42], which is an implementation of networking protocols for the Parrot AR Drone 2.0. This library has four features: an extended Kalman filter, camera projection, and back-projection to estimate distance to an object, a PID Controller to control drone position, and a VSLAM to improve the drone position estimates [
45,
46].
The AR.Drone 2.0 is equipped with sensors with precise controls and automatic stabilization features, two cameras, a 60 fps vertical QVGA camera for measuring ground speed and a 1280 × 720 at 30 fps resolution front camera with a 92° (diagonal) field of view, Ultrasound sensors to measure height, three-axis accelerometer with +/−50 mg precision, three-axis gyroscope with 2000°/s precision, three-axis magnetometer with 6° precision, and a pressure sensor with +/−10 Pa precision. The drone can monitor its own position and mapping (SLAM), robustness and controls.
3.5. Case Study
Concrete batching plants form part of the construction sector. Their many important components include cement and aggregate bins, aggregate batchers, mixers, heaters, conveyors, cement silos, control panels, and dust collectors. Concrete plants involve a human–machine interaction between the control system and the operator. The operator introduces the concrete formula by selecting the quantities of materials to be mixed and this data is processed by a control system so that the correct amount of material is conveyed to the mixer (
Figure 6). The materials used in the concrete plant are aggregates, cement, admixtures, and water. The quality and uniformity of the concrete depend on the water-cement ratio, slump value, air content, and homogeneity.
Traditionally, to control concrete quality, microwave sensors are used in aggregate bins to measure the aggregate water content and then adjust the formula as required. Aggregates of different sizes are stored in bins for different formulas. Due to certain errors during the discharge and filtering process, these materials are sometimes mixed together incorrectly, affecting concrete quality and consistency.
The UAV camera and the service IBM WVR in the cloud can identify the state of the aggregate materials transported on the conveyor belts to make adjustments to the production process.
We use the cloud service to classify normal and mixed aggregates. The role of the drone in this case is to take pictures when the materials are being transported on the belts before they reach the mixer. The cloud classifies each image and returns the results to the IoT gateway as a score between 0.0 and 1.0 for each class. This result is sent to the PLC via the IoT gateway. Using these scores, any excess quantity of a material can be measured, and the required adjustments can be made to achieve the final formula. This operation eliminates wasted time and achieves the desired formula before the final mixing.
Drones are flexible, easy to deploy, can quickly change their position in a time-sensitive situation, and can be quickly configured. Incorporating them in a control system speeds up the production line by responding in real-time to the different requirements of the control system using the cloud services. The proposed approach is considered a cost-effective solution and replaces unnecessary and repeated operator controls, traditional monitoring, and control systems.
4. Delay Assessment in the Proposed Platform
One of the important challenges to overcome is the high-latency and unreliable link problems between the cloud and the IIoT terminals. Fog computing extends computing and storage to the network edge and is not only considered for computation and storage, but also as a way of integrating new systems capable of interconnecting urgent and complex processing systems. However, each fog and edge application may have different latency requirements and may generate different types of data and network traffic [
47]. In this section, we focus on the latency between the data generation layer and the data communication layer (
Figure 1). The data generation layer is composed of the UAV system and the industrial control system.
4.1. Industrial Control System Architecture
Figure 7 shows the proposed approach system for data collection and the first layer control between the sensors in the concrete plant. The sensors are connected to the PLC S7-1214 and all information for these sensors is sent from PLC S7-1214 to PLC S7-1512 using the industrial communication standard PROFINET over Ethernet. The PLC S7-1512 supports OPC-UA, which adopts client-server architecture. The OPC UA client is installed in the IoT gateway using the Node-RED OPC UA node. UaExpert is used in this case to check connectivity with the server. All the incoming information is controlled by Node-RED.
In this first part of the delay analysis, our focus will be only on the OPC UA communications between the IoT gateway and the PLC with the OPC UA server.
4.2. Latency between Two Terminals
Latency is the time network traffic delayed by the system processing, or the total time needed to send a network packet from the application on one server to the application on another server through the network interface controller (NIC), network (cable, Wi-Fi etc.), second NIC, and into an application on another server (or client). To assess the latency between two terminals, most approaches use the round-trip delay time (RTD) or the one-way delay (OWD). The latency in the context of networking is the time spent by propagation through the network support and hardware of the adapter, as well as the software execution times (application and OS) (
Figure 8).
The hardware latency inside switches and on wires can be easily identified from the switch specifications, length of the wires, and the maximal transmission data rates, while the software latency imposed by processing a packet in the software stack is more arduous to evaluate. Several parameters like system workload, operating system and executed application influence software latency.
Equation (1) defines the RTD between two terminals in a network, where
tA and
tB are the software latency of the terminals
A and
B respectively, and
tH marks the hardware latency of switches and wires connecting the terminals
A and
B.
To accurately calculate OWD (by dividing the round-trip time by two), the configuration of the test systems must be perfectly symmetrical, meaning they must be running the same software, using the same settings, and have equal network and system performance.
4.3. Latency in OPC UA Network
In this section, we analyze the delays involved in client-server OPC UA communications in a switched Ethernet network. This model serves to define in detail the non-deterministic sources of end-to-end delay. The proposed model is based on time delays defined in [
48,
49] in an Ethernet-based network.
Figure 9 shows the round-trip data path from an OPC UA server in PLC automate to an OPC UA client on the IoT gateway and the hardware OWD required.
We consider the end-to-end network delay in the switches and wires from the client request to the server, which can be divided into three categories, the frame transmission delay (dt), the time required to transmit all of the packet’s bits to the link, the propagation delay (dl), the time for one bit to propagate from source to destination at propagation speed of the link, and the switching delays (ds), which depend on the route through the network to the server.
The transmission delay depends on the length of packet
L and capacity of link
C. The propagation delay is related to the distance between two switches and the propagation speed of the link
S.
The switch delay is defined as the time for one bit to traverse from switch input port to the switch output port. It is divided into four delays: the first is the switch input delay (
dSin), the delay of the switch ingress port, including the reception of the PHY and MAC latency. The second is the switch output delay (
dSout), the delay of the switch egress port, including the transmission PHY and MAC latency. The third delay is the switch queuing delay (
dSq), the time a frame waits in the egress port of a switch to start the transmission onto the link. The last is the switch processing delay (
dSp), the time required to examine the packet’s header and determine where to direct the packet is part of the processing delay.
The hardware end-to-end delay
dCS presented as a request from an endpoint server
S to the destination endpoint in a client
C can be expressed as the sum of the delays of all the switches and links in the path,
n being the number of links and
n − 1 the number of switches along the path.
Figure 10 reveals the architecture of the OPC UA server. The server application is the code that implements the server function. Real objects are physical or software objects that are accessible by the OPC UA server or internally maintained by it, such as physical devices and diagnostic counters. Particular objects, such as Nodes, are used by OPC UA servers to represent real objects, their definitions and references; all nodes are called AddressSpace. Nodes are accessible by clients using OPC UA services (interfaces and methods) [
50].
In the case of
m number of requests from clients to the nodes in the OPC UA server, the overall hardware end-to-end delay of the OPC UA client-server (
dCS) communication over an Ethernet network, when there are
m requests from the client to the server, is presented as:
By analyzing all the delays mentioned in the hardware, we admit that the end-to-end delay on Ethernet network is deterministic, except the delay in the switch queue, which depends on the link utilization. The packet queuing delay increases in a frequently used link.
By investigating the hardware delays for an OPC UA client/server communication in an Ethernet network, we conclude that it is hard to define exactly the hardware delay on the account of the queuing delay. In that case, when it comes to complex processes with real-time requirements, OPC UA reaches its limits. There are different ways of defining this delay, for example QoS techniques such as QFQ (weighted fair queuing) or strict priority [
14]; however, there is always a certain delay and jitter that limits real-time performance. Time sensitive networking (TSN) provides mechanisms for the transmission of time-sensitive data over Ethernet networks. The adoption of OPC-UA over TSN will also drive this paradigm in the world of deterministic and real-time machine to machine communications. TSN provides mechanisms for the transmission of time-sensitive data over Ethernet networks. With Ethernet’s limitations in terms of traffic prioritization, the TSN working group has developed the time-aware scheduler (TAS), defined in 802.1Qbv [
51]. TAS is based on TDMA, which solves the problem of synchronization and traffic priority in the Ethernet. By using this technique, queuing delay can be completely eliminated, hence the end-to-end latency becomes deterministic. Bruckner, et al. [
52] adopted this method to evaluate OPC UA performance on TSN with the most commonly used communication technologies.
4.4. UAV System Delay
There are several ways to introduce latency in a drone’s video compression and transmission system. The end-to end delay in the system can be divided into seven categories (
Figure 11):
Tcap is the capture time,
Tenc the time required to encode, the resulting transmission delay is
Ttx,
Tnw is the delay network when the drone is connected to the remote ground station via a network,
Trx is due to the ground station also being wirelessly connected to a network,
Tdec is the decoding delay at the reception station, and
Tdisp is the display latency.
Note that when the drone is communicating directly with the ground station, no network is involved and there is only a single transmission delay (
Tnw = 0 and
Trx = 0).
In the H.264 system, each video frame is organized into slices which are in turn divided into non-overlapping blocks and macro-blocks (two-dimensional unit of a video frame). Every slice is independently encoded and can decode itself without reference to another slice. The main advantage of this system is that it is not required to wait for the entire frame to be captured before starting to encode. As soon as one slice is captured, the encoding process can start, and slice transmission can begin. This technique has a consistent effect on the overall latency as it influences all the system latencies from encoding to display.
Theoretically, we define the overall latency by the number of slices
N, although in practice this may not be the case due to setting up and processing individual slices.
In order to efficiently transmit and minimize the bandwidth, it is important to use video compression techniques, although the slice technique also has an effect on the compression ratio. The higher the number of slices, the faster they can be encoded and transmitted, although as this number increases, the number of bits used for a slice and the effective slice transmission time also increase.
Other types of delay also affect the overall delay. Some factors can be adjusted when a UAV system is used. For example, Tcap depends on the frame rate of the UAV camera; the higher the frame rate, the shorter the capture time. Tx relies on the available data bandwidth of the transmission channel, while Tdisp (video capture) is based on the refresh rate of the display.