Next Article in Journal
Evaluation of a Deep Learning Approach for Predicting the Fraction of Transpirable Soil Water in Vineyards
Next Article in Special Issue
Monitoring Bridge Dynamic Deformation Law Based on Digital Photography and Ground-Based RAR Technology
Previous Article in Journal
Advanced Unified Earthquake Catalog for North East India
Previous Article in Special Issue
Application and Comparison of Non-Contact Vibration Monitoring Methods for Concrete Railway Sleepers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Perspective

GNSS-Assisted Low-Cost Vision-Based Observation System for Deformation Monitoring

1
Department of Geomatics, Czech Technical University in Prague, 16636 Prague, Czech Republic
2
Department of Applied Geodesy, Kyiv National University of Construction and Architecture, 03037 Kyiv, Ukraine
3
Department of Surveying and Geodesy, Satbayev University, Almaty 050013, Kazakhstan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(5), 2813; https://doi.org/10.3390/app13052813
Submission received: 12 December 2022 / Revised: 14 February 2023 / Accepted: 17 February 2023 / Published: 22 February 2023

Abstract

:
This paper considers an approach to solve the structure monitoring problem using an integrated GNSS system and non-metric cameras with QR-coded targets. The system is defined as a GNSS-assisted low-cost vision-based observation system, and its primary application is for monitoring various engineering structures, including high-rise buildings. The proposed workflow makes it possible to determine the change in the structure geometric parameters under the impact of external factors or loads and in what follows to predict the displacements at a given observation epoch. The approach is based on the principle of relative measurements, implemented to find the displacements between pairs of images from non-metric cameras organized in a system of interconnected chains. It is proposed to determine the displacement between the images for different epochs using the phase correlation algorithm, which provides a high-speed solution and reliable results. An experimental test bench was prepared, and a series of measurements were performed to simulate the operation of one vision-based observation system chain. A program for processing the sequence of images in the MatLab programming environment using the phase correlation algorithm was implemented. An analysis of the results of the experiment was carried out. The analysis results allowed us to conclude that the suggested approach can be successfully implemented in compliance with the requirements for monitoring accuracy. The simulation of the vision-based observation system operation with accuracy estimation was performed. The simulation results proved the high efficiency of the suggested system.

1. Introduction

State-of-the-art development of materials production, construction technologies, their automation, and the growth of land prices in large cities has led to a new way of thinking about constructing and assembling complex geometric structures, especially high-rise buildings. This fact complicates the building monitoring task due to the necessity of creating the sensors’ monitoring network, the application of various types of measuring equipment, and the high frequency of observations to acquire and study the kinematic and dynamic properties of the structure. On the other hand, having such a monitoring system will operate remotely without time-consuming and laborious field work, and expensive equipment is preferable. Among the different kinds of building monitoring, the observation of geometric parameters or deformation monitoring, in other words, has an important role. The change in the building geometry causes a reduction of its functional determination and crack emergence and propagation, which may lead to the structure’s collapse. The effect of such external loads as wind, snow, ice, solar radiation, unstable foundation, etc., leads to oscillations and torsion of structures, their bending, and their roll. These parameters may change their values daily and seasonally [1] and can cause spatial displacements at the tens of centimeters level. The considered geometric deformation parameters that need to be determined during deformation monitoring of the high-rise building are given in Figure 1.
Evidently, the deformation parameters presented above are subject to monitoring by different geospatial methods and technologies as long as the parameters are functions of linear and angular displacements. Such monitoring is carried out using various methods, geodetics particularly. Today, the global navigation satellite systems (GNSS) are the most widespread element of any monitoring system. GNSS provides reliable and high-frequency information about the monitoring target coordinates’ changes. However, due to the impossibility of measuring coordinates along the structure axis from the ground to the top, GNSS data reveals only the total displacement Δ of the structure. This displacement may portray the structure’s top displacement without reference to the ground floor. Thus, the reason for this displacement is unknown and maybe the simple spatial displacement, displacement and bending, or displacement and roll combinations (see Figure 2).
It has become clear that GNSS observations alone cannot accurately depict the deformation process. That is why GNSS may primarily be used as a complementary data source and to help detect the structure’s vertical displacement, but additional measurements along the structure are needed to figure out the reasons for the structure displacement and to obtain an accurate picture of the deformation process. The simplest way to overcome the GNSS restriction is the integration of GNSS with other geodetic or non-geodetic equipment and measurements.
Geodetic science has developed many methods to measure structure deformation in given directions. Today, we deal with different terrestrial geodetic measurements, satellite measurements, and photogrammetric technologies and measurements. Terrestrial geodetic measurements are the most widespread [2,3]. There is no point in discussing these methods in detail as anyone may find their description in the geodetic literature. Moreover, these methods were the first ones applied for deformation monitoring and, consequently, are well studied. The primary geodetic methods are spirit and hydrostatic/dynamic leveling for vertical displacements, and total stations, including image-assisted total stations [4,5], for spatial displacement determination. Other kinds of terrestrial geodetic methods are terrestrial laser scanning [6,7], depth cameras [8], and ground-based radar interferometry [9]. The papers [10,11] further develop InSAR technology. Publication [8] is focused on comparing a depth camera and terrestrial laser scanner to estimate structural deflection. The comprehensive analysis of the ground-based radar interferometry is given in [11], where the structural monitoring and damage assessment of constructions are considered as the GIS integration of differential InSAR measurements, geological investigation, historical surveys, and 3D modeling. However, those technologies are laborious, require skilled personnel, and are hard to automate. Especially critical for geodetic methods is the question of the observation frequency. It is clear that leveling, terrestrial laser scanning, or ground-based radar interferometry cannot provide more than one observation epoch per day. That is why pure geodetic methods are being used as an additional information source in combination with other methods. Notable success has been achieved in the joint application of GNSS with terrestrial geodetic measurements and sensors. GNSS was applied for monitoring tasks for the first time more than twenty years ago [12,13,14,15]. Despite the high observation frequency and relatively high accuracy, the GNSS application is restricted to the number of observation points (i.e., it is impossible to install the necessary number of GNSS antennas on a structure), observation sites (i.e., the necessity to have a relatively open sky for satellites), and sophisticated processing algorithms. The GNSS was combined with other sensors to overcome these shortcomings to detect displacements in many points, e.g., accelerometers, hydrostatic levels, inclinometers, etc. [16,17,18,19]. The study [17] presents a good case of GNSS integration with low-cost accelerometers, but the solution is not a monitoring system. Paper [19] is a different approach to integrating GNSS with other devices. GNSS aids non-overlapping images from two cameras to determine the three-dimensional displacements of high-rise buildings. This case is an example of vision-based technology assistance. The main branch of such technologies is photogrammetry, where the image is the primary data source. Here and below, only close-range photogrammetry is considered.
Close-range photogrammetry can be separated into terrestrial and aerial cases. Terrestrial photogrammetry has been known since its invention, whereas close-range aerial photogrammetry took its place by deploying effective and cheap unmanned aerial vehicles (UAVs). This is why the latter is often called UAV/UAS photogrammetry. On the one hand, its traditional concept applies close-range photogrammetry for structure deformation monitoring. On the other hand, the new achievements in computer vision technologies and digital image processing [20,21,22,23,24] have transformed classical photogrammetry into digital photogrammetry, where the opportunities for measurement automatization have risen significantly. Among the applications of the traditional terrestrial close-range photogrammetry for monitoring, it is worth mentioning the studies published recently: [25] proposes a kind of one-image photogrammetry and its integration with geodetic alignment measurements; [26,27] explore photogrammetric deformation monitoring using low-cost cameras, particularly for bridges; [28,29] explore photogrammetric deformation monitoring using a target tracking approach (the studies demonstrate one step toward measurement automation); [30] presents a pan–tilt–zoom camera-based displacement measurement system for the detection of building destruction; and [31] is a case of monitoring using the roving camera technique. Of course, the main advantage of photogrammetric technologies is the opportunity to measure as many points on the structure as necessary. However, just a simple monitoring task of a 100 m building creates an unsolvable problem for terrestrial photogrammetry due to the impossibility of capturing the building surface from the ground. Even if it were possible, the errors due to perspective distortions and resolution would downgrade the overall accuracy to an unacceptable level.
Unlike terrestrial photogrammetry, UAV photogrammetry, thanks to its higher mobility, permits the collection and, consequently, the reconstruction of a more detailed building model. However, technically, UAV photogrammetry presents the same terrestrial photogrammetry case but with higher data redundancy [31]. Today, UAV photogrammetry has a versatile application for structural monitoring. A wide range of publications similar to [32] have been published, e.g., paper [33], where UAV photogrammetry was applied for vibrations monitoring, and [34], a presentation of a lab-scale test for a six-story building model where displacement determination was carried out using UAV image correlation. Once more, the automation of UAV data processing is not simple stuff. Besides spatial displacement determination, crack monitoring is a top-rated application of close-range photogrammetry, primarily due to the simplicity of the cracks’ identification in images and their measurements. A comprehensive review of crack detection is given in [35,36]. A feature of the crack measurements is the sufficiency of only one image for measurements. On the other hand, the data for crack detection are easy to process and automate. The papers [37,38] present monitoring solutions for automated crack detection using machine learning. These papers are based on computer vision principles. The photogrammetric principles for crack monitoring are deployed in [38,39]. Except for terrestrial-based monitoring of cracks, UAV-based technologies have become very popular recently [38,40,41]. The paper [40] studies a new computationally efficient vision-based crack inspection method implemented on a low-cost UAV with a new algorithm designed to extract useful information from the images. The disadvantage of UAV-based observations is the low accuracy that does not satisfy current requirements. So, UAV data can be used as a supplementary data source for structure deformation monitoring.
Since we are on the way to developing a vision-based system, let us pay more attention to the photogrammetric methods and approaches. Considering terrestrial close-range photogrammetry, it is necessary to mention the classic books [42,43] that provide the most comprehensive review of close-range photogrammetry. Despite the versatility of the presented photogrammetric instances, they are all based on a standard algorithm and math background. This means that, regardless of the structure, any study comprises a geodetic network creation or assignment of some reference base (coordinate system), target marking and coordinating, and compliance with the general requirements of the photogrammetric survey [43]. The knowledge that came from computer vision treats and handles images differently than classical photogrammetry. The computer vision approach focuses chiefly on different image refinement methods, a digital correlation between images, etc. Math models for geometric information extraction are less strict but more robust. Computer vision approaches have become very popular thanks to their high robustness and automation [44,45,46,47,48,49,50,51,52,53,54,55,56,57,58]. Let us analyze some of them with unique features regarding the study goal. The study [44] considers computer vision methods tested on a four-story steel frame in lab conditions. These methods comprise the optical flow with the Lucas–Kanade method, the digital image correlation with bilinear interpolation, and the phase-based motion magnification using the Reisz pyramid. The authors of [46,47] developed a novel sensor for displacement measurements using one camera. They suggested an advanced template-matching algorithm. Paper [48] compares two noncontact measurement methods: the vision-based method underpinned by image correlation and the radar interferometer. The vision-based system uses one or two cameras mounted on a tripod. The paper [51] proposes an approach based on measurements of retro-reflective targets, while [52] offers the same approach free of targets. In [53,54], the vision-based monitoring task is considered as a problem of the best methods and algorithms for image compression and processing under dark–light conditions. It is worth noting that the computer vision approach can also be used for vibration measurements, as claimed in [55]. The study was conducted in a lab environment. In virtue of the high level of automation, the computer vision approach can be fused with other sensors, e.g., the fusion between a vision-based system and accelerometers. Another sample of integration is [58], where computer vision technology is integrated with terrestrial laser scanning. For all the considered cases, the measurements are based on one or two cameras, without external reference to some stable basis, insofar as the detected displacements are relative and do not present the total (global) structure deformation.
The discussion about monitoring will not be comprehensive without the small sensors mentioned. These sensors have recently become one of GNSS’s main integration elements. These sensors supplement GNSS and ensure good results for monitoring high-rise buildings. Among the different measuring sensors, it is worth mentioning the following ones that determine structure deformation and their applications, e.g., inclinometers [59,60] to detect inclinations in a particular direction, high-resolution lasers [61,62], tilt meters [63] to measure plane inclination, low-cost radars [64], and 3-D inclinometers [65] to determine a spatial rotation. A set of papers and reports give an overall review regarding the use of various sensors [66,67,68,69,70]. Whatever sensor is used, it provides superior accuracy. Still, the main drawback is the need to organize them into one system and reference this system to some external coordinate system because any sensor provides relative displacements.
Despite the importance of deformation monitoring, this task is just a small unit of a more significant problem. The technologies mentioned above have become a part of a giant branch named structural health monitoring (SHM). Structural health monitoring has become a pretty widespread problem recently. This problem is highly complex and comprises many methods and technologies to monitor a massive range of various building parameters. Therefore, any vision-based low-cost video observation system should become an integrated part of the SHM system. Many papers regarding this problem have been published recently [71,72,73,74,75,76,77,78,79]. A subject of SHM can be temperature variation inside and outside of a structure, air conditioning, humidity inside or underlying soils, and the status of various structure elements, e.g., cracks, damages, and so on. One of the essential applications of SHM is structure deformation monitoring. The given list [71,72,73,74,75,76,77,78,79] is just about geometric parameters deformation monitoring. Thanks to the development of digital technologies and computer science, it is possible to use different small sensors and combine them into one system that may operate automatically. However, modern SHM systems may also include satellite-based interferometry [80,81,82,83], UAV technologies [84], and terrestrial and/or aerial laser scanners [6]. Moreover, the SHM system is also a part of another more complex system named the building information model (BIM) [85,86,87,88]. BIM comprises all possible building life-cycle stages and consequently the monitoring steps. The paper [88] outlines the liaison between monitoring problems and BIM. Thus, the creation and operation of any monitoring system should be considered an inseparable part of the building life cycle and must be embedded into BIM. This premise imposes the conditions of easy installation, repair, operation, relocation, and renovation of the monitoring system. Such complicated requirements can be fulfilled for a system that consists of low-cost sensors. On the other hand, the system design must be as simple as possible and reliable.
Based on the given analysis, it was suggested to deploy low-cost digital cameras operating in an automated mode and organized in a system of chains connected with each other. GNSS is supposed to be used as a supplementary data source that provides external reference and control. The system is assumed to be installed inside the building and integrated into BIM. Such a system is easy to install and use, has high reliability due to the vast redundancy of measurements, and does not need professional users. This study aimed to introduce the GNSS-assisted low-cost vision-based observation system (VOS) concept and demonstrate the system’s preliminary analysis results. This system includes the ideas and approaches from close-range photogrammetry to calibrate and orient images; computer vision to process digital images; geodesy to assign coordinate systems and external control (including the possible application of total stations and GNSS); and adjustment calculus to process and analyze the measurement results. The suggested approach provides complex information about a structure deformation and reduces error accumulation and the effect of external errors, increasing the resulting accuracy in determining the displacement values. The study of [89,90,91] has shown that to carry out complex deformation research, it is recommended to locate the system chains along the principal axes of the structure.
A couple of papers demonstrate a very tentative approach with a similar idea. In [92], the approach of relative displacement measurements from the inside using an in-room camera is presented. The paper considers just a particular case of observations using one camera from one point. A more detailed review is outlined in [93], with good analysis and ideas a bit similar to our study. However, the work did not consider the case of combining a set of cameras into a network. The significant contribution of this paper is the analysis of the measuring accuracy ensured by the computer vision monitoring systems and target tracking algorithm examination. The study [94] demonstrates a concept that is close to the VOS idea but is mostly about camera calibration and does not consider cameras organized into a system without external control.
Therefore, the existing approaches and methods have some similarities to the VOS concept that will be presented and studied in what follows. The following stages have to be examined to achieve the primary goal of the study:
  • General idea and concept description.
  • Design of the VOS. At this stage, the distance effect between the VOS elements is considered based on the camera’s technical capabilities and the geometric parameters of the test structure.
  • Determination of displacements between VOS elements for a single chain. In-field simulation of the displacement measurements for a single chain. A phase correlation algorithm is suggested as a primary processing strategy.
  • Preliminary analysis of the VOS accuracy for the test structure. The investigation is carried out using statistical simulation and results from stage 3.
  • Determination of the monitoring parameters for the actual structure. Relative displacements of the VOS elements are used to model the structure frame model and compare with the design model of the structure.
  • Prediction model. Based on the structure frame model (values of monitoring parameters), a prediction model is built for a given point in time.
In this article, the features of the first four implementation stages will be described and studied in detail, and the simulation and experimental measurement results will be presented. The paper is structured as follows. Section 2 outlines the general concept of the VOS, its design, and its displacement determination approach. Section 3 describes the results of experimental studies and simulation of the VOS for a high-rise building. A comprehensive analysis of the results after the simulation and discussion are presented in Section 4. Section 5 presents the conclusions.

2. General Concept and Approach

2.1. Design Concept of the VOS

Regarding the idea mentioned above, it is proposed to place the sensors of the VOS along the mutually perpendicular axis and planes of the structure to be monitored. Such a configuration will allow at the stage of data analysis the further separation of the effects of torsion from deflection and roll, which is a tricky issue for high-rise buildings monitoring [1]. The VOS design concept supposes the determination of both relative and absolute displacements. An external coordinate system has to be established to monitor the absolute displacements. This coordinate system is fixed by the system of targets placed on the surrounding objects considered stable and via the GNSS observations at the building top. The coordinates of these targets are determined using total station measurements or, in some cases, GNSS measurements. The design concept of the GNSS-assisted VOS is presented in Figure 3.
Each particular sensor of the VOS consists of a set of elements. These elements may be organized in a different manner depending on the position of the sensor. However, in the general case, the sensor contains the components depicted in Figure 4.
In addition to standard modules, the camera must contain a module for data transmission, allowing the quick transmission of the target images from all the sensors. It is proposed to make the target a QR code with embedded LEDs. This code will enable it to include the necessary information, for example, target ID, coordinates, etc., and increase its visibility. Two sensors can be combined into a chain. There are two ways the sensors may be organized in the chain (Figure 5).
Therefore, the VOS can be installed inside the building along its principal axes. There are different methods of VOS installation. Two of them are given in Figure 6. Both cases demonstrate the VOS sensors’ placement in a structure’s vertical plane. For a horizontal plane, the scheme will be the same; the difference is only in the directions of the coordinate axes.
According to the presented schemes, each sensor–target pair is considered a chain regardless of the placement of the target (on the camera or separately). In the first scheme, the observations are performed from sensor to sensor, where each one is equipped with a QR target. This scheme is valid for structures that have relatively small sizes, otherwise a more complicated observation scheme is suggested. The sensors are interconnected with each other throughout the system of two-sided QR targets. In any case, for the first observation epoch, both the sensors and targets must be aligned in horizontal and vertical directions. The installation of the VOS is possible in different ways. VOS can be embedded inside the building communication lines or attached outside and adequately covered.
It is necessary to study the technical characteristics of the optical system to determine the effect of the distance between two sensors or the sensors and the QR target. Since low-cost cameras are going to be used, the main issue is the size of the QR target in the image. The target must be recognizable in the image, which is why the resolution plays a significant role. The conventional scheme of image acquisition in a simple camera is given in Figure 7.
In a simplified form, the visible area of size a × b is formed on an m × n matrix. The size of the visible area depends on the camera field of view γ and the focal length f and is described by the following relationship:
γ = 2 a r c t g d S f 2 S   ·   f .
where d is the frame size.
Using such a parameter will be incorrect for an image with a rectangular shape. As a result, the geometry of the initial square pixel will be distorted, or its size will change if the average value of the matrix resolution is applied. Consequently, this fact will affect the quality of digital image processing. Instead of the camera’s field of view, the angle of the visible area along the image side φ is recommended:
φ a = a r c t g a / S , φ b = a r c t g b / S
where a is the size of the visible image area along a axis, b is the size of the visible image area along the b axis, and S is the distance from the camera (sensor) to the target.
This approach preserves the uniqueness of determining the resolution of the camera, which is denoted by the value c:
c = a / n = b / m ,
where n and m are the sizes of the camera matrix.
The camera’s resolution was determined using expressions and (2). The necessary parameters were obtained from the typical camera specification and by calibration stand surveying from a fixed distance. The CMOS matrix size is 4320 × 3240 px (6.17 mm × 4.55 mm); the f-number equals 3.9. The results for the camera, a General Electric G100 with f = 72 mm, k = 5.62, and the surveying distance S = 1 m, are presented in Table 1.
As expected, the resolution for the angle γ along each of the axes of the image takes different values, while the average value of the camera resolution is reduced.
There is a serious flaw in the given concept. Such resolution determines the ideal case when the lens resolution is perfect. However, in real life the actual resolution is the sum of image resolution plus lens resolution. It is obvious that the lens also downgrades the final image resolution. The approximate resolution can be obtained under the premise that the standard lens provides the object discerned when the object has a size of at least 3 pixels. Thus, the resolution φ a , b for the distance of 1 m equals to 0.058 mm. This result allows calculating the resolution error m γ for different distances.
m γ = φ a , b S f .
The results for expression (4) are given in Figure 8.
It worth mentioning that these values can be decreased using sub-pixel processing algorithms. Thanks to the long focal length, the resolution error does not affect measurement accuracy significantly.
The other factor that affects the accuracy, which should be accounted for to understand the real achievable accuracy that can be retrieved from the camera images, is the defocusing error. The defocusing error leads to image unsharpness that is described via the blur circle.
The defocusing error is defined through the aperture that is presented as the relationship
D = f N ,
where N is well known as an f-number, and for the selected camera N = f D = 3.9 .
To calculate the defocusing error, let us use the main optics equation:
1 f = 1 S + 1 s .
The designations are clear from Figure 8. From Equation (4), we obtain the defocusing error δ :
δ = N Δ f f ,   Δ f = s f ,   s = S f S f .
The actual error due to the defocusing error is determined as
m δ = δ S f .
The defocusing error for the previous camera parameters was calculated and is presented in Figure 9.
The resulting error can be determined via expression (8):
m = m δ 2 + m γ 2 .
By expression (8), the following error distribution was obtained (see Figure 10).
The obtained errors will be used for the comparison analysis in Section 3.
With the camera resolution, it is possible to calculate the ground coverage to find the ranges of displacements that a single measuring chain can measure. The relationship between the camera geometric parameters and the ground coverage is presented in Figure 11. The size of the ground coverage can be calculated by (9):
δ x , δ y = d x , y   ·   S   ·   L x , y m , n   ·   f ,
where d x , y is the half-size of the CMOS matrix in pixels, S is the distance in mm, L x , y is the physical size of the CMOS matrix in mm, m , n is the size of the CMOS matrix in pixels, and f is the focal distance.
If one supposes that the VOS sensors were aligned during the installation, then every single chain has a ground coverage from ±1000 mm up to ±3000 mm (Figure 12). However, these figures determine the total size of the field of view. Actually, the range of the detectable displacements is restricted by the QR target size. Expression (9) is also used for QR target size calculation. If we want the QR target to take at least 200 × 200 px in the image, then the necessary size of the QR target can be retrieved in Figure 13.
Therefore, the single chain needs a QR target size from 60 mm to 140 mm for distances 15–35 m. These values determine the size of the possible detectable displacements for different distances.

2.2. Displacement Determination between Sensors

Under the effect of the structure’s bending R or torsion θ, each VOS sensor will be subject to vertical and horizontal displacement (Figure 14). The VOS chain I in the vertical plane determines two displacement components depending on the chain orientation δ x i I y i I ,   δ z i I , whereas in the horizontal plane it is δ x i I ,   δ y i I . The final deformation relative to the ground floor is the sum of partial ones:
δ x I = δ x i I ,   δ y I = δ y i I , δ z I = δ z i I .
These displacements are relative. The total displacement of the structure Δ, including vertical displacement Δz and its roll ΔL(H), are absolute quantities. Their values are yielded by referencing the VOS to the external targets (Figure 15).
Therefore, the displacements that VOS measurements can establish have the following form:
Δ x I = Δ x + Δ L x + δ x i I ,   Δ y I = Δ y + Δ L y + δ y i I , Δ z I = Δ z + δ z i I .
According to Figure 14, displacement in the horizontal plane for a chain sensor to sensor will be determined as shown in Figure 16.
It is proposed to determine the VOS sensor displacement in the horizontal plane using an image comparison algorithm based on the phase correlation method [95]. Considering a pair of images, we will take one of them as the reference, denoted as A, and the second as the target, denoted as B. Let f A x , y and f B x , y be images, one of which is shifted by x 0 , y 0 relative to the other, and F A u , v and F B u , v are their Fourier transforms, then:
f A x , y = f B x x 0 , y y 0 ,
F A u , v = e j 2 π u x 0 + v y 0 F B u , v ,
R = F B u , v F A * u , v F B u , v F A u , v ,
where R is the cross-spectrum, and F A * is a complex conjugate of F. Calculating the inverse Fourier transformation of the cross-spectrum, we obtain the impulse function:
F 1 R = δ x x 0 , y y 0 .
Having found the maximum of this function, we determine the required displacement.
Now let us find the rotation angle θ 0 under the premise of displacement x 0 , y 0 using polar coordinates:
f A x , y = f B x c o s θ 0 y s i n θ 0 x 0 , x s i n θ 0 + y c o s θ 0 y 0 ,
F A u , v = e j 2 π u x 0 + v y 0 F B u c o s θ 0 v s i n θ 0 , u s i n θ 0 + v c o s θ 0 ,
F A ρ , θ = F B ρ , θ θ 0 .
It is possible to determine the target and camera displacements using this algorithm. To do this, the first image from the camera is taken as the reference (A), all subsequent ones are considered as input ones (B), and then the image center is searched n 0 , m 0 . The shift of each new image relative to the reference δ n , δ m will describe the camera movement (Figure 17a).
To find the target displacement, each new image from the camera is taken as the reference image (A), and the target template is used as the input one (B). Therefore, by finding the target center n 0 , m 0 in the first and subsequent images, it is possible to determine the displacement of the target in the image coordinate system (Figure 17b). To obtain the data independent of the camera displacement, the displacement has to be transformed regarding the center of each image in the image set. In both cases, knowing the image resolution, the displacements in pixels δ n , δ m can be transformed into displacements in millimeters δ a , δ b .

3. Results

3.1. In-Field Experimental Study of the Displacement Measurements for a Single Chain

It was decided to prepare and perform experimental measurements for a single chain of the VOS sensors to verify the above theoretical considerations. As a sensor of the VOS, a General Electric G100 digital camera was used. The images were collected under the conditions in Section 2.1 but with an f-number of 3.9. To simulate the target movements, the QR target was on a solid plate and a test bench, which allowed moving the target in a horizontal plane by small given values δ a i , δ b i , which were manufactured and applied. The operation of the VOS sensor chain in a horizontal plane was simulated for different distances between the target and the camera (Figure 18).
The test bench (Figure 19a) was developed at Kyiv National University of Construction and Architecture, Department of Applied Geodesy. The test bench generally allows the precise movement of any attached device (QR target) in space with accuracy of ±0.1 mm. Initially, this device was deployed for precise alignment measurements and setting-out but was revamped for our study purpose. The simulation of the single chain was performed at the Department of Applied Geodesy (Figure 19b).
Since the primary element of the VOS is a camera, it must be calibrated before usage. The calibration was accomplished with Photomodeler software. However, the various software that calculates the necessary parameters can be used for the calibration procedure. The calibration quality is presented in Figure 19. The calibration parameters were: focal length: −68.02 mm, image format size: 6.39 mm × 4.79 mm, principal point coordinates: 3.83 mm × 2.39 mm, and lens distortion coefficients K1 = 0.000152, K2 = 4.18 × 10−7. The overall calibration accuracy is equal to 0.28 px, with maximum residual of 1.38 px. The calibration error distribution for different photos is presented in Figure 20.
Three simulation tests were performed for different distances of the camera (sensor)–target. The camera was focused to infinity, and the standard parameters for daylight conditions image capturing were set up. No additional light sources were necessary. Image processing was performed using the phase correlation algorithm. The processing was carried out using MatLab software.
The image resolution parameter for each distance was determined according to the previous propositions (Table 2).
In the first step, the stability of the camera position during the experiment was determined. The results showed that the camera was stable during the image capturing. This result is due to setting the camera in the automatic mode for images captured during the surveying time. The images were taken automatically with a time interval of 30 s. The ISO speed values were changed from ISO-100 to ISO-800. The results are similar between the series, and the data are given in Table 3.
In the course of the displacement study, the images were taken in automatic mode. The interval between consequent images was set up to 60 s. Every minute, the QR target displacement was changed manually to the values presented in Table 4, Table 5 and Table 6. When determining the target displacement, the target image was cut from the first image and assigned as an input image. The target template of the same size was used as a reference image (Figure 21c).
The QR target displacements were obtained by processing the image series. The displacement accuracy determination was found as the difference between measured and manually fixed displacement. The results for each test are presented in Table 4, Table 5 and Table 6.
Displacement determination errors allow calculating the root mean square errors (19) of the displacement determination for each test measurement and tentatively estimating the accuracy of the phase correlation algorithm.
m a = i Δ a i 2 i ;   m b = i Δ b i 2 i .
The results of the accuracy estimation are summarized in Table 7.
The results in Table 7 are pretty close to the values in Figure 10 which describes a total error of measurements for the single chain. It was found that the larger the displacement, the larger the error of its determination. However, this dependency must not mislead, as for shorter distances we manually assigned larger displacements. In general, we may accept the RMS error equal to 6.5 mm and use this value for the simulation of the VOS that contains multiple chains with interrelated measurements.

3.2. Preliminary Analysis of the VOS Accuracy for the Test Structure

The results obtained in the previous subsection permit determining the VOS operation’s accuracy for a typical building. Insofar as this is a step in the preliminary accuracy estimation, it is possible to simulate the VOS operation for a building with simple geometry. It is supposed that the sensors are placed 30 m apart from each other. The simulation was performed for two buildings with different heights, which were 90 m and 420 m. The first building is presented in Figure 22. The VOS scheme was implemented according to Figure 5a. As the test building height was insignificant, it was decided not to use GNSS measurements for this test analysis.
In Figure 22, the blue points are the places of sensor installation. The arrows indicate the measurement directions. Points 10, 11, 12, and 13 are the targets on the surrounding objects that determine the external coordinate system. These points are accepted as errorless, but the accuracy of their determination was also estimated. The root mean square errors from Table 7 were used as descriptive statistics for the statistical simulation of measurement errors. The simulation was performed by the Monte-Carlo method for normal (Gaussian) distribution. The measurement errors were considered random with zero mean value, so there were no systematic or gross errors (blunders). The simulation results are presented in Figure 23 and summarized in Appendix A, Table A1 and Table A2.
In the figures throughout the text, the error ellipses are given for the confidence level of 95% for better presentation. The correspondence with the tables below can be found using the t-coefficient equal to 2 for the probability of 95%.
In Figure 23a, the spatial error ellipses for the sensor–target system are given, while in Figure 23b, the relative spatial error ellipses are presented. These ellipses describe the relative accuracy between two sensors. The sense of the spatial ellipse elements is clear from Figure 24.
The numerical values for the ellipses in Figure 23 are presented in Appendix A, Table A1 and Table A2.
Such an approach allows determining the whole set of monitoring parameters: the vertical displacement of the building, roll, bending, and torsion. Suppose one needs to find the relative displacements of the building elements. In that case, it is required to rule out the external control targets and simulate the VOS output in the internal coordinate system. The simulation was carried out for the same test building in Figure 22. The simulation results are presented in Figure 25.
The numerical results are summarized in Appendix A, Table A3 and Table A4. The values in Table A1, Table A2, Table A3 and Table A4 allow assessing the probable accuracy of the monitoring parameter determination in the external and internal coordinate systems.
The second simulation analysis deals with a tall building (420 m). The simulation parameters were the same as for the previous case. However, this case allows us to check the effectiveness of the combined solution, namely, the integration of the VOS and GNSS. Thus, the GNSS accuracy for the top of the building was integrated with the accuracy of the simulated VOS. The simulation results are presented in Appendix B, Figure A1, where the absolute accuracy is given, and in Figure A2 for the relative observations. The accuracy of the GNSS measurements was assigned using standard values, e.g., ±2–3 mm along the horizontal axis and ±5 mm for elevation. For the first case (Figure A1a), the GNSS measurements are considered non-fixed. In other words, these measurements are supposed to be included in the adjustment procedure. Two GNSS antennas on the building top are suggested as the most widespread case. The second case (Figure A1b) presents the measurements without GNSS observations. Therefore, the errors of the VOS are not restricted by GNSS measurements and propagate with the building height.
The numerical results are summarized in Figure 26 and Figure 27 as the preliminary accuracy of the massive number of points is better portrayed in the graphs. Figure 26 describes the RMS errors of coordinate determination along the coordinate axis for the case of absolute measurements accompanied by GNSS, i.e., the measurements referenced to some external coordinate system.
Figure 27 shows the simulation results for measurements without GNSS support.
The results outlined in Table A1, Table A2, Table A3 and Table A4 and Figure 25 and Figure 26 are the subjects of discussion and analysis in what follows.

4. Discussion

The data were analyzed using two approaches: experimental studies analysis and simulation analysis. So far, the obtained results have just demonstrated the opportunities of the VOS. However, what about the acquired accuracy? Is it enough to monitor various engineering structures, especially high-rise buildings? First, let us analyze the results of the experimental studies accomplished in Section 3.1. To do that, it is necessary to propagate the accuracy for one chain in the case of a multi-chain. This question is essential for the case of strict requirements on the accuracy of monitoring parameter determination. Whereas the requirements for the accuracy of vertical displacement determination are not so severe, the demands for roll or bending measurements are pretty tight. The most widespread condition for the roll and bending determination is based on the requirements for ensuring an allowable deviation δ from the building’s vertical axis during construction. The expression defines this requirement as:
δ = 0 , 167 · H ,
where H is a building height in meters, and δ will be millimeters.
The allowable deviation δ is turned into monitoring accuracy using the expression:
m = δ t ,
where t is the Laplace coefficient that depends on the probability level. Typically, t equals 2 or 2.5, corresponding to 95% and 99% probability values. However, sometimes, in monitoring practice, it is suggested to use t equals 5 to increase the reliability of the measurement results. Let us suppose that the accuracy along the x and y axis is equal to m . The resulting accuracy will depend on the number of chains k used for measurements. Under this premise, the final accuracy M can be determined as:
m x = m k ;   m y = m k ,     M = m x 2 + m y 2
The expressions (22) permit us to compute the accuracy for multi-chain VOS and compare these values with the allowable values from (20). The following calculations for the various heights have been performed (Table 8), taking the figures from Table 7. The accuracy calculations have been carried out based on the VOS installation scheme along the entire height of the structure with a step of 17, 25, and 33 m.
Figure 28 is a graphic summary of the results from Table 8. The horizontal axis describes a building’s height, while the vertical axis highlights the accuracy propagation.
The experimental results and further calculations yielded some interesting findings. The calculations by (22) prove the impossibility of leveraging the suggested VOS for monitoring alone. The general picture emerging from the results is that the multi-chain VOS can probably ensure the necessary accuracy for monitoring buildings higher than 500 m. The principal stress should be pointed out on the rising efficiency of the VOS with a building height. The inclusion of GNSS measurements changes the final distribution of the RMS errors. However, our findings are not generalizable beyond the subset examined because the calculation approach suggested above does not account for the effect of the interrelated measurements, as seen in Figure 22.
Thus, it is essential to simulate the VOS measurements to account for the redundancy of measurements. Therefore, the second step is the analysis of the simulation results in Section 3.2.
Let us summarize the results presented in Table A1, Table A2, Table A3 and Table A4. The accuracy at each block was averaged, and the mean accuracy value was accepted as final for analysis. These values were compared with allowable values (20). Moreover, the simulation results allow one to estimate two measurement modes: relative and absolute.
As seen in Figure 29, the simulation results provide a more lifelike picture of the VOS accuracy. The final accuracy has improved thanks to accounting for the measurement redundancy. Therefore, the VOS provides a reliable determination of monitoring parameters starting from the height of 90 m for absolute measurements. That is obvious; the installation of the VOS for such a small building is useless, and the conventional geodetic methods provide the necessary accuracy and are well studied. Things get much more complicated for the higher buildings.
To estimate the efficiency of the VOS for tall structures, with the inclusion of the GNSS measurements, the simulation was performed for the building’s 420 m height. Let us analyze the results in Figure 26 and Figure 27. Again, the analysis is better presented graphically. To do so, the RMS errors over each floor were averaged and compared in Figure 30.
The results of the GNSS-assisted VOS simulation look different from the VOS-only simulation. One may infer a couple of interesting findings. At first, thanks to the GNSS observations, the accuracy of the VOS is saved almost at the same level for the whole structure. This effect grows with the structure height as far as the GNSS restricts the error propagation in the VOS. Secondly, as was expected, the GNSS-assisted VOS may ensure the necessary accuracy starting from 60 m. We obtained a weighted accuracy value for the high structures thanks to combined adjustment. Therefore, the simulation results proved the high capability of the developed GNSS-assisted low-cost vision-based observation system for deformation monitoring.
The specific structure of the VOS puts forward some restrictions on the application of this system. These restrictions are defined by the geometry and construction technology of the monitoring objects. Considering the geometry, one needs to pay attention to the VOS scheme. It is clear that for chains we need straight lines between the sensors. Curvilinear structures require modification of single-chain construction. Moreover, the measurement processing is not straightforward. As an example, let us consider the simplest case, namely, the VOS for horizontal monitoring of a curvilinear structure (e.g., dams, tunnels, shells, etc.) between two reference points (Figure 31). The QR targets are placed perpendicularly to the sensors but with angles (α, φ) between each other. The measured displacements must be converted regarding coordinate axes. The manufacturing and installation of the system gets complicated. Therefore, the idea of VOS has to be developed and studied in the future for the case of curvilinear structures. So far, the considered and examined scheme applies to high-rise buildings.
The second condition is the material of the monitoring structure that was built. This condition especially makes sense when we deal with temperature deformation. The temperature influence leads to structure bending. In the Introduction, it was pointed out that bending is one of the primary issues of monitoring, and the VOS is the solution to this problem. The bending values will be different for different heights (Figure 32). In the simplest case, the bending due to temperature is described by
R = α Δ t H 2 2 D ,
where α is a linear extension coefficient of material ( α = 12.1 × 10−6 1/°C for structures made of steel, α = 10.8 × 10−6 1/°C for structures made of concrete), Δ t is the temperature difference for different sides of the structure, H is a height, and D is a mean structure size in the plane. For the structure with D = 100 m and H = 420 m, we obtained the values given in Table 9.
The VOS measurement range for displacements was taken from Figure 13. Regardless of the material, the VOS measurement range covers the possible deformation by almost three times. So, in this case, there are no special requirements or restrictions on the VOS application.

5. Conclusions

This study proposed a new approach for monitoring high-rise buildings using a GNSS-assisted low-cost non-metric camera system. The accuracy examination of the VOS single chain on a test bench confirmed the possibility of ensuring the necessary monitoring accuracy. The suggested method for determining the displacement of a pair of images based on the phase correlation algorithm showed stable results in a series of field experiments. The adequate distances between the sensor and target were studied and determined based on the experiment’s results, providing reasonable accuracy. The simulation of the VOS was performed for two cases: GNSS-free for low-story buildings; VOS with additional GNSS observations. The simulation showed the necessary accuracy for deformation monitoring in the case of the GNSS-assisted VOS. The results presented in this paper were mainly limited to a simulation study. Therefore, the findings are not fully generalizable to the actual VOS operation. However, it gives clues to further research directions. Future studies will focus on the research of camera calibration, changes in target illumination, and optical beam distortion due to refraction on the resulting accuracy. We must address the issues of determining the monitoring parameters for real structure and prediction model deployment. Future research will have to assess the extent to which the VOS application is possible, i.e., different geometry. The complete implementation of the VOS may become an indispensable part of the smart building solutions to detect unallowable displacements in automatic mode.

Author Contributions

Conceptualization, R.S.; methodology, R.S.; software, A.O. and A.A.; validation, A.O., Y.M. and R.S.; formal analysis, Y.M. and A.A.; investigation, R.S., A.A. and A.O.; resources, Y.M.; data curation, R.S. and Y.M.; writing—original draft preparation, Y.M. and A.O.; writing—review and editing, R.S., A.A., Y.M. and A.O.; visualization, A.A., A.O. and Y.M.; supervision, R.S.; project administration, R.S. and A.A.; funding acquisition, A.A., Y.M. and A.O. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study did not require ethical approval.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data is unavailable due to privacy restrictions.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. The accuracy of the point coordinate determination.
Table A1. The accuracy of the point coordinate determination.
FloorPointSemi-Major Axis mX (m)Semi-Minor Axis mY (m)Elevation mZ (m)
Ground floor100.00100.00100.0010
110.00100.00100.0010
120.00100.00100.0010
10000.00300.00240.0031
10010.00290.00170.0019
10020.00390.00390.0044
10030.00410.00370.0046
M010.00350.00450.0060
M020.00290.00280.0033
1st block20000.00350.00300.0046
20010.00400.00330.0042
20020.00390.00370.0042
20030.00400.00350.0042
M210.00300.00410.0050
M220.00340.00380.0049
2nd block130.00100.00100.0010
30000.00380.00330.0056
30010.00460.00400.0053
30020.00400.00380.0044
30030.00400.00330.0041
M310.00300.00280.0032
M320.00360.00410.0055
Top floor40000.00410.00370.0065
40010.00510.00450.0064
40020.00430.00410.0049
40030.00430.00370.0046
M410.00300.00480.0054
M420.00380.00500.0063
Table A2. The relative accuracy of the point coordinates (95% confidence level).
Table A2. The relative accuracy of the point coordinates (95% confidence level).
PointSemi-Major Axis mX (m)Semi-Minor Axis mY (m)Elevation mZ (m)PointSemi-Major Axis mX (m)Semi-Minor Axis mY (m)Elevation mZ (m)
FromToFromTo
1010000.00740.00580.006012M020.00720.00670.0062
13M310.00760.00640.0061100010010.00820.00510.0068
100010030.01150.00740.0100100020000.00850.00010.0076
1000M010.01130.00690.01261001110.00720.00380.0034
100110020.01040.00920.0093100120010.01030.00040.0075
1001M020.00730.00660.0073100210030.00910.00570.0068
100220020.00880.00010.00681002M020.00810.00700.0084
100320030.00940.00010.00671003M010.01010.00670.0099
200020010.01140.00610.0117200020030.01150.00700.0106
200030000.00860.00000.00782000M210.01040.00660.0122
200120020.01270.00760.0102200130010.01010.00000.0078
2001M220.01010.00650.0117200220030.00880.00560.0070
200230020.00830.00000.00652002M220.00910.00700.0085
200330030.00870.00000.00652003M210.00920.00660.0086
300030010.01230.00620.0145300030030.01150.00680.0115
300040000.00920.00010.00803000M310.00900.00680.0118
300130020.01380.00720.0115300140010.01030.00010.0080
3001M320.01030.00660.0135300230030.00890.00560.0070
300240020.00870.00010.00673002M320.00920.00690.0087
300340030.00910.00010.00673003M310.00890.00570.0076
400040010.01440.00630.0172400040030.01340.00700.0134
4000M410.01290.00770.0153400140020.01560.00760.0134
4001M420.01320.00700.0158400240030.01020.00560.0080
4002M420.01070.00760.00994003M410.01120.00760.0097
M01M020.01030.00730.0124M01M210.00920.00000.0096
M02M220.00990.00010.0088M21M220.01040.00720.0119
M21M310.00870.00000.0089M22M320.00900.00000.0092
M31M320.00960.00660.0111M31M410.01050.00000.0095
M32M420.01030.00000.0097
Table A3. The accuracy of the point coordinate determination.
Table A3. The accuracy of the point coordinate determination.
FloorPointSemi-Major Axis mX (m)Semi-Minor Axis mY (m)Elevation mZ (m)
Ground floor10000.00300.00230.0030
10010.00290.00160.0017
10020.00380.00380.0044
10030.00400.00370.0046
M010.00340.00440.0060
M020.00290.00270.0032
1st block20000.00340.00300.0046
20010.00400.00320.0041
20020.00390.00370.0041
20030.00390.00340.0042
M210.00290.00410.0050
M220.00340.00380.0048
2nd block30000.00370.00330.0056
30010.00450.00390.0053
30020.00390.00380.0043
30030.00390.00320.0040
M310.00290.00270.0031
M320.00350.00400.0055
Top floor40000.00400.00360.0065
40010.00500.00450.0063
40020.00420.00400.0048
40030.00420.00370.0046
M410.00290.00480.0054
M420.00370.00500.0063
Table A4. The relative accuracy of the point coordinates (95% confidence level).
Table A4. The relative accuracy of the point coordinates (95% confidence level).
PointSemi-Major Axis mX (m)Semi-Minor Axis mY (m)Elevation mZ (m)PointSemi-Major Axis mX (m)Semi-Minor Axis mY (m)Elevation mZ (m)
FromToFromTo
1010000.00720.00570.005912M020.00710.00660.0062
13M310.00750.00620.0061100010010.00810.00490.0065
100010030.01140.00730.0099100020000.00840.00010.0076
1000M010.01130.00680.01261001110.00700.00380.0034
100110020.01020.00920.0091100120010.01020.00000.0075
1001M020.00700.00650.0070100210030.00910.00560.0079
100220020.00880.00010.00681002M020.00810.00700.0084
100320030.00940.00010.00671003M010.01010.00660.0098
200020010.01130.00600.0116200020030.01130.00690.0105
200030000.00860.00000.00782000M210.01040.00660.0121
200120020.01260.00760.0101200130010.01010.00000.0078
2001M220.01010.00640.0116200220030.00880.00560.0070
200230020.00830.00000.00652002M220.00910.00690.0085
200330030.00860.00000.00652003M210.00920.00650.0086
300030010.01230.00610.0144300030030.01130.00670.0115
300040000.00920.00010.00803000M310.00890.00670.0117
300130020.01370.00720.0114300140010.01030.00010.0080
3001M320.01030.00650.0134300230030.00890.00560.0070
300240020.00870.00010.00673002M320.00920.00690.0087
300340030.00910.00010.00673003M310.00880.00560.0075
400040010.01440.00610.0172400040030.01330.00690.0134
4000M410.01290.00760.0153400140020.01550.00750.0134
4001M420.01310.00690.0158400240030.01020.00550.0080
4002M420.01070.00750.00994003M410.01120.00750.0096
M01M020.01020.00720.0123M01M210.00920.00000.0096
M02M220.00980.00010.0088M21M220.01040.00710.0119
M21M310.00870.00000.0089M22M320.00900.00000.0092
M31M320.00960.00650.0111M31M410.01050.00000.0095
M32M420.01030.00000.0097

Appendix B

Figure A1. The absolute error ellipses of coordinate determination (a) with additional GNSS observations; (b) without additional GNSS observations.
Figure A1. The absolute error ellipses of coordinate determination (a) with additional GNSS observations; (b) without additional GNSS observations.
Applsci 13 02813 g0a1
Figure A2. The relative error ellipses of coordinate determination (a) with additional GNSS observations; (b) without additional GNSS observations.
Figure A2. The relative error ellipses of coordinate determination (a) with additional GNSS observations; (b) without additional GNSS observations.
Applsci 13 02813 g0a2

References

  1. Shults, R.; Annenkov, A.; Bilous, M.; Kovtun, V. Interpretation of geodetic observations of the high-rise buildings displacements. J. Geod. Cartogr. 2016, 42, 39–46. [Google Scholar] [CrossRef]
  2. Shults, R.; Roshchyn, O. Preliminary determination of spatial geodetic monitoring accuracy for free station method. Geod. List 2016, 70, 355–370. Available online: https://hrcak.srce.hr/178883 (accessed on 19 February 2023).
  3. Hostinová, A.; Kopáčik, A. Monitoring of high-rise building. In Proceedings of the Integrating Generations FIG Working Week, Stockholm, Sweden, 14–19 June 2008. [Google Scholar]
  4. Zschiesche, K. Image Assisted Total Stations for Structural Health Monitoring—A Review. Geomatics 2022, 2, 1. [Google Scholar] [CrossRef]
  5. Paar, R.; Roić, M.; Marendić, A.; Miletić, S. Technological Development and Application of Photo and Video Theodolites. Appl. Sci. 2021, 11, 3893. [Google Scholar] [CrossRef]
  6. Kaartinen, E.; Dunphy, K.; Sadhu, A. LiDAR-Based Structural Health Monitoring: Applications in Civil Infrastructure Systems. Sensors 2022, 22, 4610. [Google Scholar] [CrossRef]
  7. Zhou, H.; Xu, C.; Tang, X.; Wang, S.; Zhang, Z. A Review of Vision-Laser-Based Civil Infrastructure Inspection and Monitoring. Sensors 2022, 22, 5882. [Google Scholar] [CrossRef]
  8. Maru, M.B.; Lee, D.; Tola, K.D.; Park, S. Comparison of Depth Camera and Terrestrial Laser Scanner in Monitoring Structural Deflections. Sensors 2021, 21, 201. [Google Scholar] [CrossRef]
  9. Huang, Q.; Wang, Y.; Luzi, G.; Crosetto, M.; Monserrat, O.; Jiang, J.; Zhao, H.; Ding, Y. Ground-Based Radar Interferometry for Monitoring the Dynamic Performance of a Multitrack Steel Truss High-Speed Railway Bridge. Remote Sens. 2020, 12, 2594. [Google Scholar] [CrossRef]
  10. Budillon, A.; Schirinzi, G. Remote Monitoring of Civil Infrastructure Based on TomoSAR. Infrastructures 2022, 7, 52. [Google Scholar] [CrossRef]
  11. Miano, A.; Di Carlo, F.; Mele, A.; Giannetti, I.; Nappo, N.; Rompato, M.; Striano, P.; Bonano, M.; Bozzano, F.; Lanari, R.; et al. GIS Integration of DInSAR Measurements, Geological Investigation and Historical Surveys for the Structural Monitoring of Buildings and Infrastructures: An Application to the Valco San Paolo Urban Area of Rome. Infrastructures 2022, 7, 89. [Google Scholar] [CrossRef]
  12. Ogaja, C.; Li, X.; Rizos, C. Advances in structural monitoring with global positioning system technology: 1997–2006. J. Appl. Geod. 2008, 1, 171–179. [Google Scholar] [CrossRef]
  13. Ting-Hua, Y.; Hong-Nan, L.; Ming, G. Recent research and applications of GPS-based monitoring technology for high-rise structures. Struct. Control Health Monit. 2013, 20, 649–670. [Google Scholar]
  14. Khoo, V.H.S.; Tor, Y.K.; Ong, G. Monitoring of high rise building using real-time differential GPS. In Proceedings of the FIG Congress 2010 Facing the Challenges—Building the Capacity, Sydney, Australia, 11–16 April 2010. [Google Scholar]
  15. Cinque, D.; Saccone, M.; Capua, R.; Spina, D.; Falcolini, C.; Gabriele, S. Experimental Validation of a High Precision GNSS System for Monitoring of Civil Infrastructures. Sustainability 2022, 14, 10984. [Google Scholar] [CrossRef]
  16. Ozer, C.Y.; Li, X.; Inal, C.; Ge, L.; Yetkin, M.; Arslan, H.M. Analysis of wind-induced response of tall reinforced concrete building based on data collected by GPS and precise inclination sensor. In Proceedings of the FIG Congress 2010 Facing the Challenges—Building the Capacity, Sydney, Australia, 11–16 April 2010. [Google Scholar]
  17. Lăpădat, A.M.; Tiberius, C.C.J.M.; Teunissen, P.J.G. Experimental Evaluation of Smartphone Accelerometer and Low-Cost Dual Frequency GNSS Sensors for Deformation Monitoring. Sensors 2021, 21, 7946. [Google Scholar] [CrossRef]
  18. Meier, E.; Geiger, A.; Ingensand, H.; Licht, H.; Limpach, P.; Steiger, A.; Zwyssig, R. Hydrostatic levelling systems: Measuring at the system limits. J. Appl. Geod. 2010, 4, 91–102. [Google Scholar] [CrossRef]
  19. Zhang, D.; Yu, Z.; Xu, Y.; Ding, L.; Ding, H.; Yu, Q.; Su, Z. GNSS Aided Long-Range 3D Displacement Sensing for High-Rise Structures with Two Non-Overlapping Cameras. Remote Sens. 2022, 14, 379. [Google Scholar] [CrossRef]
  20. McCormick, N.; Lord, J. Digital image correlation for structural measurements. Proc. Inst. Civ. Eng. Civ. Eng. 2012, 165, 185–190. [Google Scholar] [CrossRef]
  21. Mousa, M.A.; Yussof, M.M.; Udi, U.J.; Nazri, F.M.; Kamarudin, M.K.; Parke, G.A.R.; Assi, L.N.; Ghahari, S.A. Application of Digital Image Correlation in Structural Health Monitoring of Bridge Infrastructures: A Review. Infrastructures 2021, 6, 176. [Google Scholar] [CrossRef]
  22. Szeliski, R. Computer Vision: Algorithms and Applications, 2nd ed.; Springer: London, UK, 2022; p. 1232. [Google Scholar]
  23. Schreier, H.; Orteu, J.-J.; Sutton, M.A. Image Correlation for Shape, Motion and Deformation Measurements. Basic Concepts, Theory and Applications; Springer: Berlin/Heidelberg, Germany, 2009; p. 320. [Google Scholar]
  24. Forsyth, D.; Ponce, J. Computer Vision: A Modern Approach, 2nd ed.; Prentice Hall: Boston, FL, USA, 2012; p. 793. [Google Scholar]
  25. Aliansyah, Z.; Shimasaki, K.; Senoo, T.; Ishii, I.; Umemoto, S. Single-Camera-Based Bridge Structural Displacement Measurement with Traffic Counting. Sensors 2021, 21, 4517. [Google Scholar] [CrossRef]
  26. Fradelos, Y.; Thalla, O.; Biliani, I.; Stiros, S. Study of Lateral Displacements and the Natural Frequency of a Pedestrian Bridge Using Low-Cost Cameras. Sensors 2020, 20, 3217. [Google Scholar] [CrossRef]
  27. Feng, D.; Feng, M.Q. Computer vision for SHM of civil infrastructure: From dynamic response measurement to damage detection—A review. Eng. Struct. 2018, 156, 105–117. [Google Scholar] [CrossRef]
  28. Ye, X.W.; Dong, C.Z.; Liu, T. A review of machine vision-based structural health monitoring: Methodologies and applications. J. Sens. 2016, 2016, 7103039. [Google Scholar] [CrossRef] [Green Version]
  29. Wang, J.; Li, G. Study on Bridge Displacement Monitoring Algorithms Based on Multi-Targets Tracking. Future Internet 2020, 12, 9. [Google Scholar] [CrossRef] [Green Version]
  30. Jeong, Y.; Park, D.; Park, K.H. PTZ Camera-Based Displacement Sensor System with Perspective Distortion Correction Unit for Early Detection of Building Destruction. Sensors 2017, 17, 430. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  31. Lydon, D.; Lydon, M.; Kromanis, R.; Dong, C.-Z.; Catbas, N.; Taylor, S. Bridge Damage Detection Approach Using a Roving Camera Technique. Sensors 2021, 21, 1246. [Google Scholar] [CrossRef] [PubMed]
  32. Hoskere, V.; Park, J.W.; Yoon, H.; Spencer, B.F., Jr. Vision-based modal survey of civil infrastructure using unmanned aerial vehicles. J. Struct. Eng. 2019, 145, 04019062. [Google Scholar] [CrossRef]
  33. Carroll, S.; Satme, J.; Alkharusi, S.; Vitzilaios, N.; Downey, A.; Rizos, D. Drone-Based Vibration Monitoring and Assessment of Structures. Appl. Sci. 2021, 11, 8560. [Google Scholar] [CrossRef]
  34. Yoon, H.; Hoskere, V.; Park, J.-W.; Spencer, B.F., Jr. Cross-Correlation-Based Structural System Identification Using Unmanned Aerial Vehicles. Sensors 2017, 17, 2075. [Google Scholar] [CrossRef] [Green Version]
  35. Dong, C.-Z.; Catbas, F.N. A review of computer vision–based structural health monitoring at local and global levels. Struct. Health Monit. 2021, 20, 692–743. [Google Scholar] [CrossRef]
  36. Zheng, P. Crack Detection and Measurement Utilizing Image-Based Reconstruction; Project and Report; Virginia Tech: Blacksburg, VA, USA; 41p.
  37. Parente, L.; Falvo, E.; Castagnetti, C.; Grassi, F.; Mancini, F.; Rossi, P.; Capra, A. Image-Based Monitoring of Cracks: Effectiveness Analysis of an Open-Source Machine Learning-Assisted Procedure. J. Imaging 2022, 8, 22. [Google Scholar] [CrossRef]
  38. Spencer, B.F.; Hoskere, V.; Narazaki, Y. Advances in computer vision-based civil infrastructure inspection and monitoring. Engineering 2019, 5, 199–222. [Google Scholar] [CrossRef]
  39. Wojnarowski, A.E.; Leonteva, A.B.; Tyurin, S.V.; Tikhonov, S.G.; Artemeva, O.V. Photogrammetric Technology for Remote High-Precision 3D Monitoring of Cracks and Deformation Joints of Buildings and Constructions. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2019, XLII-5/W2, 95–101, Measurement, Visualisation and Processing in BIM for Design and Construction Management, 24–25 September 2019, Prague, Czech Republic. [Google Scholar] [CrossRef] [Green Version]
  40. Lei, B.; Ren, Y.; Wang, N.; Huo, L.; Song, G. Design of a new low-cost unmanned aerial vehicle and vision-based concrete crack inspection method. Struct. Health Monit. 2020, 19, 1871–1883. [Google Scholar] [CrossRef]
  41. Choi, D.; Bell, W.; Kim, D.; Kim, J. UAV-Driven Structural Crack Detection and Location Determination Using Convolutional Neural Networks. Sensors 2021, 21, 2650. [Google Scholar] [CrossRef] [PubMed]
  42. Mitchell, H.; Fryer, J.; Chandler, J. Applications of 3D Measurement from Images; Whittles Publishing: Dunbeath, UK, 2007; p. 336. [Google Scholar]
  43. Luhmann, T.; Robson, S.; Kyle, S.; Böhm, J. Close-Range Photogrammetry and 3D Imaging, 3rd ed.; De Gruyter: Berlin, Germany, 2019; p. 822. [Google Scholar]
  44. Chou, J.-Y.; Chang, C.-M. Image Motion Extraction of Structures Using Computer Vision Techniques: A Comparative Study. Sensors 2021, 21, 6248. [Google Scholar] [CrossRef] [PubMed]
  45. Deng, G.; Zhou, Z.; Shao, S.; Chu, X.; Jian, C. A Novel Dense Full-Field Displacement Monitoring Method Based on Image Sequences and Optical Flow Algorithm. Appl. Sci. 2020, 10, 2118. [Google Scholar] [CrossRef] [Green Version]
  46. Feng, D.; Feng, M.Q. Vision-based multipoint displacement measurement for structural health monitoring. Struct. Control Health Monit. 2016, 23, 876–890. [Google Scholar] [CrossRef]
  47. Feng, D.; Feng, M.Q.; Ozer, E.; Fukuda, Y. A vision-based sensor for noncontact structural displacement measurement. Sensors 2015, 15, 16557–16575. [Google Scholar] [CrossRef]
  48. Kohut, P.; Holak, K.; Uhl, T.; Ortyl, Ł.; Owerko, T.; Kuras, P.; Kocierz, R. Monitoring of a civil structure’s state based on noncontact measurements. Struct. Health Monit. 2013, 12, 411–429. [Google Scholar] [CrossRef]
  49. Narazaki, Y.; Gomez, F.; Hoskere, V.; Smith, M.D.; Spencer, B.F. Efficient development of vision-based dense three-dimensional displacement measurement algorithms using physics-based graphics models. Struct. Control Health Monit. 2021, 20, 1841–1863. [Google Scholar] [CrossRef]
  50. Sangirardi, M.; Altomare, V.; De Santis, S.; de Felice, G. Detecting Damage Evolution of Masonry Structures through Computer-Vision-Based Monitoring Methods. Buildings 2022, 12, 831. [Google Scholar] [CrossRef]
  51. Taghavikish, S.; Elhabiby, M. Target Based 2D Digital Image Correlation Deflection Monitoring to Analyze the Environmental Effect on Variations of Deflection on Structures. Geomatics 2021, 1, 192–205. [Google Scholar] [CrossRef]
  52. Yoon, H.; Elanwar, H.; Choi, H.J.; Golparvar-Fard, M.; Spencer, B.F., Jr. Target-free approach for vision-based structural system identification using consumer-grade cameras. Struct. Control Health Monit. 2016, 23, 1405–1416. [Google Scholar] [CrossRef]
  53. Ngeljaratan, L.; Moustafa, M.A. Implementation and Evaluation of Vision-Based Sensor Image Compression for Close-Range Photogrammetry and Structural Health Monitoring. Sensors 2020, 20, 6844. [Google Scholar] [CrossRef] [PubMed]
  54. Ngeljaratan, L.; Moustafa, M.A. Underexposed Vision-Based Sensors’ Image Enhancement for Feature Identification in Close-Range Photogrammetry and Structural Health Monitoring. Appl. Sci. 2021, 11, 11086. [Google Scholar] [CrossRef]
  55. Kalybek, M.; Bocian, M.; Nikitas, N. Performance of Optical Structural Vibration Monitoring Systems in Experimental Modal Analysis. Sensors 2021, 21, 1239. [Google Scholar] [CrossRef]
  56. Kim, S.-W.; Jeon, B.-G.; Kim, N.-S.; Park, J.-C. Vision-based monitoring system for evaluating cable tensile forces on a cable-stayed bridge. Struct. Control Health Monit. 2013, 12, 440–456. [Google Scholar] [CrossRef]
  57. Park, J.-W.; Moon, D.-S.; Yoon, H.; Gomez, F.; Spencer, B.F., Jr.; Kim, J.R. Visual–inertial displacement sensing using data fusion of vision-based displacement with acceleration. Struct. Control Health Monit. 2018, 25, e2122. [Google Scholar] [CrossRef]
  58. Qiu, Z.; Li, H.; Hu, W.; Wang, C.; Liu, J.; Sun, Q. Real-Time Tunnel Deformation Monitoring Technology Based on Laser and Machine Vision. Appl. Sci. 2018, 8, 2579. [Google Scholar] [CrossRef]
  59. Inclinometers. Available online: http://www.geo-observations.com/inclinometers (accessed on 30 June 2021).
  60. Komarizadehasl, S.; Komary, M.; Alahmad, A.; Lozano-Galant, J.A.; Ramos, G.; Turmo, J. A Novel Wireless Low-Cost Inclinometer Made from Combining the Measurements of Multiple MEMS Gyroscopes and Accelerometers. Sensors 2022, 22, 5605. [Google Scholar] [CrossRef]
  61. Ozbek, M. Smart Maintenance and Health Monitoring of Buildings and Infrastructure Using High-Resolution Laser Scanners. Buildings 2022, 12, 454. [Google Scholar] [CrossRef]
  62. Park, H.S.; Son, S.; Choi, S.W.; Kim, Y. Wireless Laser Range Finder System for Vertical Displacement Monitoring of Mega-Trusses during Construction. Sensors 2013, 13, 5796. [Google Scholar] [CrossRef] [PubMed]
  63. Wireless Tilt Meter—Introduction, Application, Features & Operating Principle. Available online: https://www.encardio.com/blog/wireless-tilt-meter-introduction-application-features-operating-principle/ (accessed on 13 May 2021).
  64. Rodrigues, D.V.Q.; Li, C. A Review on Low-Cost Microwave Doppler Radar Systems for Structural Health Monitoring. Sensors 2021, 21, 2612. [Google Scholar] [CrossRef] [PubMed]
  65. In-Place 3D inclinometer/Settlement (IPIS) System—Features, Applications & Working. Available online: https://www.encardio.com/blog/in-place-3d-inclinometer-settlement-ipis-system-features-applications-working/ (accessed on 13 May 2021).
  66. A Guide on Geotechnical Instruments: Types, & Application. Available online: https://www.encardio.com/blog/a-guide-on-geotechnical-instruments-types-application/ (accessed on 13 May 2021).
  67. Fraden, J. Handbook of Modern Sensors: Physics, Designs, and Applications, 4th ed.; Springer: New York, NY, USA, 2010. [Google Scholar]
  68. Guidelines for Instrumentation of Large Dams; Doc. No. CDSO_GUD_DS_02_v1.0; Central Water Commission Ministry of Water Resources, River Development & Ganga Rejuvenation Government of India: New Delhi, India, 2017.
  69. Transnational Model of Sustainable Protection and Conservation of Historic Ruins. Best Practices Handbook. Publication within Project “RUINS: Sustainable Re-Use, Preservation and Modern Management of Historical Ruins in Central Europe—Elaboration of Integrated Model and Guidelines Based on the Synthesis of the Best European Experiences”, Supported by the Interreg CENTRAL EUROPE Programme Funded under the European Regional Development Fund. Available online: https://www.venetiancluster.eu/wp-content/uploads/2020/03/D.T1.4.3-Best-practice-handbook-transnational-model-of-sustainable-protection-and-conservation-of-ruins.pdf (accessed on 19 February 2023).
  70. Transportation Research Circular E-C129. Use of Inclinometers for Geotechnical Instrumentation on Transportation Projects. State of the Practice; Transportation Research Board: Washington, DC, USA, 2008.
  71. Wuh, R.-T.; Jahanshahi, M.R. Data fusion approaches for structural health monitoring and system identification: Past, present, and future. Struct. Health Monit. 2020, 19, 552–586. [Google Scholar] [CrossRef]
  72. Kot, P.; Muradov, M.; Gkantou, M.; Kamaris, G.S.; Hashim, K.; Yeboah, D. Recent Advancements in Non-Destructive Testing Techniques for Structural Health Monitoring. Appl. Sci. 2021, 11, 2750. [Google Scholar] [CrossRef]
  73. Palma, P.; Steiger, R. Structural health monitoring of timber structures—Review of available methods and case studies. Constr. Build. Mater. 2020, 248, 118528. [Google Scholar] [CrossRef]
  74. A Guide on Structural Health Monitoring (SHM). Available online: https://encardio.medium.com/encardio-rite-a-guide-on-structural-health-monitoring-shm-eb39fd02fe9a (accessed on 13 May 2021).
  75. Caballero-Russi, D.; Ortiz, A.R.; Guzmán, A.; Canchila, C. Design and Validation of a Low-Cost Structural Health Monitoring System for Dynamic Characterization of Structures. Appl. Sci. 2022, 12, 2807. [Google Scholar] [CrossRef]
  76. Meng, X.; Nguyen, D.T.; Xie, Y.; Owen, J.S.; Psimoulis, P.; Ince, S.; Chen, Q.; Ye, J.; Bhatia, P. Design and Implementation of a New System for Large Bridge Monitoring—GeoSHM. Sensors 2018, 18, 775. [Google Scholar] [CrossRef] [Green Version]
  77. Bezas, K.; Komianos, V.; Koufoudakis, G.; Tsoumanis, G.; Kabassi, K.; Oikonomou, K. Structural Health Monitoring in Historical Buildings: A Network Approach. Heritage 2020, 3, 796–818. [Google Scholar] [CrossRef]
  78. Barthorpe, R.J.; Worden, K. Emerging Trends in Optimal Structural Health Monitoring System Design: From Sensor Placement to System Evaluation. J. Sens. Actuator Netw. 2020, 9, 31. [Google Scholar] [CrossRef]
  79. Mufti, A.; Bakht, B.; Humar, J.; Jalali, J.; Newhook, J.; Rahman, S. Guidelines for Structural Health Monitoring—Design Manual No. 2; ISIS Canada; Intelligent Sensing for Innovative Structures: Winnipeg, MB, Canada, 2001. [Google Scholar]
  80. Delo, G.; Civera, M.; Lenticchia, E.; Miraglia, G.; Surace, C.; Ceravolo, R. Interferometric Satellite Data in Structural Health Monitoring: An Application to the Effects of the Construction of a Subway Line in the Urban Area of Rome. Appl. Sci. 2022, 12, 1658. [Google Scholar] [CrossRef]
  81. Entezami, A.; Arslan, A.N.; De Michele, C.; Behkamal, B. Online Hybrid Learning Methods for Real-Time Structural Health Monitoring Using Remote Sensing and Small Displacement Data. Remote Sens. 2022, 14, 3357. [Google Scholar] [CrossRef]
  82. Baumann-Ouyang, A.; Butt, J.A.; Salido-Monzú, D.; Wieser, A. MIMO-SAR Interferometric Measurements for Structural Monitoring: Accuracy and Limitations. Remote Sens. 2021, 13, 4290. [Google Scholar] [CrossRef]
  83. Ponzo, F.C.; Iacovino, C.; Ditommaso, R.; Bonano, M.; Lanari, R.; Soldovieri, F.; Cuomo, V.; Bozzano, F.; Ciampi, P.; Rompato, M. Transport Infrastructure SHM Using Integrated SAR Data and On-Site Vibrational Acquisitions: “Ponte Della Musica–Armando Trovajoli” Case Study. Appl. Sci. 2021, 11, 6504. [Google Scholar] [CrossRef]
  84. Marchewka, A.; Ziółkowski, P.; Aguilar-Vidal, V. Framework for Structural Health Monitoring of Steel Bridges by Computer Vision. Sensors 2020, 20, 700. [Google Scholar] [CrossRef] [Green Version]
  85. Ciotta, V.; Asprone, D.; Manfredi, G.; Cosenza, E. Building Information Modelling in Structural Engineering: A Qualitative Literature Review. CivilEng 2021, 2, 765–793. [Google Scholar] [CrossRef]
  86. O’Shea, M.; Murphy, J. Design of a BIM Integrated Structural Health Monitoring System for a Historic Offshore Lighthouse. Buildings 2020, 10, 131. [Google Scholar] [CrossRef]
  87. Panah, R.S.; Kioumarsi, M. Application of Building Information Modelling (BIM) in the Health Monitoring and Maintenance Process: A Systematic Review. Sensors 2021, 21, 837. [Google Scholar] [CrossRef]
  88. Shults, R. Geospatial monitoring of engineering structures as a part of BIM. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2022, XLVI-5/W1-2022, 225–230. [Google Scholar] [CrossRef]
  89. Mustapha, S.; Lu, Y.; Ng, C.-T.; Malinowski, P. Sensor Networks for Structures Health Monitoring: Placement, Implementations, and Challenges—A Review. Vibration 2021, 4, 551–585. [Google Scholar] [CrossRef]
  90. Ručevskis, S.; Rogala, T.; Katunin, A. Optimal Sensor Placement for Modal-Based Health Monitoring of a Composite Structure. Sensors 2022, 22, 3867. [Google Scholar] [CrossRef] [PubMed]
  91. Sivasuriyan, A.; Vijayan, D.S.; Górski, W.; Wodzyński, Ł.; Vaverková, M.D.; Koda, E. Practical Implementation of Structural Health Monitoring in Multi-Story Buildings. Buildings 2021, 11, 263. [Google Scholar] [CrossRef]
  92. Yang, Y.-S.; Xue, Q.; Chen, P.-Y.; Weng, J.-H.; Li, C.-H.; Liu, C.-C.; Chen, J.-S.; Chen, C.-T. Image Analysis Applications for Building Inter-Story Drift Monitoring. Appl. Sci. 2020, 10, 7304. [Google Scholar] [CrossRef]
  93. Zhuang, Y.; Chen, W.; Jin, T.; Chen, B.; Zhang, H.; Zhang, W. A Review of Computer Vision-Based Structural Deformation Monitoring in Field Environments. Sensors 2022, 22, 3789. [Google Scholar] [CrossRef] [PubMed]
  94. Zhang, X.; Zeinali, Y.; Story, B.A.; Rajan, D. Measurement of Three-Dimensional Structural Displacement Using a Hybrid Inertial Vision-Based System. Sensors 2019, 19, 4083. [Google Scholar] [CrossRef] [Green Version]
  95. Sarvaiya, J.; Patnaik, S.; Kothari, K. Image registration using log polar transform and phase correlation to recover higher scale. J. Pattern Recognit. Res. 2012, 7, 90–105. [Google Scholar] [CrossRef]
Figure 1. Monitoring parameters: (a) vertical displacements; (b) roll; (c) bending; (d) torsion.
Figure 1. Monitoring parameters: (a) vertical displacements; (b) roll; (c) bending; (d) torsion.
Applsci 13 02813 g001
Figure 2. The structure displacement caused by either roll or bending in combination with spatial displacement.
Figure 2. The structure displacement caused by either roll or bending in combination with spatial displacement.
Applsci 13 02813 g002
Figure 3. The general concept of the VOS and its installation.
Figure 3. The general concept of the VOS and its installation.
Applsci 13 02813 g003
Figure 4. VOS sensor and target.
Figure 4. VOS sensor and target.
Applsci 13 02813 g004
Figure 5. VOS single chain: (a) sensor to sensor; (b) sensor–target–sensor.
Figure 5. VOS single chain: (a) sensor to sensor; (b) sensor–target–sensor.
Applsci 13 02813 g005
Figure 6. VOS installation schemes: (a) the scheme with sensor-to-sensor observations; (b) the scheme with sensor–target–sensor.
Figure 6. VOS installation schemes: (a) the scheme with sensor-to-sensor observations; (b) the scheme with sensor–target–sensor.
Applsci 13 02813 g006aApplsci 13 02813 g006b
Figure 7. Image acquisition via system camera CCD matrix.
Figure 7. Image acquisition via system camera CCD matrix.
Applsci 13 02813 g007
Figure 8. The graph of the resolution error.
Figure 8. The graph of the resolution error.
Applsci 13 02813 g008
Figure 9. The graph of the defocusing error.
Figure 9. The graph of the defocusing error.
Applsci 13 02813 g009
Figure 10. The resulting error graph.
Figure 10. The resulting error graph.
Applsci 13 02813 g010
Figure 11. The relationship between camera parameters and size of ground coverage.
Figure 11. The relationship between camera parameters and size of ground coverage.
Applsci 13 02813 g011
Figure 12. The relationship between camera parameters and size of ground coverage.
Figure 12. The relationship between camera parameters and size of ground coverage.
Applsci 13 02813 g012
Figure 13. The QR target size at the different distances.
Figure 13. The QR target size at the different distances.
Applsci 13 02813 g013
Figure 14. Displacements along vertical chains and monitoring parameters (top view) in the horizontal plane.
Figure 14. Displacements along vertical chains and monitoring parameters (top view) in the horizontal plane.
Applsci 13 02813 g014
Figure 15. Total displacements of the structure and its roll.
Figure 15. Total displacements of the structure and its roll.
Applsci 13 02813 g015
Figure 16. Displacement geometry in the horizontal plane.
Figure 16. Displacement geometry in the horizontal plane.
Applsci 13 02813 g016
Figure 17. The scheme of the displacement determination ( V 1 = δ n 1 2 + δ m 1 2 ;   V n = δ n n 2 + δ m n 2 ): (a) for camera; (b) for target.
Figure 17. The scheme of the displacement determination ( V 1 = δ n 1 2 + δ m 1 2 ;   V n = δ n n 2 + δ m n 2 ): (a) for camera; (b) for target.
Applsci 13 02813 g017
Figure 18. The VOS single-chain simulation scheme.
Figure 18. The VOS single-chain simulation scheme.
Applsci 13 02813 g018
Figure 19. The equipment for the single-chain simulation: (a) mechanical equipment on a tripod for the precise displacement of QR target; (b) place for the test bench setup and single-chain test; (c) the sampling images from the testing site (left image: ISO-800, right image: ISO-100).
Figure 19. The equipment for the single-chain simulation: (a) mechanical equipment on a tripod for the precise displacement of QR target; (b) place for the test bench setup and single-chain test; (c) the sampling images from the testing site (left image: ISO-800, right image: ISO-100).
Applsci 13 02813 g019aApplsci 13 02813 g019b
Figure 20. Calibration errors for different photos.
Figure 20. Calibration errors for different photos.
Applsci 13 02813 g020
Figure 21. Finding the target center based on the reference and input images: (a) image acquisition; (b) input image of 100 × 100 pixels; (c) reference image.
Figure 21. Finding the target center based on the reference and input images: (a) image acquisition; (b) input image of 100 × 100 pixels; (c) reference image.
Applsci 13 02813 g021
Figure 22. The test building geometry, sensor and target placement, and measurement directions.
Figure 22. The test building geometry, sensor and target placement, and measurement directions.
Applsci 13 02813 g022
Figure 23. The error ellipses of coordinate determination: (a) the accuracy ellipses of the point coordinate determination; (b) the relative accuracy ellipses of the point coordinates.
Figure 23. The error ellipses of coordinate determination: (a) the accuracy ellipses of the point coordinate determination; (b) the relative accuracy ellipses of the point coordinates.
Applsci 13 02813 g023
Figure 24. The error ellipse elements.
Figure 24. The error ellipse elements.
Applsci 13 02813 g024
Figure 25. The error ellipses of coordinate determination for relative observations: (a) the accuracy ellipses of the point coordinate determination; (b) the relative accuracy ellipses of the point coordinates.
Figure 25. The error ellipses of coordinate determination for relative observations: (a) the accuracy ellipses of the point coordinate determination; (b) the relative accuracy ellipses of the point coordinates.
Applsci 13 02813 g025
Figure 26. Accuracy of absolute coordinate determination with additional GNSS observations.
Figure 26. Accuracy of absolute coordinate determination with additional GNSS observations.
Applsci 13 02813 g026
Figure 27. Accuracy of absolute coordinate determination without additional GNSS observations.
Figure 27. Accuracy of absolute coordinate determination without additional GNSS observations.
Applsci 13 02813 g027
Figure 28. VOS accuracy propagation.
Figure 28. VOS accuracy propagation.
Applsci 13 02813 g028
Figure 29. Simulation results comparison.
Figure 29. Simulation results comparison.
Applsci 13 02813 g029
Figure 30. Simulation results comparison for GNSS-accompanied case.
Figure 30. Simulation results comparison for GNSS-accompanied case.
Applsci 13 02813 g030
Figure 31. VOS installation scheme for curvilinear structure.
Figure 31. VOS installation scheme for curvilinear structure.
Applsci 13 02813 g031
Figure 32. Bending variation.
Figure 32. Bending variation.
Applsci 13 02813 g032
Table 1. The results of resolution determination.
Table 1. The results of resolution determination.
ParameterValueDistance S (m)Size on the Ground (mm)PixelsResolution (mm/px)
φ a 4°53′52.7″185.69443200.020
φ b 3°36′57.5″163.19432400.020
Table 2. Resolution values for different distances.
Table 2. Resolution values for different distances.
TestS (m)a (m)b (m)m (px)n (px)c (mm/px)
Test 1332.832.08432032401.921.92
Test 2252.141.58432032401.451.45
Test 3171.461.07432032400.990.99
Table 3. The results of camera shift determination during image capturing.
Table 3. The results of camera shift determination during image capturing.
Image Pair
(A-B)
nth Image Center
Coordinates
Camera Shifts between Image Capturing Series
Camera Shifts (px)Camera Shifts (mm)
nomoδnδmδaδb
1-1216016200000
1-2216016200000
1-3216016200000
Table 4. Displacement determination for QR target at distance S = 33 m.
Table 4. Displacement determination for QR target at distance S = 33 m.
TargetTarget Center (px)Relative Target Center (px)Target Displacement between Observation Epochs (mm)Manual Displacement (mm)Displacement Error (mm)
nomonmδnδmδaδbδa1δb1ΔaΔb
Input21591583−1−37000.00.0000.00.0
21581582−2−38−1−1−1.9−1.9−10−0.9−1.9
216815808−409−317.3−5.8908.3−5.8
Reference21591583−1−37000.00,0000.00.0
21581582−2−38−1−1−1.9−1.9−10−0.9−0.8
216715807−408−315.4−5.8906.4−5.8
Table 5. Displacement determination for QR target at distance S = 25 m.
Table 5. Displacement determination for QR target at distance S = 25 m.
TargetTarget Center (px)Relative Target Center (px)Target Displacement between Observation Epochs (mm)Manual Displacement (mm)Displacement Error (mm)
nomonmδnδmδaδbδa2δb2ΔaΔb
Input2210152850−92000.00.0000.00.0
2209152949−91−11−1.41.4−100.41.4
2195152935−91−151−21.81.4−10011.81.4
Reference2211152851−92000.00.0000.00.0
2210152850−92−10−1.40.0−10−0.40.0
2196152936−91−151−21.81.4−100−11.81.4
Table 6. Displacement determination for QR target at distance S = 17 m.
Table 6. Displacement determination for QR target at distance S = 17 m.
TargetTarget Center (px)Relative Target Center (px)Target Displacement between Observation Epochs (mm)Manual Displacement (mm)Displacement Error (mm)
nomonmδnδmδaδbδa3δb3ΔaΔb
Input21511537−9−83000.00.0000.00.0
21481537−12−83−30−30.0−10−20.0
21281535−32−85−23−2−22.8−2−100−12.8−2.0
Reference21511536−9−84000.00.0000.00.0
21481537−12−83−31−3.01.0−10−2.0−1.0
21271538−33−82−242−23.82.0−100−13.82.0
Table 7. Accuracy of displacement determination.
Table 7. Accuracy of displacement determination.
TestS (m)ma (mm)mb (mm)Total RMS (mm)
Test 1334.33.45.5
Test 2256.81.06.9
Test 3177.81.27.9
Table 8. Accuracy dependency between the chain’s length and building height.
Table 8. Accuracy dependency between the chain’s length and building height.
Sensor-to-Target Distance (m)Building Height (m)
50100150200250300350400450
Displacement Accuracy M (mm)
1713.519.223.527.130.333.235.838.340.6
259.813.816.919.521.823.925.827.629.3
337.310.312.614.516.217.819.220.521.8
Allowable value3.36.710.013.416.720.023.426.730.0
Table 9. Bending values depending on structure material and height.
Table 9. Bending values depending on structure material and height.
StructureBuilding Height (m)
50100150200250300350400450
Bending R (mm)
Steel Δ t = 20 °C 312274876109148194245
Δ t = 30 °C 5184173113163222290368
Concrete Δ t = 20 °C 31124436897132173219
Δ t = 30 °C 4163665101146198259328
VOS measurement range100200300400500600700800900
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shults, R.; Ormambekova, A.; Medvedskij, Y.; Annenkov, A. GNSS-Assisted Low-Cost Vision-Based Observation System for Deformation Monitoring. Appl. Sci. 2023, 13, 2813. https://doi.org/10.3390/app13052813

AMA Style

Shults R, Ormambekova A, Medvedskij Y, Annenkov A. GNSS-Assisted Low-Cost Vision-Based Observation System for Deformation Monitoring. Applied Sciences. 2023; 13(5):2813. https://doi.org/10.3390/app13052813

Chicago/Turabian Style

Shults, Roman, Azhar Ormambekova, Yurii Medvedskij, and Andriy Annenkov. 2023. "GNSS-Assisted Low-Cost Vision-Based Observation System for Deformation Monitoring" Applied Sciences 13, no. 5: 2813. https://doi.org/10.3390/app13052813

APA Style

Shults, R., Ormambekova, A., Medvedskij, Y., & Annenkov, A. (2023). GNSS-Assisted Low-Cost Vision-Based Observation System for Deformation Monitoring. Applied Sciences, 13(5), 2813. https://doi.org/10.3390/app13052813

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop