Next Article in Journal
A Fusion Underwater Salient Object Detection Based on Multi-Scale Saliency and Spatial Optimization
Next Article in Special Issue
Data-Driven Parameter Estimation of Nonlinear Ship Manoeuvring Model in Shallow Water Using Truncated Least Squares Support Vector Machines
Previous Article in Journal
Environmental Assessment with Cage Exposure in the Neva Estuary, Baltic Sea: Metal Bioaccumulation and Physiologic Activity of Bivalve Molluscs
Previous Article in Special Issue
Model Experimental Study on a T-Foil Control Method with Anti-Vertical Motion Optimization of the Mono Hull
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Attitude Estimation Method for Target Ships Based on LiDAR Point Clouds via An Improved RANSAC

Key Laboratory of Marine Intelligent Equipment and System of Ministry of Education, Shanghai Jiao Tong University, Shanghai 200240, China
*
Author to whom correspondence should be addressed.
J. Mar. Sci. Eng. 2023, 11(9), 1755; https://doi.org/10.3390/jmse11091755
Submission received: 4 August 2023 / Revised: 3 September 2023 / Accepted: 6 September 2023 / Published: 8 September 2023
(This article belongs to the Special Issue Maritime Autonomous Surface Ships)

Abstract

:
The accurate attitude estimation of target ships plays a vital role in ensuring the safety of marine transportation, especially for tugs. A Light Detection and Ranging (LiDAR) system can generate 3D point clouds to describe the target ship’s geometric features that possess attitude information. In this work, the authors put forward a new attitude-estimation framework that first extracts the geometric features (i.e., the board-side plane of a ship) using point clouds from shipborne LiDAR and then computes the attitude that is of interest (i.e., yaw and roll in this paper). To extract the board-side plane accurately on a moving ship with sparse point clouds, an improved Random Sample Consensus (RANSAC) algorithm with a pre-processing normal vector-based filter was designed to exclude noise points. A real water-pool experiment and two numerical tests were carried out to demonstrate the accuracy and general applicability of the attitude estimation of target ships brought by the improved RANSAC and estimation framework. The experimental results show that the average mean absolute errors of the angle and angular-rate estimation are 0.4879 deg and 4.2197 deg/s, respectively, which are 92.93% and 75.36% more accurate than the estimation based on standard RANSAC.

1. Introduction

The attitude estimation of surrounding ships is of great importance, as it lays the foundation for collision avoidance [1,2] by helping in the prediction of the target ship’s stability in complex and close-range scenes, such as towing operations for tugs [3], cargo transfer between ships [4], and marine replenishment [5]. In towing operations for tugs, attitude observation is especially necessary because the tugs need to maintain the sailing state of the target ship within the operating range, which is sometimes even less than 5 m [6]. Currently, most ships rely on radar images, an Automatic Identification System (AIS), or a human lookout to obtain the attitude information of other ships [7,8,9,10], whose accuracy is easily affected by environmental disturbances. In addition, limited by the dimensions of states that can be perceived by these methods, only a few components of the ship attitude can be deduced, which is not sufficient for towing tasks. For example, attitude estimation based on radar images can only provide a yaw angle and suffers from inaccuracies brought by electromagnetic interference.
To solve the above problem, the 3D Light Detection and Ranging (LiDAR) system is a promising and powerful piece of equipment to accurately and thoroughly estimate the attitude of the target ship [11]. In recent years, 3D LiDAR has been used in various perception systems thanks to its advantages of high measurement accuracy and timely response [12] for tasks such as object segmentation and mapping [13], obstacle detection [14], target recognition [15], and self-state estimation [16]. H. Wang et al. [16] applied LiDAR and a registration method to estimate the self-state. Nocerino et al. [17] applied LiDAR to attitude estimation for uncooperative space targets. Both their methods involve the multi-frame registration of the point clouds and require several steady frames. Their methods can suffer from a large self-rolling motion, which is the typical situation in shipborne attitude-estimation systems. The traditional LiDAR-based attitude-estimation algorithm for target ships is the bounding-box method [18], which can only provide a rapid estimation of a ship’s yaw angle. This method neglects the rolling motion and lacks the estimation of other attitude states, limiting its application. Therefore, a new attitude-perception method based on LiDAR is valuable for ships carrying out missions such as towing or berthing.
Attitude estimation can be realized by recognizing the geometric features attached to ships and calculating the attitude according to the obtained features. One promising feature extraction paradigm used for this purpose is Random Sample Consensus (RANSAC), which is widely used in feature extraction from point clouds [19]. The key idea of the standard RANSAC is the extraction of a predefined geometric model from the point clouds by randomly selecting minimal data points and using these data points for the construction of a candidate model [20]. Due to the uncertainty introduced by the random sampling process, standard RANSAC can derive the false geometric model if irrelevant points are picked for model fitting. LiDAR can consistently generate noise points that will affect RANSAC. To address this problem, researchers put forward a series of algorithms to improve the standard RANSAC. For example, B. Wang et al. [21] proposed an improved RANSAC that can extract the ground plane from the point cloud of a vehicle-borne LiDAR. The research adopted a post-processing method that analyzed the normal vector of the extracted plane to decrease the false extraction rate of the static ground plane. Nevertheless, for a geometric model fixed on a moving ship, such a post-processing treatment may mistakenly exclude the data points on the target geometric features since the locomotion can change the attitude. Yang et al. [22] proposed an improved RANSAC with weighted principal component analysis-based normal estimation and angular clustering before the fitting process to improve efficiency. However, their method requires a dense point cloud that is hard to obtain with a shipborne LiDAR.
To deduce the attitude of ships under rolling motion, this study put forward an attitude-estimation method for target ships using 3D point clouds from shipborne LiDAR. The estimation algorithm first calibrates the point cloud using the Inertial Measurement Unit (IMU) to deal with rolling motion. Then, we extract the feature plane fixed on a ship from the point cloud and calculate the attitude that is of interest (we take yaw, roll, yaw rate, and roll rate as an example in this paper). To realize the accurate geometric feature extraction on a moving ship from a sparse point cloud with irrelevant points, this study adds a pre-processing normal vector-based filter to the standard RANSAC. The main contributions of this paper are summarized as follows:
  • The authors propose a target ship attitude-perception framework under self-rolling motion based on an estimation of the attitude of a feature plane fixed on the target ship. The self-rolling motion is dealt with by calibrating the point cloud using IMU.
  • The authors improve the standard RANSAC by adding a normal vector-based filter in the extraction process, which can accurately determine the feature plane from the point cloud under unknown noises.
  • The authors conduct real water-pool experiments and several numerical simulations to verify the filtering ability, high accuracy, and general applicability of the proposed attitude-perception framework and the improved RANSAC. Remarkably, we demonstrate its filtering ability when facing unknown reflections and the practical applicability of our method in real water-pool experiments.
The remainder of this paper is organized as follows. In Section 2, we first define the mathematical problem of attitude estimation for ships, along with the necessary assumptions. Then, we formulate the overall framework of the proposed attitude-estimation method. After that, we explain the improved RANSAC in detail, with special emphasis on the normal vector-based filter and plane fitting. In Section 3, we perform a real perception experiment in a water pool based on an unmanned surface vehicle (USV) and several numerical simulations based on two typical ship models: the container ship and the yacht, to verify the improved accuracy of our framework. In Section 4, we briefly conclude the whole paper.

2. Attitude Estimation Method for Target Ship

2.1. Problem Statement

In fine weather without fog or rain, LiDAR can generate precise and sufficient point clouds of the target ship whose moving frequency is lower than the working frequency of LiDAR (typically 10 Hz). In most cases, the moving frequency of ships can satisfy the LiDAR requirement [23]. As shown in Figure 1, a sufficient ship point cloud contains a deck, board side, and other plane surfaces fixed to the ship and can reflect the attitude of the ship. In this paper, we adopt the board-side plane as the feature plane and transfer the attitude-estimation problem to the feature-plane-extraction problem. After the plane-fitting process, the normal vector of the board-side plane can be calculated, and then we estimate the ship attitude using the geometric relation between the board-side plane and the attitude angles of the ship. We remark that different geometric features can be employed to estimate the attitude of different types of vessels through the framework proposed here. To extract the feature plane from the point cloud and distinguish it from other planes, we designed an improved RANSAC algorithm by adding a normal vector-based filter, which can remove interfering planes.
One target ship can reflect thousands of points in a single scanning cycle of LiDAR, and this amount of data will lead to a long calculation time. In this paper, a voxel filter is utilized to accelerate the calculation process without loss of generality, whose main idea is to substitute the points within a rectangular area with their average points. We note that the voxel filter can simultaneously realize the maintenance of the shape feature and the reduction in the point number of the point cloud [24]. In Figure 2, we exhibit the original point cloud of a container ship, which consists of 2561 points, in (a), and the point cloud after voxel filtering, consisting of only 286 points, in (b). Compared with the original point cloud, the filtered point cloud’s point number is significantly reduced while retaining the shape feature.
The coordinates are defined as follows. The North-East-Down frame (NED) is marked as “On–XnYnZn” (as shown in Figure 3), which is also called the world coordinate system. The body-fixed reference frame (BODY) of the own ship is marked as “Ob–XbYbZb”, and the BODY of the target ship is marked as “Ot–XtYtZt”. Considering the self-rolling motion, we use IMU to calibrate the point cloud from the BODY frame to the NED frame. In the transformation process, the position of LiDAR is chosen as the origin of the BODY frame of the own ship, which reduces the time of coordinate transformations. ( ψ t ,   φ t ,   ϕ t ) represents the yaw angle, roll angle, and pitch angle of the target ship. ( ψ s ,   φ s ,   ϕ s ) represents the yaw angle, roll angle, and pitch angle of the own ship, which are collected by IMU. These angles can be used to calculate the rotation transformation matrix between the BODY frame of the own ship and world coordinate using
x ~ y ~ z ~ = R z R y R x x y z
with
R z = cos ψ s sin ψ s 0 sin ψ s cos ψ s 0 0 0 1 ,   R y = cos ϕ s 0 sin ϕ s 0 1 0 sin ϕ s 0 cos ϕ s ,   R x = 1 0 0 0 cos φ s sin φ s 0 sin φ s cos φ s .

2.2. Overall Workflow of the Framework

As shown in Figure 4, the ship attitude-estimation method proposed in this paper can be divided into three steps:
  • Step 1: The first step finishes the voxel-grid filter-based down-sampling process according to the given sample size, as demonstrated in Section 2.1.
  • Step 2: The second step is the feature extraction from the point clouds by the improved RANSAC, which will be elaborated in Section 2.3. In brief, the point cloud is transformed into the same coordinate system as the own ship. Then, the down-sampled point cloud is preprocessed by a normal vector-based filter, which estimates the normal vector of each point (defined later in Section 2.3) and filters the irrelevant points according to the tolerance. After that, the plane-fitting process can provide the optimal plane function of the feature plane in accordance with the given tolerance.
  • Step 3: After obtaining the feature plane, the last step calculates the target ship’s yaw angle, roll angle, yaw rate, and roll rate using the normal vector of the feature plane. Specifically, as shown in Figure 5, we assume that the feature plane of the ship is approximately parallel to the XtOtZt plane in the BODY of the target ship; hence, the included angle ψ p between the normal vector of the feature plane and the YtOtZt plane of the NED frame equals the ψ t (shown in Figure 5a). The roll angle equals the subtraction of the board-side inclination angle θ s from the included angle φ p between the normal vector of the feature plane and the XtOtYt plane of the target ship BODY (shown in Figure 5b). Based on the above observation, the target ship’s ψ t , φ t , ψ ˙ t   , and φ ˙ t can be calculated using Equations (3) and (4), where the vector ( A p ,   B p ,   C p ) represents the normal vector of the feature plane, which is calculated in Section 2.3.2.
    ψ t = ψ p = acos B p A p 2 + B p 2 , φ t = φ p θ s = atan C p A p 2 + B p 2 θ s ,  
    ψ ˙ t = Δ ψ t Δ t , φ ˙ t = Δ φ t Δ t ,  

2.3. The Improved RANSAC Algorithm

As mentioned in the introduction, the standard RANSAC will randomly choose three points in the raw data to construct the candidate plane and judge if the plane can meet this requirement, which might extract other planes in the ship point cloud and lead to the failure of the estimation. To address such a problem, we construct a normal vector-based filter as a preprocessing algorithm.

2.3.1. Normal Estimation and Normal Vector-Based Filter

To ensure the extraction of the board-side plane, this study introduces a normal filter that can reinforce the board-side plane feature by removing irrelevant points based on the normal vectors. The normal vector of a discrete point is defined as the normal vector of the plane fitted by the specific point and its neighbor points [22] (an illustrative example is given in Figure 6). We note that, using this definition, the normal vector of a discrete point can describe the plane feature near the point.
Mathematically, the computation of the normal vector ( A ~ i ,   B ~ i ,   C ~ i ) is realized by a standard least square method of plane fitting, and the plane function is
A i x   +   B i y   +   D i = z .
For each point ( x i ,   y i ,   z i ) in the point cloud, the normal vector is calculated through neighbor points ( x j ,   y j ,   z j ), defined by their Euler distance as
x j ,   y j ,   z j { x k ,   y k ,   z k } k = 1 n x i - x j 2 + y i - y j 2 + z i - z j 2   ϵ ,
where the distance tolerance value ϵ is gained by trial and error. Then, the feature plane can be fitted by the least square method as
x i 2 + j = 1 n x j 2 x i y i + j = 1 n x j y j x i + j = 1 n x j x i y i + j = 1 n x j y j y i 2 + j = 1 n y j 2 y i + j = 1 n y j x i + j = 1 n x j y i + j = 1 n y j n + 1 A i B i D i = x i z i + j = 1 n x j z j y i z i + j = 1 n y j z j z i + j = 1 n z j ,
where the parameters A i ,   B i and D i can be used to compute the normal vector ( A ~ i ,   B ~ i ,   C ~ i ) using
A ~ i = A i / D i ,     B ~ i = B i / D i ,     C ~ i = 1 / D i   .
After the normal vectors of the points are calculated, the normal filter is adopted to rule out most of the points that are irrelevant to the feature plane. Algorithm 1 provides the corresponding workflow, which first randomly picks a point in the point cloud and places it in a point set. To be concrete, if a specific point in the point cloud has a normal vector similar to other points in the existing point set, the point will be put into that point set. Otherwise, if no point set shares a similar normal vector for that specific point, a new point set will be created to include it. The difference between the normal vectors of two points is defined as
Δ d ( q i ,   q j ) = acos A ~ i A ~ j + B ~ i B ~ j + C ~ i C ~ j A ~ i 2 + B ~ i 2 + C ~ i 2 A ~ j 2 + B ~ j 2 + C ~ j 2 ,
where q i ,   q j are the normal vectors ( A ~ i ,   B ~ i ,   C ~ i ) and ( A ~ j ,   B ~ j ,   C ~ j ). The above process is looped until all points in the point cloud are divided into different point sets. Then, we apply the point number of each point set as the filtering index, and the largest point set is considered the major component and preserved. After angular filtering, the same algorithm is applied using Euler distance. We note that this filtering method can preserve the major plane and remove other irrelevant minor planes in the point cloud. Finally, the filtered point cloud is constructed from the preserved point set. As shown in Figure 7, we show the input point cloud in (a) and the filtered point cloud in (b), which illustrates that our normal vector-based filter can rule out irrelevant points.
Algorithm 1: Normal vector-based filter algorithm
Input : point   cloud   with   normal   vector   set   Q ,   threshold   ξ   Output :   subset   C 1 : for   each   q i   in   Q : 2 :         C i     { q i } ,   Q     Q     { q i } 3 :         num     0 4 :         while   num   C i : 5 :              num     C i 6 :              for   each   c j   in   C i : 7 :                  for   each   q i   in   Q : 8 :                      if   Δ d ( c j ,   q j )     ξ : 9 :                          C i     C i +   { q j } ,   Q     Q     { q j } 10 :                    end   if 11 :                 end   for 12 :             end   for 13 :        end   while 14 : end   for 15 : index     argmax i C i 16 : C     C index

2.3.2. The Improved RANSAC with Preprocessing

After ruling out irrelevant points with the normal vector-based filtering algorithm, the improved RANSAC then randomly picks three points ( x ^ 1 ,   y ^ 1 ,   z ^ 1 ), (   x ^ 2 ,   y ^ 2 ,   z ^ 2 ), (   x ^ 3 ,   y ^ 3 ,   z ^ 3 ) in the filtered point cloud and calculates the candidate feature plane function using
x x ^ 1 y y ^ 1 z z ^ 1 x ^ 2 x ^ 1 y ^ 2 y ^ 1 z ^ 2 z ^ 1 x ^ 3 x ^ 1 y ^ 3 y ^ 1 z ^ 3 z ^ 1 = 0   .
The plane function of the candidate plane can then be obtained according to
Ax   +   By   +   D = z ,
which is deduced from Formula (11). Then, we calculate the score of the plane m as the number of points whose distance to the plane is within the tolerance ε (we name the points within the tolerance as feature points). Mathematically, it is defined as
m : = card x ˇ j ,   y ˇ j ,   z ˇ j x j ,   y j ,   z j j = 1 k A x ˇ j + B y ˇ j + z ˇ j D A 2 + B 2 + 1 2 ε .
Repeat the above process for predefined cycle times and select the plane function with the highest score as the feature plane function. Then, the normal vector of the feature plane ( A p ,   B p ,   C p ) is calculated using
A p = A m / D m ,   B p = B m / D m ,   C p = 1 / D m ,
where A m ,   B m and D m are the parameters of the selected feature plane function. Figure 8 provides an illustrative example of the fitting process.

3. Experiments and Results

In this study, we applied both real water-pool experiments and numerical tests to validate our method. The methods applied in the experiments are “Improved RANSAC with Voxel Filter” (hereafter referred to as “Improved RANSAC with VF”), which is the proposed framework; “Standard RANSAC with Voxel Filter” (hereafter referred to as “Standard RANSAC with VF”); “Improved RANSAC without Voxel Filter” (hereafter referred to as the “Improved RANSAC without VF”). The authors apply the Root Mean Square Errors (RMSE) and the Mean Absolute Errors (MAE) as the indicators of errors, following [25]. The calculation formulas are as follows:
RMSE = 1 n i = 1 n x i x ^ i 2 ,
MAE = 1 n i = 1 n x i x ^ i ,
where x i is the estimated value, x ^ i is the reference value, and n is the number of the estimation result. The above indexes are referred to as the evaluation indexes of the estimation results.

3.1. Water-Pool Experiments

To verify the filtering ability of our normal vector-based filter under real environmental noise and verify our attitude-estimation framework on a real ship target, we carried out the experiment in an outdoor water pool.

3.1.1. Experimental Environment and Configuration

In water-pool experiments, we adopted “LS-M1” solid-state LiDAR, with detailed technical information given in Table 1. We equipped the experimental USV named “Hong Dong No. 1” with an IMU system to measure the attitude state, and the dimensional information is shown in Table 2. The experimental scene is shown in Figure 9. In the experiment, we gave the USV an initial moving speed and created some waves in the pool to keep the USV moving freely for ten seconds. The moving motion was recorded using the IMU equipped on the USV and estimated by LiDAR. As shown in Table 3, the parameters of each method were set consistently. The leaf size of the voxel filter is 3 cm.

3.1.2. Experimental Results

The snapshots of the experiment are shown in Figure 10 to provide visual aids. The actual point clouds in the experiments are shown in Figure 11a. The red rectangle highlights the noise points caused by unknown reflections and the green points belong to the feature plane that we want to extract. The number of the reflected points of the USV is around 15,000, while the board-side plane only reflects around 1000 points, which shows that the collected point clouds in the experiment are severely affected by unknown reflections around the target USV. It can be seen from Figure 11b that, after applying the proposed normal vector-based filter, the number of irrelevant points is significantly reduced.
After applying the proposed filtering algorithm, we then computed the feature plane. The feature-plane-extraction results are shown in Figure 12. Figure 12a shows the extraction result of the standard RANSAC whose input point cloud is obtained directly from LiDAR without normal vector-based filtering, while Figure 12b shows the extraction result of the improved RANSAC with pre-processing. It can be seen from Figure 11a and Figure 12a that the feature points that standard RANSAC extracts are mainly noise points, while Figure 11a and Figure 12b show that our improved RANSAC can extract feature points belonging to the board-side plane.
After computing the feature planes, we applied the estimation methods to estimate the attitude. The estimation results and the corresponding IMU data are shown in Figure 13 and Figure 14 The roll angle estimated by standard RANSAC is far from the IMU data and suffers sudden change. The yaw angle follows the trend of the IMU data but there is a gap between these angles. Due to the errors in the angle estimation stage, the rate estimation also contains great losses. The roll angle estimated by improved RANSAC is close to the IMU data. The yaw angle estimation can follow the trend of the IMU data despite small fluctuations. In addition, the yaw angle variance between the improved RANSAC with/without VF is evident in Figure 13. The reason might be that the leaf size of the voxel filter is not ignorable compared with the small main size of the USV. The point clouds might lose some minor details due to the large ratio of the leaf size to the main size of the USV. We remark that this variance is not a concerning problem and will become unobtrusive as the ratio becomes smaller. Furthermore, it can be observed in Figure 13 that the standard RANSAC exhibits a drift in both roll angle and yaw angle after the 7-s mark. This is caused by the changes in the environmental noise points after 7 s. The changes have a negative effect on the standard RANSAC. Since our method puts much effort into reducing noise points, this drift is filtered out. As shown in Table 4, we calculate the evaluation indexes of each method.

3.2. Numerical Tests

To verify the general applicability of our method for different ship types, we intend to apply two typical ship models in the numerical tests.

3.2.1. Numerical Simulation Preparations and Configurations

The numerical motion simulations are based on the LiDAR computation model, the target ship models, and the input attitude data. As shown in Table 5, the LiDAR computation model is based on VLP-32 LiDAR, which is widely used in various studies [26,27] and whose detection range can satisfy the need of towing operations. The LiDAR computation model can manifest the vertical laser beams, spinning speed, and the laser fire frequency of real VLP-32 LiDAR. As the sea surface only serves as an interface to divide the under-water part and the above-water part of the ship model, the accurate sea-surface model only provides a small improvement compared to the single-plane sea-surface model for the proposed algorithm [28]. To simplify the problem, a single plane parallel to the sea level is adopted and the position is set to the designed draught water line of the ship model. In this case, the simulation system can only produce reflection points above the designed draught water line, which is in line with the actual situation.
The first model is a typical container ship with large board-side planes, and the dimensional information is given in Table 6. As shown in Figure 15, the test scene consists of the target container ship and an observing boat. We equip the observing boat with LiDAR to collect point clouds and use IMU to calibrate the LiDAR. The observing boat is placed 70 m from the starboard of the target container ship and can move freely. The input attitude of the container ship model is set in accordance with the output of a 4-DOF motion model of a container ship [29,30]. The movement of the container ship in the experiment is set as follows: the ship will first make a turn to trace the target course of 22°. After the course angle reaches 22°, the ship will turn to −17°and maintain its course. As shown in Table 7, the parameters of each method in this test are set consistently. The leaf size of the voxel filter is 1 m.
The second model is a typical yacht with a smaller board-side plane than the container ship, and the dimensional information is given in Table 8. As shown in Figure 16, the test scene consists of a target yacht and an observing boat. The observing boat was placed 30 m from the starboard of the target yacht. The input attitude data of the yacht was set by a set of collected data by the GNSS/IMU system on a real yacht whose basic dimension was roughly the same as the adopted model. As shown in Table 9, the parameters in each method were the same. The leaf size of the voxel filter is 10 cm.

3.2.2. Test Results of the Container Ship Model Motion Simulation

First, we calibrate the point cloud using the IMU data of the observing boat and extract the feature plane. As shown in Figure 17a, standard RANSAC faces interference from irrelevant points on the upper structure part of the container ship, while Figure 17b shows that an improved RANSAC can extract the board-side plane of the container.
After extracting the feature planes, we then calculate the attitude states. The estimation results and the corresponding 4DOF model output (the true value in this test) are shown in Figure 18 and Figure 19. As shown in Figure 18, the roll angle estimated by improved RANSAC can follow the trend of the true value, while the estimated roll angle of the standard RANSAC method is unsteady and far from the true value. All three methods can estimate the yaw angle in a way that is close to the true value. From the enlarged figure, we can notice that the lines of the improved RANSAC are closer to the true value line. As shown in Figure 19, the rate value lines of standard RANSAC can follow the trend of the true value, but fluctuate and contain lots of turning points, while the improved RANSAC estimation lines are steady and close to the true value line. We calculate the evaluation indexes of each method in Table 10.

3.2.3. Test Results of the Yacht Model Motion Simulation

First, we calibrate the point cloud using the IMU data of the observing boat and extract the feature plane. As shown in Figure 20a, standard RANSAC faces interference from irrelevant points on other parts of the yacht, while Figure 20b shows that an improved RANSAC can extract the board-side plane of the yacht.
Then, we calculate the attitude state based on the extraction results. The estimation results of each method and the GNSS/IMU system data (the true value in this test) are shown in Figure 21 and Figure 22. In Figure 21, the roll angle line of standard RANSAC is twisty and far from the true value. The yaw angle line of standard RANSAC can follow the trend of true value despite some fluctuations. Both the yaw angle lines and roll angle lines of the improved RANSAC are steady and close to the true value. In Figure 22, the roll rate and yaw rate lines of the standard RANSAC can follow the trend of the true value but suffer from large fluctuations, while the rate lines of the improved RANSAC are smooth and close to the true value. As shown in Table 11, we calculate the evaluation indexes of each method.

4. Discussion

4.1. Discuss the Feature-Plane-Extraction Ability of the Methods

From the comparisons of Figure 12, Figure 17 and Figure 20, we can observe that the feature-plane-extraction results of improved RANSAC are more accurate than those of the standard RANSAC. We note that the noise points in the three tests mainly consist of the unknown reflection points and irrelevant points from other parts of the target ship. In Figure 11, we can see that the normal vector-based filter of the improved RANSAC can rule out both kinds of noise points sufficiently. The filtering process can reduce the possibility of false extraction of the feature plane and thus improve the extraction ability of RANSAC. Accurate feature-plane extraction can be achieved by improving RANSAC with our normal vector-based filtering algorithm.

4.2. Discuss the Attitude Estimation Accuracy of the Methods

Our estimation framework can provide an accurate target attitude state when the feature plane is properly extracted. We can infer from Table 4, Table 10 and Table 11 that both improved RANSAC with VF and improved RANSAC without VF can achieve better estimation results than standard RANSAC regarding both RMSE and MAE. In Table 12, we calculate the average evaluation indexes of each method to further discuss the accuracy.
Both methods using improved RANSAC show great improvements compared with standard RANSAC in terms of roll angle, yaw angle, and roll rate, while the yaw rate estimation results are slightly improved. One possible reason for this is that the irrelevant points from other parts of the target ship might be static and have little influence on yaw rate estimation.
To show the improvement in accuracy in more detail, we calculate the angle and angular rate average evaluation indexes of improved RANSAC with VF and standard RANSAC with VF in Table 13. The angle reduction percentages of RMSE and MAE are 91.62% and 92.93%, respectively. The angular rate reduction percentages of RMSE and MAE are 71.81% and 75.36%, respectively. The above results show that our method can estimate the attitude state more accurately than standard RANSAC.

4.3. Discuss the Density Sensitiveness of Improved RANSAC

To investigate the density sensitiveness of our method, we applied the improved RANSAC with VF and improved RANSAC without VF to estimate the attitude. The voxel filter applied in this framework can reduce the density of the point cloud and help reduce time consumption. Given that the proposed method mainly involves the extraction of major geometric features of a ship model and that the voxel filter usually affects minor details, the drawback of the voxel filter should not be a concern. The experimental results in Table 4, Table 10 and Table 11 indicate that the above two methods achieve similar estimation results, which supports the application of a voxel filter. We note that the adopted voxel filter and leaf size can preserve the features in the point cloud and, if the feature is clearly scanned, the density of the point cloud will have little influence on the estimation process.

4.4. Discuss the General Applicability of Improved RANSAC

Our method requires the extraction of the feature plane and estimation of the attitude. Different ship types may have different feature planes. In this paper, we used a real USV, a container ship model, and a yacht model as targets to verify our method. The above targets have varied shapes and sizes, and our method achieved satisfying results on these targets (indicated in Table 4, Table 10 and Table 11), which demonstrates that our method can be applied to different ship types.
The drawback is that the parameters of our method for different ship targets are different. We comment that this problem can be solved by constructing a parameter database for typical ship types and applying existing classification methods [31,32,33] to acquire the type of target ship.

4.5. Discuss Single LiDAR and Multiple LiDARs

In this study, we applied single LiDAR to observe the target ship and develop an attitude-estimation framework. The mechanism of LiDAR makes the occlusion problem of single LiDAR inevitable. Under the decreased relative scanning angle of LiDAR, the occlusion of the ship’s bow or stern will invalidate the method, which is an important issue to resolve in the future. A promising direction is combining multiple LiDARs with the attitude-estimation framework proposed in this paper to cover the blind area of a single LiDAR.

5. Conclusions

In this study, we proposed an attitude estimation framework that extracts the board-side plane through an improved RANSAC method and computes the roll angle, yaw angle, roll rate, and yaw rate of the target ship. We added a normal vector-based filter to the standard RANSAC method to rule out the noise points in a point cloud and improve the accuracy of the feature-extraction process. Specifically, we first prepared the point cloud through down-sampling and calibration transformation. Next, we estimated the normal vectors of the points and used our filter algorithm to rule out noise points. Then, the RANSAC process provided the optimal plane function, which was adopted to calculate attitude information according to geometric relation. Next, we conducted a real experiment via a USV and two numerical tests using two typical ship models (container ship and yacht) to show the accuracy and general applicability of the proposed method.
The experimental results reveal that the average mean absolute errors of the estimated angle and angular rate are 0.4879 deg and 4.2197 deg/s, respectively, which are 92.93% and 75.36% more accurate than the estimation based on standard RANSAC. Remarkably, we demonstrated the filtering ability and the practical applicability of our method in real water-pool experiments under real environmental noises. We also investigated the density sensitiveness of our method and found that the density of input point clouds has little influence if the feature is clearly scanned.
However, there are some defects in this study owing to the limitations of our experimental resources. The classification of target ships and auto-adaptation of the parameters could be achieved using more abundant data sets, such as real target ships of more types. In addition, the occlusion problem of a single LiDAR can be solved by merging multiple LiDARs, which will be the focus of our future study.

Author Contributions

Conceptualization, S.W.; methodology, S.W.; software, S.W. and X.Y.; validation, S.W. and Y.X.; formal analysis, S.W. and Y.X.; investigation, S.W.; resources, S.W., X.Y. and H.W.; data curation, S.W.; writing—original draft preparation, S.W.; writing—review and editing, Y.X. and S.W.; visualization, S.W. and Y.X.; supervision, H.W.; project administration, H.W.; funding acquisition, H.W. All authors have read and agreed to the published version of the manuscript.

Funding

This work is sponsored by the National Key R&D Program of China, grant number 2022YFE0125200; the National Natural Science Foundation of China, grant number 52271348.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

Much appreciation to each reviewer for their valuable comments and suggestions to improve the quality of this note. We appreciate the work of Lei Dong, who provided the valuable data used in the experiments.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Waskito, K.T.; Sasa, K.; Chen, C.; Kitagawa, Y.; Lee, S.-W. Comparative Study of Realistic Ship Motion Simulation for Optimal Ship Routing of a Bulk Carrier in Rough Seas. Ocean. Eng. 2022, 260, 111731. [Google Scholar] [CrossRef]
  2. Zhuang, J.; Zhang, L.; Zhao, S.; Cao, J.; Wang, B.; Sun, H. Radar-Based Collision Avoidance for Unmanned Surface Vehicles. China Ocean Eng. 2016, 30, 867–883. [Google Scholar] [CrossRef]
  3. Zhang, L.; Du, Z.; Valdez Banda, O.A.; Goerlandt, F.; Du, L.; Li, X. Collision Prevention of Ship Towing Operation under Environmental Disturbance. Ocean. Eng. 2022, 266, 112870. [Google Scholar] [CrossRef]
  4. Huang, L.-M.; Duan, W.-Y.; Han, Y.; Chen, Y.-S. A Review of Short-Term Prediction Techniques for Ship Motions in Seaway. Chuan Bo Li Xue/J. Ship Mech. 2014, 18, 1534–1542. [Google Scholar] [CrossRef]
  5. He, Y.; Liu, X.; Zhang, K.; Mou, J.; Liang, Y.; Zhao, X.; Wang, B.; Huang, L. Dynamic Adaptive Intelligent Navigation Decision Making Method for Multi-Object Situation in Open Water. Ocean. Eng. 2022, 253, 111238. [Google Scholar] [CrossRef]
  6. Du, Z.; Reppa, V.; Negenborn, R.R. Cooperative Control of Autonomous Tugs for Ship Towing. IFAC-PapersOnLine 2020, 53, 14470–14475. [Google Scholar] [CrossRef]
  7. Naus, K.; Wąż, M.; Szymak, P.; Gucma, L.; Gucma, M. Assessment of Ship Position Estimation Accuracy Based on Radar Navigation Mark Echoes Identified in an Electronic Navigational Chart. Measurement 2021, 169, 108630. [Google Scholar] [CrossRef]
  8. Wu, Y.; Chu, X.; Deng, L.; Lei, J.; He, W.; Królczyk, G.; Li, Z. A New Multi-Sensor Fusion Approach for Integrated Ship Motion Perception in Inland Waterways. Measurement 2022, 200, 111630. [Google Scholar] [CrossRef]
  9. Yan, Z.; Cheng, L.; He, R.; Yang, H. Extracting Ship Stopping Information from AIS Data. Ocean. Eng. 2022, 250, 111004. [Google Scholar] [CrossRef]
  10. Liu, J.; Yan, X.; Liu, C.; Fan, A.; Ma, F. Developments and Applications of Green and Intelligent Inland Vessels in China. J. Mar. Sci. Eng. 2023, 11, 318. [Google Scholar] [CrossRef]
  11. Lu, X.; Li, Y. Motion Pose Estimation of Inshore Ships Based on Point Cloud. Measurement 2022, 205, 112189. [Google Scholar] [CrossRef]
  12. Zhang, X.; Wang, C.; Jiang, L.; An, L.; Yang, R. Collision-Avoidance Navigation Systems for Maritime Autonomous Surface Ships: A State of the Art Survey. Ocean. Eng. 2021, 235, 109380. [Google Scholar] [CrossRef]
  13. Thompson, D.; Coyle, E.; Brown, J. Efficient LiDAR-Based Object Segmentation and Mapping for Maritime Environments. IEEE J. Ocean. Eng. 2019, 44, 352–362. [Google Scholar] [CrossRef]
  14. Wang, J.; Ma, F. Obstacle Recognition Method for Ship Based on 3D Lidar. In Proceedings of the 2021 6th International Conference on Transportation Information and Safety (ICTIS), Wuhan, China, 22–24 October 2021; pp. 588–593. [Google Scholar]
  15. Chen, C.; Li, Y. Ship Berthing Information Extraction System Using Three-Dimensional Light Detection and Ranging Data. J. Mar. Sci. Eng. 2021, 9, 747. [Google Scholar] [CrossRef]
  16. Wang, H.; Yin, Y.; Jing, Q. Comparative Analysis of 3D LiDAR Scan-Matching Methods for State Estimation of Autonomous Surface Vessel. JMSE 2023, 11, 840. [Google Scholar] [CrossRef]
  17. Nocerino, A.; Opromolla, R.; Fasano, G.; Grassi, M.; Fontdegloria Balaguer, P.; John, S.; Cho, H.; Bevilacqua, R. Experimental Validation of Inertia Parameters and Attitude Estimation of Uncooperative Space Targets Using Solid State LIDAR. Acta Astronaut. 2023, 210, 428–436. [Google Scholar] [CrossRef]
  18. Ma, R.; Chen, C.; Yang, B.; Li, D.; Wang, H.; Cong, Y.; Hu, Z. CG-SSD: Corner Guided Single Stage 3D Object Detection from LiDAR Point Cloud. ISPRS J. Photogramm. Remote Sens. 2022, 191, 33–48. [Google Scholar] [CrossRef]
  19. Fischler, M.A.; Bolles, R.C. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. In Readings in Computer Vision; Fischler, M.A., Firschein, O., Eds.; Morgan Kaufmann: San Francisco, CA, USA, 1987; pp. 726–740. ISBN 978-0-08-051581-6. [Google Scholar]
  20. Grilli, E.; Menna, F.; Remondino, F. A Review of Point Clouds Segmentation and Classification Algorithms. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 339–344. [Google Scholar] [CrossRef]
  21. Wang, B.; Lan, J.; Gao, J. LiDAR Filtering in 3D Object Detection Based on Improved RANSAC. Remote Sens. 2022, 14, 2110. [Google Scholar] [CrossRef]
  22. Yang, L.; Li, Y.; Li, X.; Meng, Z.; Luo, H. Efficient Plane Extraction Using Normal Estimation and RANSAC from 3D Point Cloud. Comput. Stand. Interfaces 2022, 82, 103608. [Google Scholar] [CrossRef]
  23. Dong, G.; Yan, M.; Zheng, Z.; Ma, X.; Sun, Z.; Gao, J. Experimental Investigation on the Hydrodynamic Response of a Moored Ship to Low-Frequency Harbor Oscillations. Ocean. Eng. 2022, 262, 112261. [Google Scholar] [CrossRef]
  24. Wang, S.; Hu, Q.; Xiao, D.; He, L.; Liu, R.; Xiang, B.; Kong, Q. A New Point Cloud Simplification Method with Feature and Integrity Preservation by Partition Strategy. Measurement 2022, 197, 111173. [Google Scholar] [CrossRef]
  25. Sun, Q.; Tang, Z.; Gao, J.; Zhang, G. Short-Term Ship Motion Attitude Prediction Based on LSTM and GPR. Appl. Ocean. Res. 2022, 118, 102927. [Google Scholar] [CrossRef]
  26. Chekakta, Z.; Zenati, A.; Aouf, N.; Dubois-Matra, O. Robust Deep Learning LiDAR-Based Pose Estimation for Autonomous Space Landers. Acta Astronaut. 2022, 201, 59–74. [Google Scholar] [CrossRef]
  27. Soilán, M.; González-Aguilera, D.; del-Campo-Sánchez, A.; Hernández-López, D.; Del Pozo, S. Road Marking Degradation Analysis Using 3D Point Cloud Data Acquired with a Low-Cost Mobile Mapping System. Autom. Constr. 2022, 141, 104446. [Google Scholar] [CrossRef]
  28. Wijaya, A.P.; Naaijen, P.; Andonowati; van Groesen, E. Reconstruction and Future Prediction of the Sea Surface from Radar Observations. Ocean. Eng. 2015, 106, 261–270. [Google Scholar] [CrossRef]
  29. Guo, H.; Zou, Z. System-Based Investigation on 4-DOF Ship Maneuvering with Hydrodynamic Derivatives Determined by RANS Simulation of Captive Model Tests. Appl. Ocean. Res. 2017, 68, 11–25. [Google Scholar] [CrossRef]
  30. Zhang, G.; Zhang, X.; Pang, H. Multi-Innovation Auto-Constructed Least Squares Identification for 4 DOF Ship Manoeuvring Modelling with Full-Scale Trial Data. ISA Trans. 2015, 58, 186–195. [Google Scholar] [CrossRef]
  31. Aljumaily, H.; Laefer, D.F.; Cuadra, D.; Velasco, M. Point Cloud Voxel Classification of Aerial Urban LiDAR Using Voxel Attributes and Random Forest Approach. Int. J. Appl. Earth Obs. Geoinf. 2023, 118, 103208. [Google Scholar] [CrossRef]
  32. Fang, L.; You, Z.; Shen, G.; Chen, Y.; Li, J. A Joint Deep Learning Network of Point Clouds and Multiple Views for Roadside Object Classification from Lidar Point Clouds. ISPRS J. Photogramm. Remote Sens. 2022, 193, 115–136. [Google Scholar] [CrossRef]
  33. Xu, Z.; Liu, K.; Chen, K.; Ding, C.; Wang, Y.; Jia, K. Classification of Single-View Object Point Clouds. Pattern Recognit. 2023, 135, 109137. [Google Scholar] [CrossRef]
Figure 1. Typical sufficient point cloud of a container ship: (a) represents the container ship; (b) represents the point cloud of the container ship.
Figure 1. Typical sufficient point cloud of a container ship: (a) represents the container ship; (b) represents the point cloud of the container ship.
Jmse 11 01755 g001
Figure 2. Point cloud voxel filtering: (a) represents the original point cloud; (b) represents after voxel filtering.
Figure 2. Point cloud voxel filtering: (a) represents the original point cloud; (b) represents after voxel filtering.
Jmse 11 01755 g002
Figure 3. The NED and BODY coordinate systems. Blue coordinates are the BODY frame and the black coordinate is the NED frame.
Figure 3. The NED and BODY coordinate systems. Blue coordinates are the BODY frame and the black coordinate is the NED frame.
Jmse 11 01755 g003
Figure 4. Overall flow chart of our framework.
Figure 4. Overall flow chart of our framework.
Jmse 11 01755 g004
Figure 5. The geometric relations between the feature plane and the target ship’s attitude: (a) represents the included angle ψ p in the NED frame; (b) represents the included angle φ p and board-side inclination angle θ s in the BODY frame.
Figure 5. The geometric relations between the feature plane and the target ship’s attitude: (a) represents the included angle ψ p in the NED frame; (b) represents the included angle φ p and board-side inclination angle θ s in the BODY frame.
Jmse 11 01755 g005
Figure 6. The normal vector of a discrete point. The neighbor points and the normal vector are in red. The fitted plane is shown in light blue.
Figure 6. The normal vector of a discrete point. The neighbor points and the normal vector are in red. The fitted plane is shown in light blue.
Jmse 11 01755 g006
Figure 7. Point cloud normal vector-based filtering: (a) represents the input point cloud; (b) represents after filtering.
Figure 7. Point cloud normal vector-based filtering: (a) represents the input point cloud; (b) represents after filtering.
Jmse 11 01755 g007
Figure 8. The plane-fitting of RANSAC. The red points are the chosen points, the blue points are near the constructed plane, and the black points are the noise points.
Figure 8. The plane-fitting of RANSAC. The red points are the chosen points, the blue points are near the constructed plane, and the black points are the noise points.
Jmse 11 01755 g008
Figure 9. The water-pool experimental scene. The USV in the upper part is “Hong Dong No. 1” and the LiDAR in the lower part is the “LS-M1” LiDAR.
Figure 9. The water-pool experimental scene. The USV in the upper part is “Hong Dong No. 1” and the LiDAR in the lower part is the “LS-M1” LiDAR.
Jmse 11 01755 g009
Figure 10. The snapshots of the “Hong Dong No. 1” USV in the water-pool experiments. (af) is the sequence of each snapshot.
Figure 10. The snapshots of the “Hong Dong No. 1” USV in the water-pool experiments. (af) is the sequence of each snapshot.
Jmse 11 01755 g010
Figure 11. The comparison of the actual raw point cloud and after filtering in the water-pool experiments: (a) represents the originally collected point cloud; (b) highlights the point cloud after our normal vector-based filter in blue.
Figure 11. The comparison of the actual raw point cloud and after filtering in the water-pool experiments: (a) represents the originally collected point cloud; (b) highlights the point cloud after our normal vector-based filter in blue.
Jmse 11 01755 g011
Figure 12. The comparison of the feature-plane-extraction results of standard RANSAC and improved RANSAC in the water-pool experiments: (a) shows the feature-plane-extraction result of standard RANSAC; (b) shows the feature-plane-extraction result of our improved RANSAC.
Figure 12. The comparison of the feature-plane-extraction results of standard RANSAC and improved RANSAC in the water-pool experiments: (a) shows the feature-plane-extraction result of standard RANSAC; (b) shows the feature-plane-extraction result of our improved RANSAC.
Jmse 11 01755 g012
Figure 13. The roll angle and yaw angle results in the water-pool experiments.
Figure 13. The roll angle and yaw angle results in the water-pool experiments.
Jmse 11 01755 g013
Figure 14. The roll rate and yaw rate results in the water-pool experiments.
Figure 14. The roll rate and yaw rate results in the water-pool experiments.
Jmse 11 01755 g014
Figure 15. The test scene of the container ship motion simulation. The coordinate arrows show the installation position of LiDAR on the observing boat.
Figure 15. The test scene of the container ship motion simulation. The coordinate arrows show the installation position of LiDAR on the observing boat.
Jmse 11 01755 g015
Figure 16. The test scene of the yacht motion simulation. The coordinate arrows show the installation position of LiDAR on the observing boat.
Figure 16. The test scene of the yacht motion simulation. The coordinate arrows show the installation position of LiDAR on the observing boat.
Jmse 11 01755 g016
Figure 17. The comparison of the feature-plane-extraction results of standard RANSAC and improved RANSAC in the container ship motion simulations: (a) shows the feature-plane-extraction result of standard RANSAC; (b) shows the feature-plane-extraction result of our improved RANSAC.
Figure 17. The comparison of the feature-plane-extraction results of standard RANSAC and improved RANSAC in the container ship motion simulations: (a) shows the feature-plane-extraction result of standard RANSAC; (b) shows the feature-plane-extraction result of our improved RANSAC.
Jmse 11 01755 g017
Figure 18. The roll angle and yaw angle results in the container ship simulations.
Figure 18. The roll angle and yaw angle results in the container ship simulations.
Jmse 11 01755 g018
Figure 19. The roll rate and yaw rate results in the container ship simulations.
Figure 19. The roll rate and yaw rate results in the container ship simulations.
Jmse 11 01755 g019
Figure 20. The comparison of the feature-plane-extraction results of standard RANSAC and improved RANSAC in the yacht motion simulations: (a) shows the feature-plane-extraction result of standard RANSAC; (b) shows the feature-plane-extraction result of our improved RANSAC.
Figure 20. The comparison of the feature-plane-extraction results of standard RANSAC and improved RANSAC in the yacht motion simulations: (a) shows the feature-plane-extraction result of standard RANSAC; (b) shows the feature-plane-extraction result of our improved RANSAC.
Jmse 11 01755 g020
Figure 21. The roll angle and yaw angle results in the yacht simulations.
Figure 21. The roll angle and yaw angle results in the yacht simulations.
Jmse 11 01755 g021
Figure 22. The roll rate and yaw rate results in the yacht simulations.
Figure 22. The roll rate and yaw rate results in the yacht simulations.
Jmse 11 01755 g022
Table 1. Technical information of the “LS-M1” LiDAR.
Table 1. Technical information of the “LS-M1” LiDAR.
InformationValue
maximum range350 m
laser fire frequency18,000 Hz
vertical field of view35 deg
vertical angular resolution0.03 deg
horizontal field of view120 deg
horizontal angular resolution0.06 deg
Table 2. Dimensional information of the “Hong Dong No. 1” USV.
Table 2. Dimensional information of the “Hong Dong No. 1” USV.
InformationValue
main size1.5 m
molded breadth0.74 m
molded depth0.6 m
designed draught0.2 m
board-side inclination angle0.1 deg
Table 3. The parameter settings of each method in the water pool experiment.
Table 3. The parameter settings of each method in the water pool experiment.
MethodParameterMeaningValue
Improved RANSAC with VF ε a Max distance of neighbor points0.035 m
ξ a Threshold of normal vector angle0.02 rad
ϵ a Distance tolerance of point to plane0.05 m
Standard RANSAC with VF ϵ b Distance tolerance of point to plane0.05 m
Improved RANSAC without VF ε c Max distance of neighbor points0.035 m
ξ c Threshold of normal vector angle0.02 rad
ϵ c Distance tolerance of point to plane0.05 m
Table 4. The evaluation indexes of each method in the water-pool experiments.
Table 4. The evaluation indexes of each method in the water-pool experiments.
MethodAttitude StateRMSEMAE
Improved RANSAC with VFRoll Angle2.0371°1.6907°
Yaw Angle1.3204°1.1092°
Roll Rate19.1547°/s15.8893°/s
Yaw Rate12.6331°/s8.9665°/s
Improved RANSAC without VFRoll Angle1.9879°1.6301°
Yaw Angle1.2956°1.1068°
Roll Rate18.2154°/s14.3370°/s
Yaw Rate12.0684°/s8.8624°/s
Standard RANSAC with VFRoll Angle14.5339°13.8871°
Yaw Angle8.4089°8.2139°
Roll Rate140.2946°/s75.8283°/s
Yaw Rate10.8218°/s7.6778°/s
Table 5. Technical information of the VLP-32 LiDAR.
Table 5. Technical information of the VLP-32 LiDAR.
InformationValue
laser beam32
spinning speed300 RPM
maximum range200 m
laser fire frequency21,700 Hz
vertical field of view40 deg
horizontal field of view360 deg
horizontal angular resolution0.08 deg
Table 6. Dimensional information of the container model.
Table 6. Dimensional information of the container model.
InformationValue
main size125 m
molded breadth25 m
molded depth13 m
designed draught8 m
board-side inclination angle0.05 deg
Table 7. The parameter settings of each method in the container ship motion simulation.
Table 7. The parameter settings of each method in the container ship motion simulation.
MethodParameterMeaningValue
Improved RANSAC with VF ε a Max distance of neighbor points10 m
ξ a Threshold of normal vector angle0.003 rad
ϵ a Distance tolerance of point to plane0.05 m
Standard RANSAC with VF ϵ b Distance tolerance of point to plane0.05 m
Improved RANSAC without VF ε c Max distance of neighbor points10 m
ξ c Threshold of normal vector angle0.003 rad
ϵ c Distance tolerance of point to plane0.05 m
Table 8. Dimensional information of the yacht model.
Table 8. Dimensional information of the yacht model.
InformationValue
main size20 m
molded breadth6 m
molded depth3 m
designed draught1 m
board-side inclination angle0.1 deg
Table 9. The parameter settings of each method in the yacht motion simulation.
Table 9. The parameter settings of each method in the yacht motion simulation.
MethodParameterMeaningValue
Improved RANSAC with VF ε a Max distance of neighbor points3 m
ξ a Threshold of normal vector angle0.04 rad
ϵ a Distance tolerance of point to plane0.0075 m
Standard RANSAC with VF ϵ b Distance tolerance of point to plane0.0075 m
Improved RANSAC without VF ε c Max distance of neighbor points3 m
ξ c Threshold of normal vector angle0.04 rad
ϵ c Distance tolerance of point to plane0.0075 m
Table 10. The evaluation indexes of each method in the container ship motion simulations.
Table 10. The evaluation indexes of each method in the container ship motion simulations.
MethodAttitude StateRMSEMAE
Improved RANSAC with VFRoll Angle0.0373°0.0316°
Yaw Angle0.0350°0.0294°
Roll Rate0.1201°/s0.0859°/s
Yaw Rate0.1433°/s0.1163°/s
Improved RANSAC without VFRoll Angle0.0377°0.0307°
Yaw Angle0.0349°0.0293°
Roll Rate0.1088°/s0.0766°/s
Yaw Rate0.1430°/s0.1162°/s
Standard RANSAC with VFRoll Angle13.9727°13.5472°
Yaw Angle0.2845°0.2383°
Roll Rate13.0475°/s10.0682°/s
Yaw Rate0.8129/s0.6323°/s
Table 11. The evaluation indexes of each method in the yacht motion simulations.
Table 11. The evaluation indexes of each method in the yacht motion simulations.
MethodAttitude StateRMSEMAE
Improved RANSAC with VFRoll Angle0.0466°0.0368°
Yaw Angle0.0365°0.0295°
Roll Rate0.1942°/s0.1513°/s
Yaw Rate0.1362°/s0.1090°/s
Improved RANSAC without VFRoll Angle0.0440°0.0346°
Yaw Angle0.0363°0.0292°
Roll Rate0.1843°/s0.1449°/s
Yaw Rate0.1354°/s0.1083°/s
Standard RANSAC with VFRoll Angle12.7396°12.2805°
Yaw Angle0.8079°0.6695°
Roll Rate11.5327°/s6.9814°/s
Yaw Rate2.1816°/s1.5662°/s
Table 12. The average evaluation indexes 1 of each method.
Table 12. The average evaluation indexes 1 of each method.
Attitude StateMethodAverage RMSEAverage MAE
Roll Angle (°)Improved RANSAC with VF0.70700.5864
Improved RANSAC without VF0.68990.5651
Standard RANSAC with VF13.748713.2383
Yaw Angle (°)Improved RANSAC with VF0.46400.3894
Improved RANSAC without VF0.45560.3884
Standard RANSAC with VF2.92473.0406
Roll Rate (°/s)Improved RANSAC with VF6.48975.3755
Improved RANSAC without VF6.16954.8528
Standard RANSAC with VF54.958330.9593
Yaw Rate (°/s)Improved RANSAC with VF4.30423.0639
Improved RANSAC without VF4.11563.0290
Standard RANSAC with VF4.60543.2921
1 Average indexes of each method are computed by averaging its indexes of three tests.
Table 13. The angle and angular rate average evaluation indexes 1 and reduction percentage.
Table 13. The angle and angular rate average evaluation indexes 1 and reduction percentage.
Attitude StateMethodAverage RMSEAverage MAE
AngleImproved RANSAC with VF0.5855°0.4879°
Standard RANSAC with VF8.3367°8.1395°
Reduction Percentage 291.62%92.93%
Angular RateImproved RANSAC with VF5.3970°/s4.2197°/s
Standard RANSAC with VF29.7819°/s17.1257°/s
Reduction Percentage 271.81%75.36%
1 Angle average indexes are computed by averaging the data from roll angle and yaw angle. Angular rate average indexes are computed by averaging the data from roll rate and yaw rate. 2 Reduction percentage is calculated by subtracting the division of “improved RANSAC with VF” by “standard RANSAC with VF” from one.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wei, S.; Xiao, Y.; Yang, X.; Wang, H. Attitude Estimation Method for Target Ships Based on LiDAR Point Clouds via An Improved RANSAC. J. Mar. Sci. Eng. 2023, 11, 1755. https://doi.org/10.3390/jmse11091755

AMA Style

Wei S, Xiao Y, Yang X, Wang H. Attitude Estimation Method for Target Ships Based on LiDAR Point Clouds via An Improved RANSAC. Journal of Marine Science and Engineering. 2023; 11(9):1755. https://doi.org/10.3390/jmse11091755

Chicago/Turabian Style

Wei, Shengzhe, Yuminghao Xiao, Xinde Yang, and Hongdong Wang. 2023. "Attitude Estimation Method for Target Ships Based on LiDAR Point Clouds via An Improved RANSAC" Journal of Marine Science and Engineering 11, no. 9: 1755. https://doi.org/10.3390/jmse11091755

APA Style

Wei, S., Xiao, Y., Yang, X., & Wang, H. (2023). Attitude Estimation Method for Target Ships Based on LiDAR Point Clouds via An Improved RANSAC. Journal of Marine Science and Engineering, 11(9), 1755. https://doi.org/10.3390/jmse11091755

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop