Next Article in Journal
A Sum-of-Squares and Semidefinite Programming Approach for Maximum Likelihood DOA Estimation
Next Article in Special Issue
Selection of Shear Horizontal Wave Transducers for Robotic Nondestructive Inspection in Harsh Environments
Previous Article in Journal
Smart Pipe System for a Shipyard 4.0
Previous Article in Special Issue
A Novel Fisheye-Lens-Based Photoacoustic System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Omni-Directional Scanning Localization Method of a Mobile Robot Based on Ultrasonic Sensors

1
Department of Mechanical and Precision Instrument Engineering, Xi’an University of Technology, Xi’an 710048, China
2
Department of Art and Design, Xi’an University of Technology, Xi’an 710048, China
*
Author to whom correspondence should be addressed.
Sensors 2016, 16(12), 2189; https://doi.org/10.3390/s16122189
Submission received: 28 October 2016 / Revised: 8 December 2016 / Accepted: 14 December 2016 / Published: 20 December 2016
(This article belongs to the Special Issue Ultrasonic Sensors)

Abstract

:
Improved ranging accuracy is obtained by the development of a novel ultrasonic sensor ranging algorithm, unlike the conventional ranging algorithm, which considers the divergence angle and the incidence angle of the ultrasonic sensor synchronously. An ultrasonic sensor scanning method is developed based on this algorithm for the recognition of an inclined plate and to obtain the localization of the ultrasonic sensor relative to the inclined plate reference frame. The ultrasonic sensor scanning method is then leveraged for the omni-directional localization of a mobile robot, where the ultrasonic sensors are installed on a mobile robot and follow the spin of the robot, the inclined plate is recognized and the position and posture of the robot are acquired with respect to the coordinate system of the inclined plate, realizing the localization of the robot. Finally, the localization method is implemented into an omni-directional scanning localization experiment with the independently researched and developed mobile robot. Localization accuracies of up to ±3.33 mm for the front, up to ±6.21 for the lateral and up to ±0.20° for the posture are obtained, verifying the correctness and effectiveness of the proposed localization method.

1. Introduction

The types of sensors used in the localization of mobile robots include laser sensors [1,2,3], visional sensors [4,5,6], infrared sensors [7], RFID (Radio Frequency Identification Devices) [8] and ultrasonic sensors [9,10], which compared with other sensors is the most robust and low-cost distance detection device [11]. The sound waves emitted by an ultrasonic sensor encompasses a fan-shaped area, the angle of which is defined as the divergence angle, and all objects that fall within this region can be detected. The distance accuracy of the ultrasonic sensor may be limited by failure to consider the divergence angle and the incidence angle, which refers to the angle between the cross-section of the ultrasonic sensor and the plane of the object being detected.
Song and Tang [12] reduced the impact of the divergence angle on the localization accuracy of a mobile robot by the application of external and independent Kalman filtering and the use of two ultrasonic sensors and a CCD (Charge-Coupled Device) vision sensor. Noykov and Roumenin [11] experimentally outlined an orientational probability graph for the divergence angle of an ultrasonic sensor and proposed an ultrasonic sensor edge detection method based on polaroid ultrasonic sensors. Kim and Kim [10] put forward a dual-ultrasonic sensor overlapping area distance detection method, which effectively decreased the influence of the divergence angle on the ranging accuracy of ultrasonic sensors, allowing for the precise localization of the posture of the car-like robot. Bin Liang et al. [13] brought forward the lateral localization method that employed two ultrasonic sensors installed on one side of the robot and considered the incident angle. Wijk and Christensen [14] proposed the use of information fusion technology for the indoor robot localization by the recognition of the fixed object. Carinena et al. [15] applied the novel paradigm of fuzzy temporal rules to detect doors using the information of ultrasonic sensors. Hwang et al. [16] introduced a simple GPS system for the indoor localization of a mobile robot that consisted of one transmitter having ultrasonic and RF (Radio Frequency) and two receivers. Li et al. [17] developed an ultrasonic sensor array heuristic controller system with group-sensor firing intervals, which was used to obtain the posture of a mobile robot in a parking space, and to ensure the ability to withstand collision and to guarantee safety parking. Kim and Kim [18] presented the optimal arrangement of an ultrasonic sensor ring with beam overlap for high resolution obstacle detection and minimal position uncertainty of a mobile robot. Hsu et al. [19] proposed a localization method based on the omni-directional ultrasonic sensor, which included a mobile robot carrying an omni-directional ultrasonic device as a transmitter and several ultrasonic sensors located at the vertices of a square environment serving as receivers. Lim et al. [20] proposed a novel control architecture which enabled a robot to navigate indoor environments while avoiding obstacles and localizing its current position by using a smartphone as its brain to deal with the heavy-duty and rotating ultrasonic sensors, reducing the number of sensors needed, as well as the time of distant measurements. Currently, research in the field of mobile robot localization is typically limited to uni-directional localization, for instance, forward or lateral localization, and research on the simultaneous forward, lateral and posture localization is rare.
Previous studies to reduce the effects of the divergence angle and the incident angle of ultrasonic sensors for the localization of a mobile robot have mainly focused on filtering or compensation methods. In this paper, the improved ultrasonic ranging calculation expression is presented, which is based on the original distance detection model for ultrasonic sensors. The ultrasonic sensor scanning method described herein can be used to recognize the inclined plate and acquire the position and posture of the ultrasonic sensor relative to the framework of the inclined plate. Finally, the omni-directional scanning localization method of a mobile robot is put forth based on this scanning method.
The organization of this paper is as follows: Section 2 analyses the affections of the divergence angle and the incidence angle on the ranging accuracy of the ultrasonic sensor, and deduces the improved algorithm expression. Section 3 introduces the methodology of the edge detection and the recognition of the inclined plate, and is extended in the application of the omni-directional scanning localization method in Section 4. In Section 5, the experiments of the identification of thresholds and the actual localization of a mobile robot are implemented, and the results verified the proposed localization methodology. Finally, Section 6 offers brief concluding comments.

2. The Divergence Angle and the Incidence Angle of an Ultrasonic Sensor

In order to analyze the impact of the divergence angle and the incident angle on the measurement accuracy of an ultrasonic sensor, a geometric model is established as shown in Figure 1, where U t i represents the ultrasonic sensor, i = ( 1 , 2 ) is the number of ultrasonic sensor, α i represents the divergence angle of U t i (in degree), A B is the reference plate, d i represents the direct distance measurement of U t i (in mm), D c i is the actual distance value (in mm) between U t i and the reference plate AB in the y-direction, L is the distance (in mm) between two ultrasonic sensors in the x-direction and r is the diameter (in mm) of the ultrasonic sensor, U t i . The incidence angle, θ , is defined as the angle between the cross-section of U t i and the plane of the reference plate, AB, (in degree).
From Figure 1, it can be concluded that the actual distance value, D c i , between the ultrasonic sensor, U t i , and the reference plate, AB, in the y-direction can be expressed as:
D c i = f ( d i , α i , θ ) = d i C ( α i ) + [ d i S ( α i ) + r 2 ] T ( θ ) ,
where C ( α i ) is the cosine trigonometric function, cos ( α i ) , S ( α i ) is the sine function, sin ( α i ) , T ( θ ) is the tangent function, tan ( θ ) , and i = ( 1 , 2 ) is the number of ultrasonic sensors. Conventionally, the direct distance, d i , is taken as the actual distance, D c i . However, from the Equation (1), it is apparent that the actual distance, D c i , is different from d i and is influenced by the divergence angle, α i , and the incidence angle, θ , and that if these angles are ignored, the real distance measurement accuracy of the ultrasonic sensor would be affected. It is well known that the divergence angle, α , is an intrinsic property of a given ultrasonic sensor, which is invariant for a specified ultrasonic sensor but varies between different ultrasonic sensors, and can be obtained through experiments or from the factory manual. Usually, in most applications, the incidence angle, θ , is set to zero when a single ultrasonic sensor is used or in non-positioning and non-obstacle avoidance situations. However, when multiple ultrasonic sensors are used simultaneously and accurate localization is required, such as in the application presented in Figure 2, where two ultrasonic sensors are applied, the incidence angle, θ , can be expressed as shown in Equation (2):
θ = g ( D c 1 , D c 2 ) = tan 1 ( D c 1 D c 2 L ) .
From Equations (1) and (2), the incidence angle, θ , can be described as:
θ = { tan 1 [ d 1 C ( α 1 ) d 2 C ( α 2 ) L d 1 S ( α 1 ) + d 2 S ( α 2 ) ]   i f ( d 1 > d 2 ) 0   e l s e i f ( d 1 = d 2 ) tan 1 [ d 2 C ( α 2 ) d 1 C ( α 1 ) L + d 1 S ( α 1 ) d 2 S ( α 2 ) ]   i f ( d 1 < d 2 ) .

3. Edge Detection and Recognition of the Inclined Plate

Earlier in this work, the effects of the divergence angle and the incidence angle on the accuracy of the distance measurement of an ultrasonic sensor have been analyzed and a novel ranging algorithm with consideration of the divergence angle and the incidence angle of the ultrasonic sensor has been established. Now, this conclusion is going to be executed to the edge detection of an inclined plate. In order to enable object identification in a given environment, Zhong et al. [21] determined the range, bearing angle and shape (edge or plane) of objects from a single measurement of a robot using a single transmitter and a multi-receiver of ultrasonic sensors. Ohtani and Baba [22] designed a prototype system for the shape recognition, position and posture measurement of an object, using an ultrasonic sensor array made up of multi ultrasonic transmitters and receivers arranged in the same plane, a processing unit and a neural network. Although both of these studies could recognize objects and detect the edge of an object, the transmitter and the receiver of each ultrasonic sensor are separate, with the transmitter irradiating the measured object with ultrasonic waves and the receiver picking up the reflected waves to recognize the object. This method requires multiple positions for the transmitters and the receivers, which needs more space and is inflexible. Therefore, it is quite significant to integrate the transmitter and the receiver of an ultrasonic sensor by controlling the scanning of the ultrasonic sensor to achieve the identification of objects.
The ultrasonic sensors used in this paper are all the integrated ones, and they all have the time synchronization (avoiding the mutual interference between different ultrasonic sensors) and temperature compensation functions. The minimal detection distance of an ultrasonic sensor (dead zone) is denoted as D min , and the maximal as D max , the actual distance between the object and the ultrasonic sensor as D, where D [ D min   , D max ] to ensure that the distance value, D, is applicable and reliable. When the direction of the ultrasonic sensor relative to the inclined plate scans continuously, relative changes in the value measured by the ultrasonic sensor occur, whereas, at the edge of the inclined plate, the measured value oscillates irregularly. Thus, we define a threshold ξ to determine whether the edge of the inclined plate is detected or not, where ξ is a small positive real number. Let d i and d i + 1 be the two successive measurement record values of a specific edge, with i = 1 ,   2 ,   ,   n , where n is the total number of record groups. If Δ d = d i d i + 1 satisfies Equation (4), it is definitely accounted for that the edge of the inclined plate is detected and the distance from the ultrasonic sensor to the edge of inclined plate is d i :
Δ d = d i d i + 1 < ξ .
The edge detection model is as shown in Figure 3, where α is the divergence angle of the ultrasonic sensor U t , C is the coordinate system of U t , L A B is the actual length of the inclined plate A B , and A and B are the coordinate systems at point A and B , respectively. λ is the angle between the inclined plate A B and the horizontal direction of the x-axis, D P = f ( d P , α , θ ) is the actual distance from the ultrasonic sensor, U t , to the point, P , on the inclined plate. Correspondingly, d P is the direct distance measurement of point P , d A and d B are the directly measured values of points A and B, respectively, while scanning the plate AB, and can all be confirmed from Equation (4). ω 1 and ω 2 are the rotation angles of the ultrasonic sensor scanning from point P to point A counterclockwise and scanning from point P to point B clockwise, respectively, where ω = ω A + ω B . The theoretical length of plate AB, l A B , is obtained from d A , d B and ω , which are contained in the triangle ∆ABC:
l A B = ( d A ) 2 + ( d B ) 2 d A d B C ( ω ) ,
Δ L = | l A B L A B | < δ .
In Equation (6), δ is a positive number, given as the length recognition threshold of the inclined plate. The values, d A , d B , ω A and ω B , acquired from Equation (4), are considered to be correct if l A B satisfies Equation (6). Otherwise, scanning is repeated until Equation (6) is satisfied and the plate is recognized. Set θ A as the angle between the straight line AC and the y-direction of the reference frame A , θ P as the angle between the straight line PC and the y-direction of the reference frame P , θ B as the angle between the straight line BC and the y-direction of the reference frame B . According to the homogeneous coordinate transformation methodology, the position and posture, T P A and T P B , of system P relative to A and B , respectively, are obtained as:
T P A = [ E d A S ( θ A ) d A C ( θ A ) 0 0 1 ] = [ R z ( θ A ω A α θ P ) d A S ( θ A ) d A C ( θ A ) 0 0 1 ] ,
T P B = [ E d B S ( θ B ) d B C ( θ B ) 0 0 1 ] = [ R z ( ω B θ B + α θ P ) d B S ( θ B ) d B C ( θ B ) 0 0 1 ] ,
where E is a 3 × 3 unit matrix, and R z ( θ ) is a rotation matrix as shown in Equation (10). The angles, θ A and θ B , in Equations (7) and (8) are as shown in Equation (9):
{ θ A = π 2 s i n 1 ( d B L A B S ( ω ) ) + λ θ B = π 2 s i n 1 ( d A L A B S ( ω ) ) λ ,
R z ( θ ) = [ C ( θ ) S ( θ ) 0 S ( θ ) C ( θ ) 0 0 0 1 ] .
According to the earlier definition, the incidence angles between the ultrasonic sensor, U t , and the inclined plate, AB, at point P can be described as θ P A and θ P B , respectively, relative to the reference frames A and B :
{ θ P A = ω A + α θ A + λ θ P B = θ B α ω B + λ .
Through a review of the above studies, the position (as shown in Equation (7) or (8)) and posture (as shown in Equation (11)) of an ultrasonic sensor relative to a fixed plate can be obtained. Similarly, if the ultrasonic sensor is installed on a mobile robot, the scanning of the ultrasonic sensor is accomplished by the rotation of the robot. Thus, the position and posture of the mobile robot can also be acquired.

4. Omni-Directional Scanning Localization Method

The above section introduced the ultrasonic sensor scanning recognition method of a fixed inclined plate and deduced the position and posture of the ultrasonic sensor relative to the plate. However, in most practical applications, the ultrasonic sensor is installed on a mobile robot, the rotation center of the sensor is the center of the robot and the ultrasonic sensor scans the objects along with the spin of the robot. To further research the localization method of a mobile robot with an ultrasonic sensor, a model is established as shown in Figure 4.
In Figure 4, a coordinate system parallel to the reference frame O at points A, B, C, D and P as A , B , C , D and P is established, as is the robot base reference frame, R , at the center of mobile robot, whose axes are parallel to the outlines of the robot. AB and CD are the two inclined plates fixed on the localization worksite or the pallet to be carried away by the robot, the angle between plate AB and the horizontal direction of the x-axis is λ , and the angle between plate CD and the horizontal direction of the x-axis is λ . R is the center of the mobile robot body. The two ultrasonic sensors, U t 1 and U t 2 , are installed at the two points, E and F, on the front of the robot body, characterized by the divergence angles, α 1 and α 2 , respectively. The distance parallel to the transverse of the robot between U t 1 and U t 2 is W. d E is the directly measured value of U t 1 when the robot is located at position R, and D c E is the actual one. Similarly, d F is the directional measurement of U t 2 and D c F is the actual one. P is the reflection point at the present position and posture of U t 1 .
The desired distance between the ultrasonic sensors, U t 1 and U t 2 , and the plate, AB and CD, is denoted as D and the posture of the robot is θ O = 0 ° when the robot is at the reference position O. The inclined plate AB and CD are symmetrically arranged to the y-axis of the frame O . The position and posture error of the robot at the point R relative to the reference frame, O , is given as Δ ε = [ Δ ε x , Δ ε y , Δ ε θ ] T . The position of point E in the frame R , P E R , is P E R = [ E x , E y ] T . The distance from point A to point D is L A D at the horizontal direction of the x-axis, and the distance from point B to point C is L B C at the horizontal direction of the x-axis.
First, the robot is controlled to point O manually and it is made sure that D c E = D c F = D . Then, the robot rotation is controlled around its center, and, using the scanning recognition method introduced in Section 3, the counterclockwise and clockwise angles of the robot from point P to point A and point B, ω A O and ω B O , respectively, are recorded. Finally, the homogeneous coordinate transformations, T o A and T o B , from the coordinate system A and B to the coordinate system P is accomplished as follows:
T O A = [ R z ( θ A O ω A O α 1 ) P O A 0 1 ] = [ E L A D 2 E y + D L A D W 2 T ( λ ) 0 0 1 ] ,
T O B = [ R z ( θ B O + ω B O + α 1 ) P O B 0 1 ] = [ E L B C 2 E y + D + L B C W 2 T ( λ ) 0 0 1 ] ,
where θ A O is the angle between the straight line AO and the y-axis of the reference frame A , and θ B O is the angle between the straight line BO and the y-axis of the reference frame B . From Equations (12) and (13), the distance from point O to point A and point B, L A O and L B O , are as follows:
L A O = ( P O x A ) 2 + ( P O y A ) 2 ,
L B O = ( P O x B ) 2 + ( P O y B ) 2 ,
ω O = ω A O + ω B O .
The theoretical length l A B of the plate AB in the triangle ∆ABO is calculated from the values of L A O , L B O and ω O :
l A B = ( L A O ) 2 + ( L B O ) 2 2 L A O L B O C ( ω O ) .
In accordance with the definition of the incidence angle, the incidence angle, θ , relative to the reference frames A and B is:
θ = θ P A = θ P B = λ   .
Similarly, the transformation of the robot at the position R relative to A and B are T R A and T R B , respectively:
T R A = [ R z ( θ A R ω A α 1 ) L A R S ( θ A R ) L A R C ( θ A R ) 0 0 1 ] ,
T R B = [ R z ( θ B R + ω B + α 1 ) L B R S ( θ B R ) L B R C ( θ B R ) 0 0 1 ] ,
where θ A R is the angle between the straight line AR and the y-axis of the reference frame A , and θ B R is the angle between the straight line BR and the y-axis of the reference frame B :
{ θ A R = π 2 sin 1 ( L B R L A B S ( ω ) ) + λ   θ B R = π 2 + sin 1 ( L A R L A B S ( ω ) ) + λ ,
{ L A R = ( L A G ) 2 + ( L G R ) 2 2 L A G L G R S ( α 1 ) L B R = ( L A H ) 2 + ( L H R ) 2 2 L A H L H R S ( α 1 ) ,
{ L A G = d A + P R y E C ( α 1 ) L G R = P R y E T ( α 1 ) + P R x E L A H = d B + P R y E C ( α 1 ) L H R = L A H .
The theoretical length of the plate AB, l A B , can be deduced in the triangle ∆ABR as:
l A B = ( L A R ) 2 + ( L B R ) 2 L A R L B R C ( ω ) .
The divergence angles, θ P A and θ P B , of the ultrasonic sensor are listed in the Equation (25):
{ θ P A = ω A + α 1 θ A R + λ θ P B = λ θ B R ω B α 1 .
The position and posture error of the robot at the position R relative to the position O in reference to A and B are Δ O A ε R and Δ O B ε R :
Δ O A ε R = [ Δ O A ε x R Δ O A ε y R Δ O A ε θ R ] = [ P R x A P O x A P R y A P O y A R R x A R O x A ] ,
Δ O B ε R = [ Δ O B ε x R Δ O B ε y R Δ O B ε θ R ] = [ P R x B P O x B P R y B P O y B R R x B R O x B ] .
Considering Equations (12)–(15), (19), (20), (26) and (27), the errors Δ A ε R and Δ A ε R are calculated as follows:
Δ O A ε R = 1 2 [ 2 L A R S ( θ A R ) + L A D ( L A D W ) T ( λ ) 2 ( E y + D + L A R C ( θ A R ) ) 2 ( θ A R ω A α 1 ) ] ,
Δ O B ε R = 1 2 [ 2 L B R S ( θ B R ) + L B C ( L B C W ) T ( λ ) 2 ( E y + D + L B R C ( θ B R ) ) 2 ( θ B R + ω B + α 1 ) ] .
The position and posture error thresholds of the robot at position R are set to ε = [ ε x , ε y , ε θ ] T . The robot meets the localization requirement if the error Δ ε satisfies the Equation (30):
Δ ε = [ Δ ε x Δ ε y Δ ε θ ] < [ ε x ε y ε θ ] = ε .
Up until this stage, the position and posture and the error of the mobile robot relative to the reference frame, A and B , at the points A and B have been acquired. For the actual application, the localization of the robot is achieved according to the follow steps.
Step 1
Preparation. First, the positions of the inclined plate AB and CD are set up, and their lengths and poses are specified. Then, values of thresholds ξ , δ and ε are assigned. Next, the distance measured value D of ultrasonic sensor at the reference pose O is verified, and the transform position and posture T O A and T O B are calculated.
Step 2
Satisfaction of pre-localization condition. The robot moves from somewhere to the control point R, which can be anywhere, and whether the actual distance measurement values D c E and D c F satisfy D c E D < ρ E and D c F D < ρ F , with ρ E and ρ F being positive and real numbers, are judged. If these prerequisites are met, the directly measured distances d A and d B of the ultrasonic sensor U t 1 , and the rotation angles ω A and ω B of the robot are recorded and it is possible to proceed to Step 3. Otherwise, the process must be repeated until the pre-localization conditions are met.
Step 3
Edge detection of inclined plate. Along with the spin of the robot around its center, the ultrasonic sensor U t 1 scans the inclined plate AB, and the distance d A and d B of the edge of plate AB is measured. If d A and d B satisfy the Equation (4), it is possible go to the subsequent step. If not, re-scanning is necessary.
Step 4
Verification of the length of the inclined plate. The theoretical length l A B of plate AB is calculated through Equation (24). If Equation (6) is satisfied, proceed to Step 5. If not, it is necessary to return to Step 2 and repeat Steps 2–4.
Step 5
Calculation of the position and posture of the robot. The position and posture, T R A and T R B , of the mobile robot relative to the reference frame A and B are calculated, respectively, through Equations (19) and (20).
Step 6
Satisfaction of localization requirement. The position and posture errors, Δ A ε R and Δ B ε R , of the robot are computed. If Δ A ε R and Δ B ε R satisfy Equation (30), the localization requirement is achieved. Otherwise, return to Step 2 and repeat Steps 2–6.

5. Threshold and Experiments

5.1. Threshold Identification Experiment

The omni-directional scanning localization method of a mobile robot and its application steps have been introduced. However, the confirmation of the different threshold values defined in the method may affect the accuracy of the localization.
The most important one is the plate edge detection threshold, ξ , which decides the sink or swim of the edge detection of the plate, and affects the correctness of the incidence angle of the ultrasonic sensor and the theoretical length of the plate. Therefore, it is necessary to get the precise value of ξ .
The experiment preformed to confirm the edge detection threshold, ξ , is as shown in Figure 5, where the ultrasonic sensor is mounted on the c-axis of the independently researched and developed precision five-axis machine, and the x-axis of the machine is assembled at the top of the y-axis and below the c-axis. The scanning movement of the ultrasonic sensor is driven by the CNC (Computer Numerical Control) programing of the x-axis, the y-axis and the rotation of the c-axis of the machine. The plate edge data measured by the ultrasonic sensor is recorded by the DAQ (Date Acquisation), SIRISI-8A (DEWESoft, Kumberg, Austria) as shown in Figure 6.
In Figure 6, the blue line is the recorded data of the ultrasonic sensor, the red line is the rotation angle of the c-axis of the machine, points A and B are the detected edge of the inclined plate, their vertical data, d A and d B , are the detected distances of the two sides of the plate, and ω is the rotation angle of the c-axis from point A to point B. The theoretical length of plate l A B can be calculated using Equation (5) and the length error Δ L from Equation (6). After many experiments and much analysis, it has been found that the error Δ L increases sharply when ξ < 0.008 and at a faster rate when ξ > 0.010 . Therefore, ξ = 0.009 is concluded as the edge detection threshold and δ = 3 as the length recognition threshold. The ranging threshold ρ E = 100 and ρ F =100. The position and posture error threshold ε can be determined empirically as ε = [ 10   mm , 5   mm , 1 ° ] T .

5.2. Localization Experiment

The localization experiment has been implemented on the independently developed latent and towing mobile robot as shown in Figure 7. The mobile robot is a differential driving robot, with two driving wheels mounted coaxially on the left and right sides of the robot symmetrically and four universal wheels distributed at the four corners of the robot correspondingly. In order to ensure that the six wheels of the robot can be in better contact with the ground while driving at the same time, the universal wheels have been designed with elastic suspension structure. The maximum speed of the robot is 0.5 m/s, the maximum loading capacity is 2 tons, and its length, width and height are 1560 mm, 900 mm, and 300 mm, respectively.
Parts of the experimental localization data are listed in Table 1. D c 1 and D c 2 are the actual distances measured from the ultrasonic sensors U t 1 and U t 2 , respectively. Δ ε x , Δ ε y and Δ ε θ are the localization error of the robot relative to the x, y and c axes, of the reference frame O .
As can be seen from Table 1, the localization accuracy of the robot is Δ ε y ± 3.33 mm on the y-axis (the front of the robot), Δ ε x ± 6.21 mm on the x-axis (the lateral of the robot), Δ ε θ ± 0.20 ° on the c-axis (the posture of the robot). The localization accuracy, Δ ε , satisfies the localization error threshold, ε , and these results verify the efficacy of the omni-directional scanning localization method.
To this step, the methodology of the edge detection and recognition of an inclined plate and the omni-directional scanning localization of a robot is verified by the above experiments, and the thresholds of the omni-directional scanning localization method are confirmed through experiments and experience. At the end, the novel ranging algorithm of an ultrasonic sensor is certified. The localization accuracy of the proposed omni-directional scanning method is suitable for a variety of applications, and several application examples are described in the next section.

5.3. Discussion and Application

The proposed localization method, an omni-directional localization method which can realize the forward, lateral and posture localization of a robot simultaneously, is different from the localization application as shown in Figure 2, which can only achieve the lateral (or the front) and posture position of the robot. This method takes up little space and is more convenient in the localization applications such as those shown in Figure 8, where Figure 8a is a transit task application of a busy factory (this is a case of a textile enterprise), in which there are lots of shelves arranged, according to certain rules, and they are waiting to be carried away. Figure 8b is another localization application (this is a case of a rice winery), where the robot continuously moves from anywhere to the tight localization site in a continuous path and confirms its position and posture.
The proposed localization method can also be leveraged for other types of robots, if the robot satisfies the localization conditions, such as the omni-directional mobile robot shown in Figure 8c, for which each of the four sides of the robot installed two ultrasonic sensors to ensure the realization of the localization on any one side of the robot. The robot has four Mecanum’s wheels, with which the robot can move smoothly in x, y and c directions [23]. Further research and application of this technology is still ongoing, such as the multi-robot localization in the dynamic environment.

6. Conclusions

First, the novel ranging algorithm of an ultrasonic sensor is established by simultaneously considering both the divergence angle and the incidence angle, which improved the accuracy of the measurement of the ultrasonic sensor.
Second, the edge detection and recognition of an inclined plate is introduced by using the proposed ranging algorithm, based on which the position and posture of an ultrasonic sensor relative to the plate are obtained.
Third, the ultrasonic sensor is installed on a mobile robot, and the positioning method of the ultrasonic sensor is extended to the omni-directional scanning localization of a mobile robot to achieve the forward, lateral and posture localizations synchronously. Details of the localization methodology are introduced and discussed, and the application steps are summarized.
Finally, the thresholds of the localization method are confirmed through experiments and experience, and the omni-directional scanning localization method is verified by the localization experiment of a differential driving robot. The application of the method on other types of mobile robots is discussed and several real applications are given.
Finally, the main concern of the proposed localization method is the local localization of a mobile robot at the worksite. Further research and application of this technology is still ongoing, such as the combination of the local and the global localization of a robot, and the research of applications of the proposed localization method for multi-robots with multi-sensor information fusion technology in dynamic circumstances.

Acknowledgments

The authors would like to express their sincere thanks to the reviewers for their invaluable comments and suggestions. The research leading to these results has received funding from the National Key Scientific and Technological Innovation Projects (multi-functional cutting robot, 99BK445), the National 973 Project (2009CB724406), and funding from the Key Laboratory Research Project of the Education Department of Shaanxi Province (14JS062), the Research Project of the Education Department of Shaanxi Province (11JS073), the Key Scientific Research Project of the Education Department of Shaanxi Province (2010JS81) and the Special Research Program Project of the Education Department of Shaanxi Province (16JK1517).

Author Contributions

Wei-Yi Mu and Yu-Mei Huang developed the methodology and wrote the paper; Wei-Yi Mu and Hong-Yan Liu designed and performed the experiments under the supervision of Guang-Peng Zhang and Xin-Gang Yang; Guang-Peng Zhang and Wen Yan analyzed the data. The authors approved the final manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Loevsky, I.; Shimshoni, I. Reliable and efficient landmark-based localization for mobile robots. Robot. Auton. Syst. 2010, 58, 520–528. [Google Scholar] [CrossRef]
  2. Mirkhani, M.; Forsati, R.; Shahri, A.M.; Moayedikia, A. A novel efficient algorithm for mobile robot localization. Robot. Auton. Syst. 2013, 61, 920–931. [Google Scholar] [CrossRef]
  3. Martín, F.; Moreno, L.; Blanco, D.; Muñoz, M.L. Kullback–leibler divergence-based global localization for mobile robots. Robot. Auton. Syst. 2014, 62, 120–130. [Google Scholar] [CrossRef]
  4. Muñoz-Salinas, R.; Aguirre, E.; García-Silvente, M. Detection of doors using a genetic visual fuzzy system for mobile robots. Auton. Robots 2006, 21, 123–141. [Google Scholar] [CrossRef]
  5. Zhu, X.; Cao, Q. A special unique solution case of the perspective-three-point problem for external parameter calibration of an omnidirectional camera. Int. J. Adv. Robot. Syst. 2012, 9, 217. [Google Scholar] [CrossRef]
  6. Chao, C.T.; Chung, M.H.; Chiou, J.S.; Wang, C.J. A simple interface for 3d position estimation of a mobile robot with single camera. Sensors 2016, 16, 435. [Google Scholar] [CrossRef] [PubMed]
  7. Gorostiza, E.M.; Lázaro Galilea, J.L.; Meca Meca, F.J.; Salido Monzú, D.; Espinosa Zapata, F.; Pallarés Puerto, L. Infrared sensor system for mobile-robot positioning in intelligent spaces. Sensors 2011, 11, 5416–5438. [Google Scholar] [CrossRef] [PubMed]
  8. Mi, J.; Yasutake, T. Design of an hf-band rfid system with multiple readers and passive tags for indoor mobile robot self-localization. Sensors 2016, 16, 1200. [Google Scholar] [CrossRef] [PubMed]
  9. Ko, N.Y.; Kuc, T.Y. Fusing range measurements from ultrasonic beacons and a laser range finder for localization of a mobile robot. Sensors 2015, 15, 11050–11075. [Google Scholar] [CrossRef] [PubMed]
  10. Kim, S.; Kim, H. Optimally overlapped ultrasonic sensor ring design for minimal positional uncertainty in obstacle detection. Int. J. Control Autom. Syst. 2010, 8, 1280–1287. [Google Scholar] [CrossRef]
  11. Noykov, S.; Roumenin, C. Calibration and interface of a Polaroid ultrasonic sensor for mobile robots. Sens. Actuators A Phys. 2007, 135, 169–178. [Google Scholar] [CrossRef]
  12. Song, K.T.; Tang, W.H. Environment perception for a mobile robot using double ultrasonic sensors and a CCD camera. IEEE Trans. Ind. Electron. 1996, 43, 372–379. [Google Scholar] [CrossRef]
  13. Ma, B.; Huang, Y.; Shi, E.; Cai, T. Experimental study of lateral position based on algebra neural networks information fusion. China Mech. Eng. 2008, 19, 2102–2107. [Google Scholar]
  14. Wijk, O.; Christensen, H.I. Triangulation-based fusion of sonar data with application in robot pose tracking. IEEE Trans. Robot. Autom. 2001, 16, 740–752. [Google Scholar] [CrossRef]
  15. Carinena, P.; Regueiro, C.V.; Otero, A.; Bugarin, A.J. Landmark detection in mobile robotics using fuzzy temporal rules. IEEE Trans. Fuzzy Syst. 2004, 12, 423–435. [Google Scholar] [CrossRef]
  16. Hwang, K.H.; Kim, D.E.; Lee, D.H.; Kuc, T.Y. A simple ultrasonic GPS system for indoor mobile robot system using kalman filtering. In Proceedings of the 2006 SICE-ICASE International Joint Conference, Busan, Korea, 18–21 October 2006; pp. 2915–2918.
  17. Li, T.S.; Yeh, Y.C.; Wu, J.D.; Hsiao, M.Y. Multifunctional intelligent autonomous parking controllers for carlike mobile robots. IEEE Trans. Ind. Electron. 2010, 57, 1687–1700. [Google Scholar] [CrossRef]
  18. Kim, S.; Kim, H. Optimal ultrasonic sensor ring with beam overlap for high resolution obstacle detection. In Proceedings of the IECON 37th Annual Conference on IEEE Industrial Electronics Society, Melbourne, Australia, 7–10 November 2011; pp. 240–245.
  19. Hsu, C.C.; Lai, C.Y.; Kanamori, C.; Aoyama, H. Localization of mobile robots based on omni-directional ultrasonic sensing. In Proceedings of the SICE Annual Conference, Tokyo, Japan, 13–18 September 2011; pp. 1972–1975.
  20. Lim, J.; Lee, S.J.; Tewolde, G.; Kwon, J. Indoor localization and navigation for a mobile robot equipped with rotating ultrasonic sensors using a smartphone as the robot’s brain. In Proceedings of the 2015 IEEE International Conference on Electro/Information Technology (EIT), Dekalb, IL, USA, 21–23 May 2015; pp. 1–11.
  21. Zhong, S.L.; Kwon, S.T.; Joo, M.G. Multi-object identification for mobile robot using ultrasonic sensors. Int. J. Control Autom. Syst. 2012, 10, 589–593. [Google Scholar]
  22. Ohtani, K.; Baba, M. Shape recognition and position and posture measurement of an object based on ultrasonic pressure distributions. IEEJ Trans. Electron. Inf. Syst. 2010, 130, 2013–2020. [Google Scholar] [CrossRef]
  23. Muir, P.F.; Neuman, C.P. Kinematic modeling of wheeled mobile robots. Robot. Syst. 1987, 4, 281–340. [Google Scholar] [CrossRef]
Figure 1. The divergence angle and the incidence angle.
Figure 1. The divergence angle and the incidence angle.
Sensors 16 02189 g001
Figure 2. Lateral localization of a mobile robot. (a) The mathematical model of the localization application; (b) the practical application of lateral localization.
Figure 2. Lateral localization of a mobile robot. (a) The mathematical model of the localization application; (b) the practical application of lateral localization.
Sensors 16 02189 g002
Figure 3. Edge detection of the inclined plate.
Figure 3. Edge detection of the inclined plate.
Sensors 16 02189 g003
Figure 4. Localization model of the mobile robot.
Figure 4. Localization model of the mobile robot.
Sensors 16 02189 g004
Figure 5. Threshold of edge detection experiment.
Figure 5. Threshold of edge detection experiment.
Sensors 16 02189 g005
Figure 6. Data of edge detection.
Figure 6. Data of edge detection.
Sensors 16 02189 g006
Figure 7. The localization experiment.
Figure 7. The localization experiment.
Sensors 16 02189 g007
Figure 8. Applications of the omni-directional scanning localization method. (a) The transit tasks application; (b) the localization application; (c) the application of an omni-directional mobile.
Figure 8. Applications of the omni-directional scanning localization method. (a) The transit tasks application; (b) the localization application; (c) the application of an omni-directional mobile.
Sensors 16 02189 g008
Table 1. The localization data of the robot.
Table 1. The localization data of the robot.
D c 1 (mm) D c 2 (mm) Δ ε x (mm) Δ ε y (mm) Δ ε θ (°)
298.78300.42−4.65−0.40−0.16
300.19300.57−1.080.38−0.04
301.14301.18−0.111.16−0.00
300.28300.64−1.020.46−0.03
302.31304.35−5.783.33−0.19
298.79300.98−6.21−0.12−0.20
300.84299.503.800.170.13
297.98297.411.62−2.310.05

Share and Cite

MDPI and ACS Style

Mu, W.-Y.; Zhang, G.-P.; Huang, Y.-M.; Yang, X.-G.; Liu, H.-Y.; Yan, W. Omni-Directional Scanning Localization Method of a Mobile Robot Based on Ultrasonic Sensors. Sensors 2016, 16, 2189. https://doi.org/10.3390/s16122189

AMA Style

Mu W-Y, Zhang G-P, Huang Y-M, Yang X-G, Liu H-Y, Yan W. Omni-Directional Scanning Localization Method of a Mobile Robot Based on Ultrasonic Sensors. Sensors. 2016; 16(12):2189. https://doi.org/10.3390/s16122189

Chicago/Turabian Style

Mu, Wei-Yi, Guang-Peng Zhang, Yu-Mei Huang, Xin-Gang Yang, Hong-Yan Liu, and Wen Yan. 2016. "Omni-Directional Scanning Localization Method of a Mobile Robot Based on Ultrasonic Sensors" Sensors 16, no. 12: 2189. https://doi.org/10.3390/s16122189

APA Style

Mu, W. -Y., Zhang, G. -P., Huang, Y. -M., Yang, X. -G., Liu, H. -Y., & Yan, W. (2016). Omni-Directional Scanning Localization Method of a Mobile Robot Based on Ultrasonic Sensors. Sensors, 16(12), 2189. https://doi.org/10.3390/s16122189

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop