Next Article in Journal
Floating Chip Mounting System Driven by Repulsive Force of Permanent Magnets for Multiple On-Site SPR Immunoassay Measurements
Previous Article in Journal
A Review on Architectures and Communications Technologies for Wearable Health-Monitoring Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory

Department of Electrical Engineering, National Chin-Yi University of Technology, Taichung 41170, Taiwan
*
Author to whom correspondence should be addressed.
Sensors 2012, 12(10), 13947-13963; https://doi.org/10.3390/s121013947
Submission received: 27 July 2012 / Revised: 8 October 2012 / Accepted: 10 October 2012 / Published: 16 October 2012
(This article belongs to the Section Physical Sensors)

Abstract

: The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system.

1. Introduction

The development of omni-directional wheel systems has made it possible to build robots that can move laterally without needing to rotate. Several researchers have employed omni-directional wheels in their robot designs. A dynamic model and a nonlinear mobile issue are explored in [13], where an omni-directional vehicle is equipped with up to three motor sets. Using a Field Programmable Gate Array (FPGA) as the control core, a multi-robot [4] was developed by integrating a robot arm into an omni-directional mobile robot, enabling better interaction between the robot and users.

Many factors can cause path deviation in a robot, such as variation in motor mechanical tolerances, power output, the weight borne by wheels, and even the ground surface; thus, path deviation is unavoidable. This study employs a compensation approach based on a motor encoder using fuzzy logic to resolve the problem of straight path deviation [5,6]. Motor revolutions are evaluated based on the feedback pulses that are dispatched from the motor encoder at specific intervals interval; in order to tune the value of the Pulse Width Modulation (PWM), and thus, the specified motor revolutions are set.

Several approaches have been employed to avoid obstacles. These include lasers and infrared [7,8], vision systems [9,10], and wall following using ultrasonic sensors [1113]. Extension theory was proposed in 1983 to solve contradictions and incompatibility problems. It consists of two parts—matter-element model and extended set theory—and can be applied to classification or fault diagnosis [14]. Fuzzy theory, proposed by L. A. Zadeh in 1965, uses fuzzy rules and fuzzy inference to replace complicated equations. It's widely used in robot control [15]. In this paper, the obstacle-avoidance system is modeled as multi-dimensional obstacle-avoidance matter-elements, where the names of the extension matter-elements are the same as the number of obstacle-avoidance modes. The proposed approach utilizes ultrasound to complete the task and to implement the matter-element extension model. In this way, obstacles can be modeled and an optimal tracking approach is implemented.

This paper is organized as follows: Section 2 describes the hardware configuration of the robots. Section 3 introduces the omni-directional wheeled mobile robot designed using fuzzy logic theory. Section 4 presents an extension theory based on an obstacle avoidance system. Section 5 gives a discussion of experimental results and Section 6 presents our conclusions.

2. Hardware Design

The omni-directional mobile autonomous following robot control system is composed of an industrial motherboard (SBC86850) along with a Peripheral Interface Controller (PIC) micro controller (DSPIC30F6010A) that serves as the control core and commands the peripheral hardware. The peripheral hardware itself consists of a motor (3863A024C), a motor driver (MD03), a motor encoder (HEDS-5500 A12), an ultrasonic distance sensor (the PING™ Ultrasonic Range Finder), and a Bluetooth controller. Figure 1 shows a diagram of the omni-directional mobile robot. Figure 2 shows the three parts of the robot tracking system, namely the user interface, a workstation, and the robots.

There are two user modes available: manual and autonomous tracking. In manual mode, we use a Wii controller to control the robot's direction of movement and speed. In the autonomous tracking mode, the operator is provided with an infrared emitter module, as shown in Figure 3, which emits infrared signals to enable the robot to track and follow the operator autonomously.

The camera is connected to the workstation and has an infrared filter so that an infrared image is displayed on the screen. The camera is a Logitech Quick Cam Pro 5000, which captures an infrared light source handled by the operator. The industrial motherboard comprises the follower-tracking component of the system, and analyzes the infrared data, which is converted to physical coordinates giving the relative position of the user and the robot. This data is then used to control the direction and speed of the robot.

The robot employs two PIC micro controller boards, with one serving as a signal capture board and the other as a motor control board. This decentralized setup can improve the processing efficiency of the PIC micro controller. The signal capture board receives the command issued by the workstation and the Wii controller signal. The motor control board handles the motor control function, and receives information from the ultrasonic distance sensor.

3. An Omni-Motion Control System Based Upon Fuzzy Theory

The robot implemented in this research is capable of directed translation movement along the x and y axis, and rotational movement along the z axis. All signals dispatched from the motor encoders are translated into the PWM format and are used to calculate the motor revolution. These are then compensated by applying fuzzy logic theory to correct the path deviation. The flowchart of the proposed omni-motion control system is shown in Figure 4.

3.1. Building a Kinematic Equation

Figure 5 shows the three motors positioned at a distance R from the origin, (i.e., the base center), and placed at equal angles to each other, where the angle between the wheel axes is 120 degrees. The angles θ1, θ2, and θ3 are the angles of the wheels measured relative to the x y plane. φ is the rotational angle of the robot; V1, V2, and V3 represent the three wheel speeds, and Vm is the target movement direction. The experimental setup of the robot is shown in Figure 6.

Vm is the intended movement direction of the robot and can be represented as Vm = (Vmx, Vmy), form its x and y components. φ is the robot's rotate angular velocity. The equations of motion V1, V2, V3 are as expressed as:

[ V 1 V 2 V 3 ] = [ cos θ 1 sin θ 1 R cos θ 2 sin θ 2 R cos θ 3 sin θ 3 R ] [ V m x V m y ϕ ˙ ]

A movement rule base can be designed from the kinematic equation.

3.2. Fuzzy Controller Design from Motor Encoders

Robots in real environments are easily affected by several factors. For example, the output power of the three groups of motors may be uneven, ground friction may vary, or the weight balance may be uneven. All these factors can cause the path of the robot to deviate.

As shown in Figure 7, the output the output uk is the sum of uk-1 and Δu. Δu is the motor revolution error and error difference, represents the amount of PWM adjustment needed. The inputs to the fuzzy controller are e and de, where e is motor speed error and de is error variation. Therefore, the straight path deviation is compensated through an invariant motor revolution.

Figure 8 shows membership functions of the input variable e and de and the output Δu, where a triangle function is used as the single membership function. The fuzzy set is composed of the following values: negative big (NB), negative medium (NM), negative small (NS), zero (ZO), positive small (PS), positive medium (PM) and positive big (PB).

By developing rules of thumb based upon several measurements, premise and consequence are deduced and the tuned ranges of membership functions are determined accordingly. The rule base consisting of 21 rules for motor encoder compensation is given in Table 1, and an example is illustrated by:

If e is NM and de is ZO , then Δ u is PM .

It is assumed that if the motor revolution error e is NM, the revolution error difference de is ZO, and the PWM adjustment (Δu) is PM. This situation is assumed to represent a realistic motor revolution value that is less than the expected value, which has an error within the tolerance range. Therefore, the motor PWM is increased by the value of PM.

We perform a fuzzy interference using a minimum inference engine. The controller output represents the center of gravity defuzzification and is determined by the algorithm. The input and output relationship curve of the fuzzy controller is shown in Figure 9.

We set motor PWM to divide into 128 equal parts, i.e., 0–127. As shown in Figure 10(a), we set the initial values of the PWM driving the three motors to 80. A straight path deviation is observed because of the non-uniform feedback signal. Figure 10(b) shows the activation of the fuzzy controller at 15ms to equalize the three motor revolutions. However, it should be noted that the three motor sets experience different levels of PWM due to variation in the load carried by the robot, motor efficiency, and other factors.

4. An Extensible Obstacle-Avoidance System Design

Extension theory is used to describe the inference process of obstacle-avoidance, which allows us to transform a complex problem in the real world into one expressed through matter-elements. As shown in Figure 11 and in Equation (3), nine sets of ultrasonic distance sensors are installed on the left, front-left, front-right, and right sides. An appropriate movement path can be selected by applying extension theory after converting the analogue distance signal into a digital value:

{ Left side distance = Ultrasonic 1 Ultrasonic 2 Left front distance = Ultrasonic 3 Ultrasonic 4 Ultrasonic 5 Right front distance = Ultrasonic 6 Ultrasonic 7 Right side distance = Ultrasonic 8 Ultrasonic 9

4.1. Matter-Element Extension Set

In this section we quantify the extension set characteristics mathematically. The set with name (Name, N), characteristic (Characteristic, C), and with characteristic value (Value, V) is used to describe the three basic elements.

In this work, the obstacle-avoidance system is modeled as multi-dimensional obstacle-avoidance matter-elements, where the names of the extension matter-elements are the same as the number of obstacle-avoidance modes. The ultrasonic distance sensors are defined by four characteristics. Various motion approaches based on the principle of the multi-dimensional extension matter-element model, are expressed in Equation (4). These are associated with various motion strategies:

{ R 1 , i 1 = [ N 1 , i 1 , C 1 , i 1 , j , < a 1 , i 1 , j , b 1 , i 1 , j > ] , i 1 = 1 , 2 , , 7 , j = 1 , 2 , 3 , 4 R 2 , i 2 = [ N 2 , i 2 , C 2 , i 2 , j , < a 2 , i 2 , j , b 2 , i 2 , j > ] , i 2 = 1 , 2 , , 6 , j = 1 , 2 , 3 , 4 R 3 , i 3 = [ N 3 , i 3 , C 3 , i 3 , j , < a 3 , i 3 , j , b 3 , i 3 , j > ] , i 3 = 1 , 2 , , 6 , j = 1 , 2 , 3 , 4

R1,i1, R2,i2, R3,i3 represent various matter-element models, N1,i1, N2,i2, N3,i3 are the names of various obstacle modes, and C1,i1, C2,i2, C3,i3 are the distances to the obstacles in each aspect. The terms <a1,i1,j, b1,i1,j>, <a2,i2,j, b2,i2,j>, <a3,i3,j, b3,i3,j> are the scopes of the classical domains defined in various aspects. This work comprises three sets of models, representing various motion approaches. Strategies are built to avoid obstacles in various aspect directions. There are up to seven motion strategies specified for the forward motion, and six for the left forward and the right forward, as show in Tables 2 to 4.

4.2. Correlation Function

In Figure 12 a classical domain and a neighborhood domain are defined on the interval X0 = <a, b> and X = <c, d> respectively. The neighborhood domain is defined as X = <0, 200>, and X0 X, without any common end points. The primary correlation function can be defined as:

K ( x ) = ρ ( x , X O ) D ( x , X O , X )
where D(x, X0, X) is a point position value, and the relationship between a point and two different ranges is defined as:
D ( x , X O , X ) = { ρ ( x , X ) ρ ( x , X O ) , x X O 1 , x ϵ X O

The relationship between x and X0 is defined as:

ρ ( x , X O ) = | x a + b 2 | 1 2 ( b a )
where the point x is related to the range X as:
ρ ( x , X ) = | x c + d 2 | 1 2 ( d c )

4.3. Evaluation Method and the Best Strategy of Obstacle Avoidance

In Figure 13 the procedure for an optimal evaluation strategy is described in order to find the best strategy of obstacle-avoidance. The optimal obstacle-avoidance depends on the obtained maximum correlation degree by means of the following procedure:

  • We first define evaluation conditions. The correlation set K1,i1, K2,i2, K3,i3, (i.e., the obstacle-avoidance strategies) made by three mobile modes and four sets of distance sensors mounted in various angles, are expressed as:

    { K 1 , i 1 = { K 1 , i 1 , j } i 1 = 1 , 2 , , 7 , j = 1 , 2 , 3 , 4 K 2 , i 2 = { K 2 , i 2 , j } i 2 = 1 , 2 , , 6 , j = 1 , 2 , 3 , 4 K 3 , i 3 = { K 3 , i 3 , j } i 3 = 1 , 2 , , 6 , j = 1 , 2 , 3 , 4

  • The weightings W1,i1,j, W2,i2,j, W3,i3,j of four sets of distance sensors to detect obstacles are assigned the same value of 1/4. The correlation between each distance sensed is expressed as:

    { K ¯ 1 , i 1 = j = 1 4 W 1 , i 1 , j K 1 , i 1 , j , i 1 = 1 , 2 , , 7 K ¯ 2 , i 2 = j = 1 4 W 2 , i 2 , j K 2 , i 2 , j , i 2 = 1 , 2 , , 6 K ¯ 3 , i 3 = j = 1 4 W 3 , i 3 , j K 3 , i 3 , j , i 3 = 1 , 2 , , 6

  • The maximum degree of correlation in the individual mobile modes is extracted for the optimal obstacle-avoidance strategy, and can be found by comparing the optimal degree of evaluation between K1,i1, K2,i2 and K3,i3, as:

    { K ¯ 1 max = max i 1 = 1 , 2 , 7 K ¯ 1 , i 1 K ¯ 2 max = max i 2 = 1 , 2 , 6 K ¯ 2 , i 2 K ¯ 3 max = max i 3 = 1 , 2 , 6 K ¯ 3 , i 3

  • The choice of obstacle-avoidance mode for a robot is evaluated as the degree of correlation within a set of multi-dimensional obstacle-avoidance matter-element modes. This is then translated into the optimal strategy to avoid obstacles.

In Equation (12) the directions representing the mobile mode are given. If the direction is equal to 1, it denotes forward motion, 2 represents front forward, and any other value represents a forward-right direction. The obstacle direction representing the optimal obstacle-avoidance strategy is sent to the robot, where K1_max represents the optimal strategy for forward motion, K2_max for forward-left motion, and K3_max for forward-right:

i f ( direction = = 1 ) obstacle _ direction = K 1 _ max ; elseif ( direction = = 2 ) obstacle _ direction = K 2 _ max ; elseobstacle _ direction = K 3 _ max ;

5. Experimental Results and Analysis

The interface design of the autonomous mobile robot controller is shown in Figure 14. A data link interface to the robot through an RS232 serial transmission, carries the control signal sent from the workstation to the micro controller. This is usually in manual mode. An operational test for obstacle-avoidance by the omni-directional robot was conducted, as shown in Figure 15 and Figure 16.

The omni-directional mobile robot was seen to exhibit high mobility in a complex environment. We also tested the obstacle avoidance capability. In order to correct the path deviation, the aspect angle of movement, and the target speed are transmitted to the three motor sets driven by a fuzzy logic controller to provide a compensation approach for the motor encoder. The robot was able to avoid all obstacles, proving the effectiveness of the proposed system. Thus, the feasibility and the effectiveness of the robot were validated.

6. Conclusions

In this study, omni-directional wheels were used to develop a robot capable of omni-directional movement. The robot offers improved mobility as it utilizes lateral movement over rotational movement by utilizing the omni-directional wheel design. The robot was tested in various mobile modes in a complex environment, and was able to compensate for path deviations through motor encoder compensation based on fuzzy logic theory. The robot was also able to avoid all obstacles in its path autonomously by employing ultrasonic distance sensors with an obstacle-avoidance algorithm. The aim of implementing omni-directional motion control for a three-wheeled autonomous robot was achieved. The robot offers high mobility, motion path correction, and an obstacle avoidance capability. This robot system is suitable for libraries, supermarkets, airports, hospitals and similar scenarios.

Acknowledgments

This work was funded by The National Science Council (grant number NSC-100-2221-E-167-004).

References

  1. Hung, N.; Kim, D.H.; Kim, H.K.; Kim, S.B. Tracking Controller Design of Omnidirectional Mobile Manipulator System. Proceedings of the 2009 ICROS-SICE International Joint Conference, Fukuoka, Japan, 18–21 August 2009; pp. 539–544.
  2. Zhao, D.; Deng, X.; Yi, J. Motion and internal force control for omnidirectional wheeled mobile robots. IEEE-ASME T. Mech. 2009, 14, 382–387. [Google Scholar]
  3. Ye, C.; Ma, S. Development of an Omnidirectional Mobile Platform. Proceedings of the IEEE 2009 International Conference on Mechatronics and Automation (ICMA 2009), Changchun, China, 9– 12 August 2009; pp. 1111–1115.
  4. Huang, H.C.; Tsai, C.C. FPGA Implementation of an embedded robust adaptive controller for autonomous omnidirectional mobile platform. IEEE Trans. Ind. Electron. 2009, 56, 1604–1616. [Google Scholar]
  5. Meng, J.E.; Chang, D. Online tuning of fuzzy inference systems using dynamic fuzzy q-learning. IEEE Trans. Syst. Man CY. B-Cyber 2004, 34, 1478–1489. [Google Scholar]
  6. Chiu, C.S.; Lian, K.Y. Hybrid fuzzy model-based control of nonholonomic systems: A unified viewpoint. IEEE Trans. Fuzzy Syst. 2008, 16, 85–96. [Google Scholar]
  7. Chi, K.H.; Lee, M.R. Obstacle Avoidance in Mobile Robot Using Neural Network. Proceedings of the International Conference on Consumer Electronics, Communications and Networks (CECNet), Xianning, China, 11–13 March 2011; pp. 5082–5085.
  8. You, B.; Qiu, J.; Li, D. A Novel Obstacle Avoidance Method for Low-Cost Household Mobile Robot. Proceedings of the 2008 IEEE International Conference on Automation and Logistics, ICAL, Qingdao, China, 1– 3 September 2008; pp. 111–116.
  9. Green, W.E.; Oh, P.Y. optic-flow-based collision avoidance. IEEE Robot. Autom. Mag. 2008, 15, 96–103. [Google Scholar]
  10. Wei, Z.; Lee, D.J.; Nelson, B.E.; Archibald, J.K. Hardware-friendly vision algorithms for embedded obstacle detection applications. IEEE Trans. Circ. Syst. Video Tech. 2010, 20, 1577–1589. [Google Scholar]
  11. Kim, P.G.; Park, C.G.; Jong, Y.H.; Yun, J.H.; Mo, E.J.; Kim, C.S.; Jie, M.S.; Hwang, S.C.; Lee, K.W. Obstacle avoidance of a mobile robot using vision system and ultrasonic sensor. Comp. Sci. (LNCS) 2007, 4681, 545–553. [Google Scholar]
  12. Juang, C.F.; Hsu, C.H. Reinforcement ant optimized fuzzy controller for mobile robot wall following control. IEEE Trans. Ind. Electron. 2009, 56, 3931–3940. [Google Scholar]
  13. Chen, C.Y.; Shih, B.Y.; Chou, W.C.; Li, Y.J.; Chen, Y.H. Obstacle avoidance design for a humanoid intelligent robot with ultrasonic sensors. J. Vib. Control 2011, 17, 1798–1804. [Google Scholar]
  14. Wang, M.-H. Application of extension theory to vibration fault diagnosis of generator sets. IEE Proc.-Gener. Transm. Distrib. 2004, 151, 503–508. [Google Scholar]
  15. David, H.; Humberto, M. Fuzzy Mobile-robot positioning in intelligent spaces using wireless sensor networks. Sensors 2011, 11, 10820–10839. [Google Scholar]
Figure 1. Omni-directional mobile robot.
Figure 1. Omni-directional mobile robot.
Sensors 12 13947f1 1024
Figure 2. The robot system hardware link.
Figure 2. The robot system hardware link.
Sensors 12 13947f2 1024
Figure 3. The IR source carried by the user.
Figure 3. The IR source carried by the user.
Sensors 12 13947f3 1024
Figure 4. Proposed system flowchart.
Figure 4. Proposed system flowchart.
Sensors 12 13947f4 1024
Figure 5. Configuration of the omni-directional mobile robot.
Figure 5. Configuration of the omni-directional mobile robot.
Sensors 12 13947f5 1024
Figure 6. The experimental setup of the omni-directional mobile robot.
Figure 6. The experimental setup of the omni-directional mobile robot.
Sensors 12 13947f6 1024
Figure 7. The control system block diagram.
Figure 7. The control system block diagram.
Sensors 12 13947f7 1024
Figure 8. Input and output membership functions.
Figure 8. Input and output membership functions.
Sensors 12 13947f8 1024
Figure 9. The fuzzy input and output relationship graph.
Figure 9. The fuzzy input and output relationship graph.
Sensors 12 13947f9 1024
Figure 10. Experimental results of the motor revolution using fuzzy theory. (a) PWM tuning curves and (b) encoder feedback curves for the three motor sets.
Figure 10. Experimental results of the motor revolution using fuzzy theory. (a) PWM tuning curves and (b) encoder feedback curves for the three motor sets.
Sensors 12 13947f10 1024
Figure 11. Ultrasonic distance sensors shown on the robot periphery.
Figure 11. Ultrasonic distance sensors shown on the robot periphery.
Sensors 12 13947f11 1024
Figure 12. An extension correlation function.
Figure 12. An extension correlation function.
Sensors 12 13947f12 1024
Figure 13. The procedure for determining an optimal evaluation strategy.
Figure 13. The procedure for determining an optimal evaluation strategy.
Sensors 12 13947f13 1024
Figure 14. The data link control interface.
Figure 14. The data link control interface.
Sensors 12 13947f14 1024
Figure 15. A set of omni-directional mobile experiments. (a) Forward, (b) turn right, (c) turn left, and (d) rotate back.
Figure 15. A set of omni-directional mobile experiments. (a) Forward, (b) turn right, (c) turn left, and (d) rotate back.
Sensors 12 13947f15 1024
Figure 16. A set of obstacle-avoidance experiments. (ad) Actual experimental pictures of the robot pass the first obstacle and (eh) actual experimental pictures of the robot pass the second obstacle.
Figure 16. A set of obstacle-avoidance experiments. (ad) Actual experimental pictures of the robot pass the first obstacle and (eh) actual experimental pictures of the robot pass the second obstacle.
Sensors 12 13947f16 1024
Table 1. A motor encoder compensation rule table.
Table 1. A motor encoder compensation rule table.
eNBNMNSZOPSPMPB

de
NBPBPBPMZONMNMNB
ZOPBPMPSZONSNMNB
PBPMPSZOZOZONSNM
Table 2. Rule table for forward motion in the extension-element model.
Table 2. Rule table for forward motion in the extension-element model.
NumberObstacle LocationExtension Element ModelApproach
1No obstacle R 11 = [ N 11 , C 111 , < 15 , 200 > , C 112 , < 15 , 200 > , C 113 , < 15 , 200 > , C 114 , < 15 , 200 > ]Move forward
2Left forward R 12 = [ N 12 , C 121 , < 15 , 200 > , C 122 , < 0 , 15 > , C 123 , < 15 , 200 > , C 124 , < 15 , 200 > ]Move right
3Right forward R 13 = [ N 13 , C 131 , < 15 , 200 > , C 132 , < 15 , 200 > , C 133 , < 0 , 15 > , C 134 , < 15 , 200 > ]Move left
4Left forward, Right forward R 14 = [ N 14 , C 141 , < 15 , 200 > , C 142 , < 0 , 15 > , C 143 , < 0 , 15 > , C 144 , < 15 , 200 > ]Move left or right
5Left forward, Right forward, Right R 15 = [ N 15 , C 151 , < 15 , 200 > , C 152 , < 0 , 15 > , C 153 , < 0 , 15 > , C 154 , < 0 , 15 > ]move left
6Left forward, Right forward, Left R 16 = [ N 16 , C 161 , < 0 , 15 > , C 162 , < 0 , 15 > , C 163 , < 0 , 15 > , C 164 , < 15 , 200 > ]Move right
7All R 17 = [ N 17 , C 171 , < 0 , 15 > , C 172 , < 0 , 15 > , C 173 , < 0 , 15 > , C 174 , < 0 , 15 > ]Move backward
Table 3. Rule table for left forward motion in the extension-element model.
Table 3. Rule table for left forward motion in the extension-element model.
NumberObstacle LocationExtension Element ModelApproach
1No obstacle R 21 = [ N 21 , C 211 , < 15 , 200 > , C 212 , < 15 , 200 > , C 213 , < 15 , 200 > , C 214 , < 15 , 200 > ]Move left forward
2Left R 22 = [ N 22 , C 221 , < 0 , 15 > , C 222 , < 15 , 200 > , C 223 , < 15 , 200 > , C 224 , < 15 , 200 > ]Move forward
3Left, Left forward R 23 = [ N 23 , C 231 , < 0 , 15 > , C 232 , < 0 , 15 > , C 233 < 15 , 200 > , C 234 , < 15 , 200 > ]Move right
4Left, Right forward R 24 = [ N 24 , C 241 , < 0 , 15 > , C 242 , < 15 , 200 > , C 243 , < 0 , 15 > , C 244 , < 15 , 200 > ]Move right
5Left, Left forward, Right forward R 25 = [ N 25 , C 251 , < 0 , 15 > , C 252 , < 0 , 15 > , C 253 , < 0 , 15 > , C 254 , < 15 , 200 > ]Move right
6Upper left corner R 26 = [ N 26 , C 261 , < 15 , 200 > , C 262 , < 0 , 15 > , C 263 , < 15 , 200 > , C 264 , < 15 , 200 > ]Move right forward
Table 4. Rule table for right forward motion in the extension-element model.
Table 4. Rule table for right forward motion in the extension-element model.
NumberObstacle LocationExtension Element ModelApproach
1No obstacle R 31 = [ N 31 , C 311 , < 15 , 200 > , C 312 , < 15 , 200 > , C 313 , < 15 , 200 > , C 314 , < 15 , 200 > ]Move right forward
2Right R 32 = [ N 32 , C 321 , < 15 , 200 > , C 322 , < 15 , 200 > , C 323 , < 15 , 200 > , C 324 , < 0 , 15 > ]Move forward
3Left forward, Right R 33 = [ N 33 , C 331 , < 15 , 200 > , C 332 , < 0 , 15 > , C 333 < 15 , 200 > , C 334 , < 0 , 15 > ]Move left
4Right forward, Right R 34 = [ N 34 , C 341 , < 15 , 200 > , C 342 , < 15 , 200 > , C 343 , < 0 , 15 > , C 344 , < 0 , 15 > ]Move left
5Left forward, Right forward, Right R 35 = [ N 35 , C 351 , < 15 , 200 > , C 352 , < 0 , 15 > , C 353 , < 0 , 15 > , C 354 , < 0 , 15 > ]Move left
6Upper right corner R 36 = [ N 36 , C 361 , < 15 , 200 > , C 362 , < 15 , 200 > , C 363 , < 0 , 15 > , C 364 , < 15 , 200 > ]Move left forward

Share and Cite

MDPI and ACS Style

Pai, N.-S.; Hsieh, H.-H.; Lai, Y.-C. Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory. Sensors 2012, 12, 13947-13963. https://doi.org/10.3390/s121013947

AMA Style

Pai N-S, Hsieh H-H, Lai Y-C. Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory. Sensors. 2012; 12(10):13947-13963. https://doi.org/10.3390/s121013947

Chicago/Turabian Style

Pai, Neng-Sheng, Hung-Hui Hsieh, and Yi-Chung Lai. 2012. "Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory" Sensors 12, no. 10: 13947-13963. https://doi.org/10.3390/s121013947

APA Style

Pai, N. -S., Hsieh, H. -H., & Lai, Y. -C. (2012). Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory. Sensors, 12(10), 13947-13963. https://doi.org/10.3390/s121013947

Article Metrics

Back to TopTop