Cloud Incubator Car: A Reliable Platform for Autonomous Driving
Abstract
:1. Introduction
Platforms for Autonomous Driving Development
2. Cloud Incubator Car Platform
2.1. Modification of the Steering System
2.2. Modification of the Braking System
2.3. Throttle and Gearbox Modifications
2.4. Sensor Holders
2.5. Electrical and Communications Infrastructure
- (1)
- A serial Controller Area Network (CAN bus), because of its suitability for high-speed applications using short messages, its robustness, and reliability [17].
- (2)
- (3)
- A point-to-point set communications, such as RS232 and USB.
3. High-Level CICar Architecture
3.1. Control System
3.2. Perception System
3.3. Decision-Making System
Path-Planning System (PPS)
Global Planner
- (1)
- It must be able to create global routes using digital map sources
- (2)
- It should allow an easy introduction of traffic restrictions or unplanned events.
- (3)
- It should be executed in an agile manner and produce results in an acceptable time.
- (1)
- It has an easily accessible Application Programming Interface (API) and is multiplatform.
- (2)
- The maps can be downloaded with different layers of information.
- (3)
- It allows georeferencing of the map points.
- (4)
- It is constantly updated and is available for free.
- A search direction is chosen (‘N’-North, ‘S’-South, ‘E’-East, ‘W’-West) according to the position of the angle between the positions pg and p0. Equation (1) shows how the search directions are obtained by the algorithm in function of the angle
- A clear path is chosen from p0 to the goal pg. A route is considered clear if all the points that form the line that joins p0 and pg have a value equal to ‘1’.
- 2.1
- If the path is not free:
- The SCP algorithm searches for the first pair of cross points [CPi; CPi+1] according to the search direction set as shown in Equation (1). To achieve this, the algorithm goes through the map, starting from p0 according to an angle θ1, until it finds a value equal to ‘0’. The point found is labelled as the first cross point CPi. Starting from CPi, and according to the search angle θ2, the algorithm searches for the next point with a value of ‘0’. This new point is labelled as the second cross point CPi+1. The angles θ1, θ2 are calculated according to the search direction and the following table.During the search for the cross points [CPi; CPi+1], the candidate points pi(xi, yi) are calculated by alternating the values of θ = θ1 and θ = θ2, based on Equation (2):
- The Euclidean distance between two consecutive cross points CPi and CPi+1 is calculated.
- If the distance is 0, the search direction is changed by evaluating the position of pg with respect to the last CPi+1 that was calculated, p0 is set to p0 = CPi+1, and the algorithm goes to step 2.
- If the distance is not equal to 0, p0 is set to p0 = CPi+1, and the algorithm goes to step 2.
- 2.1
- If there is a clear route, the algorithm finishes, and the path is composed of the points [p0,pm1,…,pmn−1;pg], with pm1,…,pmn−1 being the midpoints obtained from the pairs of points from the cross-point vector [p0,CP1,CP2,…,CPn;pg].
Local Planner
- Distance filter. All XYZ points beyond a maximum distance (dmax) will be rejected.Distance filter. All XYZ points beyond a maximum distance (dmax) will be rejected.
- High filter. All Z points higher than a maximum value (Zmax) will be discarded.
Algorithm 1 Trajectory generator algorithm. |
Point P0(x0, y0), P1(x1, y1), P2(x2, y2), WPi, WP(i+1) Point KPi[] % array: set of keypoints Trajectories (set of points): B(t) Trajectories: Ti[] % array: set of trajectories INTEGER: cnt = 0, nCurves % for the WHILE iteration REAL: distance, safety_dist % safety distance % Initial assignments: INPUT nIter, safety_dist , KPi[] P0 = WPi , P2 = WP(i+1) P1: ((WPi.x + WP(i+1).x) / 2, (WPi.y + WP(i+1).y) / 2) P1.x = P1.x – nCurves * 3.0 / 2 % Process: BEGIN 1. WHILE cnt < nCurves 1.1. IF P1 is in ROI 1.1.1. B(t) = fB(P0, P1, P2) 1.1.2. P1 = fP1(KP) 1.1.3. cnt = cnt + 1; 1.1.4. IF distance >= safety_dist AND all B(t) points are in ROI 1.1.4.1. Ti[] = B(t) is accepted as candidate to trajectory END IF % (1.1.4.1) END IF % (1.1) 1.2. P1.x = P1.x + 3.0 1.3. nCurves = nCurves + 1 END WHILE % (1) END |
4. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Kala, R. On-Road Intelligent Vehicles: Motion Planning for Intelligent Transportation Systems; Elsevier Science: Amsterdam, The Netherlands, 2016. [Google Scholar]
- Flemming, B.; Gill, V.; Godsmark, P.; Kirk, B. Automated Vehicles: The Coming of the Next Disruptive Technology. In The Conference Board of Canada; Conference Board of Canada: Ottawa, ON, Canada, 2015. [Google Scholar]
- Corben, B.; Logan, D.; Fanciulli, L.; Farley, R.; Cameron, I. Strengthening road safety strategy development “Towards Zero” 2008–2020 – Western Australia’s experience scientific research on road safety management SWOV workshop 16 and 17 November 2009. Saf. Sci. 2010, 48, 1085–1097. [Google Scholar] [CrossRef]
- Engelbrecht, J.; Booysen, M.J.; Bruwer, F.J.; van Rooyen, G.-J. Survey of smartphone-based sensing in vehicles for intelligent transportation system applications. IET Intell. Transp. Syst. 2015, 9, 924–935. [Google Scholar] [CrossRef]
- Thrun, S. Toward robotic cars. Commun. ACM 2010, 53, 99. [Google Scholar] [CrossRef]
- Ibañez-Guzmán, J.; Laugier, C.; Yoder, J.-D.; Thrun, S. Autonomous Driving: Context and State-of-the-Art. In Handbook of Intelligent Vehicles; Springer: London, UK, 2012; pp. 1271–1310. [Google Scholar]
- Urmson, C.; Anhalt, J.; Bagnell, D.; Baker, C.; Bittner, R.; Clark, M.N.; Dolan, J.; Duggins, D.; Galatali, T.; Geyer, C.; et al. Autonomous Driving in Urban Environments: Boss and the Urban Challenge. J. F. Robot. 2009, 25, 1–59. [Google Scholar]
- NHTSA Federal Automated Vehicles Policy—September 2016 | US Department of Transportation. Available online: https://www.transportation.gov/AV/federal-automated-vehicles-policy-september-2016 (accessed on 29 November 2017).
- SAE J3016: Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems - SAE International. Available online: http://standards.sae.org/j3016_201401/ (accessed on 29 November 2017).
- Wei, J.; Snider, J.M.; Kim, J.; Dolan, J.M.; Rajkumar, R.; Litkouhi, B. Towards a viable autonomous driving research platform. In IEEE Intelligent Vehicles Symposium, Proceedings; Gold Coast, QLD, Australia, 2013; pp. 763–770. [Google Scholar]
- Milanés, V.; Llorca, D.F.; Vinagre, B.M.; González, C.; Sotelo, M.A. Clavileño: Evolution of an autonomous car. In Proceedings of the IEEE Conference on Intelligent Transportation Systems, Proceedings, ITSC, Funchal, Portugal, 19–22 September 2010; pp. 1129–1134. [Google Scholar]
- Grisleri, P.; Fedriga, I. The BRAiVE Autonomous Ground Vehicle Platform. IFAC Proc. Vol. 2010, 43, 497–502. [Google Scholar] [CrossRef]
- Levinson, J.; Askeland, J.; Becker, J.; Dolson, J.; Held, D.; Kammel, S.; Kolter, J.Z.; Langer, D.; Pink, O.; Pratt, V.; et al. Towards fully autonomous driving: Systems and algorithms. In IEEE Intelligent Vehicles Symposium, Proceedings; Baden-Baden, Germany, 2011; pp. 163–168. [Google Scholar]
- Liu, S.; Tang, J.; Wang, C.; Wang, Q.; Gaudiot, J.L. A Unified Cloud Platform for Autonomous Driving. Computer 2017, 50, 42–49. [Google Scholar] [CrossRef]
- Liu, S.; Tang, J.; Wang, C.; Wang, Q.; Gaudiot, J.-L. Implementing a Cloud Platform for Autonomous Driving. arXiv 2017, arXiv:1704.02696. [Google Scholar]
- NI white papers The LabVIEW RIO Architecture: A Foundation for Innovation. Available online http://www.ni.com/white-paper/10894/en/ (accessed on 14 February 2018).
- Hu, J.; Xiong, C. Study on the embedded CAN bus control system in the vehicle. In Proceedings of the 2012 International Conference on Computer Science and Electronics Engineering, ICCSEE 2012, Hangzhou, China, 23–25 March 2012; IEEE; Volume 2, pp. 440–442. [Google Scholar]
- Petrovskaya, A.; Thrun, S. Model based vehicle detection and tracking for autonomous urban driving. Auton. Robots 2009, 26, 123–139. [Google Scholar] [CrossRef]
- Navarro, P.; Fernández, C.; Borraz, R.; Alonso, D. A Machine Learning Approach to Pedestrian Detection for Autonomous Vehicles Using High-Definition 3D Range Data. Sensors 2016, 17, 18. [Google Scholar] [CrossRef] [PubMed]
- Samek, M. Practical UML Statecharts in C/C++: Event-Driven Programming for Embedded Systems; Newnes/Elsevier: Burlington, MA, USA, 2009. [Google Scholar]
- Li, L. Time-of-Flight Camera–An Introduction; Technical White Paper; Texas Instruments: Dallas, TX, USA, 2014; p. 10. [Google Scholar]
- Lange, R.; Seitz, P. Solid-state time-of-flight range camera. IEEE J. Quantum Electron. 2001, 37, 390–397. [Google Scholar] [CrossRef]
- Piñana-Díaz, C.; Toledo-Moreo, R.; Toledo-Moreo, F.; Skarmeta, A. A Two-Layers Based Approach of an Enhanced-Mapfor Urban Positioning Support. Sensors 2012, 12, 14508–14524. [Google Scholar] [CrossRef] [PubMed]
- Piñana-Diaz, C.; Toledo-Moreo, R.; Bétaille, D.; Gómez-Skarmeta, A.F. GPS multipath detection and exclusion with elevation-enhanced maps. In Proceedings of the IEEE Conference on Intelligent Transportation Systems, Proceedings, ITSC, Washintong, DC, USA, 5–7 October 2011; pp. 19–24. [Google Scholar]
- Czubenko, M.; Kowalczuk, Z.; Ordys, A. Autonomous Driver Based on an Intelligent System of Decision-Making. Cognit. Comput. 2015, 7, 569–581. [Google Scholar] [CrossRef] [PubMed]
- Chen, B.; Cheng, H.H. A Review of the Applications of Agent Technology in Traffic and Transportation Systems. IEEE Trans. Intell. Transp. Syst. 2010, 11, 485–497. [Google Scholar]
- Abdullah, R.; Hussain, A.; Warwick, K.; Zayed, A. Autonomous intelligent cruise control using a novel multiple-controller framework incorporating fuzzy-logic-based switching and tuning. Neurocomputing 2008, 71, 2727–2741. [Google Scholar] [CrossRef]
- Belker, T.; Beetz, M.; Cremers, A.B. Learning action models for the improved execution of navigation plans. Rob. Auton. Syst. 2002, 38, 137–148. [Google Scholar] [CrossRef]
- Chakraborty, D.; Vaz, W.; Nandi, A.K. Optimal driving during electric vehicle acceleration using evolutionary algorithms. Appl. Soft Comput. 2015, 34, 217–235. [Google Scholar] [CrossRef]
- Michalos, G.; Fysikopoulos, A.; Makris, S.; Mourtzis, D.; Chryssolouris, G. Multi criteria assembly line design and configuration – An automotive case study. CIRP J. Manuf. Sci. Technol. 2015, 9, 69–87. [Google Scholar] [CrossRef]
- Cunningham, A.G.; Galceran, E.; Eustice, R.M.; Olson, E. MPDM: Multipolicy decision-making in dynamic, uncertain environments for autonomous driving. In Proceedings of the IEEE International Conference on Robotics and Automation, Seatle, WA, USA, 26-30 May 2015; IEEE; Volume 2015-June, pp. 1670–1677. [Google Scholar]
- Chu, K.; Lee, M.; Sunwoo, M. Local path planning for off-road autonomous driving with avoidance of static obstacles. IEEE Trans. Intell. Transp. Syst. 2012, 13, 1599–1616. [Google Scholar] [CrossRef]
- Biswas, S.; Lovell, B.C. B-Splines and Its Applications. In Bézier and Splines in Image Processing and Machine Vision; Springer: London, UK; pp. 3–31.
- Hart, P.; Nilsson, N.; Raphael, B. A Formal Basis for the Heuristic Determination of Minimum Cost Paths. IEEE Trans. Syst. Sci. Cybern. 1968, 4, 100–107. [Google Scholar] [CrossRef]
- Dijkstra, E.W. A Note on Two Problems in Connexion with Graphs. In Numerische Mathematik; Stichting Mathematisch Centrum: Amsterdam, The Netherlands, 1959; Volume 271, pp. 269–271. [Google Scholar]
- Harris, C.; Stephens, M. A combined corner and edge detector. In Proceedings of the Fourth Alvey Vision Conference, Manchester, UK, 31 August–2 September 1988. [Google Scholar]
Capacity | Means to Achieve That Capacity |
---|---|
Acquisition of information about the environment (at different distances) | Onboard sensors (self-positioning, obstacle detection, driver monitoring) |
Network connection for long-distance information | |
Vehicular interconnection | |
Processing of environment information | Onboard software for critical analysis |
Cloud processing for route planning, traffic control, etc. | |
Making driving decisions | Cognitive systems |
Agent systems | |
Evolutionary algorithms |
Category | Main Features | Fault Correction Rely on | Automated Driving Modes Covered |
---|---|---|---|
Level 3. Conditional Automation | All dynamic driving tasks are automated. The human driver is expected to act correctly when required. | Human driver | Some driving modes: |
Parking maneuvers | |||
Low speed traffic jam | |||
Level 4. High Automation | All dynamic driving tasks are automated. The human driver is not expected to act correctly when required. | System | Some driving modes: |
High-speed cruising | |||
Level 5. Full Automation | All dynamic driving tasks are automated for any type of road and environmental conditions usually managed by a human driver. | System | All driving modes |
Urban driving |
PLATFORM | Vehicle Basis | Automated Systems | Control System | Sensors | Performances | Navigation |
---|---|---|---|---|---|---|
BOSS (DARPA) | Chevrolet Thaoe | Steering Brake Throttle | Proprietary (based on multiprocessor system) | 1 HD 3D LIDAR 4 Cameras 6 radars 8 2D LIDAR 1 Inertial GPS navigation system. | Obstacle avoidance Lane keeping Crossing detection | Anytime D* Algorithm |
JUNIOR (DARPA) | VW Passat wagon | Steering Brake Throttle | Proprietary (based on multiprocessor system) | 1 HD 3D LIDAR 4 Cameras 6 radars 2 2D LIDAR 1 Inertial GPS navigation system. | Object detection Pedestrian detection | Moving Frame |
BRAiVE (VisLab) | VW Passat | Steering Brake Throttle | Proprietary dSpace | 10 cameras 4 lasers 16 laser beams 1 radar 1 Inertial GPS navigation system. | Close loop manoeuvring | NA |
AUTOPIA | CITRÖEN C3 Pluriel | Steering Brake Throttle | Proprietary ORBEX (based on fuzzy coprocessor) | 2 front cameras 1 RTK-DGPS+IMU | Following a leading vehicle Pedestrian detection | Fuzzy logic |
Sensor | Characteristics and Configurations |
---|---|
SICK LASER scanner 2D TIM551 | Operating range: 0.05 m ... 4 m, Aperture angle: 270° |
LIDAR VELODYNE HDL64 scanner 3D | 120 m range, 1.3 Million Points per Second, 26.9° Vertical FOV |
Prosilica GT1290 cam | 1.2 Megapixel Ethernet gigabit, RJ45 Ethernet connector, colour |
DGPS | GPS aided AHRS |
ToF Sentis3D-M420Kit cam | Range Indoor: 7 m Outdoor: 4 m, Horizontal FOV: 90° |
IMU Moog Crossbow NAV440CA-202 | Pitch and roll accuracy of < 0.4°, Position Accuracy < 0.3 m |
EMLID RTK GNSS Receiver | 7 mm positioning precision |
Directions | Angle θ1 | Angle θ2 |
---|---|---|
N | −180 + θ0 | −θ0 |
S | θ0 | 180 − θ0 |
E | θ0 | −θ0 |
W | −180 + θ0 | 180 − θ0 |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Borraz, R.; Navarro, P.J.; Fernández, C.; Alcover, P.M. Cloud Incubator Car: A Reliable Platform for Autonomous Driving. Appl. Sci. 2018, 8, 303. https://doi.org/10.3390/app8020303
Borraz R, Navarro PJ, Fernández C, Alcover PM. Cloud Incubator Car: A Reliable Platform for Autonomous Driving. Applied Sciences. 2018; 8(2):303. https://doi.org/10.3390/app8020303
Chicago/Turabian StyleBorraz, Raúl, Pedro J. Navarro, Carlos Fernández, and Pedro María Alcover. 2018. "Cloud Incubator Car: A Reliable Platform for Autonomous Driving" Applied Sciences 8, no. 2: 303. https://doi.org/10.3390/app8020303
APA StyleBorraz, R., Navarro, P. J., Fernández, C., & Alcover, P. M. (2018). Cloud Incubator Car: A Reliable Platform for Autonomous Driving. Applied Sciences, 8(2), 303. https://doi.org/10.3390/app8020303