SPARTAN: Developing a Vision System for Future Autonomous Space Exploration Robots
Article first published online: 10 OCT 2013
© 2013 Wiley Periodicals, Inc.
Journal of Field Robotics
Special Issue: Special Issue on Space Robotics, Part 2
Volume 31, Issue 1, pages 107–140, January/February 2014
How to Cite
Kostavelis, I., Nalpantidis, L., Boukas, E., Rodrigalvarez, M. A., Stamoulias, I., Lentaris, G., Diamantopoulos, D., Siozios, K., Soudris, D. and Gasteratos, A. (2014), SPARTAN: Developing a Vision System for Future Autonomous Space Exploration Robots. J. Field Robotics, 31: 107–140. doi: 10.1002/rob.21484
- Issue published online: 18 DEC 2013
- Article first published online: 10 OCT 2013
- Manuscript Accepted: 13 AUG 2013
- Manuscript Received: 7 DEC 2012
- European Space Agency (ESA) project “SPAring Robotics Technologies for Autonomous Navigation (SPARTAN)
Mars exploration is expected to remain a focus of the scientific community in the years to come. A Mars rover should be highly autonomous because communication between the rover and the terrestrial operation center is difficult, and because the vehicle should spend as much of its traverse time as possible moving. Autonomous behavior of the rover implies that the vision system provides both a wide view to enable navigation and three-dimensional (3D) reconstruction, and at the same time a close-up view ensuring safety and providing reliable odometry data. The European Space Agency funded project “SPAring Robotics Technologies for Autonomous Navigation” (SPARTAN) aimed to develop an efficient vision system to cover all such aspects of autonomous exploratory rovers. This paper presents the development of such a system, starting from the requirements up to the testing of the working prototype. The vision system was designed with the intention of being efficient, low-cost, and accurate and to be implemented using custom-designed vectorial processing by means of field programmable gate arrays (FPGAs). A prototype of the complete vision system was developed, mounted on a basic mobile robot platform, and tested. The results on both real-world Mars-like and long-range simulated data are presented in terms of 3D reconstruction and visual odometry accuracy, as well as execution speed. The developed system is found to fulfill the set requirements.