The Canadian Planetary Emulation Terrain Energy-Aware Rover Navigation Dataset
We have collected a unique energy-aware navigation dataset at the Canadian Space Agency’s Mars Emulation Terrain (MET) in Saint-Hubert, Quebec, Canada. This dataset consists of raw and post-processed sensor measurements collected by our rover in addition to georeferenced aerial maps of the MET (colour mosaic, elevation model, slope and aspect maps). The data are available for download below in human-readable format and rosbag (.bag) format.
The official journal article describing this dataset in greater details will be linked here shortly. Python data fetching and plotting scripts and ROS-based visualization tools are available in the dataset’s repository.
The rover used for data collection was a Clearpath Husky UGV, a four-wheeled, skid-steered mobile robot with its own battery and two motors (for the wheels on the left and right sides of the vehicle). The main computer supporting control and data logging was powered by a separate 12 V deep cycle rechargeable battery. Battery operation was preferred over a gasoline generator to ensure the rover’s weight did not change with time, avoiding variations in the interactions with the terrain surface.
The rover carried a suite of sensors mounted on an aluminum mast near the front of the vehicle. Sensors on board included an Occam Vision Group omnidirectional stereo camera (composed of 10 individual RGB cameras, each with a resolution of 752 x 480 pixels), a monochrome Point Grey Flea3 camera with a resolution of 1280 x 1024 pixels, and an Apogee E-804-SP-420 pyranometer. The Flea3 camera was tilted slightly downwards to enable imaging of the terrain directly in front of the rover. The pyranometer was installed on top of the omnidirectional camera, making it possible to measure solar irradiance at all tilt angles (and to further estimate solar power generation). A LORD MicroStrain 3DM-GX3-25 inertial measurement unit (IMU) was installed near the base of the sensor mast. Positioning information was provided by a NovAtel Smart6-L GPS receiver at the rear of the Husky platform.
The entire dataset is separated into six different runs, each covering different sections of the MET at different times. The data was collected on September 4, 2018 between 17:00 and 19:00 (Eastern Daylight Time). The data is available in both human-readable format and in rosbag (.bag) format.
To avoid extremely large files, the rosbag data of every run was broken down into two parts: “runX_clouds_only.bag” and “runX_base.bag”. The former only contains the point clouds generated from the omnidirectional camera raw images after data collection, and the latter contains all the raw data and the remainder of the post-processed data. Both rosbags possess consistent timestamps and can be merged together using bagedit for example. A similar breakdown was followed for the human-readable data.
Aside from point clouds, the post-processed data of every run includes a blended cylindrical panorama made from the omnidirectional sensor images, planar rover velocity estimates from wheel encoder data and an estimated global trajectory obtained by fusing GPS and stereo imagery coming from cameras 0 and 1 of the omnidirectional sensor using VINS-Fusion.
Data Download Links
Aerial maps (UTM coordinates, zone 18T)
Aerial maps download (2.6 MB)
Length: 196 m
Start local time (EDT): 17:19:41
Duration: 526 s (8 minutes 46 seconds)
Length: 205 m
Start local time (EDT): 17:31:45
Duration: 490 s (8 minutes 10 seconds)
Length: 143 m
Start local time (EDT): 17:43:25
Duration: 367 s (6 minutes 7 seconds)
Length: 168 m
Start local time (EDT): 18:14:32
Duration: 457 s (7 minutes 37 seconds)
Length: 257 m
Start local time (EDT): 18:23:08
Duration: 693 s (11 minutes 33 seconds)
Length: 260 m
Start local time (EDT): 18:34:59
Duration: 708 s (11 minutes 48 seconds)
If you encounter any problem using the scripts in our GitHub repository, please open an issue. Any other questions/inquiries/feedback can be sent to Olivier Lamarre at firstname.lastname@example.org.
About the Authors
Olivier Lamarre is a Ph.D. student with the STARS Laboratory; he is working on energy-aware planning for rovers using surface and orbital data in cooperation with NASA JPL.
Oliver Limoyo is also a Ph.D. student with the STARS Laboratory and is working on enabling mobile manipulators to learn how to interact with unstructured and dynamic environments.
Filip Marić is a Ph.D. student with the LAMoR group at the University of Zagreb and the STARS Laboratory. He is exploring the connection between estimation and planning.
Dr. Jonathan Kelly is the Director of the STARS Laboratory, and has research interests in 3D computer vision, probabilistic modelling, estimation theory, and machine learning.