One of the key challenges related to any dual-spacecraft missions involving rendezvous and proximity operations maneuvers is the onboard determination of the pose (i.e., relative position and orientation) of a target object with respect to a robotic chaser spacecraft equipped with computer vision sensors. The relative pose of a target represents crucial information upon which real-time guidance, trajectory control, and capture actions are planned and executed. Besides relative pose determination, other computer vision tasks such as 3D model reconstruction, component recognition, and foreground/background segmentation are often required.
Before deploying any computer vision algorithms in operational scenarios, it is critical to ensure they will perform as expected, regardless of the harsh lightning conditions encountered in low-Earth orbit and even when applied to an unknown noncooperative target spacecraft. To address the need to train and/or test computer vision algorithms across a wide range of scenarios, conditions, target spacecraft, and visual features, the Spacecraft Robotics and Control Laboratory has developed labeled datasets of images of various target spacecraft, ranging from actual on-orbit imagery to synthetic rendered models.
Satellite Segmentation (SATSEG) Dataset
The SATellite SEGmentation (SATSEG) dataset is used to benchmark segmentation methods applied to space-based applications. SATSEG consists of 100 color and grayscale pictures of actual spacecraft, and laboratory mockup, captured by visual and thermal cameras. Some of the spacecraft in this dataset include: Cygnus, Dragon, ISS, Space Shuttle, Cubesats, Hubble, Orbital Express, and Radarsat. Also included in SATSEG is manually-produced ground-truth data that provides a binary mask for each image (foreground = 255 / background = 0).
Shi, J.-F., and Ulrich, S., “Satellite Segmentation (SATSEG) Dataset”, https://doi.org/10.5683/SP3/VDAN02, Borealis (Ed.), 2018.
Related Publications
Shi, J.-F., and Ulrich, S., “Uncooperative Spacecraft Pose Estimation using Monocular Monochromatic Images,” AIAA Journal of Spacecraft and Rockets, Vol. 58, No. 2, 2021, pp. 284–301.
AIAA SPACE and Astronautics Forum and Exposition, AIAA Paper 2018-5281, Orlando, FL, 2018.