{"id":1603,"date":"2025-02-06T08:49:05","date_gmt":"2025-02-06T13:49:05","guid":{"rendered":"https:\/\/carleton.ca\/spacecraft\/?page_id=1603"},"modified":"2025-12-04T00:34:55","modified_gmt":"2025-12-04T05:34:55","slug":"datasets","status":"publish","type":"page","link":"https:\/\/carleton.ca\/spacecraft\/datasets\/","title":{"rendered":"Computer Vision Datasets"},"content":{"rendered":"<p>One of the key challenges related to any dual-spacecraft missions involving rendezvous and proximity operations maneuvers is the onboard determination of the pose (i.e., relative position and orientation) of a target object with respect to the robotic chaser spacecraft equipped with computer vision sensors. The relative pose of a target represents crucial information upon which real-time guidance, trajectory control, and docking\/capture maneuvers are planned and executed. Besides relative pose determination, other computer vision tasks such as detection, 3D model reconstruction, component identification, and foreground\/background segmentation are often required.<\/p>\n<p>Before deploying any computer vision algorithms in operational scenarios, it is important to ensure they will perform as expected, regardless of the harsh lightning conditions encountered in low-Earth orbit and even when applied to unknown uncooperative targets. To address the need to train and\/or test computer vision algorithms across a wide range of scenarios, conditions, target spacecraft, and visual features, the Spacecraft Robotics Laboratory has developed labeled datasets of images of various target spacecraft, ranging from actual on-orbit imagery to synthetic rendered models.<\/p>\n<p><strong>EVent-based Observation of Spacecraft (EVOS) Dataset<\/strong><\/p>\n<p>The EVOS (EVent-based Observation of Spacecraft) dataset is designed to support research in event-based vision for autonomous on-orbit inspection and space debris removal. The dataset was collected at Carleton University&#8217;s Spacecraft Proximity Operations Testbed using an IniVation DVXplorer Micro event camera mounted on a stationary chaser spacecraft platform which observes a moving target spacecraft. The observed target is covered in multi-layer insulation and is equipped with a solar panel and a docking cone. The dataset contains 15 unique experiments, each approximately 300 seconds long, featuring distinct trajectories under different lighting conditions. Time-synchronized ground-truth position and velocity data are provided via a PhaseSpace motion capture system with sub-millimeter accuracy.<\/p>\n<p>Crain, A., Ulrich, S., &#8220;EVent-based Observation of Spacecraft (EVOS) Dataset,&#8221; Federated Research Data Repository, 2025. <a href=\"https:\/\/doi.org\/10.20383\/103.01538\">https:\/\/doi.org\/10.20383\/103.01538<\/a>. <\/p>\n<p><span style=\"text-decoration: underline;\">Related Publications<\/span><\/p>\n<p>Crain, A., and Ulrich, S., &#8220;Event-Based Spacecraft Representation Using Inter-Event-Interval Adaptive Time Surfaces,&#8221; <i>36th AIAA\/AAS Space Flight Mechanics Meeting, Orlando, <\/i>FL, 12-16 Jan, 2026.<\/p>\n<p><strong>Spacecraft Thermal Infrared (STIR) Dataset<\/strong><\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\" wp-image-1700 alignleft\" src=\"https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/data_01014-240x180.png\" alt=\"\" width=\"173\" height=\"130\" srcset=\"https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/data_01014-240x180.png 240w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/data_01014-160x120.png 160w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/data_01014.png 320w\" sizes=\"(max-width: 173px) 100vw, 173px\" \/>The Spacecraft Thermal Infrared (STIR) Dataset is used primarily to train machine learning-based pose determination methods applied to spacecraft proximity operations. STIR consists of 16,641 thermal infrared pictures of a free-floating spacecraft target platform, captured at Carleton University&#8217;s Spacecraft Proximity Operations Testbed by an ICI-9320 thermal camera installed on the chaser spacecraft platform. For each of the frames, the relative planar three-degree-of-freedom pose of the target with respect to the chaser is included (x and y positions, and yaw angle). This ground truth data was acquired by a PhaseSpace motion capture system with submillimeter accuracy.<\/p>\n<p>Budhkar, A., and Ulrich, S., &#8220;Spacecraft Thermal Infrared (STIR) Dataset,&#8221; Federated Research Data Repository, 2025. <a href=\"https:\/\/doi.org\/10.20383\/103.01333\">https:\/\/doi.org\/10.20383\/103.01333<\/a>.<\/p>\n<p><span style=\"text-decoration: underline;\">Related Publications<\/span><\/p>\n<p><span class=\"art_authors\"><span class=\"NLM_string-name\">Budhkar, A.<\/span>, and <span class=\"NLM_string-name\">Ulrich, S.,<\/span> &#8220;Neural Network-Based Spacecraft Pose Determination Using Thermal Imagery,&#8221; <\/span><span class=\"journalName\"><i>AAS\/AIAA Space Flight Mechanics Meeting, <\/i>Kaua&#8217;i, HI, 19-23 Jan, 2025<\/span><span class=\"year\">.<\/span><\/p>\n<p><strong>Satellite Segmentation (SATSEG) Dataset<\/strong><\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\" wp-image-1607 alignleft\" src=\"https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/SATSEG-240x324.png\" alt=\"\" width=\"173\" height=\"234\" srcset=\"https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/SATSEG-240x324.png 240w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/SATSEG-400x540.png 400w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/SATSEG-160x216.png 160w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/SATSEG-768x1037.png 768w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/SATSEG-1137x1536.png 1137w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/SATSEG-1517x2048.png 1517w, https:\/\/carleton.ca\/spacecraft\/wp-content\/uploads\/SATSEG-360x486.png 360w\" sizes=\"(max-width: 173px) 100vw, 173px\" \/>The SATellite SEGmentation (SATSEG) dataset is used to benchmark segmentation methods applied to space-based applications. SATSEG consists of 100 color and grayscale pictures of actual spacecraft, and laboratory mockup, captured by visual and thermal cameras. Some of the spacecraft in this dataset include: Cygnus, Dragon, ISS, Space Shuttle, Cubesats, Hubble, Orbital Express, and Radarsat. Also included in SATSEG is manually-produced ground-truth data that provides a binary mask for each image (foreground = 255 \/ background = 0).<\/p>\n<p>Shi, J.-F., and Ulrich, S., &#8220;Satellite Segmentation (SATSEG) Dataset&#8221;, <a href=\"https:\/\/doi.org\/10.5683\/SP3\/VDAN02\" target=\"_blank\" rel=\"noopener\">https:\/\/doi.org\/10.5683\/SP3\/VDAN02<\/a>, Borealis (Ed.), 2018.<\/p>\n<p><span style=\"text-decoration: underline;\">Related Publications<\/span><\/p>\n<p>Shi, J.-F., and Ulrich, S., \u201cUncooperative Spacecraft Pose Estimation using Monocular Monochromatic Images,\u201d <em>AIAA Journal of Spacecraft and Rockets,<\/em> Vol. 58, No. 2, 2021, pp. 284\u2013301.<\/p>\n<p><span class=\"art_authors\"><span class=\"NLM_string-name\">Shi, J.-F.<\/span>, <span class=\"NLM_string-name\">Ulrich, S.,<\/span> and <span class=\"NLM_string-name\">Ruel, S.,<\/span> &#8220;Regional Method for Monocular Infrared Image Spacecraft Pose Estimation,&#8221; <\/span><span class=\"journalName\"><i>AIAA SPACE and Astronautics Forum and Exposition, <\/i>AIAA Paper 2018-5281, Orlando, FL, <\/span><span class=\"year\">2018.<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>One of the key challenges related to any dual-spacecraft missions involving rendezvous and proximity operations maneuvers is the onboard determination of the pose (i.e., relative position and orientation) of a target object with respect to the robotic chaser spacecraft equipped with computer vision sensors. The relative pose of a target represents crucial information upon which [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_relevanssi_hide_post":"","_relevanssi_hide_content":"","_relevanssi_pin_for_all":"","_relevanssi_pin_keywords":"","_relevanssi_unpin_keywords":"","_relevanssi_related_keywords":"","_relevanssi_related_include_ids":"","_relevanssi_related_exclude_ids":"","_relevanssi_related_no_append":"","_relevanssi_related_not_related":"","_relevanssi_related_posts":"","_relevanssi_noindex_reason":"","_mi_skip_tracking":false,"_exactmetrics_sitenote_active":false,"_exactmetrics_sitenote_note":"","_exactmetrics_sitenote_category":0,"footnotes":"","_links_to":"","_links_to_target":""},"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v21.2 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Computer Vision Datasets - Spacecraft Robotics Laboratory<\/title>\n<meta name=\"description\" content=\"One of the key challenges related to any dual-spacecraft missions involving rendezvous and proximity operations maneuvers is the onboard determination of\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/carleton.ca\/spacecraft\/datasets\/\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"4 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/carleton.ca\/spacecraft\/datasets\/\",\"url\":\"https:\/\/carleton.ca\/spacecraft\/datasets\/\",\"name\":\"Computer Vision Datasets - Spacecraft Robotics Laboratory\",\"isPartOf\":{\"@id\":\"https:\/\/carleton.ca\/spacecraft\/#website\"},\"datePublished\":\"2025-02-06T13:49:05+00:00\",\"dateModified\":\"2025-12-04T05:34:55+00:00\",\"description\":\"One of the key challenges related to any dual-spacecraft missions involving rendezvous and proximity operations maneuvers is the onboard determination of\",\"breadcrumb\":{\"@id\":\"https:\/\/carleton.ca\/spacecraft\/datasets\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/carleton.ca\/spacecraft\/datasets\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/carleton.ca\/spacecraft\/datasets\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/carleton.ca\/spacecraft\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Computer Vision Datasets\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/carleton.ca\/spacecraft\/#website\",\"url\":\"https:\/\/carleton.ca\/spacecraft\/\",\"name\":\"Spacecraft Robotics Laboratory\",\"description\":\"Carleton University\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/carleton.ca\/spacecraft\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Computer Vision Datasets - Spacecraft Robotics Laboratory","description":"One of the key challenges related to any dual-spacecraft missions involving rendezvous and proximity operations maneuvers is the onboard determination of","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/carleton.ca\/spacecraft\/datasets\/","twitter_misc":{"Est. reading time":"4 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/carleton.ca\/spacecraft\/datasets\/","url":"https:\/\/carleton.ca\/spacecraft\/datasets\/","name":"Computer Vision Datasets - Spacecraft Robotics Laboratory","isPartOf":{"@id":"https:\/\/carleton.ca\/spacecraft\/#website"},"datePublished":"2025-02-06T13:49:05+00:00","dateModified":"2025-12-04T05:34:55+00:00","description":"One of the key challenges related to any dual-spacecraft missions involving rendezvous and proximity operations maneuvers is the onboard determination of","breadcrumb":{"@id":"https:\/\/carleton.ca\/spacecraft\/datasets\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/carleton.ca\/spacecraft\/datasets\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/carleton.ca\/spacecraft\/datasets\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/carleton.ca\/spacecraft\/"},{"@type":"ListItem","position":2,"name":"Computer Vision Datasets"}]},{"@type":"WebSite","@id":"https:\/\/carleton.ca\/spacecraft\/#website","url":"https:\/\/carleton.ca\/spacecraft\/","name":"Spacecraft Robotics Laboratory","description":"Carleton University","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/carleton.ca\/spacecraft\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"}]}},"acf":{"banner_image_type":"none","banner_button":"no"},"_links":{"self":[{"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/pages\/1603"}],"collection":[{"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/comments?post=1603"}],"version-history":[{"count":3,"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/pages\/1603\/revisions"}],"predecessor-version":[{"id":1730,"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/pages\/1603\/revisions\/1730"}],"wp:attachment":[{"href":"https:\/\/carleton.ca\/spacecraft\/wp-json\/wp\/v2\/media?parent=1603"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}