{"id":4559,"date":"2016-12-16T15:36:57","date_gmt":"2016-12-16T20:36:57","guid":{"rendered":"https:\/\/newsroom.carleton.ca\/?post_type=cu_story&#038;p=4559"},"modified":"2025-10-15T10:42:42","modified_gmt":"2025-10-15T14:42:42","slug":"robot-ethics-jason-millar","status":"publish","type":"cu_story","link":"https:\/\/carleton.ca\/news\/story\/robot-ethics-jason-millar\/","title":{"rendered":"Robot Ethics"},"content":{"rendered":"\n<section class=\"w-screen px-6 cu-section cu-section--white ml-offset-center md:px-8 lg:px-14\">\n    <div class=\"space-y-6 cu-max-w-child-max  md:space-y-10 cu-prose-first-last\">\n\n        \n        \n        \n            \n    <div class=\"cu-wideimage relative flex items-center justify-center mx-auto px-8 overflow-hidden md:px-16 rounded-xl not-prose  my-6 md:my-12 first:mt-0 bg-cu-black-50 pt-10 pb-12\" style=\"\">\n\n        \n        <div class=\"relative z-[2] max-w-4xl w-full flex flex-col items-center gap-2 cu-wideimage-image cu-zero-first-last\">\n            <header class=\"mx-auto mb-6 text-center text-cu-black-800 cu-pageheader cu-component-updated cu-pageheader--center md:mb-12\">\n\n                                    <h1 class=\"cu-prose-first-last font-semibold mb-2 text-3xl md:text-4xl lg:text-5xl lg:leading-[3.5rem] cu-pageheader--center text-center mx-auto after:left-px\">\n                        Robot Ethics\n                    <\/h1>\n                \n                            <\/header>\n        <\/div>\n\n                    <svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" class=\"absolute bottom-0 w-full z-[1]\" fill=\"none\" viewbox=\"0 0 1280 312\">\n                <path fill=\"#fff\" d=\"M26.412 315.608c-.602-.268-6.655-2.412-13.524-4.769a1943.84 1943.84 0 0 1-14.682-5.144l-2.276-.858v-5.358c0-4.876.086-5.358.773-5.09 1.674.643 21.38 5.84 34.646 9.109 14.682 3.59 28.935 6.858 45.936 10.449l9.874 2.089H57.322c-16.4 0-30.31-.16-30.91-.428ZM460.019 315.233c42.974-10.074 75.602-19.88 132.443-39.867 76.16-26.791 152.063-57.709 222.385-90.663 16.7-7.823 21.336-10.074 44.262-21.273 85.004-41.688 134.719-64.193 195.291-88.413 66.55-26.577 145.2-53.584 194.27-66.765C1258.5 5.626 1281.34 0 1282.24 0c.17 0 .34 27.596.34 61.3v61.299l-2.23.375c-84.7 13.718-165.93 35.955-310.736 84.931-46.494 15.753-65.427 22.076-96.166 32.15-9.102 3-24.814 8.198-34.989 11.574-107.543 35.954-153.008 50.422-196.626 62.639l-6.74 1.876-89.126-.054c-78.135-.054-88.782-.161-85.948-.857ZM729.628 312.875c33.229-10.985 69.248-23.523 127.506-44.207 118.705-42.223 164.596-57.709 217.446-73.302 2.62-.75 8.29-2.465 12.67-3.751 56.19-16.772 126.94-33.597 184.17-43.671 5.07-.91 9.66-1.768 10.22-1.875l.94-.161v170.236l-281.28-.054H719.968l9.66-3.215ZM246.864 313.411c-65.041-2.251-143.047-12.11-208.432-26.256-18.375-3.965-41.73-9.538-42.202-10.074-.171-.214-.257-21.38-.214-47.046l.129-46.618 6.654 3.697c57.313 32.043 118.491 56.531 197.699 79.143 40.313 11.521 83.459 18.058 138.669 21.059 15.584.857 65.685.857 81.14 0 33.744-1.876 61.306-4.93 88.396-9.806 6.396-1.126 11.634-1.983 11.722-1.929.255.375-20.48 7.769-30.999 11.038-28.592 8.948-59.288 15.646-91.873 20.147-26.36 3.59-50.015 5.627-78.35 6.698-15.584.59-55.209.59-72.339-.053Z\"><\/path>\n                <path fill=\"#fff\" d=\"M-3.066 295.067 32.06 304.1v9.033H-3.066v-18.066Z\"><\/path>\n            <\/svg>\n            <\/div>\n\n    \n\n    <\/div>\n<\/section>\n\n<p>Jason Millar is bridging the divide that separates engineering programs and the humanities.<\/p>\n\n\n\n<p>A former engineer, Millar is now a <a href=\"https:\/\/carleton.ca\/philosophy\/\" target=\"_blank\">philosophy<\/a> instructor at Carleton University while also completing post-doctoral research in philosophy at the University of Ottawa\u2019s faculty of law.<\/p>\n\n\n\n<p>Millar\u2019s research examines the ethical implications of advances in robotics and artificial intelligence, sometimes known as robot ethics.<\/p>\n\n\n\n<p>\u201cWe are spinning this web of technology, how do we do that and maintain our humanity?\u201d Millar asks.<\/p>\n\n\n<div class=\"not-prose cu-quote cu-component-spacing\">\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cWhat are the questions that would lead us to think that maybe we could lose our humanity in the first place?\u201d<\/p>\n<\/blockquote>\n<\/div>\n\n\n<p>His research focuses on driverless cars, social robotics, and weapons systems\u2014all artificial intelligence technologies that blur the boundary between human and machine. Each of these technologies raise myriad social and ethical questions.<\/p>\n\n\n<p>[wide-image image=&#8221;4573&#8243;]<\/p>\n\n\n\n<h2 id=\"blurred-boundary-betweenhuman-and-machine\" class=\"wp-block-heading\">Blurred Boundary Between<br>\nHuman and Machine<\/h2>\n\n\n\n<p>Early in his career as an engineer for a large aerospace engineering firm, Millar helped design electronics assemblies for commercial and military aircraft, and the <a href=\"https:\/\/www.nasa.gov\/mission_pages\/station\/main\/index.html\" target=\"_blank\">International Space Station<\/a>. He received one that stood out from the rest\u2014a \u201cbig green donut\u201d that was unlike any of the traditional rectangular assemblies he had encountered. It was part of a guided bomb unit and it brought him up short. It was the first time the ethical implications of his engineering work caused him to question how we think about, design and use technology.<\/p>\n\n\n\n<p>\u201cMy engineering education didn\u2019t provide me with any kind of \u2018intellectual toolkit\u2019 to deal with that question,\u201d said Millar.<\/p>\n\n\n\n<figure class=\"wp-block-image alignnone size-full wp-image-4572\"><img loading=\"lazy\" decoding=\"async\" width=\"1200\" height=\"680\" src=\"https:\/\/newsroom.carleton.ca\/wp-content\/uploads\/robot_ethics_1200w_4.jpg\" alt=\"Jason Millar is recognized and identified through the eyes of a robot.\" class=\"wp-image-4572\" srcset=\"https:\/\/carleton.ca\/news\/wp-content\/uploads\/sites\/162\/robot_ethics_1200w_4.jpg 1200w, https:\/\/carleton.ca\/news\/wp-content\/uploads\/sites\/162\/robot_ethics_1200w_4-300x170.jpg 300w, https:\/\/carleton.ca\/news\/wp-content\/uploads\/sites\/162\/robot_ethics_1200w_4-400x227.jpg 400w, https:\/\/carleton.ca\/news\/wp-content\/uploads\/sites\/162\/robot_ethics_1200w_4-768x435.jpg 768w, https:\/\/carleton.ca\/news\/wp-content\/uploads\/sites\/162\/robot_ethics_1200w_4-700x397.jpg 700w, https:\/\/carleton.ca\/news\/wp-content\/uploads\/sites\/162\/robot_ethics_1200w_4-200x113.jpg 200w\" sizes=\"auto, (max-width: 1200px) 100vw, 1200px\" \/><figcaption class=\"wp-element-caption\">Jason Millar is recognized and identified through the eyes of a robot.<\/figcaption><\/figure>\n\n\n\n<p>A seemingly inconsequential political philosophy course taken as an elective during his engineering program became the lynchpin linking his design work to the ethical questions it raised. This one course became a catalyst for a career change when Millar left his engineering work to start over\u2014pursuing undergraduate studies and eventually a PhD in philosophy.<\/p>\n\n\n\n<h2 id=\"friend-or-foe\" class=\"wp-block-heading\">Friend or Foe<\/h2>\n\n\n\n<p>The dizzying speed at which technology has infiltrated nearly every aspect of our lives requires closer scrutiny as new advances such as Deep Learning confound our expectations of the devices we have come to rely on.<\/p>\n\n\n\n<p>A relatively new advance in artificial intelligence, Deep Learning goes beyond merely programming devices such as driverless cars to drive. Instead, borrowing from the architecture of the human brain, it teaches them how to \u201clearn\u201d to drive, akin to how a human would.<\/p>\n\n\n\n<p>\u201cWe are starting to see that the systems surprise their designers,\u201d Millar says.<\/p>\n\n\n<div class=\"not-prose cu-quote cu-component-spacing\">\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cThey\u2019ll do things that are highly effective in achieving goals that the designers never anticipated.\u201d<\/p>\n<\/blockquote>\n<\/div>\n\n\n<p>He cites the computer program <a href=\"https:\/\/en.wikipedia.org\/wiki\/AlphaGo\" target=\"_blank\">AlphaGo<\/a> as one example. Designed by Google, AlphaGo confounded and out-maneuvered human players in the highly complex game Go.<\/p>\n\n\n\n<p>Millar says this increased complexity correlates with unpredictability and raises novel questions about responsibility.<\/p>\n\n\n\n<p>Millar\u2019s research untangles thorny ethical questions of responsibility and trust. Who is responsible when the driverless car has an accident\u2014the occupant, the engineer who designed the car, the car? The answer has far-reaching implications, not only for engineers, but for insurance companies, lawyers, manufacturers, policy-makers, and the public. The driverless car is just one of the rapidly evolving technologies that is transforming how humans think about and use technology.<\/p>\n\n\n<p>[wide-image image=&#8221;4575&#8243;]<\/p>\n\n\n\n<h2 id=\"robot-ethicstrusting-the-machine\" class=\"wp-block-heading\">Robot Ethics:<br>\nTrusting the Machine<\/h2>\n\n\n\n<p>Millar examines our psychological predilection to trust others and to anthropomorphize, or assign human characteristics to the technologies we interact with.<\/p>\n\n\n\n<p>In his research he often encounters \u201cdesign literature\u201d that addresses \u201chow to make people trust systems\u201d without seriously questioning whether or not the systems themselves are \u201ctrustworthy.\u201d<\/p>\n\n\n<div class=\"not-prose cu-quote cu-component-spacing\">\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cPhilosophy can help articulate what counts as trustworthy in these different kinds of systems.\u201d<\/p>\n<\/blockquote>\n<\/div>\n\n\n<p>This is crucial to ensure that the needs of the designers and the end users of technology are balanced and that our innate tendency to trust is not abused.<\/p>\n\n\n\n<p>Nowhere is this trust more prevalent than in our use of social media platforms. When we post on Facebook or any other social medium, we have certain expectations of the technology. To some degree we place our trust in it, we expect that the privacy settings available to us will provide some measure of protection. Until they don\u2019t.<\/p>\n\n\n\n<p>Millar coined the term \u201csocially awkward technology\u201d to describe our interaction with the technologies that facilitate our social lives.<\/p>\n\n\n\n<p>\u201cFacebook is a little socially awkward. It doesn&#8217;t really understand what my expectations are. It violates the norms that I would attach to a really trustworthy mediator,\u201d says Millar.<\/p>\n\n\n\n<p>Social robotics will amplify this problem. \u201cBecause the robots are designed to hook us emotionally, to smile at us and make us feel good, it\u2019s imperative that they\u2019re trustworthy by design.\u201d<\/p>\n\n\n<p>[wide-image image=&#8221;4576&#8243;]<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Jason Millar is bridging the divide that separates engineering programs and the humanities. A former engineer, Millar is now a philosophy instructor at Carleton University while also completing post-doctoral research in philosophy at the University of Ottawa\u2019s faculty of law. Millar\u2019s research examines the ethical implications of advances in robotics and artificial intelligence, sometimes known [&hellip;]<\/p>\n","protected":false},"author":410,"featured_media":0,"template":"","meta":{"_acf_changed":false,"footnotes":"","_links_to":"","_links_to_target":""},"cu_story_type":[13,19],"cu_story_tag":[1918],"class_list":["post-4559","cu_story","type-cu_story","status-publish","hentry","cu_story_type-research-discovery","cu_story_type-technology-innovation","cu_story_tag-faculty-of-engineering-and-design"],"acf":{"cu_post_thumbnail":false},"_links":{"self":[{"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/cu_story\/4559","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/cu_story"}],"about":[{"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/types\/cu_story"}],"author":[{"embeddable":true,"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/users\/410"}],"version-history":[{"count":1,"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/cu_story\/4559\/revisions"}],"predecessor-version":[{"id":97222,"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/cu_story\/4559\/revisions\/97222"}],"wp:attachment":[{"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/media?parent=4559"}],"wp:term":[{"taxonomy":"cu_story_type","embeddable":true,"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/cu_story_type?post=4559"},{"taxonomy":"cu_story_tag","embeddable":true,"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/cu_story_tag?post=4559"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}