{"id":76346,"date":"2021-05-16T09:00:47","date_gmt":"2021-05-16T13:00:47","guid":{"rendered":"https:\/\/newsroom.carleton.ca\/?post_type=cu_story&#038;p=76346"},"modified":"2025-08-19T09:37:14","modified_gmt":"2025-08-19T13:37:14","slug":"biased-algorithms-moderation-censoring-activists","status":"publish","type":"cu_story","link":"https:\/\/carleton.ca\/news\/story\/biased-algorithms-moderation-censoring-activists\/","title":{"rendered":"Beyond a technical bug: Biased algorithms and moderation are censoring activists on social media"},"content":{"rendered":"\n<section class=\"w-screen px-6 cu-section cu-section--white ml-offset-center md:px-8 lg:px-14\">\n    <div class=\"space-y-6 cu-max-w-child-max  md:space-y-10 cu-prose-first-last\">\n\n        \n                    \n                    \n            \n    <div class=\"cu-wideimage relative flex items-center justify-center mx-auto px-8 overflow-hidden md:px-16 rounded-xl not-prose  my-6 md:my-12 first:mt-0 bg-opacity-50 bg-cover bg-cu-black-50 pt-24 pb-32 md:pt-28 md:pb-44 lg:pt-36 lg:pb-60 xl:pt-48 xl:pb-72\" style=\"background-image: url(https:\/\/carleton.ca\/news\/wp-content\/uploads\/sites\/162\/conversation-social-media-algorithms-1200w-1.jpg); background-position: 50% 50%;\">\n\n                    <div class=\"absolute top-0 w-full h-screen\" style=\"background-color:rgba(0,0,0,0.600);\"><\/div>\n        \n        <div class=\"relative z-[2] max-w-4xl w-full flex flex-col items-center gap-2 cu-wideimage-image cu-zero-first-last\">\n            <header class=\"mx-auto mb-6 text-center text-white cu-pageheader cu-component-updated cu-pageheader--center md:mb-12\">\n\n                                    <h1 class=\"cu-prose-first-last font-semibold mb-2 text-3xl md:text-4xl lg:text-5xl lg:leading-[3.5rem] cu-pageheader--center text-center mx-auto after:left-px\">\n                        Beyond a technical bug: Biased algorithms and moderation are censoring activists on social media\n                    <\/h1>\n                \n                            <\/header>\n        <\/div>\n\n                    <svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" class=\"absolute bottom-0 w-full z-[1]\" fill=\"none\" viewbox=\"0 0 1280 312\">\n                <path fill=\"#fff\" d=\"M26.412 315.608c-.602-.268-6.655-2.412-13.524-4.769a1943.84 1943.84 0 0 1-14.682-5.144l-2.276-.858v-5.358c0-4.876.086-5.358.773-5.09 1.674.643 21.38 5.84 34.646 9.109 14.682 3.59 28.935 6.858 45.936 10.449l9.874 2.089H57.322c-16.4 0-30.31-.16-30.91-.428ZM460.019 315.233c42.974-10.074 75.602-19.88 132.443-39.867 76.16-26.791 152.063-57.709 222.385-90.663 16.7-7.823 21.336-10.074 44.262-21.273 85.004-41.688 134.719-64.193 195.291-88.413 66.55-26.577 145.2-53.584 194.27-66.765C1258.5 5.626 1281.34 0 1282.24 0c.17 0 .34 27.596.34 61.3v61.299l-2.23.375c-84.7 13.718-165.93 35.955-310.736 84.931-46.494 15.753-65.427 22.076-96.166 32.15-9.102 3-24.814 8.198-34.989 11.574-107.543 35.954-153.008 50.422-196.626 62.639l-6.74 1.876-89.126-.054c-78.135-.054-88.782-.161-85.948-.857ZM729.628 312.875c33.229-10.985 69.248-23.523 127.506-44.207 118.705-42.223 164.596-57.709 217.446-73.302 2.62-.75 8.29-2.465 12.67-3.751 56.19-16.772 126.94-33.597 184.17-43.671 5.07-.91 9.66-1.768 10.22-1.875l.94-.161v170.236l-281.28-.054H719.968l9.66-3.215ZM246.864 313.411c-65.041-2.251-143.047-12.11-208.432-26.256-18.375-3.965-41.73-9.538-42.202-10.074-.171-.214-.257-21.38-.214-47.046l.129-46.618 6.654 3.697c57.313 32.043 118.491 56.531 197.699 79.143 40.313 11.521 83.459 18.058 138.669 21.059 15.584.857 65.685.857 81.14 0 33.744-1.876 61.306-4.93 88.396-9.806 6.396-1.126 11.634-1.983 11.722-1.929.255.375-20.48 7.769-30.999 11.038-28.592 8.948-59.288 15.646-91.873 20.147-26.36 3.59-50.015 5.627-78.35 6.698-15.584.59-55.209.59-72.339-.053Z\"><\/path>\n                <path fill=\"#fff\" d=\"M-3.066 295.067 32.06 304.1v9.033H-3.066v-18.066Z\"><\/path>\n            <\/svg>\n            <\/div>\n\n    \n\n    <\/div>\n<\/section>\n\n<p>Following Red Dress Day on May 5, a day aimed to raise awareness for Missing and Murdered Indigenous Women and Girls (MMIWG), Indigenous activists and supporters of the campaign found <a href=\"https:\/\/www.cbc.ca\/news\/indigenous\/instagram-stories-vanish-mmiwg-red-dress-day-1.6017113\" target=\"_blank\" rel=\"noopener noreferrer\">posts about MMIWG had disappeared<\/a> from their Instagram accounts. In response, Instagram released a tweet saying that this was \u201c<a href=\"https:\/\/twitter.com\/InstagramComms\/status\/1390376354332487681\" target=\"_blank\" rel=\"noopener noreferrer\">a widespread global technical issue not related to any particular topic<\/a>,\u201d followed by an apology explaining that the platform \u201c<a href=\"https:\/\/twitter.com\/mosseri\/status\/1390764509019607040\" target=\"_blank\" rel=\"noopener noreferrer\">experienced a technical bug, which impacted millions of people\u2019s stories, highlights and archives around the world<\/a>.\u201d<\/p>\n\n\n\n<p>Creators, however, said that not all stories were affected.<\/p>\n\n\n\n<p>And this is not the first time social media platforms have been under scrutiny because of their erroneous censoring of grassroots activists and racial minorities. <\/p>\n\n\n\n<p>Many <a href=\"https:\/\/www.buzzfeednews.com\/article\/craigsilverman\/facebook-silencing-black-lives-matter-activists\" target=\"_blank\" rel=\"noopener noreferrer\">Black Lives Matter<\/a> (BLM) activists were similarly frustrated when Facebook flagged their accounts, but didn\u2019t do enough to <a href=\"https:\/\/www.motherjones.com\/politics\/2020\/07\/facebook-black-lives-matter\/\" target=\"_blank\" rel=\"noopener noreferrer\">stop racism and hate speech<\/a> against Black people on their platform.<\/p>\n\n\n\n<p>So were these really about technical glitches? Or did they result from the platforms\u2019 discriminatory and biased policies and practices? The answer lies somewhere in between.<\/p>\n\n\n\n<h2 id=\"towards-automated-content-moderation\" class=\"wp-block-heading\">Towards automated content moderation<\/h2>\n\n\n\n<p>Every time an activist\u2019s post is wrongly removed, there are at least three possible scenarios. <\/p>\n\n\n\n<p>First, sometimes the platform deliberately takes down activists\u2019 posts and accounts, usually at request of and\/or in co-ordination with the government. This happened when <a href=\"https:\/\/www.washingtonpost.com\/news\/powerpost\/paloma\/the-technology-202\/2020\/01\/13\/the-technology-202-instagram-faces-backlash-for-removing-posts-praising-soleimani\/5e1b7f1788e0fa2262dcbc72\/\" target=\"_blank\" rel=\"noopener noreferrer\">Facebook and Instagram removed posts and accounts of Iranians<\/a> who expressed support for the Iranian general Qassem Soleiman. <\/p>\n\n\n\n<p>In some countries and disputed territories, such as <a href=\"https:\/\/www.aljazeera.com\/opinions\/2021\/3\/27\/why-is-twitter-silencing-kashmiri-voices\" target=\"_blank\" rel=\"noopener noreferrer\">Kashmir<\/a>, <a href=\"https:\/\/www.propublica.org\/article\/facebook-hate-speech-censorship-internal-documents-algorithms\" target=\"_blank\" rel=\"noopener noreferrer\">Crimea, Western Sahara and Palestinian territories<\/a>, platforms censored activists and journalists to allegedly <a href=\"https:\/\/www.cfr.org\/backgrounder\/hate-speech-social-media-global-comparisons\" target=\"_blank\" rel=\"noopener noreferrer\">maintain their market access or to protect themselves from legal liabilities<\/a>.<\/p>\n\n\n\n<p>Second, a post can be removed through a user-reporting mechanism. To handle unlawful or prohibited communication, social media platforms have indeed <a href=\"https:\/\/yalebooks.yale.edu\/book\/9780300173130\/custodians-internet\" target=\"_blank\" rel=\"noopener noreferrer\">primarily relied on users reporting<\/a>.<\/p>\n\n\n\n<p>Applying community standards developed by the platform, content moderators would then review reported content and determine whether a violation had occurred. If it had, the content would be removed, and, in the case of serious or repeat infringements, the user may be temporarily suspended or permanently banned.<\/p>\n\n\n\n<p>This mechanism is problematic. Due to the sheer volume of reports received on a daily basis, there are simply not enough moderators to review each report adequately. Also, complexities and subtleties of language pose real challenges. Meanwhile, marginalized groups reclaiming abusive terms for public awareness, such as BLM and MMIWG, can be misinterpreted as being abusive.<\/p>\n\n\n\n<p>Further, in flagging content, users tend to rely on <a href=\"https:\/\/www.wired.com\/2017\/03\/google-and-facebook-cant-just-make-fake-news-disappear\/\" target=\"_blank\" rel=\"noopener noreferrer\">partisanship and ideology<\/a>. User reporting approach is driven by popular opinion of a platform\u2019s users while potentially repressing the right to unpopular speech.<\/p>\n\n\n\n<p>Such approach also emboldens <a href=\"https:\/\/doi.org\/10.1080\/14672715.2017.1341188\" target=\"_blank\" rel=\"noopener noreferrer\">freedom to hate<\/a>, where users exercise their right to voice their opinions while actively silencing others. A notable example is the removal by Facebook of \u201c<a href=\"https:\/\/forward.com\/schmooze\/201639\/coldplay-slammed-for-freedom-for-palestine-faceboo\/\" target=\"_blank\" rel=\"noopener noreferrer\">Freedom for Palestine<\/a>,\u201d a multi-artist collaboration posted by Coldplay,  after a number of users reported the song as \u201cabusive.\u201d <\/p>\n\n\n\n<p>Third, platforms are increasingly using artificial intelligence (AI) to help identify and remove prohibited content. The idea is that complex algorithms that use natural language processing can flag racist or violent content faster and better than humans possibly can. During the COVID-19 pandemic, social media companies are relying more on AI to cover for tens of thousands of human moderators who <a href=\"https:\/\/www.brookings.edu\/techstream\/covid-19-is-triggering-a-massive-experiment-in-algorithmic-content-moderation\/\" target=\"_blank\" rel=\"noopener noreferrer\">were sent home<\/a>. Now, more than ever, algorithms decide what users can and cannot post online.<\/p>\n\n\n\n<h2 id=\"algorithmic-biases\" class=\"wp-block-heading\">Algorithmic biases<\/h2>\n\n\n\n<p>There\u2019s an inherent belief that AI systems are less biased and can scale better than human beings. In practice, however, they\u2019re easily disposed to error and can impose bias on a colossal systemic scale. <\/p>\n\n\n\n<p>In two 2019 computational linguistic studies, researchers discovered that AI intended to identify hate speech may actually end up amplifying racial bias.<\/p>\n\n\n\n<p>In <a href=\"https:\/\/homes.cs.washington.edu\/%7Emsap\/pdfs\/sap2019risk.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">one study<\/a>, researchers found that tweets written in African American English commonly spoken by Black Americans are up to twice more likely to be flagged as offensive compared to others. Using a dataset of 155,800 tweets, <a href=\"https:\/\/arxiv.org\/pdf\/1905.12516.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">another study<\/a> found a similar widespread racial bias against Black speeches.<\/p>\n\n\n\n<p>What\u2019s considered offensive is bound to social context; terms that are slurs when used in some settings may not be in others. Algorithmic systems lack an ability to capture nuances and contextual particularities, which may not be understood by human moderators who test data used to train these algorithms either. This means natural language processing which is often perceived as an objective tool to identify offensive content can amplify the same biases that human beings have. <\/p>\n\n\n\n<p>Algorithmic bias may jeopardize some people who are already at risk by wrongly categorizing them as offensive, criminals or even terrorists. In mid 2020, Facebook deleted at least 35 accounts of <a href=\"https:\/\/www.nbcnews.com\/tech\/tech-news\/facebook-doesn-t-care-activists-say-accounts-removed-despite-zuckerberg-n1231110\" target=\"_blank\" rel=\"noopener noreferrer\">Syrian journalists and activists<\/a> on the pretext of terrorism while in reality, they were campaigning against violence and terrorism.<\/p>\n\n\n\n<figure class=\"wp-block-image align-center\"><img decoding=\"async\" src=\"https:\/\/images.theconversation.com\/files\/400797\/original\/file-20210514-19-4e2vj6.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip\" alt=\"Indigenous protesters, some wearing ribbon skits, some holding drums stand in a circle in the winter.\"\/><figcaption class=\"wp-element-caption\">\n              <span class=\"caption\">Protesters gather in Winnipeg the day after the jury delivered a not-guilty verdict in the second degree murder trial of Raymond Cormier, the man accused of killing Tina Fontaine. MMIWG activists had their posts removed from Instagram on a day to raise awareness.<\/span><br>\n              <span class=\"attribution\"><span class=\"source\">THE CANADIAN PRESS\/John Woods<\/span><\/span><br>\n            <\/figcaption><\/figure>\n\n\n\n<p>MMIWG, BLM and the Syrian cases exemplify the dynamic of \u201c<a href=\"https:\/\/nyupress.org\/9781479837243\/algorithms-of-oppression\/\" target=\"_blank\" rel=\"noopener noreferrer\">algorithms of opression<\/a>\u201d where algorithms reinforce older oppressive social relations and re-install new modes of racism and discrimination. <\/p>\n\n\n\n<p>While AI is celebrated as autonomous technology that can develop away from human intervention, it is inherently biased. The inequalities that underpin bias already exist in society and influence who gets the opportunity to build algorithms and their databases, and for what purpose. As such, algorithms do not intrinsically provide ways for marginalized people to escape discrimination, but they also reproduce new forms of inequality along social, racial and political lines. <\/p>\n\n\n\n<p>Despite the apparent problems, algorithms are here to stay. There is no silver bullet, but one can take steps to minimize bias. First is to recognize that there\u2019s a problem. Then, making a strong commitment to root out algorithmic biases.<\/p>\n\n\n\n<p>Bias can infiltrate the process anywhere in designing algorithms. <\/p>\n\n\n\n<p>The inclusion of more people from diverse backgrounds within this process \u2014 Indigenous, racial minorities, women and other historically marginalized groups \u2014 is one of important steps to help mitigate the bias. In the meantime, it is important to push platforms to allow for as much transparency and public oversight as possible.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p><\/p>\n\n\n\n<p>This article is republished from <a href=\"https:\/\/theconversation.com\/institutions\/carleton-university-900\" target=\"_blank\" rel=\"noopener noreferrer\">The Conversation<\/a> under a Creative Commons license. Carleton University is a member of this unique digital journalism platform that launched in June 2017 to boost visibility of Canada\u2019s academic faculty and researchers. Interested in writing a piece? Please contact <a href=\"mailto:steven.reid3@carleton.ca\">Steven Reid<\/a> or <a href=\"https:\/\/theconversation.com\/become-an-author\" target=\"_blank\" rel=\"noopener noreferrer\">sign up to become an author<\/a>.<\/p>\n\n\n\n<p><em>All photos provided by The Conversation from various sources.<\/em><\/p>\n\n\n\n<p>&#8212;<br>\n<a href=\"https:\/\/newsroom.carleton.ca\/\">Carleton Newsroom<\/a><\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/counter.theconversation.com\/content\/160669\/count.gif?distributor=republish-lightbox-basic\" alt=\"The Conversation\"\/><\/figure>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Following Red Dress Day on May 5, a day aimed to raise awareness for Missing and Murdered Indigenous Women and Girls (MMIWG), Indigenous activists and supporters of the campaign found posts about MMIWG had disappeared from their Instagram accounts. In response, Instagram released a tweet saying that this was \u201ca widespread global technical issue not [&hellip;]<\/p>\n","protected":false},"author":410,"featured_media":76347,"template":"","meta":{"_acf_changed":false,"footnotes":"","_links_to":"","_links_to_target":""},"cu_story_type":[1623],"cu_story_tag":[],"class_list":["post-76346","cu_story","type-cu_story","status-publish","has-post-thumbnail","hentry","cu_story_type-expert-perspectives"],"acf":{"cu_post_thumbnail":false},"_links":{"self":[{"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/cu_story\/76346","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/cu_story"}],"about":[{"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/types\/cu_story"}],"author":[{"embeddable":true,"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/users\/410"}],"version-history":[{"count":3,"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/cu_story\/76346\/revisions"}],"predecessor-version":[{"id":76350,"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/cu_story\/76346\/revisions\/76350"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/media\/76347"}],"wp:attachment":[{"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/media?parent=76346"}],"wp:term":[{"taxonomy":"cu_story_type","embeddable":true,"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/cu_story_type?post=76346"},{"taxonomy":"cu_story_tag","embeddable":true,"href":"https:\/\/carleton.ca\/news\/wp-json\/wp\/v2\/cu_story_tag?post=76346"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}