{"id":153,"date":"2022-02-16T11:13:18","date_gmt":"2022-02-16T16:13:18","guid":{"rendered":"https:\/\/carleton.ca\/populistpublics\/?p=153"},"modified":"2026-03-24T09:12:08","modified_gmt":"2026-03-24T13:12:08","slug":"who-watches-the-watchmen-digital-moderation-in-the-age-of-misinformation","status":"publish","type":"post","link":"https:\/\/carleton.ca\/populistpublics\/2022\/who-watches-the-watchmen-digital-moderation-in-the-age-of-misinformation\/","title":{"rendered":"\u201cWho watches the watchmen?\u201d:  Digital Moderation in the Age of Misinformation"},"content":{"rendered":"\n<section class=\"w-screen px-6 cu-section cu-section--white ml-offset-center md:px-8 lg:px-14\">\n    <div class=\"space-y-6 cu-max-w-child-5xl  md:space-y-10 cu-prose-first-last\">\n\n            <div class=\"cu-textmedia flex flex-col lg:flex-row mx-auto gap-6 md:gap-10 my-6 md:my-12 first:mt-0 max-w-5xl\">\n        <div class=\"justify-start cu-textmedia-content cu-prose-first-last\" style=\"flex: 0 0 100%;\">\n            <header class=\"font-light prose-xl cu-pageheader md:prose-2xl cu-component-updated cu-prose-first-last\">\n                                    <h1 class=\"cu-prose-first-last font-semibold !mt-2 mb-4 md:mb-6 relative after:absolute after:h-px after:bottom-0 after:bg-cu-red after:left-px text-3xl md:text-4xl lg:text-5xl lg:leading-[3.5rem] pb-5 after:w-10 text-cu-black-700 not-prose\">\n                        \u201cWho watches the watchmen?\u201d:  Digital Moderation in the Age of Misinformation\n                    <\/h1>\n                \n                                \n                            <\/header>\n\n                    <\/div>\n\n            <\/div>\n\n    <\/div>\n<\/section>\n\n<p>&#8212; by Nicholas Surges<\/p>\n\n\n\n<figure class=\"wp-block-image alignleft\"><img loading=\"lazy\" decoding=\"async\" width=\"240\" height=\"160\" src=\"https:\/\/carleton.ca\/populistpublics\/wp-content\/uploads\/sites\/223\/Social_media_icons-240x160.jpg\" alt=\"\" class=\"wp-image-154\" srcset=\"https:\/\/carleton.ca\/populistpublics\/wp-content\/uploads\/sites\/223\/Social_media_icons-240x160.jpg 240w, https:\/\/carleton.ca\/populistpublics\/wp-content\/uploads\/sites\/223\/Social_media_icons-600x400.jpg 600w, https:\/\/carleton.ca\/populistpublics\/wp-content\/uploads\/sites\/223\/Social_media_icons-300x200.jpg 300w, https:\/\/carleton.ca\/populistpublics\/wp-content\/uploads\/sites\/223\/Social_media_icons-160x107.jpg 160w, https:\/\/carleton.ca\/populistpublics\/wp-content\/uploads\/sites\/223\/Social_media_icons-768x512.jpg 768w, https:\/\/carleton.ca\/populistpublics\/wp-content\/uploads\/sites\/223\/Social_media_icons-400x267.jpg 400w, https:\/\/carleton.ca\/populistpublics\/wp-content\/uploads\/sites\/223\/Social_media_icons-360x240.jpg 360w, https:\/\/carleton.ca\/populistpublics\/wp-content\/uploads\/sites\/223\/Social_media_icons.jpg 960w\" sizes=\"auto, (max-width: 240px) 100vw, 240px\" \/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>Meta (formerly known as the Facebook Corporation) has increasingly come under scrutiny for the role it plays in shaping the social media landscape. As the largest social media company, it has set many precedents in how hate speech, misinformation, and manipulated or inauthentic content is moderated online.<\/p>\n\n\n\n<p>This raises an interesting question. Can social media corporations be trusted to act in the public interest, or will their moderation always serve their primary interest of protecting their shareholders?<\/p>\n\n\n\n<p>On October 4<sup>th<\/sup>, 2021, former Facebook employee Frances Haugen testified before the United States Senate Committee on Commerce, Science, and Transportation to argue the need for greater regulation of social media platforms. Haugen had been responsible for leaking internal documents to the Wall Street Journal as part of their ongoing Facebook Files investigation into unethical behaviour by the social media giant.<\/p>\n\n\n\n<p>In her testimony, Haugen defended her decision to speak out against her former employer. As she stated: \u201cThe company\u2019s leadership keeps vital information from the public, the U.S. government, its shareholders, and governments around the world. The documents I have provided prove that Facebook has repeatedly misled us about what its own research reveals about the safety of children, its role in spreading hateful and polarizing messages, and so much more.\u201d Haugen went on to argue that Meta\u2019s lack of transparency makes it difficult to hold them accountable for unethical behaviour.<a href=\"#_ftn1\" name=\"_ftnref1\">[1]<\/a><\/p>\n\n\n\n<p>According to Meta\u2019s policy rationale, much of Facebook &amp; Instagram\u2019s moderation is automated, using artificial intelligence and machine learning to remove offensive content as it is posted. This is particular useful for duplicate posts of previously flagged material. Human review teams exist to provide further input in cases where Facebook\u2019s algorithms are unable to determine whether or not a post is offensive. This team of over 15,000 full-time human reviewers review cases flagged by machine-based automoderators and make final rulings.<\/p>\n\n\n\n<p>Of course, beyond what the public has been given through Meta\u2019s transparency centre, the details of their moderation system are vague. While Meta claims that their human reviewers receive over 80 hours of live training, their policy centre has very no information about what credentials are needed to become a moderator or how the company addresses policy gaps related to cultural context, language fluency, and artistic expression.<\/p>\n\n\n\n<p>An October 25, 2021 article by Reuters called attention to the fact that Meta\u2019s moderation has not kept pace with their global expansion: as the giant continues to spread into new markets, the languages spoken in those markets pose a stumbling block to its algorithms and human staff\u2019s abilities to flag abusive content.<\/p>\n\n\n\n<p>A notable example is the lack of functionality in Burmese, the language spoken in Myanmar. As the country is country still rocked by an ongoing ethnic conflict, Facebook\u2019s inability to moderate content spoken in Burmese means that they cannot properly address content stoking ethnic hatred.<a href=\"#_ftn2\" name=\"_ftnref2\">[2]<\/a><\/p>\n\n\n\n<p>India is another case study in Facebook\u2019s failure to provide the language support necessary to police problematic content. The country is home to over 300 million Facebook users \u2013 the largest number in the world \u2013 and yet Facebook only provides service in 11 of India\u2019s 22 official languages.<a href=\"#_ftn3\" name=\"_ftnref3\">[3]<\/a> Given the country\u2019s religious tensions between Hindus and Muslims, this seems like a glaring oversight.<\/p>\n\n\n\n<p>These shortcomings are at least ostensibly addressed by the existence of an Oversight Board, which exists as an independent body to whom appeals can be made regarding rulings by moderation teams. The board is made up of experts in human rights, journalism and freedom of expression, and other relevant policy areas. While the members of the board are appointed by the company, they are not accountable to them in the way that full-time moderation staff are.<\/p>\n\n\n\n<p>The Oversight Board allows users to contest rulings by Facebook\u2019s full-time human moderators, who sometimes apply community standards without considering the context of a post. In case 2021-012-FB-UA, a wampum belt titled \u201cKill the Indian\/Save the Man,\u201d was deemed hate speech. The Oversight Board would later overturn this ruling, stating that \u201cin context [the use of the] phrase draws attention to and condemns specific acts of hatred and discrimination.\u201d<a href=\"#_ftn4\" name=\"_ftnref4\">[4]<\/a><\/p>\n\n\n\n<p>Similarly, a post in which a quote misattributed to Joseph Goebbels (\u201cArguments must be crude, clear, and forcible, and appeal to emotions and instincts, not the intellect\u201d) was removed because Goebbels is on the company\u2019s list of dangerous individuals. The quote is actually from the forward to British historian Hugh Redwald Trevor-Roper\u2019s <em>Final Entries, 1945<\/em>, which was based on discovered diary entries from Goebbels.<a href=\"#_ftn5\" name=\"_ftnref5\">[5]<\/a> While it is thus not one of Goebbels\u2019 quotes, it does serve as a qualified encapsulation of the propaganda policies of the Third Reich.<\/p>\n\n\n\n<p>The user posted the quote in order to make a statement about demagoguery, crypto-fascism, and populist appeals to emotion (specifically in reference to Trumpism in the United States), but in their initial decision Facebook ruled that quotes by dangerous individuals cannot be shared unless the user makes it explicitly clear that the intent is to counter hate speech, extremism, or to share it for educational or news purposes. This is not explicitly clear in Facebook\u2019s public-facing policies.<\/p>\n\n\n\n<p>As the Oversight Board noted when they overturned the decision on the grounds that the quote in itself did not support the Nazi regime or hate speech, there is a gap between what is explicitly permitted or banned in Facebook\u2019s public-facing community standards and the criteria used by human moderators employed by the company.<a href=\"#_ftn6\" name=\"_ftnref6\">[6]<\/a> The case is also interesting because the actual provenance of the quote was never in question: the fact that the quote was never one actually said by Goebbels didn\u2019t figure into the ruling, which has troubling implications for historical revisionism and misinformation.<\/p>\n\n\n\n<p>As is probably already evident by the complexity of some of the previously mentioned rulings, the creation of this higher system of appeal has proved imperfect. Some of the reasons are purely mechanical: appeals can only be launched by users who have an active account on posts that have already been reviewed and must be submitting within 15 days of the initial ruling. This means that users who have already been banned or who have deleted their accounts have no means of submitting an appeal to the board. Launching an appeal also requires knowledge of the oversight board, which many users may not even be aware exists.<\/p>\n\n\n\n<p>After the September 13, 2021 report in the Wall Street Journal called attention to hypocrisy by Facebook\u2019s XCheck program, the board was forced to examine whether or not Facebook was consistently applying its professed standards. This system, which deals with high-profile users or organizations who are \u201cimportant\u201d, \u201cpopular\u201d, or \u201cPR-risky\u201d, includes a whitelist system. Users or pages who are added to this list are treated more leniently than ordinary users.<\/p>\n\n\n\n<p>In the subsequent announcement, the Oversight Board concluded that \u201cFacebook has not been fully forthcoming with the board on its \u2018cross-check\u2019 system, which the company uses to review content decisions relating to high-profile users.\u201d<a href=\"#_ftn7\" name=\"_ftnref7\">[7]<\/a> The board also noted that Facebook does not fully comply with all of their requests for further information needed to inform their rulings, denying some of their requests for further context as \u201cirrelevant\u201d \u2013 a judgement that should probably be made by the board.<\/p>\n\n\n\n<p>Furthermore, the board noted in their report for the third quarter of 2021 that Meta was only introducing 9 of the board\u2019s 25 recommendations fully. 4 it claimed to be introducing \u201cin part\u201d, 5 it claimed to be \u201cassessing feasibility\u201d of, 5 it claimed it was already doing, and 2 it rejected outright. This illustrates how the boards recommendations may not always being implemented in full: Meta may take them under advisement but is under no obligation to act upon them.<a href=\"#_ftn8\" name=\"_ftnref8\">[8]<\/a><\/p>\n\n\n\n<p>Taken at a whole, these findings suggest that social media companies cannot, at present, be trusted to act in the public good. Is it time to start pushing for greater transparency and accountability from the social media sector?<\/p>\n\n\n\n<p>*Image used courtesy of <a class=\"extiw\" title=\"w:en:Creative Commons\" href=\"https:\/\/en.wikipedia.org\/wiki\/en:Creative_Commons\">Creative Commons<\/a>&nbsp;<a class=\"external text\" href=\"https:\/\/creativecommons.org\/licenses\/by-sa\/4.0\/deed.en\" rel=\"nofollow\">Attribution-Share Alike 4.0 International<\/a><\/p>\n\n\n\n<p><a href=\"#_ftnref1\" name=\"_ftn1\">[1]<\/a> <em>Statement of <\/em><em>Frances Haugen, Before the Sub-Committee on Consumer Protection, Product Safety, and Data Security<\/em>, October 4, 2021. <a href=\"https:\/\/www.commerce.senate.gov\/services\/files\/FC8A558E-824E-4914-BEDB-3A7B1190BD49\">https:\/\/www.commerce.senate.gov\/services\/files\/FC8A558E-824E-4914-BEDB-3A7B1190BD49<\/a><\/p>\n\n\n\n<p><a href=\"#_ftnref2\" name=\"_ftn2\">[2]<\/a> Elizabeth Culliford and Brad Heath, \u201cFacebook knew about, failed to police, abusive content globally \u2013 documents\u201d, <em>Reuters<\/em>, October 25 2021. <a href=\"https:\/\/www.reuters.com\/technology\/facebook-knew-about-failed-police-abusive-content-globally-documents-2021-10-25\/\">https:\/\/www.reuters.com\/technology\/facebook-knew-about-failed-police-abusive-content-globally-documents-2021-10-25\/<\/a><\/p>\n\n\n\n<p><a href=\"#_ftnref3\" name=\"_ftn3\">[3]<\/a> Salimah Shivji, \u201cFacebook has a massive disinformation problem in India. This student learned firsthand how damaging it can be\u201d, <em>CBC News<\/em>, December 9 2021. <a href=\"https:\/\/www.cbc.ca\/news\/world\/india-facebook-disinformation-1.6276857\">https:\/\/www.cbc.ca\/news\/world\/india-facebook-disinformation-1.6276857<\/a><\/p>\n\n\n\n<p><a href=\"#_ftnref4\" name=\"_ftn4\">[4]<\/a> <em>Case decision 2021-012-FB-UA<\/em>, Oversight Board, December 9 2021. <a href=\"https:\/\/oversightboard.com\/decision\/FB-L1LANIA7\/\">https:\/\/oversightboard.com\/decision\/FB-L1LANIA7\/<\/a><\/p>\n\n\n\n<p><a href=\"#_ftnref5\" name=\"_ftn5\">[5]<\/a> Joseph Goebbels and H. R. Trevor-Roper. <em>Final Entries, 1945: the Diaries of Joseph Goebbels<\/em>. Edited and Introduced by Hugh Trevor-Roper. New York: Putnam, 1978.<\/p>\n\n\n\n<p><a href=\"#_ftnref6\" name=\"_ftn6\">[6]<\/a> <em>Case decision 2020-005-FB-UA<\/em>, Oversight Board, January 28, 2021. <a href=\"https:\/\/oversightboard.com\/decision\/FB-L1LANIA7\/\">https:\/\/oversightboard.com\/decision\/FB-L1LANIA7\/<\/a><\/p>\n\n\n\n<p><a href=\"#_ftnref7\" name=\"_ftn7\">[7]<\/a> \u201cOversight Board demands more transparency from Facebook\u201d, <em>Oversight Board, <\/em>October 2021. <a href=\"https:\/\/oversightboard.com\/news\/215139350722703-oversight-board-demands-more-transparency-from-facebook\/\">https:\/\/oversightboard.com\/news\/215139350722703-oversight-board-demands-more-transparency-from-facebook\/<\/a><\/p>\n\n\n\n<p><a href=\"#_ftnref8\" name=\"_ftn8\">[8]<\/a> \u201cOversight Board demands more transparency from Facebook\u201d, <em>Oversight Board, <\/em>October 2021.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>&#8212; by Nicholas Surges Meta (formerly known as the Facebook Corporation) has increasingly come under scrutiny for the role it plays in shaping the social media landscape. As the largest social media company, it has set many precedents in how hate speech, misinformation, and manipulated or inauthentic content is moderated online. This raises an interesting [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":"","_links_to":"","_links_to_target":""},"categories":[23],"tags":[50,51,48,52,49],"class_list":["post-153","post","type-post","status-publish","format-standard","hentry","category-commentary","tag-community-standards","tag-facebook","tag-meta","tag-misinformation","tag-moderation"],"acf":{"cu_post_thumbnail":""},"_links":{"self":[{"href":"https:\/\/carleton.ca\/populistpublics\/wp-json\/wp\/v2\/posts\/153","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/carleton.ca\/populistpublics\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/carleton.ca\/populistpublics\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/carleton.ca\/populistpublics\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/carleton.ca\/populistpublics\/wp-json\/wp\/v2\/comments?post=153"}],"version-history":[{"count":3,"href":"https:\/\/carleton.ca\/populistpublics\/wp-json\/wp\/v2\/posts\/153\/revisions"}],"predecessor-version":[{"id":160,"href":"https:\/\/carleton.ca\/populistpublics\/wp-json\/wp\/v2\/posts\/153\/revisions\/160"}],"wp:attachment":[{"href":"https:\/\/carleton.ca\/populistpublics\/wp-json\/wp\/v2\/media?parent=153"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/carleton.ca\/populistpublics\/wp-json\/wp\/v2\/categories?post=153"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/carleton.ca\/populistpublics\/wp-json\/wp\/v2\/tags?post=153"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}