Photomontage by Grace Girotte

By Grace Girotte

The photomontage, titled “G(O)ogol”, addresses the interlocking of race, gender and class while revealing structural inequalities reproduced by the search engine of Google. It aims to draw attention to “algorithmic oppression” and systemic racism encoded in software and inherent in technology’s predictive analytics and search platforms. The textual and visual design of the photomontage was inspired by the dada art movement which aimed to dismantle traditional values in art. G(O)ogol, in a similar sense, aims to dismantle the normative, “traditional” old media representations of race, gender, and class that have translated to new media forms.

G(O)ogol features images retrieved from Google searches of high salary jobs and low salary jobs. From doctors, lawyers, and professors to care workers, nannies, and retail workers, search results usually reveal the reproduction of structural inequalities on the Internet. It has been observed, for instance, that when “nanny” is searched, the image results are predominately women of color. When “doctor” is searched, the results are primarily of white men and women.

It has become increasingly clear that Google’s algorithmic conceptualizations of race, gender, and class reproduce inequalities existent in society. Safiya Noble (2018) explains that the very presence of these misrepresentations in search results are misunderstood and clouded by dominant narratives that perpetuate the idea that search engines are “authentic” and “lack bias”. Search results reflect the values and norms of Google’s commercial partners and advertisers, who often times reflect society’s normalized and extremely profitable values. Google’s algorithmic practices of biasing information toward the interests of the neoliberal capital and social elites have resulted in a provision of information that appears to be credible but is actually a reflection of advertising interests.

The example of the Black female nanny and care worker reproduced by the search results such as Google aligns with the historical and damaging stereotype of the mammy. The controlling image of the mammy represents Black women as nurturing, obedient domestic servants and caretakers. The mammy image is central to intersecting oppressions of race, gender, and class. Google fails to provide culturally situated knowledge on how Black women and girls have both contemporarily and traditionally been misrepresented and discriminated against through media. The commodified online status of Black women’s and girls’ bodies are defined by a technological system that does not take into account the broader social, political, and historical significance of racist and sexist representations.

The intersection of race, gender, and class and their impact are visually represented by the white male doctor looming in the back of the photomontage, the white female lawyer connected by the coiled copper of an Intrauterine device (IUD), and the discolored, hourglass figured ivory tower.  These visual representations are an ode to the dada art movement. In a nonsensical nature, these visuals represent surveillance, control, and the male pornographic gaze bound to the traditional values of old media. Yet as Noble argues, the traditional misrepresentations in old media are made real once again online, situated through an authoritative mechanism that is trusted by the public: Google

G(O)ogol challenges the authenticity of search engines by drawing attention to the biased algorithmic conceptualizations of race, gender, and class provided by Google Search. While Google has since made changes to its algorithm, it is still a deeply flawed system. The textual component in the photomontage is a screenshot of the report section for inappropriate predictions. While structural changes have been made to the algorithm, algorithmic oppression still exists within these technological spaces. Even in the face of evidence, the public struggles to hold tech companies accountable for their harmful errors. As Google consumers, we must interrogate the power of algorithms and the encoded human prejudice, misunderstandings, and biases.

Grace Girotte is a fourth-year Communication and Media Studies student at Carleton University, minoring in Women’s and Gender Studies and Sociology. Grace is interested in combining media studies with feminist thought to question systems of power and privilege.