Black feminism

Algorithms of Oppression: How Search Engine Results Represent Black Women and Girls

Black Feminist Technology Studies

In her book Algorithms of Oppression, Safiya Noble, an associate professor of Information Studies and African American Studies at UCLA, proposes Black feminist technology studies as an alternative framework for the analysis of racialized and gendered identities on the internet. Through this framework, scholars can examine the way the internet mediates power relations through the matrix of domination.

Noble emphasizes that Black feminist technology studies can offer new narratives around Black people and technology. This approach differs greatly from the framework of the digital divide that presumes technological deficiency among Black internet users. Instead, Black feminist technology studies bring attention to the need for control over one’s personal and social identity on the internet.

How the Google Search Engine Represents Black Women and Girls

Through a close reading of results from Google’s search engine, Noble uses a Black feminist perspective to explain how algorithms replicate longstanding social inequality. The documentation of Black women and girls in search results shows how technology can adversely affect Black people, thus challenge the colorblind rhetoric of neutrality popular in Silicon Valley. Google as a corporation relies on commercial content moderation to control and profit from the information. Through mechanisms like AdWords, other corporations invested in generating traffic to their site exploit the identity of Black women and girls. This has led to an information ecosystem wherein the most popular search engine results for Black girls were those that led to pornographic websites.

Noble contends that the contemporary tech landscape presumes that the white male gaze dominates the use and creation of information and communication technologies. Through this gaze, this pornographic depiction arises through cultural practices of the sexualization of Black women rooted in the controlling images of the Jezebel. As a result, Google profits off the sexual representation of Black women.

How Search Engines Perpetuate Racism

Noble notes that the pitfalls of racial and gender classification aren’t new to search engines. Search engines are a modernization of the Dewey Decimal System and import its biased classification practices onto the internet. Search engines reflect how tech companies in Silicon Valley monopolize information through neocolonial projects that exploit the digital divide.

Noble refers to the case of Dylan Roof to address how information on Google is filtered through a white racial frame, referencing Jesse Daniels‘ work on cyberracism. This approach uses race as a framework to examine unequal distributions of power related to the internet. What if search engines had a classification system that identified sources of information in a manner that helped distinguish white nationalist websites from scholarly discussions of race? Technological racialization results from Google acting as what she describes as a “broker of cultural imperialism” and an information gatekeeper for representations of race mediated by the internet.

Alternative Approaches for Information and the Internet

One way Noble uses Black feminist technology studies is to propose the use of public policy for the development of a non-commercial public search engine.  Additionally, Noble points out corporate search engines currently do not have an obligation to delete or destroy records as a means to protect individual privacy. Beyond that, they offer little transparency on their methods of data retention. A public policy like right to be forgotten laws could potentially resolve some of these issues, but such laws do not exist currently in the United States.

Overall, Algorithms of Oppression help us think about the way contemporary neoliberal discourse around the coding gap affects Black women as technology users and producers. The presumption that Black women in tech are uniquely positioned to address algorithmic bias fails to recognize how the engineering curriculum in the United States largely ignores the matrix of domination. As a result, the reproduction of controlling images of Black women on the internet continues.

%d