algorithms-oppression-safiya-umoja-noble

Don’t Google It! How Search Engines Reinforce Racism

Algorithms of Oppression addresses the growing concern about the consequences of commercial control over information and the harm it does to communities.

Algorithms of Oppression: How Search Engines Reinforce Racism
Safiya Umoja Noble
NYU Press
Feb 2018

Rarely has a singular entity come to control so much of our world’s information so swiftly. In September 1998, the Google search engine launched. By June 2006, the Oxford English Dictionary had accepted ‘Google’ as a new verb. By 2012, 83 percent of search engine users were using Google.

Indeed, Google
has become a gateway to information for hundreds of millions of people. Don’t know something? Just Google It.

Not so fast, though. What slogans like ‘
Just Google it‘ ignore is that the information Google chooses to give you is heavily mediated by dominant cultural norms, commercial interests, and political interests. The technology doesn’t simply reach out into the Web and grab the most objectively useful information for you. It gives you what someone else thinks you ought to be reading in response to your question. Who is that other person? How do they decide what information you ought to receive? What influences and incentives come to bear on that person? The people who design Google’s algorithms are, ultimately, deciding what information you’re given — and these decisions are in turn influenced by their own cultural and class backgrounds, their experiences, and their desire for advertising revenue.

There’s been a growing swell of concern in the academic community about the stranglehold that commercial (for-profit) search engines have over access to information in our world. Safiya Umoja Noble builds on this body of work in

Algorithms of Oppression: How Search Engines Reinforce Racism to demonstrate that search engines, and in particular Google, are not simply imperfect machines, but systems designed by humans in ways that replicate the power structures of the western countries where they are built, complete with all the sexism and racism that are built into those structures.

Many users operate under the misperception that when they do a Google search, what results are an objective listing of the most common or popular sites pertaining to their search. In fact, what results are what the commercial advertisers on Google either expect or want a user to be satisfied with. Search engine optimization strategies help to ensure that those with money can rig searches to keep their sites at the top, and far from combatting this trend Google and other technology companies have increasingly integrated revenue generation strategies into their information technology products.

Moreover, the architecture of search engines is constructed by predominantly white male engineers, without any real understanding of how the world works for anyone who is not like them (as recently as 2016, only two percent of Google’s workforce were African American and three percent Latino). Consequently, the search engines they build are implicitly designed to respond to the needs of people like them and to provide answers they expect to satisfy users who share those identities.

Noble suggests, ultimately, that search engines and information access on the Internet needs to be removed from commercial interests and brought under public control. Only then will it be possible to have an authentic public dialogue about how to construct web systems and search engines that respond to the needs and expectations of a broader diversity of communities and users; that represent communities and especially marginalized groups of people in ways that are acceptable to those groups; and that don’t do active harm to the people using the Internet.

Search Engines Are Important

What Noble explores in her book isn’t just an abstract thought-train; it’s important. As she argues, “artificial intelligence will become a major human rights issue in the twenty-first century.”

Noble makes a strong case that present technologies and search engines are not just imperfect, but they enact actual harm to people and communities. Her book dwells at length on the example of Google search for “black girls” or “black women”, which invariably produce pornographic sites. What does it mean, and how does it affect a community when they search for information about their own identity and are subjected to pages of violent and pornographic objectification? What does that say about how the world perceives, understands, and values them?

In response, some might argue that it’s not the fault of the technology, but of human users — these are simply common associations that human users make. Don’t blame the technology for human shortcomings, goes the argument. And besides, eventually, technology will surmount this problem.

That’s patently wrong, argues Noble (and not just Noble — she cites a large body of research that’s accumulating to reinforce her arguments). Pornographic sites appear for a reason: the user is presumed (or desired, by the lucrative online porn industry) to be a middle-class white man searching for pornography. It’s a matter of a commercial technology product attempting to optimize fiscal return for the companies that advertise on its systems. While the algorithms might be adjusted ever so slightly (as they seem to have been) when results provoke public outrage, that doesn’t change the fact that the technology is predicated on an inherently exploitative commercial premise. It doesn’t provide you real, objective information — it provides you that which is profitable for advertisers. Pornography is profitable, and without direct human intervention, algorithms designed to maximize profit will continue to associate black women with porn, assuming (or hoping) that’s what users will want to find.

A useful analogy might be that of a library. Imagine if you walked into a library seeking information for your children, asked a librarian what information they had about young black girls, and had a stack of pornographic magazines tossed at you because librarians were paid kickbacks to get people to read porn. The fact is, that wouldn’t happen — because libraries are public institutions whose operations are premised on intellectual integrity and have undergone rigorous public scrutiny and dialogue. Not so search engines.

Google Is a Big Part of the Problem But Not the Only Part

It’s not just Google that’s the problem, although given Google’s dominance over the Internet it’s a central concern. Other web systems replicate the same biases. Noble concludes her book with a searing indictment of Yelp, exploring how it marginalizes businesses owned by African Americans. Not only does Yelp prioritize certain types of reviewers and businesses, but the algorithms on which it monitors searches and reviews are predicated on white middle-class behaviour. As one of Noble’s research participants observes, African Americans are less likely to engage in some of the behaviours that make for good algorithmic conduct. Often targeted by police and monitored by government institutions, they’re less likely to register with sites and share personal information, or to ‘check in’ and allow public monitoring of their whereabouts. Consequently, the businesses they frequent — often owned by other African Americans — suffer in search results, even though they might be quite popular with customers.

Thus begins a cycle that hurts the African American businesses; they become buried in search results and deprived of new customers in an increasingly digital age. Even when those companies try to get their customers to support them electronically, African American online behaviour differs sufficiently from middle-class white American behaviour in that it often gets picked up by algorithmic monitoring and security software and deleted as invalid efforts to rig the system. Noble’s research participants note that while search engines purport to be free-of-charge aggregators of public information, they, in fact, pursue increasing layers of payment from businesses (the ‘first fix’ is free) in exchange for giving them a sustained presence in search results. Those businesses that can’t pay up disappear from online listings.

The upshot is this: because online behaviour and the algorithms that rely on it is predicated on white middle/upper class behaviour, other groups, and especially African Americans, find themselves marginalized online as well as off. Because search engines are designed and operated by commercial companies, the results they produce are biased in favour of whatever will bring their parent companies more revenue. Unsuspecting users contribute to this cycle, thinking that they’re receiving honest and unbiased information, when in fact they’re receiving what commercial and corporate advertisers want them to receive and reinforcing profitable and commercially exploitative gendered and racialized norms in the process.

The point that Noble makes, and one which we should remind ourselves of, is that these things
matter. It’s not good enough to ignore them and assume that technology will eventually right itself; that the kinks will be worked out and everything will eventually be equal. That’s never worked in the real world and is just as unlikely to work in the online world — in no small part because the online world is created by the very same people who exist in, and are shaped by, the real world. Technology does not exist in a vacuum, and machines are not objective: they are created by humans and share all the same faults as the humans that create them.

“[R]acism and sexism are part of the architecture and language of technology,” she writes. “We need a full-on reevaluation of the implications of our information resources being governed by corporate-controlled advertising companies.”

Even if the technology is finessed and ‘improved’, it still doesn’t change the underlying problem of search engines: that a small group of people in an office somewhere, who evade public scrutiny and accountability, are deciding how entire communities are defined and represented online. If Google were to try to more accurately refine its searches to reflect the interests and behaviour of Latinos, for instance, who is defining what that behaviour is? The agency and ability of a group to discursively produce, control, and change its own identity and behaviour patterns is smothered by Google’s overpowering ability to drown them out by telling the world something different and presenting an entirely different (if algorithmically consistent) representation of that group to the world.

Users Don’t Realize They’re Being Duped

“The very notion that technologies are neutral must be directly challenged as a misnomer,” writes Noble.

The fact is, search engines do not produce simply objective, factual data. They produce data that is shaped and prioritized by the interests of the advertising companies that give revenue to Google. The data they produce is shaped by revenue-generating considerations, not by any objective measure of what is most useful or relevant to the user conducting a search.

“[T]here are many myths about the Internet, including the notion that what rises to the top of the information pile is strictly what is most popular as indicated by hyperlinking. Were that even true, what is most popular is not necessarily what is most true.”

Most users don’t realize the rigged nature of the data they access through search engines. Noble quotes research conducted by the
Pew Internet and American Life Project: 73 percent of Americans use search engines, and 83 percent of search engine users use Google. Of those, 73 percent of search engine users say that “most or all the information they find as they use search engines is accurate and trustworthy.” However, 62 percent of search engine users “are not aware of the difference between paid and unpaid results,” Noble writes.

Indeed, there’s a vicious cycle at work. Many technology companies are unwilling to publicly share or divulge their software and algorithms, lest that make it easier for people to find ways to work around them. The lack of public dialogue results in disenfranchising significant populations from the Internet or misrepresenting those populations and groups in ways that do them grievous harm.

The harm, she notes, is real. While research is still at an early stage in understanding the full scale of social and psychological impact that online behaviour has on the public, what we do know is that “unregulated digital platforms cause serious harm. Trolling is directly linked to harassment offline, to bullying and suicide, to threats and attacks.”

We Need Public Control Over Search

“The more we can make transparent the political dimensions of technology, the more we might be able to intervene in the spaces where algorithms are becoming a substitute for public policy debates over resource distribution – from mortgages to insurance to educational opportunities,” argues Noble. Any company that dominates over 80 percent of a market – as Google does with search engines – is too powerful, and needs to be broken up. Moreover, just as library searches operate under publicly funded and scrutinized methods, search engines need to be brought under similarly publicly accountable control. There needs to be public control and transparency over the algorithms and methodologies used by search engines because search engines are how people gather information in today’s world. That’s too important a role to concede to commercial, profit-seeking enterprises.

“What we find in search engines about people and culture is important,” writes Noble. “Search results can reframe our thinking and deny us the ability to engage deeply with essential information and knowledge we need, knowledge that has traditionally been learned through teachers, books, history, and experience. Search results, in the context of commercial advertising companies, lay the groundwork… for implicit bias: bias that is buttressed by advertising profits. Search engine results also function as a type of personal record and as records of communities, albeit unstable ones. In the context of commercial search, they signal what advertisers think we want, influenced by the kinds of information algorithms programmed to lead to popular and profitable web spaces.”

Search engines do more than give us information, says Noble. They shape contemporary understandings of the world. That means their role is too important to leave to profit-seeking corporations.

“Search does not merely present pages but structures knowledge, and the results retrieved in a commercial search engine create their own particular material reality. Ranking is itself information that also reflects the political, social, and cultural values of the society that search engine companies operate within, a notion that is often obscured in traditional information science studies.”

A growing awareness of the biased and white, patriarchal nature of search engines is what has led to the development of culturally situated search engines. Examples include
Blackbird, The Blackfind, BlackWebPortal, and GatewayBlackPortal, all designed for African American users. Others, such as JGrab, Jewist.net, JewGotIt, Maven Search, and Jewogle, are designed for Jewish users. The premise on which they are based is the same: search engines reflect and reinforce cultural and racialized biases.

But are culturally situated search engines the answer? Not necessarily, argue some scholars. They might circumvent the bias for users who access them, but they don’t change or challenge the prevailing norms on the web, argue critics. The better solution, Noble suggests, is to take search away from the hands of corporations and bring it back under public accountability.

“What we need are public search engine alternatives, united with public-interest journalism and librarianship, to ensure that the public has access to the highest quality information available.”

What we also need, Noble argues, is accountability. She cites data from July 2016 showing that only two percent of Google’s workforce is African American, and only three percent Latino. But the problem may be more deeply rooted than that. Inherent in the operating philosophy of many of these tech companies is a colonialist belief that technology is an untapped frontier in which they have implicit right to explore and conquer. Using the example of Google Glass, Noble argues perceptively:

“The lack of introspection about the public wanting to be surveilled at the level of intensity that Google Glass provided is part of the problem: centuries-old concepts of conquest and exploration of every landscape, no matter its inhabitants, are seen as emancipatory rather than colonizing and totalizing for people who fall within its gaze. People on the street may not characterize Google Glass as a neocolonial project in the way we do, but they certainly know they do not like seeing it pointed in their direction… The neocolonial trajectories are not just in products such as Google Glass but exist throughout the networked economy, where some people serve as the most exploited workers…”

Noble ends on a pessimistic note; her book went to print as the Trump dark ages were descending upon America. A quick escape from corporate disinformation and exploitation now seems more remote than ever. Yet the imperative is, conversely, more important than ever. If we do not dispossess commercial search engines from the role they play in our society, and bring search back under democratic, publicly accountable, inclusive scrutiny and control, we risk allowing commercial search engines to not only disinform us but to reshape our society.

RATING 9 / 10
RESOURCES AROUND THE WEB