Is Google racist, or are we?

E
3 min readDec 31, 2020

Who is responsible for this real-world data issue?

Upon reviewing Noble’s (2018) book ‘Algorithms of Oppression: How search engines reinforce racism’, I express frustration with Google. Is it possible that our browsers are saturated in racial prejudice?

Throughout the book, Nobel presents a collection of research showing the power of algorithms within society. According to Nobel, the internet is automated to discriminate minority groups. The book uses multiple case studies where the algorithms show systemic racism. She explains that Google search engines are supporting cultural inequalities and reinforcing data discrimination.

Copyright [Raceforward]

Firstly, in April 2020, Google voiced an apology when their automatic service which identifies images, showed considerable chauvinistic differences. The flaw was noticed when a twitter experiment (started by Nicolas Kayser-Bril), displayed a black hand holding a thermometer and a white hand doing the same action. Google’s new artificial intelligence (Google Vision Cloud) identified the black hand to be holding a ‘gun’ whereas the white hand was labelled to be holding a ‘monocular’ (as displayed below).

Copyright [Agathe Balayn] https://algorithmwatch.org/en/story/google-vision-racism/

The technological flaw was later removed by Google but still demonstrates racist programming. Bonilla-Silva (2015) believes that this is a new wave of racism which is more subtle and is driven by institutional members of society. White supremacy appears to be integrated in seemingly non-racial formats, particularly within the technological world.

A critique by Agathe Balayn (2020), is that Google’s error might not have been because of the colour of the skin tone, but because of the inaccuracy within computer code when identifying the object displayed.

Nevertheless, this would not have been the first time Google’s image recognition has been strikingly bias. In 2015 an incident where Google auto-tagged two black skinned people as “Gorillas” caused social media chaos. Google responded immediately by removing any label referring to monkeys to ensure the inaccuracy never happens again.

Copyright [WNYC Studios] www.wnycstudios.org/podcasts/notetoself/episodes/deep-problem-deep-learning

“Image labelling technology is still early and unfortunately it’s nowhere near perfect”. — Google (2015)

Despite these examples, Nobel fails to recognise that there might be multiple reasons to these racial mathematical formulas. It is believed that Google’s algorithms work by using a ranking system which determines the importance and relevance of a website. Every click, watch, tweet and view counts towards their decision-making process. Algorithms are a tool which analyses patterns and formulas which end up deciding what appears in our search boxes. This leads me to believe that aspects of our technical code could hold up a mirror to systematic racism within our society.

Is it possible that racism is embedded into our computer code because it is still embedded into our society?

Copyright [freshspectrum] https://freshspectrum.com/bridging-the-data-gap/

In regard to artificial intelligence, it is difficult to pin-point who is responsible for the flaws within our computer systems. Nobel has confidence that it is Google software engineers who are liable, however algorithms are so complex that it is impossible to know if our society is not also reflected within these algorithmic patterns.

References:

--

--

E

20 year old final year student at Loughborough University studying Communication and Media Studies.