Programming and people: unwritten bias in written code

Here at TTRO, we’re very excited about the future of technology and all that it would mean for education and the workplace. But that doesn’t mean that we wear a pair of rose-coloured VR goggles when looking at our bright future; a healthy level of criticism is needed.

In a recent Ted Talk video, American computer scientist, Joy Buolamwini, described the unconscious biases she found in modern software code.

Joy highlighted the fact that the open-source algorithm that she and many others had used in software, was unable to recognise her face because of her ethnicity. She described this as a striking example of an unconscious bias written into the code of the algorithm.

As a new global industry of algorithmic decision-making software starts growing, it’s important to be able to highlight its pitfalls, not only insensitive moments of human-machine interaction with overt outputs but in subtle behind-the-scenes outputs as well.

Evaluation of this new disruptive technology needs to increase to ensure that it does not only become relevant to a privileged few as its potential relevance to all humanity is what gives it its value. If it is not relevant to all humanity, it will no longer be valuable.

Objectivity vs subjectivity

We now see many different organisations look to software to simplify and accelerate their internal processes. But Buolamwini has raised an interesting point about it.

This modern technology is imbued with their coders’ biases and prejudices, just as any other cultural object is. In a strange way, these things cannot truly escape our humanity and its implicit subjectivities. Subjectivity dominates reality, objectivity is our ideal, and we need to strive for the latter as much as we can, knowing that we can’t escape the former. Ideals are important to aim for; imagining objectivity in technology and design is useful because we can imagine what other people might want or need. But it’s not always an attainable target. That’s the nature of design ergonomics; designing for the majority, not just yourself.

And yes, the use of the word ‘majority’ denotes a sliding scale, but we need to be mindful of which direction that sliding scale heads to. If a sliding scale of access is directly related to one’s physical body, language, religion or any other personal attribute, then there are severe ethical and possibly legal consequences to be considered.

If it is not relevant to all humanity, it will no longer be valuable.

This is especially true in instances where technology intersects with civil functions. Buolamwini uses the example of policing in the United States, where facial recognition is used in public surveillance.

Exclusionary experiences

Cathy O’Neil, author of Weapons of Math Destruction – a book I am very eager to read – outlines how the technological models being used today are riddled with errors. One reviewer describes that O’Neil proves that they are “opaque, unregulated and incontestable, even when they’re wrong” and that they even go so far as to financially exclude people by filtering bank loan applicants by their US zip codes and not their credit histories. A clearly discriminatory exclusion.

Cultural coding

A safe assumption to make is that the programmers writing the code, algorithms and AI, are predominantly imagining themselves or someone very much like themselves, as the average user. That’s problematic. People who don’t fit into this average user persona, are unable to tell whether the software has excluded them from a face-to-face interaction like Buolamwini’s.

For the sake of healthy criticism and for the sake of increasing its relevance to humanity, algorithmic decision-making software and what it produces should be reframed as a cultural artefact. The term ‘cultural artefact’ conjures up the world of documentaries and archaeology but, ‘cultural artefact’ refers to any physical ‘thing’ created by people that gives information of their culture or society.

But it’s difficult to talk about digital technology this way, as so much of it is not tangible. At best, we can perceive the physical manifestation of an abstract calculation. To say that more simply, we see what the code produces (the apps on your phone for example) not the code itself (because that would be boring to most people).

Our argument should be in support of gaining knowledge and not unknowingly supporting ignorance. Every aspect of the creation, implementation and maintenance of decision-making software needs to be examined more closely; else we might find that software will clash with the law, as Google’s DeepMind already has in the UK. So, in the words of Buolamwini, we need to paint “a richer portrait of humanity”.

 

References
Cathy O’Neil, Weapons of Math Destruction
Joy Buolamwini
Joy Buolamwini’s The Coded Gaze
Google DeepMind broke UK’s NHS privacy laws
Cultural artefact

Author: Simon Pienaar

Skip to content