Behind the screen, you might say, is an invisible force that dictates what we read, hear and watch. This force is made up of an unknown number of algorithms that monitor what we do and attempt to anticipate what we might like to do next. UTS graduate Hayley Cumming has developed a workshop ‘Uncanny Algorithms’ as a way of understanding on these invisible forces. As she says…
We live in an algorithmic culture. Computation is everywhere; a ubiquitous part of our lives—look no further than your pocket. Algorithms meticulously curate your social media feeds and determine your search results. The lens through which we view and understand the world has shifted to one that someone else has programmed.
As designers within this algorithmic culture, we are tasked with making digital systems intelligible. We are accustomed to the demands of the ‘user friendly’: simple, digestible and succinct. Yet ‘seamless’ design obscures the mechanics behind your computer screen. How are we to design digital systems if ‘user-friendly’ has become a synonym for ‘ignorance and deceit’?
Despite their ubiquity, algorithms remain inscrutable and black-boxed. In other words, we cannot immediately perceive them. This is the issue at the heart of this project: if digital systems remain obscure and ‘invisible’, how are we to critique them? How can we hold those who control them accountable?
‘Uncanny Algorithms’ is a workshop that attempts to reveal algorithmic processes by making them more obviously physical and tangible. In pairs, one participant personifies the role of the ‘computer’ while the other plays the part of the ‘human’. Together they complete three activities—each one symbolises an algorithm that influences our daily lives:
In the first activity, the ‘computer’ sorts coloured tiles while the ‘human’ labels them. This process is symbolic of computer vision training algorithms that are responsible for technologies such as Google Images, driverless cars and facial recognition.
In the second activity, the ‘human’ must select coloured tiles to create a design. However, they must choose these tiles from a set that is recommended to them by the ‘computer’. This mimics personalised software (eg. YouTube and Facebook) that is designed to create an illusion of choice, when in fact, the algorithm has already decided for us.
All images supplied by Hayley Cumming taken by Harry Corcoran
In the final activity, participants record their experiences and opinions on flowcharts. Unbeknown to them, the flowcharts were designed to make a deliberately assumptive prediction about their personality. This symbolises alarming modes of classification such as Faception: a company that claim their automated surveillance system can detect Terrorists, Pedophiles and Professional Poker Players based entirely on their face.
On one level, the workshop aims to help participants understand algorithms by translating their digital and seemingly ‘invisible’ nature into physical, tangible activities. At a deeper level, the interaction between the ‘human’ and ‘computer’ roles, opens up possibilities to critique the reciprocal relationship between humans and algorithms. All of a sudden, participants become aware of the ability to influence and manipulate each others’ actions. Currently our relationship with algorithms is misleading and divisive—filled with bias, misinformation and exploitation. This project is only the beginning of what needs to be a fundamental shift in how we design digital systems. Until we begin to rethink our technologies and challenge the megacorporations behind their design, we will continue to be entangled within their giant networked web, sticky and inescapable. The small instances in which participants found ways to collaborate within the constraints of ‘Uncanny Algorithms’ offers some hope. Perhaps we replicate such moments of collaboration when designing our digital infrastructure. But will this be enough to challenge the megalomaniac hold of algorithmic influence?
EDIT NOVEMBER 2020: This project is a finalist in the AGDA Student Awards 2020 in the print category.