top of page
Post: Blog2 Post

‘Trapped in a code’ – the fight over our algorithmic future

Updated: Mar 15, 2021


Youth protests in front of the Department for Education in London over algorithmic exam grades, August 16


We grudgingly tolerate algorithms that predict our cultural consumption – but applying them to exam results has highlighted our unease about how algorithms override individuality.


"We ain't your target market, and I ain't your target market", sang the band Sisteray on their 2018 song Algorithm Prison. They don't wish, as they put it, to be “trapped in a code". Who would? Yet Sisteray know that we are all exposed to a powerful and judgmentaldata gaze – as the camera zooms out at the end of their official video, the bandmembers all stare blankly at their phones, the words target market etched into the dirt on the back windscreen of their van.

The song shows how algorithms have risen in fame and notoriety in the last few years. And it is illustrative of a widely held concern over what these algorithms are doing to us. Often depicted as shadowy and constraining structural forces, algorithms are a source of significant anxiety. People often worry that these bits of code have a powerful but unknown sway over their lives. With an algorithmic grading system implemented to produce the recent A Level results, this anxiety is something we have seen magnified in recent days.

It seems that there was an implicit assumption on the part of those awarding the grades, that simply evoking the concept of the algorithm would be enough to reassure people of the objectivity and systematic fairness of the results. The calculative logic behind algorithmic decisions is often based on ideals of objectivity, neutrality and accuracy. In assuming that others would also see algorithms in those positive terms, the existing unease and scepticism about such systems was missed.

The backlash against how an algorithm was used to decide people's exam results and the unevenness of the outcomes for people from different backgrounds gives us an insight into the ongoing tussle over our algorithmic future. This standardisation of grades tells us something about what we are willing to tolerate when being judged by algorithms.

Where we are conscious of their presence, algorithms are mostly tolerated rather than celebrated. There’s often a kind of grudging awareness and acceptance – although feelings toward algorithms already go well beyond uneasiness for those on the sharp-end of automated decisions, especially where prejudice, discrimination and inequality are coded into what Safiya Umoja Noble has called ‘algorithms of oppression’.

Perhaps the most widely understood algorithmic processes to date, however, are those associated with tech platforms and social media. It is here that algorithms are most noticeably active in making predictions for us and about us, and where media coverage has tended to focus. But it turns out that those being examined did not wish their results to be calculated in a similar way to the selection of their next YouTube clip, Netflix film, TikTok video, Instagram picture feed or Spotify song. The models are clearly totally different, but this is the parallel that all of the talk of algorithms was likely to draw. Other people’s data are considered ok for predicting cultural consumption, but the same cannot be said for predicting educational attainment.

It seems that there are forms of algorithmic prediction that are considered to be acceptable and others, clearly, that are not. All automation creates tensions, but some decisions or predictions (“here are the grades the algorithm says you would have got") appear to break beyond the tacitly established limits of broad acceptability.

There is a concern, as expressed in that Sisteray song, that algorithms are a kind of trap and that they routinely use our data to lock us into fixed patterns. In his recent book on the long development of such systems, the historian Colin Koopman describes how data gathering processes have the effect of ‘fastening’. Both the boxes we are put in, and the gaps in the forms that are completed about us, hold us in place, he argues – with many of the categories and logics behind data usage having been established a century or more ago. This fastening process is also echoed in Deborah Lupton’s recent notion of Data Selves, in which we have become inseparable from our data. Both Koopman and Lupton point to how data are used to make us up. Recent events could be seen as a rejection of one aspect of that fastening. By reducing these students simply to a data point within a historic cohort, the loss of a sense of the individual was overpowering. The students were not happy about being fastened in place by these particular algorithmic processes.

When you combine the existing scepticism for algorithms with an algorithmic system that is so overt in its uneven outcomes, then this level of reaction was always likely. It seems that the adjusted results will not now stand, but the full impact of what has happened is not yet clear. What is clearer is that the notion or concept of the algorithm will continue to be a site of tension. In such a context, a crude belief in algorithms is unlikely to go unchallenged. It would seem that many of these A Level students agree with Sisteray – they don’t wish to be trapped in code.

By David Beer

Published 21 August 2020

Sources :

-------

Recent Posts

See All
bottom of page