Knowing and not knowing about algorithms

Knowing and not knowing about algorithms
Hilary Yerbury, Maureen Henninger
Journal of Documentation, Vol. ahead-of-print, No. ahead-of-print, pp.-

This paper considers the implications of not knowing – hypocognition – the lack of a cognitive or linguistic representation of a concept, algorithms, held by librarians responsible for programs of information literacy in universities in NSW, Australia.

A practice-based study of university librarians and their role in the development of algorithmic literacy, using semi-structured interviews and thematic analysis, showed that they had limited socio-technical knowledge of algorithms.

Not knowing led most participants to anthropomorphise algorithms, including those found in search engines such as Google, sometimes explaining them as something mysterious, although they were aware that the algorithms were gathering data about them and their online interactions. Nonetheless, they delegated responsibility for online activities. These online interactions were not presented in system terms, but often could be interpreted as examples of Goffman’s civil inattention, a social norm used in interactions with strangers, such as fellow passengers. Such an understanding prevented the development of robust algorithmic literacy.

With technologies disrupting social norms, algorithms cannot be considered strangers who understand such civility; instead, metaphorically and practically, they rudely rummage through wallets and phones. Acknowledging the implications of the reliance on socio-cultural understandings of algorithms and their anthropomorphic representations for explaining online system-based interactions can present new ways for developing algorithmic literacy.

This study suggests that the links between hypocognition and the anthropomorphising of algorithms can undermine the development of knowledge and skills in information and digital literacies.

Accessibility