CfP: Inequalities and Divides in Digital Cultures (closed)

 

Abstract Deadline: Friday, 17th August, 2018
Edited by: Pablo Abend and Annika Richterich

This issue of Digital Culture & Society addresses inequalities and divides in digital cultures. We are looking for contributions that approach related topics from various, (inter-)disciplinary perspectives. Paper proposals (abstracts) may be relevant to the following three themes – but we likewise welcome contributions moving beyond these:

 

Inequality of Access

Inequalities have often been addressed as geographical variations in the diffusion of digital technologies. Initially, the discussion of the digital divide focused on accessibility and therefore circled around the question whether (and where) individuals have or do not have access to information and communication technologies. As internet access spread, research moved away from this binary understanding of accessibility (Hargittai 2008). While missing technological infrastructure is yet a problem on a global as well as on a local scale, inequalities were also emphasised as more than mere matters of (technical) accessibility. The question is not only whether someone has access to digital technologies but: “Who, with which characteristics, connects how, to what?” (Hilbert 2011, 727).

The term “digital inequalities” aims to facilitate more nuanced approaches (see e.g. Robinson et al. 2015). Digital inequalities also follow from diverging sets of skills and media literacy, unequal access due to political restrictions, economic inequality and other factors. They are intricately connected with “[…] traditional axes of inequality such as race, class, and gender” (Robinson et al. 2015, 569). This theme thus invites submissions on digital inequalities that think beyond inequalities of technological access.

Inequality by Design and Discursive Divides

How are inequalities and divides (re-)produced by the protocols of digital media? Whereas the dictum “to classify is human” (Bowker & Star 1999, 1) still holds true in digital cultures, processes of quantification and ‘data[i]fication’ are increasingly automated. Information is automatically selected, ranked and curated by algorithms: significantly defining the content that users may (not) encounter. Data mining and software-sorting create “filter bubbles” (Sunstein 2006; Pariser 2011; Bozda 2013), presenting users with content condensed and curated based on (commercial) techniques.

Evermore data is organised according to corporate interests of the producers of applications and interface providers. “Data-ification” (Broadbent & Lobet-Maris 2015) transforms individual users into commodities whose value is determined by their digital footprint and who are reduced to their “data fumes” (Thatcher 2015). An “echo chamber effect” (Barberá 2015) is discussed as related, potential side effect of social media: referring to networks that act as closed systems in which existing beliefs are amplified or reinforced, while dissenting views are subdued.

This theme invites submissions on how digital experiences are co-shaped and potentially limited by algorithms, data and the protocols of (digital) media, and on how they may construct politically relevant discursive divides. We are also interested in how users may discuss and counter such tendencies.

Inequality by Algorithms

Moreover, big data and statistical models are used to rank and classify citizens according to their alleged skills and abilities, consumer habits, job performance, credit history, or behaviour in public. Mathematical models are used to indicate individuals’ probability to succeed in a job, shop for certain items, pay off a loan, or commit a felony. The unfolding “technological unconscious” (Thrift 2004) is relevant because classifications made by algorithms not only influence the way we perceive and see the world (theme 2), but also have a bearing on how we are seen and treated by others. These classification systems can show a “material force” (Bowker & Star 1999, 3) which determines our situation quite literally, while their (all too often problematic) underlying assumptions and procedures remain largely invisible.

More recently, Safiya Umoja Noble coined the term “algorithmic oppression” (Noble 2017) to describe how formal classification systems discriminate people based on race, class, gender, and age – from isolated incidents shrugged off as ‘glitches’ of the system to more structurally persistent forms of marginalization and discrimination. This theme therefore focuses on the inequalities intensified and/or triggered by algorithms, with regards to how individuals and groups are portrayed, addressed and treated. In addition, we are not only interested in how individuals/groups may be exposed to inequalities and divides, but also in how they may oppose and tackle these, e.g. by engaging in ‘data activism’ (Milan & van der Velden, 2016).

 

 Journal Sections

This issue of Digital Culture & Society welcomes empirical studies as well as theoretical and methodological reflections highlighting instances of inequalities and divides in digital cultures. Therefore, when submitting an abstract, please state to which of the following issue sections you would like to submit your paper:

  1. Field Research and Case Studies (full paper: 6.000 – 8.000 words)

We invite articles that discuss empirical findings from studies that approach inequalities and divides in digital culture. These may e.g. include studies that analyse particular models, introduce case studies in which classification systems account for inequalities, or follow practices/actors within digital culture to gain insights into experienced inequalities and/or divides.

  1. Methodological Reflection (full paper: 6.000 – 8.000 words)

We invite contributions that reflect on the methodologies employed when researching inequalities and divides in digital cultures. These may include, for example, challenges and opportunities faced when qualitatively researching quantifiable data and vice versa; approaches using digital methods; discussions of mobile and circulative methods; and reflections of experimental forms of research.

  1. Conceptual/Theoretical Reflection (full paper: 6.000 – 8.000 words)

We encourage contributions that reflect on the conceptual and/or theoretical dimension of inequalities and divides in digital cultures, and discuss or question how these can be defined, what causes them, and how they can be differentiated. We also invite articles that interrogate terms such as ‘digital’, ‘inequalities’ or ‘divides’.

  1. Entering the Field (2.000 – 3.000 words; experimental formats welcome)

This experimental section presents initial and ongoing empirical work. The editors have created this section to provide a platform for researchers who would like to initiate a discussion concerning their emerging (yet perhaps incomplete) research material and plans as well as methodological insights.

 

Deadlines and Submission

  • Initial abstracts (max. 300 words) and short biographical note/s (max. 50 words) are due on: Friday, 17th August 2018.
  • Authors will be notified by Monday, 27th August 2018, whether they are invited to submit a full paper.
  • Full papers are due on: Monday, 29th October 2018.
  • Authors will receive the peer review feedback by Monday, 10th December 2019
  • Final paper versions are due on: Monday, 14th January 2019.

 

Please send your abstract and (a) short biographical note(s) to Pablo Abend and Annika Richterich. Based on the abstracts, the journal editors will pre-select authors that will be invited to submit a full paper. All full papers will be double-blind peer reviewed.

 

Publisher and Open Access

DCS is published by transcript. All articles will be published as open access on our website 12 months after the initial publication. Previous issues are available here: http://digicults.org/issues.

 

References

Barberá, P. et al. (2015). Tweeting from Left to Right: Is Online Political Communication More than an Echo Chamber?. Psychological Science 26(10), 1531–1542.

Bowker, G.C., & Star, S.L. (1999). Sorting things out. Classification and its consequences. Cambridge/London: The MIT Press.

Bozdag, E. (2013). Bias in algorithmic filtering and personalization. Ethics and information technology, 15(3), 209–227.

Broadbent, S. & Lobet-Maris, C. (2015). Towards a Grey Ecology. In: Luciano Floridi (ed.). The Onlife Manifesto. Being Human in a Hyperconnected Era (pp. 111–124). London et al.: Springer.

Hargittai, E. (2008). The Digital Reproduction of Inequality. In: David Grusky (ed.). Social Stratification (pp. 936–944). Boulder: Westview Press.

Hilbert, M. (2011). The end justifies the definition. The manifold outlooks on the digital divide and their practical usefulness for policy-making. Telecommunications Policy, 35(8), 715–736.

Milan, S. & van der Velden, L. 2016. The Alternative Epistemologies of Data Activ-ism. Digital Culture & Society, 2 (2), 57–74.

Noble, S. U. (2018). Algorithms of oppression. How search engines reinforce racism. New York: New York University Press.

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. London: Penguin UK.

Robinson, L. et al. (2015). Digital inequalities and why they matter. Information, Communication & Society, 18(5), 569−582.

Sunstein, C. (2006). Preferences, paternalism, and liberty. Royal Institute of Philosophy Supplements, 59, 233-264.

Thatcher, J. (2014). Big Data, Big Questions| Living on Fumes. Digital Footprints, Data Fumes, and the Limitations of Spatial Big Data. International Journal of Communication, 8, 1765–1783.

Thrift, N. (2004). Remembering the Technological Unconscious by Foregrounding Knowledges of Position. Environment and Planning: Society and Space, 22(1), 175–190.