From: Fenwick Mckelvey <fenwick....@CONCORDIA.CA>
CFP: (un)Stable Diffusions: General-purpose artificial intelligence’s publicities, publics, and publicizations
Edited by Fenwick McKelvey, Joanna Redden, Jonathan Roberge, and Luke Stark (names in alphabetical order)
To be published in the open access Journal of Digital Social Research. Please submit your abstracts here: https://forms.gle/hWQxvuRTWoFhVs3W6
The recent release of so-called “general-purpose artificial intelligence” (GPAI) systems have prompted a public panic that AI-generated fiction is now indistinguishable from fact. General-purpose AI refers to systems “intended by the provider to perform generally applicable functions such as image and speech recognition, audio and video generation, pattern detection, question answering, translation and others” (Artificial Intelligence Act, Council of the European Union, Procedure File: 2021/0106(COD), Title 1, Article 3(1b)). Contemporary technologies referred to as GPAI include OpenAI’s ChatGPT as well as numerous text-to-image generation software such as DALL-E and Stable Diffusion. GPAI systems are already being deployed to support and entrench existing asymmetries of power and wealth. For instance, the online news outlet CNET recently disclosed that it has been publishing stories written by an AI and edited by humans for months (Main, 2023).
The current concern over ChatGPT is an important moment in AI’s publicity and publicization (Hansen, 2021), or what Noortje Marres refers to as “material things” acting as “crucial tools or props for the performance of public involvement in an issue” (Marres, 2010, p. 179). Amidst countless opinion pieces and hot takes discussing GPAI, this special issue details how scandal, silence, and hype operate to promote and publicize AI. We seek interventions that question AI’s publicity and promotion as well as new strategies of engagement with AI’s powerful social and political influence.
Concern over Generative AI, we argue, is limited by publicity around these systems that has been framed by hype, silence, or scandal (Brennen, 2018, 2018; Sun et al., 2020). Publicity refers to the relations between affected peoples and matters of shared concerns (Barney, 2014; Marres, 2015, 2018). Historically, these relations have been mediated by the press (Schudson, 2008), but GPAI’s coincides with uncertainty about journalism’s status and a rise of direct, one-step flow like effects of either citizen-to-citizen or, in the case of ChatGPT, a direct link (Bennett & Manheim, 2006).
Scholars have observed that publicity around AI follows several distinct patterns:
Our special issue seeks interventions focused on:
Please submit an extended abstract (1000 words) by 1 August 2023.
Accepted full papers due 1 December 2023.
Planned publication Spring 2024.
https://forms.gle/hWQxvuRTWoFhVs3W6