Most Read... John McAuliffeBill Manhire in Conversation with John McAuliffe
(PN Review 259)
Patricia CraigVal Warner: A Reminiscence
(PN Review 259)
Joshua WeinerAn Exchange with Daniel Tiffany/Fall 2020
(PN Review 259)
Eavan BolandA Lyric Voice at Bay
(PN Review 121)
Vahni CapildeoOn Judging Prizes, & Reading More than Six Really Good Books
(PN Review 237)
Christopher MiddletonNotes on a Viking Prow
(PN Review 10)
Next Issue Sinead Morrissey 'The Lightbox' Philip Terry 'What is Poetry' Ned Denny 'Nine Poems after Verlaine' Sasha Dugdale 'On learning that Russian mothers buy their soldier sons lucky belts inscribed with Psalm 90 to wear into battle' Rod Mengham 'Cold War Hot Air'
Poems Articles Interviews Reports Reviews Contributors
Reader Survey
PN Review Substack

This review is taken from PN Review 281, Volume 51 Number 3, January - February 2025.

Hal Coase A Measureless Ouija-Board

Conflicted Copy, Sam Riviere (Faber) £10.99

In poetry, as in love, it’s the non-necessity of what follows that takes our breath away. Anything can happen. Anything can happen, and then whatever does happen could only have happened that way. We’re well trained to pick over the second part of this happenstance. Most features of form can be thought of as allowing choices to seem necessary, before they come to seem either correct or beautiful, audacious or dull. Destiny, compulsion and fetish (like genre, metre and metaphor) are useful when we want to redescribe anything that happens as the only thing that could have happened. We often need them to talk ourselves down from the ledge of infinite possibility. The satisfaction that such redescriptions provide is in proportion to the dizziness and excitement that threatens to upturn them, and which they work to contain. In return, we get back a little kick of agency; a pattern is just a mess that you’re always making.

From December 2020 to January 2021, Sam Riviere composed Conflicted Copy using Generative Pre-trained Transformer 2 (GPT-2). GPT-2, first released a little over five years ago, was a large language model trained on eight million web pages, capable of responding to prompts with text that it generated on the basis of knowing what normally happens next. GPT-2 was a mess. As anyone who played around with it in the months after its launch might remember, it could produce plausible prose given strict parameters, but it was quickly liable to either become repetitive or veer off topic. The effect was a little ...


Searching, please wait... animated waiting image