I was reading a recent interview with the music critic Simon Reynolds — I reviewed his book Retromania years ago — when I stopped short at this:
In one of his blogs, [the late English writer Mark Fisher] talked about going to the countryside and how amazing it was. He said, ‘It really opened my eyes. I now see birds as marvellous little machines’. And I thought, why not see machines as botched little animals?
Yes, I said. Yes.
Ever since the rise, in the seventeenth century, of the mechanistic view of nature, we have been quick to use technological metaphors to explain the phenomenon of life. Declared Descartes, “There is no difference between the machines built by artisans and the diverse bodies that nature alone composes.” The human brain itself — that mysterious maker of metaphors — has through the ages been portrayed as (a) a hydraulic pumping system, (b) a clock, (c) a telephone switching network, (d) a digital computer, and, now, (e) a large language model. In constructing machines, we also construct ways of seeing the world, and ourselves.
Technological metaphors simplify — they put slippery nature in our grip — but they also distort. Maybe, as Reynolds suggests, it would be more illuminating to take the opposite tack: to use the living world as a source of metaphors for our own creations. From this view, we can see ourselves as imperfect artificers, creators of machines whose inevitable flaws reflect our own shortcomings. We make botched little animals because we’re botched little animals ourselves.
We build all these systems and we complain about them as if they’re out of control, as if they’re controlling us, but we build them. You look at the internet, it’s all boiling with human emotions — mostly, unfortunately, the ugliest kind. All these machineries and technologies and mediation systems are all fueled by humanity, they didn’t just spring out of nowhere.
In the course of writing Superbloom over the last few years, I found myself losing patience with the prevailing critique of social media and the internet, the one that portrays the technology as something imposed on us by an imperious external force (progress or capitalism or Big Tech or what-have-you) and that in turn depicts us as the hapless, helpless victims of that force’s exploitative and manipulative powers. The problem with this line of inquiry is not that it’s invalid — it’s valid, and it’s necessary — but that it’s incomplete. It skirts around our own complicity. It too easily separates our botched little animals from our botched little selves.
One of the central arguments of the book is that the commercial internet, and social media in particular, is a machine fine-tuned to sense our desires and fulfill them. If “the algorithm” manipulates us, it does so by giving us what we want. The machine’s manipulative power is secondary to, and dependent on, the pleasure it provides.
The moral philosopher Alasdair MacIntyre stressed the importance of distinguishing “between what we desire and the choiceworthy” and “between what pleases those others whom we desire to please and the choiceworthy.” Making such distinctions has always been difficult. Digital media, with its hyperactive solicitude and its automation of the act of choosing, makes them more difficult than ever — and more important than ever. Because technology is a repository of human desire, a full critique of any machine needs also to be a critique of human desire. We’re the machine’s makers before we’re its victims.
I agree that who controls (and designs) the machine needs to be a crucial focus — and that has been a focus of public attention (though without much actual action, at least in the US). But we also need to look at the role that all of us play as integral components of the machine. We provide the signals that shape the machine's outputs. We can't avoid self-examination, as those signals (and indeed the outputs they generate) are markers of our own desires.
A couple of thoughts on this, the first (and foundational) one for me coming from Robert Frost's talk "Education by Poetry," which you can find here in full: https://moodyap.pbworks.com/f/frost.EducationByPoetry.pdf
"What I am pointing out is that unless you are at home in the metaphor, unless you have had your proper poetical education in the metaphor, you are not safe anywhere . Because you are not at ease with figurative values: you don’t know the metaphor in its strength and its weakness. You don’t know how far you may expect to ride it and when it may break down with you. You are not safe with science; you are not safe in history."
Building on that (albeit indirectly), I find Neil Postman's exploration of the breakdown of the machine/mechanism metaphor for biological systems in Technopoly: The Surrender of Culture to Technology to be a helpful anchor, given that he is one of the few I've read who recognize what Frost meant. Reynolds' insight is deeply useful (and here I'll confess to being a big fan of Reynolds' work overall) in recognizing that the metaphor's fundamental power comes not so much from comparison (the strength of simile) but from juxtaposition - and, as I am always telling my poetry students, juxtaposition isn't opposition.