One of the themes of Superbloom, my new book, is the recent, dramatic, and largely unheralded expansion of the role of machinery in media. If you think of media as a system for the distribution of speech, you can break it down into three basic roles or functions:
Message creation (the speech function)
Message selection (the editorial function)
Message transmission (the transport function)
From media’s origins in the ancient world up until the early morning hours of September 5, 2006, the way these functions were divided between human beings and machines was clear-cut. People handled the speech and editorial functions, the ones involved in the making of meaning. The role of machinery was limited to the transport function.
Claude Shannon, at the outset of his landmark 1948 paper “A Mathematical Theory of Communication,” makes this division of labor explicit:
The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem.
Message creation and selection are semantic functions, functions of meaning-making, and those are the realm of human beings. Humans, as speakers, create messages, using words or pictures or sounds, and humans, as editors, choose which of those messages to send to viewers and listeners. Machines are involved only in the transport function, i.e., “reproducing at one point either exactly or approximately a message selected at another point.” The “engineering problem” of communication can take many forms — designing a postal system, stringing telegraph or telephone lines across a continent, building switching networks for the routing of phone calls, establishing broadcasting networks for the transmission of radio or television signals, creating protocols for compressing signals or instruments for amplifying them — but its scope is limited to carriage and conveyance. Meaning, as Shannon stresses, is “irrelevant” to the operation of the mechanism.
So what happened during those fateful early morning hours of September 5, 2006, to upset this neat and seemingly immutable division of labor? Facebook rolled out its News Feed, which automated the selection of messages for distribution to an audience. The editorial function was no longer the exclusive purview of human beings. It had suddenly become part of the engineering problem, a matter of user-profiling systems, prediction algorithms, and other software routines. Even though the machines — networked digital computers — had no sense of meaning themselves, they were now making decisions about meaning, semantic decisions that determine the content people see or do not see.
As the News Feed’s commercial benefits became clear — using machines to automatically choose the message most likely to grab an individual audience member’s attention was pure gold for media companies — the mechanization of the editorial function was incorporated into pretty much every other online media system. Although we didn’t see the phenomenon in these terms, it was the expansion of media technology’s role from the transport to the editorial function that set off the subsequent chain reaction of social problems that we’ve been grappling with, ineffectively, ever since. Society, we discovered, was neither prepared for nor capable of addressing the automation of meaning-making.
But that was just a preview. When, sixteen years later, on November 30, 2022, OpenAI released ChatGPT to the public, communication technology’s ambit expanded once again, this time to encompass the role of message creation — the speech function that up to then was seen not just as the exclusive purview of humans but as an essential and singular quality of humanness itself. Machines still don’t understand the meaning of the speech they produce — that may or may not change in the future — but it turns out that doesn’t matter. For computers, understanding is not a prerequisite to speaking.
The creation of meaningful speech, in the form of words and images, songs and motion pictures, is now also part of Shannon’s “engineering problem” — a matter of, among other technical tasks, programming large language models, assembling and cleansing data sets, and crafting prompts for generating outputs. What we are seeing, to continue using Shannon’s terms, is the working out, largely in private, with few semantic considerations beyond the commercial and the legal, of a mathematical theory of meaning.
The automation of content creation, like the automation of editorial decision-making before it, offers internet companies big benefits. They can generate an unlimited supply of customized content in real time at basically zero marginal cost. (As DeepSeek’s arrival suggests, even the capital costs of building AI systems are fated to plummet, further commodifying the content they generate.) Already, we’re seeing the companies rush to capture the benefits in various ways. Google is giving machine-manufactured speech precedence in its search results. X and other social media platforms are offering machine-generated glosses on tweets and posts. Spotify is incorporating machine-generated songs into playlists. Spammers are creating web sites filled entirely with machine-generated “AI slop.” And we’re still only at the very beginning of the exploitation of mechanical speech.
In 1948, the same year Shannon wrote his article, the Swiss historian Siegfried Giedion published a remarkable book called Mechanization Takes Command, an ambitious, panoramic attempt to, as he put it, “understand the effects of mechanization upon the human being.” Examining everything from assembly lines to abattoirs to vacuum cleaners, he sought to identify the point at which mechanization goes from supporting “human values” to undermining them. He ended the book with a warning:
Mechanization is an agent, like water, fire, light. It is blind and without direction of its own. It must be canalized. Like the powers of nature, mechanization depends on man’s capacity to make use of it and to protect himself against its inherent perils. Because mechanization sprang entirely from the mind of man, it is the more dangerous to him.
Humanity today confronts a challenge unlike any it has faced in the past: it has to determine how to divide the labor of meaning-making between people and machines. It’s a challenge of our own making and one we’re still entirely unprepared for.
I hope you’ll consider supporting my work by purchasing Superbloom, through Bookshop.org, Amazon, B&N, or your local bookstore. And thanks for reading New Cartographies.
Bravo, and thank you! You've shared a concise and crystal clear framework for making sense of what's happened with the intrusion of mechanization into meaning-making. We can all ask, why did this intrusion happen, and why now?
And yet many seem very happy with this, a sign of how lazy we are.