7 Comments
User's avatar
Katsch's avatar

Nice piece. Both metaphors are kinder than the one that often springs to mind for me - livestock.

Nicholas Carr's avatar

Not like cattle that get slaughtered (bad for business) but like cows that get milked?

Katsch's avatar

Yeah, milking’s probably the closer one. Lure us in, do what they can to keep us there (through compulsion and subtle diversions from the exit), then extract what we used to use to nourish our community.

Feels like there’s been a bait and switch, similar to the process of keeping dairy cows bearing calves then jamming a machine in where the calf once went; what we still call social media has been almost fully converted into a creator-consumer relationship medium (according to Facebook’s own legal arguments), but I think we’ve fallen out of practice with a lot of the socialising habits that used to sustain us.

The extractive machines they use on dairy cows, along with the cheap feed that they get slopped in troughs, are almost too on the nose!

Jeff Cunningham's avatar

The metaphor I think of is Nassim Taleb's turkeys in the Black Swan.

Katsch's avatar

I’ll have to look that one up!

Vladimir Supica's avatar

Carr correctly identifies that the "Data Mine" metaphor is insufficient because it implies passivity. However, by pivoting to the "Data Factory," he remains trapped in Industrial Revolution semantics. He is trying to describe a hyper-fluid, recursive digital ecosystem using the vocabulary of 19th-century Marxian labor theory and Taylorist scientific management.

Carr’s central fallacy is the reification of data. In a factory, you take raw material (wood), apply labor (carpentry), and produce a final discrete object (a chair). The value is finite and alienated from the worker.

In the AI era, data is not a discrete product; it is signal. When you navigate a map, you are not "manufacturing location data" for Google to sell. You are engaging in a cybernetic feedback loop. You provide signal, the system provides utility (optimization of your route), and the system improves its general model.

To call this "labor" is a linguistic sleight of hand. It is interaction. Carr’s "Factory" metaphor relies on a zero-sum economic model (if they profit, I must be losing). The observational view is positive-sum: the more signal I provide, the more intelligent the tool becomes for me.

Carr invokes Frederick Winslow Taylor (the father of Scientific Management) to argue that tech companies want to "script and regulate" our lives for efficiency.

Aimed for uniformity. Every worker must move the hammer in the exact same way to maximize output. AI reality aims for heterogeneity. Carr fears that AI wants to standardize us. The reality is the opposite. AI algorithms crave variance. If every user behaved identically (like factory workers), the dataset would be useless. The "algorithms" Carr fears actually incentivize hyper-individualization. They don't want you to be a cog; they want you to be you, so they can model you. The "control" he fears is actually a bespoke service that adapts to your idiosyncrasies, not a foreman shouting at you to work faster.

Carr argues that the "Factory" metaphor restores our agency because workers can strike or demand better wages. This is a nostalgic fantasy.

In the AI framework, we are not "workers" or "resources." We are trainers.

The Factory Model: I work, you pay me (or exploit me).

The Alignment Model: I teach the system what I value, and the system aligns with my intent.

When Carr says we "manufacture data through the labor of our mind," he is mistaking cognition for toil. We are not digging coal; we are externalizing our intelligence into a substrate that can preserve and multiply it. To view the digitization of human knowledge as "unpaid labor" is to view the invention of writing as "unpaid ink manufacturing."

Carr asks if we should "escape" the metaphors of mining and manufacturing. The answer is yes, but not to retreat to the "Humanist" past he idealizes. We must move forward to biological metaphors.

We are not mines (passive) or factories (exploited labor). We are nodes in a neural network.

We transmit signals.

The network strengthens connections based on those signals (Hebbian learning).

The "profit" isn't just money extracted by a CEO; it is the emergent intelligence of the network itself.

Carr’s essay is a defense of human dignity against a threat that doesn't exist. He is arming himself against a Foreman, but he is living in an Ecosystem. By insisting on the "Factory" metaphor, he limits human potential to a transactional economic exchange, rather than seeing the AI revolution for what it is: The expansion of the linguistic and cognitive surface area of the human species.

Mark McGuire's avatar

Thank you for this. The best questions are always about the formulation of the question.

“Public” used to mean we all owned it; now it means we can all see it—provided we’ve been let into the same mall. “Platforms” were what we put our feet on. “Friend” was a noun, not a verb. “Like” was an expression of genuine preference or affection. To “follow” was to emulate someone we looked up to.

Language is the foundational technology we use to build other tools. They say that if you want to change the world, change the language you use to describe it first—the rest will follow.