<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[New Cartographies]]></title><description><![CDATA[We have entered strange new territories. We need new cartographies.]]></description><link>https://www.newcartographies.com</link><generator>Substack</generator><lastBuildDate>Fri, 01 May 2026 07:58:11 GMT</lastBuildDate><atom:link href="https://www.newcartographies.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Nicholas Carr]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[newcartographies@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[newcartographies@substack.com]]></itunes:email><itunes:name><![CDATA[Nicholas Carr]]></itunes:name></itunes:owner><itunes:author><![CDATA[Nicholas Carr]]></itunes:author><googleplay:owner><![CDATA[newcartographies@substack.com]]></googleplay:owner><googleplay:email><![CDATA[newcartographies@substack.com]]></googleplay:email><googleplay:author><![CDATA[Nicholas Carr]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[What Hath Tim Berners-Lee Wrought?]]></title><description><![CDATA[A review of This Is for Everyone.]]></description><link>https://www.newcartographies.com/p/what-hath-tim-berners-lee-wrought</link><guid isPermaLink="false">https://www.newcartographies.com/p/what-hath-tim-berners-lee-wrought</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Fri, 03 Apr 2026 11:01:54 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!CKCx!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2d6ebb8-2cb8-4baf-9674-3c7633c32c5b_1500x1058.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!CKCx!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2d6ebb8-2cb8-4baf-9674-3c7633c32c5b_1500x1058.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!CKCx!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2d6ebb8-2cb8-4baf-9674-3c7633c32c5b_1500x1058.jpeg 424w, https://substackcdn.com/image/fetch/$s_!CKCx!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2d6ebb8-2cb8-4baf-9674-3c7633c32c5b_1500x1058.jpeg 848w, https://substackcdn.com/image/fetch/$s_!CKCx!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2d6ebb8-2cb8-4baf-9674-3c7633c32c5b_1500x1058.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!CKCx!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2d6ebb8-2cb8-4baf-9674-3c7633c32c5b_1500x1058.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!CKCx!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2d6ebb8-2cb8-4baf-9674-3c7633c32c5b_1500x1058.jpeg" width="1456" height="1027" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f2d6ebb8-2cb8-4baf-9674-3c7633c32c5b_1500x1058.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1027,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;No photo description available.&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="No photo description available." title="No photo description available." srcset="https://substackcdn.com/image/fetch/$s_!CKCx!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2d6ebb8-2cb8-4baf-9674-3c7633c32c5b_1500x1058.jpeg 424w, https://substackcdn.com/image/fetch/$s_!CKCx!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2d6ebb8-2cb8-4baf-9674-3c7633c32c5b_1500x1058.jpeg 848w, https://substackcdn.com/image/fetch/$s_!CKCx!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2d6ebb8-2cb8-4baf-9674-3c7633c32c5b_1500x1058.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!CKCx!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff2d6ebb8-2cb8-4baf-9674-3c7633c32c5b_1500x1058.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Tullio Crali, <em>The Forces of the Bend</em>, 1930.</figcaption></figure></div><p>The internet is getting old. Nearly four decades have passed since an unassuming British programmer named Tim Berners-Lee invented the World Wide Web in his office at the CERN particle-accelerator complex near Geneva. By transforming the net, until then an arcane research hub for scientists and engineers, into a buzzing, hyperlinked, multimedia information system, Berners-Lee unwittingly sparked the public&#8217;s mass migration online. He also, as he writes in his new memoir, <em><a href="https://bookshop.org/a/85280/9780374612467">This Is for Everyone</a></em>, opened the gates for those who &#8220;might corrupt what I was trying to build.&#8221;</p><p>Born to two mathematicians in 1955&#8212;the same year, he notes, as Steve Jobs and Bill Gates&#8212;Berners-Lee was an unusual kid. Growing up in the quiet London neighborhood of East Sheen, not far from the Thames, he didn&#8217;t listen to rock music or follow sports teams or watch television, and he remained oblivious to the excitements and upheavals of the Sixties. &#8220;I missed out on pop culture more or less entirely,&#8221; he writes. Fascinated by the workings of electronic circuitry, he spent his teens building model railroads and cobbling together gadgets, including a rudimentary computer, from scavenged parts. An outstanding student, he entered The Queen&#8217;s College, Oxford, on a scholarship and went on to earn a first in physics. The door to an academic career was open, but he chose instead to pursue his passion for tinkering by becoming a writer of software code.</p><p>He took on a series of short-term coding assignments before landing a more permanent job at CERN in 1984. As a member of the Data and Documents division, he wrote programs for capturing detailed digital images of atoms colliding. He was well liked by his colleagues, though they often found it difficult to follow his rapid-fire, abstract style of speech. They would hold up signs reading <em>Tim, slow down </em>when he took the floor at meetings<em>. </em>When, early in 1989, he began talking up his idea for the World Wide Web&#8212;he originally called it <em>Mesh</em> but decided that sounded too much like <em>mess</em>&#8212;he described it as a way for researchers using incompatible computers to share experimental results and other data via documents uploaded to the internet and connected with hyperlinks. It was a tool, he explained, for &#8220;intercreativity.&#8221; His pitches were met with bafflement and indifference. &#8220;Few of us if any could understand what he was talking about,&#8221; his boss at the time recalled. His written proposals, illustrated with elaborate diagrams, went unread.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Subscribe to New Cartographies for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>His break came in October 1990 when CERN decided to test one of the powerful new &#8220;cube&#8221; computers produced by NeXT, the company Steve Jobs formed after being ousted from Apple in 1985. Berners-Lee was chosen to &#8220;kick the tyres&#8221; of the machine, with the suggestion he use it to work on that &#8220;hypertext project&#8221; he couldn&#8217;t stop talking about. In an intense, one-man hackathon over the ensuing two months, he wrote the foundational codes that govern the design and functioning of web sites and browsers. Just before Christmas, he posted the world&#8217;s first web page on a CERN server. A year passed before people outside the lab noticed his invention. Then it exploded, becoming one of the most rapidly adopted technologies in history. Soon, &#8220;web&#8221; was a synonym for &#8220;internet,&#8221; and Berners-Lee was famous.</p><p>He moved to the United States, where excitement about cyberspace had been mounting for years, and joined the faculty at the Massachusetts Institute of Technology. MIT had agreed to host the World Wide Web Consortium, an independent standards-setting organization Berners-Lee set up in hopes of ensuring the web&#8217;s protocols remained open, free, and unsullied by the profit motive. He wanted his system, he made clear, to be geared to the needs of individuals and small groups, not those of big businesses. But even as the W3C, as the consortium was called, began hosting conferences and issuing guidelines, the web&#8217;s enormous commercial potential started drawing the interest, and the money, of venture capitalists, entrepreneurs, and corporations. Berners-Lee soon found himself battling them for control of his invention&#8217;s fate.</p><p>His first nemesis was an unlikely one: a husky midwestern kid named Marc Andreessen who had his own ideas about how the web should work. While an undergrad at the University of Illinois at Urbana-Champaign, he and a friend created a new browser, called Mosaic, that incorporated, over Berners-Lee&#8217;s objections, an unsanctioned code for displaying images directly on web pages. The innovation proved popular, and Mosaic quickly became the most widely used browser. The only problem for Andreessen was that Mosaic was owned not by him but by the university lab that funded his work. Sensing an entrepreneurial opportunity of epochal proportions, he decamped to Silicon Valley and joined forces with a wealthy investor, Jim Clark, to launch the browser company Netscape. With many new and attractive features, the company&#8217;s browser, Navigator, displaced Mosaic as the market leader. When Netscape went public in 1995, Andreessen made millions and Clark made even more. The internet gold rush was on.</p><p>While Berners-Lee would continue to win awards and accolades, culminating in his knighting by Queen Elizabeth II in 2004, he became a peripheral figure in the story of the web&#8217;s transformation into a global thoroughfare for commerce and communication dominated by a handful of very large, very profitable, very American companies. His dream of building a decentralized system that would give people &#8220;the ability to make links around the world&#8221; without &#8220;ensnaring them in dead-end, anti-human materialism, or systems of surveillance, coercion and control&#8221; went unfulfilled. The internet giants continued donating cash to the W3C, but they paid little heed to its strictures or goals. The businesses themselves now set the rules for how the web operates, and their rules made a mockery of its inventor&#8217;s ideals. Far from being a shield against materialism, surveillance, and control, the net became their instrument.</p><p>Berners-Lee hasn&#8217;t given up hope. If anything, he seems more optimistic than ever, seeing &#8220;signs of spring&#8221; sprouting everywhere. He rhapsodizes about a future internet that, having incorporated artificial-intelligence and virtual-reality systems, escapes the computer and the phone to become a permanent &#8220;overlay onto the physical world &#8230; a seamless informational filter on top of reality.&#8221; We will be wrapped up in it even more than we are already&#8212;a prospect he welcomes.</p><p>This &#8220;new web&#8221; will also be freed from corporate control. It &#8220;will be decentralized&#8221; and will grant its users &#8220;data sovereignty.&#8221; Thirty-five years of history will be erased, in other words, and Berners-Lee&#8217;s original vision for the web will at last come to pass. How he expects this to happen in the face of powerful, entrenched business interests is a little sketchy, but he suggests that social media companies and other internet firms will come to adopt a new set of technical protocols and operating constraints that effectively destroy the data-collection, customer-profiling, and behavior-prediction systems at the core of their business strategies. They will put Berners-Lee&#8217;s ideals ahead of their own profits.</p><p>Inventors are rarely able to see their inventions objectively, just as parents are rarely able to see their children objectively, but Berners-Lee seems particularly impaired. His focus on protocols and codes may be understandable for a programmer, but it has blinded him to a deep and uncomfortable truth about his invention. The web wasn&#8217;t corrupted by outside forces. The corruption was there from the start, latent in its design. A vast, decentralized communication network that can transmit data of all sorts to all people is not resistant to the establishment of information monopolies; it encourages their formation. Decentralization at a technical level breeds centralization at an industrial level. What Cory Doctorow calls &#8220;<a href="https://bookshop.org/a/85280/9780374619329">enshittification</a>&#8221; is not a bug but a feature.</p><p>The reasons are several:</p><ol><li><p>The web is subject to particularly strong network effects. Because a communication system becomes more valuable to users as the number of users increases, a boundaryless network with few physical or technical constraints on its expansion will consolidate traffic on a massive scale, giving strong advantages to the biggest players.</p></li><li><p>The web is a marketplace where an unimaginable number of transactions, both financial and social, are completed every second without regard to the physical location of the participants. That favors large intermediaries, or middlemen, that have the infrastructure necessary to host myriad market participants, execute transactions quickly and precisely, and maintain detailed records of all that transpires.</p></li><li><p>Operating at such scale requires large capital investments&#8212;for servers, storage drives, networking gear, cooling systems, and the like. The capital requirements present daunting barriers to entry for newcomers, barriers that are growing more forbidding as resource-intensive AI routines are incorporated into online processes and services.</p></li><li><p>The interpersonal links that Berners-Lee rightfully celebrates for their intellectual and social value also have outsized financial value when captured as data and analyzed by computers. The network effect applies not just to people but to information about people. The more aggregated the information, the bigger an asset it becomes for companies that profit by predicting attitudes and behavior.</p></li><li><p>Consumers benefit from the companies&#8217; scale. Whatever fears people may have about privacy invasions or wealth concentration, they enjoy the personalization, convenience, and diversion that social media companies and other internet outfits serve up in endless quantities for free. People&#8217;s loyalty to algorithmically-tuned feeds may be a form of addiction, but it doesn&#8217;t seem to be an addiction many are eager to break.</p></li></ol><p>Berners-Lee may be loathe to admit it, but the web he designed rewards both massive scale and centralized control.</p><p>He barely mentions the only practicable way to curb the size and power of the internet giants: breaking them up through aggressive antitrust litigation. Such a course seems unlikely for the moment, as Congress and the courts have for years taken a passive approach to antitrust enforcement, leaving control of monopolies to the marketplace. Late last year, a federal judge threw out a long-running, much-watched Federal Trade Commission suit seeking to split up Meta, ruling that the company&#8217;s social-media platforms face sufficient competition for people&#8217;s attention from other popular platforms like YouTube and TikTok. But, as the legal scholar Tim Wu explains in his recent book, <em><a href="https://bookshop.org/a/85280/9780593321249">The Age of Extraction</a></em>, governmental antitrust philosophies tend to shift with public opinion. The robber barons of the Gilded Age were brought to heel by celebrated &#8220;trust busters&#8221; like Teddy Roosevelt at the turn of the twentieth century. Later in the century, AT&amp;T, IBM, and Microsoft all saw their market power constrained by government action. Should public outrage over the financial and political power of internet companies grow in the years ahead, antitrust suits may become more common and more successful.</p><p>But while breaking up companies like Meta and Google could well bring economic and social benefits&#8212;Wu makes a compelling case that the unconstrained growth of monopolies undermines democracy&#8212;it wouldn&#8217;t change the way the web works. An Instagram spun off by Meta would still be Instagram. It would still make money by, to use Wu&#8217;s terminology, extracting attention and data from the masses and using that raw material to feed a lucrative advertising business. The companies that rule the web might not be quite as large as they used to be, but they&#8217;d still do business in much the same way. And they&#8217;d still rule the web.</p><p><em>This Is for Everyone</em> ends with an idyllic anecdote. Berners-Lee and his third wife, the entrepreneur and philanthropist Rosemary Leith, take their fourteen-foot Hobie catamaran out for a sail around a Canadian lake on a fine fall day. Rosemary takes the tiller, while Tim sits on the trampoline between the hulls, manning the mainsheet. He feels a sense of exhilaration as the little boat catches the wind and zips across the lake. &#8220;While the physical activity and constant adjustments kept us engaged,&#8221; he writes, &#8220;there were also moments of pure enjoyment. The feeling of the sun on one&#8217;s back, the sound of the water rushing past, the wind in the sails and the sensation of gliding over the waves can create a deep glow of connection with nature.&#8221;</p><p>Then, abruptly, he breaks the reverie to tell us, for the umpteenth time, why his invention is such a gift to us all:</p><blockquote><p>Even on my sailing boat, in the middle of the water on a sunny afternoon in autumn, I could use my smartphone to contact almost anyone alive. I could deliver a package to my doorstep, or a hot lunch to my office. I could buy Rosemary flowers. I could listen to any song ever recorded and watch old clips from <em>Fawlty Towers</em>. I could do my banking or trade my portfolio or donate to UNICEF. I could read the news in any language and check the weather in any country. I could book a flight, a hotel room and a car in nearly any city in the world. All I needed was a web browser and phone reception.</p></blockquote><p>So much for that connection with nature.</p><p>Not all technologies improve people&#8217;s lives. Just as Berners-Lee&#8217;s now omnipresent web shapes industries and markets, it shapes its users&#8217; thoughts, perceptions, and relationships. As we&#8217;re slowly coming to understand, human beings did not evolve to be virtual creatures in a computer-generated world. The internet operates at a scale and speed that conflict with the brain&#8217;s deliberate pace of thought, the intellect&#8217;s slow accumulation of knowledge, and the psyche&#8217;s limited capacity for stimulation and social exchange. To be able to do anything and be anywhere at any moment seems liberating for a while, but it ends in a blurred and chaotic existence, the physical world&#8217;s familiar, steadying divisions of space and time dissolving in endless torrents of data. It&#8217;s an existence that may be vivifying to certain software programmers&#8212;<em>Tim, slow down!</em>&#8212;but for the rest of us, the virtual world&#8217;s hyperkinetic superabundance ends up feeling like emptiness, a very, very busy void. We may be drawn to that void by our native attraction to information, novelty, and spectacle, but we&#8217;ll never make a home there.</p><p>It&#8217;s hard not to admire Berners-Lee&#8217;s ability to sustain his idealism about the web in the face of so many disappointments. His optimism, with its childlike resiliency, is winning. But we should be wary of it. The qualities of the web he celebrates unquestioningly are the very qualities we should be questioning. What he can&#8217;t imagine, or at least won&#8217;t allow himself to imagine, is the possibility that his gift to humanity is more bane than boon.</p><div><hr></div><p><em>This review of Tim Berners-Lee&#8217;s </em>This Is for Everyone: The Unfinished Story of the World Wide Web<em> appeared originally, in a slightly different form, in the <a href="https://lareviewofbooks.org">Los Angeles Review of Books</a>.</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption"></p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[From Homo Faber to Homo Fictor]]></title><description><![CDATA[AI smog rolls in.]]></description><link>https://www.newcartographies.com/p/from-homo-faber-to-homo-fictor</link><guid isPermaLink="false">https://www.newcartographies.com/p/from-homo-faber-to-homo-fictor</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Mon, 30 Mar 2026 11:31:25 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!iW7L!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62779b92-4d58-43a9-a253-0c901046a7bd_1536x681.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!iW7L!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62779b92-4d58-43a9-a253-0c901046a7bd_1536x681.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!iW7L!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62779b92-4d58-43a9-a253-0c901046a7bd_1536x681.jpeg 424w, https://substackcdn.com/image/fetch/$s_!iW7L!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62779b92-4d58-43a9-a253-0c901046a7bd_1536x681.jpeg 848w, https://substackcdn.com/image/fetch/$s_!iW7L!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62779b92-4d58-43a9-a253-0c901046a7bd_1536x681.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!iW7L!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62779b92-4d58-43a9-a253-0c901046a7bd_1536x681.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!iW7L!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62779b92-4d58-43a9-a253-0c901046a7bd_1536x681.jpeg" width="1536" height="681" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/62779b92-4d58-43a9-a253-0c901046a7bd_1536x681.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:681,&quot;width&quot;:1536,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:281762,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/192018722?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F36d06369-da22-4919-a2bd-47bf9ac11322_1536x1740.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!iW7L!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62779b92-4d58-43a9-a253-0c901046a7bd_1536x681.jpeg 424w, https://substackcdn.com/image/fetch/$s_!iW7L!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62779b92-4d58-43a9-a253-0c901046a7bd_1536x681.jpeg 848w, https://substackcdn.com/image/fetch/$s_!iW7L!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62779b92-4d58-43a9-a253-0c901046a7bd_1536x681.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!iW7L!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62779b92-4d58-43a9-a253-0c901046a7bd_1536x681.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Vermeer, <em>A Lady Writing</em>, c. 1665 (detail).</figcaption></figure></div><p>The twentieth-century philosophers Hannah Arendt and G&#252;nther Anders were married for eight years, from 1929 until 1937, and I would like to suggest that by marrying Arendt&#8217;s idea of <em>homo faber</em> with Anders&#8217;s idea of Promethean shame we can uncover a hidden aspect of generative artificial intelligence that is fated to cast a tragic shadow over us and our work from here on out. In creating what might best be called <em>machina faber</em>, we have brought into the world an instrument that transcends its instrumentality. The tool becomes a maker itself, a machine that, whatever its current and future shortcomings, intrudes on what Arendt saw as an essence of the human: our ability to fashion a world for ourselves. </p><p>The original <em>homo faber</em> was the artisan, the man who created with his hands objects that were useful but not <em>merely</em> useful. They were much more than what we would today call &#8220;consumer goods&#8221; or &#8220;disposables.&#8221; Crafted with care and skill, <em>homo faber</em>&#8217;s artifacts, whether they took the form of dinner tables or cathedrals, were solid and durable. Withstanding time, they outlasted their makers and their first users. In the aggregate, Arendt argued in <em><a href="https://bookshop.org/a/85280/9780226586601">The Human Condition</a></em>, they formed the foundations of civilization, giving human society a continuity through the passing of generations. Every individual is mortal, but the &#8220;human artifice&#8221; endures. </p><blockquote><p>The man-made world of things, the human artifice erected by <em>homo faber</em>, becomes a home for mortal men, whose stability will endure and outlast the ever-changing movement of their lives and actions, only insomuch as it transcends both the sheer functionalism of things produced for consumption and the sheer utility of objects produced for use.</p></blockquote><p>But <em>homo faber</em>, Arendt stressed, is the artist as well as the artisan. The words of the poet, the songs of the musician, the images of the painter and the photographer, the figures of the sculptor: they all carry the stories of humans and humankind through time. They provide the continuity not of physical artifice but of intellectual and aesthetic artifice&#8212;of history and government, of knowledge, wit, beauty.</p><blockquote><p>If mortals need his help to erect a home on earth, acting and speaking men need the help of the <em>homo faber </em>in his highest capacity, that is, the help of the artist, of poets and historiographers, of monument-builders or writers, because without them the only product of their activity, the story they enact and tell, would not survive at all.</p></blockquote><p><em>Homo faber</em> took pride in his work&#8212;his skill gave him the power to turn the inanimate stuff of nature to human purposes&#8212;and the pride was shared by all of humankind. G&#252;nther Anders termed it &#8220;Promethean pride,&#8221; after the Greek god who created man out of mud. But with the Industrial Revolution, in Anders&#8217;s telling, the story took a dark turn. As <em>homo faber</em> came to rely on ever more complex machinery to manufacture goods, he grew alienated from the products he produced. They were no longer the products of his hands; they emerged from a mechanical process in which he played only a small part. The buyers of the goods felt a similar alienation. They could no longer see in the products any human origin. They could no longer construct a home out of them.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Subscribe to New Cartographies for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>We find ourselves today, Anders wrote in <em><a href="https://bookshop.org/a/85280/9781517912659">The Obsolescence of the Human</a></em>,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> surrounded by products that don&#8217;t appear to us as having been produced by us. We have handed off so much of their design and production to industrial technologies that we can no longer take a shared pride in their invention and manufacture. Indeed, they have come to project an otherness that seems not just separate from us but superior to us. They mock us as outdated masses of meat and bone. Today&#8217;s products, Anders observed,</p><blockquote><p>are simply &#8220;there.&#8221; We encounter them primarily as necessary, desirable, superfluous, affordable, or unaffordable consumer goods that become &#8220;mine&#8221; only after I have bought them. As such, they are much more likely to be proof of one&#8217;s own insufficiency than evidence of one&#8217;s power.</p></blockquote><p>This sense of insufficiency transformed &#8220;Promethean pride&#8221; into &#8220;Promethean shame&#8221;&#8212;the shame contemporary man feels at having been born instead of made, of being a product of natural processes rather than technological ones. &#8220;He despises himself,&#8221; wrote Anders, &#8220;in the same way that things would despise him if they could.&#8221; The shame, he went on, becomes particularly sharp when a person first sees a so-called thinking machine:</p><blockquote><p>As for the man who is for the first time confronted with a working computing machine, self-aggrandizement and pride are even more alien to him. An observer who erupts with the exclamation, &#8220;My goodness, aren&#8217;t we great guys, to be capable of this!&#8221; when encountering such a machine is a clown, a figment of the imagination. Quite the contrary! He rather murmurs with a shake of his head, &#8220;My god, it&#8217;s incredible what it&#8212;the machine&#8212;can do!&#8221; At the same time, he feels highly ill at ease in his creaturely skin, for the machine half gives him the creeps and half puts him to shame.</p></blockquote><p>Though it was written seventy-five years ago, that last sentence strikes me as one of the more perceptive descriptions of man&#8217;s confrontation with generative AI: &#8220;he feels highly ill at ease in his creaturely skin, for the machine half gives him the creeps and half puts him to shame.&#8221;</p><p>With the arrival of <em>machina faber</em> in the form of AI, <em>faber (</em>the maker) no longer feels like the right term to apply to us humans. I would suggest that <em>fictor</em> (the fabricator) is now the better descriptor. In Latin, <em>fictor</em>, like <em>faber</em>, denotes a maker (though one who works in malleable materials like clay rather than in solid ones like wood or metal), but the word, like the English word <em>fabricator</em>, also carries a connotation of fakery or deception. It derives from the verb <em>fingere</em>, which means both to fashion and to feign. <em>Homo fictor</em> is the maker who may be lying about what he makes. His works, however useful, however elegant, will always be suspect. They will always carry a hint of fraud. Did he make them, or did AI? </p><p><em>Machina faber</em> steals from makers their pride in accomplishment. If you use AI to &#8220;write&#8221; something, or to &#8220;code&#8221; something, or to &#8220;compose&#8221; something, or to &#8220;design&#8221; something, or to &#8220;invent&#8221; something, then any true sense of accomplishment will be withheld from you. You will always know that you&#8217;re a fraud. You will always be ashamed of yourself, even if you&#8217;re in denial about your shame. As Anders wrote, &#8220;Anyone who denies the existence of this form of shame does so because the admission that we&#8217;ve come such a gloriously long way only to now feel shame in front of things would itself make them blush with shame.&#8221;</p><p>But&#8212;and here&#8217;s the tragic part&#8212;the shame is not restricted to those artisans and artists who use AI in their work. The shame shadows everyone, even those who abstain from using AI, even those who <a href="https://substack.com/home/post/p-190029457">take pride</a> in their public rejection of AI. In a recent <em>New York Times</em> <a href="https://www.nytimes.com/2026/03/25/opinion/shy-girl-ai-publishing.html">op-ed</a> about the controversy surrounding <em>Shy Girl</em>, the novel cancelled by its publisher because its prose was felt to carry an odor of AI, the novelist Andrea Bartz writes:</p><blockquote><p>When readers ask questions about my thriller novels, I love to discuss the themes and characters in them and the inspiration for my writing. But as generative artificial intelligence worms its way through the publishing industry, I&#8217;m bracing for a stomach-turning query: Did you actually write this?</p></blockquote><p>Fair or not, that&#8217;s the question that is now inescapable for all makers. Even if you don&#8217;t traffic in AI slop, you&#8217;re still subject to AI smog. Its odor is everywhere, as is the shame it carries. No one is beyond suspicion. Nothing is pure.</p><div><hr></div><p><em>This post is an installment in <a href="https://www.newcartographies.com/t/dead-speech">Dead Speech</a>, the New Cartographies series on AI and its cultural and economic consequences.</em></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>The first full English translation of <em><a href="https://www.upress.umn.edu/9781517912659/the-obsolescence-of-the-human/">The Obsolescence of the Human</a></em> was published last year by the University of Minnesota Press. The book was originally published in German in 1956.</p></div></div>]]></content:encoded></item><item><title><![CDATA[A Brief History of Educational Machinery]]></title><description><![CDATA[One revolution after another.]]></description><link>https://www.newcartographies.com/p/a-brief-history-of-educational-machinery</link><guid isPermaLink="false">https://www.newcartographies.com/p/a-brief-history-of-educational-machinery</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Sun, 22 Mar 2026 16:31:12 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!6ZDW!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf380bfb-525d-4eb4-a990-61208bff1aac_1212x796.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Today&#8217;s <a href="https://www.newcartographies.com/t/rerun">Sunday Rerun</a> is a piece I posted in late 2014, at the tail end of the great MOOC hype. (&#8220;MOOC&#8221; was an acronym  for massive open online course, in case you&#8217;ve forgotten.) As tech companies push AI tools into schools, and teachers and students adopt them in a rush to make education more efficient, it&#8217;s worth asking what exactly we&#8217;re trying to accomplish by automating the work of learning&#8212;a topic I discussed earlier in &#8220;<a href="https://www.newcartographies.com/p/the-myth-of-automated-learning">The Myth of Automated Learning</a>.&#8221;</em></p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!6ZDW!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf380bfb-525d-4eb4-a990-61208bff1aac_1212x796.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!6ZDW!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf380bfb-525d-4eb4-a990-61208bff1aac_1212x796.png 424w, https://substackcdn.com/image/fetch/$s_!6ZDW!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf380bfb-525d-4eb4-a990-61208bff1aac_1212x796.png 848w, https://substackcdn.com/image/fetch/$s_!6ZDW!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf380bfb-525d-4eb4-a990-61208bff1aac_1212x796.png 1272w, https://substackcdn.com/image/fetch/$s_!6ZDW!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf380bfb-525d-4eb4-a990-61208bff1aac_1212x796.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!6ZDW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf380bfb-525d-4eb4-a990-61208bff1aac_1212x796.png" width="1212" height="796" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/bf380bfb-525d-4eb4-a990-61208bff1aac_1212x796.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:796,&quot;width&quot;:1212,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Image.png&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Image.png" title="Image.png" srcset="https://substackcdn.com/image/fetch/$s_!6ZDW!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf380bfb-525d-4eb4-a990-61208bff1aac_1212x796.png 424w, https://substackcdn.com/image/fetch/$s_!6ZDW!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf380bfb-525d-4eb4-a990-61208bff1aac_1212x796.png 848w, https://substackcdn.com/image/fetch/$s_!6ZDW!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf380bfb-525d-4eb4-a990-61208bff1aac_1212x796.png 1272w, https://substackcdn.com/image/fetch/$s_!6ZDW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf380bfb-525d-4eb4-a990-61208bff1aac_1212x796.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>John Tenniel illustration for 1865 edition of </em>Alice in Wonderland<em>.</em></figcaption></figure></div><blockquote><p>&#8220;I feel like there&#8217;s a red pill and a blue pill, and you can take the blue pill and go back to your classroom and lecture your 20 students. But I&#8217;ve taken the red pill, and I&#8217;ve seen Wonderland.&#8221; &#8211;Sebastian Thrun, 2012</p></blockquote><p>Now that we&#8217;ve begun to talk of MOOCs retrospectively, the time has come to update my previously published <a href="https://www.roughtype.com/?p=1892">survey</a> of the history of hype and wishful thinking that has for more than a century surrounded technologies for automating education by replacing teachers and classrooms with machines and media. I am adding a new entry to the list. I suspect it won&#8217;t be the last addition.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Subscribe for free to New Cartographies.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p><em>Mail:</em> Around 1885, Yale professor William Rainey Harper, a pioneer of teaching-by-post, said, &#8220;The student who has prepared a certain number of lessons in the correspondence school knows more of the subject treated in those lessons, and knows it better, than the student who has covered the same ground in the classroom.&#8221; Soon, he predicted, &#8220;the work done by correspondence will be greater in amount than that done in the class-rooms of our academies and colleges.&#8221;</p><p><em>Phonograph:</em> In an 1878 article on &#8220;practical uses of the phonograph,&#8221; the <em>New York Times</em> predicted that the phonograph would be used &#8220;in the school-room in training children to read properly without the personal attention of the teacher; in teaching them to spell correctly, and in conveying any lesson to be acquired by study and memory. In short, a school may almost be conducted by machinery.&#8221;</p><p><em>Movies:</em> &#8220;It is possible to teach every branch of human knowledge with the motion picture,&#8221; proclaimed Thomas Edison in 1913. &#8220;Our school system will be completely changed in 10 years.&#8221;</p><p><em>Radio:</em> In 1927, the University of Iowa declared that &#8220;it is no imaginary dream to picture the school of tomorrow as an entirely different institution from that of today, because of the use of radio in teaching.&#8221;</p><p><em>TV:</em> &#8220;During the 1950s and 1960s,&#8221; report education scholars Marvin Van Kekerix and James Andrews, &#8220;broadcast television was widely heralded as the technology that would revolutionize education.&#8221; In 1963, an official with the National University Extension Association wrote that television provided an &#8220;open door&#8221; to transfer &#8220;vigorous and vital learning&#8221; from campuses to homes.</p><p><em>Computers:</em> &#8220;There won&#8217;t be schools in the future,&#8221; wrote MIT&#8217;s Seymour Papert in 1984. &#8220;I think the computer will blow up the school. That is, the school defined as something where there are classes, teachers running exams, people structured into groups by age, following a curriculum &#8212; all of that.&#8221;</p><p><em>World Wide Web:</em> The arrival of the web brought the e-learning fad of the late 1990s, as universities and corporations rushed to invest in online courses. In 1999, Cisco CEO John Chambers told the <em>Times</em>&#8216;s Thomas Friedman, &#8220;The next big killer application for the Internet is going to be education. Education over the Internet is going to be so big, it&#8217;s going to make e-mail usage look like a rounding error.&#8221;</p><p><em>MOOCs:</em> The <em>New York Times</em> declared 2012 &#8220;the year of the MOOC.&#8221; &#8220;Welcome to the college education revolution,&#8221; wrote the ever-hopeful Friedman in a column heralding massive open online courses. &#8220;In five years this will be a huge industry.&#8221; The MOOC &#8220;is transforming higher education,&#8221; declared the <em>Economist</em>, &#8220;threatening doom for the laggard and mediocre.&#8221; Academics were equally bedazzled. &#8220;There&#8217;s a tsunami coming,&#8221; said Stanford president John Hennessy. Opined MIT president Rafael Reif: &#8220;I am convinced that digital learning is the most important innovation in education since the printing press.&#8221; Harvard&#8217;s Clayton Christensen predicted &#8220;wholesale bankruptcies&#8221; among traditional universities.</p><p>All of these mediums and devices played useful roles in education and training&#8212;which is something worth celebrating&#8212;but none of them turned out to be revolutionary or transformative. There may be a deeper lesson here, a lesson about how easy it is to overlook the intangible virtues not just of classrooms and teachers but of presence, of bringing students together in one place at one time.</p>]]></content:encoded></item><item><title><![CDATA[Creative Work in an Age of Digital Production]]></title><description><![CDATA[On machine formalism.]]></description><link>https://www.newcartographies.com/p/creative-work-in-an-age-of-digital</link><guid isPermaLink="false">https://www.newcartographies.com/p/creative-work-in-an-age-of-digital</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Thu, 05 Mar 2026 17:12:05 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!a0-2!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53f93168-fdcb-4979-aa6f-3a511cfbbf52_1400x746.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!a0-2!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53f93168-fdcb-4979-aa6f-3a511cfbbf52_1400x746.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!a0-2!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53f93168-fdcb-4979-aa6f-3a511cfbbf52_1400x746.png 424w, https://substackcdn.com/image/fetch/$s_!a0-2!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53f93168-fdcb-4979-aa6f-3a511cfbbf52_1400x746.png 848w, https://substackcdn.com/image/fetch/$s_!a0-2!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53f93168-fdcb-4979-aa6f-3a511cfbbf52_1400x746.png 1272w, https://substackcdn.com/image/fetch/$s_!a0-2!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53f93168-fdcb-4979-aa6f-3a511cfbbf52_1400x746.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!a0-2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53f93168-fdcb-4979-aa6f-3a511cfbbf52_1400x746.png" width="1400" height="746" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/53f93168-fdcb-4979-aa6f-3a511cfbbf52_1400x746.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:746,&quot;width&quot;:1400,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2409248,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/183812047?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53f93168-fdcb-4979-aa6f-3a511cfbbf52_1400x746.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!a0-2!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53f93168-fdcb-4979-aa6f-3a511cfbbf52_1400x746.png 424w, https://substackcdn.com/image/fetch/$s_!a0-2!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53f93168-fdcb-4979-aa6f-3a511cfbbf52_1400x746.png 848w, https://substackcdn.com/image/fetch/$s_!a0-2!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53f93168-fdcb-4979-aa6f-3a511cfbbf52_1400x746.png 1272w, https://substackcdn.com/image/fetch/$s_!a0-2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F53f93168-fdcb-4979-aa6f-3a511cfbbf52_1400x746.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Enjoy it while it lasts.</figcaption></figure></div><p style="text-align: center;"><em>&#8220;After the first minute of content you will have what we call minutes 1 thru 3.&#8221; <br>&#8212;MrBeast, &#8220;How to Succeed in MrBeast Production&#8221;</em></p><p>A couple of years ago, I watched a MrBeast YouTube video on my phone. References to MrBeast were everywhere at the time&#8212;he had recently become the most popular YouTuber ever&#8212;and I felt that, to be culturally current, I should acquaint myself with his oeuvre. </p><p>My memory of the video is fuzzy, but the premise went something like this: Two strangers, a nondescript young woman and a nondescript young man, get locked together in a big windowless room. If they&#8217;re able to stay put for a hundred days, they&#8217;ll win a substantial amount of money&#8212;a million dollars, I think it was. I remember that, to emphasize the stakes, MrBeast (also a nondescript young man, though with a slightly odd affect) wheeled in a pallet on which was piled a million dollars in bundles of bills. What impressed me most about the show was its length. It seemed to go on forever, as if the hundred days were unfolding in real time. When it ended, I thought I should be given the million dollars.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Subscribe to New Cartographies to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>MrBeast has been in the news again, as the new season of his popular Amazon Prime extravaganza, <em>Beast Games</em>, debuted earlier this year. The <em>New York Times</em> ran two lengthy articles about him in December. He has, according to one of the articles, &#8220;explod[ed] the media industry&#8217;s understanding of the term &#8216;creator.&#8217;&#8221; I have no idea what that means, but it sounds like MrBeast must be doing something important. The <em>Wall Street Journal</em> seems to think so, too. In January, it ran a long, largely glowing profile of him, noting that he is known as &#8220;the man who cracked the secret of going viral.&#8221;</p><p>What exactly is that secret? What is it that MrBeast, as a &#8220;creator,&#8221; is &#8220;creating&#8221;? The obvious answer would be &#8220;content.&#8221; Content, after all, is what we talk about when we talk about digital media. It&#8217;s certainly what MrBeast talks about. But the more I think about online programming, the more I&#8217;m convinced that content isn&#8217;t what matters. What matters is form. Digital media aspires to, and often achieves, a state of contentlessness. Spend some time on MrBeast&#8217;s channel, or scroll down your Instagram feed or your X feed or your Apple News feed, or swipe through your For You page on TikTok. What you&#8217;re seeing is the repetition of a pattern, a pattern that has been statistically determined to have the highest odds of holding your attention. What fills the pattern at any given instant&#8212;what we call <em>content</em>&#8212;is fungible and disposable. It&#8217;s not important. It&#8217;s the pattern, the form the content fits and replicates, that&#8217;s important.</p><p>Popular culture has always been formulaic. Every hit pop song, or situation comedy, or detective novel spawns scores of copies. A lot of crap results. But good stuff does, too, sometimes. When the Rolling Stones were pressured by their producer and record company to come up with a quick followup to their 1964 smash &#8220;(I Can&#8217;t Get No) Satisfaction,&#8221; they delivered a knockoff: &#8220;Get Off My Cloud.&#8221;  The knockoff, if not quite in the same league as the original, was itself a great single. The Stones were savvy and skilled enough to make the knockoff fun and memorable in its own right, and by including in the lyrics a sarcastic swipe at advertising and consumerism &#8212; &#8220;he says I&#8217;ve won five pounds if I have his kind of detergent pack<em>&#8221; </em>&#8212; they gave a wink to their fans. The Stones knew the fans knew and the fans knew the Stones knew what the Stones were up to. The knockoff ended up being better for being a knockoff.</p><p>Such welcome things happen when talented people work in an established form, even a very narrow one, even one shaped by commercial interests. You get more than the reproduction of a pattern. You get something that&#8217;s both familiar and new. To go back a bit further in time, think of the sonnet. I don&#8217;t know who first wrote a poem in the fourteen-line form that came to be known as the sonnet&#8212;it was popularized by Petrarch in the fourteenth century&#8212;but for ages thereafter poets found ways to make that apparently rigid form their own. (Sonnets, wrote John Donne, are &#8220;pretty rooms&#8221; that you furnish to your own taste.) A good artist works both within and against a form. Art emerges from a struggle between what T. S. Eliot termed &#8220;tradition&#8221; (i.e., established forms) and &#8220;individual talent.&#8221;</p><p>Something very different happens when machines begin to establish and fill the patterns. The creative tension between form and individual talent gets resolved in form&#8217;s favor. Thanks to the algorithmic tracking of demand and the algorithmic delivery of supply, digital media has for the last twenty years promoted an ideal of perfect form&#8212;not a Platonic ideal but a consumerist one. Machines evaluate form microscopically, precisely measuring the effects of tiny variations in pixels or words or musical notes in order to determine the optimal pattern for each consumer. They then construct personalized feeds by repeatedly populating the optimal pattern with new stuff. </p><p>The speed of computers and computer networks makes the repetition of patterns instantaneous. Because the supply of messages is essentially unlimited&#8212;billions of people are churning new ones out every moment, and many of those people are themselves &#8220;creators&#8221; who are highly conscious of optimal patterns and aim to match them&#8212;the feed algorithm can replicate the form over and over again with no delay and no end. For both algorithm and creator, precision in form-matching takes precedence over individual inspiration and creativity. It didn&#8217;t matter which MrBeast production I watched, just as it doesn&#8217;t matter what the next item in my Instagram feed is. They&#8217;re all the same.</p><p>It&#8217;s easy to understand why automated media would concentrate on replicating forms. Evaluating works of art, or any creative products of skill and imagination, requires a mind, which the machines of media automation lack. Identifying a pattern requires only a statistical procedure, which is what the machines have. Indeed, it&#8217;s all they have. People like MrBeast didn&#8217;t crack the code of virality. Machines did. MrBeast&#8217;s great strength as a contemporary creator is that he has no ambition beyond repeating a pattern. He&#8217;s a machine-listener. He attends to the machine, and he does what it tells him to do. Here&#8217;s how he puts it in a recently leaked staff memo, &#8220;<a href="https://drive.google.com/file/d/1YaG9xpu-WQKBPUi8yQ4HaDYQLUSa7Y3J/view">How to Succeed in MrBeast Production</a>&#8221;:</p><blockquote><p>I spent basically 5 years of my life locked in a room studying virality on Youtube. Some days me and some other nerds would spend 20 hours straight studying the most minor thing: like is there a correlation between better lighting at the start of the video and less viewer drop off (there is, have good lighting at the start of the video haha) or other tiny things like that. And the result of those probably 20,000 to 30,000 hours of studying is I&#8217;d say I have a good grasp on what makes Youtube videos do well. The three metrics you guys need to care about is Click Thru Rate (CTR), Average View Duration (AVD), and Average View Percentage (AVP).</p></blockquote><p>And yet, however mechanical he may be, MrBeast is still a human being and as such will always fall short of machine levels of efficiency in pattern repetition. To the machines, or at least to their owners and operators, human creators have always been necessary evils, inefficient cogs needed only because the machines were incapable of generating content out of their own resources. The companies needed machine-listeners (the creators) to follow the machine&#8217;s instructions (the metrics) as slavishly as possible in order to produce chunks of content to feed back into the machine for distribution.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> </p><p>With AI, at last, the machines can take over the creator&#8217;s role. AI-generated slop marks the triumph of machine formalism. The machine establishes the pattern, and the machine fills the pattern with its own creation. The automated media system is relieved of human inefficiency, not to mention human sensibility. The same thing happens in the automation of factories and warehouses. People are kept on hand to perform tasks that robots aren&#8217;t good at doing&#8212;boxing up orders, say, or feeding parts into the machine&#8212;until the robots get good at doing them. </p><p>In automated systems, human beings are placeholders for future machines. Until recently, we assumed that creative types who produce content for media systems were exceptions to that rule. We&#8217;re now going to test that assumption. Is MrBeast necessary? Am I?</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>When I publish this post, Substack will offer to run an A/B test on its title, measuring how different, AI-generated titles affect various measures of readership. I assume that Substack will soon offer to do such tests on AI rewrites of the entire text. Why leave anything to chance, or to taste, if you don&#8217;t have to?</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[Flame and Filament]]></title><description><![CDATA[The deathliness of progress.]]></description><link>https://www.newcartographies.com/p/flame-and-filament</link><guid isPermaLink="false">https://www.newcartographies.com/p/flame-and-filament</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Sun, 01 Mar 2026 14:01:39 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Srhu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a7acd8e-8556-485f-9456-fc0e4803bf73_800x603.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Today&#8217;s <a href="https://www.newcartographies.com/t/rerun">Sunday Rerun</a> is the epilogue to </em><a href="https://www.nicholascarr.com/?page_id=21">The Big Switch</a>, <em>my 2008 book on cloud computing and its consequences (and electrification and its consequences). This closing bit looks at the way humanity&#8217;s sense of progress hinges on generational turnover&#8212;on death, to be blunt. It&#8217;s a subject that&#8217;s almost never discussed, but I hope to explore it further this year.</em></p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Srhu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a7acd8e-8556-485f-9456-fc0e4803bf73_800x603.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Srhu!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a7acd8e-8556-485f-9456-fc0e4803bf73_800x603.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Srhu!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a7acd8e-8556-485f-9456-fc0e4803bf73_800x603.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Srhu!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a7acd8e-8556-485f-9456-fc0e4803bf73_800x603.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Srhu!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a7acd8e-8556-485f-9456-fc0e4803bf73_800x603.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Srhu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a7acd8e-8556-485f-9456-fc0e4803bf73_800x603.jpeg" width="800" height="603" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9a7acd8e-8556-485f-9456-fc0e4803bf73_800x603.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:603,&quot;width&quot;:800,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:93404,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/189218000?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a7acd8e-8556-485f-9456-fc0e4803bf73_800x603.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Srhu!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a7acd8e-8556-485f-9456-fc0e4803bf73_800x603.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Srhu!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a7acd8e-8556-485f-9456-fc0e4803bf73_800x603.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Srhu!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a7acd8e-8556-485f-9456-fc0e4803bf73_800x603.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Srhu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a7acd8e-8556-485f-9456-fc0e4803bf73_800x603.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">John Collier, <em>The Prodigal Daughter</em> (1903).</figcaption></figure></div><p>One of man&#8217;s greatest inventions was also one of his most modest: the wick. We don&#8217;t know who first realized, many thousands of years ago, that fire could be isolated at the tip of a twisted piece of cloth and steadily fed, through capillary action, by a reservoir of wax or oil, but the discovery was, as Wolfgang Schivelbusch writes in <em>Disenchanted Night</em>, &#8220;as revolutionary in the development of artificial lighting as the wheel in the history of transport.&#8221; The wick tamed fire, allowing it to be used with a precision and an efficiency far beyond what was possible with a wooden torch or a bundle of twigs. In the process, it helped domesticate us as well. It&#8217;s hard to imagine civilization progressing to where it is today by torchlight.</p><p>The wick also proved an amazingly hardy creation. It remained the dominant lighting technology all the way to the nineteenth century, when it was replaced first by the wickless gas lamp and then, more decisively, by Edison&#8217;s electricity-fueled incandescent bulb with its glowing metal filament. Cleaner, safer, and even more efficient than the flame it replaced, the light bulb was welcomed into homes and offices around the world. But along with its many practical benefits, electric light also brought subtle and unexpected changes to the way people lived. The fireplace, the candle, and the oil lamp had always been focal points of households. Fire was, as Schivelbusch puts it, &#8220;the soul of the house.&#8221; Families would in the evening gather in a central room, drawn by the flickering flame, to chat about the day&#8217;s events or otherwise pass time together. Electric light, together with central heat, dissolved that long tradition. Family members began to spend more time in different rooms in the evening, studying or reading or working alone. Each person gained more privacy, and a greater sense of autonomy, but the cohesion of the family weakened.</p><p>Cold and steady, electric light lacked the allure of the flame. It was not mesmerizing or soothing but strictly functional. It turned light into an industrial commodity. A German diarist in 1944, forced to use candles instead of lightbulbs during nightly air raids, was struck by the difference. &#8220;We have noticed,&#8221; he wrote, &#8220;in the &#8216;weaker&#8217; light of the candle, objects have a different, a much more marked profile &#8212; it gives them a quality of &#8216;reality.&#8217;&#8221; This quality, he continued, &#8220;is lost in electric light: objects (seemingly) appear much more clearly, but in reality it <em>flattens</em> them. Electric light imparts too much brightness and thus things lose body, outline, substance &#8212; in short, their essence.&#8221;</p><p>We&#8217;re still attracted to a flame at the end of a wick. We light candles to set a romantic or a calming mood, to mark a special occasion. We buy ornamental lamps that are crafted to look like candles or candleholders, with bulbs shaped as stylized flames. But we can no longer know what it was like when fire was the source of all light. The number of people who remember life before the arrival of Edison&#8217;s bulb has dwindled to just a few, and when they go they&#8217;ll take with them all remaining memory of that earlier, pre-electric world. The same will happen, sometime toward the end of this century, with the memory of the world that existed before the computer and the Internet became commonplace. We&#8217;ll be the ones who bear it away.</p><p>All technological change is generational change. The full power and consequence of a new technology are unleashed only when those who have grown up with it become adults and begin to push their outdated parents to the margins. As the older generations die, they take with them their knowledge of what was lost when the new technology arrived. Only the sense of what was gained remains. It&#8217;s in this way that progress covers its tracks, perpetually refreshing the illusion that where we are is where we were meant to be.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Subscribe for free to New Cartographies.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Is AI the Paperclip?]]></title><description><![CDATA[Scale at all costs.]]></description><link>https://www.newcartographies.com/p/is-ai-the-paperclip</link><guid isPermaLink="false">https://www.newcartographies.com/p/is-ai-the-paperclip</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Mon, 09 Feb 2026 18:27:02 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!JtHf!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdb9e08a7-c74b-47ed-ae1c-272c53366c3e_770x537.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JtHf!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdb9e08a7-c74b-47ed-ae1c-272c53366c3e_770x537.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JtHf!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdb9e08a7-c74b-47ed-ae1c-272c53366c3e_770x537.png 424w, https://substackcdn.com/image/fetch/$s_!JtHf!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdb9e08a7-c74b-47ed-ae1c-272c53366c3e_770x537.png 848w, https://substackcdn.com/image/fetch/$s_!JtHf!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdb9e08a7-c74b-47ed-ae1c-272c53366c3e_770x537.png 1272w, https://substackcdn.com/image/fetch/$s_!JtHf!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdb9e08a7-c74b-47ed-ae1c-272c53366c3e_770x537.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JtHf!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdb9e08a7-c74b-47ed-ae1c-272c53366c3e_770x537.png" width="770" height="537" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/db9e08a7-c74b-47ed-ae1c-272c53366c3e_770x537.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:537,&quot;width&quot;:770,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:546182,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/187404005?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdb9e08a7-c74b-47ed-ae1c-272c53366c3e_770x537.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!JtHf!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdb9e08a7-c74b-47ed-ae1c-272c53366c3e_770x537.png 424w, https://substackcdn.com/image/fetch/$s_!JtHf!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdb9e08a7-c74b-47ed-ae1c-272c53366c3e_770x537.png 848w, https://substackcdn.com/image/fetch/$s_!JtHf!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdb9e08a7-c74b-47ed-ae1c-272c53366c3e_770x537.png 1272w, https://substackcdn.com/image/fetch/$s_!JtHf!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdb9e08a7-c74b-47ed-ae1c-272c53366c3e_770x537.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Meta promotional image illustrating size of its proposed Hyperion data center.</figcaption></figure></div><p>In a <a href="https://nickbostrom.com/ethics/ai">paper</a> published in 2003, the philosopher Nick Bostrom sketched out a thought experiment aimed at illustrating an existential risk that artificial intelligence might eventually pose to humanity. An advanced AI is given, by its human programmers, the objective of optimizing the production of paperclips. The machine sets off in monomaniacal pursuit of the objective, its actions untempered by common sense or ethical sense. The result, Bostrom wrote, is &#8220;a superintelligence whose top goal is the manufacturing of paperclips, with the consequence that it starts transforming first all of earth and then increasing portions of space into paperclip manufacturing facilities.&#8221; It destroys everything, including its programmers, in a mad rush to gather resources for paperclip production.</p><p>Bostrom went on to refine his &#8220;paperclip maximizer&#8221; thought experiment in subsequent writings and interviews, and it soon became a touchstone in debates about AI. Eminences as diverse as Stephen Hawking and Elon Musk would routinely bring it up in discussing the dangers of artificial intelligence. Others were skeptical. They found the story far-fetched, even by thought-experiment standards. It seemed, as <em>The Economist</em> <a href="https://www.economist.com/special-report/2016/06/23/frankensteins-paperclips">wrote</a>, a little too &#8220;silly&#8221; to be taken seriously. </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading New Cartographies! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>I was long in the skeptic camp, but recently I&#8217;ve had a change of heart. Bostrom&#8217;s story, I would argue, becomes compelling when viewed not as a thought experiment but as a fable. It&#8217;s not really about AIs making paperclips. It&#8217;s about people making AIs. Look around. Are we not madly harvesting the world&#8217;s resources in a monomaniacal attempt to optimize artificial intelligence? Are we not trapped in an &#8220;AI maximizer&#8221; scenario?</p><p>&#8220;The intelligence of an AI model roughly equals the log of the resources used to train and run it,&#8221; OpenAI CEO Sam Altman <a href="https://blog.samaltman.com/three-observations">wrote</a> a year ago. The important word here is &#8220;log.&#8221; As Donald MacKenzie explains in an insightful <a href="https://www.lrb.co.uk/the-paper/v48/n02/donald-mackenzie/ai-s-scale">article</a> on AI in the <em>London Review of Books</em>:</p><blockquote><p>A logarithmic function, at least of the kind that is relevant here, is characterised by diminishing returns. The more resources you put in, the better the results, but the rate of improvement steadily diminishes.</p></blockquote><p>To maintain a linear path of improvement in the performance of today&#8217;s neural-network-based AI models requires an exponential increase in resources. Ever larger inputs achieve ever smaller gains. But people like Altman remain absolutely committed to making those escalating resource investments, no matter the monetary or social or environmental cost. Because they believe that vast winner-take-all rewards will come to any company achieving superior scale in AI, they will devote all available resources&#8212;energy, water, real estate, data, chips, people&#8212;to the pursuit of even a tiny scale advantage.</p><p>Elon Musk, having abandoned his earlier misgivings about AI, <a href="https://www.spacex.com/updates#xai-joins-spacex">announced</a> last week that he was merging xAI into SpaceX. The combined companies were &#8220;scaling to make a sentient sun to understand the Universe and extend the light of consciousness to the stars!&#8221; he declared. &#8220;In the long term, space-based AI is obviously the only way to scale.&#8221; It&#8217;s exactly what Bostrom predicted. The monomaniacs will not stop with the resources of the Earth. They&#8217;ll extend their plundering to the heavens. Everything is raw material.</p><div><hr></div><p><em>This post is an installment in <a href="https://www.newcartographies.com/t/dead-speech">Dead Speech</a>, the New Cartographies series on AI and its cultural and economic consequences.</em></p>]]></content:encoded></item><item><title><![CDATA[The "User-Generated Content" Ruse]]></title><description><![CDATA[The feed is the content.]]></description><link>https://www.newcartographies.com/p/the-user-generated-content-ruse</link><guid isPermaLink="false">https://www.newcartographies.com/p/the-user-generated-content-ruse</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Thu, 29 Jan 2026 18:31:30 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!OXzf!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcdeb6b3-aeae-4d79-947f-07958a883f73_1175x850.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!OXzf!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcdeb6b3-aeae-4d79-947f-07958a883f73_1175x850.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!OXzf!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcdeb6b3-aeae-4d79-947f-07958a883f73_1175x850.jpeg 424w, https://substackcdn.com/image/fetch/$s_!OXzf!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcdeb6b3-aeae-4d79-947f-07958a883f73_1175x850.jpeg 848w, https://substackcdn.com/image/fetch/$s_!OXzf!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcdeb6b3-aeae-4d79-947f-07958a883f73_1175x850.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!OXzf!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcdeb6b3-aeae-4d79-947f-07958a883f73_1175x850.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!OXzf!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcdeb6b3-aeae-4d79-947f-07958a883f73_1175x850.jpeg" width="1175" height="850" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/dcdeb6b3-aeae-4d79-947f-07958a883f73_1175x850.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:850,&quot;width&quot;:1175,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:169787,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/186207392?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcdeb6b3-aeae-4d79-947f-07958a883f73_1175x850.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!OXzf!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcdeb6b3-aeae-4d79-947f-07958a883f73_1175x850.jpeg 424w, https://substackcdn.com/image/fetch/$s_!OXzf!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcdeb6b3-aeae-4d79-947f-07958a883f73_1175x850.jpeg 848w, https://substackcdn.com/image/fetch/$s_!OXzf!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcdeb6b3-aeae-4d79-947f-07958a883f73_1175x850.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!OXzf!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdcdeb6b3-aeae-4d79-947f-07958a883f73_1175x850.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Cornelis Claesz van Wieringen, <em>Battle of Gibraltar</em> (c. 1621)</figcaption></figure></div><p>Big social media companies are facing hundreds of personal-injury lawsuits claiming that their platforms have harmed people, particularly kids. Lawyers for the plaintiffs, which include individuals, states, and school districts, are modeling the suits on the successful litigation against cigarette companies at the end of the last century. Should the social media companies lose the suits, the first of which <a href="https://www.politico.com/news/2026/01/27/social-media-youth-addiction-trial-00747653?utm_campaign=etb&amp;utm_medium=newsletter&amp;utm_source=morning_brew">began</a> this week in Los Angeles, they would face not just massive payouts but also the prospect of extensive new regulatory controls on their businesses, just as tobacco companies did.</p><p>The internet giants have armies of lawyers, and they&#8217;re spending millions to block the suits. They claim, as they always have in the past, that they&#8217;re shielded from such litigation by the 1996 Communications Decency Act. As the <em>Wall Street Journal</em> <a href="https://www.wsj.com/opinion/social-media-lawsuits-trial-lawyers-google-tiktok-meta-dd2a8730?st=inQ68k&amp;reflink=desktopwebshare_permalink">writes</a>, in an editorial sympathetic to the companies, &#8220;The first problem with these cases is that Section 230 of the 1996 Communications Decency Act says internet platforms can&#8217;t be held liable for user-generated content.&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> But that old argument no longer holds water. The content produced by social media companies today is anything but &#8220;user-generated.&#8221; To think otherwise is to misunderstand how social media operates &#8212;and to misinterpret the scope of Section 230.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">New Cartographies is reader-supported. Please consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>In 1996, when Congress passed the Communications Decency Act,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> the big internet companies were internet service providers, or ISPs. Their role was limited to providing customers with access to the net, through, usually, dial-up connections over telephone lines. The ISPs acted as common carriers, their role limited to the transmission of information that was created by others &#8212; a role similar to that of traditional telephone companies or even the post office. Just as it would have been unfair to hold a mailman liable for the content of the letters he delivered to people&#8217;s mailboxes, so it would have been unfair to hold ISPs liable for the content of the emails and web pages they delivered to people&#8217;s computers. Section 230 provides internet carriers with a safe harbor from litigation so long as they restrict themselves to transporting data and do not act as &#8220;publisher or speaker&#8221; of the content they deliver:</p><blockquote><p>No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. </p></blockquote><p>Back in the early days of social media, it could be argued that Section 230 still applied. When Facebook started up in 2004, for instance, it provided its users with templates for inputting and organizing personal profiles and messages, but its main role was to connect people through an online network so they could share the content they created. The users were the speakers and the publishers of the content. Facebook was the carrier of the content.</p><p>That all changed in 2006 when Facebook introduced its News Feed. The users no longer controlled what they saw when they logged on to the network; they now saw a &#8220;feed&#8221; of information that was controlled by the algorithms Facebook wrote. The company was not just a carrier of content. It had taken on an explicitly editorial role. Like the editors at newspapers or the producers at TV networks, it selected and arranged the information that its users saw. The users had become an audience for Facebook&#8217;s production.</p><p>The story of social media ever since has been a story of the refinement of feeds as a media product aimed at capturing and holding an audience. The platforms have invested billions of dollars in designing those feeds&#8212;what they contain, how they look, how they work&#8212;to make them as &#8220;engaging&#8221; as possible. To argue that the companies are still in the business of transmitting &#8220;user-generated content&#8221; is absurd. Saying that a social-media feed is the product of users is like saying that a hot dog is the product of cattle. </p><p>The companies are not common carriers anymore; they&#8217;re media businesses. Yes, users still contribute posts and comments&#8212;though even those, in today&#8217;s era of influencers, creators, and AI, are often subsidized and actively shaped by the companies&#8212;but the essential content of social media is now the feeds produced by the platforms, not the individual messages posted by users. Go to Instagram and scroll through your feed. It&#8217;s obvious that what you&#8217;re experiencing is not discrete bits of user-generated content. It&#8217;s an elaborate, finely tuned media production manufactured by Instagram for an audience of one: you. The same goes for YouTube, X, TikTok, Facebook, Snapchat, Substack Notes, and, with a few exceptions, all the rest.</p><p>The feed is the content, and the social media company is its publisher. Period.</p><p>The question of whether social media companies should be held liable for harming people is a legally complex one, which would best be answered through courts of law. And that&#8217;s what should happen. Let the plaintiffs make their case, and let the defendants defend themselves. Section 230&#8217;s safe harbor doesn&#8217;t apply. Social media companies are, like other media companies, in the content-production business, and they&#8217;re responsible for their programming.</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Section 230 makes no mention of &#8220;user-generated content.&#8221; That phrase didn&#8217;t come into common usage until the arrival of the social web several years later.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>The Act ended up being thrown out as unconstitutional by the courts. Only Section 230 survived.</p></div></div>]]></content:encoded></item><item><title><![CDATA[Diplomacy by WhatsApp]]></title><description><![CDATA[Drop the bomb emoji.]]></description><link>https://www.newcartographies.com/p/diplomacy-by-whatsapp</link><guid isPermaLink="false">https://www.newcartographies.com/p/diplomacy-by-whatsapp</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Wed, 21 Jan 2026 18:10:04 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!2utU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c023e-e6c5-4b6c-aae0-9100aa5f5f86_1260x902.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!2utU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c023e-e6c5-4b6c-aae0-9100aa5f5f86_1260x902.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!2utU!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c023e-e6c5-4b6c-aae0-9100aa5f5f86_1260x902.jpeg 424w, https://substackcdn.com/image/fetch/$s_!2utU!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c023e-e6c5-4b6c-aae0-9100aa5f5f86_1260x902.jpeg 848w, https://substackcdn.com/image/fetch/$s_!2utU!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c023e-e6c5-4b6c-aae0-9100aa5f5f86_1260x902.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!2utU!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c023e-e6c5-4b6c-aae0-9100aa5f5f86_1260x902.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!2utU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c023e-e6c5-4b6c-aae0-9100aa5f5f86_1260x902.jpeg" width="1260" height="902" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c39c023e-e6c5-4b6c-aae0-9100aa5f5f86_1260x902.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:902,&quot;width&quot;:1260,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:252640,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/185308325?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c023e-e6c5-4b6c-aae0-9100aa5f5f86_1260x902.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!2utU!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c023e-e6c5-4b6c-aae0-9100aa5f5f86_1260x902.jpeg 424w, https://substackcdn.com/image/fetch/$s_!2utU!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c023e-e6c5-4b6c-aae0-9100aa5f5f86_1260x902.jpeg 848w, https://substackcdn.com/image/fetch/$s_!2utU!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c023e-e6c5-4b6c-aae0-9100aa5f5f86_1260x902.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!2utU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc39c023e-e6c5-4b6c-aae0-9100aa5f5f86_1260x902.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Joseph Stella, <em>Telegraph Poles</em> (1915)</figcaption></figure></div><p>If the stakes weren&#8217;t so high, the barrage of text messages between world leaders in the days running up to the World Economic Forum in Davos would be amusing. Texting turns everyone into a semiliterate twelve-year-old, and presidents, prime ministers, and secretaries general are no exception. We&#8217;re used to the President of the United States communicating with the American public in weirdly punctuated streams of all-caps, exclamation marks, and typos, but to know that texting has now become the de facto language of diplomacy is a revelation, and a worrying one.<br><br>Whether it&#8217;s a handwritten letter, a telephone call, a fax, an email, or a text, the medium through which people communicate shapes what they say and how they say it. Some mediums encourage formal speech while others encourage casual banter. Some are suited to full sentences and well-turned paragraphs; others to sentence fragments and cliches. Some promote consideration; others, abruptness. In general, a medium&#8217;s speed of delivery is inversely correlated to the thoughtfulness and nuance of the messages it carries. The growing hegemony of the instant message, it seems fair to say, is not fostering eloquence in either private correspondence or public speaking. Texts are great for quick, offhand exchanges. They debase pretty much everything else.</p><p>Because texting is resistant to the expression of complicated or subtle ideas or arguments &#8212; it is, by design, a medium of speed, compression, and simplicity &#8212; it&#8217;s particularly ill-suited to grappling with complex issues or solving complex problems. To use it for international relations and other aspects of governmental policy making, particularly in fraught situations, is a sure route to misunderstanding, anger, and the escalation of tensions.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">New Cartographies is a reader-supported publication. Please consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>The most telling precedent for what we&#8217;re seeing today is the change in diplomatic practices that occurred with the arrival of international telegraph and telephone lines in the late nineteenth century &#8212; an episode I describe in my book <em><a href="https://www.nicholascarr.com/?page_id=664">Superbloom</a></em>. The unprecedented ability of far-flung leaders and diplomats to talk directly with each other without delay spurred great hopes. It seemed obvious that the resulting exchanges would ease friction and encourage goodwill among nations. Nikola Tesla, in an 1898 interview about his work on wireless telegraph systems, said that he would be &#8220;remembered as the inventor who succeeded in abolishing war.&#8221;<sup> </sup>His rival, Guglielmo Marconi, declared in 1912 that wireless telegraphy would &#8220;make war impossible.&#8221;</p><p>What actually happened was altogether different. In the lead-up to the Franco-Prussian War of 1870, telegraphic communications inflamed tensions rather than dampening them. Writes the French historian Pierre Granet: &#8220;The constant transmission of dispatches between governments and their agents, the rapid dissemination of controversial information among an already agitated public, hastened, if it did not actually provoke, the outbreak of hostilities.&#8221; </p><p>The start of the First World War in 1914, two years after Marconi announced the end of war, was similarly hastened by the new communication mediums. After the June 28 assassination of Austrian Archduke Franz Ferdinand in Sarajevo, hundreds of urgent diplomatic messages raced between European capitals through newly strung telegraph and telephone wires. As the historian Stephen Kern describes in <em>The Culture of Time and Space 1880&#8211;1918,</em> the rapid-fire dispatches quickly devolved into ultimatums and threats. &#8220;Communication technology imparted a breakneck speed to the usually slow pace of traditional diplomacy and seemed to obviate personal diplomacy,&#8221; Kern writes. &#8220;Diplomats could not cope with the volume and speed of electronic communication.&#8221;</p><p>Diplomacy, a communicative art, had been overwhelmed by communication. By August, the world was at war. &#8220;The moral qualities&#8212;prudence, foresight, intelligence, penetration, wisdom&#8212;of statesmen and nations have not kept pace [with the] rapidity of communication by telegraph and telephone,&#8221; the distinguished British diplomat Ernest Satow wrote in his 1917 <em>Guide to Diplomatic Practice</em>. &#8220;These latter leave no time for reflection or consultation, and demand an immediate and often a hasty decision on matters of vital importance.&#8221;</p><p>His words are as resonant now as they were a century ago, and they should give today&#8217;s leaders and diplomats pause. Successful statecraft requires deliberation, discretion, and discernment, qualities rarely evident in messages thumbed out through apps on phone screens.</p>]]></content:encoded></item><item><title><![CDATA[I Am a Data Factory (and So Are You)]]></title><description><![CDATA[The metaphors that confine us.]]></description><link>https://www.newcartographies.com/p/i-am-a-data-factory-and-so-are-you</link><guid isPermaLink="false">https://www.newcartographies.com/p/i-am-a-data-factory-and-so-are-you</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Sun, 02 Nov 2025 17:19:47 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!J7ss!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6cf14e01-4b80-4031-9498-6ae6fdf3b49c_1300x600.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Today&#8217;s <a href="https://www.newcartographies.com/t/rerun">Sunday Rerun</a> is a post I originally published in May 2018. It looks at the way popular metaphors shape how we see our relationship with tech companies and their products&#8212;and our sense of personal responsibility and agency in using technologies. I&#8217;m hoping to write more on this subject in the months ahead, so I&#8217;m republishing this as a starting point.</em></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!J7ss!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6cf14e01-4b80-4031-9498-6ae6fdf3b49c_1300x600.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!J7ss!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6cf14e01-4b80-4031-9498-6ae6fdf3b49c_1300x600.png 424w, https://substackcdn.com/image/fetch/$s_!J7ss!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6cf14e01-4b80-4031-9498-6ae6fdf3b49c_1300x600.png 848w, https://substackcdn.com/image/fetch/$s_!J7ss!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6cf14e01-4b80-4031-9498-6ae6fdf3b49c_1300x600.png 1272w, https://substackcdn.com/image/fetch/$s_!J7ss!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6cf14e01-4b80-4031-9498-6ae6fdf3b49c_1300x600.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!J7ss!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6cf14e01-4b80-4031-9498-6ae6fdf3b49c_1300x600.png" width="1300" height="600" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6cf14e01-4b80-4031-9498-6ae6fdf3b49c_1300x600.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:600,&quot;width&quot;:1300,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1326084,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/177759649?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e500e33-2c29-4a66-ae67-08e9a68bb455_1300x975.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!J7ss!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6cf14e01-4b80-4031-9498-6ae6fdf3b49c_1300x600.png 424w, https://substackcdn.com/image/fetch/$s_!J7ss!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6cf14e01-4b80-4031-9498-6ae6fdf3b49c_1300x600.png 848w, https://substackcdn.com/image/fetch/$s_!J7ss!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6cf14e01-4b80-4031-9498-6ae6fdf3b49c_1300x600.png 1272w, https://substackcdn.com/image/fetch/$s_!J7ss!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6cf14e01-4b80-4031-9498-6ae6fdf3b49c_1300x600.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Workers in an undergarment factory, circa 1920.</figcaption></figure></div><p><strong>Mines and Factories</strong></p><p>Am I a data mine, or am I a data factory? Is data extracted from me, or is data produced by me? Both metaphors are repellent, but the distinction between them is important. The metaphor we choose informs our sense of the power wielded by so-called platform companies like Facebook, Google, and Amazon, and it shapes the way we, as individuals and as a society, respond to that power.</p><p>If I am a data mine, then I am essentially a chunk of real estate, and control over my data becomes a matter of ownership. Who owns me (as a site of valuable data), and what happens to the economic value of the data extracted from me? Should I be my own owner &#8212; the sole proprietor of my data mine and its wealth? Should I be nationalized, my little mine becoming part of some sort of public collective? Or should ownership rights be transferred to a set of corporations that can efficiently aggregate the raw material from my mine and everyone else&#8217;s and transform it into useful products and services? The questions raised here are questions of economics and politics.</p><p>The mining metaphor, like the mining business, is a fairly simple one, and it has become popular, particularly among writers of the left. Thinking of the platform companies as being in the extraction business, with personal data being analogous to a natural resource like iron or petroleum, brings a neatness and clarity to discussions of a new and complicated type of company. In an <a href="https://www.theguardian.com/technology/2018/mar/14/tech-big-data-capitalism-give-wealth-back-to-people">article</a> in the <em>Guardian</em> in March, Ben Tarnoff wrote that &#8220;thinking of data as a resource like oil helps illuminate not only how it functions, but how we might organize it differently.&#8221; Building on the metaphor, he went on the argue that the data business should not just be heavily regulated, as extractive industries tend to be, but that &#8220;data resources&#8221; should be nationalized &#8212; put under state ownership and control:</p><blockquote><p>Data is no less a form of common property than oil or soil or copper. We make data together, and we make it meaningful together, but its value is currently captured by the companies that own it. We find ourselves in the position of a colonized country, our resources extracted to fill faraway pockets. Wealth that belongs to the many &#8212; wealth that could help feed, educate, house and heal people &#8212; is used to enrich the few. The solution is to take up the template of resource nationalism, and nationalize our data reserves.</p></blockquote><p>In another <em>Guardian</em> <a href="https://www.theguardian.com/technology/2018/mar/31/big-data-lie-exposed-simply-blaming-facebook-wont-fix-reclaim-private-information">piece</a>, published a couple of weeks later, Evgeny Morozov offered a similar proposal concerning what he termed &#8220;the data wells inside ourselves&#8221;:</p><blockquote><p>We can use the recent data controversies to articulate a truly decentralised, emancipatory politics, whereby the institutions of the state (from the national to the municipal level) will be deployed to recognise, create, and foster the creation of social rights to data. These institutions will organise various data sets into pools with differentiated access conditions. They will also ensure that those with good ideas that have little commercial viability but promise major social impact would receive venture funding and realise those ideas on top of those data pools.</p></blockquote><p>The simplicity of the mining metaphor is its strength but also its weakness. The extraction metaphor doesn&#8217;t capture enough of what companies like Facebook and Google do, and in adopting it we too quickly narrow the discussion of our possible responses to their power. Data does not lie passively within me, like a seam of ore or a pool of oil, waiting to be extracted. Rather, I actively produce data. When I drive or walk from one place to another, I produce locational data. When I buy something, I produce purchase data. When I text with someone, I produce affiliation data. When I read or watch or purchase something online, I produce preference data. When I upload a photo, I produce not only behavioral data but data that is itself a product. I am, in other words, much more like a data factory than a data mine. I manufacture data through my labor &#8212; the labor of my mind, the labor of my body.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">New Cartographies is a reader-supported publication. Please consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>The platform companies, in turn, act more like factory managers than like the owners of oil wells or copper mines. Beyond controlling my data, the companies seek to control my actions, which to them are essentially manufacturing processes, in order to optimize my data production (and, on the demand side of the platform, my data consumption). They want to script and regulate the work of my factory &#8212; i.e., my life &#8212; as Frederick Winslow Taylor sought to script and regulate the labor of factory workers at the turn of the last century. The control wielded by these companies, in other words, is not just that of ownership but also that of command. And they exercise such command through the design of their apps and other software, which increasingly regulate everything we do during our waking hours. Apps are, like factory routines and industrial machinery, behavioral modification tools. They&#8217;re designed to maximize people&#8217;s efficiency in producing valuable data.</p><p>The factory metaphor makes clear what the mining metaphor obscures: We work for the Facebooks and Googles of the world, and the work we do is increasingly indistinguishable from the lives we lead. The questions we need to grapple with are political and economic, to be sure. But they are also ethical and philosophical. The extraction metaphor suggests we lack personal responsibility and agency. The factory metaphor emphasizes our responsibility and agency. We are actors, not mere resource pools.</p><p><strong>A False Choice</strong></p><p>To understand why the choice of metaphor is so important, consider a new <a href="https://www.theguardian.com/news/2018/may/03/why-silicon-valley-cant-fix-itself-tech-humanism">essay</a> by Ben Tarnoff, written with Moira Weigel, that was published last week. The piece opens with a sharp, cold-eyed examination of those Silicon Valley apostates who now express regret over the harmful effects of the software they created. Through their new stress on redesigning the software to promote personal &#8220;well-being,&#8221; these &#8220;tech humanists,&#8221; Tarnoff and Weigel argue, actually serve the business interests of the platform companies they criticize. The companies, the writers point out, can easily co-opt the rhetoric of well-being, using it as cover to deflect criticism while seizing even more economic power.</p><p>Tarnoff and Weigel point to Facebook CEO Mark Zuckerberg&#8217;s recent announcement that his company will place less emphasis on increasing the total amount of time members spend on Facebook and more emphasis on ensuring that their Facebook time is &#8220;time well spent.&#8221; What may sound like a selfless act of philanthropy is in reality the product of a hard-headed business calculation:</p><blockquote><p>Emphasising time well spent means creating a Facebook that prioritises data-rich personal interactions that Facebook can use to make a more engaging platform. Rather than spending a lot of time doing things that Facebook doesn&#8217;t find valuable &#8211; such as watching viral videos &#8211; you can spend a bit less time, but spend it doing things that Facebook does find valuable. In other words, &#8220;time well spent&#8221; means Facebook can monetise more efficiently. It can prioritise the intensity of data extraction over its extensiveness. This is a wise business move, disguised as a concession to critics. Shifting to this model not only sidesteps concerns about tech addiction &#8211; it also acknowledges certain basic limits to Facebook&#8217;s current growth model. There are only so many hours in the day. Facebook can&#8217;t keep prioritising total time spent &#8211; it has to extract more value from less time.</p></blockquote><p>The analysis is a trenchant one. The vagueness and self-absorption that often characterize discussions of &#8220;wellness,&#8221; particularly those emanating from the California coast, are well suited to the construction of window dressing. And, Lord knows, Zuckerberg and his ilk are experts at window dressing. But, having offered good reasons to be skeptical about Silicon Valley&#8217;s brand of tech humanism, Tarnoff and Weigel overreach. They argue that <em>any</em> humanist critique of the personal effects of technology design and use is a distraction from the &#8220;fundamental&#8221; critique of the economic and structural basis for Silicon Valley&#8217;s dominance:</p><blockquote><p>[The humanists] remain confined to the personal level, aiming to redesign how the individual user interacts with technology rather than tackling the industry&#8217;s structural failures. Tech humanism fails to address the root cause of the tech backlash: the fact that a small handful of corporations own our digital lives and strip-mine them for profit. This is a fundamentally political and collective issue. But by framing the problem in terms of health and humanity, and the solution in terms of design, the tech humanists personalise and depoliticise it.</p></blockquote><p>The choice that Tarnoff and Weigel present here &#8212; either personal critique or political critique, either a <em>design focus</em> or a <em>structural focus</em> &#8212; is a false one. And it stems from the metaphor of extraction, which conceives of data as lying passively within us (beyond the influence of software design) rather than being actively produced by us (under the influence of software design). Arguing that attending to questions of design blinds us to questions of ownership is reductive. Silicon Valley wields its power through both its control of data and its control of design, and that power influences us on both a personal and a collective level. Any robust critique of Silicon Valley needs to address both the personal and the political.</p><p>The Silicon Valley apostates are deserving of criticism, but what they&#8217;ve done that is useful is to expose, in considerable detail, the way the platform companies fine-tune their software to manipulate people&#8217;s behavior&#8212;in particular to encourage compulsive use of the software. Their apps are designed to ensure users generate the maximum amount of valuable personal data without ever pausing to think critically about what they&#8217;re doing. To put it into industrial terms, these companies are not just engaged in resource extraction; they are engaged in process engineering.</p><p>Tarnoff and Weigel go on to suggest that the tech humanists are pursuing a patriarchal agenda. The humanists want to define some ideal state of human well-being and then use software and hardware design to impose that way of being on everybody. That&#8217;s altogether true of many of the Silicon Valley apostates. Tarnoff and Weigel quote a prominent one as saying, &#8220;We have a moral responsibility to steer people&#8217;s thoughts ethically.&#8221; It&#8217;s hard to imagine a purer distillation of Silicon Valley&#8217;s hubris or a clearer expression of its belief that lives should be engineered rather than lived. But Tarnoff and Weigel&#8217;s suggestion is off the mark when it comes to the broader humanist tradition in technology theory and criticism. It is the thinkers in that tradition&#8212;Mumford, Arendt, Ellul, McLuhan, Postman, Turkle, and many others&#8212;who have taught us how deeply and subtly technology is entwined with human history, human society, and human behavior, and how our entanglement with technology can produce effects, often unforeseen and sometimes hidden, that may run counter to our interests, however we choose to define those interests.</p><p>Though any cultural criticism will entail the expression of ethical values&#8212;that&#8217;s what gives it bite&#8212;the thrust of the humanist critique of technology is not to impose a particular way of being on all of us but rather to give us the perspective, understanding, and know-how necessary to make our own informed choices about the tools and technologies we use and the way we design and employ them. By helping us to see the power of technology clearly and to resist it when necessary, the humanist tradition expands our personal and social agency rather than constricting it. That&#8217;s anything but patriarchal.</p><p><strong>Beyond Mine and Factory</strong></p><p>Nationalizing collective stores of personal data is an idea worthy of consideration and debate. But it raises a host of hard questions. In shifting ownership and control of exhaustive behavioral data to the government, what kind of abuses do we risk? It seems at least a little disconcerting to see the idea raised at a time when authoritarian movements and regimes are on the rise. If we end up trading a surveillance economy for a surveillance state, we&#8217;ve done ourselves no favor.</p><p>But let&#8217;s assume that our vast data collective is secure, well managed, and put to good, democratic ends. The shift of data ownership from the private to the public sector may well succeed in reducing the economic power of Silicon Valley, but what it would also do is reinforce and indeed institutionalize Silicon Valley&#8217;s computationalist ideology, with its foundational, Taylorist belief that, at a personal and collective level, humanity can and should be optimized through better programming. The ethos and incentives of constant surveillance and efficient data production would become even more deeply embedded in our lives, as we, the public, take on the roles of both watched and watcher, data producer and data exploiter. </p><p>In addressing the power now centralized in big tech companies, we should keep both the personal and the political consequences of digitization in focus. The question isn&#8217;t just about who controls the production and use of personal data. It&#8217;s about whether we want to frame our lives in terms of data production and use. Maybe the metaphors of mining and manufacturing are ones we need to escape.</p>]]></content:encoded></item><item><title><![CDATA[AI vs AI]]></title><description><![CDATA[Slop in, slop out.]]></description><link>https://www.newcartographies.com/p/ai-vs-ai</link><guid isPermaLink="false">https://www.newcartographies.com/p/ai-vs-ai</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Sat, 25 Oct 2025 16:24:12 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!IRBV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe773abf0-a400-41f8-9520-0eee381abe31_1200x626.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!IRBV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe773abf0-a400-41f8-9520-0eee381abe31_1200x626.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!IRBV!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe773abf0-a400-41f8-9520-0eee381abe31_1200x626.jpeg 424w, https://substackcdn.com/image/fetch/$s_!IRBV!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe773abf0-a400-41f8-9520-0eee381abe31_1200x626.jpeg 848w, https://substackcdn.com/image/fetch/$s_!IRBV!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe773abf0-a400-41f8-9520-0eee381abe31_1200x626.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!IRBV!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe773abf0-a400-41f8-9520-0eee381abe31_1200x626.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!IRBV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe773abf0-a400-41f8-9520-0eee381abe31_1200x626.jpeg" width="1200" height="626" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e773abf0-a400-41f8-9520-0eee381abe31_1200x626.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:626,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:117787,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/176565411?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe773abf0-a400-41f8-9520-0eee381abe31_1200x626.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!IRBV!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe773abf0-a400-41f8-9520-0eee381abe31_1200x626.jpeg 424w, https://substackcdn.com/image/fetch/$s_!IRBV!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe773abf0-a400-41f8-9520-0eee381abe31_1200x626.jpeg 848w, https://substackcdn.com/image/fetch/$s_!IRBV!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe773abf0-a400-41f8-9520-0eee381abe31_1200x626.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!IRBV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe773abf0-a400-41f8-9520-0eee381abe31_1200x626.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>If you&#8217;ve done much googling recently, you&#8217;ve probably noticed the odd and dubious set of sources that Google&#8217;s large language model draws from in generating the &#8220;AI Overviews&#8221; that now appear at the top of the company&#8217;s search results (after the ads, of course). Rather than dig deep into authoritative writings on a subject, Google&#8217;s bot usually pieces together its overview from recently published, cursory summaries posted on a hodgepodge of highly trafficked websites &#8212; the same dumbed-down, search-engine-optimized sites that have long appeared highly in Google results. Taking the path of least epistemic resistance, the AI slaps together a bland, often unreliable summary of summaries and presents it as a judicious, objective overview.</p><p>The problem becomes more acute when you search for advice on buying a product or service. The overview in this case tends to be a synthesis of text drawn three kinds of sources: (1) promotional sites run by businesses that supply the product or service, (2) influencer sites written by people who often get referral fees from suppliers, and (3) crappy &#8220;best of&#8221; sites operated by content farms (&#8220;Ten Best Parrot Cages for 2025!&#8221;). The AI Overview is little more than a rehash of corporate marketing messages.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">New Cartographies is a reader-supported publication. Please consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>I&#8217;ll give you an example. In doing some recent research on home heating systems (my furnace is on the fritz), I googled &#8220;residential hvac in [city name].&#8221; The resulting AI Overview began with this sentence: &#8220;Residential HVAC in [city name] includes a wide range of services like installation, repair, and maintenance from local companies such as [company name], [company name], and [company name].&#8221; The overview went on to give little capsule summaries of these three businesses and their putative strengths. The AI, in other words, explicitly names and promotes particular companies. When I looked at the top twelve sources the bot drew from, I found that nine were websites run by local HVAC companies, one was a marketing site run by a leading maker of furnaces and air conditioners, one was a Yelp page ranking local suppliers, and the last was a product-ranking page from TopTenReviews.com that was filled with affiliate links.</p><p>What this makes clear is that Google&#8217;s AI Overview and the similar summaries generated by other AI bots &#8212; now major sources of information for the public &#8212; are anything but judicious and objective. Tractable and predictable, the bots&#8217; algorithms are in fact incredibly ripe for gaming. And because the bots are happy to call out particular businesses by name, successfully gaming the systems will be lucrative. In many industries, AI gaming will turn into a cost of doing business. If you think search engine optimization (SEO) has been a blight on the net, artificial intelligence optimization (AIO) promises to be even worse. Local, national, and international companies&#8212;not to mention political operatives, influencers, and crooks&#8212;are going to invest huge amounts of money in attempts to manipulate what comes out of the mouths of the language models that increasingly tell us what we want to know. </p><p>Ever since the internet was opened to commerce in the early 1990s, it has operated as a gigantic cat-and-mouse game. The cats&#8212;search engines, social media sites, and other information aggregators&#8212;write the algorithms that determine the information people see (and don&#8217;t see). The mice&#8212;companies and other parties with an interest in influencing online flows of information&#8212;reverse-engineer the algorithms in order to boost the visibility of the messages they circulate. The cats tweak the algorithms to thwart the mice. And the cycle continues, endlessly. It&#8217;s hardly a surprise that SEO is now a $100 billion industry.</p><p>AIO gives the old game a new and troubling twist. As language models become the dominant tools the cats use to choose the information they circulate, language models will also become the dominant tools the mice use in trying to reverse-engineer and influence the systems. (For a simple example of what&#8217;s to come, go to any chatbot and give it a prompt like this: &#8220;I&#8217;m a business, and I would like to promote my company by influencing what appears in the outputs of AI chatbots like Google&#8217;s AI Overview. What techniques can I use to accomplish my goal?&#8221;) AI will be the cat, and AI will be the mouse. That&#8217;s going to create an interesting feedback loop, to say the least. Welcome to the slop wars.</p><div><hr></div><p><em>This post is an installment in <a href="https://www.newcartographies.com/t/dead-speech">Dead Speech</a>, the New Cartographies series on AI and its cultural and economic consequences.</em></p>]]></content:encoded></item><item><title><![CDATA[Is Google Making Us Stupid?]]></title><description><![CDATA[What the internet did to our brains.]]></description><link>https://www.newcartographies.com/p/is-google-making-us-stupid</link><guid isPermaLink="false">https://www.newcartographies.com/p/is-google-making-us-stupid</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Sun, 31 Aug 2025 09:27:23 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Wosc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa16fd714-e174-424d-9a71-cef4812ca701_1280x839.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Wosc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa16fd714-e174-424d-9a71-cef4812ca701_1280x839.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Wosc!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa16fd714-e174-424d-9a71-cef4812ca701_1280x839.heic 424w, https://substackcdn.com/image/fetch/$s_!Wosc!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa16fd714-e174-424d-9a71-cef4812ca701_1280x839.heic 848w, https://substackcdn.com/image/fetch/$s_!Wosc!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa16fd714-e174-424d-9a71-cef4812ca701_1280x839.heic 1272w, https://substackcdn.com/image/fetch/$s_!Wosc!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa16fd714-e174-424d-9a71-cef4812ca701_1280x839.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Wosc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa16fd714-e174-424d-9a71-cef4812ca701_1280x839.heic" width="1280" height="839" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a16fd714-e174-424d-9a71-cef4812ca701_1280x839.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:839,&quot;width&quot;:1280,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:396598,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/172310352?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa16fd714-e174-424d-9a71-cef4812ca701_1280x839.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Wosc!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa16fd714-e174-424d-9a71-cef4812ca701_1280x839.heic 424w, https://substackcdn.com/image/fetch/$s_!Wosc!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa16fd714-e174-424d-9a71-cef4812ca701_1280x839.heic 848w, https://substackcdn.com/image/fetch/$s_!Wosc!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa16fd714-e174-424d-9a71-cef4812ca701_1280x839.heic 1272w, https://substackcdn.com/image/fetch/$s_!Wosc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa16fd714-e174-424d-9a71-cef4812ca701_1280x839.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">J.M.W. Turner, <em>Snow Storm - Steam-Boat off a Harbour&#8217;s Mouth</em></figcaption></figure></div><p><em>Seventeen years ago, when MySpace was bigger than Facebook and going online still felt liberating, </em>The Atlantic<em> published my essay &#8220;Is Google Making Us Stupid?&#8221; in its summer Ideas issue. With <a href="https://www.newcartographies.com/p/the-myth-of-automated-learning">AI</a> now being sold to the public as an intelligence amplifier, just as the net was then, I offer the essay as today&#8217;s <a href="https://www.newcartographies.com/t/rerun">Sunday Rerun</a>. It&#8217;s always good to be reminded that the ultimate effects of a broadly adopted new technology never match the early expectations. </em></p><div><hr></div><p>&#8220;Dave, stop. Stop, will you? Stop, Dave. Will you stop, Dave?&#8221; So the supercomputer HAL pleads with the implacable astronaut Dave Bowman in a famous and weirdly poignant scene toward the end of Stanley Kubrick&#8217;s <em>2001: A Space Odyssey</em>. Bowman, having nearly been sent to a deep-space death by the malfunctioning machine, is calmly, coldly disconnecting the memory circuits that control its artificial brain. &#8220;Dave, my mind is going,&#8221; HAL says, forlornly. &#8220;I can feel it. I can feel it.&#8221;</p><p>I can feel it, too. Over the past few years I&#8217;ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn&#8217;t going&#8212;so far as I can tell&#8212;but it&#8217;s changing. I&#8217;m not thinking the way I used to think. I can feel it most strongly when I&#8217;m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I&#8217;d spend hours strolling through long stretches of prose. That&#8217;s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I&#8217;m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">New Cartographies is a reader-supported publication. To support my work, consider becoming a free or paid subscriber. Thanks.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>I think I know what&#8217;s going on. For more than a decade now, I&#8217;ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the internet. The web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and I&#8217;ve got the telltale fact or pithy quote I was after. Even when I&#8217;m not working, I&#8217;m as likely as not to be foraging in the web&#8217;s info-thickets&#8212;reading and writing emails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to which they&#8217;re sometimes likened, hyperlinks don&#8217;t merely point to related works; they propel you toward them.)</p><p>For me, as for others, the net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they&#8217;ve been widely described and duly applauded. &#8220;The perfect recall of silicon memory,&#8221; <em>Wired</em>&#8217;s Clive Thompson has written, &#8220;can be an enormous boon to thinking.&#8221; But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.</p><p>I&#8217;m not the only one. When I mention my troubles with reading to friends and acquaintances&#8212;literary types, most of them&#8212;many say they&#8217;re having similar experiences. The more they use the web, the more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have also begun mentioning the phenomenon. Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether. &#8220;I was a lit major in college, and used to be [a] voracious book reader,&#8221; he wrote. &#8220;What happened?&#8221; He speculates on the answer: &#8220;What if I do all my reading on the web not so much because the way I read has changed, i.e. I&#8217;m just seeking convenience, but because the way I THINK has changed?&#8221;</p><p>Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the internet has altered his mental habits. &#8220;I now have almost totally lost the ability to read and absorb a longish article on the web or in print,&#8221; he wrote earlier this year. A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a &#8220;staccato&#8221; quality, reflecting the way he quickly scans short passages of text from many sources online. &#8220;I can&#8217;t read <em>War and Peace </em>anymore,&#8221; he admitted. &#8220;I&#8217;ve lost the ability to do that. Even a blog post of more than three or four paragraphs is too much to absorb. I skim it.&#8221;</p><p>Anecdotes alone don&#8217;t prove much. And we still await the long-term neurological and psychological experiments that will provide a definitive picture of how internet use affects cognition. But a recently published study of online research habits, conducted by scholars from University College London, suggests that we may well be in the midst of a sea change in the way we read and think. As part of the five-year research program, the scholars examined computer logs documenting the behavior of visitors to two popular research sites, one operated by the British Library and one by a U.K. educational consortium, that provide access to journal articles, ebooks, and other sources of written information. They found that people using the sites exhibited &#8220;a form of skimming activity,&#8221; hopping from one source to another and rarely returning to any source they&#8217;d already visited. They typically read no more than one or two pages of an article or book before they would &#8220;bounce&#8221; out to another site. Sometimes they&#8217;d save a long article, but there&#8217;s no evidence that they ever went back and actually read it. The authors of the study report: &#8220;It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of &#8216;reading&#8217; are emerging as users &#8216;power browse&#8217; horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.&#8221;</p><p>Thanks to the ubiquity of text on the internet, not to mention the popularity of text messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. But it&#8217;s a different kind of reading, and behind it lies a different kind of thinking&#8212;perhaps even a new sense of the self. &#8220;We are not only <em>what</em> we read,&#8221; says Maryanne Wolf, a developmental psychologist at Tufts University and the author of <em>Proust and the Squid: The Story and Science of the Reading Brain</em>. &#8220;We are <em>how</em> we read.&#8221; Wolf worries that the style of reading promoted by the net, a style that puts &#8220;efficiency&#8221; and &#8220;immediacy&#8221; above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace. When we read online, she says, we tend to become &#8220;mere decoders of information.&#8221; Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged.</p><p>Reading, explains Wolf, is not an instinctive skill for human beings. It&#8217;s not etched into our genes the way speech is. We have to teach our minds how to translate the symbolic characters we see into the language we understand. And the media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains. Experiments demonstrate that readers of ideograms, such as the Chinese, develop a mental circuitry for reading that is very different from the circuitry found in those of us whose written language employs an alphabet. The variations extend across many regions of the brain, including those that govern such essential cognitive functions as memory and the interpretation of visual and auditory stimuli. We can expect as well that the circuits woven by our use of the net will be different from those woven by our reading of books and other printed works.</p><h4>* * * * *</h4><p>Sometime in 1882, Friedrich Nietzsche bought a typewriter&#8212;a Malling-Hansen Writing Ball, to be precise. His vision was failing, and keeping his eyes focused on a page had become exhausting and painful, often bringing on crushing headaches. He had been forced to curtail his writing, and he feared that he would soon have to give it up. The typewriter rescued him, at least for a time. Once he had mastered touch-typing, he was able to write with his eyes closed, using only the tips of his fingers. Words could once again flow from his mind to the page.</p><p>But the machine had a subtler effect on his work. One of Nietzsche&#8217;s friends, a composer, noticed a change in the style of his writing. His already terse prose had become even tighter, more telegraphic. &#8220;Perhaps you will through this instrument even take to a new idiom,&#8221; the friend wrote in a letter, noting that, in his own work, his &#8220;&#8216;thoughts&#8217; in music and language often depend on the quality of pen and paper.&#8221;</p><p>&#8220;You are right,&#8221; Nietzsche replied, &#8220;our writing equipment takes part in the forming of our thoughts.&#8221; Under the sway of the machine, writes the German media scholar Friedrich A. Kittler, Nietzsche&#8217;s prose &#8220;changed from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style.&#8221;</p><p>The human brain is malleable. People used to think that our mental meshwork, the dense connections formed among the hundred billion or so neurons inside our skulls, was largely fixed by the time we reached adulthood. But brain researchers have discovered that that&#8217;s not the case. James Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George Mason University, says that even the adult mind &#8220;is very plastic.&#8221; Nerve cells routinely break old connections and form new ones. &#8220;The brain,&#8221; according to Olds, &#8220;has the ability to reprogram itself on the fly, altering the way it functions.&#8221;</p><p>As we use what the sociologist Daniel Bell has called our &#8220;intellectual technologies&#8221;&#8212;the tools that extend our mental rather than our physical capacities&#8212;we inevitably begin to take on the qualities of those technologies. The mechanical clock, which came into common use in the fourteenth century, provides a compelling example. In <em>Technics and Civilization</em>, the historian and cultural critic Lewis Mumford described how the clock &#8220;disassociated time from human events and helped create the belief in an independent world of mathematically measurable sequences.&#8221; The &#8220;abstract framework of divided time&#8221; became &#8220;the point of reference for both action and thought.&#8221;</p><p>The clock&#8217;s methodical ticking helped bring into being the scientific mind and the scientific man. But it also took something away. As the late MIT computer scientist Joseph Weizenbaum observed in his 1976 book <em>Computer Power and Human Reason: From Judgment to Calculation</em>, the conception of the world that emerged from the widespread use of timekeeping instruments &#8220;remains an impoverished version of the older one, for it rests on a rejection of those direct experiences that formed the basis for, and indeed constituted, the old reality.&#8221; In deciding when to eat, to work, to sleep, to rise, we stopped listening to our senses and started obeying the clock.</p><p>The process of adapting to new intellectual technologies is reflected in the changing metaphors we use to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their brains as operating &#8220;like clockwork.&#8221; Today, in the age of software, we have come to think of them as operating &#8220;like computers.&#8221; But the changes, neuroscience tells us, go much deeper than metaphor. Thanks to our brain&#8217;s plasticity, the adaptation occurs also at a biological level.</p><p>The internet promises to have particularly far-reaching effects on cognition. In a paper published in 1936, the British mathematician Alan Turing proved that a digital computer, which at the time existed only as a theoretical machine, could be programmed to perform the function of any other information-processing device. And that&#8217;s what we&#8217;re seeing today. The internet, an immeasurably powerful computing system, is subsuming most of our other intellectual technologies. It&#8217;s becoming our map and our clock, our printing press and our typewriter, our calculator and our telephone, our radio and our TV.</p><p>When the net absorbs a medium, it re-creates that medium in its own image. It injects the medium&#8217;s content with hyperlinks, blinking ads, and other digital gewgaws, and it surrounds the content with the content of all the other media it has absorbed. A new email message, for instance, may announce its arrival as we&#8217;re glancing over the latest headlines at a newspaper&#8217;s site. The result is to scatter our attention and diffuse our concentration.</p><p>The net&#8217;s influence doesn&#8217;t end at the edges of a computer screen, either. As people&#8217;s minds become attuned to the crazy quilt of internet media, traditional media have to adapt to the audience&#8217;s new expectations. Television programs add text crawls and pop-up ads, and magazines and newspapers shorten their articles, introduce capsule summaries, and crowd their pages with easy-to-browse info-snippets. When, in March of this year, the<em> New York Times </em>decided to devote the second and third pages of every edition to article abstracts, its design director, Tom Bodkin, explained that the &#8220;shortcuts&#8221; would give harried readers a quick &#8220;taste&#8221; of the day&#8217;s news, sparing them the &#8220;less efficient&#8221; method of actually turning the pages and reading the articles. Old media have little choice but to play by the new-media rules.</p><p>Never has a communications system played so many roles in our lives&#8212;or exerted such broad influence over our thoughts&#8212;as the internet does today. Yet, for all that&#8217;s been written about the net, there&#8217;s been little consideration of how, exactly, it&#8217;s reprogramming us. The net&#8217;s intellectual ethic remains obscure.</p><h4>* * * * *</h4><p>About the same time that Nietzsche started using his typewriter, an earnest young man named Frederick Winslow Taylor carried a stopwatch into the Midvale Steel plant in Philadelphia and began a historic series of experiments aimed at improving the efficiency of the plant&#8217;s machinists. With the approval of Midvale&#8217;s owners, he recruited a group of factory hands, set them to work on various metalworking machines, and recorded and timed their every movement as well as the operations of the machines. By breaking down every job into a sequence of small, discrete steps and then testing different ways of performing each one, Taylor created a set of precise instructions&#8212;an &#8220;algorithm,&#8221; we might say today&#8212;for how each worker should work. Midvale&#8217;s employees grumbled about the strict new regime, claiming that it turned them into little more than automatons, but the factory&#8217;s productivity soared.</p><p>More than a hundred years after the invention of the steam engine, the industrial revolution had at last found its philosophy and its philosopher. Taylor&#8217;s tight industrial choreography&#8212;his &#8220;system,&#8221; as he liked to call it&#8212;was embraced by manufacturers throughout the country and, in time, around the world. Seeking maximum speed, maximum efficiency, and maximum output, factory owners used time-and-motion studies to organize their work and configure the jobs of their workers. The goal, as Taylor defined it in his celebrated 1911 treatise <em>The Principles of Scientific Management</em>, was to identify and adopt, for every job, the &#8220;one best method&#8221; of work and thereby to effect &#8220;the gradual substitution of science for rule of thumb throughout the mechanic arts.&#8221; Once his system was applied to all acts of manual labor, Taylor assured his followers, it would bring about a restructuring not only of industry but of society, creating a utopia of perfect efficiency. &#8220;In the past the man has been first,&#8221; he declared; &#8220;in the future the system must be first.&#8221;</p><p>Taylor&#8217;s system is still very much with us; it remains the ethic of industrial manufacturing. And now, thanks to the growing power that computer engineers and software coders wield over our intellectual lives, Taylor&#8217;s ethic is beginning to govern the realm of the mind as well. The internet is a machine designed for the efficient and automated collection, transmission, and manipulation of information, and its legions of programmers are intent on finding the &#8220;one best method&#8221;&#8212;the perfect algorithm&#8212;to carry out every mental movement of what we&#8217;ve come to describe as &#8220;knowledge work.&#8221;</p><p>Google&#8217;s headquarters, in Mountain View, California&#8212;the Googleplex&#8212;is the internet&#8217;s high church, and the religion practiced inside its walls is Taylorism. Google, says its chief executive, Eric Schmidt, is &#8220;a company that&#8217;s founded around the science of measurement,&#8221; and it is striving to &#8220;systematize everything&#8221; it does. Drawing on the terabytes of behavioral data it collects through its search engine and other sites, it carries out thousands of experiments a day, according to the <em>Harvard Business Review</em>, and it uses the results to refine the algorithms that increasingly control how people find information and extract meaning from it. What Taylor did for the work of the hand, Google is doing for the work of the mind.</p><p>The company has declared that its mission is &#8220;to organize the world&#8217;s information and make it universally accessible and useful.&#8221; It seeks to develop &#8220;the perfect search engine,&#8221; which it defines as something that &#8220;understands exactly what you mean and gives you back exactly what you want.&#8221; In Google&#8217;s view, information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency. The more pieces of information we can &#8220;access&#8221; and the faster we can extract their gist, the more productive we become as thinkers.</p><p>Where does it end? Sergey Brin and Larry Page, the gifted young men who founded Google while pursuing doctoral degrees in computer science at Stanford, speak frequently of their desire to turn their search engine into an artificial intelligence, a HAL-like machine that might be connected directly to our brains. &#8220;The ultimate search engine is something as smart as people&#8212;or smarter,&#8221; Page said in a speech a few years back. &#8220;For us, working on search is a way to work on artificial intelligence.&#8221; In a 2004 interview with <em>Newsweek</em>, Brin said, &#8220;Certainly if you had all the world&#8217;s information directly attached to your brain, or an artificial brain that was smarter than your brain, you&#8217;d be better off.&#8221; Last year, Page told a convention of scientists that Google is &#8220;really trying to build artificial intelligence and to do it on a large scale.&#8221;</p><p>Such an ambition is a natural one, even an admirable one, for a pair of math whizzes with vast quantities of cash at their disposal and a small army of computer scientists in their employ. A fundamentally scientific enterprise, Google is motivated by a desire to use technology, in Eric Schmidt&#8217;s words, &#8220;to solve problems that have never been solved before,&#8221; and artificial intelligence is the hardest problem out there. Why wouldn&#8217;t Brin and Page want to be the ones to crack it?</p><p>Still, their easy assumption that we&#8217;d all &#8220;be better off&#8221; if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling. It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google&#8217;s world, the world we enter when we go online, there&#8217;s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.</p><p>The idea that our minds should operate as high-speed data-processing machines is not only built into the workings of the internet; it is the network&#8217;s reigning business model. The faster we surf across the web&#8212;the more links we click and pages we view&#8212;the more opportunities Google and other companies gain to collect information about us and to feed us advertisements. Most of the proprietors of the commercial internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link&#8212;the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It&#8217;s in their economic interest to drive us to distraction.</p><h4>* * * * *</h4><p>Maybe I&#8217;m just a worrywart. Just as there&#8217;s a tendency to glorify technological progress, there&#8217;s a countertendency to expect the worst of every new tool or machine. In Plato&#8217;s <em>Phaedrus</em>, Socrates bemoaned the development of writing. He feared that, as people came to rely on the written word as a substitute for the knowledge they used to carry inside their heads, they would, in the words of one of the dialogue&#8217;s characters, &#8220;cease to exercise their memory and become forgetful.&#8221; And because they would be able to &#8220;receive a quantity of information without proper instruction,&#8221; they would &#8220;be thought very knowledgeable when they are for the most part quite ignorant.&#8221; They would be &#8220;filled with the conceit of wisdom instead of real wisdom.&#8221; Socrates wasn&#8217;t wrong&#8212;the new technology did often have the effects he feared&#8212;but he was shortsighted. He couldn&#8217;t foresee the many ways that writing and reading would serve to spread information, spur fresh ideas, and expand human knowledge (if not wisdom).</p><p>The arrival of Gutenberg&#8217;s printing press, in the fifteenth century, set off another round of teeth gnashing. The Italian humanist Hieronimo Squarciafico worried that the easy availability of books would lead to intellectual laziness, making men &#8220;less studious&#8221; and weakening their minds. Others argued that cheaply printed books and broadsheets would undermine religious authority, demean the work of scholars and scribes, and spread sedition and debauchery. As New York University professor Clay Shirky notes, &#8220;Most of the arguments made against the printing press were correct, even prescient.&#8221; But, again, the doomsayers were unable to imagine the myriad blessings the printed word would deliver.</p><p>So, yes, you should be skeptical of my skepticism. Perhaps those who dismiss critics of the internet as Luddites or nostalgists will be proved correct, and from our hyperactive, data-stoked minds will spring a golden age of intellectual discovery and universal wisdom. Then again, the net isn&#8217;t the alphabet, and although it may replace the printing press, it produces something altogether different. The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author&#8217;s words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas. Deep reading, as Maryanne Wolf argues, is indistinguishable from deep thinking.</p><p>If we lose those quiet spaces, or fill them up with &#8220;content,&#8221; we will sacrifice something important not only in our selves but in our culture. In a recent essay, the playwright Richard Foreman eloquently described what&#8217;s at stake: &#8220;I come from a tradition of Western culture, in which the ideal (my ideal) was the complex, dense and &#8216;cathedral-like&#8217; structure of the highly educated and articulate personality&#8212;a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West.&#8221; But now, he continued, &#8220;I see within us all (myself included) the replacement of complex inner density with a new kind of self&#8212;evolving under the pressure of information overload and the technology of the &#8216;instantly available.&#8217;&#8221;</p><p>As we are drained of our &#8220;inner repertory of dense cultural inheritance,&#8221; Foreman concluded, we risk turning into &#8220;&#8216;pancake people&#8217;&#8212;spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.&#8221;</p><p>I&#8217;m haunted by that scene in <em>2001</em>. What makes it so poignant, and so weird, is the computer&#8217;s emotional response to the disassembly of its mind: its despair as one circuit after another goes dark, its childlike pleading with the astronaut&#8212;&#8220;I can feel it. I can feel it. I&#8217;m afraid&#8221;&#8212;and its final reversion to what can only be called a state of innocence. HAL&#8217;s outpouring of feeling contrasts with the emotionlessness that characterizes the human figures in the film, who go about their business with an almost robotic efficiency. Their thoughts and actions feel scripted, as if they&#8217;re following the steps of an algorithm. In the world of <em>2001</em>, people have become so machinelike that the most human character turns out to be a machine. That&#8217;s the essence of Kubrick&#8217;s dark prophecy: As we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption"></p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The Erasive Age]]></title><description><![CDATA[Generation as destruction.]]></description><link>https://www.newcartographies.com/p/the-erasive-age</link><guid isPermaLink="false">https://www.newcartographies.com/p/the-erasive-age</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Fri, 22 Aug 2025 15:06:39 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!be1-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3502975-5002-4de8-80ee-43f9be806d1f_922x1260.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!fnTn!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7e2bc2-9915-4e4e-b225-ef6c5cc0754b_1596x1844.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!fnTn!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7e2bc2-9915-4e4e-b225-ef6c5cc0754b_1596x1844.heic 424w, https://substackcdn.com/image/fetch/$s_!fnTn!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7e2bc2-9915-4e4e-b225-ef6c5cc0754b_1596x1844.heic 848w, https://substackcdn.com/image/fetch/$s_!fnTn!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7e2bc2-9915-4e4e-b225-ef6c5cc0754b_1596x1844.heic 1272w, https://substackcdn.com/image/fetch/$s_!fnTn!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7e2bc2-9915-4e4e-b225-ef6c5cc0754b_1596x1844.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!fnTn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7e2bc2-9915-4e4e-b225-ef6c5cc0754b_1596x1844.heic" width="1456" height="1682" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/bd7e2bc2-9915-4e4e-b225-ef6c5cc0754b_1596x1844.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1682,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:287464,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/169132004?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7e2bc2-9915-4e4e-b225-ef6c5cc0754b_1596x1844.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!fnTn!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7e2bc2-9915-4e4e-b225-ef6c5cc0754b_1596x1844.heic 424w, https://substackcdn.com/image/fetch/$s_!fnTn!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7e2bc2-9915-4e4e-b225-ef6c5cc0754b_1596x1844.heic 848w, https://substackcdn.com/image/fetch/$s_!fnTn!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7e2bc2-9915-4e4e-b225-ef6c5cc0754b_1596x1844.heic 1272w, https://substackcdn.com/image/fetch/$s_!fnTn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7e2bc2-9915-4e4e-b225-ef6c5cc0754b_1596x1844.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Robert Rauschenberg, <em>Erased de Kooning Drawing</em> (1953), SFMOMA</figcaption></figure></div><p>One day in 1953, a young and at the time little-known experimental artist named Robert Rauschenberg arrived at the studio of the great abstract expressionist Willem de Kooning bearing a bottle of Jack Daniels and a strange request. He wanted the famous artist to give him one of his drawings so he could erase</p><p> it. De Kooning was taken aback. &#8220;I remember that the idea of destruction kept coming into the conversation,&#8221; Rauschenberg later recalled, &#8220;and I kept trying to show that it wouldn&#8217;t be destruction.&#8221; </p><p>Rauschenberg explained to de Kooning that he wanted to see if a work of art could be created not just through the inscription of marks but through their removal. Could art be erasive as well as inscriptive? After much back-and-forth, and several servings of brown liquor, de Kooning agreed to the request. He chose a drawing he had recently completed &#8212; one he was fond of &#8212; and gave it to Rauschenberg.</p><p>Over the course of the next two months, Rauschenberg slowly, meticulously erased the drawing, taking off layers of grease pencil, charcoal, graphite, and ink. He went through forty erasers. All that remained in the end were a few faint traces of the original sketch. With the help of his friend Jasper Johns, he then carefully matted and framed the work, and Johns wrote a label for it, inscribing the title, artist, and date so precisely that they appeared to have been printed out by a machine: </p><blockquote><p>ERASED de KOONING DRAWING<br>ROBERT RAUSCHENBERG<br>1953</p></blockquote><p> &#8220;The simple, gilded frame and understated inscription are integral parts of the finished artwork,&#8221; <a href="https://www.sfmoma.org/artwork/98.298/">writes</a> a curator at the San Francisco Museum of Modern Art, which acquired the work in 1998, &#8220;offering the sole indication of the psychologically loaded act central to its creation.&#8221; Even a work of erasure demands a frame, Rauschenberg understood, a boundary establishing its place in the world. Erasure cries out for inscription. We want to know the marks were there before they weren&#8217;t.</p><p><em>Erasive</em> is an exceptionally uncommon word. It was coined in the seventeenth century but has rarely been used since. Word-processing and messaging spellcheckers underline it with suspicion. Its rarity testifies to our discomfort with, as the SFMOMA writer terms it, the &#8220;psychologically loaded act&#8221; of erasure. But, thanks to the rise of what tech companies have cheerfully branded &#8220;generative AI,&#8221; the word seems certain to be used more often in the years to come. Our condition demands it. Behind every act of AI generation lie many acts of erasure. We have entered the erasive age. </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">New Cartographies is a reader-supported publication. To support my work, consider becoming a free or paid subscriber. Thanks.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Although we assume that media is fundamentally inscriptive, a means of preserving and transmitting human-made marks of one sort or another, communication systems have always also entailed erasure. What they erase are the spatiotemporal boundaries that in nature fix speech to speaker. A person says something, and if there are others in earshot, they hear it. Otherwise it&#8217;s gone. But that same person writes those same words down on a sheet of paper, or enters them into a computer network, and the words can travel through space and persist through time. Much of the value of media, cultural and financial, has always stemmed from its power to erase the material world&#8217;s physical constraints on the flow of speech, the flow of information.</p><p>So long as erasure served our desire to transmit our own marks and receive the marks made by others, we didn&#8217;t worry about it. We celebrated it &#8212; <em>the death of distance! the transcendence of time! </em>&#8212; just as we celebrate other technologies that free us, or at least shield us, from the world&#8217;s frictions and constraints. We want our marks, and the marks of others, to flow freely through space and time. We want the speech of distant people to arrive in our mailbox, to issue forth from our radio and TV, to hang on the walls of a museum, to appear on the screen of our phone. Take away such freedom of movement, return us to the original communication system of mouth and ear, and you take away knowledge, culture, science, entertainment, pretty much the entirety of modernity. </p><p>Erasure is good for business. The more of the world that media erases, the more dependent society becomes on the systems and services of media companies and the more profits those companies earn. That&#8217;s why people like Mark Zuckerberg have been so eager to promote the benefits of &#8220;frictionlessness&#8221; in communication and social relations. What we failed to appreciate is that the pursuit of profit would lead the companies beyond the erasure of spatiotemporal boundaries. They would seek to erase the greatest source of friction in their operations: their reliance on human creativity and expression. They would seek to replace the human source of the information they transmit &#8212; speakers and their speech &#8212; with highly efficient machines capable of creating &#8220;content&#8221; cheaply and on demand. </p><p>In creating tradable derivatives of human speech, AI erases the human voice, the human hand. First, it turns the works of culture into numbers, then it compresses those numbers into a generalized statistical model. Of the originals only traces remain. If Rauschenberg sought to show that erasure can be a generative act, AI bots have the opposite goal: to show that generation can be an erasive act. Fulfilling de Kooning&#8217;s fears, generation turns destructive.</p><p>It&#8217;s useful to bring in another work of art, Vilhelm Hammersh&#248;i&#8217;s 1913 oil painting <em>Interior with Windsor Chair</em>:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!be1-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3502975-5002-4de8-80ee-43f9be806d1f_922x1260.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!be1-!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3502975-5002-4de8-80ee-43f9be806d1f_922x1260.jpeg 424w, https://substackcdn.com/image/fetch/$s_!be1-!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3502975-5002-4de8-80ee-43f9be806d1f_922x1260.jpeg 848w, https://substackcdn.com/image/fetch/$s_!be1-!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3502975-5002-4de8-80ee-43f9be806d1f_922x1260.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!be1-!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3502975-5002-4de8-80ee-43f9be806d1f_922x1260.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!be1-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3502975-5002-4de8-80ee-43f9be806d1f_922x1260.jpeg" width="922" height="1260" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d3502975-5002-4de8-80ee-43f9be806d1f_922x1260.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1260,&quot;width&quot;:922,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:212754,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/169132004?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3502975-5002-4de8-80ee-43f9be806d1f_922x1260.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!be1-!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3502975-5002-4de8-80ee-43f9be806d1f_922x1260.jpeg 424w, https://substackcdn.com/image/fetch/$s_!be1-!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3502975-5002-4de8-80ee-43f9be806d1f_922x1260.jpeg 848w, https://substackcdn.com/image/fetch/$s_!be1-!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3502975-5002-4de8-80ee-43f9be806d1f_922x1260.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!be1-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd3502975-5002-4de8-80ee-43f9be806d1f_922x1260.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Vilhelm Hammersh&#248;i, <em>Interior with Windsor Chair</em>.</figcaption></figure></div><p>As the art critic Shawn Grenier has <a href="https://www.youtube.com/watch?v=C86OLJ_0BQk">explained</a>, the chair in the painting, like the faint traces of the original de Kooning drawing in Rauschenberg&#8217;s work, has the paradoxical effect of accentuating the painting&#8217;s air of emptiness. In both works, we sense not only an absence but also the presence &#8212; the human presence &#8212; that preceded the absence. It&#8217;s that same memory of human presence that, to the discerning mind, gives the products of generative AI their poignancy. </p><p>The more we draw on AI to shape our perception and understanding of the world, to structure our thoughts and words, to express ourselves, the more complicit we become in erasing culture, the past, others, ourselves. Eventually, should we continue down the path, even the memory of what&#8217;s been erased will be erased. No frame, no matting, no inscription. Only the empty revelation of erasure.</p><div><hr></div><p><em>This post is an installment in <a href="https://www.newcartographies.com/t/dead-speech">Dead Speech</a>, a New Cartographies series about the cultural and economic consequences of AI. The series began <a href="https://www.newcartographies.com/p/dead-labor-dead-speech">here</a>.</em></p>]]></content:encoded></item><item><title><![CDATA[The Medium Is the Medium]]></title><description><![CDATA[AI ascends.]]></description><link>https://www.newcartographies.com/p/the-medium-is-the-medium</link><guid isPermaLink="false">https://www.newcartographies.com/p/the-medium-is-the-medium</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Sun, 27 Jul 2025 10:01:04 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!3s8V!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F371c4879-9edf-47e8-a1ca-2d81ab4c76a6_1440x911.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Today&#8217;s <a href="https://www.newcartographies.com/t/rerun">Sunday Rerun</a> is a post that originally appeared on my old blog, Rough Type, at the end of 2021. Written about a year after OpenAI&#8217;s release of GPT-3 and a year before its unveiling of ChatGPT, it&#8217;s one of my first attempts to make sense of generative AI. (I would also work some of this material into the AI chapter of my book </em><a href="https://www.nicholascarr.com/?page_id=664">Superbloom</a><em>.) The connection between AI and the Spiritualism movement of the nineteenth and early-twentieth centuries is something I hope to write more about soon. </em></p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!3s8V!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F371c4879-9edf-47e8-a1ca-2d81ab4c76a6_1440x911.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!3s8V!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F371c4879-9edf-47e8-a1ca-2d81ab4c76a6_1440x911.jpeg 424w, https://substackcdn.com/image/fetch/$s_!3s8V!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F371c4879-9edf-47e8-a1ca-2d81ab4c76a6_1440x911.jpeg 848w, https://substackcdn.com/image/fetch/$s_!3s8V!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F371c4879-9edf-47e8-a1ca-2d81ab4c76a6_1440x911.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!3s8V!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F371c4879-9edf-47e8-a1ca-2d81ab4c76a6_1440x911.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!3s8V!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F371c4879-9edf-47e8-a1ca-2d81ab4c76a6_1440x911.jpeg" width="1440" height="911" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/371c4879-9edf-47e8-a1ca-2d81ab4c76a6_1440x911.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:911,&quot;width&quot;:1440,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:340472,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/169308324?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F371c4879-9edf-47e8-a1ca-2d81ab4c76a6_1440x911.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!3s8V!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F371c4879-9edf-47e8-a1ca-2d81ab4c76a6_1440x911.jpeg 424w, https://substackcdn.com/image/fetch/$s_!3s8V!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F371c4879-9edf-47e8-a1ca-2d81ab4c76a6_1440x911.jpeg 848w, https://substackcdn.com/image/fetch/$s_!3s8V!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F371c4879-9edf-47e8-a1ca-2d81ab4c76a6_1440x911.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!3s8V!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F371c4879-9edf-47e8-a1ca-2d81ab4c76a6_1440x911.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Still from Fritz Lang&#8217;s 1922 film <em>Dr. Mabuse</em>.</figcaption></figure></div><p>In the fall of 1917, the Irish poet William Butler Yeats, well into middle age and having recently had marriage proposals turned down twice, first by his great love Maud Gonne and then by Gonne&#8217;s daughter, Iseult, offered his hand to a well-off young Englishwoman named Georgie Hyde-Lees. She accepted, and the two were wed a few weeks later, on October 20, in a small ceremony in London, with Ezra Pound serving as best man. &#8220;The girl is 25, not bad looking, sensible, will perhaps dust a few cobwebs out of his belfry,&#8221; Pound reported afterwards to a friend.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">To receive new posts and support my work, consider becoming a free or paid subscriber to New Cartographies.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Hyde-Lees was a medium, a member of the Hermetic Order of the Golden Dawn, and four days into their otherwise disappointing honeymoon she gave her husband a demonstration of her ability to channel the words of spirits through automatic writing. Yeats was fascinated by the messages that flowed through his wife&#8217;s pen, and in the ensuing years the couple held more than four hundred such seances, the poet poring over each new script. He saw the texts as emanations from what he called <em>Spiritus Mundi</em>, a sort of universal memory or collective unconsciousness that was the source of all humanity&#8217;s symbols and myths. </p><p>During one particularly productive session, Yeats announced that he would devote the rest of his life to interpreting the messages. &#8220;No,&#8221; the spirits responded, &#8220;we have come to give you metaphors for poetry.&#8221; And that they did, in abundance. Many of Yeats&#8217;s great late poems, with their gyres, spiral staircases, and waxing and waning moons, were inspired by his wife&#8217;s occult scribbles.</p><p>One way to think about AI-based text-generation tools like OpenAI&#8217;s GPT-3 is as clairvoyants. They are mediums that bring the words of the past into the present in a new arrangement. GPT-3 is not creating text out of nothing, after all. It is drawing on a vast corpus of human expression and, through a quasi-mystical statistical procedure (no one can explain exactly what it is doing), synthesizing all those old words into something new, something intelligible to and requiring interpretation by a living interlocutor. When we talk to GPT-3, we are, in a very real way, communing with the dead. </p><p>One of Hyde-Lees&#8217; spirits said to Yeats, &#8220;this script has its origin in human life &#8212; all religious systems have their origin in God &amp; descend to man &#8212; this ascends.&#8221; The same could be said of the scripts generated by GPT-3. They have their origin in human life; they ascend.</p><p>It&#8217;s telling that one of the first commercial applications of GPT-3, Sudowrite, is being <a href="https://sudowrite.com">marketed</a> as a therapy for writer&#8217;s block. If you&#8217;re writing a story or essay and find yourself stuck, you can plug the last few sentences of your work into Sudowrite, and it will generate the next few sentences, in a variety of versions. It may not give you metaphors for poetry (though it could), but it will give you some inspiration, stirring thoughts and opening possible new paths. It&#8217;s an automatic muse, a mechanical Georgie Hyde-Lees.</p><p>Sudowrite, and GPT-3 in general, has already been used for a lot of stunts. Kevin Roose, the <em>New York Times</em> technology columnist, recently used it to generate a substantial portion of a <a href="https://www.nytimes.com/2021/11/21/books/review/the-age-of-ai-henry-kissinger-eric-schmidt-daniel-huttenlocher.html">review</a> of a mediocre new book on artificial intelligence. (The title of the review was, naturally, &#8220;A Robot Wrote this Book Review.&#8221;) Commenting on Sudowrite&#8217;s output, Roose wrote, &#8220;within a few minutes, the AI was coming up with impressively cogent paragraphs of analysis &#8212; some, frankly, better than what I could have generated on my own.&#8221;</p><p>But the potential of these AI-powered automatic writers goes beyond journalistic parlor tricks. They promise to serve as new tools for the creation of art. One of the most remarkable pieces of writing I read this year was Vauhini Vara&#8217;s essay &#8220;<a href="https://www.thebeliever.net/ghosts/">Ghosts</a>&#8221; in <em>The Believer</em>. While locked down in 2020, Vara became obsessed with GPT-3. &#8220;I sought out examples of GPT-3&#8217;s work, and they astonished me,&#8221; she writes in an introduction to her piece. &#8220;Some of them could easily be mistaken for texts written by a human hand. In others, the language was weird, off-kilter &#8212; but often poetically so, almost truer than writing any human would produce.&#8221; Yeats would have understood.</p><p>Vara&#8217;s older sister had died of cancer shortly after graduating high school. The experience left Vara traumatized, and, though an accomplished author, she had never been able to write about it. But with GPT-3 she began to find the words. &#8220;I found myself irresistibly attracted to GPT-3,&#8221; she explains &#8220;&#8212; to the way it offered, without judgment, to deliver words to a writer who has found herself at a loss for them.&#8221; She began to feed GPT-3 some sentences about her sister&#8217;s illness, and the system started to weave those sentences into stories &#8212; fantastical and uncanny, but also stirring, and ultimately heartbreaking. The essay chronicles eight of her sessions with GPT-3. It reads as a conversation between a writer and her muse, a conversation that begins tentatively and grows richer and truer as it goes on.</p><p>At one point, after Vara confesses to feeling like &#8220;a ghost&#8221; after her sister&#8217;s death, GPT-3 continues the thought:</p><blockquote><p>So I can&#8217;t describe her to you. But I can describe what it felt like to have her die. It felt like my life was an accident &#8212; or, worse, a mistake. I&#8217;d made a mistake in being born, and now, to correct it, I would have to die. I&#8217;d have to die, and someone else &#8212; a stranger &#8212; would have to live, in my place. I was that stranger. I still am.</p></blockquote><p>What gives the exchange all the more poignancy is the sense that, in drawing on its corpus of past human speech to generate its mysterious new pastiche, GPT-3 is expressing the pain of others who have suffered unbearable losses. Spirits are talking.</p>]]></content:encoded></item><item><title><![CDATA[Western Digital]]></title><description><![CDATA[A fancy.]]></description><link>https://www.newcartographies.com/p/western-digital</link><guid isPermaLink="false">https://www.newcartographies.com/p/western-digital</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Wed, 16 Jul 2025 22:55:54 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!WBXY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7933762a-3df1-4828-8f17-8853ea05d82c_750x545.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!WBXY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7933762a-3df1-4828-8f17-8853ea05d82c_750x545.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!WBXY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7933762a-3df1-4828-8f17-8853ea05d82c_750x545.jpeg 424w, https://substackcdn.com/image/fetch/$s_!WBXY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7933762a-3df1-4828-8f17-8853ea05d82c_750x545.jpeg 848w, https://substackcdn.com/image/fetch/$s_!WBXY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7933762a-3df1-4828-8f17-8853ea05d82c_750x545.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!WBXY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7933762a-3df1-4828-8f17-8853ea05d82c_750x545.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!WBXY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7933762a-3df1-4828-8f17-8853ea05d82c_750x545.jpeg" width="750" height="545" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7933762a-3df1-4828-8f17-8853ea05d82c_750x545.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:545,&quot;width&quot;:750,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:59439,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/168075670?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7933762a-3df1-4828-8f17-8853ea05d82c_750x545.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!WBXY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7933762a-3df1-4828-8f17-8853ea05d82c_750x545.jpeg 424w, https://substackcdn.com/image/fetch/$s_!WBXY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7933762a-3df1-4828-8f17-8853ea05d82c_750x545.jpeg 848w, https://substackcdn.com/image/fetch/$s_!WBXY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7933762a-3df1-4828-8f17-8853ea05d82c_750x545.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!WBXY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7933762a-3df1-4828-8f17-8853ea05d82c_750x545.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Albert Bierstadt, &#8220;California Sunset.&#8221;</figcaption></figure></div><p><em>I was going to end my last post, &#8220;<a href="https://www.newcartographies.com/p/against-compression">Against Compression,</a>&#8221; with a coda about a fictional artist, but I decided the piece already placed enough demands on the reader&#8217;s patience. So here it is, a free-floating coda. Affix it to whatever you&#8217;d like.</em></p><p>The career of the contemporary French artist Jed Martin is a twisty one, full of incursions and recursions. (Michel Houellebecq chronicles them all in his 2010 novel <em><a href="https://bookshop.org/a/85280/9780307946539">The Map and the Territory</a></em>.) As a boy, Jed would sit in his family&#8217;s garden and, under the intermittently watchful eye of his pretty babysitter, draw flowers with colored pencils. As a teenager, he began painting landscapes in a style reminiscent of C&#233;zanne. In art school, he abandoned painting for photography. Soon he was taking simple but exquisitely detailed photos of everyday artifacts: forks, Pendaflex files, printer cartridges, nuts and bolts. He wanted to create &#8220;an exhaustive catalogue of the objects of human manufacturing in the Industrial Age.&#8221; The project consumed him for six years, and out of it came eleven thousand photographs, neatly stored as TIFF files on a single, small Western Digital hard drive.</p><p>The photos gained him a few admirers among his art school colleagues, but it was only after he shifted the focus of his lens from industrial goods to Michelin roadmaps that he found fame. In the tire manufacturer&#8217;s lovingly detailed maps of the French countryside, Jed discovered a world more ordered, more understandable, denser with information, than the one the maps portrayed and in which he lived. His first solo exhibition, underwritten by the communications department of the Michelin Corporation, caused a stir in the Paris art world.</p><blockquote><p>The entrance to the hall was barred by a big panel, leaving two-meters-wide passageways at either side, on which Jed had displayed a satellite photo taken around the mountain of Guebwiller next to an enlargement of a Michelin Departments map of the same zone. The contrast was striking: while the photograph showed only a soup of more or less uniform green sprinkled with vague blue spots, the map developed a fascinating maze of departmental and scenic roads, <em>viewpoints</em>, forests, lakes, and cols. Above the two enlargements, in black capital letters, was the title of the exhibition: THE MAP IS MORE INTERESTING THAN THE TERRITORY.</p></blockquote><p>The review in <em>Le Monde</em> was &#8220;ecstatic in its praise.&#8221;</p><p>But Jed grew bored of maps. He stowed his camera gear and went back to painting. In a kind of slow-motion frenzy of creativity that would last many years, he produced his acclaimed &#8220;Professions&#8221; series of oils, upwards of sixty large-scale portraits of people working at various jobs, blue collar and white. There was a butcher and a store manager, an architect and a prostitute, a &#8220;remote maintenance assistant.&#8221; Wealthy collectors bid up the value of the coldly beautiful works. Jed became very rich.</p><p>The most celebrated of the Professions canvasses was one of the last he painted. Titled <em>Bill Gates and Steve Jobs Discussing the Future of Information Technology</em> and subtitled <em>The Conversation at Palo Alto</em>, it portrayed the two aging entrepreneurs sitting in the living room of Jobs&#8217;s house in Silicon Valley. Gates, casually dressed and wearing flip-flops, looks relaxed and happy, a man enjoying his retirement and his money. Jobs, in the early stages of the disease that would kill him, appears pinched and withdrawn, an &#8220;embodiment of austerity.&#8221; The two men are playing a game of chess, which Gates appears to be winning. </p><p>Comments Houellebecq:</p><blockquote><p>In certain pages of his autobiography, <em>The Road Ahead</em>, Bill Gates occasionally lets slip what could be considered total cynicism &#8212; particularly in the passage where he confesses quite plainly that it is not necessarily advantageous for a business to offer the most innovative products. More often it is preferable to observe what the competitors are doing (and there he clearly refers, without using the name, to Apple), to let them bring out their products, confront the difficulties inherent in any innovation, and, in a way, surmount the initial problems; then, in a second phase, to flood the market by offering low-cost copies of the competing products.</p></blockquote><p>AI, it strikes me, applies the Gates model to the entirety of culture. Let writers and musicians and artists do the hard work of actually creating the original artifacts of culture. Then have the machine flood the market with low-cost copies, with cheap derivatives. I did some searching to see if Houellebecq had made any recent pronouncements about AI. I couldn&#8217;t find anything. So, for the hell of it, I asked Claude, Anthropic&#8217;s chatbot, what Houellebecq might say about AI. Here&#8217;s what it came up with:</p><blockquote><p>The machines had finally learned to speak, and predictably, they had nothing interesting to say. Claude, ChatGPT, Bard &#8212; these names floating through the digital ether like the brand names of antidepressants, promising relief from the fundamental problem of human existence: the need to think.</p><p>I watched my neighbors queuing at the coffee shop, each hunched over their phones, asking artificial intelligences to write their emails, plan their dates, explain their feelings back to them. The barista, a philosophy graduate drowning in student debt, served lattes to people who no longer needed to form coherent thoughts. There was something almost beautiful about this collective surrender to algorithmic mediation, the way a mass suicide might be beautiful if observed from sufficient distance.</p></blockquote><p>Not perfect, but pretty damn good. Maybe the machines do have something interesting to say.</p><p>But back to Jed Martin and his Professions. In looking at <em>Bill Gates and Steve Jobs Discussing the Future of Information Technology</em>, the viewer&#8217;s eye, writes Houellebecq, is drawn not just to the two famous figures, one triumphant, the other tragic, but to a large window behind them, through which can be seen &#8220;a landscape of meadows, of an almost surreal emerald green.&#8221;</p><blockquote><p>Evening was falling, magnificently, in the explosion of a sun that Martin had wanted to be almost improbable in its orangey magnificence, setting on northern California, and the evening was falling on the most advanced part of the world; it was that too, that indefinite sadness of farewells, which could be read in Jobs&#8217;s eyes.</p></blockquote><div><hr></div><p><em>This post is an installment in <a href="https://www.newcartographies.com/t/dead-speech">Dead Speech</a>, a New Cartographies series about the cultural and economic consequences of AI.</em></p>]]></content:encoded></item><item><title><![CDATA[Against Compression]]></title><description><![CDATA[Let me not boil it down for you.]]></description><link>https://www.newcartographies.com/p/against-compression</link><guid isPermaLink="false">https://www.newcartographies.com/p/against-compression</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Fri, 11 Jul 2025 17:30:56 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!W-Vb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce5ee8ef-c175-4b6f-9735-c2d5cb654dfb_1512x958.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!W-Vb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce5ee8ef-c175-4b6f-9735-c2d5cb654dfb_1512x958.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!W-Vb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce5ee8ef-c175-4b6f-9735-c2d5cb654dfb_1512x958.jpeg 424w, https://substackcdn.com/image/fetch/$s_!W-Vb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce5ee8ef-c175-4b6f-9735-c2d5cb654dfb_1512x958.jpeg 848w, https://substackcdn.com/image/fetch/$s_!W-Vb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce5ee8ef-c175-4b6f-9735-c2d5cb654dfb_1512x958.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!W-Vb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce5ee8ef-c175-4b6f-9735-c2d5cb654dfb_1512x958.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!W-Vb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce5ee8ef-c175-4b6f-9735-c2d5cb654dfb_1512x958.jpeg" width="1456" height="923" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ce5ee8ef-c175-4b6f-9735-c2d5cb654dfb_1512x958.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:923,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:618179,&quot;alt&quot;:&quot;palm at the end of the mind&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/161162756?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce5ee8ef-c175-4b6f-9735-c2d5cb654dfb_1512x958.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="palm at the end of the mind" title="palm at the end of the mind" srcset="https://substackcdn.com/image/fetch/$s_!W-Vb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce5ee8ef-c175-4b6f-9735-c2d5cb654dfb_1512x958.jpeg 424w, https://substackcdn.com/image/fetch/$s_!W-Vb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce5ee8ef-c175-4b6f-9735-c2d5cb654dfb_1512x958.jpeg 848w, https://substackcdn.com/image/fetch/$s_!W-Vb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce5ee8ef-c175-4b6f-9735-c2d5cb654dfb_1512x958.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!W-Vb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce5ee8ef-c175-4b6f-9735-c2d5cb654dfb_1512x958.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>My Brief Career in Abstraction</strong></p><p>I was a generative pre-trained transformer before GPTs were cool. After dropping out of grad school in the mid-eighties, I landed a job with a new digital-media division of H. W. Wilson, the venerable publisher whose <em>Readers&#8217; Guide to Periodical Literature </em>had long been a mainstay of library reference desks. Looking to capitalize on the surging popularity of personal computers and online databases, the company had decided to create a digital supplement to the <em>Guide</em> that would include 200-word summaries of articles from a couple hundred of the most popular magazines in the country.</p><p>Along with a dozen or so other failed academics and wannabe writers, I was hired to produce these abstracts. I was assigned a small, low-walled cubicle in a large room in an office building in Cambridge, Massachusetts. On my little metal desk sat a terminal connected to a minicomputer running the then cutting-edge Wang Word Processing System. When I arrived each morning, I would be handed a copy of a recent issue of a magazine. It might be <em>Rolling Stone</em>, it might be <em>Scientific American</em>, it might be <em>Redbook</em>. I would skim each article and type up a summary on the terminal, dutifully inserting the various formatting codes the system required to output the text properly. Then I&#8217;d get another issue of another magazine and go through the drill again. The job was fun for a few days &#8212; I was getting paid to read magazines &#8212; then it wasn&#8217;t. As management started increasing the &#8220;target goal&#8221; for the number of abstracts that needed to be produced each hour, I found myself a pieceworker in a literary sweatshop. I quit after a few months.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">New Cartographies is a reader-supported publication. Please consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>I rarely gave a thought to that job after leaving it, but I&#8217;ve been thinking about it a lot recently. It&#8217;s hard to imagine work better suited to the talents of an AI chatbot. Feed in the text of an article, and out pops a serviceable summary at a specified length. What I and my colleagues labored to produce in a month, a bot could output in seconds. I&#8217;d like to think the abstracts we wrote were at least slightly better than what a machine would generate &#8212; we actually <em>understood</em> what we were reading, right? &#8212; but I&#8217;m probably kidding myself. If I went back and read our summaries now, I suspect I would discover a high percentage of slop. And surely the bot&#8217;s work would have been more consistent in style and level of detail than what the bunch of us churned out in our isolated cubicles. I can&#8217;t help but feel retrospectively disposable.</p><p><strong>Distillations</strong></p><p>In 1933, just as Prohibition was being repealed, the German-American inventor Hans Peter Luhn filed a patent for what he called &#8220;The Cocktail Oracle.&#8221; It was a set of plastic cards that allowed people to quickly identify the drinks they could make with whatever ingredients they had on hand. There was a keycard with the names of thirty-six of the most popular cocktails of the day, arranged in a four-by-nine grid. The keycard was entirely black except for the names of the drinks, which were translucent. Then there were fifteen ingredients cards, each representing a common spirit, mixer, or garnish. The ingredients cards had a four-by-nine grid of rectangles, matching the keycard&#8217;s grid of names. If an ingredient went into a particular cocktail, the rectangle would be clear. Otherwise, it was black. You&#8217;d select the cards for the ingredients you wanted to use, arrange them behind the keycard, and hold the set of cards up to a light. The names of the drinks you could make would be illuminated.</p><p>Luhn, a dapper man whose career up to then had been in the textile trade, wasn&#8217;t interested in inebriation. He was interested in information. Drink recipes, he intuited, are algorithms; the names of ingredients are data the algorithms draw on. The Cocktail Oracle was a simple, if ingenious, tool for storing, sorting, and selecting information. It was an analog computer, an app made of plastic. What mattered wasn&#8217;t the drinks but how they were represented in data.</p><p>IBM was good at spotting talent, and it spotted Luhn. His cards, the company recognized, resembled the punch cards that were then used to store data and enter it into mainframes. IBM hired the inventor and gave him free rein. He quickly became one of its star thinkers &#8212; and, despite having no formal training in computer science or even mathematics, one of the world&#8217;s most important innovators in digital computing. He <a href="https://spectrum.ieee.org/hans-peter-luhn-and-the-birth-of-the-hashing-algorithm">pioneered</a> many techniques for the machine-processing of text, from error correction to searching to indexing, that remain in common use today. </p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!bmH-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd543c978-3e25-4d30-b2af-5c65c17a6eaa.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!bmH-!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd543c978-3e25-4d30-b2af-5c65c17a6eaa.jpeg 424w, https://substackcdn.com/image/fetch/$s_!bmH-!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd543c978-3e25-4d30-b2af-5c65c17a6eaa.jpeg 848w, https://substackcdn.com/image/fetch/$s_!bmH-!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd543c978-3e25-4d30-b2af-5c65c17a6eaa.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!bmH-!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd543c978-3e25-4d30-b2af-5c65c17a6eaa.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!bmH-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd543c978-3e25-4d30-b2af-5c65c17a6eaa.jpeg" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d543c978-3e25-4d30-b2af-5c65c17a6eaa.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:249724,&quot;alt&quot;:&quot;Photo: IBM&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Photo: IBM" title="Photo: IBM" srcset="https://substackcdn.com/image/fetch/$s_!bmH-!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd543c978-3e25-4d30-b2af-5c65c17a6eaa.jpeg 424w, https://substackcdn.com/image/fetch/$s_!bmH-!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd543c978-3e25-4d30-b2af-5c65c17a6eaa.jpeg 848w, https://substackcdn.com/image/fetch/$s_!bmH-!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd543c978-3e25-4d30-b2af-5c65c17a6eaa.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!bmH-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd543c978-3e25-4d30-b2af-5c65c17a6eaa.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><figcaption class="image-caption">H. P. Luhn at work.</figcaption></figure></div><p>His most ambitious project, undertaken in the late 1950s, toward the end of his IBM career and his life, was the creation of a computer program for generating summaries of documents. In &#8220;<a href="https://courses.ischool.berkeley.edu/i256/f06/papers/luhn58.pdf">The Automatic Creation of Literature Abstracts</a>,&#8221; a celebrated and widely cited article published in the <em>IBM Journal</em> in April 1958, he laid out the problem he was seeking to solve with his &#8220;auto-abstracts&#8221;:</p><blockquote><p>The preparation of abstracts is an intellectual effort, requiring general familiarity with the subject. To bring out the salient points of an author&#8217;s argument calls for skill and experience. Consequently a considerable amount of qualified manpower that could be used to advantage in other ways must be diverted to the task of facilitating access to information. This widespread problem is being aggravated by the ever-increasing output of technical literature.</p></blockquote><p>Using machines to create abstracts would, Luhn argued, greatly increase the efficiency of this routine but ever more essential work. But it would do more than that. By removing people and their quirks and biases from the process, it would improve the quality of the summaries:</p><blockquote><p>The abstracter&#8217;s product is almost always influenced by his background, attitude, and disposition. The abstracter&#8217;s own opinions or<strong> </strong>immediate interests may sometimes bias his interpretation of the author&#8217;s ideas. The quality of an abstract of a given article may therefore vary widely among abstracters, and if the same person were to abstract an article again at some other time, he might come up with a different product. [With] the application of machine methods . . . both human effort and bias may be eliminated from the abstracting process.</p></blockquote><p>Luhn&#8217;s method of auto-abstracting was founded on a simple observation: &#8220;a writer normally repeats certain words as he advances or varies his arguments and as he elaborates on an aspect of a subject.&#8221; In any piece of expository writing, the words that appear most frequently (excluding commonplace ones like <em>the</em> and <em>to</em>) also tend to be the most &#8220;significant&#8221; &#8212; the ones that reveal the most about the author&#8217;s theme or argument. And the sentences that contain the largest &#8220;clusters&#8221; of those significant words represent in turn the most significant sentences. Through a statistical analysis of word frequency and clustering, Luhn&#8217;s algorithm composed abstracts by identifying and then stringing together an article&#8217;s most significant sentences. </p><p>When Luhn gave a public demonstration of his computer-generated summaries at a library sciences conference in 1958, the event was covered with great excitement by the press. It seemed to mark a breakthrough in natural language processing that heralded a new era in artificial intelligence. Here was a machine that could read and write! When Luhn died of leukemia in 1964, the <em>New York Times</em> ran an obituary that made prominent mention of the auto-abstracting demo. Six years after the fact, the event still seemed rich with promise:</p><blockquote><p>Mr. Luhn, in a demonstration, took a 2,326-word article on hormones in the nervous system from <em>The Scientific American</em>, inserted it in the form of magnetic tape into an I.B.M. computer, and pushed a button. Three minutes later, the machine&#8217;s automatic typewriter typed four sentences giving the gist of the article, of which the machine had made an abstract. Mr Luhn thus showed, in practice, how a machine could do in three minutes what would have taken at least half an hour&#8217;s hard work.</p></blockquote><p>Luhn&#8217;s method was compelling in theory &#8212; and it was good enough for a successful demo &#8212; but it had a fatal flaw. Making sense of the summaries required too much work. Readers of the abstracts would, as Luhn admitted in his paper, need to &#8220;learn how to interpret them and how to detect their implications,&#8221; recognizing &#8220;that certain words contained in the sample sentences stand for notions which must have been elaborated upon somewhere in the article.&#8221; Luhn&#8217;s abstracts gave subject-matter experts a general sense of what the underlying articles were about, but to everyone else they were gobbledygook.</p><p>I am living proof of Luhn&#8217;s failure. If his auto-abstracter had worked even moderately well, I would never have been hired by H. W. Wilson. Rather than recruiting a squad of overeducated scribblers, the company would have loaded its minicomputer with an IBM Auto-Abstracting System and paid a few dunderheads to feed magazine pages into an optical character recognition scanner.</p><p>For many years after Luhn&#8217;s death, other programmers tried to refine his method, mainly by testing other ways of identifying salient words and sentences in a text. The results remained underwhelming. After decades of disappointment, the pursuit of a system for computer-generated summaries came to be seen as a dead end. &#8220;If our aim is to produce abstracts that cannot be told from manual abstracts, then it is hard to believe that systems relying on selection and limited adjustment of textual material could ever succeed,&#8221; lamented a computer scientist in a 1990 <a href="https://www.sciencedirect.com/science/article/abs/pii/030645739090014S">paper</a>. </p><p>Only now, some seventy-five years after his demo, has Luhn&#8217;s dream been fulfilled. As with other perplexing challenges in natural language processing &#8212; translating text from one language into another, interpreting simple spoken commands, winning <em>Jeopardy</em> &#8212; the problem of automated abstracting came to be solved not with elegant theories of cognition and language but with the brute-force application of quantities of data and computer power that would have been unimaginable to computer scientists in the 1950s and &#8217;60s. Today&#8217;s generative pre-trained transformers resemble Luhn&#8217;s auto-abstracter in employing a statistical analysis of text, but the scale at which they operate is entirely different. Luhn worked with a garden trowel. GPTs work with a fleet of backhoes.</p><p><strong>Artificial Generalizing Intelligence</strong></p><p>In realizing Luhn&#8217;s dream, GPTs also distort it. An abstract, for H. P. Luhn and H. W. Wilson as for the researchers they sought to aid, was a navigational tool used to guide intellectual journeys &#8212; an efficient means of identifying and prioritizing, among a welter of possibilities, those articles one actually needed to read to advance one&#8217;s knowledge in a subject. The summaries pumped out by GPTs can play a similar role. Experts can use them to survey the latest writings in their field and plot a path through them. But in their common use &#8212; by students, by paper pushers, by the general public &#8212; they aren&#8217;t navigational aids. They&#8217;re substitutes. The machine-generated summary takes the place of the human-written work. The gist becomes the end product. </p><p>&#8220;Intelligence is compression,&#8221; says Ilya Sutskever, the machine-learning pioneer who was chief scientist at OpenAI until quitting last year. The statement, which has become a commonplace in AI circles, illustrates itself. Sutskever has surveyed many examples of what he considers &#8220;intelligence,&#8221; and he has compressed them all into a generalization, a pattern, an abstraction: &#8220;intelligence is compression.&#8221; In addition to being a description of human intelligence, the statement, not coincidentally, describes the basic procedure of machine learning and, in particular, the work of generative pre-trained transformers. A GPT takes in many, many examples of human thoughts, ideas, and observations, as expressed through words, pictures, or sounds, and boils them down into a concise, statistical representation, or model, of knowledge. That model is then used to interpret prompts and generate new, derivative thoughts, ideas, and observations that have the appearance of human expression.</p><p>But &#8220;intelligence is compression,&#8221; when applied to the mind rather than the computer, is a grotesque oversimplification. It presents one (admittedly very important) aspect of intelligence as the whole of intelligence, dismissing all other ways of thinking and perceiving as inconsequential. In defining an indeterminate, multifaceted human quality (intelligence) in terms of a computer operation (machine learning), Sutskever and his ilk fall victim to what might be called the neuromachinic fallacy &#8212; the desire to see in the workings of a machine a reflection of the workings of the mind. They are not alone. Throughout history, scientists, philosophers, and engineers have been quick to use a new technology &#8212; be it a hydraulic fountain, a mechanical clock, or a telephone switchboard &#8212; as a metaphor for the brain. All the metaphors have some descriptive value, but all of them are also reductive, incomplete, and, when taken as fact, misleading.</p><p>The body is represented in the brain by an array of neurons, and as the body familiarizes itself to particular locations, those locations, too, become represented in the brain by neuronal arrays. These mental representations are important, but they are not the essence of thought or consciousness. They&#8217;re not substitutes for the things they represent &#8212; the self and its settings &#8212; but rather biological tools that aid us in more fully experiencing the world in all its particulars. Like Luhn&#8217;s abstracts, or cocktail recipes, they&#8217;re navigational aids that help us experience and make sense of the fullness of the actual. Whatever intelligence is, it can&#8217;t be reduced to representations and models. It is at least as attuned to what makes things unique (and incompressible) as to what makes things similar (and compressible). A recipe is just a recipe, but a good cocktail is a drink.</p><p>Sutskever and his former colleagues at OpenAI believe that the kind of compression performed by GPTs can lead to the development of an &#8220;AGI&#8221; &#8212; an <em>artificial general intelligence</em> that can equal or exceed the intellectual capacities of the human mind. But what <em>AGI</em> really stands for in their narrow conception of the mind would be better expressed as <em>artificial generalizing intelligence</em>. It encompasses the mind&#8217;s analytic ability to see common patterns in different things or phenomena and to derive general categories or rules from them. But it excludes all the aspects of intelligence that free us from the constraints of rules and patterns: imaginative thinking, metaphorical thinking, critical thinking, contemplation, aesthetic perception, taste, the ability to see, as William Blake put it, &#8220;a world in a grain of sand.&#8221; In its highest form, intelligence is not compressive. It&#8217;s extravagant, from the Latin <em>extravagans</em>, meaning to wander off the established course, to go beyond the general rules.</p><p>By removing all subjectivity from thought &#8212; by, in short, separating intelligence from being &#8212; the mages of AI allow themselves to indulge in tautology. They reduce intelligence to that which their machines can do and then claim their machines are intelligent.</p><p><strong>The Servile Artist</strong></p><p>In a 2003 <a href="https://www.timeshighereducation.com/features/the-unholy-market-kills-divine-life-of-the-mind/177114.article">article</a> in <em>Times Higher Education</em>, the British theologian Andrew Louth drew a distinction between what he termed &#8220;the free arts&#8221; and &#8220;the servile arts.&#8221; The distinction seems fundamental to understanding both the possibilities of human intelligence and the limits of machine intelligence:</p><blockquote><p>The medieval university was a place that made possible a life of thought, of contemplation. It emerged in the 12th century from the monastic and cathedral schools of the early Middle Ages where the purpose of learning was to allow monks to fulfil their vocation, which fundamentally meant to come to know God. Although knowledge of God might be useful in various ways, it was sought as an end in itself. Such knowledge was called contemplation, a kind of prayerful attention.</p><p>The evolution of the university took the pattern of learning that characterised monastic life &#8211; reading, meditation, prayer and contemplation &#8211; out of the immediate context of the monastery. But it did not fundamentally alter it. At its heart was the search for knowledge for its own sake. It was an exercise of freedom on the part of human beings, and the disciplines involved were to enable one to think freely and creatively. These were the liberal arts, or free arts, as opposed to the servile arts to which a man is bound if he has in mind a limited task.</p><p>In other words, in the medieval university, contemplation was knowledge of reality itself, as opposed to that involved in getting things done. It corresponded to a distinction in our understanding of what it is to be human, between reason conceived as puzzling things out and that conceived as receptive of truth. This understanding of learning has a history that goes back to the roots of western culture. Now, this is under serious threat, and with it our notion of civilisation.</p></blockquote><p>OpenAI gives the game away in its small-minded <a href="https://openai.com/our-structure/">definition</a> of artificial general intelligence as &#8220;a highly autonomous system that outperforms humans at most <em>economically valuable</em> work [emphasis added].&#8221; Economically valuable work is exactly what Louth calls &#8220;that to which a man is bound if he has in mind a limited task.&#8221; AI in its current form is a servile artist.</p><p>None of this would matter much if we had not adopted computer systems as the fundamental conduit of thought and culture. But we have. Those who control the systems control much about us. Their flaws and shortcomings are built not just into the technology but, increasingly, into society&#8217;s norms and practices. Just as the brilliant but socially maladroit Mark Zuckerberg came to set the terms for how we socialize today, so the brilliant but intellectually crippled designers of contemporary AI systems seem destined to set the terms for how we think.</p><p>Karen Hao, in her recent <em><a href="https://bookshop.org/a/85280/9780593657508">Empire of AI</a></em>, relates an episode from 2013 when Elon Musk and Google cofounder Larry Page met at a Napa Valley party and argued about the prospects for artificial intelligence. Amped up as always  &#8212; half hero, half villain, and lacking the composure to distinguish between the two &#8212; Musk implores Page to join  him in a crusade to ensure AI doesn&#8217;t destroy humanity. Page brushes him off, dismissing his concerns as &#8220;specist.&#8221; The triumph of AI, he tells his friend, is just &#8220;the next stage in evolution.&#8221; In Page&#8217;s response, the deep, carefully concealed misanthropy of the contemporary tech elite bubbles briefly to the surface. These are people who hold human beings &#8212; indeed all the messy, incompressible things of the world &#8212; in contempt. What can&#8217;t be represented in data is without value.</p><p>Let me end by bringing in some poetry. The very last poem that the dying Wallace Stevens is said to have written is &#8220;Of Mere Being,&#8221; which appeared in the 1957 collection <em><a href="https://bookshop.org/a/85280/9780679725343">Opus Posthumous</a></em>. It&#8217;s a short poem, just four three-line stanzas, but it gets a lot across:</p><blockquote><p>The palm at the end of the mind,<br>Beyond the last thought, rises<br>In the bronze decor,</p><p>A gold-feathered bird<br>Sings in the palm, without human meaning,<br>Without human feeling, a foreign song.</p><p>You know then that it is not the reason<br>That makes us happy or unhappy.<br>The bird sings. Its feathers shine.</p><p>The palm stands on the edge of space.<br>The wind moves slowly in the branches.<br>The bird's fire-fangled feathers dangle down.</p></blockquote><p>One can interpret the poem &#8212; hundreds of pages have been written about it &#8212; but one cannot compress it. It is entirely extravagant.</p><p>&#8220;No ideas but in things,&#8221; Stevens&#8217;s contemporary William Carlos Williams declared as his personal ethos for poetry. The dictum expresses a conception of the mind&#8217;s work that is precisely the opposite of &#8220;intelligence is compression.&#8221; In Stevens&#8217;s poem, though, we see at play both attributes of the mind &#8212; a facility for abstraction and an acute sensitivity to the thing itself &#8212; held in exquisite balance by imagination, metaphor, and memory, three other attributes of the mind. The fully formed intellect sees a palm tree as an example of a pattern expressed by the word <em>palm,</em> but it also sees the tree as an irreducible phenomenon that escapes the prison of the pattern and achieves its own singularity.</p><p>The moment you start to believe that all intelligence is compression, you are lost to the world. You exist in the cramped confines of a representation. That&#8217;s where GPTs exist, and it&#8217;s where we&#8217;ll all exist if we continue to follow their lead.</p><div><hr></div><p><em>This post is an installment in <a href="https://www.newcartographies.com/t/dead-speech">Dead Speech</a>, a New Cartographies series about the cultural and economic consequences of AI. The series began <a href="https://www.newcartographies.com/p/dead-labor-dead-speech">here</a>.</em></p>]]></content:encoded></item><item><title><![CDATA[All the Little Data]]></title><description><![CDATA[What we see when we see the world as information.]]></description><link>https://www.newcartographies.com/p/all-the-little-data</link><guid isPermaLink="false">https://www.newcartographies.com/p/all-the-little-data</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Sun, 22 Jun 2025 12:25:23 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!m55Q!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9c107b3c-1d05-41bd-b91f-4e7aedb5e662_843x623.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!m55Q!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9c107b3c-1d05-41bd-b91f-4e7aedb5e662_843x623.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!m55Q!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9c107b3c-1d05-41bd-b91f-4e7aedb5e662_843x623.heic 424w, https://substackcdn.com/image/fetch/$s_!m55Q!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9c107b3c-1d05-41bd-b91f-4e7aedb5e662_843x623.heic 848w, https://substackcdn.com/image/fetch/$s_!m55Q!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9c107b3c-1d05-41bd-b91f-4e7aedb5e662_843x623.heic 1272w, https://substackcdn.com/image/fetch/$s_!m55Q!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9c107b3c-1d05-41bd-b91f-4e7aedb5e662_843x623.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!m55Q!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9c107b3c-1d05-41bd-b91f-4e7aedb5e662_843x623.heic" width="843" height="623" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9c107b3c-1d05-41bd-b91f-4e7aedb5e662_843x623.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:623,&quot;width&quot;:843,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:34742,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/166518576?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9c107b3c-1d05-41bd-b91f-4e7aedb5e662_843x623.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!m55Q!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9c107b3c-1d05-41bd-b91f-4e7aedb5e662_843x623.heic 424w, https://substackcdn.com/image/fetch/$s_!m55Q!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9c107b3c-1d05-41bd-b91f-4e7aedb5e662_843x623.heic 848w, https://substackcdn.com/image/fetch/$s_!m55Q!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9c107b3c-1d05-41bd-b91f-4e7aedb5e662_843x623.heic 1272w, https://substackcdn.com/image/fetch/$s_!m55Q!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9c107b3c-1d05-41bd-b91f-4e7aedb5e662_843x623.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Sol LeWitt, <em>The</em> <em>Location of a Circle </em>(Whitney Museum of American Art)<em>.</em></figcaption></figure></div><p>One recent Tuesday, at two thirty-seven in the afternoon, I received an email from UPS letting me know that a package had been delivered to my home. Attached, as evidence, was a blurry, off-kilter photograph of a small, slightly dented but otherwise nondescript cardboard box that had been placed on my driveway, next to the garage door. A minute later, at thirty-eight minutes past two, I received a second email announcing the package&#8217;s arrival, this one from the online merchant that had shipped the box and sold me the shirt it contained. The company congratulated me on the purchase, praised my good taste in menswear, and offered a few suggestions of other articles of clothing I might be interested in buying.</p><p>The two emails capped a fusillade of messages. It began five days earlier, when, as I tapped the Place Order button for the shirt, a banking app on my phone notified me that my credit card was being charged $79.95. (It was a nice shirt.) Seconds later, I received both an email and a text from the merchant, confirming the purchase and letting me know that I would receive further communications when the shirt shipped. Which I did, the very next day, when both the merchant and UPS emailed me a shipment confirmation with a tracking link. (When I clicked the link, I learned that the package had been picked up and had arrived at a UPS facility in Tacoma, Washington.) I also received emails from the two companies, as well as another text from the merchant, the day before the delivery, informing me the shirt would arrive the following day&#8212;&#8220;Get ready!&#8221; the retailer brayed&#8212;and yet another UPS email, early on Tuesday morning, confirming that the shirt had been loaded onto a truck at a local warehouse and was officially &#8220;out for delivery.&#8221; There was a coda, too: The day after the shirt arrived, the merchant sent an email expressing its hope that I liked the garment and suggesting I post a review on its website.</p><p>I find myself in possession of a lot of information these days. I&#8217;m in the loop. I&#8217;m in many loops, all spinning simultaneously. It&#8217;s not just the minutiae of commerce&#8212;orders, shipments, deliveries&#8212;that are richly documented. When I&#8217;m driving, my car&#8217;s dashboard, linked to my iPhone through CarPlay, shows me exactly where I am, tells me the posted speed limit and the current traffic conditions, and lets me know both the distance I have to go before I reach my destination and the estimated time of my arrival. (There&#8217;s also a readout available on the town or city I&#8217;m visiting: population, elevation, square footage, GPS coordinates.) My phone&#8217;s weather app gives me a bespoke meteorological report of remarkable thoroughness. Right this second, the app tells me it&#8217;s eighty-four degrees and cloudy outside. A light rain will begin in seventeen minutes and will end forty-eight minutes after that, at which point it will become partly cloudy. The wind is blowing west-southwest at six miles per hour, the relative humidity is 58 percent, and the barometric pressure is 30.18 inHg. The UV index is six, which is High, and the air quality index is fifty-one, which is Moderate. The sun will set this evening at 8:11 p.m., and in four days the moon will be full. I&#8217;ve taken 4,325 steps today. My refrigerator&#8217;s water filter has only 10 percent of its useful life left. My credit rating just dropped eight points. I have 4,307 unread emails, two more than I had five minutes ago.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Subscribe to New Cartographies.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Even my consumption of cultural goods&#8212;an ugly phrase, yes, but it seems apt&#8212;is shadowed by metadata. When the graphical user interface was introduced to personal computers in the early 1980s, the scroll bar habituated us to a visual indicator of our progress through a document. Now, pretty much all viewing, listening, and reading is tracked, visually or numerically, in real time. When I&#8217;m listening to a song, a glance at the progress bar tells me, to the second, how much time has elapsed since the tune began and how much remains before it ends. The same goes for TV shows and movies and videos. When I&#8217;m reading an ebook, I&#8217;m kept apprised of the percentage of the text I&#8217;ve made it through. When I&#8217;m looking over the homepage of a newspaper or magazine site, I&#8217;m told how long it will take to read each article. Here&#8217;s a &#8220;3 min read.&#8221; There&#8217;s a &#8220;7 min read.&#8221; (This essay, for the record, is a thirteen-minute read, and you have nine minutes to go.) Every photo on my phone offers its own little data dump: where and when it was taken, the aperture and ISO settings, the exposure time, the image&#8217;s size in pixels and bits. My pictures tend to be amateurish, but the data always looks professional.</p><p>We talk a lot these days about Big Data, those heaping stores of digitized information that, fueling search and recommendation engines, social media feeds, and, now, artificial intelligence models, govern so much of our lives today. But we don&#8217;t give much notice to what might be called little data&#8212;all those fleeting, discrete bits of information that swarm around us like gnats on a humid summer evening. Measurements and readings. Forecasts and estimates. Facts and statistics. Yet it&#8217;s the little data, at least as much as the big stuff, that shapes our sense of ourselves and the world around us as we click and scroll through our days. Our apps have recruited us all into the arcane fraternity of the logistics manager and the process-control engineer, the meteorologist and the lab tech, and what we&#8217;re monitoring and measuring, in such exquisite detail, is our own existence. &#8220;Software is eating the world,&#8221; the venture capitalist Marc Andreessen declared in a famous <em>Wall Street Journal </em>op-ed a decade ago. It&#8217;s also eating us.</p><p>In <em>Minima Moralia</em>, his 1951 book of aphoristic musings, the German philosopher Theodor Adorno made a trenchant observation about the intimate relationship he saw developing between humanity and its ever more elaborate and encompassing technology. People were growing attuned to and protective of &#8220;the functioning of the apparatus, in which they are not only objectively incorporated but with which they proudly identify themselves.&#8221; Adorno wasn&#8217;t just rehashing the trope about laborers becoming cogs in the industrial machine, so memorably expressed fifteen years earlier by Charlie Chaplin in<em> Modern Times</em>. His point was subtler. Machines aren&#8217;t our masters. They&#8217;re not even separate from us. As their makers, we imbue them with our own will and desire. They&#8217;re our familiars, and we&#8217;re theirs. As we form tighter bonds, our intentions merge. We vibrate to the same rhythms, adopt the same posture toward the world.</p><p>The mechanical apparatuses of Adorno&#8217;s time, from machine tools in factories to vacuum cleaners in homes, emphasized the industrial ethos of routinization, standardization, and repetition. They oriented people toward the efficient production of outputs. They turned everyone into a machinist. But the apparatuses were not constant presences in people&#8217;s lives. Workers walked away from their machines at the end of their shifts. Vacuum cleaners went back into the closet once the rugs were clean. The internet is different. Thanks to the omnipresence of the smartphone, it&#8217;s always there. The network is less a tool than a habitation, less an apparatus than an environment. We don&#8217;t just use it to get things done. We are, as Adorno foresaw, incorporated into it as components. We&#8217;re nodes, continuously receiving and transmitting signals. The ethos of the system is one of documentation and representation. We&#8217;re all jointly engaged in the production of a facsimile of the world&#8212;a &#8220;mirror world,&#8221; to borrow a term from the computer scientist David Gelernter, created purely of information&#8212;and in that facsimile we have taken up residence.</p><p>Clean and tidy, the mirror world has practical value. It makes life run more smoothly. If I know I&#8217;m going to have to sign for a package, it&#8217;s useful to be told when it will arrive. If I&#8217;m on a highway and I&#8217;m alerted to an accident ahead, I can take an exit before I get stuck in a traffic jam. If I know rain is going to start falling in seventeen minutes, I can put off the walk I was about to take. But the view of reality that little data give us is narrow and distorted. The image in the mirror has low resolution. It obscures more than it reveals. Data can show us only what can be made explicit. Anything that can&#8217;t be reduced to the zeroes and ones that run through computers gets pruned away. What we don&#8217;t see when we see the world as information are qualities of being&#8212;ambiguity, contingency, mystery, beauty&#8212;that demand perceptual and emotional depth and the full engagement of the senses and the imagination. It hardly seems a coincidence that we find ourselves uncomfortable discussing or even acknowledging such qualities today. In their open-endedness, they defy datafication.</p><p>Still, little data&#8217;s simplifications are reassuring. By shrinking the world to the well-defined and the measurable, they lend a sense of order and predictability to our disjointed lives. Social situations used to be bounded in space and time. You&#8217;d be in one place, with one group of people, and then, sometime later, you&#8217;d be somewhere else, with another group. Such &#8220;situation segregation&#8221; served as &#8220;a psycho-social shock absorber,&#8221; the communication professor Joshua Meyrowitz explained in his 1986 book, <em>No Sense of Place</em>. &#8220;By selectively exposing ourselves to events and other people, we control the flow of our actions and emotions.&#8221; Social media eliminates the spatiotemporal boundaries. Social settings blur together. We&#8217;re everywhere, with everyone, all at once. The shock absorber gone, a welter of overlapping events and conversations buffets the nervous system. Time stamps, progress bars, location mappings, and other such informational indicators help temper the anxiousness bred by the flux. They give us a feeling that we&#8217;re still situated in time and space, that we exist in a solid world of things rather than a vaporous one of symbols. The feeling may be an illusion&#8212;the information offers only a sterile representation of the real&#8212;but it&#8217;s comforting nonetheless. My shirt is in Tacoma, and all is right in the world.</p><p>The comfort is welcome. It&#8217;s one reason the data exert such a pull on us. But there&#8217;s a bigger reason. Little data tell us little stories in which we play starring roles. When I track a package as it hopscotches across the country from depot to depot, I know that I&#8217;m the prime mover in the process&#8212;the one who set it in motion and the one who, when I tear open the box, will bring it to a close. That little white arrowhead traveling so confidently across the map on the dashboard? That&#8217;s me. I&#8217;m going somewhere. I&#8217;m worth watching. When I monitor the advance of a song&#8217;s progress bar, I know I can stop the music anytime, purely at my whim. I&#8217;m the DJ. I&#8217;m the tastemaker. I say when one tune ends and the next begins. So lovingly personalized, so indulgent, little data put us at the center of things. They tell us that we have power, that we matter.</p><p>And yet, as we rely on the data to get our bearings and exercise our agency, we lose definition as individuals. The self, always hazy, dissolves into abstraction. We begin to exist symbolically, a pattern of information within a broader pattern of information. We feel this most acutely when we shape an identity to fit the parameters of social media. Everything we do on platforms like Facebook, X, and Substack is logged, and the resulting data are often immediately visible to us (and others) in the form of like and view tallies, friend and follower counts, comment and retweet scores, and other quantitative measures of activity and affect. Even the number of seconds that elapse between a post and a response becomes laden with meaning. Social status and personal character take numerical forms and, like other measurements, demand to be monitored, managed, and optimized. Just as today&#8217;s airline pilots, surrounded by data displays in their &#8220;glass cockpits,&#8221; fly their planes more by number than by sight and feel, so we seem fated to navigate our lives more through recorded signals than through direct experience. Events become real only when they&#8217;re rendered after the fact as information. Pics or it didn&#8217;t happen, as the Instagrammers used to say.</p><p>When social relations are conducted through data, they come to resemble economic relations. They turn transactional. Before my Uber driver sees me as a person, she sees me as an assemblage of information&#8212;a location on a map, a rating on a five-point scale, a first name&#8212;and I see her the same way. The gig economy, like the social media system, is constructed of little data. It works by turning people and their activities into abstractions, digital signals that can be processed by computers. It&#8217;s only logical that, in cities like San Francisco, Phoenix, and Austin, the drivers are now being automated out of existence. Self-driving algorithms can carry out the necessary transactions with even more precision and efficiency. To really perfect the system, though, you&#8217;d need to turn the passengers into automatons, too. The trip would take place not on asphalt but entirely on screen, a flow of data through the mirror world. We may not want to admit it, but when we communicate using little data, we&#8217;re speaking the language of robots.</p><p>Back in 2004, in an interview with <em>Playboy</em> magazine, Sergey Brin, one of Google&#8217;s founders, said something that has stuck with me. &#8220;The entirety of the world&#8217;s information,&#8221; he suggested, might one day become &#8220;just one of our thoughts.&#8221; He was speculating about the possibility that Google would invent some sort of electronic implant to connect an individual&#8217;s nervous system to the internet. The idea seemed far-fetched to me at the time, and it still does. But as I think about how my mind works these days, I&#8217;m coming to realize that Brin may have been more prescient than either he or I realized. We don&#8217;t need dongles hanging out of our skulls. The stream of little data is already a stream of consciousness. It&#8217;s running through our heads all the time. In coming years, as digital sensors proliferate, as more and more objects turn into computer interfaces, and as AI gets better at reading our interests and intentions, the ever-swelling data stream may become our dominant train of thought, our all-purpose apparatus for the work of sense-making and self-making.</p><p>A few months ago, as part of my annual physical exam, I had blood drawn for a routine panel of tests. Late the next day, my phone vibrated to let me know the results were available through my doctor&#8217;s &#8220;patient portal&#8221; app. I signed in (entering a six-digit code to authenticate myself), clicked on the Results tab, and was greeted by a long list of numbers. There must have been two dozen of them, each a measure of some important metabolic function, each occupying a point within a range of points. Blood, that most vital and visceral of substances, had been turned into an array of data on a computer screen. Blood had been rendered bloodless. Maybe I was in a morbid mood&#8212;medical tests will do that to you&#8212;but as I scrolled through the numbers, I couldn&#8217;t help feeling I was looking at a metaphor for something larger, something central to the human condition today. What is datafication but a process for transforming the living into the dead?</p><p>I returned the shirt. It didn&#8217;t fit.</p><div><hr></div><p><em>This essay originally appeared in the Fall 2024 issue of <a href="https://hedgehogreview.com">The Hedgehog Review</a>.</em></p>]]></content:encoded></item><item><title><![CDATA[The Original Chatbot]]></title><description><![CDATA[On Joseph Weizenbaum and Eliza.]]></description><link>https://www.newcartographies.com/p/the-original-chatbot</link><guid isPermaLink="false">https://www.newcartographies.com/p/the-original-chatbot</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Sun, 01 Jun 2025 14:44:03 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/248b61ad-3eaa-498e-92d2-cb4fddebcfac_1796x1048.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!L4xG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F065b413d-c870-4af6-8764-5601daaeb13f_1958x1572.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!L4xG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F065b413d-c870-4af6-8764-5601daaeb13f_1958x1572.heic 424w, https://substackcdn.com/image/fetch/$s_!L4xG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F065b413d-c870-4af6-8764-5601daaeb13f_1958x1572.heic 848w, https://substackcdn.com/image/fetch/$s_!L4xG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F065b413d-c870-4af6-8764-5601daaeb13f_1958x1572.heic 1272w, https://substackcdn.com/image/fetch/$s_!L4xG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F065b413d-c870-4af6-8764-5601daaeb13f_1958x1572.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!L4xG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F065b413d-c870-4af6-8764-5601daaeb13f_1958x1572.heic" width="1456" height="1169" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/065b413d-c870-4af6-8764-5601daaeb13f_1958x1572.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1169,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:236284,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/164933913?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F065b413d-c870-4af6-8764-5601daaeb13f_1958x1572.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!L4xG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F065b413d-c870-4af6-8764-5601daaeb13f_1958x1572.heic 424w, https://substackcdn.com/image/fetch/$s_!L4xG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F065b413d-c870-4af6-8764-5601daaeb13f_1958x1572.heic 848w, https://substackcdn.com/image/fetch/$s_!L4xG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F065b413d-c870-4af6-8764-5601daaeb13f_1958x1572.heic 1272w, https://substackcdn.com/image/fetch/$s_!L4xG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F065b413d-c870-4af6-8764-5601daaeb13f_1958x1572.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Jean-L&#233;on G&#233;r&#244;me, <em>Pygmalion and Galatea</em> (detail).</figcaption></figure></div><p><em>In Ovid&#8217;s </em>Metamorphoses<em>, the sculptor Pygmalion, a celibate by choice, sculpts a beautiful woman in ivory and falls in love with her. &#8220;He kisses it and feels his kisses are returned.&#8221; Nearly two thousand years later, in 1913, George Bernard Shaw uses Ovid&#8217;s story as the basis for his play </em>Pygmalion<em>, in which the phonetics professor Henry Higgins teaches the cockney guttersnipe Eliza Doolittle to speak the King&#8217;s English and in the process falls for her. In 1956, Lerner and Loewe turn Shaw&#8217;s play into a celebrated musical, </em>My Fair Lady<em>, which in 1964 is adapted into a hit film starring Rex Harrison as Henry and Audrey Hepburn as Eliza. That same year, the MIT computer scientist and AI researcher Joseph Weizenbaum begins programming the first computer chatbot, which he names, as a joke, Eliza. He fails to foresee how his Eliza will prove just as seductive as her predecessors. In today&#8217;s <a href="https://www.newcartographies.com/t/rerun">Sunday Rerun</a>, drawn from &#8220;A Thing Like Me,&#8221; the closing chapter of my 2010 book </em><a href="https://bookshop.org/p/books/the-shallows-what-the-internet-is-doing-to-our-brains-nicholas-carr/15563044?ean=9780393357820&amp;next=t&amp;affiliate=85280">The Shallows</a><em>, I tell the story of Weizenbaum, who, to his credit, did not fall for his creation. </em></p><div><hr></div><p>It was one of the odder episodes in the history of computer science, yet also one of the more telling. Over the course of a few months in 1964 and 1965, Joseph Weizenbaum, a forty-one-year-old computer scientist at the Massachusetts Institute of Technology, wrote a software application for parsing written language, which he programmed to run on the university&#8217;s new time-sharing system. A student, sitting at one of the system&#8217;s terminals, would type a sentence into the computer, and Weizenbaum&#8217;s program, following a set of simple rules about English grammar, would identify a salient word or phrase in the sentence and analyze the syntactical context in which it was used. The program would then, following another set of rules, transform the sentence into a new sentence that had the appearance of being a response to the original. The computer-generated sentence would appear almost instantly on the student&#8217;s terminal, giving the illusion of a conversation.</p><p>In a January 1966 paper introducing his program, Weizenbaum provided an example of how it worked. If a person typed the sentence &#8220;I am very unhappy these days,&#8221; the computer would need only know that the phrase &#8220;I am&#8221; typically comes before a description of the speaker&#8217;s current situation or state of mind. The computer could then recast the sentence into the reply &#8220;How long have you been very unhappy these days?&#8221; The program worked, Weizenbaum explained, by first applying &#8220;a kind of template to the original sentence, one part of which matched the two words &#8216;I am&#8217; and the remainder [of which]<strong> </strong>isolated the words &#8216;very unhappy these days.&#8217;&#8221; It then used an algorithmic &#8220;reassembly kit,&#8221; tailored to the template, that included a rule specifying that &#8220;any sentence of the form &#8216;I am BLAH&#8217;&#8221; should be &#8220;transformed to &#8216;How long have you been BLAH,&#8217; independently of the meaning of BLAH.&#8221;</p><p>Weizenbaum named his program Eliza, after Eliza Doolittle, the cockney flower girl who, in George Bernard Shaw&#8217;s <em>Pygmalion</em>, learns to speak proper English under the tutelage of the haughty phonetics professor Henry Higgins. To make the simulated conversations a little more interesting, Weizenbaum also gave his artificial interlocutor a persona&#8212;that of a Rogerian psychotherapist. The choice, as he explained in his paper, was a pragmatic one. Using a technique developed by the psychologist Carl Rogers in the 1940s, Rogerian therapists pretended, in their conversations with patients, to have no understanding of the world. For the most part, they simply parroted their patients&#8217; statements back to them in the form of banal, open-ended questions or comments. Knowing that the na&#239;vet&#233; was a pose, the patients were free to attribute to their therapists &#8220;all sorts of background knowledge, insights and reasoning ability.&#8221; The Rogerian persona, Weizenbaum recognized, had &#8220;a crucial psychological utility&#8221; for Eliza: it suggested that the program&#8217;s vacuity masked some sort of real intelligence.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Subscribe to New Cartographies.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Weizenbaum provided a transcript of part of an exchange between Eliza and a young woman who had been an early tester of the program (the machine&#8217;s responses are in italics):</p><blockquote><p>Men are all alike.<br><em>In what way?<br></em>They&#8217;re always bugging us about something or other.<br><em>Can you think of a specific example?<br></em>Well, my boyfriend made me come here.<br><em>Your boyfriend made you come here.<br></em>He says I&#8217;m depressed much of the time.<br><em>I am sorry to hear you are depressed.<br></em>It&#8217;s true. I am unhappy.<br><em>Do you think coming here will help you to be not unhappy?<br></em>I need some help, that much seems certain.<br><em>What would it mean to you if you got some help?<br></em>Perhaps I could learn to get along with my mother.<br><em>Tell me more about your family.<br></em>My mother takes care of me.<br><em>Who else in your family takes care of you?<br></em>My father.<br><em>Your father.<br></em>You are like my father in some ways.<br><em>What resemblance do you see?<br></em>You are not very aggressive but I think you don&#8217;t want me to notice that.<br><em>What makes you think I am not very aggressive?<br></em>You don&#8217;t argue with me.<br><em>Why do you think I don&#8217;t argue with you?<br></em>You are afraid of me.<br><em>Does it please you to believe I am afraid of you?<br></em>My father is afraid of everybody.</p></blockquote><p>In commenting on his creation, which he acknowledged to be a modest and even slightly silly contribution to the fledgling field of natural-language processing, Weizenbaum observed how easy it is for computer programmers to make machines &#8220;behave in wondrous ways, often sufficient to dazzle even the most experienced observer.&#8221; But as soon as a program&#8217;s &#8220;inner workings are explained in language sufficiently plain to induce understanding,&#8221; he continued, &#8220;its magic crumbles away; it stands revealed as a mere collection of procedures, each quite comprehensible.&#8221; The program goes &#8220;from the shelf marked &#8216;intelligent&#8217; to that reserved for curios.&#8221;</p><p>Weizenbaum, like Henry Higgins, was soon to have his equilibrium disturbed. Eliza quickly found fame on the MIT campus, becoming a mainstay of lectures and presentations about computing and time-sharing. It was among the first software programs able to demonstrate the power and speed of computers in a way that laymen<strong> </strong>could easily grasp. You didn&#8217;t need a background in mathematics, much less computer science, to chat with Eliza. Copies of the program proliferated at other schools as well. Then the press took notice, and Eliza became, as Weizenbaum later put it, &#8220;a national plaything.&#8221; </p><p>While he was surprised by the public&#8217;s interest in his program, what shocked him was how quickly and deeply people using the software &#8220;became emotionally involved with the computer,&#8221; talking to it as if it were an actual person. Users &#8220;would, after conversing with it for a time, insist, in spite of my explanations, that the machine really understood them.&#8221; Even his secretary, who had watched him write the code for Eliza &#8220;and surely knew it to be merely a computer program,&#8221; was seduced. After a few moments using the software at a terminal in Weizenbaum&#8217;s office, she asked the professor to leave the room because she was embarrassed by the intimacy of the conversation. &#8220;What I had not realized,&#8221; said Weizenbaum, &#8220;is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.&#8221;</p><p>Things<strong> </strong>were about to get stranger still. Distinguished psychiatrists and psychologists began to suggest, with considerable enthusiasm, that the program could play a valuable role in actually treating the ill and the disturbed. In an article in the <em>Journal of Nervous and Mental Disease</em>, three prominent research psychiatrists wrote that Eliza, with a bit of tweaking, could be &#8220;a therapeutic tool which can be made widely available to mental hospitals and psychiatric centers suffering a shortage of therapists.&#8221; Thanks to the &#8220;time-sharing capabilities of modern and future computers, several hundred patients an hour could be handled by a computer system designed for this purpose.&#8221; Writing in <em>Natural History</em>, the prominent astrophysicist Carl Sagan expressed equal excitement about Eliza&#8217;s potential. He foresaw the development of &#8220;a network of computer therapeutic terminals, something like arrays of large telephone booths, in which, for a few dollars a session, we would be able to talk with an attentive, tested, and largely non-directive psychotherapist.&#8221;</p><p>To converse with Eliza was to engage in a variation on the famous Turing test. But, as Weizenbaum was astonished to discover, the people who &#8220;talked&#8221; with his program had little interest in making rational, objective judgments about the identity of Eliza. They <em>wanted</em> to believe that Eliza was actually thinking. They <em>wanted</em> to imbue Eliza with human qualities&#8212;even when they were well aware that it was nothing more than a computer program following simple and rather obvious instructions. The Turing test, it turned out, was as much a test of the way human beings think as of the way machines think. In their <em>Journal of Nervous and Mental Disease</em> article, the three psychiatrists hadn&#8217;t just suggested that Eliza could serve as a substitute for a real therapist. They went on to argue, in circular fashion, that a psychotherapist was in essence a kind of computer: &#8220;A human therapist can be viewed as an information processor and decision maker with a set of decision rules which are closely linked to short-range and long-range goals.&#8221;<strong> </strong>In simulating a human being, however clumsily, Eliza encouraged human beings to think of themselves as simulations of computers.</p><p>The reaction to the software unnerved Weizenbaum. It planted in his mind a question he had never before asked himself but that would preoccupy him for many years: &#8220;What is it about the computer that has brought the view of man as a machine to a new level of plausibility?&#8221; In 1976, a decade after Eliza&#8217;s debut, he provided an answer in his book <em>Computer Power and Human Reason</em>. To understand the effects of a computer, he argued, you had to see the machine in the context of humankind&#8217;s<strong> </strong>past intellectual technologies, the long succession of tools that transformed how people think and altered their &#8220;perception of reality.&#8221; Such technologies become part of &#8220;the very stuff out of which man builds his world.&#8221; Once adopted, they can never be abandoned, at least not without plunging society into &#8220;great confusion and possibly utter chaos.&#8221; An intellectual technology, he wrote, &#8220;becomes an indispensable component of any structure once it is so thoroughly integrated with the structure, so enmeshed in various vital substructures, that it can no longer be factored out without fatally impairing the whole structure.&#8221;</p><p>That fact, almost &#8220;a tautology,&#8221; helps explain how our dependence on digital computers grew steadily and seemingly inexorably after the machines were invented at the end of the Second World War. &#8220;The computer was not a prerequisite to the survival of modern society in the post-war period and beyond,&#8221; Weizenbaum argued; &#8220;its enthusiastic, uncritical embrace by the most &#8216;progressive&#8217; elements of American government, business, and industry made it a resource essential to society&#8217;s survival <em>in the form</em> that the computer itself had been instrumental in shaping.&#8221; He knew from his experience with time-sharing networks that the role of computers would expand beyond the automation of governmental and industrial processes. Computers would come to mediate the activities that define people&#8217;s everyday lives&#8212;how they learn, how they think, how they socialize. What the history of intellectual technologies shows us, he warned, is that &#8220;the introduction of computers into some complex human activities may constitute an irreversible commitment.&#8221; Our intellectual and social lives may, like our industrial routines, come to reflect the form that the computer imposes on them.</p><p>What makes us most human, Weizenbaum had come to believe, is what is least computable about us&#8212;the connections between our mind and our body, the experiences that shape our memory and our thinking, our capacity for emotion and empathy. The great danger we face as we become more intimately involved with our computers&#8212;as we come to experience more of our lives through the disembodied symbols flickering across our screens&#8212;is that we&#8217;ll begin to lose our humanness, to sacrifice the very qualities that separate us from machines. The only way to avoid that fate, Weizenbaum wrote, is to have the self-awareness and the courage to refuse to delegate to computers the most human of our mental activities and intellectual pursuits, particularly &#8220;tasks that demand wisdom.&#8221;</p><p>In addition to being a learned treatise on the workings of computers and software, Weizenbaum&#8217;s book was a cri de coeur, a computer programmer&#8217;s passionate examination of the limits of his profession. The book did not endear the author to his peers. After it came out, Weizenbaum was spurned as a heretic by leading computer scientists, particularly those pursuing artificial intelligence. John McCarthy, one of the early AI pioneers and promoters, spoke for many technologists when, in a mocking review, he dismissed <em>Computer Power and Human Reason</em> as &#8220;an unreasonable book&#8221; and scolded Weizenbaum for unscientific &#8220;moralizing.&#8221; Outside the data-processing field, the book caused only a brief stir. It appeared just as the first personal computers were making the leap from hobbyists&#8217; workbenches to mass production. The public, primed for the start of a buying spree that would put computers into every office, home, and school in the land, was in no mood to entertain an apostate&#8217;s doubts.</p>]]></content:encoded></item><item><title><![CDATA[The Myth of Automated Learning]]></title><description><![CDATA[AI's real threat to education.]]></description><link>https://www.newcartographies.com/p/the-myth-of-automated-learning</link><guid isPermaLink="false">https://www.newcartographies.com/p/the-myth-of-automated-learning</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Tue, 27 May 2025 10:00:30 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!E1AR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20a45140-f2a8-44d1-a32c-08aaf26c7257_2236x1318.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!E1AR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20a45140-f2a8-44d1-a32c-08aaf26c7257_2236x1318.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!E1AR!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20a45140-f2a8-44d1-a32c-08aaf26c7257_2236x1318.heic 424w, https://substackcdn.com/image/fetch/$s_!E1AR!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20a45140-f2a8-44d1-a32c-08aaf26c7257_2236x1318.heic 848w, https://substackcdn.com/image/fetch/$s_!E1AR!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20a45140-f2a8-44d1-a32c-08aaf26c7257_2236x1318.heic 1272w, https://substackcdn.com/image/fetch/$s_!E1AR!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20a45140-f2a8-44d1-a32c-08aaf26c7257_2236x1318.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!E1AR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20a45140-f2a8-44d1-a32c-08aaf26c7257_2236x1318.heic" width="1456" height="858" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/20a45140-f2a8-44d1-a32c-08aaf26c7257_2236x1318.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:858,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:357121,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/163771392?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20a45140-f2a8-44d1-a32c-08aaf26c7257_2236x1318.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!E1AR!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20a45140-f2a8-44d1-a32c-08aaf26c7257_2236x1318.heic 424w, https://substackcdn.com/image/fetch/$s_!E1AR!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20a45140-f2a8-44d1-a32c-08aaf26c7257_2236x1318.heic 848w, https://substackcdn.com/image/fetch/$s_!E1AR!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20a45140-f2a8-44d1-a32c-08aaf26c7257_2236x1318.heic 1272w, https://substackcdn.com/image/fetch/$s_!E1AR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F20a45140-f2a8-44d1-a32c-08aaf26c7257_2236x1318.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Fran&#231;ois Bonvin, <em>Still Life with Book, Papers and Inkwell </em>(detail)<em>.</em></figcaption></figure></div><p>Among the general public, generative AI&#8217;s most enthusiastic early adopters have been students. Surveys conducted a year ago revealed that nearly 90 percent of college students and more than 50 percent of high-schoolers were regularly using chatbots for schoolwork. Those numbers are certainly higher now. AI may be the most rapidly adopted educational tool since the pencil.</p><p>Because text-generating bots like ChatGPT offer an easy way to cheat on papers and other assignments, students&#8217; embrace of the technology has stirred uneasiness, and sometimes despair, among educators. Teachers and pupils now find themselves playing an algorithmic <a href="https://www.wsj.com/tech/personal-tech/ai-student-papers-humanize-school-6b13240a?st=c85noA&amp;reflink=desktopwebshare_permalink">cat-and-mouse game</a>, with no winners. But cheating is a symptom of a deeper, more insidious problem. The real threat AI poses to education isn&#8217;t that it encourages cheating. It&#8217;s that it discourages learning.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">New Cartographies is a reader-supported publication. Please subscribe.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>To understand why, it&#8217;s important to recognize that generative AI is an automation technology. You can speculate all you want about computers eventually attaining human-level intelligence or even &#8220;superintelligence,&#8221; but for the time being AI is doing something that has a long precedent in human affairs. Whether it&#8217;s engaged in research or summarization, writing words or creating charts, it is replacing human labor with machine labor.</p><p>Thanks to human-factors researchers and the mountain of evidence they&#8217;ve compiled on the consequences of automation for workers,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> we know that one of three things happens when people use a machine to automate a task they would otherwise have done themselves:</p><ol><li><p>Their skill in the activity grows.</p></li><li><p>Their skill in the activity atrophies.</p></li><li><p>Their skill in the activity never develops.</p></li></ol><p>Which scenario plays out hinges on the level of mastery a person brings to the job. If a worker has already mastered the activity being automated, the machine can become an aid to further skill development. It takes over a routine but time-consuming task, allowing the person to tackle and master harder challenges. In the hands of an experienced mathematician, for instance, a slide rule or a calculator becomes an intelligence amplifier.  </p><p>If, however, the maintenance of the skill in question requires frequent practice &#8212; as is the case with most manual skills and many skills requiring a combination of manual and mental dexterity &#8212; then automation can threaten the talent of even a master practitioner. We see this in <a href="https://www.newcartographies.com/p/on-autopilot">aviation</a>. When skilled pilots become so dependent on autopilot systems that they rarely practice manual flying, they suffer what researchers term &#8220;skill fade.&#8221; They lose situational awareness, and their reactions slow. They get rusty. </p><p>Automation is most pernicious in the third scenario: when a machine takes command of a job before the person using the machine has gained any direct experience doing the work. Without experience, without practice, talent is stillborn. That was the story of the &#8220;deskilling&#8221; phenomenon of the early Industrial Revolution. Skilled craftsmen were replaced by unskilled machine operators. The work sped up, but the only skill the machine operators developed was the skill of operating the machine, which in most cases was hardly any skill at all. Take away the machine, and the work stops.</p><p>Because generative AI is a general-purpose technology that can be used to automate all sorts of tasks and jobs, we&#8217;re likely to see plenty of examples of each of the three skill scenarios in the years to come. But AI&#8217;s use by high-school and college students to complete written assignments, to ease or avoid the work of reading and writing, is a special case. It puts the process of deskilling at education&#8217;s core. To automate learning is to subvert learning. </p><p>Unlike carpentry or calculus, learning is not a skill that can be &#8220;mastered.&#8221; It&#8217;s true that the more research you do, the better you&#8217;ll get at doing research, and the more papers you write, the better you&#8217;ll get at writing papers, but the pedagogical value of a writing assignment doesn&#8217;t lie in the tangible product of the work &#8212; the paper that gets handed in at the assignment&#8217;s end. It lies in the work itself: the critical reading of source materials, the synthesis of evidence and ideas, the formulation of a thesis and an argument, and the expression of thought in a coherent piece of writing. The paper is a proxy that the instructor uses to evaluate the success of the work the student has done &#8212; the work of learning. Once graded and returned to the student, the paper can be thrown away.</p><p>Generative AI enables students to produce the product without doing the work. Rather than reading and making sense of difficult source texts, they can ask a chatbot to gin up simplified summaries. Rather than synthesizing various ideas and perspectives through concerted thinking, they can ask the chatbot for a generic synthesis. And rather than expressing (and refining) their thoughts through the composition of sentences and paragraphs, they can get the bot to spit out a first draft or even a final one. The paper a student hands in no longer provides evidence of the work of learning its creation entailed. It is a substitute for the work.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a></p><p>In a recent <em>Chronicle of Higher Education</em> <a href="https://www.chronicle.com/article/is-ai-enhancing-education-or-replacing-it">article</a>, Clay Shirky, a long-time analyst of digital media who serves as Vice Provost for AI and Technology in Education at New York University, drew on his extensive work with professors and students to explain how this new dynamic is eating away at one of education&#8217;s foundations:</p><blockquote><p>Every year, 15 million or so undergraduates in the United States produce papers and exams running to billions of words. While the <em>output </em>of any given course is student assignments &#8212; papers, exams, research projects, and so on &#8212; the <em>product </em>of that course is student experience. &#8220;Learning results from what the student does and thinks,&#8221; as the great educational theorist Herbert Simon once noted, &#8220;and only as a result of what the student does and thinks.&#8221; . . .</p><p>The utility of written assignments relies on two assumptions: The first is that to write about something, the student has to understand the subject and organize their thoughts. The second is that grading student writing amounts to assessing the effort and thought that went into it. At the end of 2022, the logic of this proposition &#8212; never ironclad &#8212; began to fall apart completely. The writing a student produces and the experience they have can now be decoupled as easily as typing a prompt, which means that grading student writing might now be unrelated to assessing what the student has learned to comprehend or express.</p></blockquote><p>The work of learning is hard by design &#8212; unchallenged, the mind learns nothing &#8212; and trying to alleviate or avoid it is nothing new. Time-pressed students have always sought shortcuts (CliffsNotes and SparkNotes built businesses serving them), and unscrupulous students have always found ways to cheat. But generative AI is something different, not just in scale but in kind. AI&#8217;s speed, ease of use, flexibility, and, most important, wide adoption throughout society are making it feel normal and even necessary to automate reading and writing and bypass the work of learning. Struggling with words and the ideas they represent is starting to feel old-fashioned and even foolish, like struggling to navigate a city with a paper map. Why bother, when a machine can do the heavy lifting for you?</p><p>What AI too often produces is the illusion of learning. Students may well be able to write better papers with a chatbot than they could on their own, but they end up learning less. The problem doesn&#8217;t seem to be limited to writing assignments. An extensive 2024 University of Pennsylvania study of the effects of AI on high-school math students found, as its authors write in a forthcoming PNAS <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4895486">article</a>, that &#8220;access to GPT-4 significantly improves performance [as measured by grades],&#8221; but when access to the technology is taken away, &#8220;students actually perform worse than those who never had access.&#8221; Armed with generative AI, a B student can produce A work while turning into a C student.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a></p><p>An ironic consequence of the loss of learning is that it prevents students from using AI adeptly. Writing a good prompt requires an understanding of the subject being explored. The prompter needs to know the context of the prompt. The development of that kind of understanding is exactly what a reliance on AI impedes. &#8220;The most useful deployment of current and near-future generative AI in research and expression absolutely <em>requires </em>that you already know a great deal,&#8221; <a href="https://timothyburke.substack.com/p/academia-is-ai-hype-yes">writes</a> Swarthmore history professor Timothy Burke, but the way the technology is actually being used &#8220;is brutally short-circuiting the processes by which people gain enough knowledge and expressive proficiency to be able to use the potential of generative AI correctly.&#8221; The tool&#8217;s deskilling effect extends to the use of the tool itself.</p><p>Shirky senses a growing &#8220;sadness&#8221; among students as they become more dependent on AI. They feel compelled to use the technology even though they know it&#8217;s sapping their learning &#8212; and foreclosing the intellectual possibilities that learning opens, the satisfactions that come with doing or grasping something hard. He quotes some undergraduates:</p><blockquote><p>&#8220;I&#8217;ve become lazier. AI makes reading easier, but it slowly causes my brain to lose the ability to think critically or understand every word.&#8221; </p><p>&#8220;I literally can&#8217;t even go 10 seconds without using Chat when I am doing my assignments. I hate what I have become because I know I am learning NOTHING, but I am too far behind now to get by without using it . . . my motivation is gone.&#8221;</p><p>&#8220;Everyone is doing it.&#8221;</p></blockquote><p>We&#8217;ve been focused on how students use AI to cheat. What we should be more concerned about is how AI cheats students. </p><p><em>This post is part of <a href="https://www.newcartographies.com/t/dead-speech">Dead Speech</a>, the New Cartographies series on AI and its cultural and economic consequences.</em></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>I explore this evidence in my book on automation, <em><a href="https://www.nicholascarr.com/?page_id=18">The Glass Cage</a></em>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>One might argue that, in adopting generative AI, students are bringing to its logical conclusion a long-running trend in education that was set in motion by parents, politicians, and school administrators: stressing quantitative measures of performance over actual learning. </p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>This cycle of dependency is good for AI companies. In March, with evidence of AI&#8217;s disruptive effects on education mounting, OpenAI <a href="https://help.openai.com/en/articles/10968654-student-discounts-for-chatgpt-plus-us-canada">announced</a> that it was giving students free access to the premium version of its service, ChatGPT Plus, through the end of the school year. For AI companies, students aren&#8217;t learners. They&#8217;re customers.</p></div></div>]]></content:encoded></item><item><title><![CDATA[Friendship in the Age of Digital Simulation]]></title><description><![CDATA[Only echoes.]]></description><link>https://www.newcartographies.com/p/friendship-in-the-age-of-digital</link><guid isPermaLink="false">https://www.newcartographies.com/p/friendship-in-the-age-of-digital</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Sun, 11 May 2025 07:13:13 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!30ML!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87a255aa-e575-402f-86fb-3070835cb8ec_2788x1976.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!30ML!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87a255aa-e575-402f-86fb-3070835cb8ec_2788x1976.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!30ML!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87a255aa-e575-402f-86fb-3070835cb8ec_2788x1976.heic 424w, https://substackcdn.com/image/fetch/$s_!30ML!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87a255aa-e575-402f-86fb-3070835cb8ec_2788x1976.heic 848w, https://substackcdn.com/image/fetch/$s_!30ML!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87a255aa-e575-402f-86fb-3070835cb8ec_2788x1976.heic 1272w, https://substackcdn.com/image/fetch/$s_!30ML!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87a255aa-e575-402f-86fb-3070835cb8ec_2788x1976.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!30ML!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87a255aa-e575-402f-86fb-3070835cb8ec_2788x1976.heic" width="1456" height="1032" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/87a255aa-e575-402f-86fb-3070835cb8ec_2788x1976.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1032,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1027557,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/163276923?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87a255aa-e575-402f-86fb-3070835cb8ec_2788x1976.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!30ML!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87a255aa-e575-402f-86fb-3070835cb8ec_2788x1976.heic 424w, https://substackcdn.com/image/fetch/$s_!30ML!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87a255aa-e575-402f-86fb-3070835cb8ec_2788x1976.heic 848w, https://substackcdn.com/image/fetch/$s_!30ML!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87a255aa-e575-402f-86fb-3070835cb8ec_2788x1976.heic 1272w, https://substackcdn.com/image/fetch/$s_!30ML!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87a255aa-e575-402f-86fb-3070835cb8ec_2788x1976.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">James Ensor, <em>Masks Confronting Death.</em></figcaption></figure></div><p>The new version of Mark Zuckerberg &#8212; I&#8217;ll call him alt-Mark &#8212; is peddling a new version of the metaverse. The <a href="https://www.roughtype.com/?p=8935">original idea</a> behind the virtual world, as you may hazily recall, was that we&#8217;d be digitally transformed into legless cartoon characters who would fly around a cartoon planet having sword fights and doing other supposedly fun things. It was a reboot of <em>The Jetsons</em> with Mr. Spacely as executive producer. </p><p>That version didn&#8217;t come close to achieving critical mass &#8212; I think only five people have signed up for Meta&#8217;s Horizon Worlds &#8212; and it was ditched when the release of ChatGPT rerouted the future onto a new and more lucrative path. With generative AI, you don&#8217;t need a critical mass of people to populate a user&#8217;s virtual social sphere. You can use chatbots, which have, to companies like Meta, distinct advantages over their human brethren. They&#8217;re cheap to produce, limitless in supply, and tractable in the extreme. Told what to do, they do it. </p><p>As friends, too, the bots have advantages over their meatier precursors. They&#8217;re always online, always awake, always there for you &#8212; <em>exclusively</em> for you.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Subscribe to New Cartographies.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Alt-Mark has been making the podcast rounds to promote Meta&#8217;s new vision of a virtualized society in which bot friends take the place of real friends. Technology, in this view, is the obvious solution to the loneliness crisis that technology created. &#8220;The average American I think has, it&#8217;s fewer than three friends, three people they&#8217;d consider friends,&#8221; Zuckerberg says (oblivious to the fact that he&#8217;s confessing that Facebook failed utterly in its stated aim of strengthening social bonds), &#8220;and the average person has demand for meaningfully more, I think it&#8217;s like fifteen friends.&#8221; Meta can bring friendship&#8217;s supply-demand imbalance back into equilibrium by using its mountain of personal data to gin up a dozen bespoke bot friends for every average American. Any squeamishness about such relationships will subside once we develop &#8220;the vocabulary as a society to articulate why they are valuable.&#8221;</p><p>And the friends that come out of Meta&#8217;s friend factory won&#8217;t be limited to the disappointing, run-of-the-mill friends available in your local neighborhood. These will be cool friends, hot friends. Meta can even &#8212; for a small added fee, one assumes &#8212; manufacture celebrity friends for you. As the <em>Wall Street Journal</em> <a href="https://www.wsj.com/tech/ai/meta-ai-chatbots-sex-a25311bf?st=4EpRWC&amp;reflink=desktopwebshare_permalink">reports</a>, &#8220;To boost the popularity of these souped-up chatbots, Meta has cut deals for up to seven-figures with celebrities like actresses Kristen Bell and Judi Dench and wrestler-turned-actor John Cena for the rights to use their voices.&#8221; It&#8217;s hard to imagine a better BFF than Dame Judi.</p><p>If the metaverse in its original conception represented a break from Facebook-style social media &#8212; Meta pitched it as a radical advance in the re-creation of society as a simulation &#8212; the botverse represents a retreat back to the familiar. It&#8217;s a continuation of the reigning social-media model, with nonhuman disembodied voices replacing human disembodied voices. Once you separate voice from being, alt-Mark understands, the next step is to mass-produce voice as a commodity, to offer friendship as a service.</p><p>The botverse is in fact the logical culmination of the social-media model, which has always sought to replace real friendship with a computer-generated facsimile. As the cultural critic Rob Horning <a href="https://robhorning.substack.com/p/contentment">argues</a>, the social-media system was never actually designed to encourage people to, in Zuckerberg&#8217;s oft-repeated phrase, &#8220;connect with friends and family.&#8221; It was designed to disconnect people from friends and family, to sever traditional social ties in order to reconstruct them in an algorithmically mediated form. &#8220;Tech companies,&#8221; Horning writes, &#8220;built social infrastructure only to undermine it, to help with dismantling it as a site of resistance to commercialization, commodification, and mediatization.&#8221; The companies knew that &#8220;isolated people make for more dependable consumers.&#8221; To monetize friendship, you first have to dismember it.</p><p>The Dutch media theorist Geert Lovink, in his 2019 book <em><a href="https://bookshop.org/a/85280/9780745339344">Sad By Design</a></em>, argued in a similar vein that unhappiness and angst aren&#8217;t unintended byproducts of social media but rather design features built into the system from the start. Through their continuously replenished supply of messages, each offering a simulation of connection, social platforms promise to alleviate the sense of loneliness they provoke. Parched, we keep returning to the well, even though we know the water&#8217;s bad. </p><p>By turning social interactions into symbolic transactions, the platforms reconstruct society on a foundation of anomie. Bots fit seamlessly into such a society, upping the monetization potential substantially.</p><p>When Facebook&#8217;s News Feed introduced us to what Zuckerberg termed &#8220;frictionless sharing,&#8221; we learned, or should have learned, that friction is the essence of sharing. Freed of any investment of effort, time, or care, sharing loses all meaning. It becomes mere transmission. The frictionless friendship offered by chatbots, by removing the need to adapt one&#8217;s self to another self, to make room in one&#8217;s life for a different being, will be similarly empty. Because our personalized chatbots will be modeled on our own characteristics and desires, as defined by the data the platforms collect on us, they will be versions of ourselves. They&#8217;ll do us in different voices. That seems like another recipe for amplifying loneliness, not alleviating it.</p><p>Our fate, should we take the path Zuckerberg and his Silicon Valley mates are laying for us, would be similar to that suffered by the stranded man in Robert Frost&#8217;s poem of existentialist despair, &#8220;The Most of It&#8221;:</p><blockquote><p>He thought he kept the universe alone;<br>For all the voice in answer he could wake<br>Was but the mocking echo of his own<br>From some tree-hidden cliff across the lake.<br>Some morning from the boulder-broken beach<br>He would cry out on life, that what it wants<br>Is not its own love back in copy speech,<br>But counter-love, original response.<br>And nothing ever came of what he cried.</p></blockquote><p>The voice of your personalized chatbot friend will never be more than a mocking echo of your own, even if it sounds exactly like Judi Dench.</p><div><hr></div><p><em>This post is part of <a href="https://www.newcartographies.com/t/dead-speech">Dead Speech</a>, the New Cartographies series on AI and its cultural and economic consequences.</em></p>]]></content:encoded></item><item><title><![CDATA[The Bus]]></title><description><![CDATA[Either you're on or you're off.]]></description><link>https://www.newcartographies.com/p/the-bus</link><guid isPermaLink="false">https://www.newcartographies.com/p/the-bus</guid><dc:creator><![CDATA[Nicholas Carr]]></dc:creator><pubDate>Sun, 20 Apr 2025 10:01:11 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!IPWm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d8e35df-0dd4-45f0-9e21-350a5de29c12_1500x844.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>It was a little more than a decade ago that Google started its Bay Area bus service to shuttle employees back and forth to work. Other big tech firms, including Facebook and Apple, followed suit, creating a comfortable, clean, efficient, private mass-transit system for what Marc Andreessen would later call &#8220;the reality privileged.&#8221; Today&#8217;s <a href="https://www.newcartographies.com/t/rerun">Sunday Rerun</a> is a post I wrote in early 2014 about the Google bus as vehicle and symbol.</em></p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!IPWm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d8e35df-0dd4-45f0-9e21-350a5de29c12_1500x844.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!IPWm!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d8e35df-0dd4-45f0-9e21-350a5de29c12_1500x844.png 424w, https://substackcdn.com/image/fetch/$s_!IPWm!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d8e35df-0dd4-45f0-9e21-350a5de29c12_1500x844.png 848w, https://substackcdn.com/image/fetch/$s_!IPWm!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d8e35df-0dd4-45f0-9e21-350a5de29c12_1500x844.png 1272w, https://substackcdn.com/image/fetch/$s_!IPWm!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d8e35df-0dd4-45f0-9e21-350a5de29c12_1500x844.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!IPWm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d8e35df-0dd4-45f0-9e21-350a5de29c12_1500x844.png" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0d8e35df-0dd4-45f0-9e21-350a5de29c12_1500x844.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1035629,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.newcartographies.com/i/161327751?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d8e35df-0dd4-45f0-9e21-350a5de29c12_1500x844.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!IPWm!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d8e35df-0dd4-45f0-9e21-350a5de29c12_1500x844.png 424w, https://substackcdn.com/image/fetch/$s_!IPWm!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d8e35df-0dd4-45f0-9e21-350a5de29c12_1500x844.png 848w, https://substackcdn.com/image/fetch/$s_!IPWm!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d8e35df-0dd4-45f0-9e21-350a5de29c12_1500x844.png 1272w, https://substackcdn.com/image/fetch/$s_!IPWm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0d8e35df-0dd4-45f0-9e21-350a5de29c12_1500x844.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Future bus station at Mars Base Alpha (SpaceX rendering).</figcaption></figure></div><p><em>Mobile</em>. <em>Social</em>. Before the app, before the smartphone, before the network, there was the bus. And the bus headed south from San Francisco toward a new world. Tom Wolfe told the tale with characteristic verve in his 1968 classic, <em>The Electric Kool-Aid Acid Test</em>:</p><blockquote><p>&#8220;There are going to be times,&#8221; says Kesey, &#8220;when we can&#8217;t wait for somebody. Now, you&#8217;re either on the bus or off the bus. If you&#8217;re on the bus, and you get left behind, then you&#8217;ll find it again. If you&#8217;re off the bus in the first place &#8212; then it won&#8217;t make a damn.&#8221; And nobody had to have it spelled out for them. Everything was becoming allegorical, understood by the group mind, and especially this: &#8220;You&#8217;re either on the bus . . . or off the bus.&#8221;</p></blockquote><p>In a richly allegorical incident that took place on a San Francisco street on December 9, a young Google employee harangued a group of protesters who had blocked a Google bus from making its rounds between the city and the company&#8217;s Mountain View campus. &#8220;This is a city for the right people who can afford it,&#8221; yelled the Googler, irate over his inability to get to the Googleplex and his free breakfast buffet. &#8220;You can&#8217;t afford it? You can leave. I&#8217;m sorry, get a better job.&#8221; There was a video, of course, and it exploded into virality on YouTube:</p><div id="youtube2-8yc6z2oSPdQ" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;8yc6z2oSPdQ&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/8yc6z2oSPdQ?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>But the guy wasn&#8217;t really a Googler. In a second virality surge, triggered just a couple of hours after the first, news spread that the whole event had been staged. The irate man was a union organizer named Max Alper, who described his stunt as &#8220;street theater.&#8221; A happening! Alper seemed at that moment a direct descendant of Ken Kesey&#8217;s Merry Pranksters. Rather than being <em>on the bus</em>, though, Alper was most definitely <em>off the bus</em>.</p><p>But if the Prankster was off the bus, who exactly was on the bus? Was it The Man? Had it been The Man all along? &#8220;I think there&#8217;s always been a tension between the countercultural rhetoric of Silicon Valley and its insurgent but ultimately corporate ethos,&#8221; mused Stanford media professor Fred Turner in a recent <a href="https://hbr.org/2014/01/how-silicon-valley-became-the-man">interview</a>.  </p><blockquote><p>Google treats its engineers extremely well, offers extremely flexible work spaces, has built essentially a culture of collaboration and creativity that looks very communal and very wonderful, even as around those engineers it has cafeteria workers who are making something very close to minimum wage, and often lack the ability to get proper health insurance. That&#8217;s the kind of old communal mindset right there, where you bring together a kind of elite, give them a shared mindset, all the resources they need to live in that mindset, and yet surround them with folks who are relatively impoverished, often racially different, certainly members of a different class. In that sense, the communes were already The Man. And we&#8217;ve inherited their legacy.</p></blockquote><p>So there it is: The Kesey bus, through a kind of hallucinogenic transmogrification, has become the Google bus. The makeover is, on the surface, radical. The Kesey bus was a 1939 International Harvester school bus bought for peanuts; the Google bus is a plush new Van Hool machine that goes for half a million bucks. The Kesey bus was brightly colored, a rolling Grateful Dead poster; the Google bus is drab and anonymous, a rolling Jos. A. Bank suit. The Kesey bus was raucous and raunchy; the Google bus is hushed and chaste. The Kesey bus carried a vat of LSD for connecting with the group mind; the Google bus has wifi. </p><p>The Pranksters named their bus Furthur. If the Google bus had a name, it would be Safer.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.newcartographies.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Subscribe to New Cartographies.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Yet, despite the differences, both buses are vehicles of communalism. As Turner suggests, they carry young elites eager to distance themselves from the reigning culture, to define themselves as members of a select and separate society that will become a model for the superior society of the future. The existing culture is too corrupt, too far gone, to be reformed from within. You have to escape it to rebuild it. You have to start over. You have to get on the bus.</p><p>&#8220;Migration to North America was self-selective,&#8221; observed pioneering acid-dropper Timothy Leary in his essay &#8220;Exo-politics,&#8221; written in the mid-seventies while he was locked up in federal prison on a drug charge. </p><blockquote><p>The Pilgrim mothers and fathers fled from England to Holland, mortgaged their possessions, and sailed the Mayflower, because they wanted a place to live out the kooky, freaky reality that they collectively shared. And there&#8217;s no question the experiment is a success. Americans are freer than Europeans, and Californians are a new species evolving away from Americans. </p></blockquote><p>Having bumped up against the Pacific, the next step for <em>Homo californicus</em> would be to rocket off into the heavens to set up experimental &#8220;mini-worlds&#8221; in outer space. &#8220;Within ten years after initiating space migration,&#8221; Leary wrote, &#8220;a group of 1,000 people could get together cooperatively and build a new mini-world cheaper than they could buy individual homes down here. Within 25 years there&#8217;ll be a High-Orbital Mini-Earth for <em>your</em> vision of social reality. You have the right, duty, and responsibility to externalize that vision with those who share it.&#8221;</p><p>During the seventies, Leary had plenty of company in calling for the establishment of elite experimental colonies beyond the bounds of established society. Buckminster Fuller, Gerard O&#8217;Neill, and Jerry Brown, among others, <a href="https://ia601402.us.archive.org/22/items/spacecoloniesdec00unse/spacecoloniesdec00unse.pdf">argued</a> for expanding the American frontier to create zones of technological and social experimentation where innovation could proceed unhampered by outdated laws and traditions. The migration of the self-selecting elite would eventually help the more timid who chose to stay behind, Leary argued, as it &#8220;allows for new experiments &#8212; technological, political, and social &#8212; in a new ecological niche far from the home hive.&#8221;</p><p>That idea, scrubbed of its psychedelic origins, has today become the bedrock of Silicon Valley utopianism. &#8220;Law can&#8217;t be right if it&#8217;s fifty years old,&#8221; Google founder Larry Page said recently. &#8220;Like, it&#8217;s before the internet.&#8221; He went on: </p><blockquote><p>Maybe we should set aside some small part of the world, you know, like going to Burning Man, [that would serve as] an environment where people try out different things, but not everybody has to go. And I think that&#8217;s a great thing, too. I think as technologists we should have some safe places where we can try out some new things and figure out: What is the effect on society? What&#8217;s the effect on people? Without having to deploy it into the normal world. And people who like those kinds of things can go there and experience that.</p></blockquote><p>It&#8217;s not only Page. Jeff Bezos and Elon Musk dream of establishing Learyesque space colonies, celestial Burning Mans. Peter Thiel is slightly more down to earth. His Seasteading Institute hopes to set up floating technology incubation colonies on the ocean, outside national boundaries. &#8220;If you can start a new business, why can you not start a new country?&#8221; he asks. &#8220;The reason the seasteading question&#8217;s been so interesting is that a lot of people do think that we can do much better as a society. And if you run the thought experiment &#8212; <em>could we be doing things better in our society? &#8212; </em>people may disagree on the particulars, but an awful lot of people think things can be done dramatically better.&#8221; </p><p>The institute has even come up with a nifty retelling of history to explain how its colonies will, in short order, raise the poor out of slums and into luxury high-rises:</p><div id="youtube2-50qXvXMn1Y4" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;50qXvXMn1Y4&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/50qXvXMn1Y4?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>In a notorious <a href="http://www.youtube.com/watch?v=cOubCHLXT6A">speech</a> last fall at the Y Combinator Startup School, bitcoin miner Balaji Srinivasan also channeled Leary when he called for &#8220;Silicon Valley&#8217;s Ultimate Exit&#8221; &#8212; the establishment of a new country beyond the reach of the U.S. and other failed states. &#8220;[When] a company or a country is in decline,&#8221; he explained, &#8220;you can try Voice, or you can try Exit. Voice is basically changing the system from within, whereas Exit is leaving to create a new system, a new startup.&#8221; </p><blockquote><p>We&#8217;re a nation of emigrants: we&#8217;re shaped by both Voice and Exit, starting with the Puritans. You know, they fled religious persecution, the American Revolutionaries which left England&#8217;s orbit. Then we started moving west, leaving the East Coast bureaucracy. . . . What do I mean by Silicon Valley&#8217;s Ultimate Exit? It basically means: build an opt-in society, ultimately outside the US, run by technology. And this is actually where the Valley is going. This is where we&#8217;re going over the next ten years. . . . The best part is this: the people who think this is weird, the people who sneer at the frontier, who hate technology &#8212; they won&#8217;t follow you out there.</p></blockquote><p>The Kesey bus dead-ended somewhere in Mexico, its actual and allegorical gaskets blown. The Google bus continues on its circuit between the City and the Valley, an infinite loop of infinite possibility.</p>]]></content:encoded></item></channel></rss>