It’s hard to remember—impossible, if you’re under thirty—but there was an Internet before there was a World Wide Web. Experts at their computer keyboards (phones were something else entirely) chatted and emailed and used Unix protocols called “finger” and “gopher” to probe the darkness for pearls of information. Around 1992 people started talking about an “Information Superhighway,” in part because of a national program championed by then senator Al Gore to link computer networks in universities, government, and industry. A highway was different from a web, though. It took time for everyone to catch on.
The Internet was a messy joint effort, but the web had a single inventor: Tim Berners-Lee, a computer programmer at the CERN particle physics laboratory in Geneva. His big idea boi…
It’s hard to remember—impossible, if you’re under thirty—but there was an Internet before there was a World Wide Web. Experts at their computer keyboards (phones were something else entirely) chatted and emailed and used Unix protocols called “finger” and “gopher” to probe the darkness for pearls of information. Around 1992 people started talking about an “Information Superhighway,” in part because of a national program championed by then senator Al Gore to link computer networks in universities, government, and industry. A highway was different from a web, though. It took time for everyone to catch on.
The Internet was a messy joint effort, but the web had a single inventor: Tim Berners-Lee, a computer programmer at the CERN particle physics laboratory in Geneva. His big idea boiled down to a single word: links. He thought he could organize a free-form world of information by prioritizing interconnections between documents—which could be text, pictures, sound, or anything at all. Suddenly it seemed everyone was talking about webpages and web browsers; people turned away from their television sets and discovered the thrills of *surfing the web. *
It’s also hard to remember the idealism and ebullience of those days. The world online promised to empower individuals and unleash a wave of creativity. Excitement came in two main varieties. One was a sense of new riches—an abundance, a cornucopia of information goodies. The Library of Congress was “going online” and so was the Louvre. “Click the mouse,” urged the New York Times technology reporter John Markoff:
There’s a NASA weather movie taken from a satellite high over the Pacific Ocean. A few more clicks, and one is reading a speech by President Clinton, as digitally stored at the University of Missouri. Click-click: a sampler of digital music recordings as compiled by MTV. Click again, et voila: a small digital snapshot reveals whether a certain coffee pot in a computer science laboratory at Cambridge University in England is empty or full.
At the same time, the Internet seemed to promise new freedom, a breaking of corporate shackles, a chaotic counterpoint to the uniformity of what was soon to be labeled “old media.” As personal computers acquired modems and their users felt the urge to connect, privately owned information services emerged, like America Online, charging customers for access and offering a fixed menu. But the Internet was different: multifarious, decentralized, and democratic, “like a vast television station without programmers or a newspaper without editors—or rather, with millions of programmers and editors,” as one eager newbie put it in 1994.1 It was meant to be the heyday of the amateur, as celebrated in Amateurs!, an insightful exploration by the British writer and artist Joanna Walsh. Outsiders, unpaid and uncredentialed, came into their own. Peter Steiner’s 1993 New Yorker cartoon, “On the Internet, nobody knows you’re a dog,” became one of the most quoted in the magazine’s history, and that dog was happy.
Berners-Lee took justified pride in his World Wide Web, and still does. “Early web culture was so delightful,” he writes in his engaging memoir, This Is for Everyone. “This organic, emergent structure of the early web was a fragile thing of beauty and I was greatly impressed by it. Unfortunately, it doesn’t much resemble the web of today.”
That is an understatement. We know that any Internet chronicle will take a dark turn. Here are some of the things the optimists failed to foresee: the erosion of privacy and, as Berners-Lee writes, “the industrial-scale harvest of user data.” The emergence of ruthless giant corporations—Google, Meta, Amazon—mightier than nation-states. The creation of a powerful new oligarch class. The collapse of the aforementioned old media; the loss of a consensus reality; the rise of clickbait and deepfakes. “The utopian para-universe of the early net didn’t pan out,” observes Walsh—another understatement.
These authors—Berners-Lee, Walsh, and the blogger, novelist, and activist Cory Doctorow—were more than spectators; they played active parts in the evolution of the online world. Their vantage points differed widely, and naturally their disillusionment comes in different flavors. Doctorow’s is the most pungent; his term enshittification became a buzzword in 2023 and has achieved its own entry in Wikipedia. “All our tech businesses are turning awful, all at once,” he writes. “We remain trapped in their rotting carcasses, unable to escape.” Yet none of these authors has given up hope. Is it too late to salvage some of the Internet’s early promise?
For Tim Berners-Lee, computing was the family business. His parents were mathematicians at the center of the budding British computer industry in the 1950s: “Mum wrote binary code with a tape punch,” he writes; that is, she punched holes in long rolls of paper to represent ones and zeros. Computing was a small world. They got to know the mathematician and code breaker Alan Turing when he was trying to program their company’s first product—a five-ton mainframe computer with four thousand vacuum tubes—to play chess.
With coding in his veins, Berners-Lee attended Oxford University. Computer science was not a recognized subject, so he studied physics instead. In 1980 he took a job at CERN, the great complex of buildings and underground particle accelerators on the border between Switzerland and France. The tunnel that now houses the Large Hadron Collider, the world’s largest, seventeen miles in circumference, was under construction. The staff numbered more than three thousand and hailed from more than twenty countries. By then computers had appeared all through the complex, controlling machinery and storing data: “minicomputers” the size of refrigerators occupied the machine room; others were connected in local networks. A typical terminal displayed twenty-four lines of eighty characters and saved programs to eight-inch floppy disks. Berners-Lee’s division was called Data and Documents, and CERN had digital data and documents in a miscellany of formats and languages, shared among an ever-shifting arrangement of groups and networks. He saw this sprawl as a problem needing new ideas.
Computer scientists, like bureaucrats, tend to think in terms of hierarchical structure: directory trees and organization charts; documents in containers; files stored in folders. This offended Berners-Lee’s intuition about information: that what matters is not objects but relationships. For him the interconnections—links both ways—were paramount. “I was proposing…to free those documents—essentially to dump the files from their folders onto the floor,” he writes. “What you wanted, instead, was to encourage new and unexpected relationships between pieces of information to flourish. And, to do that, you had to let the users make those connections, in any way they saw fit.”
The diagram in his first project proposal, dated March 1989, was labeled “Mesh.” He decided he needed a better name and settled on World Wide Web, because he liked the abbreviation. Then he began proselytizing. Beyond the walls of his organization the global network of networks was taking shape, and it, too, had data and documents. “CERN is a model in miniature of the rest of the world in a few years’ time,” Berners-Lee wrote. “CERN meets now some problems which the rest of the world will have to face soon.” His memo fascinated some of his colleagues and amused others, but the World Wide Web project fit nowhere into Berners-Lee’s actual responsibilities, nor into the mission of a European taxpayer-funded particle physics lab.
Nonetheless his supervisors seem to have tolerated him—an excitable fast-talker, “full of fizz,” with an offbeat passion project. He set about making a system that would have practical utility for his colleagues at their far-flung workstations. He programmed what we now call a web server and a web browser; he created a language for hyperlinks and addresses, URLs, like https://www.nybooks.com. He confronted a chicken-and-egg problem: no one had any reason to share information in the form of webpages, because no one had a web browser; and no one had any reason to use a web browser until there were webpages to visit. The catalyst for mass adoption at CERN—the “killer app”—turned out to be the phone book. The up-to-date laboratory directory resided on a mainframe computer; logging into it was a nuisance. So instead, by 1991 a thousand CERN researchers were using his crude web browser to look up phone numbers. Berners-Lee hosted the world’s first webpage on a PC in his office. It was titled “The World Wide Web project,” and it featured a series of links, including one called Frequently Asked Questions. The computer itself featured a warning notice in red marker: “This machine is a server. DO NOT POWER IT DOWN!!”
He started logging “hits” on his server; as word spread, some of these came across the Internet from outside CERN. By the end of the year he counted a hundred a day. It was a thousand a day before the end of 1992 and ten thousand in 1993, and other people set up web servers of their own; everyone started advertising www.this and www.that, and now more than half the people on earth are users of the World Wide Web.
Berners-Lee gave the online world not just a technology but an attitude. Call it a credo or, as Walsh does in her philosophical exploration (via Kant, Schopenhauer, and Lacan), an aesthetic. It’s in the slogan he uses as his title: This Is for Everyone. Along with other Internet pioneers, he believed that the essential tools—shared protocols and software—should be available to everyone free of charge. No company or government should control the web—that was his vision. In 1993 he persuaded CERN to release all his source code to the public, relinquishing intellectual property rights and ensuring that any user could enjoy it, share it, and modify it.
Walsh is one of those users—part of a generation that could say (as she did in a previous book, Girl Online), “All the good things in my life have come to me through screens.” She, too, celebrates an egalitarian ideal. *We *built Internet culture; it’s ours. “I don’t like books that use ‘we,’ that extend the particular to the general, erasing the subtleties of individual lives,” she writes, but that *we *is essential to her project. She speaks for a presumed cohort of like-minded people, of the right age and class to have a shared experience of the Internet, from then to now. “Online, what we make, and make of ourselves, is experienced not only by whoever’s in front of us, but by anyone we allow to see (and some we don’t),” she says. This is a nice observation. She adds, “Online isn’t an unfamiliar experience any more; it’s where we live.” She means the people who are sometimes called consumers but who, for Internet culture, are also the creators. Her amateurs were liable to use the word aesthetic with particular pleasure and self-consciousness. She celebrates the aesthetic they created, and mourns it, and celebrates it again.
She barely mentions Berners-Lee, but he anticipated her aesthetic of the creative amateur. He, too, liked chaos—“anarchic jumble.” He deplored the apparent rationality evidenced by urban planners like Le Corbusier: “‘rational’ cities, which segmented neighborhoods by function and stripped buildings of detail and ornamentation.” His design for the web was an antidesign, refusing to impose particular structures, leaving space for unanticipated uses and possibilities: “I explicitly conceived of the web to be fractal, thumbing my nose at this kind of false ‘rationality.’” It would evolve, making connections, opening portals, and encouraging creativity. Doctorow remembers it as “a wild and woolly internet, a space where people with disfavored views could find one another, offer mutual aid, and organize.”
Berners-Lee’s memoir serves as a genial potted history of the Internet. He seems to have been everywhere and met everyone. Making an early appearance is a college student at the University of Illinois at Urbana-Champaign named Marc Andreessen. In 1993 he was an undergraduate learning to program—he earned $6.85 an hour writing Unix code at the National Center for Supercomputing Applications, on the Illinois campus. With another NCSA programmer, Eric Bina, he wrote a web browser they called Mosaic, intended to be simple and user-friendly, with versions for Windows and Macintosh PCs.
That was exactly what the world needed in this moment, when hundreds of thousands of PC owners discovered all at once, modems squealing, that they could “dial in” to “Internet service providers.” The NCSA, with funding from Al Gore’s program, backed the Mosaic browser with press promotion, and for a while it was so popular that people talked about being “on Mosaic” rather than on the Internet or the web. “Think of it as a map to the buried treasures of the Information Age,” The New York Times gushed. Hardly anyone remembers Mosaic now, the history of the Internet being a history of things that were incredibly hot for an incredibly short time.
Berners-Lee, who recalls a tense meeting with a truculent Andreessen in a campus basement, saw his free-for-all vision being co-opted. In short order, Andreessen graduated, decamped to Silicon Valley, and took the web browser private with his own Mosaic Communications Corporation. He settled an intellectual property lawsuit from the University of Illinois, changed the browser’s name to Netscape, and became one of the first Internet billionaires. He appeared on the cover of Time magazine in 1996 with bare feet and a lupine grin. Thirty years later, Andreessen is one of Silicon Valley’s most powerful venture capitalists, an enthusiastic backer of the current wave of AI and cryptocurrency. He is the quintessential technocrat, a proud captain of what he calls “the techno-capital machine.”2
To its users, the web browser was a lovely tool. To its owners, it was a platform—a means of control, a system that locked users in and monitored their behavior. Microsoft, late to the Internet, caught up and countered Netscape with a browser of its own, Internet Explorer. This period was known as the browser war. The browser acquired more and more features—for playing games, watching videos, signing forms, and most of all buying stuff, ideally with a single click. There was money to be extracted, data to be harvested.
In the most profound way, Andreessen was Berners-Lee’s nemesis, but it’s not Berners-Lee’s style to get mad. That’s more Cory Doctorow’s thing:
The internet is getting worse, fast. The services we rely on, that we once loved? They’re all turning into piles of shit, all at once. Worse, the digital is merging with the physical, which means that the same forces that are wrecking our platforms are also wrecking our homes and our cars, the places where we work and shop. The world is increasingly made up of computers we put our bodies into, and computers we put into our bodies. And these computers suck.
What Doctorow means is that the bright, shiny objects of the Internet have become spy tools, surreptitiously collecting information about us—our habits, our desires, our health, our political inclinations—and using it to manipulate our behavior. The platforms that appear to serve users hungry for information—and did serve them, at first—now go to extreme lengths to seize attention. Algorithms designed to maximize “engagement” amplify anger and sensationalism at the expense of truth.
Novel platforms emerged and swelled in overlapping sequence: the browser, the search engine (Google), the social network (Facebook, Twitter), the megastore (Amazon). Before all of these, before any dream of the Internet, the proto-platform was the Bell System—the American telephone network, a monopoly operated by the world’s most powerful corporation. The Bell System left nothing to chance and nothing to the user. It owned the wires and the telephones. Customers were captive, and so were the ostensible regulators, for most of a century.
After the breakup of the telephone monopoly, the new platforms could not lock in users so absolutely. They had to resort to cunning. Case number one: Facebook, which Doctorow calls “a service that Mark Zuckerberg started in his dorm room so that he and his creepy pals could nonconsensually rate the fuckability of their fellow Harvard undergrads.”
He’s not wrong. But users loved it. They exchanged personal news and relationship statuses and music preferences and pictures. Zuckerberg’s was not the first social media service; oldsters may vaguely recall Friendster and then MySpace, which by 2006 had been snapped up by Rupert Murdoch. As Facebook put it in a marketing pitch:
Has it occurred to you that MySpace is owned by an evil, crapulent, senescent Australian billionaire named Rupert Murdoch, and he spies on you with every hour that God sends?
Come to Facebook, where we will never spy on you.
Now, of course, spying on users is the essence of Zuckerberg’s business model. This is what the Harvard business professor Shoshana Zuboff has called surveillance capitalism, a project of behavior control, commodifying individuals’ personal experience and private information to target them with advertising and propaganda.3 In Walsh’s terms, creativity has been replaced by extraction. “Creators are back in the age of the patron,” she writes. Customers become unwitting captives: they have friends and followers, but only by sufferance of the platform; if they want to switch to a different service, they can’t take their network with them.
The ironies are abundant, and chief among them is that the early Internet thrived on cutting out the middleman. If people complained about the markup charged by their brick-and-mortar bookstore, the upstart Amazon promised to eliminate the overhead of shelf space, store rents, and clerk salaries and deliver the merchandise straight to their front door. Or straight to the eyeballs—cut out the printers and paper mills, too. The buzzword was disintermediation. Another master of disintermediation was eBay, connecting buyers and sellers directly, cutting out the antique dealers and flea markets. Napster did the same for music lovers, cutting out the record stores; it began enabling song downloads in 1999, operated for a year and a half, claimed 80 million users, and devastated the recording industry.
And now? The platforms are middlemen par excellence. They squeeze buyers and sellers alike. Music streaming services like Spotify and Apple Music say they aim to connect artists with their fans, helping music lovers find the music they love and helping creators find a livelihood; instead they use their centralized control to pay artists less than ever. Google and Facebook, dominating the global advertising market, have colluded to raise prices for advertisers while minimizing the revenue to websites that publish the ads.
Doctorow’s warning is urgent and his analysis is trenchant. Enshittification, as he sees it, has three stages. First, a platform needs to lure users. They provide real value to customers, free of charge, taking losses as necessary. Google offered a truly revolutionary search engine, a portal that seemed to fulfill the best of Berners-Lee’s vision. Facebook let users build communities. Twitter, when it began, was playful and fun, Doctorow writes: “It was a party the whole world was invited to.” Stage one, per Doctorow, is “good to users.”
Stage two is “good to business customers.” When Apple had sold enough iPhones, it could offer developers a ready market for its new App Store. The feedback loop of network effects kicked in: every new app in the App Store made the iPhone more attractive to users; every iPhone sold made the App Store more attractive to app developers. On social media, the business customers were those willing to pay to get their message into the feeds of users who had previously been able to control their own information experience. Facebook (“We will *never *spy on you”) monitored its users’ every click and expropriated the content they posted. In economic terms, it clawed back surplus from users and sold it to business customers.
In stage three, the business customers are squeezed in turn; the platform uses its access to their information to claw back surplus for itself. Amazon clones products sold by its merchants and undercuts their prices. It charges merchants fees to appear in searches—$38 billion a year for search placement alone. That, in turn, poisons the user experience. As Doctorow writes:
On average, the stuff at the top of an Amazon search results page is bad. It’s low-quality, high-priced junk…. The top-scoring items with the highest user ratings are often terrible but are garlanded with (paid) rave reviews.
Google, likewise, undermines the quality of its own search engine to prioritize paid results and increase the number of queries. In the final stage, users are stuck in the platform and getting less and less value, while “the merchants who rely on selling to us are stuck there, too, earning less and less from every sale.”
Enshittification represents the fulfillment of a vision laid out by Andreessen in a famous 2011 *Wall Street Journal *essay, still featured on his company website. “Software is eating the world,” he declared proudly. By then he was a major investor in Facebook, Twitter, LinkedIn, Skype, and many others. What he meant by “eating the world” was that Amazon had destroyed Borders, Netflix had destroyed Blockbuster, music-streaming giants were destroying record labels, and Google was “using software to eat the retail marketing industry.” He considered this to be good news.
But software doesn’t eat anything. Tech companies do, when they gain the power to use the levers of the information economy to consolidate and dominate.
Nothing could be further from the early hopes of Tim Berners-Lee; yet he remains an optimist by nature. “The difference between the enjoyable open web and more predatory social media aspects is largely a design issue,” he writes.
In the early days of the web, delight and surprise were everywhere, but today online life is as likely to induce anxiety as joy. By steering people away from algorithmic addiction, I hope we can reclaim that delight.
He urges us to walk away from Facebook and X in exchange for “something pro-human”—decentralized platforms like Mastodon, a user-controlled, open-source alternative in the so-called Fediverse. One virtue of the Fediverse, as Doctorow also emphasizes, is that users can move freely from one server or community to another without losing their friends and followers. Berners-Lee suggests that new software protocols can return the power of personal information to users. He also imagines, somewhat naively, that the much-touted advances in AI are “signs of spring.”
So far, the rush to AI seems to be embracing the same pathologies: deprecation of workers and creators; secretive closed standards; overheated marketing; and consolidation of power.4 The users are not in control; the strings are held by Google, Andreessen, Elon Musk, Larry Ellison, and Sam Altman.
In spite of everything Doctorow, too, believes there is a path to a “cure,” a way to resist the rise of technofeudalism and bring back the best of “the old, good internet.” Users need to become aware of the tricks that lock them into platforms, and they need to break free. The government needs to use antitrust law to break the monopolies and regulation to prevent fraud and protect privacy.
The European Union, whose governments are not in thrall to the mostly American tech giants, has provided a plausible blueprint with its Digital Services Act and Digital Markets Act. They define “gatekeeper platforms” as pipelines—bottlenecks—between businesses and users, and they attempt to protect competition and privacy through regulations requiring transparency and accountability. During the Biden administration, the United States began to enforce antitrust law more seriously than it had in decades, due partly to the forward-looking chair of the Federal Trade Commission, Lina Khan, a vigorous antimonopolist who sued and investigated Amazon, Meta, and Microsoft. Biden’s Justice Department sued Google for creating an illegal monopoly in the advertising business and Apple for locking in customers and boxing out competitors.
Before Doctorow could finish his book, however, Donald Trump was elected to a second term. Doctorow had to do some rewriting. He retained some hopes—Trump had promised to clamp down on Google, Facebook, and even Twitter, when they were insufficiently deferential to his agenda. During the campaign, J.D. Vance went out of his way to praise Khan for “trying to go after some of these big tech companies that monopolize what we’re allowed to say in our own country.”
Reality has surely disappointed Doctorow yet again. Trump replaced Khan with a commissioner who is reversing her agenda. The antitrust case against Google ended in September with a whimper: the Biden administration had asked for a forced separation of the company’s browser business from its search business, but Judge Amit P. Mehta, having already declared Google a monopolist, backed down. “Here the court is asked to gaze into a crystal ball and look to the future,” he wrote. “Not exactly a judge’s forte.” And Trump has made his own kind of peace with the tech oligarchs: demanding personal obeisance and dispensing favors. Musk and Andreessen became full-throated and deep-pocketed supporters; Zuckerberg and Jeff Bezos donated to his inaugural festivities; Google and Apple executives have come to the White House as supplicants, bearing flattery and gifts. Amazon, Apple, Google, Microsoft, and Meta all joined the list of donors to Trump’s vanity ballroom project in the now-demolished East Wing of the White House.
We amateurs are going to need a work-around.