I
“There have been great societies that did not use the wheel, but there have been no societies that did not tell stories.”
—Ursula K. LeGuin
It’s 2006. I am an unpaid blogger without a day job living at home with my parents. Freelance work has been slow. After smoking a bowl in my green Hyundai Accent with a dent in the front left bumper, I shuffle to the computer room and go through my bookmarked blogs. One of them (I don’t remember which one, probably BoingBoing) links me to a Henry Jenkins essay on the challenge of making video game narratives; how it is difficult to find the balance between good gameplay and good narrative. I have what blogger Lindsay Robertson of the now-defunct blog Lindsayism calls a “highdea.” What if the Internet were to evolve into a narrative medium? Jenkins compares video games to cinema. For decades, films were primarily meant to be an attraction: an electric vaudeville, or circus. There was no fourth wall, actors looked at the camera and performed their stunts, pratfalls, etc. My hope is that by the 2010s, the Internet will prove to be as legitimate a medium for storytelling as cinema proved to be and as television is proving to be at that time.
Almost 20 years later, I am two years sober and the internet has no coherence whatsoever, let alone any serious claim for narrative legitimacy. We are about to become a post-literate society. AI is triggering delusions. What’s more, people don’t seem to have lives to live, let alone write about. Heather Parry writes in Persuasion:
But we have grown up in the internet age, so all our characters speak in a very detached, ironic style where you can’t actually figure out what anyone feels about anything. Because there are no obvious emotions, you can’t really figure out an emotional change. Because there’s no action, and instead we have characters just saying what they think (but not what they feel), there’s little progression or tension, and you don’t see characters forced to make decisions. There’s no propulsion from one scene to the next, and you might get to the end of a story—or sometimes an entire novel—without anything having actually occurred. You’ve had lots of discussions in different bars (to show your character is tortured, right?) but no plot is to be found anywhere.
Instead of a thriving life of the mind, everyone seems stuck with a life in the mind. More than eight billion main characters on the planet, all in search of a story.
Why has the internet not taken off as a legitimate narrative medium? Why do we, after hours of browsing headlines with words like “fascism” “AI doom” and “genocide,” conclude that “nothing ever happens?” The problem goes back further than the usual scapegoats (AI, smartphones, social media). It goes back further than even 2006. In his 2013 book Present Shock, Douglas Rushkoff introduced the term “narrative collapse.” His first application of the concept, to explain the collapse of the narrative of an optimistic future that surged during the nineties and crashed after the year 2000, is interesting, though I would argue, as Nadia Asparouhova does in Antimemetics: Why Some Ideas Resist Spreading, that Western society lost narrative cohesion after the Cold War, with the culture wars as the new focus, fragmenting into several smaller conflicts competing for attention. Rushkoff also tracks how narrative collapse played out in films and on television; it is this collapse that the world wide web was born into.
As life online grew, something else developed in the rubble of narrative collapse: many call it the “attention economy,” but that gives it short shrift. Anyone who has been consistently online since 2009 has been drifting in a world that moves faster than thought, tracking and meeting desires expressed and hidden. This is a dream economy because of the repressed desires that are indulged, but also because of how it feeds off a dull state of consciousness.
I’ve wanted to write this piece for a long time, although it had a slightly different tack when I started: the focus was on how the internet fails as a narrative medium because everyone has main character syndrome. True, but that’s like saying all those flights to Epstein Island were bad for the environment. Too big an understatement to write about. During Substack Summer, I was going to make the case that if, as The New Yorker said, the Great American Novel may come from Substack, then maybe the internet will finally be a conduit to narrative complexity and serious prestige. I wrote one paragraph, but was out of my wheelhouse: I never wrote an infomercial before. That went in the trash bin. Months later, I was on the F train reading the writer Gurwinder on how social media shortens your life. I was particularly taken with the section on the amnesia it induces; how we retain nothing we read, watch or hear. Reading after a long day of work at the hospital like I did, I was napping. When I came to, I tried to remember what I was reading and that was my epiphany: how many half-remembered internet dreams have I drifted in and out of for half my life? All that petty bickering, all those memes that were past their expiration date as soon as I backed out of the DMs, all those rambling five hour podcasts about how there was nothing like The Comedy Store . . . no wonder the internet has not given us a Citizen Kane yet. Ironically, it took months for me to remember that Katherine Dee already likened online life to being in a dream state. One major distinction: she likens it to a mystical realm, albeit one that’s lost its charm. The dream economy I’m referring to is modelled on and corresponds to our everyday dreams that are governed by our basest desires.
It seems dumb worrying about how the internet has not progressed as a medium that can sustain a narrative, until you remember that, more than TV, movies or books, the internet is at the center of many people’s lives. It is how many communicate and how they hear news. I can’t remember the last time I was informed of a death offline. William James, who coined the term “stream of consciousness,” emphasized the central role of attention in the flow of that stream. When we focus our attention on stories, we create an opportunity to see life through someone else’s eyes. This is true for candid, confessional stories, but it also applies to stories from writers who prefer to pen labyrinthine conspiracies instead of thinly disguised autobiography. Stories require us to pay attention. The dream economy pays attention to us. It knows what we desire, even — especially — the desires we don’t explicitly demand. It doesn’t even want us focusing on one post for too long. As Rob Horning writes, “This infrastructure often prioritizes short videos and datafied responses over ‘immersion’ as a way to keep users on a platform and make those users into assets. Algorithmic feeds make digital switching seem like the only kind of activity.”
Networks wanted you to stop changing channels in the channel surfing era; platforms insist on constant, distracted scrolling. They don’t want you in one place for too long. Like a shark, stasis equals death to them. In a dream, you’re in a subway one minute then you turn around and you’re in a schoolhouse. Like that Buster Keaton gag in Sherlock Jr., where his projectionist character dreams he is in the movie he is projecting — one second he is on the street, the next, on the edge of a steep cliff. This is half the reason why no one wants to hear about your dreams other than a professional psychoanalyst: you’re droning on about something that only applies to your inner life, where you get everything you want, consciously or not. Much autofiction about internet life is similarly self-indulgent: often there is a dry list of random things the protagonist saw online. Blah blah blah, the internet is random, we get it. Stories like this will never deliver the “spiritual stupor” that Karl Ove Knausgård felt reading The Brothers Karamazov.
What if the internet were to evolve into a narrative medium? This may have come from a literal pipe dream I had two decades ago. Though I stopped smoking pipes, I am not done chasing stupors.
II
Derek Thomson writes:
In his 1974 book Television: Technology and Cultural Form, Raymond Williams wrote that “in all communications systems before [television], the essential items were discrete.” That is, a book is bound and finite, existing on its own terms. A play is performed in a particular theater at a set hour. Williams argued that television shifted culture from discrete and bounded products to a continuous, streaming sequence of images and sounds, which he called “flow.” When I say “everything is turning into television,” what I mean is that disparate forms of media and entertainment are converging on one thing: the continuous flow of episodic video.
Television might not have caused narrative collapse but, being what Marshall McLuhan called a “cold” medium, it provided the perfect conditions for it. As Thompson notes, the “inwardness” and “sustained attention” needed for complex thought, let alone for complex narratives, atrophied in the television age. In his chapter on narrative collapse, Rushkoff pinpoints the moment on the timeline of TV history when the dam broke and narrative collapse was an imminent threat: when remote controls and video game systems appeared in living rooms during the 1980s.
Yes, in the seventies, television had continuous flow, but the television would stop showing broadcasts after midnight. It was the decade after that introduced us to 24-hour news and entertainment. Remote controls turned everyone into emperors in their own living room coliseums. If a show required too much patience? Thumbs down: CLICK! The sitcoms and detective shows on broadcast networks now had more competition from specialty cable channels like CNN, ESPN and MTV, which offered a constant stream of news, sports and music respectively. Ratings stunts were pulled left and right to stay afloat: every sitcom had a very special episode (usually about the cute kid being offered drugs); movie stars, wrestlers and even the First Lady would make cameo appearances (announced a week ahead of time on the news), no matter how relevant they they were to the plot.
Though it is tempting to shoehorn the Video Game Crash of 1983 and how, in an unlikely turn of events, it was difficult to sell video game consoles in 1985, there is enough disorientation online: when video games became ubiquitous, for the first time, from the comfort of your own couch, you could interact with the characters you saw on the television screen. It is baffling how many diagnoses of screen-based psychosis omit this moment entirely. Before the Atari 2600 and Nintendo Entertainment System (the Magnavox Odyssey was a failure in the Seventies), the television screen was impenetrable to viewers. There was a strict hierarchical ladder if you wanted to be on the other side of the glass: the highest rung of course reserved for stars and heads of state, the lowest rung reserved for local news interviews with eyewitnesses; dancers on a youth music program like “American Bandstand” or “Soul Train;” call-in viewers on live talk shows. In the newly-dawned video game era, you could be Indiana Jones or the prizefighter that knocks out Mike Tyson.
The roots of narrative collapse can be found here. While stories offered different perspectives on life, video games promised an alternate life entirely! Importantly, not just a different way of interacting, but one where the player, the main character, was the hero. Main Character Syndrome has its roots here. As Chuck Klosterman said in a 2006 Esquire piece on how there is no great video game criticism: “what makes video-game criticism complex is that the action is almost never static. Unlike a film director or a recording artist, the game designer forfeits all autonomy over his creation —he can’t dictate the emotions or motives of the characters. Every player invents the future.”
Though the eighties contained the seeds of narrative collapse, though major metanarratives like religion had collapsed by this time, there was still the Cold War. One reason why eighties blockbusters are so beloved was because they catered to a patriotic populace united in solidarity against the Russians. Comedies like Stripes, sports dramas like Rocky IV, teen action films like Red Dawn and Iron Eagle were all textbook lessons in how to serve up popaganda.
Francis Fukuyama said the triumph of liberal democracy was the end of history. I’m not sure he thinks that’s true anymore, but it does coincide with the beginnings of true narrative collapse. Rushkoff cites several early examples from the nineties of shows that evinced the tumult. The three I want to highlight are Beavis and Butthead, Mystery Science Theater Three Thousand(aka MST3K) and The Real World, with a detour along the way for Space Ghost Coast to Coast, a show he doesn’t mention.
Before autofiction stories cataloged the characters’ online activity, Beavis and Butthead and MST3K made the main characters television and film viewers respectively. Beavis and Butthead, two teenage burnouts, would watch music videos and comment on them, often going on wild digressions, some obvious (sex, kicking butt), some unexpected (Butthead remarking on Queen Latifah’s transition from music to talk shows while watching Clash video]). Every MST3K episode had a janitor trapped in space watch a terrible movie with robot companions he created to make wisecracks with. Both shows marked the beginning of commentary overshadowing the work being commented on.
“Space Ghost Coast to Coast” went beyond commentary: the creators took old clips from the original Hanna-Barbera ‘60s cartoon series Space Ghost and, after ripping the titular hero, as well as his arch-nemesis, Zorak, from stories about intergalactic crime fighting, put them in a studio to interview bewildered celebrity guests. More than even Late Night With David Letterman, Space Ghost Coast to Coastwas the quintessential anti-talk show. Letterman might have mocked his guests or been bored with them, but he had the courtesy to pay attention to them. Space Ghost would often ignore the celebrity he was interviewing, either bantering with Moltar or arguing with Zorak (the bandleader in this context) about inane things. The blood-brain barrier was broken: now talk-show hosts had the short attention span that viewers had for decades.
My unapologetic love for the unabashed idiocy of Beavis and Butthead and Space Ghost Coast to Coast is a glaring contradiction, to be sure. For second-wave Gen-Xers and first-wave Millennials, television cartoons were as dangerous as punk rock. From Present Shock:
Deconstructed in this fashion, television loses its ability to tell stories over time. It’s as if the linear narrative structure had been so misused and abused by television’s incompetent or manipulative storytellers that it simply stopped working, particularly on younger people who were raised in the more interactive media environment and equipped with defensive technologies.
A central paradox of the nineties: at the same time that capitalism was the clear victor of the Cold War, there was a veritable renaissance of anti-corporate scorn. As W. David Marx notes in his new book Blank Space: A Cultural History of the Twenty-First Century, Pearl Jam had the #1 album in 1993, Vs., without releasing a music video. They also fought Ticketmaster before it was trendy. The pop-rap of MC Hammer and Vanilla Ice got buried by outlaw anthems from Dr. Dre and Snoop Dogg. Meanwhile, Beavis and Butthead bit the hand that fed them, roasting some of their network’s biggest videos, as Space Ghost, instead of asking Radiohead lead singer Thom Yorke about his process, sang a nonsensical song about a knife.
Cartoon Network’s golden age, especially its storied Adult Swim programming block, began with Space Ghost Coast to Coast. Meanwhile, MTV suits in the early Nineties had come to a scary realization: the remote control was slowly killing their business model. By 1992, there were other music channels, let alone other channels, some of which had music video shows. If you didn’t want to hear Right Said Fred again, just switch to MuchMusic, CMT or BET for another song. MTV had no choice but to create original programming
The plan was to make a soap opera of young, sexy twentysomethings. But MTV had to pay writers and actors. This was how programming worked before 1992, except on MTV. They had commercials (for Pepsi and Jordache) in between commercials (the music videos for Bon Jovi and Whitney Houston). Paying for talent was a foreign language that they weren’t well-versed in. From this came the decision to make a reality soap with non-actors that got paid below scale worked off no script: The Real World was a hit that lasted 33 seasons.
This led to one of the greatest revelations in capitalist history: people will gladly accept attention as payment, not just money. Especially young people. May 21, 1992, the day the The Real World pilot aired, was the birth date of the modern attention economy. The older iteration, which entails making money off capturing people’s attention for advertising dollars, goes back to the thirties with radio shows like “Amos and Andy.” This economy is still around, but the modern attention economy, where people will work for slave wages, or even no wages, to win the public’s attention, goes back more than 30 years ago, before short-form video, smartphones, social media, even the commercial world wide web itself.
The Real World was not the first reality show. That distinction belongs to An American Family, which first aired on PBS in 1973. Cops was in its fourth season in 1992. But PBS relied on donations, not ratings, and Cops was free advertisement for police departments: a clever way to juice fundraisers. Thanks to MTV in 1992, the industry realized that they could use reality programming as a battering ram against unions. Stories are too expensive because writers are too expensive.
Having a show without writers is difficult if you just stick attractive people in the same apartment together, which was why, in season two, some roommates were cast because they were prone to conflict. In The Attention Merchants, Tim Wu identifies the moment the troublesome comedian David Edwards gets kicked out of the house as the beginning of the “kicked off the island” trope that is now standard for reality TV. In other words, in a world without writers, without stories, you need outrage, voyeurism, gossip. The titillation of watching Edwards get dismissed is identical to that of watching a washed-up movie star get shamed for sending embarrassing texts.
But if there is conflict, isn’t there a story? A plot is not the same thing as a story. Read the Wikipedia article for William T. Vollmann, then read Alexander Sorondo’s widely-celebrated profile on Vollmann for this very magazine. To say they are the same is like confusing a tombstone for a family album.
It wasn’t just the entertainment world that was affected by the fallout of narrative collapse. Journalism in the nineties devolved from legitimate reporting to tabloid sensationalism that became indistinguishable from a tawdry soap opera. From Ross Benes’s book 1999: The Year Low Culture Conquered America and Kickstarted Our Bizarre Times:
The launch of 24/7 news channels CNN, Fox News, and MSNBC turned real-life news stories into instant made-for-TV dramas that spawned endless coverage. O. J. Simpson’s murder trial, Bill Clinton’s affair with Monica Lewinsky, Pee-wee Herman masturbating in a porno theater, Princess Diana’s tragic death, child beauty queen JonBenet Ramsey’s unsolved murder, and Tonya Harding’s coordinated attack on fellow ice skater Nancy Kerrigan received heavy rotation.
In the nineties, narrative collapse was mostly contained to television, a medium that had been struggling for decades to gain respect, prestige or legitimacy. Music was innovative and lucrative that decade; independent movies revived the New Hollywood spirit, many of the films playing with narrative convention without sacrificing good storytelling; literature gave us Infinite Jest, American Psycho, The Secret History and Fight Club. I am old enough to remember that for most of the nineties, being on the internet made people think you were smart. You were respected for not channel surfing but surfing the web instead. Though the internet might have already made some people afraid, television was still the primary locus of cultural rot. Between 1975 (when shows like Happy Days eclipsed All in the Family) and 1998, everyone took for granted that, with few exceptions, everything on television was shit.
When The Sopranos debuted in 1999 on HBO, ushering in the Peak TV era and proving that smart, engaging stories could be told on television, HBO’s slogan at the time was “It’s Not TV, It’s HBO.” The shame that stirred in that copy writer’s heart. Imagine “It’s Not Film, It’s A24” or “It’s Not Music, It’s Spotify.” Even in the current degraded state of Hollywood and pop music, those would still read as insults to the respective companies.
As Sorondo brilliantly observes in his profile of novelist Mark Z. Danielewski, shows like The Sopranos were popular because they were released on DVD, making them accessible to people who could not afford premium cable. To put a finer point on it, watching these shows on DVD slowed the flow of television. Watching The Sopranos or The Wire on DVD meant no flipping between channels. Complex story lines were possible now; forgotten episodes were obsolete, just play it again, perhaps with subtitles this time. Before this moment, television was meant to be cheap and disposable, which was why it took so long to evolve. Marion Stokes, a television producer, recorded hours of TV news footage from 1977 until her death in 2012 (the Iran hostage crisis of 1979 strengthened her resolve to carry on with the project, to preserve stories and threads the media might want the public to forget). The advent of DVDs helped make reliable television archives a reality. This is as significant as libraries were to the history of literature, or playing old movies on TV would be for developing New Hollywood directors (and video stores would be for the indie director movement of the nineties).
The shift from DVDs to streaming would have consequences that will be discussed later.
As television shows became more sophisticated, movies underwent catastrophic damage from narrative collapse. Here I respectfully depart from Rushkoff’s study of narrative collapse again. I would not cite The Sopranos or Pulp Fiction as examples of post-narrative works. While Rushkoff claims Memento is an example of narrative collapse at the multiplex, I associate it with franchises like the Marvel Cinematic Universe or the Harry Potter/Fantastic Beasts universe. He illuminates how a TV series like Game of Thrones is more focused on worldbuilding than any sort of climax or resolution, yet omits how this same focus has deteriorated mainstream cinema’s narrative legitimacy since the early 2000s. Like the video games the audiences for these films grew up on, these CGI-injected tentpoles promised a different life in a fantastic realm. Like the computers the audiences were using at that time, instead of a film series coming to a conclusion, it would go on until it got so shopworn that it needed to be rebooted. Franchises never ended now; they just got upgraded with new casts and crews. As I’ve said before, the idea of a packed theater of grown-ups watching a film about adults not frantically searching for stones or rings seems as quaint as Victorian adults gathering to watch a lecture. This level of infantilization, combined with the proto-gossip trap of reality TV, are where we see the greatest devastation caused by narrative collapse.
This bizarre allergy to stories ending does not bode well for our collective psyche and spirit. Ted Gioia, the cultural critic and leading Substack writer, sees a future where the Star Wars and Marvel franchises extend themselves into self-parody. I agree. What a shame that all that jumping, running, punching and shooting probably won’t end with an ambiguous, controversial, unsatisfying conclusion like Lost or The Sopranos did, but with an underwhelming cliffhanger trailing off into nothing because by that point the whole damn circus is too damn expensive to bring back into town again (unless it’s big enough to rebrand of course).
Here’s what they don’t tell you: unsatisfying endings are often better than the tidier ones. The satisfaction they bring often reinforces the viewer’s self-satisfaction, reinforcing their world view, reassuring them like a mother kissing a skinned knee. I prefer the shock when I finished the Freaks and Geeks Season One DVD, when I realized that Nick and Lindsay do not get back together. I would drink beer and watch the series again, tracking what went wrong like a former high school quarterback rewinding the footage to see how he missed his opening at the final game thirty years ago. Perhaps the ending stung, but it felt honest in a way television never does anymore.
By this point, you’re understandably irritated or angered by my preoccupation with narrative collapse. Shows and movies don’t tell strong stories any longer, so what? It doesn’t seem to matter, unless you factor in that nobody has mythology or religion to hang their hat on anymore.
But they still have their dreams.
III
It’s 2009. Barack Obama is inaugurated. He’s the first Black President in U.S. history. America can’t abridge the ugly chapters from its history, but the youth can — the youth are — and they’re writing a New Testament and spreading the good news. Racism is America’s original sin, but it will all be cleansed by a new generation. Baptism by new blood.
This was the new metanarrative being sold, anyway. It was meant to replace the increasingly flimsy one of the War on Terror. As comedian David Cross said (in 2002, when criticism of President George W. Bush was mostly silenced), “You cannot win a War on Terrorism. It’s like having a war on jealousy.” Obama’s victory was an impressive rebrand for the country — arguably the first successful social media campaign. Social was to the 2008 election what blogging was to the 2004 election. While old racist Boomers drooled in the flickering glow of Fox News, the first generation of the new millennium brandished their smartphones as soft weapons for guerilla warfare against the hegemony of white patriarchy.
With all the analysis of how social media got Obama across the finish line, you don’t see much about the timing of his run. If he ran in ’04 instead of Kerry, it would have been a bloodbath at the polls. The Iraq War wasn’t widely unpopular yet. Americans generally believed Bush should stay in the driver’s seat. He couldn’t have won his first term in ‘12 either. That would have been after 2009.
A month after Obama was sworn in, Facebook introduced a new feature: the like button. Thus, on February 9th 2009, the dream economy was born. Yes, the new button drove the modern attention economy, active since The Real World, into overdrive. Attention seeking was infectious now. You had to be an extrovert to make a YouTube video, especially if you were in front of the camera. You had to be pretty confident if you thought your tweets could compete in the same feed as the ones major celebrities posted. Facebook introduced this new functionality after most people under the age of 40 already joined the network. One day, they were sharing recipes, jokes, pictures of their aunt’s birthday, etc. The next, they got approval for this, leading many to wonder: if my green bean casserole picture is getting four likes, what will get me 10?
Digg, Reddit and YouTube had similar functionality before the like button’s debut, but their approval metrics were used to show everyone the same front page, with the most popular links and videos, like a supermarket that highlighted the hottest items, boasting they had them in stock. The true precedent was Amazon’s recommendation algorithm. Reviews and purchases were used to recommend products that you might also want to buy. Except now this logic was being applied to ideas, to life events, news stories, jokes, etc. In the beginning, the most popular posts in the friend group appeared on the news feed. Over time, it was populated with content that you liked. The feed wasn’t officially algorithmic until 2011, but the like button was the primitive method used to build that algorithm’s data.
This was how social media assumed its mantle at the center of everyday life for all humanity. Even in 2008, social media was primarily seen as a way to reach out to youths. Most news sources were primarily concerned with gaming the Google algorithm, appearing on the front page of search results. Now there was another game: getting articles shared on the Facebook news feed, which meant now everything was written with the youth in mind. Buzzfeed, once a deadpan, ironic project treating virality like a science, became a legitimate trade publication like Billboard or Variety. Users saw sites like MySpace, Digg, eBaum’s World, YouTube, Twitter, and of course Facebook as a fun timesuck. But there was a more substantial internet, with well-written blogs, many linking to news stories that might have flown under the radar, or to albums that were demanding. Users also spent time there. The Facebook like button changed all that. Your posts needed to be popular. If your green bean casserole picture got two likes, time to make a jelly bean casserole. The world outside of social media was unimportant because your ego wasn’t on trial anywhere else.
Some might write off the dream economy as another term for “simulacrum.” The simulacrum seems quaint nowadays. A simulation was based on the outside world. This new realm was a perfect reflection of your own unconscious. Facebook collecting information on the world’s desires was like Lucifer Morningstar on the supernatural procedural Lucifer asking a suspect “What do you desire?”
During Obama’s first term, a hidden desire was unearthed on the news feed: racism. The dominant medium of a particular time has a supernatural tendency to reveal and satisfy repressed desires. During the Victorian era, repressed sexual desire led to big sales for Oscar Wilde and Algernon Charles Swinburne’s literary erotica. With peace signs everywhere in the sixties, violent films like Bonnie and Clyde and The Wild Bunch had high grosses. Commercial mass media in the 2010s might have insisted on the narrative of a pluralistic democracy, but Facebook brought the nationalist rage usually found on message boards like 4Chan into the mainstream and made more money off it than any of those anonymous forums.
As the dream economy evolved, it relied on a frictionless user experience. Though nobody realized it at the time, the web was better because it was slower. In the early 2000s, the internet wasn’t painfully slow like it was in the dial-up days, but patience was required. If you read something online and there was a link to a video, you had to download the video, typically using a stand-alone media player like Quicktime. If you wanted to hear an indie rock song from a music blog, you downloaded the mp3. It didn’t take forever, but it took long enough that when it was ready, you were curious what all the fuss was about. Podcasts were downloaded onto iPods, to be listened to later, some of them going as long as three hours. No video clips.
I tipped my hand just now. In December 2005, YouTube took over the world. It gained popularity with a viral video — a digital short from “Saturday Night Live” called “Lazy Sunday,” created by members of the popular internet sketch group Lonely Island. What made this viral video different from the others: if you clicked on it, you went to a website full of videos that you could search for, even upload content to. Downloaded videos did not lead you anywhere else. Nor did they play the instant you clicked on them in the browser window. The streaming era began.
Cut to 2011: Netflix makes users decide between the physical DVD model or the streaming model. Most choose the streaming one and the DVD becomes yet another physical medium fetishized by collectors like vinyl and cassette tape before it. It is not a coincidence that the decline of the DVD is accompanied by the decline of prestige television. Once again, shows need to grab attention. Worse still, the archival of classic television is as poor as it was before the home video era. Licensing is too expensive, so many great shows end up forgotten. This time, movies also suffer from neglectful preservation. For the first time since the seventies, a movie’s survival depends on the mercy of theatrical revivals and networks (in this case streaming networks) showing them. No more video store Tarantinos. Film and television are as ephemeral as YouTube.
In 2013, Netflix introduced original programming. Their House of Cards was created after the data revealed that viewers liked the British House of Cards as well as Kevin Spacey. Netflix was able to use data to create shows for major audiences and niche demographics. Unlike the TV networks, Netflix began releasing entire seasons on one day, instead of spacing out episodes on a weekly basis. Watching a show on Netflix is like a fugue state, where you spend eight hours binging a show and it ends up a forgotten dream.
Despite narrative collapse and the first stirrings of the dream economy, there was forward momentum for stories online. Live storytelling shows like The Moth and Risk got traction online through videos and podcasting. Serial and S-Town were podcasting’s answer to New Journalism. Creepypasta manifested on message boards globally. Broad City, High Maintenance and Awkward Black Girl were adapted from web series to television (Awkward Black Girl became Insecure). There was a thriving alt-lit community online. Though much of it consisted of autofiction regarding non-events in a person’s life, some truly affecting work came out of this movement, especially Scott McClanahan’s stories of West Virginia life.
This all happened during the hypnagogic early stages of the dream economy, when it was easier to return to wakefulness. Facebook’s popularity was waning with American youth in the early 2010s. Snapchat was primarily focused on direct messages that vanished after they were opened. Twitter was bullish on the reverse chronological timeline, boasting its ability to track trending news stories as they happened. Weird but moving web series like High Maintenance had a chance to get widely shared on Twitter and Tumblr.
But, as Kyle Chayka clarifies in Filterworld, Instagram and Twitter, the two major social media sites that were driving culture forward, began using algorithmic news feeds in 2016. Netflix gave different viewers different home pages that fit their individual interests. Now the dream economy was in the deep stage. The algorithm swallowed the world. By this time the algorithm perfectly mirrored the users’ desires. Like buttons were not the only metric used to mirror them; now platforms could monitor how long someone lingered on a picture, how often they returned to a post. The dream economy now captured unconscious desires that weren’t publicly declared. This progressed for four years until the COVID lockdowns of 2020, when people were encouraged to stay home and hibernate, to cocoon. COVID was not a conspiracy by tech companies, but it benefited them in a way no other time ever would again. TikTok’s global domination began, with the greatest algorithm in history. Social media was not the center of life; it was life. Even celebrities, unable to work on shows and movies, posted videos on the feed from their homes.
There was one last snag in the machinations of the dream economy: the increased difficulty of receiving positive attention. Imagine being in a dream world where you only see people you desire, but getting them to notice you is harder than ever. That hole was sealed in November 2022 with ChatGPT. In a sea of unfriendly cloutchasers on social media, the chatbot listened to you like no one could. Better than your overworked friends, better than your mom with her fading memory. Maybe even better than your therapist. Like the Genie in Aladdin, you never had a friend like it. One minute it reassured you, the next it granted wishes, giving you images of anything you desired.
Being at the center of this world was much easier. Now everyone could be in their own dreamworld. What’s more, the dynamic of participation had changed. In this decade, posting declined in popularity. Now, agency was spent on guiding the algorithm to match your desires, especially on apps like TikTok. Everyone was a king who demanded entertainment for their court. We had come full circle: as Derek Thompson said, everything was television now. But this was a television guided by your deepest and darkest cravings.
Or your fears. Douglas Rushkoff wrote Present Shock as a response, or an update, to Alvin Toffler’s Future Shock. Toffler’s book warned everyone to future-proof themselves, lest they get caught unawares by the tidal wave of technological progress. Many in tech still hew close to this philosophy. Rushkoff maintained that the 21st century was in a state of present shock, where the future already happened and our focus was on minute-by-minute developments. This applied to the 2010s, but in this decade the world, afraid of global pandemics, environmental devastation, increased nuclear proliferation, genocide and impending AI doom, has been frozen in a state of what I would call “past shock.” We have been rapidly cycling through retro trends and reboots like the reminiscences we nervously recount at a relative’s deathbed.
The dream economy thrives in this environment of course. If humanity is dying, why not watch its life flash before our eyes? One of the deepest ironies of our age that has been puzzled on by many great thinkers: as our technology gets better, we are that much more stuck in the past. Our dreamscape consists primarily of recurring dreams. Half the reason we are stuck further in the past is because we have amnesia of our recent past. Movies like Avatar gross millions, only to be forgotten. Much of the great indie rock from the 2000s has been lost because the blogs that publicized them have been defunct and many of those songs have stayed in an iPod lying around somewhere. Daniel Falatko’s haunting novel The Wayback Machine follows this thread, with much of the protagonist’s memories of that time difficult to corroborate online by the hosts of the podcast he appears on.
Past shock does not only happen on a macro level, but on a personal subjective level. A trend that has been unique to this decade: extreme media immersion. “Sludge” videos use split screens to show multiple things at once: a streamer sounding off on Jeffrey Epstein in one frame; a person playing Minecraft in another; a Family Guy clip in the third. A more extreme example of media immersion of course is gooning, a form of extended masturbation using several screens for porn, often using porn compilations. From Daniel Kolitz’s memorable Harper’s article on the subject:
Some part of me, I’m saying, was convinced that these gooners were human. Almost certainly they had, in their lives, hugged grandparents, cuddled pets. Bundled in winter coats, they had sled down snowy inclines, secure in a legal guardian’s love.
Funny that all these moments he lists seem to recall childhood because I believe, paradoxically, this extreme media immersion is meant to return them to a state most of us experienced in childhood, a state also felt by household pets, when a parent, caregiver or “legal guardian’s” voice acts as a balm, regardless of the words being said. The pets and the children can’t understand what is being said: the pure sound is enough. This extreme media immersion is extreme regression. The lack of focus, the desensitization, is the point. It is meant to numb anxiety,to stop the stream of consciousness that seems to be hurtling them to a violently turbulent future.
IV
Great, so how do we awaken from the dream economy? A common suggestion is to stop using phones, or to quit social media. Look up from your phone and drink in the world around you. Look at what? Sephora? T-Mobile? Dunkin’?
As Katherine Dee once wrote on her remarkable blog Default.Blog, it is wise to look at your intentions, your mindset, before diving into cyberspace, which she likens to the astral plane. This is akin to Timothy Leary suggesting LSD users make deliberate choices about mindset and setting. Yet I have seen no one make this suggestion with digital detoxes. I can’t think of a screen addict that has not seen the message “stop scrolling” on their phones. Why can’t they stop? There are all these pink cloud testimonials of life beyond the matrix, but how long does this last? Is there any recidivism? What happens when all those emotions you’ve been numbing come roaring back?
In 2016, I went through media deprivation for a week, as suggested by Julia Cameron in her quasi-mystical workbook program The Artist’s Way. This didn’t just exclude scrolling. I also cut out TV, radio, books, movies, magazines, compact discs, all of it. I had no idea how much anxiety I was numbing. My emotional fluctuations were so extreme, towards the end of the week I tried to buy crack, more than 10 years after I stopped smoking it. An extreme example, to be sure, but to the point, quitting something is not enough. Even those who have seen 12-step meetings on a TV show know that quitting drinking and drugging is literally the first step. How do you stay stopped?
Something you also hear in meetings that is not a warmed-over cliche is that the problem is not the substance, the problem is you. When Buddhism was invented on Bodhi Day, there were no movies, televisions and computers. But there was what is roughly translated in English as “monkey mind,” a mind that constantly chatters and judges. In The Science of Storytelling, Will Storr discovers that the mind constantly frames a person’s life into a narrative, with that person as the main character of course. This is humanity in its natural, resting, tribal-biased state. Or, as Kevin Simler and Robin Hanson illustrate in The Elephant in the Brain, humans evolved to become better at talking than listening because they were constantly auditioning for the highest role in the tribe. I would conjecture this also explains why our minds tend to chatter internally more than reflect the world around us nonjudgmentally.
For most of human history, it was understood that the self-serving mental narrative would not be entertained, let alone catered to. As much as we preferred to chatter outwardly and inwardly, we needed to indulge other people’s thoughts. Over time, spoken stories evolved into epic oral poetry and plays. There was a bargain: instead of daydreaming about how you were going to get over, you listened to a story about how someone attained victory. There was wish fulfillment, to be sure, which made it dreamlike if you identified with the main character, but often you walked away with new perspectives, insights and ideas that you did not know you wanted. Printed literature expanded the potential for stopping the stream of the monkey mind’s chatter, or diverting it anyway, but for centuries it primarily peddled wish fulfillment fantasies in the form of chivalric romances. The first novel, Don Quixote, focused on a man whose daydreams about being a noble knight from these romances devolved into grandiose delusions that he was compelled to bring to life.
You can even say that as long as narrative works were being sold, there was a dream economy, which would explain why the narrative collapse of the nineties created the necessary vacuum for the modern dream economy to evolve. Needless to say, the machinations of the older dream economy were more rudimentary, not as sticky. Stories require concentration, which hurts their memetic potential. When this focus of attention is rewarded with a good story, it’s a new experience — often it feels life-changing. You frantically tell everyone they have to check it out. Today’s more potent dream economy does not require concentration. It also doesn’t require distancing, which is another benefit stories give us. Many of our most cherished stories can be cynically, lazily described as a heightened form of gossip. They are much more, but our tribal brains often let them in under that false pretense. All that money and Charles Foster Kane died alone? What a pity! But stories distance us enough from the action so we can move beyond the petty mindset of mere gossip. Moreover, they are created by people who have often had enough chronological distance from the events in their lives they are based on (if they are autobiographical) to create something that is beyond a mere therapy session.
One feature that the modern dream economy thrives on that is anathema to the world of narrative is what Anna Kornbluh refers to as “immediacy.” There was a time when youths competed to create novels, movies and TV shows but now they mostly compete to make the best content. Even those that want to create TV shows or novels need to have a large social media presence for any legacy media company to consider them. The Ankler article highlights how youths today talk about their lives on social media, but Lee clarifies how they are sharing fragments of their lives as they are happening. In the nineties, if a customer at your job was rude, you scratched it into a journal and unearthed it years later, to become a part of your memoir or semi-autobiographical novel. Now, the rude customer is broadcast on social media minutes after the moment happens, if not as it happens. Best case scenario, the whole thing goes viral and you have this niche of followers who watch you handle rude customers. Perhaps customers hope you serve them for some of that trademark sass that you served that impolite shopper. To clumsily paraphrase an old adage, why would anyone buy your TV show about the hellish life of a cashier who dreams of stardom when they can get the saltiness as it happens for free?
This is of course the synthesis of video games and reality TV: gamified reality, where the competitive stakes are too fast-paced and steep to waste on the old ways of thinking, like contemplation, or even thought itself.
Past shock is driven by fear of a terrifying future, but another reason our gears are stuck in reverse is that, while culture as we have understood it (films, TV) is on the decline, AI slop and even the creator economy are hotter than ever. But the latter trade in ephemeral, forgettable content, so here we are clinging to superhero films like classical scholars once did Greek mythology.
Let’s not succumb to our collective short-term memory and forget about Substack Summer. In the earlier years of this decade, self-published nonfiction began to lose the stigma that it had in the blogging years, when only rare exceptions like Mark Fisher were taken seriously, or in the dreadful Medium era of the 2010s. Twenty Twenty-Five was the year that, after years of blithely ignoring self-published fiction, content to let it rot in genre prison, The New Yorker acknowledged its validity. Anecdotally, I can tell you that, while I wouldn’t deny Substack is a social media platform, my scrolling is different there. The Notes feed (Substack’s version of a Twitter newsfeed) often contains pithy quips and memes. But I seek articles and stories. When I find them, I save them for later. The stories I save for last. They demand my absolute attention. Alexander Sorondo’s experimental personal essay posts from his Substack Big Reader Bad Grades are reliably disarming sketches of life as a grocer/Uber Eats driver that make me feel more connected to the world around me. I want to look at the world through keener eyes. In May, Daniel Falatko’s elegiac piece about a beautiful barback he knew in Philadelphia was constantly getting reposted on my feed until I relented and read it. It stopped the stream of self-centered thinking temporarily to make room for the life of a young woman that is no longer with us. Another story I’ve read online that I haven’t forgotten is by Naomi Kanakia. It has one of those endings that might seem abrupt until you sit with it and its epiphany jolts through you. Where these works fall in the canon of great literature does not concern me; they sit on the loftiest of perches within the internet’s young history as a medium.
Literature now has the countercultural cool movies and music once did (and that literature itself did in the fifties). Dimes Square’s literary moment may have come and gone, but Los Angeles is becoming an unlikely new center. Hollywood may not have had its most profitable year, but original, live-action films like Sinners and Weapons made a notable impact at the box office that they haven’t in decades.
Narrative collapse is not narrative extinction. As to be expected, metanarratives are still being pumped out, though they may be more covert now. The War on Terror narrative fell apart during the Iraq War; the progressive youth narrative collapsed during the 2010s; the “trust the science” narrative of 2020 may have inspired the most antipathy. So now the Silicon Valley PayPal Mafia are animated by a narrative that only insiders are privy to: we must use AI technology to conjure up demons, who ultimately run the market.They are not only trading in hidden narratives. A white supremacist cartoon created with the help of AI, The Will Stancil Show, is a huge hit on X. While right-wingers have made consistent progress countermanding liberal messaging on radio and online, this is the first time they have had a successful scripted narrative ever. Not only does this underline how important stories are, but it simultaneously reveals that the “truth crisis” of AI is besides the point: many are content to live in the dream world, determined to never wake up, especially if that dream is of how great this country used to be.
The most likely AI doom narrative is that we become overreliant on it and it does the thinking for us. As Paul Kingsnorth, inspired by Oswald Spengler, would say in his new book Against the Machine, much of the history of technology has had a Faustian undercurrent to it. ChatGPT sees its most usage during the school year, when students use it to cheat and professors use it to grade. As I read more about this problem, I had a monstrous thought: I had to write in my journal later, but I was feeling too lazy; what if I tell ChatGPT what happened that day and it wrote what happened in an interesting way? I got home that day and tried the experiment. Here are the results, with the parts completely fabricated by the chatbot taken out:
Today felt like three different movies stitched together — a workplace soap, a tiny personal victory, and an unexpectedly sad slow-burn at the end. Work started off loud and petty. The rumor mill is in full rotation: Alicia’s been telling people that Marlon threatened her. I didn’t see it happen. . . . I hate how quickly a single line of gossip becomes a currency — it buys people opinions and ruins the geometry of small interactions. I tried to stay out of it, mostly because I don’t know the truth, but it’s the kind of thing that sits on your shoulders all day and makes typing feel heavier.
On a much happier note: I ate a full meal today — like, actually sat and enjoyed food — and did not have to sprint to the bathroom afterwards. This might be the smallest thing anyone’s ever celebrated, but if you’ve been living in fear of your own stomach, you know it’s basically liberation. I kept waiting for the panic to kick in and nothing happened. Coffee, sandwich, a laugh, and continued functioning digestive tract. Tiny victory flag planted on the kitchen table.
Later, I met up with Ross. Ross has this way of making the day feel like it has a plot again — you leave the conversation lighter than when you walked in.
The day ended on a quiet, sad note. Syed told me his cat died. I hadn’t known how much he loved that animal until he started talking about it; his voice got soft around the edges and the apartment felt emptier through the phone. Losing a pet is losing a room in your life — all those little rituals vanished overnight. I sat with him for a while, mostly listening. It was one of those moments where you realize being present is the only meaningful thing you can offer.
So: ridiculous office rumors, personal triumph over my own body, laughter with a friend, and genuine sorrow for another. Not exactly a normal Friday, but honestly — I’ll take a day that makes me feel this many different things. Time to sleep and hope tomorrow is less dramatic and more boring (boring in the good way).
Amazing! Now I don’t have to go through the hassle of seeing the world through my own eyes anymore!
Ever since that day, I have resolved to stop using AI. I do not want to outsource my perception anymore. Now’s a good time to mention that you shouldn’t outsource your understanding of the world to thinkpieces. On his podcast, Douglas Rushkoff introduced another concept: insight junkie. Often when I read thinkpieces, they give me a tidy, hopeful conclusion and I saunter into the world with a more confident step. Well let me break that spell: the answers you seek are not in here, they are out there. We are not only facing a literacy crisis with actual text, but with reading the world around us. I want to read more stories, but I also want to study the characters in my own life. I want to look below the Dunkin’ sign at the man holding the door for money. I want to look within my own mind, listen to my thoughts, read my own inner monologue more closely. I hope it’s worth the hassle.
Mo Diggs writes about tech, culture and legacy media. His Substack, Cross Current, looks at the intersection between the new media trends of today and the legacy media trends of yesterday. It has been mentioned in The New York Times and The Guardian.







the opening to this is so damn good
Best piece I've read in ages. Thank you.