select all

How I Built a Bot to Write Full House Episodes

Photo: Netflix

In late February, 13 new episodes of Full House appeared on Netflix, as Fuller House. Some saw it as excessive. Consider that the original series aired for eight years and seasons, between 1987 and 1995, with a total of 192 episodes. Watching them all front-to-back would take roughly 80 hours, plus the years of subsequent therapy.

But personally, I don’t think it’s nearly enough Full House. So I made more. My buddy Sam Kronick and I got to work creating a program that would generate new Full House scripts, one per day, forever.

We’re not the first people to turn to America’s greatest renewable resource — the sitcom — as inspiration for our weird internet art-humor project. Over the past couple of years, old sitcoms — particularly those that aired in the ‘90s — have been a fairly common, reliable target for comedy writers, programmers, and artists. A number of weird works, from parody Twitter accounts to script-generating bots, utilize their aesthetic, or use them as springboards for bizarre experimental art.

To name a few: the brilliant performance-art project Seinfeld2000 (2013); the Friends script generator Infinifriends (2014); The Jerry Seinfeld Program (2010); this mock-up for a Frasier SNES game (2015); the 15-minute opening-credit sequence Too Many Cooks (2014); and Nothing (2014), the sinister super-cut of empty establishing shots on Seinfeld.

These works were all produced independently of one another, but draw from the same well of canned laughter. It’s not a coincidence. As it turns out, the formulaic nature of sitcoms is an ideal source for bot makers. Each episode of a show is essentially the same day with the same characters repeating the same catchphrases, infinitely.

Tom Armitage, the U.K.-based designer responsible for Infinifriends (as well as other generative works The Literary Operator and Markov Chocolates), backs this up. “The technique I was using — simple Markov chains — needed a decent-sized corpus to draw from,” he explained. “So a show with many episodes, ideally transcribed online, would be best.” Markov chains are a language model used to create many contemporary bots. Essentially, they create new sentences by arranging words based on phrases from a source text. The richer that source text, the better.

And what’s richer than the original run of Full House? To make Fullest House, we took all of the show’s scripts, helpfully transcribed by the appropriately named Full House Forever (over at full-house.org, by far my favorite organization). The transcriptions had a few spelling errors, but we figured this just added verisimilitude. Sam combined all the scripts in one big text file, and fed that into an artificial neural network algorithm.

At the risk of oversimplifying, artificial neural networks (ANN) are a powerful form of machine learning, modeled on the neural networks in our brains. They can be used to defeat humans at Go, or help determine one’s eligibility for a loan, or automatically label photos. Depending on the pool of data they work from, they can also be racist, or at least reflect deep institutional biases. Remember all those upsetting magic eye-style images that went viral in 2015? They were made using Google’s DeepDream neural network. Recurrent Neural Networks (RNN, a form of ANN) can generate weird versions of Shakespeare or Donald Trump speeches.

To generate text, this particular RNN is programmed to read an input (for example, Full House scripts), then generate an output based on patterns it notices. It compares the new file with the original, noting the hits and misses, and adjusts itself accordingly (hence: machine learning), before creating yet another new version. It repeats this process over and over, thousands of times, until it starts creating fairly accurate approximations of the original. Unlike Markov chains, which work on a word-by-word basis with no memory, RNNs can work on a character level, actively learn, and consider fancy things like paragraph structure and formatting.

The process sounds complicated but it’s really not that bad. It basically involves training the RNN program on a data set (whatever text file you want to work with), then messing with a few of the parameters. The art here is figuring out the appropriate balance between underfitting (the computer generates nothing but gibberish) vs. overfitting (you just spent a whole lot of time and resources creating a glorified copy-and-paste machine). This part is totally subjective, and we just clumsily messed with stuff until we were happy with the results. After that, start sampling to generate new text.

Unfortunately for us, the first few iterations read less like Full House and more like Finnegans Wake. I’ve always wondered what a sitcom written by James Joyce might look like.

After a few more bizarre early iterations (featuring brief appearances of the new characters “Danky” and “Bendy”) we eventually arrived at a decent-enough version, in which everyone keeps hugging, walking up and down stairs, and apologizing. According to our computer, that’s the essence of Full House. It’s not necessarily wrong.

Sitcom formulas tend to create a set of expectations for the viewer that, when disrupted intentionally or by a computer error, produce a reliable comedic dissonance. “If the computer-generated language falls flat,” Armitage said, “we can fill in the blank with our imaginations of how, say, ’96-era Matthew Perry might deliver a line.”

Simultaneously, there’s an interesting tension at play. The errors suggest the ineffable quality of our sense of humor, and underscore our essential humanness. Maybe there’s some relief to this: If robots can’t yet master something like comedy — even the most formulaic versions of it — maybe they won’t be able to crush humanity in their greasy clamps after all.

Regardless, the easy reference points mean that jokes require less time to set up — ideal for the fast pace and short attention spans online. “I’m more drawn to playing within forms and genres in general,” Casper Kelly, writer/director of Too Many Cooks, told me. “I think as our culture moves faster, and as the sheer amount of work being produced increases, finding those shared memories of forms will become trickier. But for all generations right now, sitcoms are a good shared experience.”

Personally, I think it’s entertaining to push the cynical deluge of TV and film reboots to their logical conclusion. “Why sell a box set when you could sell a machine that infinitely generates new episodes of Friends on your TV?” asks Armitage.

Imagine a dystopian future where computers generate scripts of our most cherished sitcoms, to satiate the relentless demand of a bored, sedentary public. We sit watching for decades, sipping nothing but a nutrient-rich slurry, as the actors age and die on camera, to be swiftly replaced by replicas grown in a warehouse-sized actor breeding lab, our minds trapped in a blissed-out permanent nostalgia, as our own bodies gradually fall apart and decay.

Maybe we don’t quite need to imagine it. The end result of our project is like if Full House took place in a drifter’s heroin-withdrawal nightmare. There’s nothing resembling a plot. Characters wander in and out of scenes, shouting out their lines at the sky. Catchphrases are mangled; at some point “You got it, dude!” lost the “dude,” and Uncle Joey’s “Cut it out” replaced “cut” with random words (i.e., “Do it out!”). Episode titles include classics such as “Divorce Is Not Golden,” “Again (Part 2),” and “Firls Will Be Boys.” Stephanie becomes “Staphanie.” Eventually, Uncle Jesse loses his mind, and starts talking to the couch.

None of it makes any sense. But still, it’s funnier than Fuller House.

How I Built a Bot to Write Full House Episodes