Science

'Sunspring,' a Short Sci-Fi Film Written by AI, is as Weird as it Sounds

Thomas Middleditch stars in a bizarre adventure.

Sunspring/The Scene

Writers, prepare to be made obsolete. An artificial-intelligence program has written a sci-fi screenplay, which real-life actors have turned into a nine minute film. Sunspring was made by director Oscar Sharp for the annual Sci-Fi London Film Festival, written by a recurrent neural network that called itself “Benjamin.” The film, which debuted on Ars Technica on Thursday, is equal parts bizarre, emotional, and intriguing.

Sunspring stars Thomas Middleditch (from Silicon Valley) as a character called H, pining after Elisabeth Gray, who also plays a character called H. Humphrey Ker seems to be the second H’s partner, and he’s called C, but even that’s not entirely certain. The first H is in a room with them, but then he’s not, and then he’s in the stars and a song with AI-generated lyrics starts playing, and then there’s two of him, but then the second one disappears and he’s left alone in a room with a black hole.

“Benjamin” was fed classic sci-fi scripts by his operator, AI researcher Ross Goodwin. The producers and actors then sat around a table, read through the script “Benjamin” made, and tried to piece it all together. Everything, including stage directions and dialogue, was produced by AI.

The end result is something that doesn’t quite resemble a dream, but plays with coherence enough to make you wonder whether the film is actually a masterpiece, one where you don’t quite understand the dialogue for some reason. Gray’s closing emotional monologue, where she cries over C, highlights her excellent acting talent even when working with a machine-based screenwriter.

It’s not the first time that AI artists have tried to replace humans. Google’s Magenta project aims to create original art pieces through a mix of speech recognition, translation and image recognition. Magenta plans to learn from others how to create art, learning the techniques of surprise and narrative arc to create compelling pieces.

Google’s previous Deep Dream project created abstract pieces by looking at an image fed into the system and attempting to recognize other images through patterns. Dogs morphing out of the sky, cats in the lampshades and eyes in the walls all came out of Google’s project. Princeton University student Ji-Sung Kim used Deep Dream to power his “Deepjazz” creation, which applied similar principles to music.

Although Benjamin, Magenta and Deep Dream are impressive, it may be a while before the art industry has to hang up its collective apron and call it quits. On the other hand, when was the last time you got Thomas Middleditch to perform your screenplay?

Related Tags