Reel Science

40 Years Ago, a Wild Sci-Fi Movie Predicted a Life-Changing Invention

Playing fast and loose with its reductive portrayal of the brain, the film’s mind-sharing technology is far from reality.

Reel Science

Douglas Trumbull is best known as Hollywood’s special effects guru. From 1968’s 2001: A Space Odyssey and 1982’s Blade Runner, he brought the fantastical visions of other writers and directors life. But in 1983, Trumbull tried making a movie of his own — and stumbled upon a bizarre branch of science that’s just now coming to fruition.

Brainstorm revolves around a pair of scientists — Drs. Michael Brace (Christopher Walken) and Lillian Reynolds (Louis Fletcher) — who engineer a revolutionary technology capable of recording someone’s thoughts, emotions, and sensations.

In what would likely raise eyebrows to an internal review board, the scientists and their lab members test the new technology among themselves. Brace, Reynolds, and others (including silver screen darling Natalie Wood, who plays Brace’s estranged wife Karen) share experiences and past memories, which are captured on something akin to a VHS tape and played through a headset to the recipient’s brain. Later in Brainstorm, Reynolds dies from a heart attack while in the lab but manages to don the headset in the nick of time, leaving behind a recording of the moment she dies — and the afterlife — for others to experience.

Brainstorm plays fast and loose with its reductive portrayal of how the brain works. However, the movie’s mind-sharing technology isn’t far from the truth, necessarily. Four decades later, with the rise of brain-computer interfaces (or BCIs) melding the mind with machines, we may be closer to making thoughts tangible, if not to others but to devices like prosthetic limbs and speech synthesizers.

Reel Science is an Inverse series that reveals the real (and fake) science behind your favorite movies and TV.

Can science decode our thoughts into actions?

In Brainstorm, scientists find a way to record human thoughts and emotions on a VHS tape.

Metro-Goldwyn-Mayer

Over the last 60 years, scientists have been developing BCIs — systems that facilitate direct communication and interaction between the human brain and external devices. BCIs work by implanting electrodes that pick up electrical neural activity anywhere inside or outside the brain. A computer analyzes the brain activity and translates it into commands that a separate device carries out.

The primary aim of BCIs is to assist individuals with disabilities, such as paralysis, by enabling them to control prosthetic limbs, communication devices, or wheelchairs using their brain signals. Much research and innovation has gone into offering a means of communication for individuals with severe motor impairments, such as those with locked-in syndrome, allowing them to express thoughts and needs.

BCIs use various methods to detect and interpret brain activity. Often, these methods are invasive since that is what’s required to capture brain activity with high fidelity. However, they can be non-invasive by measuring electrical signals from the scalp, although it’s not the best compared to invasive, coming with its own challenges such as unwanted signals from too much noise. Non-invasive means include electroencephalography (or EEG), seemingly what Brace and Fletcher use in Brainstorm, and functional magnetic resonance imaging (or fMRI).

A brain-computer interface for post-stroke rehabilitation at the World Robot Conference 2023 in Beijing.

Xinhua/Getty Images

The collected brain signals are processed through specialized algorithms and software. These algorithms analyze and decode the neural data to extract meaningful information. So far, we’ve only made progress in decoding thoughts into speech, recreating snippets of songs, and muscle movements. In an August study published in the journal Nature, researchers at the University of California, San Franciso, were able to create a digital avatar where a patient, outfitted with a neural implant, expressed their emotions via an on-screen animated face.

That’s the extent of BCIs at the moment. We’re not anywhere close to recording whole memories or sensations — let alone afterlife experiences — nor conveying them telepathically through a headset. That Brainstorm level of complexity may very well be on the horizon, but it will require solving every mystery of how the brain works, better neural implants and other means of recording brain activity, and more sophisticated algorithms to do the decoding legwork. Science fiction, however, may very well motivate us in the right (or ethically dubious) direction.

Related Tags