While promoting his 2005 book Last Child in the Woods, writer Richard Louv used the biophillia hypothesis to back up his claim that technology has stripped children of the biological imperative to go play outside. A concept developed in 1972 and turned scientific in 1984, this so-called “Biophillia Hypothesis” links the human desire to be in nature to unspecified genetic traits. It is, rather unsurprisingly, favored by a certain genre of concerned parent — the kind that gifts kids the Tree Finder guide and favors unstructured time. But, from a scientific perspective, biophilia is nothing more than a hypothesis, a less-than-clinical explanation of why so many of us just want to go outside.
“Basically the idea is that biologically we are still hunters and gatherers and we need, at some level we don’t fully understand, direct involvement in nature,” Louv, who’s become the foremost evangelist for the theory, told NPR. “We need to see natural shapes in the horizon. And when we don’t get that, we don’t do so well.”
This notion of evolutionary lag, the implication that our physiology no longer suits our circumstance, makes a natural sort of sense. Our chairs hurt. Our eyes suffer from staring at screens. But it’s not just that. It’s a sense of kinship with the organic world that becomes borderline transcendental. In 1997, 2003, and 2005, researchers from the University of Illinois and Utah State University conducted questionnaire surveys, asking a total of 200 people whether they considered themselves part of or separate from nature. Some 77 percent said that they felt natural regardless of whether they went outside. It was an interesting finding in that it heavily implied that a majority of people spend a great deal of time feeling like the organic element of their inorganic worlds.
Commuters, stuck in cars, still feel connected to the ecosystems they see out their windows even though in many literal senses they are not. What connects them — if anything — is yearning.
But why do people yearn for something that burns, drowns, freezes, and triggers allergies? The answer might have to do with the fact our engagement with the natural world is now voluntary and, therefore, mostly pleasant. Nature makes us feel good while actually making us healthier. The list of real medical benefits seems almost too good to be true: The New York State Department of Environmental Conservation cites studies claiming time outside boosts immune systems, lowers blood pressure, and accelerates recovery from surgery. According to the Harvard Medical School being outdoors improves concentration, mood, and general happiness. Even looking at trees, much less picnicking beneath them, has been shown to help patients: Studies have found that patients who have a view of trees from their hospital rooms spend less time in the hospital and demonstrate fewer symptoms than patients without a view to the outside.
But the most interesting academic peek into our obsession with sunning ourselves is probably the 2010 cross-collaboration study between American and Canadian universities that found being outside made people feel “more alive” — in essence, nature gave the study participants an increased sense of vitality. The research team conducted five separate experiments on 537 college students, putting them in actual and imagined contexts of nature. Across each study, people felt better when they were in nature while 90 percent of the subjects said that they felt increased energy when they were outside. One study in particular demonstrated that 20 minutes outside was all people needed to feel invigorated.
“Nature is fuel for the soul,” said the study’s lead author Richard Ryan in an uncharacteristically philosophical statement. “Often when we feel depleted we reach for a cup of coffee, but research suggests a better way to get energized to connect with nature.”
In another study, published in the June 2015 edition of Landscape and Urban Planning, researchers randomly assigned 60 participants to either go on a 50-minute walk in nature or an urban environment around Stanford, California. They found that those who had the “nature experience” experienced decreased anxiety and rumination, while also experienced cognition benefits like being better at memory tasks. The people who had the urban walk felt few effects. One could blame the blandness of Palo Alto, but there seems to be a broader truth: Now that nature doesn’t present an existential threat, it has become the ultimate palliative.
So-called “Spring Fever,” an observable and real psychological phenomenon, is probably the most obvious manifestation of our need to leave our homes and offices. At the spring equinox people report higher energy levels, decreased sleep, and an alleviation of depression. As people experience more daylight, the brain secretes less melatonin, waking us up, while simultaneously releasing serotonin, which makes us giddy. We’re arguably both dumber and tanner for the seasonal shift, but most people welcome the trade-off. In a sense, the way our skin and eyes interact with the sun makes us into outdoor addicts. The yearning can turn into a craving when our body turns a walk in the park into a chemical high.
Which is all to say that Louv’s advocacy for time outdoors is not ridiculous: Despite the fact that studies prove nature makes us healthier, 50 percent of people now live in urban areas with limited access to nature. By 2050 that number will hit 70 percent. Urbanization is good for nature, but potentially bad for our relationship with it and therefore bad for us.
Humans are often the sole natural part of urban landscapes, but we don’t get the same relief from hanging around each other as from taking a hike through the woods or looking at a waterfall. When we look at ourselves, we see something natural. When we look at each other, we see products of a man-made world. It would seem that we’re partially wrong in both cases and also partially right.