PERHAPS YOU RECALL A TIME when you took in more of the world. You were new and the world was new. As a boy, I would go out in the woods and sit under a tree, then lick my thumb and wet each nostril. I had read somewhere that people—perhaps pioneers or American Indians, I don’t remember—did this in order to keen their sense of smell for approaching game or danger. I held perfectly still, my back against rough bark, all of my senses waiting. And slowly, animal life returned. A rabbit appeared under a bush, birds swooped low, an ant went on a walk-about over my knee. I felt intensely alive.
Can we be new again? In 2005, when my book Last Child in the Woods was published, I wasn’t prepared for the movement that would follow, and for the reaction of adults when they considered their own lives.
In the book, I introduced the term nature-deficit disorder—not as a medical diagnosis but as a way to describe the growing gap between children and nature. By its broadest interpretation, nature-deficit disorder is an atrophied awareness, a diminished ability to find meaning in the life that surrounds us. When we think of the nature deficit, we usually think of kids spending too much time indoors plugged into an outlet or computer screen. But after the book’s publication, I heard adults speak with heartfelt emotion, even anger, about their own sense of loss.
One day after a talk in Seattle, a woman literally grabbed my lapels and said, “Listen to me: adults have nature-deficit disorder, too.” She was right, of course. As a species, we are most animated when our days and nights are touched by the natural world. While individuals can find immeasurable joy in a great work of art, or by falling in love, all of life is rooted in nature, and a separation from it desensitizes and diminishes us.
That truth seems obvious to some of us, though it has yet to take root in the wider culture. However, in recent years an emerging body of research has begun to describe the restorative power of time spent in the natural world. Even in small doses, we are learning, exposure to nature can measurably improve our psychological and physical health.
While the study of the relationship between mental acuity, creativity, and time spent outdoors is still a frontier for science, new data suggests that exposure to the living world can even enhance intelligence. At least two factors are involved: first, our senses and sensibilities can be improved by spending time in nature; second, the natural environment seems to stimulate our ability to pay attention, think clearly, and be more creative.
In 2008, for the first time in history, more than half the world’s population lived in towns and cities. The traditional ways that humans have experienced nature are vanishing along with biodiversity. At the same time, our culture’s faith in technological immersion has no limits. We sink ever deeper into a sea of circuitry. We consume breathtaking accounts of the creation of synthetic life, combining bacteria with human DNA; of microscopic machines designed to enter our bodies to fight biological invaders; of computer-augmented reality. We even hear talk of a posthuman era in which people themselves are optimally enhanced by technology. Aren’t we getting a little ahead of ourselves?
By contrast, I believe the future can be shaped by what I call the Nature Principle, which holds that in an age of environmental, economic, and social transformation, the future will belong to the nature-smart—those individuals, businesses, and political leaders who develop a deeper understanding of nature and balance the virtual with the real.
The skeptic will say that this prescription is at best problematic, given the rate at which we’re destroying nature, and the skeptic will be right. This is why the Nature Principle is about conservation but also about restoring ourselves while we restore nature; about bringing back natural habitats where they once existed or creating them where they never were—in our homes, workplaces, cities, and suburbs. It’s about the power of living in nature—not with it but in it.
The more high-tech our lives become, the more nature we need.
MANY OF US DESIRE a fuller life of the senses. We city dwellers marvel at the seemingly superhuman or supernatural abilities of “primitive” peoples like the Australian Aborigines but consider those talents vestigial, like that remnant tailbone. Here’s another view: such senses are in fact latent in all of us, blanketed by noise and faulty assumptions.
Ever wonder why you have two nostrils? Researchers at the University of California at Berkeley did. They fitted undergraduates with taped-over goggles, earmuffs, and work gloves to block other senses, then set them loose in a field. Most of the students could follow a 30-foot-long trail of chocolate perfume and even changed direction precisely where the invisible path took a turn. The subjects were able to smell better with two functioning nostrils, which researchers likened to hearing in stereo. And they found themselves zigzagging, a technique employed by dogs as they track. “We found that not only are humans capable of scent tracking,” said study researcher Noam Sobel, “but they spontaneously mimic the tracking pattern of [other] mammals.”
What else can we do that we’ve forgotten? Scientists who study human perception no longer assume we have only five senses: taste, touch, smell, sight, and hearing. The number now ranges from a conservative 10 to as many as 30, including blood-sugar levels, empty stomach, thirst, and proprioception (awareness of our body’s position in space). In 2009, researchers at Madrid’s University of Alcalá de Henares showed how people, like bats, could identify objects without needing to see them, through the echoes of human tongue clicks. According to the lead researcher, echoes are also perceived through vibrations in ears, tongue, and bones—a refined sense learned through trial and error by some blind people and even sighted individuals. It’s all about hearing a world that exists beyond what we normally mistake for silence.
This brings us to the so-called sixth sense, which to some means intuition, to others ESP, and to still others the ability to unconsciously detect danger. In December 2004, as the devastating Asian tsunami approached, Jarawa tribespeople of India’s Andaman Islands reportedly sensed sounds from the approaching wave, or some other unusual activity, long before the water struck the shore. They fled to higher ground. The Jarawas used tribal knowledge of nature’s warning signs, explained V. R. Rao, director of the Anthropological Survey of India, based in Calcutta. “They got wind of impending danger from biological warning signals, like the cry of birds and change in the behavioral patterns of marine animals.” In the Jarawas’ case, the sixth sense may be the sum of all the other senses combined with their everyday knowledge of nature.
In separate research, the U.S. military has studied how some soldiers seem to be able to use their latent senses to detect roadside bombs and other hazards. The 18-month study of 800 military personnel found that the best bomb spotters were rural people—those who’d grown up in the woods hunting turkey or deer—as well as those from tough urban neighborhoods, where it’s equally important to be alert. “They just seemed to pick up things much better,” reported Army Sergeant Major Todd Burnett, who worked on the study for the Pentagon’s Joint Improvised Explosive Device Defeat Organization. “They know how to look at the entire environment.” And the other enlistees, the ones who’d spent more time with Game Boys or at the mall? They didn’t do so well. As Burnett put it, they were focused on the proverbial “screen rather than the whole surrounding.”
The explanation may be partly physiological. Australian researchers suggest that the troubling increase in nearsightedness is linked to young people spending less time outdoors, where eyes must focus at longer distances. But more is probably going on here. Good vision, acute hearing, an attuned sense of smell, spatial awareness—all of these abilities could be operating simultaneously. This natural advantage offers practical applications. One is an increased ability to learn; another is an enhanced capability to avoid danger. Still another, perhaps the most important, is the measurement-defying ability to more fully engage in life.
BUT LET’S BE REALISTIC. Even if we’re lucky enough to have bonded with nature when we were young, maintaining that bond is no easy thing. Information has infiltrated our every waking minute. Unctuous personalities squawk at us from flat-panel TVs on gas pumps. Billboard companies replace pasted paper with flashing digital displays. Screens pop up in airports, coffeehouses, banks, grocery-store checkout lines, even restrooms. Advertisers hawk DVDs for preschoolers on the paper liners of examination tables in pediatricians’ offices. This info-blitzkrieg has spawned a new field called interruption science and a newly minted condition: continuous partial attention.
There’s no denying the benefits of the Internet. But electronic immersion without a force to balance it creates a hole in the boat, draining our ability to pay attention, think clearly, be productive and creative. To combat these losses, our society seems to look everywhere but the natural domain for the building of better brains, whether through supplements like ginkgo biloba or nootropics—so-called smart drugs—like Ritalin, the amphetamine Adderall, and Provigil. Some people need such medication, of course, but overreliance on these substances remains a massive experiment with long-term side effects that have yet to be determined. And an immediately available, low-cost intelligence-enhancing supplement already exists.
Environmental psychologists Rachel and Stephen Kaplan began foundational work in the study of nature’s healing effect on the mind in the 1970s. Findings from their nine-year study for the U.S. Forest Service and later research suggested that contact with nature can assist with recovery from mental fatigue and can help restore attention. It can also help reboot the brain’s ability to think. The Kaplans and their team followed participants in an Outward Bound–like program, which took people into the wilderness for up to two weeks. During these treks or afterwards, subjects reported experiencing a sense of peace and an ability to think more clearly; they also reported that just being in nature was more restorative than the physical activities, like rock climbing, for which such programs are mainly known.
Over time the Kaplans developed their theory of directed-attention fatigue. Paying conscious attention to something demands voluntary effort, they found, which can erode mental effectiveness and get in the way of forming abstract long-term goals. “A number of symptoms are commonly attributed to this fatigue,” Stephen Kaplan and his colleague Raymond De Young wrote in 2002. “Irritability and impulsivity that results in regrettable choices, impatience that has us making ill-formed decisions, and distractibility that allows the immediate environment to have a magnified effect on our behavioral choices.”
The Kaplans hypothesize that the best antidote to such fatigue is involuntary attention, a kind of “fascination,” which occurs when we are in an environment that fulfills certain criteria: for instance, the setting must transport the person away from their day-to-day routine and allow the opportunity to explore. Furthermore, they found, the natural world is a particularly effective place for the human brain to overcome mental fatigue.
One reason for this might be right beneath our feet. A study conducted by Dorothy Matthews and Susan Jenks at the Sage Colleges in Troy, New York, found that a common soil bacterium given to mice helped them navigate a maze twice as fast. The natural bacterium in question, Mycobacterium vaccae, is usually ingested or inhaled when people spend time in nature. The effect wore off in a few weeks, but, Matthews said, the research suggests that the M. vaccae we come in contact with all the time in nature may “play a role” in learning in mammals. Smart pill, meet smart bug.
Taking this even further, can time in nature nurture genius itself? Creative genius is not the accumulation of knowledge; it’s the ability to see patterns in the universe, to detect hidden links between what is and what could be.
When public-radio commentator John Hockenberry reported in 2008 on research at the University of Michigan that indicated greater mental acuity after a nature walk, he pointed out that Albert Einstein and the mathematician and philosopher Kurt Gödel, “two of the most brilliant people who ever walked the face of the earth, used to famously, every single day, take walks in the woods on the Princeton campus.”
The science here is both incomplete and encouraging; we do know that, because of the brain’s plasticity, moments of growth can happen throughout life. And so can the creation of new neurons, the brain cells that process and transmit information. It’s reasonable to speculate, then, that time spent in the natural world, by both restoring and stimulating the brain, may lead to bursts of new neurons. Nature neurons.
SO DOES THIS MEAN that we should dispense with electronic media entirely? No, and for most of us that would be close to impossible. But we can cultivate a third way.
When my sons were growing up, they spent a lot of time outdoors, but they also played plenty of video games—more than I was comfortable with. Occasionally, they’d try to convince me that members of their generation were making an evolutionary leap; because they spent so much time texting, video-gaming, and so on, they were wired differently. In response I pointed out that my generation said something like that about recreational drugs. That didn’t work out so well.
Gary Small, a professor of psychiatry at the University of California at Los Angeles, suggests that the breakneck pace of technological change is creating what he calls a brain gap between the generations, and this gap is opening in a single generation.
Small and his colleagues used MRIs to study the dorsolateral area of the prefrontal cortex, which integrates complex information and short-term memory and is instrumental in decision-making. Two groups were tested: experienced, or “savvy,” computer users; and inexperienced, or “naive,” ones. While doing Web searches, savvy users had dorsolateral areas that were quite active, while in the naive users the dorsolateral area was quiet. As the Canadian magazine Maclean’s reported, “On day five, the savvy group’s brain looked more or less the same. But in the naive group, something amazing had happened: as they searched, their circuitry sprang to life, flashing and thundering in exactly the same way it did in their tech-trained counterparts.”
Teenagers’ brains are particularly malleable, apt to be shaped by technological experience. Is this a good thing? One view is that people who experience too much technology in their formative years experience stunted development of the frontal lobe, “ultimately freezing them in teen brain mode,” as Maclean’s put it.
More optimistic researchers suggest that all this multitasking is creating the smartest generation yet, freed from limitations of geography, weather, and distance—pesky inconveniences of the physical world. This vision calls to mind the sci-fi speculation of the 1950s and ’60s that people would someday be freed from physical limitations and that, as they evolved, their brains—in fact, their heads—would grow larger and larger, until members of our species or what it becomes (Homo google?) just float around in space. We’re not floaters yet. In his 2008 book The Dumbest Generation, Mark Bauerlein, a professor at Emory University, reels out studies comparing this generation of students with prior generations, finding that “they don’t know any more history or civics, economics or science, literature or current events”— despite all that available information.
But here is a third possibility, and the one I prefer: the hybrid mind. The ultimate multitasking is to live simultaneously in both the digital and physical worlds, using computers to maximize our powers to process intellectual data and natural environments to ignite our senses and accelerate our ability to learn and feel—combining the resurfaced “primitive” powers of our ancestors with the digital speed of our teenagers.
Putting the Nature Principle to use in our lives won’t, of course, be just about neurons and intelligence. A whole river is gathering force, its headwaters fed by science. New branches reach outward, producing exciting career possibilities: biophilic design, reconciliation ecology, green exercise, ecopsychology, place-based learning, slow food, and organic gardening. Generous future historians may someday write that those of us alive today did more than survive or sustain—that we brought nature back to our workplaces, our neighborhoods, and our families.
Few today would question the notion that every person, especially every young person, has a right to access the Internet, whether through a school district, a library, or a city’s public Wi-Fi program. We accept the idea that the divide between the digital haves and have-nots must be closed.
But recently I’ve been asking another question of people: Do we have a right to walk in the woods?