The seemingly trivial activity of mind-wandering is now believed to play a central role in the brain’s “deep learning,” the mind’s sifting through past experiences, imagining future prospects and assessing them with emotional judgments: that flash of shame or pride or anxiety that each scenario elicits.
A growing number of scholars, drawn from a wide swath of disciplines — neuroscience, philosophy, computer science — now argue that this aptitude for cognitive time travel, revealed by the discovery of the default network, may be the defining property of human intelligence. “What best distinguishes our species,” Martin Seligman wrote in a Times Op-Ed with John Tierney, “is an ability that scientists are just beginning to appreciate: We contemplate the future.” He went on: “A more apt name for our species would be Homo prospectus, because we thrive by considering our prospects. The power of prospection is what makes us wise.”
Today, it seems, mind-wandering is under attack from all sides. It’s a common complaint that our compulsive use of smartphones is destroying our ability to focus. But seen through the lens of Homo prospectus, ubiquitous computing poses a different kind of threat: Having a network-connected supercomputer in your pocket at all times gives you too much to focus on. It cuts into your mind-wandering time. The downtime between cognitively active tasks that once led to REST states can now be filled with Instagram, or Nasdaq updates, or podcasts. We have Twitter timelines instead of time travel.
At the same time, a society-wide vogue for “mindfulness” encourages us to be in the moment, to think of nothing at all instead of letting our thoughts wander. Search YouTube, and there are hundreds of meditation videos teaching you how to stop your mind from doing what it does naturally. The Homo prospectus theory suggests that, if anything, we need to carve out time in our schedule — and perhaps even in our schools — to let minds drift.
Steven Johnson writing in the New York Times