Surviving the brain-dissolving internet

I’ve been a technology journalist for nearly 20 years and a tech devotee even longer. Over that time, I’ve been obsessed with how the digital experience scrambles how we make sense of the real world.  

Technology may have liberated us from the old gatekeepers, but it also created a culture of choose-your-own-fact niches, elevated conspiracy thinking to the center of public consciousness and brought the incessant nightmare of high-school-clique drama to every human endeavor. It also skewed our experience of daily reality. 

Objectively, the world today is better than ever, but the digital world inevitably makes everyone feel worse. It isn’t just the substance of daily news that unmoors you, but also the speed and volume and oversaturated fakery of it all. 

And so, to survive the brain-dissolving internet, I turned to meditation.

The fad is backed by reams of scientific research showing the benefits of mindfulness for your physical and mental health — how even short-term stints improve your attention span and your ability to focus, your memory, and other cognitive functions.

Farhad Manjoo writing in the New York Times

The Victorian Internet

Tom Standage writes in his book The Victorian Internet, “That the telegraph was so widely seen as a panacea is perhaps understandable. The fact that we are still making the same mistake today is less so. The irony is that even though it failed to live up to the utopian claims made by about it, the telegraph really did transform the world.”

The Internet, like the telegraph, offers tremendous potential for altering the world in a positive way. But we would be wise to temper our enthusiasm.

As Standage suggests, “Better communication does not necessarily lead to a wider understanding of other points of view: the potential of new technologies to change things for the better is invariably overstated, while the ways in which they will make things worse are usually unforeseen.”

Stephen Goforth

I Used to Be a Human Being

In the last year of my blogging life, my health began to give out. Four bronchial infections in 12 months had become progressively harder to kick. Vacations, such as they were, had become mere opportunities for sleep. My dreams were filled with the snippets of code I used each day to update the site. My friendships had atrophied as my time away from the web dwindled. My doctor, dispensing one more course of antibiotics, finally laid it on the line: “Did you really survive HIV to die of the web?”

But the rewards were many: an audience of up to 100,000 people a day; a new-media business that was actually profitable; a constant stream of things to annoy, enlighten, or infuriate me; a niche in the nerve center of the exploding global conversation; and a way to measure success — in big and beautiful data — that was a constant dopamine bath for the writerly ego. If you had to reinvent yourself as a writer in the internet age, I reassured myself, then I was ahead of the curve. The problem was that I hadn’t been able to reinvent myself as a human being.

Andrew Sullivan writing in New York Magazine

Policing people’s grammar online is never really about grammar

One of the many unexpected side effects of the internet is that it’s shown us just how many people appear to lose the capacity for emotional self-regulation when confronted with a misused semicolon. Scroll through the comments section of any publication or simply sign on to Twitter, and you’ll find plenty of examples of people who treat typos and grammatical errors not just as ordinary mistakes, but as a kind of moral offense.

When a grammar stickler obsesses over the proper placement of an apostrophe in a Facebook status or a blog post, they’re not engaging with the actual content. How many times have we seen an online commenter whose only remark on a post about the author’s struggles with body image is “It’s their not there,” or a Twitter acquaintance who proudly screenshots a typo in a New York Times article on science education? The instinct to publicly criticize and police linguistic errors is also a way to avoid wading into the muck of other people’s thoughts and feelings, and redirect the conversation back toward oneself.

Because young or poor or immigrant populations are often among those who may not conform to traditional English grammar and spelling and punctuation usage, focusing on linguistic deviations can reinforce the barriers of privilege.

Sarah Todd writing in Quartz

why Facebook survived

While Facebook was just getting on its feet in 2004, a similar social network called Campus Network (or CU Community) was ahead and more advanced. Slate explains why only one survived.

Why did Facebook succeed where Campus Network failed? The simplest explanation is, well, its simplicity. Yes, Campus Network had advanced features that Facebook was missing. While Campus Network blitzed first-time users right away, Facebook updated its features incrementally. Facebook respected the Web's learning curve.

Campus Network did too much too soon. Neither site, of course, can claim to be the first social network—Friendster and MySpace already had large followings in 2003. But both Facebook and Campus Network had the crucial insight that overlaying a virtual community on top of an existing community—a college campus—would cement users' trust and loyalty. Campus Network figured it out first. Facebook just executed it better.

While people want to make their own choices, research shows too many options creates problems. We become overwhelmed. There is no substitute for simplicity and clarity. Whether on purpose or by accident, Facebook was built from the perspective of looking at what users would do with the site rather than building to show off what its creators could do. One approach shows respect for the audience.

Stephen Goforth

The Web Almost Killed Me

For a decade and a half, I’d been a web obsessive, publishing blog posts multiple times a day, seven days a week, and ultimately corralling a team that curated the web every 20 minutes during peak hours. Each morning began with a full immersion in the stream of internet consciousness and news, jumping from site to site, tweet to tweet, breaking news story to hottest take, scanning countless images and videos, catching up with multiple memes. Throughout the day, I’d cough up an insight or an argument or a joke about what had just occurred or what was happening right now. My brain had never been so occupied so insistently by so many different subjects and in so public a way for so long.  If you had to reinvent yourself as a writer in the internet age, I reassured myself, then I was ahead of the curve. The problem was that I hadn’t been able to reinvent myself as a human being.

I realized I had been engaging—like most addicts—in a form of denial. I’d long treated my online life as a supplement to my real life. But then I began to realize, as my health and happiness deteriorated, that this was not a both-and kind of situation. It was either-or. Every hour I spent online was not spent in the physical world.

Andrew Sullivan, I used to Be a Human Being

Information Overload

“Information overload” is one of the biggest irritations in modern life. Commentators have coined a profusion of phrases to describe the anxiety and anomie caused by too much information: “data asphyxiation” (William van Winkle), “data smog” (David Shenk), “information fatigue syndrome” (David Lewis), “cognitive overload” (Eric Schmidt) and “time famine” (Leslie Perlow). Johann Hari, a British journalist, notes that there is a good reason why “wired” means both “connected to the internet” and “high, frantic, unable to concentrate”.

These worries are exaggerated. Stick-in-the-muds have always complained about new technologies: the Victorians fussed that the telegraph meant that “the businessman of the present day must be continually on the jump.” Yet clearly there is a problem. It is not merely the dizzying increase in the volume of information (the amount of data being stored doubles every 18 months). It is also the combination of omnipresence and fragmentation. Many professionals are welded to their smartphones.

They raise three big worries. First, information overload can make people feel anxious and powerless: scientists have discovered that multitaskers produce more stress hormones. Second, overload can reduce creativity. Teresa Amabile of Harvard Business School has spent more than a decade studying the work habits of 238 people, collecting a total of 12,000 diary entries between them. She finds that focus and creativity are connected. People are more likely to be creative if they are allowed to focus on something for some time without interruptions. If constantly interrupted or forced to attend meetings, they are less likely to be creative. Third, overload can also make workers less productive. David Meyer, of the University of Michigan, has shown that people who complete certain tasks in parallel take much longer and make many more errors than people who complete the same tasks in sequence.

What can be done about information overload? One answer is technological: rely on the people who created the fog to invent filters that will clean it up. A second answer involves willpower. Ration your intake. Turn off your mobile phone and internet from time to time.

But such ruses are not enough. Smarter filters cannot stop people from obsessively checking their BlackBerrys. Some do so because it makes them feel important; others because they may be addicted to the “dopamine squirt” they get from receiving messages, as Edward Hallowell and John Ratey, two academics, have argued. And self-discipline can be counter-productive if your company doesn’t embrace it. Some bosses get shirty if their underlings are unreachable even for a few minutes.

Most companies are better at giving employees access to the information superhighway than at teaching them how to drive. This is starting to change. Management consultants have spotted an opportunity. Derek Dean and Caroline Webb of McKinsey urge businesses to embrace three principles to deal with data overload: find time to focus, filter out noise and forget about work when you can. Business leaders are chipping in. David Novak of Yum! Brands urges people to ask themselves whether what they are doing is constructive or a mere “activity”. John Doerr, a venture capitalist, urges people to focus on a narrow range of objectives and filter out everything else. Cristobal Conde of SunGard, an IT firm, preserves “thinking time” in his schedule when he cannot be disturbed. This might sound like common sense. But common sense is rare amid the cacophony of corporate life.

Schumpeter, from The Economist

the dictatorship of data

The dictatorship of data ensnares even the best of them. Google runs everything according to data. That strategy has led to much of its success. But it also trips up the company from time to time. Its cofounders, Larry Page and Sergey Brin, long insisted on knowing all job candidates’ SAT scores and their grade point averages when they graduated from college. In their thinking, the first number measured potential and the second measured achievement. Accomplished managers in their 40s were hounded for the scores, to their outright bafflement. The company even continued to demand the numbers long after its internal studies showed no correlation between the scores and job performance.

Google ought to know better, to resist being seduced by data’s false charms. The measure leaves little room for change in a person’s life. It counts book smarts at the expense of knowledge. And it may not reflect the qualifications of people from the humanities, where know-how may be less quantifiable than in science and engineering. Google’s obsession with such data for HR purposes is especially queer considering that the company’s founders are products of Montessori schools, which emphasize learning, not grades. By Google’s standards, neither Bill Gates nor Mark Zuckerberg nor Steve Jobs would have been hired, since they lack college degrees.

Kenneth Cukier and Viktor Mayer-Schönberger, writing in MIT’s Technology Review

Read more here

the information riot

The Internet is an interruption system. It seizes our attention only to scramble it. There’s the problem of hypertext and the many different kinds of media coming at us simultaneously. Every time we shift our attention, the brain has to reorient itself, further taxing our mental resources. Many studies have shown that switching between just two tasks can add substantially to our cognitive load, impeding our thinking and increasing the likelihood that we’ll overlook or misinterpret important information.

On the Internet, where we generally juggle several tasks, the switching costs pile ever higher. We willingly accept the loss of concentration and focus, the fragmentation of our attention, and the thinning of our thoughts in return for the wealth of compelling, or at least diverting, information we receive.

Nicholas Carr
The Shallows

Skimming skills and thinking deeply

Imagine filling a bathtub with a thimble; that’s the challenge involved in moving information from working memory into long-term memory. On the Net, we face many information faucets, all going full blast. Our little thimble overflows as we rush from tap to tap. We transfer only a small jumble of drops from different faucets, not a continuous, coherent stream. When the load exceeds our mind’s ability to process and store it, we’re unable to retain the information or to draw connections with other memories.

The ability to scan and browse is as important as the ability to read deeply and think attentively. The problem is that skimming is becoming our dominant mode of thought. Once a means to an end, a way to identify information for further study, it’s becoming an end in itself—our preferred method of both learning and analysis. Dazzled by the Net’s treasures, we are blind to the damage we may be doing to our intellectual lives and even our culture.

We are evolving from cultivators of personal knowledge into hunters and gatherers in the electronic data forest.

Nicholas Carr, The Shallows