Surviving the brain-dissolving internet

I’ve been a technology journalist for nearly 20 years and a tech devotee even longer. Over that time, I’ve been obsessed with how the digital experience scrambles how we make sense of the real world.  

Technology may have liberated us from the old gatekeepers, but it also created a culture of choose-your-own-fact niches, elevated conspiracy thinking to the center of public consciousness and brought the incessant nightmare of high-school-clique drama to every human endeavor. It also skewed our experience of daily reality. 

Objectively, the world today is better than ever, but the digital world inevitably makes everyone feel worse. It isn’t just the substance of daily news that unmoors you, but also the speed and volume and oversaturated fakery of it all. 

And so, to survive the brain-dissolving internet, I turned to meditation.

The fad is backed by reams of scientific research showing the benefits of mindfulness for your physical and mental health — how even short-term stints improve your attention span and your ability to focus, your memory, and other cognitive functions.

Farhad Manjoo writing in the New York Times

The people inside the machine

In 1770 a chess-playing robot, built by a Hungarian inventor, caused a sensation across Europe. The Mechanical Turk was capable of beating even the best players at chess. 

It eventually transpired that there was a human chess player cleverly concealed in its innards. The apparently intelligent machine depended on a person hidden inside. 

It turns out that something very similar is happening today. Just like the Turk, modern artificial-intelligence (AI) systems rely on help from unseen humans. 

Pretty much everything you do online creates a trail of data that can be used for making systems smarter. As Google, Facebook and others operate their enormous smart machines, we are all helping to power them. A clockwork chess robot from the 1770s thus foreshadowed both the modern debate about artificial intelligence – and a key aspect of making the technology work. The internet is a giant Mechanical Turk: whether we know it or not, we have all become the people inside the machine

Tom Standage writing in 1843 Magazine 

The First CRISPR baby

Eventually, a CRISPR baby will be born.* The (new gene-editing) technology is too easy. There is no world government to stop its use; many argue no one should do so anyway. At the point that baby emerges, perhaps modified to evade a particular disease or perhaps even to look a particular way, theoretical debates will become real. 

 Jennifer Doudna knows the influence she and her fellow scientists have is diminishing every day. “I would hope this would be used to create cures, to help people,” she says. Even if the technology is not quite there yet, CRISPR could eventually do plenty else besides. Every week a new paper is published finding more genes that influence looks, intelligence, stamina, even sexuality. 

“The dystopic view would be IVF clinics that offer parents a menu of options for kids,” she says. “Nobody has kids by sex anymore. You go to a clinic, pick from a menu, say, ‘I want my kid to be this tall, have this colour of eye, this level of IQ,’ and all those sorts of things. I think that would be terrible.” 

Tom Whipple writing in 1843 magazine 

*Chinese scientists are creating CRISPR babies  MIT Technology Review 

 

The people behind the AI Curtain

“So much of what passes for automation isn’t really automation,” says writer and documentarian Astra Taylor. She describes a moment when she was waiting to pick up her lunch at a cafe, and another customer walked in, awestruck, wondering aloud how the app knew that his order was ready 20 minutes early. The woman behind the counter just looked at him and said, “I just sent you a message.”

“He was so convinced that it was a robot,” Taylor says. “He couldn’t see the human labor right in front of his eyes.”

She calls this process fauxtomation: “Fauxtomation renders invisible human labor to make computers seem smarter than they are.”

“AI” usually relies on a lot of low-paid human labor.

Katharine Manning Schwab writing in Fast Company

When your appliances work as police informants

Suppose police suspect a man of organizing a political protest that turned violent, muses the ACLU’s Nathan Wessler, who argued the Carpenter case (on digital privacy) for the ACLU before the Supreme Court. The suspect’s smart meter and thermostat confirm that a handful of people showed up at his home and stayed there the two nights before the demonstration; the suspect’s smart refrigerator ordered a bunch of soda and snack food on those days, which was all consumed; after someone asked Alexa to play some music in his living room, a voice in the background said, “Tomorrow, we’re going to really show them”; and that night, the suspect’s smart mattress recorded him sleeping fitfully and his heart beating faster than normal. The police arrest the man on conspiracy and other charges. He eventually proves he’s innocent – some old friends visited from out of town, and planned a day of sightseeing—but not before a legal nightmare turns his life upside down.

 "There’s not a person among us who doesn’t have private aspects of their life that could create difficulty for them if they were exposed,” Wessler says. “And misinterpreted.”

David Henry writing in 1843

The Phone Trade-Off

Smartphone photography isn’t making us dumber. It’s shifting the way our minds work, refocusing our attention.

Alixandra Barasch is a cognitive scientist at NYU. In her work, she finds that, yes, incessant smartphone camera use can lead to lapses in memory. But, more importantly, she finds a wrinkle: Cameras can also focus our attention to enhance memory.

She’s run similar studies to the one at Stanford, where participants either take photos or don’t take photos while on a museum tour. When instructed to take photos of an exhibit, her participants were more likely to remember visual aspects of their experience (the art and artifacts they saw) than if they didn’t take photos. But there’s a trade-off: The participants snapping photos were less likely to remember information they heard.

Brian Resnick writing in Vox

What Causes Technological Adoption

Popular culture presents consumer technology as a never-ending upward progression that continuously makes things better for everybody. In reality, new tech products usually involve a set of tradeoffs where improvements in areas like usability or design come along with weaknesses in areas like privacy & security. Sometimes new tech is better for one community while making things worse for others. Most importantly, just because a particular technology is “better” in some way doesn’t guarantee it will be widely adopted, or that it will cause other, more popular technologies to improve.

In reality, technological advances are a lot like evolution in the biological world: there are all kinds of dead-ends or regressions or uneven tradeoffs along the way, even if we see broad progress over time.

Anil Dash writing in Medium

 

Tech history is poorly documented and poorly understood

It’s often near impossible to know why certain technologies flourished, or what happened to the ones that didn’t. While we’re still early enough in the computing revolution that many of its pioneers are still alive and working to create technology today, it’s common to find that tech history as recent as a few years ago has already been erased. Why did your favorite app succeed when others didn’t? What failed attempts were made to create such apps before? What problems did those apps encounter — or what problems did they cause? Which creators or innovators got erased from the stories when we created the myths around today’s biggest tech titans?

All of those questions get glossed over, silenced, or sometimes deliberately answered incorrectly, in favor of building a story of sleek, seamless, inevitable progress in the tech world. Now, that’s hardly unique to technology — nearly every industry can point to similar issues. But that ahistorical view of the tech world can have serious consequences when today’s tech creators are unable to learn from those who came before them, even if they want to.

Anil Dash writing in Medium

Most tech doesn’t come from startups

Only about 15% of programmers work at startups, and in many big tech companies, most of the staff aren’t even programmers anyway. So the focus on defining tech by the habits or culture of programmers that work at big-name startups deeply distorts the way that tech is seen in society. Instead, we should consider that the majority of people who create technology work in organizations or institutions that we don’t think of as “tech” at all.

What’s more, there are lots of independent tech companieslittle indie shops or mom-and-pop businesses that make websites, apps, or custom software, and a lot of the most talented programmers prefer the culture or challenges of those organizations over the more famous tech titans. We shouldn’t erase the fact that startups are only a tiny part of tech, and we shouldn’t let the extreme culture of many startups distort the way we think about technology overall.

Anil Dash writing in Medium

The multitude Books is a great evil!

Flash back to the year 1455. German Johannes Gutenberg prints his first book, the Latin Vulgate Bible. As Gutenberg’s press reaches across Europe, the Bible is translated into local languages. Poorly-produced copies of the Bible and mediocre literature soon thrive, leading to claims that the printing press must be controlled to avoid chaos and loss of intellectual life. Martin Luther complains, “The multitude of books is a great evil. There is no measure of limit to this fever for writing.” 

Comparisons are being made between the effects of the printing press to the advent of the internet.

Stephen Goforth

 

We prefer the Apps

The family that is eating together while simultaneously on their phones is not actually together. They are, in writer Sherry Turkle’s formulation, “alone together.” You are where your attention is. If you’re watching a football game with your son while also texting a friend, you’re not fully with your child — and he knows it. Truly being with another person means being experientially with them, picking up countless tiny signals from the eyes and voice and body language and context, and reacting, often unconsciously, to every nuance. These are our deepest social skills, which have been honed through the aeons. They are what make us distinctively human.

No wonder we prefer the apps. An entire universe of intimate responses is flattened to a single, distant swipe. We hide our vulnerabilities, airbrushing our flaws and quirks; we project our fantasies onto the images before us. Rejection still stings — but less when a new virtual match beckons on the horizon.

Andrew Sullivan writing in New York Magazine

Tech created a global village — and put us at each other’s throats

As we get additional information about others, we place greater stress on the ways those people differ from us than on the ways they resemble us, and this inclination to emphasize dissimilarities over similarities strengthens as the amount of information accumulates. On average, we like strangers best when we know the least about them.

The effect intensifies in the virtual world, where everyone is in everyone else’s business. Social networks like Facebook and messaging apps like Snapchat encourage constant self-disclosure. Because status is measured quantitatively online, in numbers of followers, friends, and likes, people are rewarded for broadcasting endless details about their lives and thoughts through messages and photographs. To shut up, even briefly, is to disappear. One study found that people share four times as much information about themselves when they converse through computers as when they talk in person.

Progress toward a more amicable world will require not technological magic but concrete, painstaking, and altogether human measures: negotiation and compromise, a renewed emphasis on civics and reasoned debate, a citizenry able to appreciate contrary perspectives. At a personal level, we may need less self-expression and more self-examination.

Technology is an amplifier. It magnifies our best traits, and it magnifies our worst.

Nicholas Carr writing in the Boston Globe

There’s a tiger over there!

“Everybody is fighting for your attention, so your only real defense is to make it so that those stimuli don’t come in the door,” says Boston University cognitive neuroscientist David Somers. The idea that your technology should alert you when it thinks you should pay attention is relatively new (push alerts only really became a thing in 2009), and, frankly, it’s a big step backward. To use the earlier metaphor, you’re letting the bushes rustle nonstop, and telling yourself there’s a tiger over there.

“It’s so important that we define where we want to go as opposed to letting technology drive us and we’re just hanging on for dear life,” says author Amy Blankson, who works in the filed of positive psychology, specifically on maximizing happiness.

Still, everyone gets a buzz from this high-octane news environment. Literally. Every notification, every tweet, every beep and buzz releases dopamine and other neurochemicals, providing a moment’s elation. As with any drug, your brain gets used to it. Perhaps even craves it. “Even when you’re really on your best behavior and you’re like ‘OK I’m going to close my web browser, I’m going to shut off my phone,’ you still have this internal need for that feedback,” says Somers.

Reclaim control of what you read.

Emily Dreyfuss, Wired

Commercial Culture’s Substitute for Love

Let me toss out the idea that, as our markets discover and respond to what consumers most want, our technology has become extremely adept at creating products that correspond to our fantasy ideal of an erotic relationship, in which the beloved object asks for nothing and gives everything, instantly, and makes us feel all powerful, and doesn’t throw terrible scenes when it’s replaced by an even sexier object and is consigned to a drawer.

To speak more generally, the ultimate goal of technology, the telos of techne, is to replace a natural world that’s indifferent to our wishes — a world of hurricanes and hardships and breakable hearts, a world of resistance — with a world so responsive to our wishes as to be, effectively, a mere extension of the self.

Let me suggest, finally, that the world of techno-consumerism is therefore troubled by real love, and that it has no choice but to trouble love in turn.

Its first line of defense is to commodify its enemy. You can all supply your own favorite, most nauseating examples of the commodification of love. Mine include the wedding industry, TV ads that feature cute young children or the giving of automobiles as Christmas presents, and the particularly grotesque equation of diamond jewelry with everlasting devotion. The message, in each case, is that if you love somebody you should buy stuff.

A related phenomenon is the transformation, courtesy of Facebook, of the verb “to like” from a state of mind to an action that you perform with your computer mouse, from a feeling to an assertion of consumer choice. And liking, in general, is commercial culture’s substitute for loving.

The striking thing about all consumer products — and none more so than electronic devices and applications — is that they’re designed to be immensely likable. This is, in fact, the definition of a consumer product, in contrast to the product that is simply itself and whose makers aren’t fixated on your liking it. (I’m thinking here of jet engines, laboratory equipment, serious art and literature.)

But if you consider this in human terms, and you imagine a person defined by a desperation to be liked, what do you see? You see a person without integrity, without a center. In more pathological cases, you see a narcissist — a person who can’t tolerate the tarnishing of his or her self-image that not being liked represents, and who therefore either withdraws from human contact or goes to extreme, integrity-sacrificing lengths to be likable.

If you dedicate your existence to being likable, however, and if you adopt whatever cool persona is necessary to make it happen, it suggests that you’ve despaired of being loved for who you really are. And if you succeed in manipulating other people into liking you, it will be hard not to feel, at some level, contempt for those people, because they’ve fallen for your shtick

Consumer technology products would never do anything this unattractive, because they aren’t people. They are, however, great allies and enablers of narcissism. Alongside their built-in eagerness to be liked is a built-in eagerness to reflect well on us. Our lives look a lot more interesting when they’re filtered through the sexy Facebook interface. We star in our own movies, we photograph ourselves incessantly, we click the mouse and a machine confirms our sense of mastery.

And, since our technology is really just an extension of ourselves, we don’t have to have contempt for its manipulability in the way we might with actual people. It’s all one big endless loop. We like the mirror and the mirror likes us. To friend a person is merely to include the person in our private hall of flattering mirrors.

I may be overstating the case, a little bit. Very probably, you’re sick to death of hearing social media disrespected by cranky 51-year-olds. My aim here is mainly to set up a contrast between the narcissistic tendencies of technology and the problem of actual love. My friend Alice Sebold likes to talk about “getting down in the pit and loving somebody.” She has in mind the dirt that love inevitably splatters on the mirror of our self-regard.

The simple fact of the matter is that trying to be perfectly likable is incompatible with loving relationships. Sooner or later, for example, you’re going to find yourself in a hideous, screaming fight, and you’ll hear coming out of your mouth things that you yourself don’t like at all, things that shatter your self-image as a fair, kind, cool, attractive, in-control, funny, likable person. Something realer than likability has come out in you, and suddenly you’re having an actual life.

Suddenly there’s a real choice to be made, not a fake consumer choice between a BlackBerry and an iPhone, but a question: Do I love this person? And, for the other person, does this person love me?

There is no such thing as a person whose real self you like every particle of. This is why a world of liking is ultimately a lie. But there is such a thing as a person whose real self you love every particle of. And this is why love is such an existential threat to the techno-consumerist order: it exposes the lie.

This is not to say that love is only about fighting. Love is about bottomless empathy, born out of the heart’s revelation that another person is every bit as real as you are. And this is why love, as I understand it, is always specific. Trying to love all of humanity may be a worthy endeavor, but, in a funny way, it keeps the focus on the self, on the self’s own moral or spiritual well-being. Whereas, to love a specific person, and to identify with his or her struggles and joys as if they were your own, you have to surrender some of your self

The big risk here, of course, is rejection. We can all handle being disliked now and then, because there’s such an infinitely big pool of potential likers. But to expose your whole self, not just the likable surface, and to have it rejected, can be catastrophically painful. The prospect of pain generally, the pain of loss, of breakup, of death, is what makes it so tempting to avoid love and stay safely in the world of liking.

And yet pain hurts but it doesn’t kill. When you consider the alternative — an anesthetized dream of self-sufficiency, abetted by technology — pain emerges as the natural product and natural indicator of being alive in a resistant world. To go through a life painlessly is to have not lived. Even just to say to yourself, “Oh, I’ll get to that love and pain stuff later, maybe in my 30s” is to consign yourself to 10 years of merely taking up space on the planet and burning up its resources. Of being (and I mean this in the most damning sense of the word) a consumer.

The fundamental fact about all of us is that we’re alive for a while but will die before long. This fact is the real root cause of all our anger and pain and despair. And you can either run from this fact or, by way of love, you can embrace it.

When you stay in your room and rage or sneer or shrug your shoulders, as I did for many years, the world and its problems are impossibly daunting. But when you go out and put yourself in real relation to real people, or even just real animals, there’s a very real danger that you might love some of them.

And who knows what might happen to you then?

Jonathan Franzen, Kenyon College 2011 Commencement speech

The March of Technology

The designs built into modern technology comes from savvy, white-coated engineers working tirelessly in their laboratories to create the most effective devices . Right? Kevin Kelly offers this example of how decisions made by generations past stay with us in unexpected ways:

Ordinary Roman carts were constructed to match the width of Imperial Roman war chariots because it was easier to follow the ruts in the road left by the war chariots. The chariots were sized to accommodate the width of two large war horses, which translates into our English measurement as a width of 4' 8.5". Roads throughout the vast Roman empire were built to this spec. When the legions of Rome marched into Britain, they constructed long distance imperial roads 4' 8.5" wide. When the English started building tramways, they used the same width so the same horse carriages could be used. And when they started building railways with horseless carriages, naturally the rails were 4' 8.5" wide. Imported laborers from the British Isles built the first railways in the Americas using the same tools and jigs they were used to.

Fast forward to the US Space shuttle, which is built in parts around the country and assembled in Florida. Because the two large solid fuel rocket engines on the side of the launch Shuttle were sent by railroad from Utah, and that line transversed a tunnel not much wider than the standard track, the rockets themselves could not be much wider than 4' 8.5." As one wag concluded: "So, a major Space Shuttle design feature of what is arguably the world's most advanced transportation system was determined over two thousand years ago by the width of two horses' arse."

Kevin Kelly, What Technology Wants

Earnest was right-but no one listened

As legend has it, Ernest Duchesne was a student at a French military medical school in the 1890s when he noticed that the hospital’s stable boys who tended the horses did something peculiar: They stored their saddles in a damp, dark room so that mold would grow on their undersurfaces. They did this, they explained, because the mold helped heal the horses’ saddle sores. Duchesne was fascinated and conducted an experiment in which he treated sick guinea pigs with a solution made from mold—a rough form of what we’d now call penicillin. The guinea pigs healed completely. Duchesne wrote up his findings in a thesis, but because he was unknown and young—only 23 at the time—the French Institut Pasteur wouldn’t acknowledge it. His research vanished, and Duchesne died 15 years later of tuberculosis (a disease that would someday be treatable with antibiotics). It would take 31 years for the Scottish scientist Alexander Fleming to rediscover penicillin, independently and with no idea that Duchesne had already done it. In those three decades, untold millions of people died of diseases that could have been cured. Failed networks kill ideas.

Clive Thompson, Smarter Than you Think

The day pain died

The date of the first operation under anesthetic, Oct. 16, 1846, ranks among the most iconic in the history of medicine.

Before 1846, the vast majority of religious and medical opinion held that pain was inseparable from sensation in general, and thus from life itself. Though the idea of pain as necessary may seem primitive and brutal to us today, it lingers in certain corners of healthcare, such as obstetrics and childbirth, where epidurals and caesarean sections still carry the taint of moral opprobrium. In the early 19th century, doctors interested in the pain-relieving properties of ether and nitrous oxide were characterized as cranks and profiteers. The case against them was not merely practical, but moral: They were seen as seeking to exploit their patients' base and cowardly instincts. Furthermore, by whipping up the fear of operations, they were frightening others away from surgery and damaging public health.

The "eureka moment" of anesthesia, like the seemingly sudden arrival of many new technologies, was not so much a moment of discovery as a moment of recognition: a tipping point when society decided that old attitudes needed to be overthrown. It was a social revolution as much as a medical one.

Mike Jay, The Atmosphere of Heaven

Filling Up on Digital Junk food

We’re becoming quite intolerant of letting each other think complicated things. To hear someone else out, you need to be able to be still for a while and pay attention to something other than your immediate needs. So if we’re living in a moment when you can be in seven different places at once… on a phone here, on a laptop. How do we save stillness?

Erik Erickson talks about the need for stillness in order to fully develop and to discover your identity and become who you need to become and think what you need to think. Stillness is one of the great things in jeopardy.

When we’re texting, on the phone, doing e-mail, getting information, the experience is of being filled up. That feels good. And we assume that it is nourishing in the sense of taking us to a place we want to go. And I think that we are going to start to learn that in our enthusiasm and in our fascinations, we can also be flattened and depleted by what perhaps was once nourishing us but which can’t be a steady diet. If all I do is my e-mail, my calendar, and my searches, I feel great; I feel like a master of the universe. And then it’s the end of the day, I’ve been busy all day, and I haven’t thought about anything hard, and I have been consumed by the technologies that were there and that had the power to nourish me. If kids feel that they need to be connected in order to be themselves that’s quite unhealthy. They’ll always feel lonely, because the connections that they’re forming are not going to give them what they seek.

Sherry Turke, Alone Together

The incessant demands of the inbox

Before email, if you wanted to write to someone, you had to invest some effort in it. You’d sit down with pen and paper, or at a typewriter, and carefully compose a message. There wasn’t anything about the medium that lent itself to dashing off quick notes without giving them much thought, partly because of the ritual involved, and the time it took to write a note, find and address an envelope, add postage, and take the letter to a mailbox. Because the very act of writing a note or letter to someone took this many steps, and was spread out over time, we didn’t go to the trouble unless we had something important to say. Because of email’s immediacy, most of us give little thought to typing up any little thing that pops in our heads and hitting the send button. And email doesn’t cost anything.

Sure, there’s the money you paid for your computer and your internet connection, but there is no incremental cost to sending one more email. Compare this with paper letters. Each one incurred the price of the envelope and the postage stamp, and although this doesn’t represent a lot of money, these were in limited supply – if you ran out of them, you’d have to make a special trip to the stationery store and the post office to buy more, so you didn’t use them frivolously. The sheer ease of sending emails has led to a change in manners, a tendency to be less polite about what we ask of others. Many professionals tell a similar story. One said, “A large proportion of emails I receive are from people I barely know asking me to do something for them that is outside what would normally be considered the scope of my work or my relationship with them. Email somehow apparently makes it OK to ask for things they would never ask by phone, in person, or in snail mail.”

Most emails demand some sort of action: Click on this link to see a video of a baby panda, or answer this query from a co-worker, or make plans for lunch with a friend, or delete this email as spam. All this activity gives us a sense that we’re getting things done – and in some cases we are. But we are sacrificing efficiency and deep concentration when we interrupt our priority activities with email.

Daniel J Levitin writing in The Guardian

The incessant demands of the inbox

Before email, if you wanted to write to someone, you had to invest some effort in it. You’d sit down with pen and paper, or at a typewriter, and carefully compose a message. There wasn’t anything about the medium that lent itself to dashing off quick notes without giving them much thought, partly because of the ritual involved, and the time it took to write a note, find and address an envelope, add postage, and take the letter to a mailbox. Because the very act of writing a note or letter to someone took this many steps, and was spread out over time, we didn’t go to the trouble unless we had something important to say. Because of email’s immediacy, most of us give little thought to typing up any little thing that pops in our heads and hitting the send button. And email doesn’t cost anything.

Sure, there’s the money you paid for your computer and your internet connection, but there is no incremental cost to sending one more email. Compare this with paper letters. Each one incurred the price of the envelope and the postage stamp, and although this doesn’t represent a lot of money, these were in limited supply – if you ran out of them, you’d have to make a special trip to the stationery store and the post office to buy more, so you didn’t use them frivolously. The sheer ease of sending emails has led to a change in manners, a tendency to be less polite about what we ask of others. Many professionals tell a similar story. One said, “A large proportion of emails I receive are from people I barely know asking me to do something for them that is outside what would normally be considered the scope of my work or my relationship with them. Email somehow apparently makes it OK to ask for things they would never ask by phone, in person, or in snail mail.”

Most emails demand some sort of action: Click on this link to see a video of a baby panda, or answer this query from a co-worker, or make plans for lunch with a friend, or delete this email as spam. All this activity gives us a sense that we’re getting things done – and in some cases we are. But we are sacrificing efficiency and deep concentration when we interrupt our priority activities with email.

Daniel J Levitin writing in The Guardian