One of the more alarming/inspiring essays I've read of late, in the final weeks of this batshit crazy year, argues that even if our laws someday manage to adequately address technology's invasion of our privacy, the real problem is the lasting impact technology will have on our brains and civilization.
In his Dec. 5 piece, "Our Brains Are No Match for our Technology," Tristan Harris, co-founder and executive director of the Center for Humane Technology (humanetech.com), and a former design ethicist at Google argues, persuasively, that the real danger today's technology poses is for our attention spans and, ultimately, our humanity.
Humans' Paleolithic brains, Harris writes, "aren't built for omniscient awareness of the world's suffering" and also "aren't wired for truth-seeking." Subsequently, "technology has outmatched our brains, diminishing our capacity to address the world's most pressing challenges." And so, the attention economy "has turned us into a civilization maladapted for its own survival."
On the bright side—relatively speaking—people also have the capacity, in theory, to be self-aware and take action, which means we have the ability to reverse some of the existential problems we've created. I would also recommend Harris' "Inconvenient Truth for Tech" talk, available on the center's website. I've personally implemented some of the "take control" tips on the center's site, such as setting one's phone to grayscale to remove the positive brain reinforcement provided by the color screen, and have removed as many of my phone's app features as possible.
Since my brain is hardwired to fall down a rabbit hole, I then began listening to Harris' podcast, "Your Undivided Attention," where I learned the median attention span right now is approximately 40 seconds, and internet brain addiction is intensifying.
What was I saying?
Oh, yes. Internet brain addiction has become prevalent enough that one of its remedies, dopamine fasting (aka not using technology for periods of time), is itself already in a 2.0 phase. And it might work, even though a. it was invented by a venture capitalist and b. it has nothing to do with dopamine.
"Any one thing you're going to say about dopamine to characterize its function is going to be wrong," Stanford professor Russell Poldrack told The Times last month in an article on the trend. "But it would be hard to find something more wrong than associating it with pleasure, because we know that dopamine has nothing to do with the experience of pleasure, at least directly." Poldrack said technology fasts would be more accurately characterized as "stimulation" fasts (I predict this term will not catch on, as it sounds a bit less techbro, if you ask me).
Back to Harris. Last June, he testified to the United States Senate Committee on Commerce, Science, and Transportation Subcommittee on Communications, Technology, Innovation and the Internet on what he calls the use of "persuasive technology" on the internet, saying: "I want to argue today that persuasive technology is a massively underestimated and powerful force shaping the world and that it has taken control of the pen of human history and will drive us to catastrophe if we don't take it back."
Harris' point to Congress was that worrying about when technology would usurp human jobs overshadowed the point when technology "hacks human weaknesses" and takes control of society.
At Google, Harris said, he thought about how to ethically wield the control that technology has over people's thoughts and behaviors. In a so-called attention economy, he noted, "it becomes a race to the bottom of the brain stem." Technology hacked people's needs for social validation through social media and their opinions and interests through algorithms, and grew an asymmetrical relationship between technology's power and human weakness. All of these various types of technology and impacts, he concluded, "are part of an interconnected system of compounding harms that we call 'human downgrading.'"
"How can we solve the world's most urgent problems if we've downgraded our attention spans, downgraded our capacity for complexity and nuance, downgraded our shared truth, downgraded our beliefs into conspiracy theory, thinking that we can't construct shared agendas to solve our problems?" he said.
These are pretty good questions, perhaps ones to share around the dinner table this holiday? If that doesn't sound like the best-laid plan, visit the center website for other suggestions of how to get involved in its humane technology agenda, and for a printable list of tips on how to reverse human downgrading.
It could make a nice stocking stuffer.