Technology companies are locked in an arms race to seize your attention, and that race is tearing apart our shared social fabric. In this inaugural podcast from the Center for Humane Technology, hosts Tristan Harris and Aza Raskin will expose the hidden designs that have the power to hijack our attention, manipulate our choices and destabilize our real world communities. They’ll explore what it means to become sophisticated about human nature, by interviewing hypnotists, magicians, experts on the dynamics of cults and election hacking and the powers of persuasion. How can we escape this unrelenting race to the bottom of the brain stem? Start by subscribing to our new series, Your Undivided Attention.
The sound of bullies on social media can be deafening, but what about their victims? “They're just sitting there being pummeled and pummeled and pummeled,” says Fadi Quran. As the campaign director of Avaaz, a platform for 62 million activists worldwide, Fadi and his team go to great lengths to figure out exactly how social media is being weaponized against vulnerable communities, including those who have no voice online at all. “They can't report it. They’re not online.” Fadi says. “They can't even have a conversation about it.” But by bringing these voices of survivors to Silicon Valley, Fadi says, tech companies can not just hear the lethal consequences of algorithmic abuse, they can start hacking away at a system that Fadi argues was “designed for bullies.”
[This episode originally aired on November 5, 2019] Maria Ressa is arguably one of the bravest journalists working in the Philippines today. As co-founder and CEO of the media site Rappler, she has withstood death threats, multiple arrests and a rising tide of populist fury that she first saw on Facebook, in the form of a strange and jarring personal attack. Through her story, she reveals, play by play, how an aspiring strongman can use social media to spread falsehoods, sow confusion, intimidate critics and subvert democratic institutions. Nonetheless, she argues Silicon Valley can reverse these trends, and fast. First, tech companies must "wake up," she says, to the threats they've unleashed throughout the Global South. Second, they must recognize that social media is intrinsically designed to favor the strongman over the lone dissident and the propagandist over the truth-teller, which is why it has become the central tool in every aspiring dictator's playbook.
When you’re gripped by anxiety, fear, grief or dread, how do you escape? It can happen in the span of a few breaths, according to meditation experts Jack Kornfield and Trudy Goodman. They have helped thousands of people find their way out of a mental loop, by moving deeper into it. It's a journey inward that reveals an important lesson for the architects of the attention economy: you cannot begin to build humane technology for billions of users, until you pay careful attention to the course of your own wayward thoughts.
How can we feel empowered to take on global threats? The battle begins in our heads, argues Christiana Figueres. She became the United Nation’s top climate official, after she had watched the 2009 Copenhagen climate summit collapse “in blood, in screams, in tears.” In the wake of that debacle, she began performing an act of emotional Aikido on herself, her team and eventually delegates from 196 nations. She called it “stubborn optimism." It requires a clear and alluring vision of a future that can supplant the dystopian and discouraging vision of what will happen if the world fails to act. It was stubborn optimism, she says, that convinced those nations to sign the first global climate framework, the Paris Agreement. We explore how a similar shift in Silicon Valley's vision could lead 3 billion people to take action.
How does disinformation spread in the age of COVID-19? It takes an expert like Renée DiResta to trace conspiracy theories back to their source. She’s already exposed how Russian state actors manipulated the 2016 election, but that was just a prelude to what she’s seeing online today: a convergence of state actors and lone individuals, anti-vaxxers and NRA supporters, scam artists and preachers and the occasional fan of cuddly pandas. What ties all of these disparate actors together is an information ecosystem that’s breaking down before our eyes. We explore what’s going wrong and what we must do to fix it in this interview with Renée DiResta, Research Manager at the Stanford Internet Observatory.
An information system that relies on advertising was not born with the Internet. But social media platforms have taken it to an entirely new level, becoming a major force in how we make sense of ourselves and the world around us. Columbia law professor Tim Wu, author of The Attention Merchants and The Curse of Bigness, takes us through the birth of the eyeball-centric news model and ensuing boom of yellow journalism, to the backlash that rallied journalists and citizens around creating industry ethics and standards. Throughout the 20th century, radio, television, and even posters elicited excitement, hope, fear, skepticism and greed, and people worked together to create a patchwork of regulation and behavior that attempted to point those tools in the direction of good. The Internet has brought us to just such a crossroads again, but this time with global consequences that are truly life-and-death.
We agree more than we think we do, but tech platforms distort our perceptions by amplifying the loudest, angriest and most dismissive voices online. In reality, they’re just a noisy faction. This Earth Day we ask Anthony Leiserowitz, Director of the Yale Program on Climate Change Communication, how he shifts public opinion on climate change. We’ll see how tech platforms could amplify voices of solidarity within our own communities. More importantly, we’ll see how they could empower 2 billion people to act in the face of global threats.
How can tech companies help flatten the curve? First and foremost, they must address the lethal misinformation and disinformation circulating on their platforms. The problem goes much deeper than fake news, according to Claire Wardle, co-founder and executive director of First Draft. She studies the gray zones of information warfare, where bad actors mix facts with falsehoods, news with gossip, and sincerity with satire. “Most of this stuff isn't fake and most of this stuff isn't news,” Claire argues. If these subtler forms of misinformation go unaddressed, tech companies may not only fail to flatten the curve — they could raise it higher.
What difference does a few hours of Congressional testimony make? Tristan takes us behind the scenes of his January 8th testimony to the Energy and Commerce Committee on disinformation in the digital age. With just minutes to answer each lawmaker’s questions, he speaks with Committee members about how the urgency and complexity of humane technology issues is an immense challenge. Tristan returned hopeful, and though it sometimes feels like Groundhog Day, each trip to DC reveals evolving conversations, advancing legislation, deeper understanding and stronger coalitions.
We are in the middle of a global trust crisis. Neighbors are strangers and local news sources are becoming scarcer; institutions that used to symbolize prestige, honor and a sense of societal security are ridiculed for being antiquated and out of touch. To replace the void, we turn to sharing economy companies and social media, which come up short, or worse. Our guest on this episode, academic and business advisor Rachel Botsman, guides us through how we got here, and how to recover. Botsman is the Trust Fellow at Oxford University, and the author of two books, including “Who Can You Trust?” The intangibility of trust makes it difficult to pin down, she explains, and she speaks directly to technology leaders about fostering communities and creating products the public is willing to put faith in. “The efficiency of technology is the enemy of trust,” she says.
“You can binge watch an ideology in a weekend,” says Tony McAleer. He should know. A former white supremacist, McAleer was introduced to neo-Nazi ideology through the U.K. punk scene in the 1980s. But after his daughter was born, he embarked on a decades-long journey from hate to compassion. Today’s technology, he says, make violent ideologies infinitely more accessible and appealing to those who long for acceptance. Social media isolates us and can incubate hate in a highly diffuse structure, making it nearly impossible to stop race-based violence without fanning the flames or driving it further underground. McAleer discusses solutions to this dilemma and the positive actions we can take together.
Brittany Kaiser, a former Cambridge Analytica insider, witnessed a two day presentation at the company that shocked her and her co-workers. It laid out a new method of campaigning, in which candidates greet voters with a thousand faces and speak in a thousand tongues, automatically generating messages that are increasingly aiming toward an audience of one. She explains how these methods of persuasion have shaped elections worldwide, enabling candidates to sway voters in strange and startling ways.
Maria Ressa is arguably one of the bravest journalists working in the Philippines today. As co-founder and CEO of the media site Rappler, she has withstood death threats, multiple arrests and a rising tide of populist fury that she first saw on Facebook, in the form of a strange and jarring personal attack. Through her story, she reveals, play by play, how an aspiring strongman can use social media to spread falsehoods, sow confusion, intimidate critics and subvert democratic institutions. Nonetheless, she argues Silicon Valley can reverse these trends, and fast. First, tech companies must "wake up," she says, to the threats they've unleashed throughout the Global South. Second, they must recognize that social media is intrinsically designed to favor the strongman over the lone dissident and the propagandist over the truth-teller, which is why it has become the central tool in every aspiring dictator's playbook.
What causes addiction? Johann Hari, author of Chasing the Scream, travelled some 30,000 miles in search of an answer. He met with researchers and lawmakers, drug dealers and drug makers, those who were struggling with substance abuse and those who had recovered from it, and he came to the conclusion that our whole narrative about addiction is broken. "The opposite of addiction is not sobriety," he argues. "The opposite of addiction is connection." But first, we have to figure out what it really means to connect.
Every 40 seconds, our attention breaks. It takes an act of extreme self-awareness to even notice. That’s why Gloria Mark, a professor in the Department of Informatics at University of California, Irvine, started measuring the attention spans of office workers with scientific precision. What she has discovered is not simply an explosion of disruptive communications, but a pandemic of stress that has followed workers from their offices to their homes. She shares the latest findings from the “science of interruptions,” and how we can stop forfeiting our attention to the next notification, and the next one, ad nauseam.
In the second part of our interview with Renée DiResta, disinformation expert, Mozilla fellow, and co-author of the Senate Intelligence Committee’s Russia investigation, she explains how social media platforms use your sense of identity and personal relationships to keep you glued to their sites longer, and how those design choices have political consequences. The online tools and tactics of foreign agents can be very precise and deliberate, but they don’t have to be -- Renée has seen how deception and uncertainty are powerful agents of distrust and easy to create. Do we really need the ease of global amplification of information-sharing that social media enables, anyway? We don’t want spam in our email inbox so why do we tolerate it in our social media feed? What would happen if we had to copy and paste and click twice, or three times? Tristan and Aza also brainstorm ways to prevent and control disinformation in the lead-up to elections, and particularly the 2020 U.S. elections.
Today’s online propaganda has evolved in unforeseeable and seemingly absurd ways; by laughing at or spreading a Kermit the Frog meme, you may be unwittingly advancing the Russian agenda. These campaigns affect our elections integrity, public health, and relationships. In this episode, the first of two parts, disinformation expert Renee DiResta talks with Tristan and Aza about how these tactics work, how social media platforms’ algorithms and business models allow foreign agents to game the system, and what these messages reveal to us about ourselves. Renee gained unique insight into this issue when in 2017 Congress asked her to lead a team of investigators analyzing a data set of texts, images and videos from Facebook, Twitter and Google thought to have been created by Russia’s Internet Research Agency. She shares what she learned, and in part two of their conversation, Renee, Tristan and Aza will discuss what steps can be taken to prevent this kind of manipulation in the future.
When we press play on a YouTube video, we set in motion an algorithm that taps all available data to find the next video that keeps us glued to the screen. Because of its advertising-based business model, YouTube’s top priority is not to help us learn to play the accordion, tie a bow tie, heal an injury, or see a new city — it’s to keep us staring at the screen for as long as possible, regardless of the content. This episode’s guest, AI expert Guillaume Chaslot, helped write YouTube’s recommendation engine and explains how those priorities spin up outrage, conspiracy theories and extremism. After leaving YouTube, Guillaume’s mission became shedding light on those hidden patterns on his website, AlgoTransparency.org, which tracks and publicizes YouTube recommendations for controversial content channels. Through his work, he encourages YouTube to take responsibility for the videos it promotes and aims to give viewers more control.
Aza sits down with Yael Eisenstat, a former CIA officer and a former advisor at the White House. When Yael noticed that Americans were having a harder and harder time finding common ground, she shifted her work from counter-extremism abroad to advising technology companies in the U.S. She believed as danger at home increased, her public sector experience could help fill a gap in Silicon Valley’s talent pool and chip away at the ways tech was contributing to polarization and election hacking. But when she joined Facebook in June 2018, things didn’t go as planned. Yael shares the lessons she learned and her perspective on government’s role in regulating tech, and Aza and Tristan raise questions about our relationships with these companies and the balance of power.
In part two of our interview with cultural anthropologist Natasha Dow Schüll, author of Addiction by Design, we learn what gamblers are really after a lot of the time — it’s not money. And it’s the same thing we’re looking for when we mindlessly open up Facebook or Twitter. How can we design products so that we’re not taking advantage of these universal urges and vulnerabilities but using them to help us? Tristan, Aza and Natasha explore ways we could shift our thinking about making and using technology.
Natasha Dow Schüll, author of Addiction by Design, has spent years studying how slot machines hold gamblers, spellbound, in an endless loop of play. She never imagined the addictive designs which she had first witnessed in Las Vegas, would go bounding into Silicon Valley and reappear on virtually every smartphone screen worldwide. In the first segment of this two-part interview, Natasha Dow Schüll offers a prescient warning to users and designers alike: How far can the attention economy go toward stealing another moment of your time? Farther than you might imagine.