The Silicon Valley insider who says turn off your phone
The Silicon Valley insider who says turn off your phone
The following article is from the Weekend Australian Magazine and The Times. It’s an eye-opening view on where technology is taking us and well worth a read. We hope you enjoy this.
‘Tech is tearing us apart’
The balance between humankind and technology has reached a tipping point, says this Silicon Valley insider – and we all stand to lose.
By BEN HOYLE
From The Weekend Australian Magazine
Reporting can be a scary job. I have had nervous moments with warlords, gangsters and neo-Nazis. I have been shot at and threatened. Once I had to endure, without displaying any outward sign of panic, the whole of Tonight’s the Night, the Rod Stewart musical. But if Tristan Harris is right, the presentation playing now on his phone is the most frightening thing I’ve seen in my life. It’s a road map for the erosion of civilisation as we know it.
Harris, 35, is a former Google insider who has been called “the closest thing Silicon Valley has to a conscience” and a “Silicon Valley apostate”. He believes we’re in the midst of a great social upheaval caused by technology companies that view the world’s 2.7 billion smartphone users as a resource whose attention they can mine for profit. The resulting competition has a very unfortunate side-effect: “attention capitalism” is making us nastier, stupider and much less likely to find common ground with our fellow humans.
We can try to resist, but it is not a fair fight. Whenever you open Facebook, Instagram or YouTube, you switch on what Harris calls “a voodoo doll-like version of you in a supercomputer”. It consists of nearly everything you’ve ever clicked on, liked or watched. That’s how these companies keep you ensnared: they know you better than you know yourself.
Harris’s conclusions are controversial, but his influence is unmistakeable. He has briefed world leaders and is a confidant of some of the most powerful figures in the technology industry. He has testified to the US Congress. His two TED Talks have more than 4 million views.
More is at stake here than children spending too much time on screens, companies selling our data or hackers interfering in elections, Harris argues. What is actually happening is a fundamental rewiring of human brains, leading to behaviour “that is tearing apart our social fabric”, he says.
We’re in San Francisco’s business district, upstairs from the offices of the Centre for Humane Technology, the non-profit organisation Harris co-founded. He is slightly built, with vigilant eyes, and wears an old-school digital watch – a means of freeing himself from checking his phone.
Saving the world sounds exhausting. Harris’s Shortwhale page (a service for winnowing email overload) explains that for his “health and sanity” he minimises his email time. Potential contacts should bear in mind that every week he gets “10+ major media interview requests” and “10+ major speaking engagement inquiries”. Every month sees “10+ film documentary interview requests” and “10+ major inquiries from major governments”.
As soon as Harris starts his pitch, though, he gleams with evangelical purpose. Ten minutes in, he leaps to his feet and sketches a graph on a whiteboard to show the moment when technology will overwhelm humankind’s strengths: when artificial intelligence can do everything better than we can. It looks reassuringly far off. But then he opens up the presentation on his phone and homes in on a much earlier watershed. This, he says, is when the algorithms that churn away in the background of our lives achieve a form of stealth supremacy by hacking our human weaknesses: vulnerabilities such as vanity, social insecurity and susceptibility to information that affirms our existing prejudices. Technology doesn’t have to be so advanced to penetrate this soft underbelly. We’re there already. “The first crossing point was when it overloaded our mental limits, which we feel as information overload,” Harris explains. That probably happened in the early Noughties, he says. Then smartphones arrived and became a portal through which apps such as Facebook could reach “and grab the puppet strings of your self-image and social validations”.
Since then, our relationship with technology has had profound effects. “You get shortening of attention spans, addiction, disinformation, narcissism, outrage, polarisation,” he says. This is measurable. A 2018 study by the Massachusetts Institute of Technology showed that fake news spreads six times faster than accurate news. The same year, FaceApp went viral by offering users a chance to generate plausibly aged images of themselves and share them; thus did its Russian-based designers persuade 150 million people to hand over images of their faces, paired with their names.
The most damaging development is the most recent. Harris calls it “the checkmate”. This is when technology “attacks the foundation of what we trust” via fake news, bots and deepfake videos. Even if you boycott the internet, people around you might be radicalised by YouTube videos or choose not to vaccinate their children because of misinformation spread online. Tech-influenced crises are erupting everywhere. “This is a self-reinforcing system that gets worse as [the problems] feed each other,” Harris says. “We call it ‘human downgrading’. This isn’t the privacy problem. This isn’t the data problem. This is the diagnosis for why all this shit is going wrong at the same time.” He often quotes evolutionary biologist Edward Wilson, who said: “The real problem of humanity is [that] we have Palaeolithic emotions, medieval institutions and godlike technology.”
Belief in truth and facts is slipping away at a time “when it has never been more urgent… [for] the whole world to see our world’s problems the same way very quickly”, Harris says. I feel myself recoiling from the dystopian forecast on his phone. He grins. “I try to stay lighthearted, but that’s why we lose sleep. That’s why we work so hard.”
The goal is to change how technology is built. Last year, the Centre for Humane Technology launched a podcast called Your Undivided Attention in which Harris and his co-founder, Aza Raskin, interview experts who can help demystify human downgrading, including authorities on cults, casino design, addiction, election hacking and methods of persuasion. In one episode, Gloria Mark, a professor of informatics at the University of California, talked about the “science of interruptions”. She has found that when people work on computers, their attention breaks every 40 seconds. Less than two decades ago it was every three minutes. “We are still in the Wild West of tech development,” she said. Tech “is being developed without really thinking about how it fits with human beings”. That’s what Harris and Raskin want to change. Their approach is a pincer movement: they lobby tech leaders discreetly and hold workshops inside tech companies, while mounting a public campaign to increase external pressure.
In April, they gathered several hundred tech heavy-hitters at a San Francisco amphitheatre to introduce the concept of human downgrading. The audience included co-founders of Apple, Craigslist and Pinterest, vice-presidents at Facebook and Google, and venture capitalists. At a dinner for a select group of attendees, Harris flagged a silver lining to the clouds he’d depicted. “Unlike climate change, it only takes about 1000 people to reverse human downgrading,” he said. “In this room, right now, are many of those people.”
The main offenders are obvious. “It’s hard not to look at this and say, essentially, we have at least two of the biggest companies – Facebook (including Instagram and WhatsApp) and Google (including YouTube) – incentivised to create this digital Dark Age, where disinformation outcompetes information.” YouTube has two billion unique monthly users, giving it a footprint roughly the same size as Christianity. Facebook is bigger.
The founders of these companies didn’t set out to build a system to undermine humanity, Harris says, but they are now trapped by the way their businesses are configured. “I found that it’s only been external pressure – from policymakers, shareholders and media – that has changed companies’ behaviour,” he told a Senate hearing in June.
Apple is often praised, rightly, for its privacy standards, but it is also “the company that can change all this, because it’s not bound by the incentives of maximising attention”. It makes its money through sales of devices and through a cut of fees paid for services bought or subscribed to via its App Store. Apple could simply lock out of the App Store all companies that have a business model based on maximising attention, he suggests. That would incentivise change pretty quickly.
Of course, Harris recognises that when a business model is the problem and that model has “one and a half trillion dollars of market value”, companies won’t switch course overnight. But he believes most technologists are idealists. Harris studied computer science at Stanford University; two of his friends from that time, Kevin Systrom and Mike Krieger, later started Instagram, which they sold to Facebook for $1 billion. At Stanford, the three were in a group imagining ways to improve the world through technology, and Harris doubts that either Systrom or Krieger has lost that aspiration. Many tech leaders he knows feel the same way. He’s also close to Jack Dorsey, the boss of Twitter, which perhaps not coincidentally introduced a ban on political adverts in November.
At Stanford, Harris joined the Persuasive Technology Lab run by behavioural psychologist BJ Fogg. It has since gained quasi-mythical status for training a generation of entrepreneurs to use psychological insights to influence users’ actions. Harris co-founded a company, Apture, which was bought by Google in 2011, and he joined the search engine’s email service. Anxieties about attention capitalism boiled inside him there for almost a year, he says, “because I saw the situation getting worse and I didn’t see the key product, Gmail, sufficiently attack the problem”.
In 2013, Harris wrote a presentation setting out his thoughts on the “enormous responsibility” borne by designers like him for how “millions of people around the world spend their attention”. It went viral within Google and he ended up discussing it with Larry Page, then the chief executive. Google gave him a new role, “design ethicist”, but ultimately he became frustrated with the company’s failure to reform and in 2016 left to run a non-profit advocacy group. He called it Time Well Spent, to crystallise what he thought a user’s experience of technology should be.
In 2017, tech investor Roger McNamee saw Harris on 60 Minutes talking about how app design made smartphones addictive in the same way that slot machines are. McNamee, who suspected Facebook was “a clear and present danger to democracy”, was intrigued. The two men joined forces and added Jim Steyer, founder of Common Sense Media. They briefed members of Congress investigating interference in the 2016 election, and discussed privacy violations with lawyers and politicians, and within months nearly 40 states had opened investigations into Facebook. A Wall Street Journal profile depicted the trio in cowboy garb as “the New Tech Avengers”.
Other aspects of Harris’s message were gaining traction too. In January 2018, Facebook boss Mark Zuckerberg outlined his company’s goals for the year in a post that began, “One of our big focus areas is making sure the time we all spend on Facebook is time well spent.” A few months later, Google and Apple announced initiatives to help users monitor and reduce their screen time. Harris was encouraged, but wary of his ideas being diluted as the companies co-opted his language, so he retreated from the spotlight to brainstorm with Raskin and update his vision. The theory of human downgrading and the presentation in April were the first fruits of that process.
Not everyone has been won over. Andrew Przybylski, director of research at the Oxford Internet Institute, believes Harris means well but lacks scientific evidence for his claims. Dean Eckles, a professor at the Massachusetts Institute of Technology, also questions the evidence for the theory of human downgrading. He is “not sure” that there is proof of “a general erosion in our faith in facts”, or that society is significantly more polarised by social media today than it was by partisan, sensationalist journalism a century ago. However, he stresses that “Tristan has done good” by increasing scrutiny on how tech companies’ business models affect society.
That scrutiny is paying off. Fogg recently renamed the Persuasive Technology Lab the Behaviour Design Lab, with a focus on fostering “good habits”. Recently he made a forecast for the new year. “A movement to be ‘post-digital’ will emerge in 2020,” he tweeted. “We will start to realise that being chained to your mobile phone is a low-status behaviour, similar to smoking.”
In May last year, Chris Hughes, a co-founder of Facebook and adviser to the Centre for Humane Technology, called for Facebook to be broken up. His chief concern was Zuckerberg’s unprecedented power “to monitor, organise and even censor the conversations of two billion people”. Two months later, the Federal Trade Commission fined Facebook $5 billion for violating users’ privacy. In September, it fined Google $170 million for collecting children’s personal information via YouTube.
In October, Zuckerberg was hauled before Congress for a second time in 18 months, a development that would have seemed “crazy” only a few years ago, Harris says. Soon afterwards, it emerged that more than 250 Facebook employees had written a letter to the company’s top team protesting at its refusal to fact-check political ads.
Harris sees these developments as proof that he’s on the right path. The challenge is to press on. “We’re something like eight people in an office in San Francisco with every government, thousands of engineers and media knocking at our door,” he says, looking suddenly weary. “You can imagine how overwhelmed we are and how little of a personal life any of us has, because of how much is at stake and how quickly it needs to change.”
HOW TO MAKE YOUR PHONE LESS ADDICTIVE
Turn off all notifications except those from people
Most notifications are generated by machines, not actual people. They keep our phones vibrating to lure us back into apps we don’t really need. Visit settings > notifications and turn off all notifications, banners and badges, apart from apps where real people want your attention. Or, better still, turn off all your notifications altogether.
Colourful icons give our brains shiny rewards every time we look at our phone. The solution? Set your screen to greyscale. In iOS, go to settings > general > accessibility > accessibility shortcut (bottom) > colour filters. This allows you to triple-tap the home button to toggle greyscale on and off, so you keep colour when you need it. In Android, go to settings > digital wellbeing & parental control > wind down.
Charge your device outside the bedroom
Get a separate alarm clock and charge your phone in a different room. This way, you can wake up without getting sucked into your phone before getting out of bed.
Keep your home screen for tools only
Do you open apps mindlessly because they are the first thing you see when you unlock your phone? Try limiting your first page to tools – the apps you use for quick in-and-out tasks such as Maps, Camera, Calendar, Notes. Move the rest of your apps, especially mindless choices, off the first page and into folders.
Launch apps by typing their names
Swipe down and type the app you want to open instead of leaving easily accessible bad habits on the home screen. Typing takes just enough effort to make us pause and ask, “Do I really want to do this?” In Android, you can use the search box on your home screen. In iOS, for best results turn off Siri suggestions (settings > Siri & search > Siri suggestions to off).
Better still, remove social media from your phone altogether
If you really want to use your phone less, remove all the major social media apps. It’s the easiest way to cut back, because these apps gobble up so much time. Train yourself to use them from your computer only (if at all). Note: you can delete the Facebook app and still get some specific features, such as Messenger for messages.
Send audio notes or call instead of texting
It’s common for people to misinterpret text messages, while the voice is rich with tone and less vulnerable to misinterpretation. Recording a quick voice message is often faster and less stressful than typing a message. Plus, it doesn’t require your full visual attention.
Use texting shortcuts
In iOS, press and hold on a text message and you’ll see a menu of quick reactions. It’s faster than crafting a response and can also add some context, giving a taste of the emotion that’s often lost in a text.