Connect with us

Technology

Tristan Harris says tech has taken only ‘baby steps’ toward reform

Published

on

 

Over the last six years, Tristan Harris has forced us to think differently about the devices and digital services we use every day.

First as a product manager at Google and later as an outside critic of the tech industry, he’s shone a light on what’s been called the attention economy, the way our phones and apps and web services are constantly diverting and distracting us.

It took years for his critique to spread. But boy has it.

News that Russian-linked provocateurs had hijacked Facebook and other social media sites to spread propaganda during the 2016 election helped boost his profile. So too did reports that social media was leading to a significant uptick in depression among kids. Since then, Harris has found a ready audience ranging from everyday citizens to heads of state wanting to better understand how tech companies are manipulating or being used to manipulate their customers.

Harris, who cofounded the Center for Humane Technology to develop and promote ideas for reforming the tech industry, has already made a mark on the industry. Features such as Apple’s Screen Time, which iPhone owners can use to set limits on how much they use their devices and apps, are a direct result of the criticisms he’s raised.

And more may be on the way. For the first time, policymakers in the US and across the globe, many of whom have consulted with Harris and his colleagues, are seriously considering regulations to reset the relationship between the technology companies and their customers and the wider society. On Monday, for example, the UK’s Information Commissioner’s Office told the BBC it was considering severely restricting the amount of data social networks can collect on children by putting in place a range of measures, including limiting their use of like buttons.

Business Insider spoke with Harris recently about what inspired him to start his movement and what he feels he’s achieved so far. This interview has been edited for length and clarity.

Harris felt the industry was heading in the wrong direction

Troy Wolverton: You’ve been trying to draw attention to and get the tech companies to address the abuses of what you call the attention economy since you put together your presentation in 2013. How do you think you’ve affected the industry or the debate?

Tristan Harris: At the beginning, people didn’t necessarily want to admit that there was a problem. I mean, the slide deck at Google went viral, and [it resonated with] people. But there was no action. It was just lots of denialism, lots of “oh, people are addicted to lots of things, cigarettes, alcohol. Isn’t this just capitalism?”

And it’s like, guys, we’re creating a very specific form of psychological manipulation and influence that we, the tech industry are responsible for fixing. And getting people to admit that took a really long time. We had a hard time getting people to sort of just agree that there was a problem that had to get fixed.

And I think now what’s changed is that people do know — because they’ve been forced to know — that there’s a problem. So now people are talking about what do we actually do about it.

What I’ve been hearing recently is that for the first time executives at Facebook, their friends are now kind of turning on them and saying, “Which side history do you want to be on?”

And now I think because enough of the public has swayed the friends of people at the tops of these companies, that people now realize there’s something structural we have to change.

Wolverton: What prompted you to put together the slide deck in 2013?

Harris: I felt like, fundamentally, there was just something wrong about the direction where this was headed, which is a really kind of a scary thought when you see an entire industry headed in not the right direction. Because up until then I thought technology was great.

This is not an anti-technology movement. But what I was starting to really wake up to was … what my most talented friends and engineers were increasingly doing was to be better and better at playing tricks on the human mind to keep people hooked on screens.

I just felt that everyone that I knew was really not doing the kind of big, creative thinking that people had used do in the 90s and the early 2000s and it was becoming instead this race to manipulate the human mind.

Wolverton: But was there some moment that triggered that awareness, some epiphany?

Harris: I had a little bit of an epiphany. I went to the Santa Cruz mountains for a weekend with my friend Aza Raskin, who is now a cofounder of CHT. I came back from that weekend, after reconnecting with nature, and something profound kind of just hit me. I really don’t know what came over me.

I just felt like I had to say something. It felt wrong. It felt like no one else was going to say something.

I’m not the kind of person that starts revolutions or speaks up. This is something that I’ve had to learn how to do.

Heads of state have been knocking on his door

Wolverton: How has your understanding of the scope of the problem changed since you put together your 2013 presentation?

Harris: I had been CEO of a tiny Web 2.0 tech startup called Apture. I had a background, academically, in cognitive science, and computer science, and linguistics, user-interface design, human-computer interaction, things like that. I was trained to think about building technology products and the human mind.

Since I left Google and especially after Cambridge Analytica and pairing up with [Silicon Valley venture capitalist] Roger [McNamee], and these issues took off, my breadth of understanding and the scope of what’s stake expanded by multiple orders of magnitude.

The scope of the issue has [been] raised from the way that a product designer would think about attention and notifications and home screens and the economics of app stores — which is how I started — to now playing 12-dimensional geopolitical chess and seeing how Iran, North Korea, Russia, China, use these platforms for global information warfare. [And it goes from there] all the way down to the way these issues affect the day-to-day social pressures and mental life of teenagers.

We have world governments knocking on our door, because they want to understand these issues. Briefing heads of state — I never thought I’d be doing that. This has been wild, and it speaks to the scope and gravity of the issue.

Read this:The real lesson of Facebook’s Apple dust-up shows why Zuckerberg’s ‘hacker way’ is even more dangerous than we thought

I knew that this issue would affect everything, conceptually, back in 2013. But I didn’t ground that understanding as I have over the last year and a half, where you actually meet the people in the countries whose elections are at stake by these issues. Or you meet and speak with parent-children-teacher groups that wrestle with these issues daily. So, it literally affects everyone and everything. And it’s the choke point for what is holding the pen of human history, which is what I think people underestimate.

Wolverton: If you imagine this process as a curve going from identifying a problem to adequately addressing it, where do you think we are along it?

Harris: Still the opening inning, I think. I think we’re in the opening innings of a reckoning.

[Companies such as Facebook and YouTube] are going to be looked back on as the fossil fuel companies, because in the attention economy, they drill deeper and deeper in the race to the bottom of the brain stem to get the attention out of people.

[They’re] now, wherever pressure exists on them, trying to correct for the largest of the harms that occur, but only because usually unpaid or nonprofit-paid civil society research groups stay up till 3 in the morning, scrape Facebook and YouTube and calculate the recommendation systems and the disinformation campaigns, and then they tell … the New York Times … and then Facebook or YouTube might, if there’s enough pressure, after [a Rep.] Adam Schiff or a Senator [Mark] Warner or [Sen. Richard] Blumenthal letter from Congress start to do something about it.

I think looking backwards we’re going to say, ‘Oh my god, we’re so glad that we woke up from that nightmare and started designing, and funding, and structuring our technology in such a way that it’s cooperatively owned by the users and the constituencies that it most affects. It’s not on an infinite growth treadmill. It is designed with humane business models that are considerate of human sensitivities and vulnerabilities.

Tech companies have taken only baby steps so far

Wolverton: You’ve said that companies like Facebook, Apple, and Google have taken what you call baby steps toward addressing these issues by doing things such as allowing people to set limits on the time they spend on their devices. How important are those?

Harris: They’re celebrated baby steps. I just want to be clear. I’m happy that they’re doing it, because it sets off a race to the top.

I mean, I had one of the executives of a major technology company you would know say, next to me on a stage at a private event, the whole industry is now in a race to the top for time well spent. I mean that’s ridiculous. We were able to flip this around from a race to the bottom — from who can just steal attention by pulling on our paleolithic puppet strings — to now a race to the top. [Companies are now vying to] prove that they care more about … the individual’s well-being and hopefully in the future, a whole society and civilizational well being.

But that’s why the baby step matters. It actually co-occurred with all the companies starting to race in that direction, and we have to keep that race going.

Wolverton: With all this focus on how devices and apps are demanding our attention, I was wondering how much time you’re spending on your phone these days.

Harris: Well, this is one of the most important issues for the world for all time, and I am, and our organization, are playing such a big role in it that unfortunately, I am constantly working on this problem, which means constantly using technology.

I could look at my screen time app for you if you want, and I now know the answer to that question thanks to the features that now exist in a billion phones.

Let me see. Screen time, last seven days, average is 3 hours and 2 minutes per day.

Continue Reading
Advertisement Find your dream job

Trending