How much time do you spend on social media? Do you know how many hours of your life are lost scrolling passively through your Facebook feed, watching YouTube videos or thumbing mindlessly through your Instagram stories? And do you really want to know?
It’s about to become more difficult to avoid knowing.
A host of new “digital wellbeing” tools are being rolled out by tech giants, as the industry responds to recent criticisms that its products tap into vulnerabilities in human psychology, and are engineered for addiction.
Last month, Facebook launched a new suite of tools allowing users – though not yet those in Ireland – to track how much time they have spent on Facebook and Instagram, to limit notifications, or set a reminder to switch off.
God only knows what it's doing to our children's brains
Google recently began offering similar tools for its users – including a Time Watched feature on YouTube. In June, Apple announced its plans for a new function called Screen Time, which will monitor how often users pick up their iPhone or iPad during the day, and help them manage notifications. The purpose, it says, is to "empower" users who need help "balancing the many things that are important to them".
These new products haven’t come out of the blue. The tech industry is facing something of an existential crisis, as it struggles to reconcile the revenue models of the attention economy with deeper questions about the potentially destructive social and psychological effects of spending too much time online.
Last year, the former Google design ethicist Tristan Harris warned that “all of our minds can be hijacked. Our choices are not as free as we think they are”.
In November, the former Facebook president turned social media "conscientious objector" Sean Parker made headlines when he claimed the company's founders knew they were creating something addictive, and that they knowingly exploited "a vulnerability in human psychology" from the outset.
“It literally changes your relationship with society, with each other. It probably interferes with productivity in weird ways,” he said. “God only knows what it’s doing to our children’s brains.”
“We’ve heard these comments and seen them in the press, but that is not the experience that I’ve seen or that anybody I know has seen at Facebook where that is the orientation,” says David Ginsberg, Facebook’s vice president for user research, who was in Dublin last week for an event discussing the effect of social media on people’s wellbeing.
“The orientation is building products that help people. Every day, we’re coming to work to build a product that has value for people. That being said, there is a larger industry-wide conversation going on about our relationship with technology that I think is actually quite healthy.”
Should we be worried by these dire warnings about technology hijacking our brains?
In January, Mark Zuckerberg announced that his personal goal for 2018 was to "fix Facebook". "The world feels anxious and divided, and Facebook has a lot of work to do," he said.
But the public sense of unease about the power held by a handful of Silicon Valley companies hasn't been assuaged by some of the events since. In June, a Channel 4 Dispatches investigation showed trainee moderators at Facebook's Dublin offices being told to leave disturbing content online.
Last week, Facebook's Sheryl Sandberg admitted to US senators that the company had been "too slow" to spot Russian interference in the election. "We were too slow to spot this and too slow to act. That's on us," she said.
As the tech giants try to get ahead of this latest avalanche of criticism, expect to hear lots more of words such as “digital wellbeing” and “balance”, and the benefits of “intentional” or “active” use of social media, over more passive consumption, as well as new tools to help us take back control.
In a blogpost published last week, Google said its research showed “a sense of obligation has crept into tech. People want tools to break it.”
Addiction
But Dr Eoin Whelan, a lecturer in business information systems at NUI Galway, who has studied our relationship with technology extensively, says we have to stop looking to technology to solve problems created by technology.
“If the tech industry really wanted to help us change how we use their products, they could start by designing the technology in a way that’s for our benefit. Instead, there’s now a whole new industry built up around trying to solve the problem of our addiction to technology.”
There is limited research on the effectiveness of time management tools, he says, so it’s difficult to measure whether awareness leads to action. There’s a risk, he says, that “tools like these can just lull people into a false sense that the tech will solve the problem”.
There’s a simple answer, but don’t expect to hear it from the tech industry, Whelan adds. “Take personal responsibility, put your phone away, go for a walk, read a book, spend time with the people who matter.”
So should we be worried by these dire warnings about technology hijacking our brains? Or are we in the midst of an overreaction, a clutching at our metaphorical pearls in another version of the same moral panic that has greeted every new technology?
“Certainly a mounting body of whistleblower-type evidence points to sustained attempts to make social technologies as ‘addictive’ as possible. However, there is a difference between ‘addictive-type behaviours’ and true clinical addiction,” points out forensic cyberpsychologist Dr Mary Aiken.
“Smartphone ‘addictive behaviours’ and ‘social media addiction’ have been heavily discussed in the media,” she says. “Causation has not as yet been established, but that is not to say that we will not have clinical conditions or diagnoses going forward.”
Too little, too late?
Facebook’s research shows that wellbeing is not just about the quantity of time we spend online – it’s also about the quality, and what we do when we’re online. “One of the big things we’ve learned from all the research on wellbeing, both externally and research that we’ve done internally, is that when people are interacting with each other on social media that tends to be positive for their wellbeing. They’re more intentional, and it’s about strengthening relationships.
“Whereas [when] people are just passively consuming content, that tends not to be good for their wellbeing,” says Lauren Scissors, Facebook’s user experience research manager for News Feed, the constantly updated content that, according to the company, “shows you the stories that matter the most to you”.
Scissors says Facebook has also made changes to how its News Feed feature works, in response to research that shows passive consumption (clicking on links and liking posts) results in more negative feelings than active interaction (messaging one-on-one, posting and commenting, or scrolling through your own feed).
It’s worth noting that the thing Facebook has identified as being good for users’ wellbeing also, conveniently, benefits the tech industry: more engaged users are more valuable to advertisers than very passive ones.
There is a question, too, about whether all this is just too little, too late. Facebook’s motto “move fast and break things” hasn’t always invited a great deal of room for introspection, or for thinking about the long-term costs of new features.
It does not take a rocket scientist to figure out that something is wrong, and that we need to act now
Today, say Ginsberg and Scissors, every new innovation at the company exists to solve a problem and create a better experience for users. “That’s a deep part of almost everything that we do today. This is a company that is incredibly mission focused, and our mission is to give people the power to build community and to bring the world closer together,” says Ginsberg.
One way to measure the effectiveness of these new tools – and to deepen our understanding of the impact of our relationship with technology – would be to open anonymised data to researchers, and not just those working directly with tech companies.
There are privacy concerns to be considered in making even anonymous data available, says Ginsberg. However, earlier this year Facebook launched the Election Research Commission, headed by two academic researchers who act as a gateway to other, independent researchers, who can request access to aggregated, de-personalised datasets on particular topics.
“It’s focused right now just on election integrity issues. We’re piloting it there just as a first start, and if this is a successful model, we’ll scale it to other areas,” he says.
“Those researchers will maintain complete independence because those researchers will never work directly with us. We think if this works, it will be a new model for the entire industry. We’re putting our data into it and we hope to see in time Google and Twitter and other companies will see this as a scalable solution.”
The truth is that, until more independent academic research is done, we can’t know what the long-term effects of any new technology will be. But Dr Aiken – who has said that asking tech companies to self-regulate is like asking drunk drivers to award themselves penalty points – believes we can’t afford to wait and see.
While academic research struggles to keep pace, “kids are engaging with technology, parents and teachers have little or no guidelines, while a whole cohort metaphorically sit on the fence waiting for the results of poorly conceptualised longitudinal studies,” she says.
She points to recent tabloid headlines about the videogame Fortnite – which has 125 million players worldwide, many of them young: “Game Over: Girl 9 in rehab after getting hooked playing Fortnite for 10 hours a day and wets herself to avoid switching off.”
“It does not take a rocket scientist to figure out that something is wrong, and that we need to act now – we need academic first responders.”