There has been a steady stream of articles about and by “reformed techies” who are coming to terms with the Silicon Valley ‘Frankenstein‘ they’ve spawned. Regret is transformed into something more missionary with the recently launched Center for Humane Technology.
In this post I want to focus on how the Center has constructed what they perceive as a problem with the digital ecosystem: the attention economy and our addiction to it. I question how they’ve constructed the problem in terms of individual addictive behavior, and design, rather than structural failures and gaps; and the challenges in disconnection from the attention economy. However, I end my questioning with an invitation to them to engage with organisations and networks who are already working on addressing problems arising out of the attention economy.
Sean Parker and Chamath Palihapitiya, early Facebook investors and developers, are worried about the platform’s effects on society.
The Center for Humane Technology identifies social media – the drivers of the attention economy – and the dark arts of persuasion, or UX, as culprits in the weakening of democracy, children’s well-being, mental health and social relations. Led by Tristan Harris, aka “the conscience of silicon valley”, the Center wants to disrupt how we use tech, and get us off all the platforms and tools most of them worked to get us on in the first place. They define the problem as follows:
“Snapchat turns conversations into streaks, redefining how our children measure friendship. Instagram glorifies the picture-perfect life, eroding our self worth. Facebook segregates us into echo chambers, fragmenting our communities. YouTube autoplays the next video within seconds, even if it eats into our sleep. These are not neutral products. They are part of a system designed to addict us.”
“Pushing lies directly to specific zip codes, races, or religions. Finding people who are already prone to conspiracies or racism, and automatically reaching similar users with “Lookalike” targeting. Delivering messages timed to prey on us when we are most emotionally vulnerable (e.g., Facebook found depressed teens buy more makeup). Creating millions of fake accounts and bots impersonating real people with real-sounding names and photos, fooling millions with the false impression of consensus.”