ESG: why Facebook does not belong

by

humane technology | responsible tech | social | stock exchange

On May 7 the S&P 500 ESG Index dropped Facebook for the second time. Explaining the current drop S&P referred to deterioration in environmental reporting, operation eco-efficiency and policy influence. That may be true. However, Facebook is a prime example of the “S” problem of social media: its technology should serve the interests of humanity and not the other way around. 

Former Facebook executive Chamath Palihapitiya has been widely quoted saying: “The short-term dopamine-driven feedback loops that we have created are destroying how society works. It literally is a point now where I think we have created tools that are ripping apart the social fabric of how society works. That is truly where we are. I would encourage all of you, as the future leaders of the world to really internalize how important this is. If you feed the beast, that beast will destroy you.” The pandemic created an environment supporting the beast. Dramatic increases in screen time have been reported worldwide.

Designed for addiction

“Behind every screen on your phone, there are generally like literally a thousand engineers that have worked on this thing to try to make it maximally addicting” – BBC Worldwide quotes former Mozilla and Jawbone employee Aza Raskin. Current business models are fundamentally flawed as they are designed to suck as much attention and data out of users as possible and sell it on as a ‘product’. 

Of course, we all have our own ways of fighting the beast. Some put their phones in airplane mode when they get home to save their family life. Some practice slow tech parenting. Some seek and join like-minded individuals bent on stopping the spread of addictive technologies. Enter Silicon Valley based Center for Humane Technology. Its ambitious goal is realigning technology with humanity’s best interest. But where do we start? Their ‘definition’ of what qualifies as humane technology stands on three pillars:

  1. Humane technology is transparent. People know what the technology does when they use it because it conforms to their expectations. There are no hidden ulterior motives or dark patterns that trick users.
  2. Humane technology lets people opt-out. People are not forced to sign up for services or needlessly give up data when they use humane tech. This includes the right to remove sensitive personal information from products like search engines.
  3. Humane technology has legal language that is easy to understand. People can comprehend the terms of service for tech they use so they can make informed decisions. Contracts and privacy policies are written or explained in the simplest legally-permissible language.

I like the fact that the Center focuses on a positive vision rather than just beating up on the negative. But I do wonder what can really be done to regain control of our real and social media lives. How can consuming technology and ‘meaningful’ content not consume our most important relationships? 

How to humanize technology before it robotizes us? 

I propose seven potential ways of dealing with the rising conflict between users and designers of technology. Some are more realistic than others:

  1. Social Responsibility – developers can choose to stop purposefully designing addictive features for platform technologies.
  2. Self-regulation – industry bodies of technology companies may put in place codes of ethics, by-laws and commit to do away with the intentional design of addictive technology or for that matter, aim to develop humane technologies in the future.
  3. Internal rebellion – employees of the producers of addictive technologies protest against their own employers. It’s already happening.
  4. Consumer Activism – groups of consumers can stage protests, voice criticism and try to force designers of addictive technologies to humanize their approach.
  5. Media – press can play an important role in creating awareness of technology addiction, name and shame the culprits. 
  6. Regulation – technologies designed with the purpose of addiction may be labeled, taxed, banned or even criminalized. (Check out proposals by former 2020 presidential candidate Andrew Yang)  
  7. Rise of humane, sustainable technologies – attention brokers may eventually give way to designers of humane technologies with socially sustainable business models targeted to serve the interests of users as they scale and grow.

Alibaba founder Jack Ma said: “Technology should always do something that enables people, not disable people.”  “We have the responsibility to have a good heart, and do something good,” – he added. His wake up call has a simple message: technology should serve the needs of humanity and not the other way around. 

There is no one size fits all solution to the problem, but time is running out for us to find ways to humanize technology before it robotizes us.