We live in a world where technology compete for our attention, especially on our smartphones. Apps do everything they can to get us to open the app, and not leave it. At least that’s how I feel, with endless newsfeeds, notifications and autoplay, it’s so easy to just open the phone and get stuck. The feeling is not new, but the thing that pinned it down for me was the book Zucked by Roger McNamee [1]. It highlighted the reason for the feelings, both why companies do it and what they do. By using data companies have on their users they maximise their consumption. This can be in the form of video content on a streaming platform or browsing the newsfeed on social media.
I would argue that there are two kinds of platforms, the ones where you pay with money, and the one you pay with data. When paying with data, the user is often the product. The way companies sell that product and make money is advertisement. By knowing their users, companies are able to tailor the most appropriate ads for that user. The more ads the user sees, the more revenue for the company earn. Therefore it is in the companies best interest to keep the user engaged and coming back. For social media, it’s profitable to be addictive. The more users stay on the platform, and the more they interact with it, the more the platforms know about the user and therefore can show more and better ads.
As an answer to this exploitation of users and their data a movement have risen. Humane technology aims for ethical technology. By focusing on adding value to the user, without exploiting the nor their data, it is the polar opposite of where many of the major platforms are heading. The Center for Humane Technology is a great source of both inspiration and knowledge when it comes to these areas. They even propose the following principles[2]:
- Obsess over Values; Today there is an obsession with clicks, likes, and other instant reaction metrics. This promotes clickbait to maximise the metrics. Instead we should use metrics of actual value (fun, creativity, well-being), what did the user get out of this? It is harder to measure, but ensures greater value for all parties.
- Strengthen Existing Brilliance; Technology is moving very fast, and enters more and more spaces. But not all things needs a technical adaptation or solution. Some things cannot be replaced with technology. For example, If you feel lonely a weekend evening after being home alone, the solution might be to invite some friends over for dinner and discussion. Tech could help you set it up, prepare the meal etc. but when you are seated at the table, it’s you and your friends that bring each other joy.
- Make the Invisible Visceral; To ensure that we consider every ethical and safety aspect of our product it can be a good idea to how we frame our user personas in the design. By considering a random old lady to be a relative of yours, perhaps your grandmother, you might be more cautious about how the product might affect her.
- Enable Wise Choices; By changing the way we frame the information we help the readers to make a choice. All information will have a bias for one interpretation. A common example is the cows are deadlier than sharks statistic, that is biased towards the dangers of cows due to the large difference in shark to cow population. This does however not mean that cows are more dangerous than sharks.
- Nurture Mindfulness; To ensure the well-being of the user it is important to allow for a balanced experience. By nurturing the users mindfulness their awareness increases. When there is not a new notification prompting them to engage whenever there is a dull moment it promotes actively searching out whatever the user needs, and sometimes that is just a calm moment to relax.
- Bind Growth with Responsibility; The only goal should not be in the number of users, platforms and other technology should take their responsibilities to grow ethically. How can we grow without compromising our values? You should not be willing to grow at any cost, but rather find a balance where your ethics are sound, and your users happy.
These steps are a great way to start working towards a humane technology, but as with the prisoners dilemma [3] it is still easy for others to not play nice, and exploit the user to gain more influence. I hope that we continue to move in a direction where the users gain enough insight to reward nice and ethical behaviour (humane technology) over exploiting users.