I’m taking at my pace the free online course “Foundations of humane technology” by the Center for Humane Technology (whom I had heard of after watching the excellent documentary “The Social Dilemma”.) Here are my notes.
Technology vs. human nature
As humans, we inherit evolutionary conditioning that developed over millions of years.Center for Humane Technology
🤔 Not to freak out entirely just yet, but my brain just leapt to this thought: When I think of the damage done over just ten years of asocial behaviour on so-called social media, I really worry about evolutionary conditioning and its possible acceleration in the face of individualism and instant gratification.
⚠️ heuristics (shortcuts for making decisions and sometimes fast decisions on incomplete data) may become cognitive biases (such as social conformity biases where our opinions are shaped by (the actions of) others) which can be manipulated and exploited for profit or power.
An interesting overconfidence bias is the above-average effect: many people think they’re above average, which of course is statistically impossible. Because of that, we believe we can not easily be manipulated, influenced, and shaped. The truth is that we make decisions based on context and/or emotions, however great we (think we) are.
- What we think is a fair price depends on the first price we hear. This is known as price anchoring.
- If someone has just read a paragraph containing the word “Ocean,” and then you ask them to identify their favorite clothing detergent, they’re more likely to choose “Tide.” This is known as subliminal cueing.
- A nauseating environment inclines us toward harsher moral judgments.
🤔 Personal reflection: hijacked biases
Are there any instances in the last day or two you can think of when your brain’s heuristics may have been hijacked, by technology or anything else?
Yes. It happens at least once a day and more, that I pick up my smartphone to do something specific and that seeing a particular app icon, or a notification totally hijacks my focus. Most of the times I don’t even accomplish that specific thing I had set out to do until after I’ve put the phone down and something else reminds me of what I had intended to do. It makes me wonder the amount of things I ended up not doing after all 😂
Design to enable wise choices, NOT to exploit human vulnerabilities
Human vulnerabilities and tendencies need to be factored in as we study user behavior. Ignoring values risks to trade what’s right for what works.
Good practice: technologies or products should be capable of:
- Receiving information from users about their experience
- Learning how your technology does or doesn’t support user values
- Evolving towards increasing alignment between platform design and user well-being