Welcome back! It has been a little while since we last checked in, with a slew of life things piling on, it was very difficult to focus on the blog. At long last, here is a new entry on why you just can’t quit Candy Crush. Science holds the answers.
When behavioral science meets tech
I read this fascinating article on Medium about the scientists behind the addicting apps we use every day. Behavioral neuroscience is near and dear to my heart, it was my major in college and something of an obsession for most of my career. My first laboratory experience was as a research technician in the department of psychology at Northeastern University. In this old, maze of a building on Forsyth avenue, Dr. James R. Stellar taught us and nurtured our young scientific minds.
Side note: he is one of the greatest mentors I’ll ever have the privilege of learning from. Without his guidance I would not be where I am today.
This article specifically hit close to home because our lab researched the reward pathways of the brain, specifically the neuroscience of cocaine addiction. The article opens speaking about B.F. Skinner’s box or operant conditioning chamber (in short: rat presses lever, gets reward, rat learns to press lever until the cows come home). A device I am quite familiar with as we made use of a modified version in our experiments. As psychology advanced, other theories explaining why we do what we do came to light. But good old B.F. Skinner’s theories about reward-driven behavior never disappeared, they found a new life in business and technology. Think of Facebook or Twitter as your very own “press-lever-get-tasty-treat” box.
Second side note: Guess who knew B.F. Skinner personally? That’s right, not me. Dr. Stellar did.
How to make tech and influence people
Behavior design is a discipline that studies how we might influence the decisions that people make every day. B.J. Fogg (the father of behavior design) began spreading the gospel of behavior design in tech during his PhD in the late 90s.
What, asked Fogg, if we could design educational software that persuaded students to study for longer or a financial-management programme that encouraged users to save more? Answering such questions, he argued, required the application of insights from psychology.
Fogg showed data supporting the idea that our relationship with tech follows the same rules of reciprocity with which we engage socially. That is, we are more likely to engage and stay engaged with technology that we feel is giving us something in return, that is useful to us or giving back in some way. Of course, you can see how this interaction of psychology and technology has been hugely profitable for many Silicon Valley entrepreneurs but many people had ethical concerns. These concerns are now shared by Fogg himself, as the applications of his work go far beyond the use cases he initially envisioned. There is the obvious problem of people misusing Fogg’s teachings for less than moral/legal purposes, but there are also more universal concerns.
Nir Eyal, one of Fogg’s former students, argues that successful tech (Facebook, Instagram) leverages Skinner’s behaviorism to hook users. In his book, ‘Hooked: How to Build Habit-Forming Products’, he makes the case that these platforms don’t just add value with whatever technological problems they solve, but they fulfill an emotional need for the user. It seems that at the root of our interactions with technology, there is an emotional connection. Our need for social interaction and validation is inexorably entangled with social media, our brains get a jolt of dopamine with every like or positive comment. And while that on its own wouldn’t be particularly addicting in a regular schedule, (turns out the rats lost eagerness if they knew exactly when the pellet was coming, and only pressed the bar when hungry) social media validation is never certain and comes in unpredictable waves. The very nature of human interaction, when applied to technology, makes it so we’re ever searching, ever hopeful, ever hooked.
And while it’s tempting to fall into a pit of despair, delete your social media accounts, and binge a season of Black Mirror, I believe there is hope for behavioral design yet.
Opening pandora’s box
Creating a habit using technology isn’t inherently evil, nor automatically a ploy to turn people into zombie cash cows. There are many out there employing behavioral design to create interfaces and experiences that remove user pain points, that improve your health, or get you to be more involved in your democracy, or make you behave in a way that is better for the environment.
The key, as you may have concluded by now, is the ethical code behind the code. Tristan Harris, another former student of Fogg’s, has made it his mission to bring attention to injecting ethics into these powerful tech platforms. In fact, his passion resulted in a position being created for him at Google: design ethicist and product philosopher. He went on to create The Center for Humane Technology, and the Time Well Spent movement in an attempt to, “catalyze a rapid, coordinated change among technology companies through public advocacy, the development of ethical design standards, design education and policy recommendations to protect minds from nefarious manipulation.”
Human-centered design doesn’t just mean deeply knowing your user, their pain-points and resolving them. If we are truly to employ human-centered design, we need to think about how we are impacting decision-making on the larger ecosystem of humanity. There is nothing inherently wrong with people using Facebook, Instagram or Twitter. But of course,
- Uncle Ben
- Featured photo by Alexandre Godreau on Unsplash