Predictably Irrational
I’ve now started Predictably Irrational by Dan Ariely, another behavioral economics book. So far, it's been a great read-similar to Thinking Fast and Slow, but adopting a much more conversational style.
The Decoy Effect
The Decoy Effect is a pricing technique where the seller will include a 'decoy' at a high price which is clearly the poorer option of the item they want to sell.
In the example he uses in the book, the subscriptions to The Economist are:
- A: digital subscription ($65)
- B: print subscription ($125)
- C: digital + print bundle ($125)
The idea here is that people will see option B (the decoy) as clearly the worse version of C. Even though no one buys option B, it will make them more apt to pick C than if option B hadn't even been shown in the first place!
The underlying reason here is that people are bad at assessing value in a vacuum, but good at comparisons. Moreover, comparisons between like options (the decoy, and the more expensive option) are easier to make than comparisons between unlike options.
We underestimate many of our natural tendencies
Ariely cites studies around both procrastination, and making decisions under stress.
For procrastination, he A/B tested classes with three different classes. Each class required the students to write 10 different papers.
- students must turn in a paper every two weeks
- students must turn in all papers by the end of the semester, but there's no set schedule
- students must turn in all papers by the end of the semester, but there's a recommended schedule for turning one in every two weeks (a tool!)
He found students in class 1 got the highest scores, and students in class 2 had the lowest scores. Even though all of the students had awareness that they would procrastinate, they still underestimated it. When they had a tool available they did better, but still not as well as a mandated schedule.
Takeaway: even when we are aware of our tendencies and biases, we still manage to underestimate them
We hate losing options, irrationally so
People tend to hate losing an option for doing something, even if that option isn't something that they really want.
Ariely ran a study where students would play a game to enter different rooms. They had 100 clicks to use during the experiment.
Each room had a range of payouts between 1-10 cents every time the user clicked. Users could move between rooms, but it would cost them a click. The goal was to find the 'higher-paying' rooms, and then use clicks there.
When there was no reduction in options, participants would find an optimal room and stay there (the rational strategy). But when rooms would disappear, something very different happened. Participants would try and hold on to different rooms, even if those rooms were low-paying, simply because they didn't like losing an option.
Takeaway: people dislike losing the option to do something, even if it's against our best interest. We weight optionality quite heavily (perhaps something to use in Segment's messaging)
Most people will tend to cheat a little bit... but not when money is involved
The last study I'll share came from an HBS class. Students were asked to take a short multiple-choice trivia survey (e.g. what is the capital of Guatemala?) and then score their answers under different conditions:
- condition 1 (control): participants answer, and then bring their answer sheet up to the grader
- condition 2 (easier to cheat): participants answer, copy and self-grade their sheet against the answers, then shred their original answers. they turn in the copied sheet
- condition 3 (easiest to cheat): participants answer, self-grade, and shred all documents. they simply report their score
When cheating was available, the scores moved up uniformly. Even when cheating got easier, scores didn't move much more. It seemed everyone had their own internal honor code so they would only fudge the numbers a little bit.
Interestingly... this did not happen when people were first asked to recall particular 'honesty line', like the 10 commandments or agreeing to an honor code. Even if they didn't know the 10 commandments, participants scores lowered.
This also happens less when there is it's just money that is exchanged. Ariely tried leaving Pepsis in shared dorm fridges, and found they were all gone 72 hours later. When he left dollar bills on a plate in the same fridges, he found that students wouldn't touch them.
Meta impressions
I found this book to be an overall delightful read. Ariely does an incredible job combining study results with anecdotes.
For each study, he starts by telling the story of the participants, the conditions, and a few individual interactions before sharing the overall results. It's a great storytelling technique, and especially fun if you’re familiar with MIT undergrads.
I think this book would go in “lifechanging” for me, had I not seen so many of these concepts for the first time in Thinking Fast and Slow. It’s a great read.