Discover more from The Scipionic Circle
Mind Macros 58: The Pyramid of Choice, Barbell Strategy, and calculated risk
"Make sure that the probability of the unacceptable (i.e., the risk of ruin) is nil." - Ray Dailo
Welcome to another issue of Mind Macros - I hope you find something of value.
Food for Thought
I. The Pyramid of Choice
The Pyramid of Choice is a psychological model that shows how people diverge when making decisions. At the apex of the pyramid rests the sharp contrast between right and wrong, moral and immoral. Our choices act like a catapult, propelling us down either side of the pyramid.
In Mistakes Were Made (But Not By Me), the authors present the example of two identical young men who are about to take a school exam. Each man is 'reasonably honest' and holds the same attitude toward cheating. During the test, they both are presented with an opportunity to cheat by reading another student's answers. One man succumbs to the temptation, while the other does not. Their decisions were a 'hair's breadth apart' and determined the subsequent mental chatter.
A week after the exam, the student who cheated tells himself, "Hey, everyone cheats. It's no big deal. And I really needed to do this for my future career." The student who did not cheat tells himself: "People who cheat should be permanently expelled from school. We have to make an example of them."
"By the time the students are through with their increasingly intense levels of self-justification, two things have happened: One, they are now very far apart from one another; and two, they have internalized their beliefs and are convinced that they have always felt that way. It is as if they had started off at the top of a pyramid, a millimeter apart; but by the time they have finished justifying their individual actions, they have slid to the bottom and now stand at opposite corners of its base. The one who didn't cheat considers the other to be totally immoral, and the one who cheated thinks the other is hopelessly puritanical. This process illustrates how people who have been sorely tempted, battled temptation, and almost given in to it—but resisted at the eleventh hour—come to dislike, even despise, those who did not succeed in the same effort."
"When the person at the top of the pyramid is uncertain, when there are benefits and costs of both choices, then he or she will feel a particular urgency to justify the choice made. But by the time the person is at the bottom of the pyramid, ambivalence will have morphed into certainty, and he or she will be miles away from anyone who took a different route."
"This process blurs the distinction that people like to draw between 'us good guys' and 'those bad guys.' Often, standing at the top of the pyramid, we are faced not with a black-and-white, go/ no-go decision, but with a gray choice whose consequences are shrouded. The first steps along the path are morally ambiguous, and the right decision is not always clear. We make an early, apparently inconsequential decision, and then we justify it to reduce the ambiguity of the choice. This starts a process of entrapment—action, justification, further action—that increases our intensity and commitment, and may end up taking us far from our original intentions or principles."
Rather than resorting to dishonest tactics, like cheating on an exam, let's consider the far more severe matter of how people can be compelled to harm others. Stanley Milgram's famous 1960 psychology experiment, involving over 3000 participants, found shocking results that were verified in a second experiment 40 years later.
"In Milgram's original version, two-thirds of the participants administered what they thought were life-threatening levels of electric shock to another person, simply because the experimenter kept saying, "The experiment requires that you continue." This experiment is almost always described as a study of obedience to authority. Indeed it is. But it is more than that: It is also a demonstration of long-term results of self-justification. Imagine that a distinguished-looking man in a white lab coat walks up to you and offers you twenty dollars to participate in a scientific experiment. He says, "I want you to inflict 500 volts of incredibly painful shock to another person to help us understand the role of punishment in learning." Chances are you would refuse; the money isn't worth it to harm another person, even for science. Of course, a few people would do it for twenty bucks and some would not do it for twenty thousand, but most would tell the scientist where he could stick his money. Now suppose the scientist lures you along more gradually. Suppose he offers you twenty dollars to administer a minuscule amount of shock, say 10 volts, to a fellow in the adjoining room, to see if this zap will improve the man's ability to learn. The experimenter even tries the 10 volts on you, and you can barely feel it. So you agree. It's harmless and the study seems pretty interesting. (Besides, you've always wanted to know whether spanking your kids will get them to shape up.) You go along for the moment, and now the experimenter tells you that if the learner gets the wrong answer, you must move to the next toggle switch, which delivers a shock of 20 volts. Again, it's a small and harmless jolt. Because you just gave the learner 10, you see no reason why you shouldn't give him 20. And because you just gave him 20, you say to yourself, 30 isn't much more than 20, so I'll go to 30. He makes another mistake, and the scientist says, 'Please administer the next level—40 volts.'
"Where do you draw the line? When do you decide enough is enough? Will you keep going to 450 volts, or even beyond that, to a switch marked XXX DANGER? When people are asked in advance how far they imagine they would go, almost no one says they would go to 450. But when they are actually in the situation, two-thirds of them go all the way to the maximum level they believe is dangerous. They do this by justifying each step as they went along: This small shock doesn't hurt; 20 isn't much worse than 10; if I've given 20, why not 30? As they justified each step, they committed themselves further. By the time people were administering what they believed were strong shocks, most found it difficult to justify a sudden decision to quit. Participants who resisted early in the study, questioning the very validity of the procedure, were less likely to become entrapped by it and more likely to walk out.
"The Milgram experiment shows us how ordinary people can end up doing immoral and harmful things through a chain reaction of behavior and subsequent self-justification. When we, as observers, look at them in puzzlement or dismay, we fail to realize that we are often looking at the end of a long, slow process down that pyramid.
"How do you get an honest man to lose his ethical compass? You get him to take one step at a time, and self-justification will do the rest."
II. The Barbell Strategy
"What do we mean by barbell? The barbell (a bar with weights on both ends that weight lifters use) is meant to illustrate the idea of a combination of extremes kept separate, with avoidance of the middle. In our context it is not necessarily symmetric: it is just composed of two extremes, with nothing in the center. One can also call it, more technically, a bimodal strategy, as it has two distinct modes rather than a single, central one.
"A barbell can be any dual strategy composed of extremes, without the corruption of the middle—somehow they all result in favorable asymmetries.
"Let us use an example from vulgar finance, where it is easiest to explain, but misunderstood the most. If you put 90 percent of your funds in boring cash (assuming you are protected from inflation) or something called a "numeraire repository of value," and 10 percent in very risky, maximally risky, securities, you cannot possibly lose more than 10 percent, while you are exposed to massive upside. Someone with 100 percent in so-called "medium" risk securities has a risk of total ruin from the miscomputation of risks. This barbell technique remedies the problem that risks of rare events are incomputable and fragile to estimation error; here the financial barbell has a maximum known loss."
"With personal risks, you can easily barbell yourself by removing the chances of ruin in any area. I am personally completely paranoid about certain risks, then very aggressive with others. The rules are: no smoking, no sugar (particularly fructose), no motorcycles, no bicycles in town or more generally outside a traffic-free area such as the Sahara desert, no mixing with the Eastern European mafias, and no getting on a plane not flown by a professional pilot (unless there is a co-pilot). Outside of these I can take all manner of professional and personal risks, particularly those in which there is no risk of terminal injury." — From Antifragility by Nassim Taleb.
The barbell strategy provides an alternative to the usual dilemma of accepting large losses to pursue massive gains. Conventional wisdom dictates that in order to ‘win big,’ we must take on immense risks. However, 'barbelling' allows us to hedge our bets. We’re limiting potential losses while still taking advantage of potential gains, thus avoiding the common middle-ground of moderate risks.
In finance, we could allocate 90% of our savings to conservative government bonds and the remaining 10% to riskier investments. No more than 10% of our portfolio can incur losses, while we’re also provided the possibility of greater returns (if the high-risk investments prove successful). Contrast this approach with the typical mid-risk, such as investing 100% of our savings into the S&P500. We can either see losses of 30% (with the potential for much more) across our entire portfolio in a single year or limited gains up to an average of 20%. The barbell strategy limits any disastrous bets, as Taleb explains, quoting legendary investor Ray Dalio, who advises:
"Make sure that the probability of the unacceptable (i.e., the risk of ruin) is nil."
Taleb proceeds to share more analogies of eliminating the middle:
"My writing approach is as follows: on one hand a literary essay that can be grasped by anyone and on the other technical papers, nothing in between—such as interviews with journalists or newspaper articles or op-ed pieces, outside of the requirements of publishers.
"More barbells. Do crazy things (break furniture once in a while), like the Greeks during the later stages of a drinking symposium, and stay 'rational' in larger decisions. Trashy gossip magazines and classics or sophisticated works; never middlebrow stuff. Talk to either undergraduate students, cab drivers, and gardeners or the highest caliber scholars; never to middling-but-career-conscious academics."
Quotes to Ponder
I. Albert Camus on friendship:
"Don't walk behind me; I may not lead. Don't walk in front of me; I may not follow. Just walk beside me and be my friend."
II. Socrates on the cycle of suffering:
"If you don't get what you want, you suffer; if you get what you don't want, you suffer; even when you get exactly what you want, you still suffer because you can't hold on to it forever. Your mind is your predicament. It wants to be free of change. Free of pain, free of the obligations of life and death. But change is law and no amount of pretending will alter that reality."
Did you enjoy this email? Please consider buying me a coffee to caffeinate my reading sessions.
Thank you for reading,