

Discover more from The Scipionic Circle
Mind Macros 52: Skin in the game, human's tendency to believe, and the shortness of life
“Truth is not what you want it to be; it is what it is. And you must bend to its power or live a lie.” – Miyamoto Musashi
Hello, friend.
Welcome to another issue of Mind Macros - I hope you find something of value.
Food for Thought
I. Skin in the game
"In traditional societies, a person is only as respectable and as worthy as the downside he (or, more, a lot more, than expected, she) is willing to face for the sake of others. The most courageous, or valorous, occupy the highest rank in their society: knights, generals, commanders. Even mafia dons accept that such rank in the hierarchy makes them the most exposed to be whacked by competitors and the most penalized by the authorities. The same applies to saints, those who abdicate, devote their lives to serve others—to help the weak, the deprived, and the dispossessed.
“So Table 7 presents another Triad: there are those with no skin in the game but who benefit from others, those who neither benefit from nor harm others, and, finally, the grand category of those sacrificial ones who take the harm for the sake of others” – From Skin in the Game by Nassim Nicholas Taleb.
Taleb’s definition of skin in the game proposes that decision-makers should be held accountable for the results of their decisions, facing the same consequences as those affected by them. Making decisions with skin in the game encourages responsibility and consideration, as the downside is not transferred to another party. The concept aims to create an asymmetry of risk and return through a reverse incentive, in which there is something to lose if things go wrong.
In his book, Skin in the Game, Taleb provides an example from Hammurabi, the sixth Amorite king of the Old Babylonian Empire:
"Hammurabi’s code—now about 3,800 years old—identifies the need to reestablish a symmetry of fragility, spelled out as follows:
"If a builder builds a house and the house collapses and causes the death of the owner of the house—the builder shall be put to death. If it causes the death of the son of the owner of the house, a son of that builder shall be put to death. If it causes the death of a slave of the owner of the house—he shall give to the owner of the house a slave of equal value.
"It looks like they were much more advanced 3,800 years ago than we are today. The entire idea is that the builder knows more, a lot more, than any safety inspector, particularly about what lies hidden in the foundations—making it the best risk management rule ever, as the foundation, with delayed collapse, is the best place to hide risk. Hammurabi and his advisors understood small probabilities.
"Now, clearly the object here is not to punish retrospectively, but to save lives by providing up-front disincentive in case of harm to others during the fulfillment of one’s profession. These asymmetries are particularly severe when it comes to small-probability extreme events, that is, Black Swans—as these are the most misunderstood and their exposure is easiest to hide."
Other examples of having no skin in the game include:
Giving financial advice without any personal investment in the outcome. Taleb's related aphorism is: “Don’t tell me what you think, tell me what you have in your portfolio.”
Those who advise politicians or other public figures without having any stake in the outcome. Similarly, legislators and politicians who create and pass laws that they don't have to abide by.
CEOs that receive hefty bonuses despite the company’s performance.
Big banks that speculate on high-risk investments with taxpayer-backed money.
Governments bailing out failing financial institutions.
The last two might sound familiar, as they in part, led to the 2008 housing market collapse in the United States, with the aftershocks felt worldwide.
Since banks can expect a government bailout, they can transfer the downside to somebody else (the taxpayers). Taleb describes this as a Bob Rubin Trade. If a banker makes a decision that results in a positive outcome, they keep all the upside. If they make a decision that results in a negative outcome, someone else receives the downside.
The following are a few examples of having skin in the game:
Pilots of commercial passenger planes. Any mistake results in them suffering the same consequences as the passengers.
Investors who purchase a large share in a company they founded. Or business owners putting their personal savings into their companies.
Soldiers who put their lives at risk in order to uphold their country's ideals.
Parents who are willing to make personal sacrifices to ensure their children have the best opportunities in life.
Scientists who risk their reputations to stand up for evidence-based research.
II. Humans are made to believe
“‘We are hardwired to be duped,’ argues Timothy R. Levine in his book Duped: Truth-Default Theory and the Social Science of Lying and Deception. Levine is the distinguished professor and chair of communication studies at the University of Alabama at Birmingham, and has spent his career studying human lying, with his research being funded by the FBI and the NSA. Levine’s work argues that, despite our obvious capacity and propensity to lie, the default setting for our species is to accept the things we hear as being true, something Levine calls truth-default theory (TDT). 'TDT proposes that the content of incoming communication is usually uncritically accepted as true, and most of the time this is a good thing for us,' he argues. 'The tendency to believe others is an adaptive product of human evolution that enables efficient communication and social coordination.' As a species, humans are both wired for credulity and for telling lies. It’s that combination of traits—this bizarre mismatch between the human ability to lie and spot lies—that makes us a danger to ourselves.
"Humans are unlike other animals when it comes to our capacity for deception. Because we are why specialists, we have minds overflowing with ideas—dead facts—about how the world works, which gives us an infinite number of subjects about which we could lie. We are also in possession of a communication medium—language—that allows us to transform these dead facts into words that slither into the minds of other people with ease. What’s more, we have the capacity to understand that other people have minds in the first place; minds that hold beliefs about how the world is (i.e., what’s true), and thus minds that can be fooled into believing false information. As Levine points out, we’re also particularly bad at spotting false information. This sets up a scenario where, as we will see in this section, being a lying bullshit artist in a world filled with gullible victims can be a path to success, as it was for Russell Oakes. The accepted wisdom is that humans tell, on average, between one and two verbal lies a day. That, however, is an estimated average across the entire population. Six out of ten people claim not to lie at all (which is probably a lie), with most lies being told by a small subset of pathological liars who tell—on average—ten lies a day. We tell fewer lies as we get older, which might have less to do with our maturing sense of morality, and more to do with the cognitive decline that makes it harder to pull off the mental gymnastics needed to keep track of the nonsense we’re spouting. We need to think harder and maintain concentration to produce lies, which is why you often see the TV trope of an onscreen detective asking rapid-fire questions of suspects until they inadvertently blurt out the truth because they can’t think fast enough. It’s the same reason for the phrase in vino veritas (in wine, there is truth): It’s the idea that drinking alcohol works a bit like a truth serum, where people are more likely to reveal their true feelings (and stop lying) when their higher-order thinking has been compromised.
"An even better way to get ahead is to take lying to the next level: bullshitting. The term bullshitting is a legitimate scientific term. It was popularized by the philosopher Harry Frankfurt in his 2005 book, On Bullshit, and is used in the scientific literature today to describe communication intended to impress others without concern for evidence or truth. It is not the same thing as lying, which involves knowingly creating false information with the intention of manipulating others’ behavior. A bullshitter, on the other hand, does not know and does not care whether what they’re saying is accurate. They are more concerned with what Stephen Colbert called truthiness: the quality of seeming or being felt to be true, even if not necessarily true. Bullshitting seems like a negative behavior that would gum up the works of the human social world and sow chaos and confusion. But there is evidence to suggest that bullshitting might be a skill that has been selected for by evolution. A capacity to produce bullshit might be a signal to others that the bullshitter is in fact an intelligent individual. A recent study in the journal Evolutionary Psychology found that participants who were the most skilled at making up plausible (but fake) explanations of concepts they didn’t understand (a bit like the game Balderdash) also scored highest on tests of cognitive ability. So being a better bullshitter is in fact correlated with being smarter. The authors concluded that ‘the ability to produce satisfying bullshit may serve to assist individuals in navigating social systems, both as an energetically efficient strategy for impressing others and as an honest signal of one’s intelligence.’ In other words, the bullshitter has an extra advantage over a non-bullshitter: They don’t waste time worrying about the truth; they can focus all of their energy on being believed instead of being accurate.” – From If Nietzsche Were a Narwhal by Justin Gregg (view my three takeaways).
Levine's work demonstrates that, by default, we believe everything we hear as true unless there is evidence of deceit. This is especially true when a story resonates with our own experiences or beliefs. TDT indicates that we follow a system of ‘acceptance unless proven otherwise’ rather than the safer ‘neutral unless adequate evidence is presented to inform a decision.’ We've discussed in past issues how this evidence-based method of approaching the world, while the most effective, sets a standard that is impossible to meet. The time and energy required for truth-seeking simply aren't available to us (in every domain). To make decisions and process information effectively, we need heuristics with a high rate of accuracy to navigate life. Cognitive razors are one solution that act as shortcuts for processing information. Here are ten of my favorites:
Occam's Razor: The simplest explanations are more likely to be true than complicated ones.
Hanlon's Razor: Don't attribute to malice that which can be explained by stupidity.
Machiavelli's Razor: Don't attribute to malice that which can be explained by self-interest.
Hitchens's Razor: What can be asserted without evidence can be dismissed without evidence. The burden of proof is on those who assert claims, not on opposing parties to disprove them.
Russell’s Teapot: If someone believes that there is a teapot flying around the moon, they have to believe everything because it's impossible to disprove. This concept highlights the absurdity of believing something on the basis that it cannot be disproven (so it must be true).
Munger's Rule of Opinions: "I never allow myself to have an opinion on anything that I don't know the other side's argument better than they do." – Charlie Munger
Sagan's Standard: "Extraordinary claims require extraordinary evidence." – Carl Sagan
Alder's Razor: If something cannot be settled by experimentation or observation, it's not worthy of debate. Without empirical evidence, it's simply a battle of one argument against the other.
Livingston Razor: It is impossible to convince someone to change their mind with logical arguments if they do not value logic. "It is difficult to remove by logic an idea not placed there by logic in the first place." – Gordon Livingston.
Taleb's Bullshit Detector: "Avoid taking advice from someone who gives advice for a living, unless there is a penalty for their advice." – Nassim Nicholas Taleb.
Quotes to Ponder
I. Seneca on the shortness of life:
“It’s not at all that we have too short a time to live, but that we squander a great deal of it. Life is long enough, and it’s given in sufficient measure to do many great things if we spend it well. But when it’s poured down the drain of luxury and neglect, when it’s employed to no good end, we’re finally driven to see that it has passed by before we even recognized it passing. And so it is—we don’t receive a short life, we make it so.”
II. Miyamoto Musashi on the immutability of truth:
“Truth is not what you want it to be; it is what it is. And you must bend to its power or live a lie.”
Thank you for reading,
Matthew Vere
Mind Macros 52: Skin in the game, human's tendency to believe, and the shortness of life
Congratulations on 52 weeks of your interesting and informative newsletter!