70% off

‘Nobody’s Fool’ Review: Against All Frauds

We tend to believe by default what we see and hear. But even if we can’t be nobody’s fool, we may become a bit less foolish. Stańczyk (1862) by Jan Matejko. Photo: Alamy Stock Photo By Matthew Hutson July 9, 2023 3:30 pm ET The latest book by Francesca Gino, a behavioral scientist currently on leave from Harvard University, is titled “Rebel Talent: Why It Pays to Break the Rules at Work and in Life.” I am not here to review that book, published in 2018, except to call its subtitle an ironic masterpiece. Last month, three researchers claimed that four of Ms. Gino’s studies, spanning more than a decade, contain manipulated data. The book currently up for review, Daniel Simons and Christopher Chabris’s “Nobody’s Fool: Why We Get Taken In and What We Can Do About It,” would surely have mentioned Ms. Gino’s alleged fraud had i

A person who loves writing, loves novels, and loves life.Seeking objective truth, hoping for world peace, and wishing for a world without wars.
‘Nobody’s Fool’ Review: Against All Frauds
We tend to believe by default what we see and hear. But even if we can’t be nobody’s fool, we may become a bit less foolish.

Stańczyk (1862) by Jan Matejko.

Photo: Alamy Stock Photo

The latest book by Francesca Gino, a behavioral scientist currently on leave from Harvard University, is titled “Rebel Talent: Why It Pays to Break the Rules at Work and in Life.” I am not here to review that book, published in 2018, except to call its subtitle an ironic masterpiece. Last month, three researchers claimed that four of Ms. Gino’s studies, spanning more than a decade, contain manipulated data.

The book currently up for review, Daniel Simons and Christopher Chabris’s “Nobody’s Fool: Why We Get Taken In and What We Can Do About It,” would surely have mentioned Ms. Gino’s alleged fraud had it been uncovered in time. Instead, the book includes the next-best thing: In 2021, one of Ms. Gino’s collaborators was revealed to have manipulated data. What’s more, the paper on which the two academics had collaborated was one about dishonesty. (The whistleblowers call the pileup a “Clusterfake.”)

“Frauds of many sorts are growing in terms of both dollars stolen and victims scammed,” write Messrs. Simons and Chabris. “But the story goes beyond crime.Businesses have adopted more deceptive techniques as standard operating procedure.” Whatever the metrics behind this argued trend, we can agree that the world is less honest than we’d like it to be.

Messrs. Simons and Chabris, professors of psychology at the University of Illinois and the Geisinger Health System, respectively, structure their affable and fleet-footed book around four “habits”—common mental vulnerabilities—and four “hooks”—common features of enticing claims—to help us spot and fight trickery. Following their advice won’t come naturally, in part because of a truth bias: We tend to believe by default what we see and hear. But even if we can’t be nobody’s fool, we may become a bit less foolish.

The first habit cited by the authors is the tendency to focus on what’s in front of us, neglecting information that’s absent or not spotlighted. Messrs. Simons and Chabris’s previous book, “The Invisible Gorilla” (2010), is named after an experiment they conducted in which many participants, when asked to track people in black shirts, failed to spot someone dressed in a gorilla costume. In “Nobody’s Fool,” they highlight, for example, the wrong guesses made by psychic mediums—failed forays conveniently discarded to accentuate the hits. They also point to rigged demonstrations made by the startup Theranos before the company collapsed. In evaluating a claim—say, about a stock-picker’s prowess—the authors recommend constructing a two-by-two “possibility grid,” containing not only successful picks, but also unpicked successes, failed picks and unpicked failures. And they recommend asking many questions, even when it’s uncomfortable, as Theranos’s investors should have done.

The authors also point out that we tend not to examine claims that match our expectations. A confirmation bias leads us to seek evidence supporting our hunches. When a predicted outcome occurs, “that’s a good sign that you need to check more, not less,” the authors write. I’m not sure that’s strictly true, but when the stakes are high, one should check either way. Messrs. Simons and Chabris suggest having people serve as a “red team” to poke holes in our reasoning.

A similar habit is to not reconsider our set beliefs. Magicians capitalize on our assumptions about, say, whether a deck of cards has been shuffled. Advertisers also build on our beliefs, for instance touting “natural” products that are actually worse for us and the environment.

Finally, we often try to economize our information-seeking, keeping questions to a minimum. That’s how we miss the hidden costs of maintaining a desktop printer (all those cartridges). It’s also how art collectors miss the signs that a painting was forged (until the same Gauguin ends up at auction simultaneously at Christie’s and Sotheby’s). Messrs. Simons and Chabris write that we should beware of answers that raise red flags (“the originals have been lost”) as well as more subtle brush-offs (“it’s been validated”).

Among the hooks we usually fall for are unrealistically consistent data. People like the comfort of predictability, which is partly why Bernie Madoff’s Ponzi scheme grew so large—his returns weren’t huge but they kept coming. Beware of soothingly smooth performances.

We’re also suckered by familiarity. In the “illusory truth effect,” repeated messages sound truer. It’s worth asking why something sounds familiar—is it from a marketing blitz?—and remembering that sometimes things become popular through sheer randomness.

Precision is another hook. Scientific findings seem firmer when they have lots of decimal places, and homes sell for more when asking prices aren’t round numbers.

Finally, we should question claims that small causes have large effects. “Complex problems usually require multipronged solutions, if they are solvable at all, and rarely yield to the proverbial ‘one simple trick,’ ” the authors write.

Many of the cases in the book verge on insider-baseball territory as they cover research malfeasance; some get into the weeds of detecting fraud by counting how many of certain digits appear in raw data—something that most perusers of scientific headlines won’t do. But the cases are still interesting, they pass quickly, and they’re surrounded by a wide variety of other fascinating examples.

The authors’ advice starts with “accept less, check more.” It ends on a different note: “Is it worth checking every line on your receipt every time you shop to verify that each price was correct to the penny? Perhaps not.” Research not in the book suggests that people who trust more are more successful. So what to do? Messrs. Simons and Chabris land on the need to find balance in credulity—not an easy task. Getting ahead in work and in life apparently requires being someone’s fool some of the time. It’s the cost of doing business.

Mr. Hutson is the author of “The 7 Laws of Magical Thinking: How Irrational Beliefs Keep Us Happy, Healthy, and Sane.”

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow

Media Union

Contact us >