Outside the Box


Outside the Box was retired on April 25, 2018, to make way for the new and improved premium research service, Over My Shoulder.

If you’re interested in joining John Mauldin, Patrick Watson, and the thousands of Over My Shoulder subscribers as they analyse important research several times a week, please click here to find out how you can subscribe for less than $10 per month.

Question Your Assumptions

March 28, 2018

I always thought I would learn more as I grew older. In fact, the opposite is happening: The older I get, the more I realize how much I don’t know. Maybe that’s why my search for answers seems to widen and intensify each year.

If you're looking for the 'next Bitcoin'... You're missing the bigger picture.

Our questions differ, but we’re all seeking answers. Our digital technologies, led by search, theoretically make it easy to find answers, too – but they aren’t necessarily the right ones. This is a growing problem. Whatever crazy thing you want to believe, a quick internet search will turn up some “expert” to confirm you’re right.

In Niall Ferguson’s latest book, The Tower and the Square, he spends a good part of one chapter documenting the high percentage of people, not just in the US but all over the world, who believe in one form of conspiracy theory or another. That would be funny if it weren’t so sad. One of the things Niall points out is that the dominance of Facebook and Twitter has tended to break us down further into tribes, where we increasingly talk just to our own kind, reinforcing our parochial beliefs and idiosyncrasies.

Worse, we are increasingly overconfident in our own beliefs, even without expert confirmation. That’s no surprise to stock traders; they’ve long known that the crowd is often wrong. But they also know that the crowd can believe itself to be right a lot longer than skeptics think it can. That’s how we get asset bubbles.

Today’s Outside the Box is a short Quartz article by Olivia Goldhill, discussing a new paper by social psychologist David Dunning. Extreme wonks might recognize the name because, with Justin Kruger, he defined the “Dunning-Kruger effect.” It says that people who lack knowledge on a particular topic tend not to recognize their lack. In other words, we don’t know enough to know how little we know.

In his latest research, Dunning says we often make bad decisions, not because other people trick us, but because we trick ourselves. “To fall prey to another person you have to fall prey to your belief that you’re a good judge of character, that you know the situation, that you’re on solid ground as opposed to shifty ground,” he says.

When we were all living in small bands on the savannah, this was actually a good behavioral trait to have. But then it was pretty easy to see who we could trust and who we couldn’t, because the decisions we were making were basically pretty simple. The world has gotten extraordinarily more complex, and we often end up relying on “experts” to make decisions for us, based on their training and knowledge, when in reality they bring their own biases, assumptions, and agendas –limitations that often they aren’t even aware of. As I write this, I can’t help but think of Fed economists, but the problem is pervasive.

We’re hit with such a constant tidal wave of information that no one person can stay on top of it anymore. So what we are “sure” about is no longer as sure as we once thought it was. In a world of social media, where we are breaking up into tribes that live in their own echo chambers (rather than as one big happy family, which is what the developers of social media thought we would become), it’s harder to know what is right and true, and thus the shout goes up, “Fake news!”

My Mauldin Economics colleague Patrick Watson has recently taken note of research similar to Dunning’s. In last week’s Connecting the Dots, he described a neuroscientific study showing that car dealers – experts on selling cars – have almost no idea what motivates car buyers. The researchers call this “expert blindness.” Our own knowledge can keep us from seeing what’s real.

That’s a pretty deep thought, but it’s an important one that we should all keep in mind. It tells me I should question my assumptions and do more research before I make important decisions. As should we all. So read this Outside the Box and then resolve to question yourself.

On a related note, I’d like to ask your help. It is increasingly clear, given the multiple demands on my time, that I need to streamline my writing schedule. Producing both Thoughts from the Frontline and Outside the Box is not getting any easier, given the seemingly ever-increasing amount of research I have to do to stay on top of my game. And there’s just more – a lot more – going on in my business life than there was five or ten years ago.

We’re exploring some new ideas that will let me continue to deliver the quality information you deserve and even to improve it. I know change is hard (especially for me), and I also know some of you may have ideas I haven’t considered.

That’s why we’ve prepared a short, anonymous survey to gather your input. Please click here and share your thoughts.

I’ll report back in a week or two after I’ve reviewed everyone’s thoughts. I greatly appreciate your help!

Your constantly questioning his assumptions analyst,

John Mauldin, Editor
Outside the Box

Get John Mauldin's Over My Shoulder

"Must See" Research Directly from John Mauldin to You

Be the best-informed person in the room
with your very own risk-free trial of Over My Shoulder.
Join John Mauldin's private readers’ circle, today.

The Person Who’s Best at Lying to You Is You

By Olivia Goldhill

Originally published on Quartz, March 18, 2018

In 2008, the psychiatrist Stephen Greenspan published The Annals of Gullibility, a summary of his decades of research into how to avoid being gullible. Two days later, he discovered his financial advisor Bernie Madoff was a fraud, who had caused Greenspan to lose a third of his retirement savings.

This anecdote, from a presentation by University of Michigan social psychologist David Dunning, due to be presented at the 20th Sydney Symposium of Social Psychology in Visegrád, Hungary in July, highlights an unfortunate but inescapable truth: We are always most gullible to ourselves. As Dunning explains it, Greenspan – despite being the expert on gullibility – fell prey to Madoff’s fraudulent behavior not simply because Madoff was some master manipulator, but because Greenspan had, essentially, tricked himself.

“To fall prey to another person you have to fall prey to your belief that you’re a good judge of character, that you know the situation, that you’re on solid ground as opposed to shifty ground,” says Dunning. Greenspan, Dunning notes, failed to follow his own advice and take appropriate cautionary steps before trusting someone in a field he knew little about. Though he wrote the book on how not to be overly confident of your own judgments, Greenspan went against own advice when he handed over his savings without properly interrogating both Madoff’s confidence in himself, and his own sense of confidence in Madoff. Had he followed his own counsel, Greenspan would have recognized he knew little about financial investments, and would have done far more research before deciding to hand over his money to Madoff.

Dunning is an expert on the human tendency to overestimate confidence in our own knowledge and beliefs. In 1999, together with social psychologist Justin Kruger, Dunning identified the co-eponymous Dunning-Kruger effect: people who are incompetent and lack knowledge in a field tend to massively overestimate their abilities because, quite simply, they don’t know enough to recognize what they don’t know. So hugely unqualified people erroneously believe that they’re perfectly qualified. (This effect that has an unfortunate tendency to create the worst possible bosses. It’s also the opposite of imposter syndrome, which describes when qualified people worry that they aren’t qualified.)

In his latest presentation, Dunning highlights the studies that collectively show how we repeatedly and consistently fool ourselves into thinking we know more than we do, and so convince ourselves that our opinion or choice is right – even when there’s absolutely no evidence to support this. There are dozens of studies supporting this hypothesis, showing, for example, that British prisoners rate themselves as more ethical and moral than typical citizens, and that people mistakenly believe they’re better than others at reaching unbiased conclusions.

People tend to be just as confident in their false beliefs as their accurate ones. In one 2013 study, participants were asked a physics question about the trajectory of a ball after it was shot through a curved tube. Those who said the trajectory would be curved (wrong) were just as confident that their answer was correct as those who correctly stated the ball would have a straight trajectory.

A body of research has also established what scientists call “egocentric discounting”: If participants are asked to give an estimate of a particular fact, such as unemployment rate or city population, and then shown someone else’s estimate and asked if they’d like to revise their own, they consistently give greater weight to their own view than others’, even when they’re not remotely knowledgeable in these areas.

Our false confidence in our own beliefs also deters us from asking for advice when appropriate – or to even know to whom to turn. “To recognize superior expertise would require people to have already a surfeit of expertise themselves,” notes Dunning.

Gullibility to oneself is not a modern phenomenon. But the effects are exacerbated in the age of social media, when false information spreads rapidly. “We’re living in a world in which we’re awash with information and misinformation,” says Dunning. “We live in a post-truth world.”

The issue is that the current environment convinces people they’re more informed than they actually are. It might, says Dunning, actually be better for people to feel uninformed. “When people are uninformed, they know they don’t know the answer,” he says, and so they will be more open to hearing from others with real expertise. If we think they know enough, however, we’ll just “cobble together what seems to us to be the best response possible to someone asking us our opinion, or a policy, or what we think,” says Dunning. And, he adds, “unfortunately we’re programmed to know enough to cobble together an answer.”

There’s no quick fix to this, but there is a key step we could take to avoid being so willfully misinformed. We need to not only evaluate the evidence behind newly presented facts and stories, but evaluate our own capability of evaluating the evidence.

The same questions we consider when evaluating whether to trust another person should apply to ourselves: “Are you too invested in this thought or belief you have? Are you really giving the conclusions you’re reaching due diligence? Are you in over your head?”, says Dunning.

That said, constantly questioning ourselves would be impractical, leading to a constant state of self-doubt and uncertainty. Most effective, says Dunning, would be to focus on situations that are new to us, and where the stakes are high. “Normally those two situations go together,” says Dunning. “We only buy so many cars in our lives, we only invest large sums of money every so often, we only get married every so often.”

Of course, as that last example shows, at some point you have to give up being savvy and just trust your own judgment – both in yourself and others. Dunning quotes novelist Graham Greene: “It is impossible to go through life without trust…that would be to be imprisoned in the worst cell of all, oneself.”

We can though, learn to be a little more careful and wise. Just as we don’t blindly trust every person we meet, there’s no reason to be utterly trusting and gullible to ourselves.

Discuss This


We welcome your comments. Please comply with our Community Rules.


David Machanick

April 1, 2018, 6:02 p.m.

Amazon - WSJ says the USPS loses money on their packages. Other sources say the USPS is making a killing. If you can find an article that clearly explains this with facts and figures, it would be great. Could be an Outside the Box or Over My Shoulder article.


March 28, 2018, 8:58 p.m.

This discussion reminds me of the old aphorism, “the less a man knows, the more he suspects.”  It also makes me wonder why we continue to believe so much of what the government tells about certain things, when it has lied to us so often in the past.

John Beeler

March 28, 2018, 8:36 p.m.

    I am responding to the paragraph below in your most recent outside the box article in which you state:

I’d like to ask your help. It is increasingly clear, given the multiple demands on my time, that I need to streamline my writing schedule. Producing both Thoughts from the Frontline and Outside the Box is not getting any easier, given the seemingly ever-increasing amount of research I have to do to stay on top of my game. And there’s just more – a lot more – going on in my business life than there was five or ten years ago.

These newsletters are the roots of your success. Its analogous to people running 3 - 5 miles a day to keep their bodies in top physical shape. Your mental acuity, which seems amazingly high to me, comes primarily from the challenge of writing these letters each week. It is the solid rock foundation on which your outstanding reputation is based. You know all to well great companies who tried to remake themselves and failed. I hope that a career change/alteration is not motivated by a desire to be more influential in the uppermost levels of our society. I agree that at times it is just plain boring doing the same work over and over again. But look at the men and women you choose to surround yourself with, who have in your opinion, become truly successful at what they do that you invite to be “SIC” speakers each year. Have they altered their careers significantly, as you are contemplating, or have they continue doing what they do best? Some of the people you have enjoyed a professional relationship with have moved on for better or worse. Why not start with a list of their failures. Make another list for those who successfully remade themselves so to speak. Make a list of the good and the bad to help you arrive at a decision you will be happy to live with for the next 20+ years.

Regards, John Beeler


March 28, 2018, 3:54 p.m.

Dean Schulze,

You’re making an assumption that isn’t stated to be an effect in the problem: gravity.  In this sort of thought problem, anything that isn’t specifically stated should be assumed to be insignificant.  The ball will travel in a straight line.

Don Braswell

March 28, 2018, 3:29 p.m.

Want a new idea?  Read an old book.  From 2400 years ago (and another reason to study dead white dudes):  “When told that the Oracle of Delphi had revealed to one of his friends that Socrates was the wisest man in Athens, he responded not by boasting or celebrating, but by trying to prove the Oracle wrong.

So Socrates decided he would try and find out if anyone knew what was truly worthwhile in life, because anyone who knew that would surely be wiser than him. He set about questioning everyone he could find, but no one could give him a satisfactory answer. Instead they all pretended to know something they clearly did not.

Finally he realized the Oracle might be right after all. He was the wisest man in Athens because he alone was prepared to admit his own ignorance rather than pretend to know something he did not.”  from <http://www.pbs.org/empires/thegreeks/characters/socrates_p4.html>

Willis Smith

March 28, 2018, 3:16 p.m.

Overconfidence is a characteristic of narcissists.


March 28, 2018, 2:49 p.m.

I remember being quite impressed with Dunning and Kruger’s original paper in 1999:  “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments”.  Of course Scott Adams sort of beat them to it with his Dilbert Principle, published in 1997, an extension of the Peter Principle except that rather than being promoted until one achieves a level where one is incompetent, the Dilbert Principle states that people are promoted because they are incompetent; hence pointy-haired bosses.

jack goldman

March 28, 2018, 2:15 p.m.

Conspiracy theory. Wall Street and government steal my money but it’s just counterfeit currency anyway. I am a debt slave working for currency units and worthless computer credits, owned on a global slave farm. Please prove I am wrong. Disconnect from machines. Reconnect with nature.

Dean Schulze

March 28, 2018, 2:10 p.m.

“Those who said the trajectory would be curved (wrong) were just as confident that their answer was correct as those who correctly stated the ball would have a straight trajectory.”

The ball will have a curved trajectory - parabolic (ignoring wind friction) - whether it is launched from a straight or curved tube.

L M C Clark

March 28, 2018, 1:48 p.m.

Richard Feynman was one of the greatest physicists in history, and known as the greatest problem solver (see “No Ordinary Genius” and “Quantum Man”).  Hans Bethe referred to him as “a magician”, not an “ordinary genius”.  Judging from his quotes Feynman was both down to earth and a very wise fellow, not assuming conclusions.  My favorite quote (1974 address at Cal Tech), “The first principle is not to fool yourself - and you are the easiest person to fool.”