In Praise of Overthinking

There are not many things that people agree on these days, but one of them might be overthinking.

Everywhere you look, people are sharing how to stop overthinking or celebrating the fact that they are ‘recovering overthinkers’.

The assumption is that overthinking is debilitating and unproductive, holding us back from accomplishing what we want.

Like it is with the paradox of choice, as variables and possibilities increase, we become paralyzed by the options. If you listen to influencers and creators these days, our choices have led us to an epidemic of overthinking.

It has come to a point that some are shaming people for thinking too much. Some come right out and say it: ‘Stop thinking and do something!’

But, while I share the urge to be productive, I can’t help but to question the proposed solution. 

Sure, we’re dealing with a surfeit of data and choices these days. But switching off our thinking and forcing action doesn’t seem like the best way to deal with it. In fact, it seems like the exact opposite of what we should do.

Two Kinds of Thinking

Allow me to overthink out loud a bit:

I recently saw a popular meme that epitomizes what might be called the Underthinking Mentality. It read, ‘How much certainty I think I need (100%). How much certainty I actually need (51%).’

The implication is that you just need to be over 50% certain to make a decision. And anything else is a waste.

It was a good meme—it’s simple, it made you think, and, perhaps most importantly, it was controversial. If a meme is undeniable, there is no conversation, and it doesn’t trigger the algorithm. But, if a meme has a little bit of wrong to go with its right, it’ll go viral.

When we see the 51%, we think, ‘Of course! All we need is a majority to tip the scales to the winner.’ Like it is in politics or the World Series, you don’t need to win ’em all—just more than half, and you’re good.

But the meme suggests this applies to all decision making. What if it’s not like the World Series? What if it’s more like the NCAA tournament where you do have to win ’em all?

What should be clear to anyone is that all decisions are not made equal. Some are easy with zero unknowns, but others are difficult with a slew of unknowns. We are often faced with what Donald Rumsfeld famously called ‘unknown unknowns’. 

The easy decisions are like riding a bike over a wide, paved bridge—comfortable and familiar, most people can perform it thoughtlessly. The more difficult ones are like walking a tightrope over the gorge—precarious and dangerous, even the more practiced tightrope artists must concentrate to perform. At the very least, it is clear that the two kinds of actions are not the same and require different levels of skill and preparation. Suggesting the same approach for both situations would be foolhardy and likely disastrous.

So it is with decision making. Some decisions require little to no thinking at all. But others require heaps of thinking—analyzing data, modeling, comparison, analogy, framing. What would be overkill in the former might not be enough in the latter. 

A straightforward approach would be to distinguish between simple and complex problems. When a decision is simple, such as what shirt to wear for work or what restaurant to eat at, don’t deliberate too much. But when you’re deciding whether to take the job in another city, spend as much time as you can on the decision.

What we’ll find is that overthinking usually consists of applying the second kind of thinking to the first kind of circumstances—deliberating over data and analogies when a quick analysis would suffice. We see folks fretting over minutiae and things out of their control instead of focusing on the big picture and things they can control. And while it’s unnecessary to fret over minutiae and things out of their control, that doesn’t mean that all deliberation is unnecessary. All thinking isn’t overthinking.

The Return on Information Curve

In a fascinating 1973 study, Oregon Research Institute psychologist Paul Slovic showed the effects of gaining more information on decision-making. In a series of trials, Slovic gave a group of horse handicappers an amount of information to help them predict the winner of races. At first, he gave the handicappers any five pieces of information they asked for. In the next round, he increased it to ten, then 20, and finally 40 pieces of information.

In the first race, the handicappers had a success rate of 17 percent, which is much better than they would have had if they randomly selected the winners. Their confidence was on par with their success at 19 percent.

But Slovic found that the handicappers performed no better during each of the subsequent trials, successfully picking winners of 17 percent of the races even when they had 40 pieces of information. What’s most interesting is that their confidence had increased with the amount of information given so that in the last trial they had 34 percent confidence. While performance remained steady, confidence doubled.

Slovic’s findings reveal a learning curve and a law of diminishing returns. Having five bits of information is much better than none. But having 40 is no better than having five. German electrochemist Kannasoot Kanokkanchana drew up a graph that makes it plain. One x the work gets you suboptimal results whereas three x the work gets you to a satisfactory place. Beyond that, it takes too much time to improve the results and shouldn’t even be considered.

Many will use these findings to suggest that we think too much in all decisions. But Slovic’s study was based on a particular kind of knowledge used for a particular kind of decision. While it’s clear that more information doesn’t help in the horserace prediction game, it’s not clear that the principle holds for other endeavors.

When Good Enough Is Not Enough

Most of the Underthinking Mentality is based on the fact that decisions have time horizons. If you spend too much time on a given decision, you’ll miss out on potential benefits. This is a prominent factor in today’s world where the first mover benefit can make or break a startup, and most design systems are based on the Agile principle of learning through iteration. The quicker you can get to market, the quicker you learn, and the quicker you can improve. 

In the Army, there is an acronym, ‘OBE’, ‘overtaken by events’. Becoming OBE is to suggest that you’ve waited too long to act, you’ve become a victim of overthinking. Late four-star general Colin Powell said that in a bureaucratic environment OBE is a “felonious offense”. “You blew it. If you took too much time to study the issue, to staff it, or to think about it, you became OBE.”

To combat this, Powell used what has become known as the 40/70 Rule, named for the range of knowledge that you should have to make a decision. “I don’t act if I have only enough information to give me less than a 40 percent chance of being right,” Powell said. “And I don’t wait until I have enough facts to be 100 percent sure of being right, because by then it is almost always too late.” 

These days, we can all relate to Powell’s sense of impatience. The bureaucracy of large corporations can be as frustrating as that of the government. It is for this reason that so many have gravitated to the 40/70 Rule in finance, marketing, and operations.

And yet, even in highly bureaucratic situations, there are risks to making decisions without all of the knowledge. Powell’s rule is ironic because he is famous for being one of the central figures in the George W. Bush administration who initiated the second Iraq war based on a lack of information.

There can be no doubt that some decisions require only 40-70 percent of knowledge to be made. But, when dealing with something where lives are at stake, you would hope for more certainty before jumping into action.

It doesn’t take much overthinking to come up with a few examples where 100 percent is 100 percent necessary:

  1. Aerospace Engineering: Even minor flaws can lead to catastrophic failures in aircraft or spacecraft as we saw with space shuttles Challenger and Columbia.
  2. Surgery: Precision and perfection are essential when performing delicate surgical procedures.
  3. Cryptography: Any gap in encryption can lead to security breaches and data leaks.
  4. Pharmaceutical Research: Safety and efficacy in drug development demand 100 percent accuracy to prevent an epidemic of iatrogenic injury.
  5. Art Restoration: Flawless restoration is essential to preserve valuable artworks (see Ecce Homo).
  6. Computer Chip Manufacturing: The tiniest defect in a microchip can render it useless.
  7. Precision Instrument Making: Such as in the construction of telescopes or electron microscopes.
  8. Mathematical Proofs: Mathematical rigor requires logical perfection to be considered valid.
  9. High-End Cuisine: Achieving perfection in fine dining is often the goal for top chefs.
  10. Music Performance: Musicians strive for perfection in their art to create outstanding performances.

My uncle used to have a saying as he would coach me on doing chores like sweeping the driveway—”It ain’t a spaceship.” He actually included a few expletives, but you get the idea—40 to 70 percent would be fine. But while I wasn’t building a spaceship, there are others who are building spaceships. And, for them, 40 to 70 percent is not good enough. You need 100.

Conclusion

What the open-eyed observer will come to realize is that it’s not a matter of overthinking or underthinking—it’s not a matter of quantity at all, but rather of quality. Despite all the hullabaloo regarding overthinking, it’s not how much thinking you’re doing, but whether it’s the right kind of thinking.

If there’s one takeaway from all of this, it is that we should stop referring to procrastination or worrying as overthinking, where the presumed solution is to think less, and rather consider it a matter of wrong thinking, where the presumed solution is clear, precise, and sound thinking. Such is the main project of the Frameworks Mindset.