This page covers every mental model needed for making great decisions.
Picture this: Office workers frantically scurry around a boardroom looking for an elusive signal among all their data to decide how to resolve an emergency situation.
Meanwhile, a lone employee — sitting silently in the back of the room — calmly identifies just the most pertinent data then clears her throat and authoritatively presents a startlingly insightful conclusion.
The others say, "Oh, right..." And now everyone knows what to do.
That zen employee knew about mental models. She learned critical thinking.
"With an hour to save the world, I'd spend 50 minutes defining the problem and 10 minutes solving it."
A year ago, I was fundraising for a startup. In the process, I met a lot of investors. It struck me how uniformly insightful they were. I had not seen anything like it before.
Their wisdom extended beyond domain expertise. They seemed to have a high-level ability to connect dots between disparate concepts with a degree of pragmatism that made me wonder, "Are these people exceptionally bright?"
I didn't have an answer. All I knew was that they didn't seem to be innately smarter.
A couple months later, an investor friend asked me, "Have you heard of mental models?" I hadn't. I looked them up, and everything suddenly clicked: Investors didn't necessarily have greater intelligence — they aren't necessarily capable of being rocket scientists — rather they learned tools for making good decisions.
I didn't even known there was a toolbox to begin with.
I began researching critical thinking. I was dismayed to only find material from university lectures. Expectedly, it was dry and uninspired. Unexpectedly, it was also sparse and disjointed: There wasn't a unified framework to deploy the key 20% of critical thinking concepts into daily conversation and writing.
And none of the examples were illuminatory. Or engaging to read.
So I decided to put the work into writing what I believe is the first holistic, layperson's guide to critical thinking.
Hopefully the way I'm presenting fallacies, biases, and models makes their relationships seem self-evident, but these topics aren't historically taught within a singular framework. Nor are they accompanied by prescriptions for refutation.
I had to rigorously think through how everything in this guide connected. And I had to generate original refutations, techniques, and models.
By the way, if you've read this far, you'd be silly to not say hello on Twitter 😜 I'll be posting there when my next handbook is out.
Critical thinking is the act of using valid facts to form sound arguments.
To form an argument, you research evidence:
Researching is finding high-leverage facts among available data. Once found, we use logic (mental models) to mold those facts into arguments, such as:
An argument's logic is sound if it's devoid of biases (preconceptions that taint your research) and fallacies (faulty reasoning).
Mental models are the shortcut to consistently achieving soundness in our arguments.
A mental model is simply the fancy name for a framework we use to methodically assess a situation. We input evidence into them and they output a conclusion for us.
This helps us sidestep the biases and fallacies our intuition would have relied on:
Here's the cool thing: Models are loosely consistent. Given the same inputs, many people will get similar outputs. This gives critical thinkers a means to share conclusions that can be quickly accepted and rallied around by everyone. How cool is that?
Before we learn the best mental models, let's develop our habit of critical thinking.
We've been conditioned to think being smart means being "quick on our feet." That's useful some of the time, but making important decisions is not one of those times; the human brain is not wired to make great long-term decisions through snap judgments.
Learning critical thinking is therefore the discipline of refuting the snap judgments our minds are eager to make.
Consider how we think along a continuum of calculated thought:
The more we transition toward critical thinking, the more energy we must expend and the less egotistical we have to be about the quality of our gut instincts.
Further, the more self-aware you are about how clearly you're thinking, the more often you'll detect when you should be 1) avoiding biases and fallacies and 2) using models.
To develop this awareness, we must recognize when you're researching or arguing.
Daily, you encounter a few big opportunities to leverage critical thinking exercises. They arise whenever you've begun to argue or research:
Consider our stew of daily thoughts:
If so, pause to deflect all snap judgments. In your state of awareness, run the argument in your head the 5 mental models (below) to better structure your thoughts. To better navigate your existing knowledge to derive insights.
(All the fallacies and biases are summarized in the cheat sheet at the bottom of this page. So you can use that to memorize them all.)
The models are anticlimactically simple. In fact, if they were complex, I'd tell you to be suspect of their efficacy.
So read them slowly and identify opportunities in your life where they're applicable.
This models helps you avoid conflating urgency with importance when prioritizing:
To measure the urgency and importance of a task, consider the impact it'll have in 10 hours from now, 10 weeks from now, and 10 months from now:
This model helps you identify the motivations underlying your decision making.
First, note how there's a set of core values every project or business decision relies on:
Depending on which category you're facing, list the corresponding values one per line. Then write the two projects/decisions you're considering beside the values. Next, draw arrows from the values toward the two choices. Make the arrows one to three units in length depending on how much the decision they're pointing toward fulfills the value:
Finally, mentally project yourself out to age 80 (or your business out to 2 years from now) to determine which decision you would most regret not having made given your prioritization of the above values. That's the choice you should go with.
Example: In-depth walkthrough of how I used this model to decide my career path.
Often, 80% of outputs originate from just 20% of inputs. So, increase efficiency by allocating most of your resources toward the top 20%.
Whenever you're about to make a critical allocation of time, money, or resources, list all the inputs being allocated toward the objective. Then, rearrange them in order of which produce the most outputs (e.g. revenue, products). Narrow in on the top 20%.
That top 20% will shift over time, so re-analyze outputs at fixed intervals.
Highly disciplined 80/20 thinkers will accomplish in one lifetime what you would in four.
A system is anything composed of interdependent parts. Examples of systems include machines, businesses, and apps. Similarly, a process is anything composed of interdependent steps, e.g. supply chains, customer onboardings, and film productions.
For both systems and processes, the configurations and relationships of their components are usually the extrapolation of what we've seen before.
Therefore, to discover insights that drastically increase the efficiency of a system or process, we must consciously reject reasoning by analogy and a blind adherence to convention. (Analogies are shortcuts for assuming one thing is like another. Convention is how people told you things are.) These feed your anecdote and familiarity biases.
We do this by breaking a system/process down and clinically assessing its components:
Let's look at an example.
Let's discover insights for a modern car dealership.
Let's see if we can remove or replace these components:
Cars: Cars are central to the objective, so we can't remove or replace them.
Salespeople: Could we sell cars without salespeople? Well, we could place digital kiosks inside the dealership to answer common questions and provide video walkthroughs. Or we could make the kiosks dashboard-mounted iPads located inside the cars so customers can experience the interiors while learning.
We could also create a web-first sales model with online video material so customers only need to come in-store for test drives.
Since removing salespeople could save significant overhead, we could pass along the savings to customers in exchange for our minimalist approach. This could very well be a notable efficiency insight to consider further!
Car lot: If we still wanted to sell cars without paying the high rental costs of car lots, we could operate out of a small office and disburse our vehicles across the metro area. In fact, if we paid $300/mo to keep each vehicle in a garage (and only kept the hottest-selling vehicles on-hand), our total monthly parking costs could be much, much lower than the cost of renting a large car lot.
And, whenever a customer schedules to come in, we just meet them at the car, or we drive it to their home for a luxury experience!
Since consumers now shop online for everything, and they can find videos, images, and 3D walkthroughs of every vehicle, is a lot required to stay in business? Doesn't look like it, and the savings passed onto customers could be significant.
Financing: Removing financing would turn away many buyers because vehicles are commonly leased. And there's really no replacement for financing — financing is as simple as it already is. Plus it's usually outsourced to third-parties. So it's low-overhead. Given that there's not much overhead reduction to justify the inevitable drop in sales, we'll keep this component.
Our potential insights that pass this two-point test are the ones worth considering:
You should constantly be on the lookout for systems/processes. This model can be applied to every system (e.g. government) and process (e.g. marketing) you encounter.
Use this model to prevent a problem from recurring. For example:
The following process will help you systematically generate high-leverage solutions:
Your ice cream cone melted. Why? You held it in direct sunlight for too long. Why? Because you were distracted. Why? Because you saw a pretty pony. Why? Because the pretty pony was at the fair.
Now we pause because we've reached a cause we couldn't have prevented. (We're not in charge of the fair's attractions.) When this happens, we go one step backwards:
Because you saw a pretty pony.
And then we ask, How could I have prevented this?
By only doing one thing at a time. (I could have waited until I finished my ice cream before I looked at the ponies.)
This begins our new chain of Why's: We continue asking, Why did/didn't I do this?
I failed to do one thing at a time because I was overwhelmed by the fair.
Solution A: When I do something new, I must be cautious not to multitask.
Let's continue asking Why? to get higher-level conclusions.
Why was I overwhelmed? Because I had never been to a fair before.
Solution B: Partake in more social experiences to avoid being out of the loop.
We continue asking Why did/didn't I do this? of our failure to implement the prevention until we reach a cause where a solution would be too broad to be useful.
For example, if we continued further, we might get to this cause:
Why didn't I go to a fair before? Because my parents don't take me places. Solution C: Too broad. Lecturing parents won't truly prevent ice cream spills.
So now we compare Solutions A and B to conclude which is highest leverage (according to 80/20 analysis).
Confused by all this? Don't be. Just look at the graph below then re-read this section. Or, if you're getting bored, just skip to the next section altogether!
This model is purposefully loosely structured. It will produce different answers for different people. There's no right or wrong way to follow it; it's simply a framework for preventing you from overlooking potential solutions due to your biases.
Preventing the Titanic from sinking
We could say it sank because the captain didn't see the iceberg. A level higher, we'd say the captain didn't see it due to poor weather conditions. A level higher, we'd say poor weather occurred due to ecological phenomena.
Now we have to pause, because we can't control the weather. So, we go back to the previous cause of "Not being able to see poor weather" and we ask, What's the most sensible way to have prevented this? We could argue that all-weather iceberg detection technology (e.g. sonar, radar) is a good approach.
So now that we have a reasonable prevention, we have to question why we failed to implement the prevention: We begin asking, Why did/didn't I do this?
We could say the ship's designers lacked foresight to include iceberg detection technology. Because the designers lacked experience building large ships.
Pause. Now we're at a high-level cause that there's a high-leverage solution to: We could have instituted catastrophe prevention awareness processes into the ship's architectural and technological design phase.
Now we have a potential solution to work with. We could optionally keep asking Why did/didn't I do this? to propose additional solutions.
Note that multiple people can repeat this process to derive alternative high-leverage conclusions to consider. Then we can compare all of them based on 80/20 allocation: Which requires the fewest resources to produce the greatest benefits?
The only "secret" to thinking critically is being rigorous in your application of it. The difference between you and, say, Elon Musk is his incredible discipline in adhering to models and avoiding bias. He doesn't lazily revert to his instincts for big decisions.
You too must pass all your big decisions through the lens of critical thinking. This means using mental models not just for work decisions, but also for managing your life, choosing hobbies, and navigating the logic of spoken and written arguments.
To put it bluntly, everything you've been learning in this guide comes down to this:
♖ "Your entire life runs on the software (models) in your head. Why wouldn’t you obsess over optimizing it?" — Tim Urban
Part of that optimization is simplification. Models do not need to be complex. Take it from the man who popularized mental models and used them to grow Berkshire Hathaway to a $400B market cap:
"Some of the worst business decisions I’ve ever seen are the consequence of complex calculations and projections. They do that in business schools because, well, they have to do something."– Charlie Munger (Warren Buffett's investment partner)
I've compiled my best mental models and shared them with you. Musk has done the same. Now you too must flesh out your go-to models. Learn critical thinking skills every day of your life.
How? Keep an eye out for the complex, recurring tasks you encounter. Identify commonalities among the solutions to those tasks so you can work backwards to develop a framework to turn data into similar conclusions.