thinking Sequence Thinking Clearly

Rationality

Most people have a rough idea of what rationality means. Most of those ideas are wrong.

“Well, pray if you like, only you’d do better to use your judgment.” — Tolstoy, War and Peace

Most people have a rough idea of what rationality means. Most of those ideas are wrong.


The Straw Vulcan

The dominant image of the rational person is Mr. Spock: emotionless, data-obsessed, perpetually baffled by human behaviour. This archetype has a name — the Straw Vulcan — and every one of its defining traits is irrational.

The Straw Vulcan:

  • ignores or suppresses emotion
  • only values what can be measured
  • expects everyone else to be rational, and is baffled when they aren’t
  • refuses to decide until all information is in

Suppressing emotion means discarding a vast store of accumulated wisdom encoded in feeling. Recoiling from a hot stove doesn’t need conscious deliberation — and the quickness of that response doesn’t make it irrational. Only valuing the quantifiable means ignoring uncertainty, randomness, and most of what makes life worth living. Expecting everyone to be rational, after repeated evidence to the contrary, means never updating your model of human behaviour — which is itself a failure of rationality. And waiting for complete information before acting is a strategy that guarantees you’ll never act.

The Straw Vulcan is a straw man. Beating it tells you nothing about rationality.


What Rationality Actually Is

Rationality has two components.

Epistemic rationality is about belief. It’s the systematic effort to hold beliefs that best match reality — to keep updating your model of the world as new evidence arrives. The goal is to make your map match the territory. Not perfectly — that’s impossible — but better, continuously, without flinching when the update is uncomfortable.

Instrumental rationality is about action. Given what you want, what’s the most effective way to get it? It doesn’t tell you what to value — that’s your job. It tells you how to pursue your values without fooling yourself about whether you’re making progress.

Together: rationality is about what is true, and what to do.

The two depend on each other. Bad maps produce bad routes. A mind too busy mapping the territory to ever traverse it isn’t accomplishing much either.


Bounded Rationality

Humans are not logically omniscient. We have limited time, limited information, limited cognitive capacity. The question isn’t whether to be a perfect reasoner — that option isn’t on the table. The question is how to reason well given real constraints.

Herbert Simon, the Nobel laureate who formalized this, called the solution satisficing: searching through alternatives until you find one that meets an acceptability threshold, then stopping. Not the optimal solution; a good enough one, found efficiently.

This sounds like a compromise. It is. But it’s also the only rational approach to a world where information is scarce, decisions are time-pressured, and perfect solutions are theoretical.

Simon framed the trade-off cleanly: you can find an optimal solution for a simplified model of the world, or a satisfactory solution for the real world. Neither dominates the other. Both are valid strategies depending on the stakes and your information.


Collective Rationality

Rationality scales. Institutions and incentive structures can produce rational outcomes even when the individuals inside them are not being particularly rational.

Markets are the textbook example. Prices aggregate dispersed information across millions of actors, reducing the computation any individual needs to understand supply and demand. The profit-and-loss mechanism rewards accurate models of the world and punishes inaccurate ones. The result is a system that is, in aggregate, more epistemically accurate than any of its participants.

Religion is a more provocative case. It often looks epistemically irrational — its map of the territory is wrong in important ways. But from an instrumental and collective standpoint, it may produce adaptive behaviour: cooperation, deferred gratification, community cohesion. Whether those benefits outweigh the epistemic costs is a genuine question, not a simple one.


The Enemies

Armed with an actual definition, the standard objections to rationality become easier to diagnose.

Postmodernism holds that truth is not discovered but constructed — a tool of power deployed by the dominant. The humble core of this is real: our beliefs are shaped by our circumstances, and we can’t fully step outside them. But the conclusion — that reason itself is a power play — is self-refuting. An argument that claims to reveal a truth about the nature of truth has sawed off the branch it’s sitting on. Epistemic rationality doesn’t promise a view from nowhere. It promises a better view than the one you started with.

The romantic and religious objection runs differently: reason is soul-destroying. More computation, less humanity. If we rationalise love, art, or faith, we drain them of what makes them worth having. This is a more sympathetic worry, but it misidentifies the target. Understanding why you love someone — the neuroscience, the evolutionary pressures, the attachment patterns — doesn’t dissolve the love. A map of a mountain doesn’t flatten it. Instrumental rationality has nothing to say about what you should value. It only asks whether you’re pursuing those values effectively.

Both objections are fighting the Straw Vulcan. Neither touches the real thing.


Rationality is the discipline of making your beliefs match the world, and your actions match your goals.

Nothing soulless about it.