technology Sequence The Programmer's Mind

The Programming Mindset

What programming teaches you about thinking — and why the most important thing to understand about computers is that they are dumb.

“It’s the only job I can think of where I get to be both an engineer and an artist. There’s an incredible, rigorous, technical element to it, which I like because you have to do very precise thinking. On the other hand, it has a wildly creative side where the boundaries of imagination are the only real limitation.”

— Andy Hertzfeld

The first and most important thing to understand about computers is that they are dumb.

Not slow, not limited in the way a human with a narrow specialisation is limited — dumb in a very specific sense: they have no inferential ability. A computer cannot notice that even though you clicked this button, what you probably meant to do was click that one. It cannot infer intent. It executes the instruction it received, not the instruction you meant to give.

This is disorienting at first. Computers feel intelligent — they do remarkable things. But what they do remarkably is execute instructions reliably. The intelligence, such as it is, is in the instructions. The moment you understand this, a lot becomes clear: when a program doesn’t work, the error is always yours. The computer did exactly what it was told.

This is a hard lesson to internalize. When you’ve read the same ten lines of code fifty times and cannot see the problem, the temptation is to conclude that something is wrong with the machine. It almost never is. Something is wrong with your model of what the code is doing.

Papert put it well:

“When you learn to program a computer you almost never get it right the first time. Learning to be a master programmer is learning to become highly skilled at isolating and correcting ‘bugs,’ the parts that keep the program from working. The question to ask about the program is not whether it is right or wrong, but if it is fixable. If this way of looking at intellectual products were generalised to how the larger culture thinks about knowledge and its acquisition, we might all be less intimidated about our fears of ‘being wrong.’”

And: “The process of debugging is a normal part of the process of understanding a program. The programmer is encouraged to study the bug rather than forget the error.”

Programming is a feedback system with unusually high resolution. A program either runs or it doesn’t. When it doesn’t, the error message is precise — sometimes cryptic, but always pointing at something real. You cannot paper over it. You are forced, repeatedly, to locate the gap between what you thought you understood and what is actually true. This is uncomfortable and productive in equal measure.


The dumbness that causes this frustration is also what makes computers extraordinarily useful. No inferential ability means no hidden assumptions, no moods, no drift. A computer that works correctly today works correctly tomorrow. The reliability of automation depends entirely on this: you give it a rule, it follows the rule, every time, without variation.


What programming actually is

Philosophers distinguish between two kinds of knowledge: declarative and imperative. Declarative is knowing-that — facts, states, propositions. Imperative is knowing-how — processes, procedures, methods.

Computation is imperative. A computer doesn’t “know” the square root of 169. It knows how to compute it. When you ask a search engine a question, it doesn’t retrieve a stored answer from a ledger — it executes a process that derives one. This is true all the way down. Even the data retrieved from a database requires imperative knowledge to find: which database, where, what to filter, how to sort.

Programming, then, is the description of computational processes. It is writing the rules by which your corner of the computational universe operates. Things will behave according to what you specify — exactly that, and nothing more.

This is what Allen Downey captures:

“This way of thinking combines some of the best features of mathematics, engineering, and natural science. Like mathematicians, computer scientists use formal languages to denote ideas (specifically computations). Like engineers, they design things, assembling components into systems and evaluating trade-offs among alternatives. Like scientists, they observe the behavior of complex systems, form hypotheses, and test predictions.”

You cannot think only like one of these. At different moments in solving a programming problem, you need all three. That flexibility — the ability to switch between the formal, the practical, and the empirical — is the programming mindset. It is also, not coincidentally, a description of rigorous thinking in general.