philosophy Sequence Mind and Society

Tribalism

The problem is not that we disagree. It's how we disagree — and how tribal instincts corrupt the mechanisms that should resolve disagreement.

“Wild animals, lacking imagination, almost never do disastrously stupid things out of false perceptions of the world about them. But humans create artificial disasters for themselves when their ideology makes them unable to perceive where their own self-interest lies.” — E.T. Jaynes

The world is divided. It always has been, and new problems will always generate new disagreements about how to solve them. This is not a crisis — divergence of opinion is how better answers eventually surface. The problem is not that we disagree. It’s how.


You Can’t Judge Your Evidence Without Comparing It

Consider how you decide whether there’s food in the house. You check the fridge, find it empty, and become fairly confident there’s nothing to eat. You check again — more confident still. But you never looked in the pantry, the fruit bowl, or the freezer. You’ve only confirmed your starting assumption by revisiting the same source.

This is how most people form political and ideological views.

Mill said it better than anyone has since:

“He who knows only his own side of the case knows little of that. His reasons may be good, and no one may have been able to refute them. But if he is equally unable to refute the reasons on the opposite side, if he does not so much as know what they are, he has no ground for preferring either opinion.”

The tribe doesn’t matter — scientific, progressive, conservative, religious. Consistently re-engaging with confirmatory sources inflates confidence without improving accuracy. Your ideas don’t become reliable by staying in the barracks trading stories with allies. They become reliable by surviving contact with people who think you’re wrong.


Why We Fail

Three tendencies undermine this — and all three are standard-issue human.

Shortcuts. The brain defaults to rules of thumb because they’re usually good enough. Defer to experts; go with the group. These heuristics have always been imperfect, but as the world gets more complex, the gap between the shortcut and the right answer widens.

Cognitive biases. Even when we slow down and think, our prior beliefs shape what we see. Present the same data to two people with different starting positions and they’ll often reach different conclusions — not because they’re stupid, but because what we believe changes what we notice. We attend to evidence that confirms what we already think and scrutinise evidence that contradicts it.

Social goals. Holding a view different from your group’s triggers stress. Disagreement with the tribe activates the same neural pathways as other forms of threat. We often believe what we believe, in part, because it’s what the people around us believe — and maintaining that membership feels important.

None of this is unique to your opponents. These are features of the human brain, not defects in the people you disagree with.


Why Attacking Backfires

Stress narrows thinking. S.I. Hayakawa observed that under threat, a person’s perception collapses from multi-valued to two-valued: everything becomes either safe or dangerous, us or them. Nuance disappears not because they’re incapable of it, but because the cognitive conditions for it are gone.

This matters when you consider what happens when you attack someone for their ideology. You don’t dislodge their views — you activate their threat response, which pushes them deeper into group loyalty. Lahti and Weinstein found that members of a group adhere more strongly to its moral framework precisely when the group’s stability is challenged. Attack the tribe, and the tribe closes ranks.

Bad ideas are not defeated by attacking the people who hold them. They’re defeated by acquiring converts — people who defect voluntarily because they found somewhere better to go. You cannot be that destination while you’re also the threat.


Being Right Isn’t Enough

Carnegie understood something that most political discourse ignores: humans are not creatures of logic. They are creatures of emotion who are capable of logic. If you’re only well-reasoned and not reasonable, you’ll be correct in ways that persuade nobody. If you’re only reasonable and not well-reasoned, you become a social mirror — agreeing your way toward nothing.

Both are required. Well-reasoned, so you’re actually right. Reasonable, so you can be heard.

The goal when engaging with someone who disagrees is twofold: to acquire whatever they know that you don’t, and to offer the same in return. You can only do either if the conversation doesn’t become a threat to be survived.


You cannot hold a mirror to your own biases. That’s the point. You need someone standing in a different ideological position to show you what you can’t see from where you’re standing — and you have to be willing to look.

Hate being wrong. Not other people for being wrong.

Besides: if they are wrong, what good does hating them do?