Menu Close

Tag: mental models

This website was archived on July 20, 2019. It is frozen in time on that date.
Exolymph creator Sonya Mann's active website is Sonya, Supposedly.

Wanted: Efficacious Heuristics

A system is an arrangement of interlocking elements or moving parts that all affect each other, sometimes recursively. The world is full of them (and in fact each of those systems is itself an element of the larger system that comprises the whole universe).

Mathias Lafeldt wrote about how the human mind copes with this:

“Systems are invisible to our eyes. We try to understand them indirectly through mental models and then perform actions based on these models. […] Our built-in pattern detector is able to simplify complexity into manageable decision rules.”

Lafeldt’s explanation reminds me of the saying that the map is not the territory. Reality (which is a system) is the territory. Our mental models are maps that help us navigate that territory.

Artwork by Kyung-Min Chung.

Artwork by Kyung-Min Chung.

But no map is a 1:1 representation of reality — that would be a duplicate, or a simulation. Rather, our maps give us heuristics for interpreting the lay of the land, so to speak, and rules for how to react to what we encounter. Maps are produced by fallible humans, so they contain inaccuracies. Often they don’t handle edge cases well (or at all).

Nevertheless, I like mental models. They cut through all the epistemological bullshit. Instead of optimizing a mental model to be true, you optimize it to be useful. An effective mental model is one that helps you be, well, more effective.

This is why Occam’s Razor is so popular despite being incorrect much of the time. Some plans do go off without a hitch. But expecting the chaotic worst is a socioeconomically adaptive behavior, so we keep the idea around. [Edit: Hilariously, I mixed up Occam’s Razor and Murphy’s Law — thereby demonstrating Murphy’s Law.]

My personal favorite mental model is a simple one: “There are always tradeoffs.” One of the tradeoffs of using mental models at all is that you sacrifice understanding the full complexity of a situation. Mental models, like maps, hide the genuine texture of the ground. In return they give you efficiency.

Guillotineplex

Are you familiar with the killing machine? It does what it sounds like. And unfortunately the machine is indiscriminate — the humans who operate and maintain it choose the machine’s targets, but they don’t always do a good job. So the machine terminates murderers, but even more often its victims are innocent.

Such is the way of a killing machine. It’s just a machine. Objects — or assemblages of objects — can’t be responsible or culpable for their “actions”.

Photo by mel.

Photo by mel.

I could be talking about a few different machines. I could be talking about US drones in the Middle East, or about the United States Armed Forces as a larger whole. I could be talking about lethal injection setups, or about our entire criminal “justice” system.

In a literal sense machines are different from bureaucracies. But regarding human organizations as machines can be a useful mental model. When we zoom out to that perspective, becomes obvious how little the good intentions of the participants matter.

A cog in a machine can be very well-made and run smoothly, interacting admirably with the parts next to it. But if the overarching design of the machine is to enable corrupt operators to execute their enemies, well…

© 2019 Exolymph. All rights reserved.

Theme by Anders Norén.