Good Intentions Don't Scale

December 30, 2020

Could good intentions scale? There are merits to keeping things small. What is it that makes our society today so disconnected, people so misinformed, social mores so callous? Has humanity lost its heart? I wouldn't blame anyone for believing that to be the case. Take one look at how we disregard the people living on the streets in our cities, or fail to address climate change, or insist on not wearing masks during a pandemic / getting vaccinated. Some blame education, some blame greed, some blame ignorance, some blame racism, some blame technology. I think they all actually come from the same - at some level, it looks like people are unwilling/unable to accommodate or consider the scale, variety, and complexity of the modern world.

In "Why Information Grows" by Cesar Hidalgo, he argues the cause of the world's complexity is simple. He defines the "personbyte" as a unit of information - it is the amount of information one person can comprehend. How much can one person understand about the world? At times, in the past, being a "renaissance man", someone with enough expertise in several fields to push the frontier of human knowledge in all of them simultaneously, was actually possible. As the world became more complex, that of course became more and more difficult. The world we live in today clearly accommodates a level of complexity that is much higher than it once was. But human minds, as remarkable as they are, have not changed at quite the same scale. We are still mostly only as intelligent as we needed to be in a much simpler world. So how is this possible?

Hidalgo claims, as follows naturally, that something else had to give. In fact, that something else was something about the information itself. You see, one incredible thing about information is that it can be encoded. That means you can take the information about something, say the shape of a car, and use a symbol to refer to it. Now that single symbol, so long as you know where to look it up, contains all the same information as the original. Of course, this does not scale. As soon as you need to store more pieces of information than you have unique symbols, you need to come up with some non-unique mapping. This kind of compression or encoding can be done deliberately, as the field of compression studies, but it also takes place fairly spontaneously. Through communication, through habit, through process, through systems, we naturally introduce shorthands, shortcuts, shortcakes. We specialize, storing different information in different heads, and defining terms so that we can communicate about them as necessary without everyone needing to know. Through these processes, very simple creatures can accomplish incredible things. Bees, ants, and termites build incredible structures of unfathomable complexity to any individual creature - but through structure, process, and specialization, no individual needs to fathom the larger picture to play its part.

We are unique, however, in almost all of these changes happening not in our genomes, but outside of us - in our shared culture. We've changed our language to accommodate words we do not understand, our education system prioritizes specialization, our society is hierarchical, and our economy cleverly simplifies almost all decisions into simple questions of price. Through these innovations, we've accommodated so much complexity, but at what cost? For most, that cost is not just under-investigated, it is completely unacknowledged. Instead of seeing the costs and benefits of these systems as deliberate tradeoffs, we find ourselves endlessly polarized and confused. We treat these failings like failings of the solutions, not inevitable failures of any solution to this problem. But in computer science we often find ourselves up against these issues - no matter what hard problem you are trying to solve, there are hard boundaries and limitations involved. Good engineering and algorithmic problem solving are as much about working around these limitations as they are about pushing that frontier.

Stepping back - in layman's terms, what is the limitation I am drawing? And where is it showing up? The way I see it, people are, and will be for the foreseeable future, limited in how much they understand. When we see the ramifications we think - wow capitalism is so short-sighted, technology is isolating and dividing us, misinformation is sowing dissent and distrust, the wealthy are greedy and irresponsible - but under all of these systems are the same misgivings. Some information or consideration is missing. What we see people do falls short of what they could, if only they understood. But I believe it is impossible for them to understand. The level of responsibility we would expect everyone to have is simply unrealistic. If we were to rely on individuals making decisions that are considerate of everyone and everything else in the world, we would simply not be setting ourselves up for success. Even in a world of incredibly well intentioned people, the complexity is far too great. In today's world, buying the wrong banana at the supermarket means supporting a centuries-old, conflict inclined, banana-republic supporting organization that has sowed civil revolutions, placed totalitarian dictators, and indirectly murdered hundreds of thousands of people in order to maintain their market position. Your only direct hint is that the bananas are 30 cents cheaper. Good intentions may be necessary, but they are not sufficient. Good intentions don't scale.

Or could they? Next up, we'll discuss the implications of network theory. Can we design a structure for society where people don't need to understand the global implications in order to make globally beneficial decisions? Could people simply be kind to their neighbors, and that be enough?