On March 30, 2017 a large portion of the Interstate 85 (I85) highway in Atlanta, GA collapsed after a massive fire that raged underneath it.
Being a key piece of infrastructure that carries thousands of cars every day, experts predicted severe traffic congestion and delays. Yet, none of this materialized. People simply changes their behavior; in fact Atlanta's public transportation (MARTA) reported a 25% spike in ridership following the incident.
On the other hand, adding a lane to an existing highway usually makes congestion worse. This is known as Braess's Paradox. Traffic congestion is one of those complex problems that simply cannot be solved with a direct solution of building more roads.
Have there been times when you tried to tackle a problem head on and failed? Some problems are best tackled indirectly. Why?
In order to understand why this happens we have to first understand a few things about complex systems. As explained in my previous article, there are three types of systems categorized by the level of constraint on both the system and the agents operating in it.
While ordered systems are transparent (simple systems are transparent to everyone and complicated systems are transparent to experts) complex systems seem transparent but are in fact opaque. We simply cannot know everything that happens in these systems. We think we know, but we usually have a very limited understanding of the complexity inherent in these domains.
John Kay calls this phenomenon Obliquity and explains it in detail in his book by the same title. He writes:
The environment—social, commercial, natural—in which we operate changes over time and as we interact with it. Our knowledge of that complex environment is necessarily piecemeal and imperfect.
The human mind is programmed to look for patterns and to seek causes, and this approach is often valuable. But that programming leads us to see patterns in random events and to attribute intentions where none existed. We believe we observe directness in obliquity
Because of this, direct solutions almost never work as intended and usually have unforeseen consequences or adverse effects, like the increase in congestion when more roads are open.
A good example of this is the so called cobra effect which is based on an anecdote about a bounty program created in British colonial India where the government tried to fix the problem of venomous cobras by offering a bounty for every dead cobra.
This worked initially but then people started breeding cobras for income. When the government found out, the scrapped the program causing the breeders to release their worthless cobras making the problem worse.
It is because of this that I believe the first step in tackling any problem is to get a sense for the type of environment we're dealing with.
If the environment or domain is simple, the solution should be self evident. We simply sense what's happening, we categorize, prioritize and solve the problem.
If the domain or system is complicated, like a car's engine or a software system. we hire experts to analyze the issue, get a sense for what the problem is and solve it.
If we're dealing with a complex domain or environment, we cannot solve the problem by analyzing. We have to adopt a more experimental, discovery based approach. We have to try things and see how they work; we have to probe, sense and then respond accordingly.
You assess the situation quickly, form a hypothesis, design and carry out a small-scale, safe-to fail experiment and analyze the results, Then you reassess your hypothesis and figure out if you've solved the problem. Other times you can leverage what's already there, what you sense and see that's already working.