There’s a pattern I’ve observed a few times through scientific and computing history. I think of it as “complexity collapse”. It’s probably related to Kuhn’s paradigm shift.

The pattern starts with an approach that worked in the past. Gaps in the approach lead to accretions and additions. These restore the approach to functionality, but at the expense of added complexity.

That added complexity at first appears preferable to rebuilding the approach from the ground up. Eventually, however, the tower of complexity becomes impossible to extend further. At this point, the field is ripe for a complexity collapse and replacement with a fundamentally different approach.

In the realm of science, this complexity collapse has led to the most famous reformulations in history:

Admittedly, the examples from astronomy and quantum physics are more fundamental to our understanding of the universe than XML-based dependency injection. But these examples all illustrate a similar dynamic. Complexity accumulates, a new theory replaces the old one, leading to complexity collapse.

All those examples include a common coda as well: complexity grows again!

Today, the contradictions between quantum mechanics and general relativity lead many physicists to look for a new model. Not adjustments but a new paradigm to sweep away and unify the towers of complexity in both fields.

In the realm of data-based applications with web-ish interfaces, complexity collapse led many to embrace Ruby on Rails or Node.js. Both ecosystems have had mini-collapses but no complete replacement, yet.