“Never attribute to malice that which is adequately explained by stupidity”
Hanlon’s Razor
Those looking for an explanation for the absurdly dysfunctional state of modern western society usually turn to one of two explanations. Either the problems are caused by individual leaders and their failings e.g. a senile Biden, a narcissistic Trudeau or a clownish Boris Johnson, or they are the deliberate attempt on the part of some group of masterminds to bring about a new world order; in other words, a conspiracy theory.
Part of the reason why these explanations are preferred is because they allow the possibility of redress. You can vote out an idiot leader and get a better one. And a conspiracy theory can be unearthed and held to justice.
The large-scale conspiracy theory is also quite flattering to the human ego. It implies a team of evil super geniuses who are so intelligent they that they are able to pull the strings of all nation states simultaneously turning the leaders of countries into puppets while hiding their nefarious intent from the general public. It’s pleasing to think that we are capable of that level of intelligence and competence. It’s also a comforting thought because the bad guys can be brought to justice. All we have to do is uncover their dastardly plot and bring them before the courts. The German lawyer, Reiner Füllmich, has been playing on this idea right from the start of corona with promises of convictions against those who pushed the “public health” measures. Unless I missed the news, none of his attempts have succeeded.
In this post we’ll sketch out an alternative explanation which is that the system itself is the problem. When the system is the problem it becomes really hard, maybe even impossible, for individuals to make a difference. I have seen such a dynamic with my own eyes in the form of dysfunctional organisations where new management was brought in to fix things. These were intelligent people who knew what was wrong and had a plan to make it right. But organisations are systems and systems have their own dynamic that is independent of any of the individuals involved. A system also has an external context that affects it. An organisation in a dying industry cannot be saved no matter how smart the people who are trying to save it. For these reasons and others, the system-based explanation is less gratifying and therefore less popular. But that doesn’t make it less truthful.
We’ll use the example of modern science to explore this concept because the question of whether science is corrupt or incompetent has become quite urgent in the last two years as we have watched the corona debacle unfold.
There are at least two underlying assumptions in our general culture when it comes to science. Firstly is the idea that anybody can do science as long as they have access to education. We can represent this graphically as follows:
That is, everybody has the same amount of innate ability to do science and the only thing preventing them from realising that ability is a proper education.
The second assumption is that all science problems are equally solvable which we can represent as follows:
Another way of saying this is that all science problems are equally complex. If you assign equal resources (time, money, people) you will get equal results.
Let’s look at a different model for both of these using the “Zipf curve” which has been shown to hold across numerous domains. Note that the Zipf curve matches the Marginal Benefit curve used in economics to capture the concept of diminishing returns.
What this graph aims to capture is the idea that the more people you train in “science” the less quality of scientist you get. This can be for reasons of nature or nurture (or a combination of both). The innate talents required to become a high quality scientist are not shared equally in the population and we would expect something like a Zipf curve to represent the distribution of those talents in the same way that not everybody has the collection of talents required to become a professional athlete.
When it comes to intellectual matters, you might argue that we could make up the difference in talent through education. But even if that were true, it would be more costly to educate the less talented people as they will need more time to develop their skills and knowledge. Once again you would run into a Zipf curve where the Marginal Benefit from education falls because the Marginal Cost rises. If we further assume that the talent pool of science educators is also a Zipf curve, then the quality of education would fall the more people get educated because there aren’t enough good teachers to teach them. Either way, you still end up with a curve of diminishing returns.
But what happens if the domain of “science problems” is also a Zipf curve? This would look as follows.
What this curve describes is the “low hanging fruit” dynamic. The problem domain of science is not equally distributed. There are a set of problems which are more simple and therefore more easily solved while the majority of problems are more complex.
If we combine these two concepts we get a story about the evolution of science. In a time of resource constraint such as in the 1800s, only the most talented people become scientists (assuming a relatively merit-based system of resource allocation). Those scientists will be working on the relatively simple problems in the field and therefore they produce the most valuable results. Everybody gets excited by the results and as wealth accumulates we throw more resources into the field expecting even more impressive results.
The extra resources would produce more results if the curves for ability and simplicity were flat. But they are not. They are Zipf curves. What happens, therefore, is that less capable scientists are put to work on more complex problems i.e. the intersection of the two diminishing Marginal Benefit curves. We spend more money to get fewer, less valuable results. But even though the Marginal Benefit falls, the overall cost-benefit equation might still be positive.
The problem of diminishing returns is exacerbated by a third consideration. More resources means more people are working on “science” and adding more people reduces the quality of communication. The following diagram is often used to summarise this problem.
Communication becomes more difficult as the number of people involved increases even if the quality of the information remains static. But if less talented people are working on more complex problems, we would expect the information quality to degrade leading to a situation where there is more communication of lower quality information. In short, the signal-to-noise ratio goes to hell.
You can see this dynamic even in small groups. Take a musical band, for example. If there are five people in the band, it only takes one person to be “out” for the whole band to be “out”. Similarly, on a small engineering team if one person doesn’t understand, this effects the overall communication flow because erroneous messaging is introduced and more time needs to be spent correcting the errors. If the person is a line level worker, it’s usually possible to work around them and try and exclude them from communication. But I have been on teams where the person who didn’t understand was the senior manager on the team. It’s a lot harder to move a senior manager out of the way so that things can get done.
A key thing to bear in mind is that a low signal-to-noise ratio won’t appear to be obviously “wrong”. This is true both for the people on the inside doing the work and also to external participants. Noisy communication is worse than the case where communication is “wrong”. Wrong communication is almost as useful as right communication. If you know somebody is always wrong, you just invert whatever they say and now you have truth. You can’t do that with noisy communication. Noisy communication is ambivalent, unclear and confusing. Again, the musical group example is a useful here. A somewhat incompetent band doesn’t sound “wrong” but rather “blah” or “meh”. You shrug your shoulders and say something like “it’s not bad but it’s not good either”. This is in contrast to a band like Nickelback who are technically proficient musicians that happen to make bad music.
When the signal-to-noise ratio is low, it becomes far more difficult to show that something is wrong because there are no clear and obvious errors. There is no smoking gun that will set the record straight and restore order. Rather, there is an accumulation of numerous small errors which are much harder and more time consuming to identify and correct. In a small group such as a band, it’s possible to find the weak link (usually the drummer) and get rid of them. In larger groups it becomes far more difficult and in really large organisations like corporations and government departments it’s as good as impossible.
It’s important to understand that this dynamic of noise accumulation occurs before politics, commercial money and the enormous egos of billionaires and celebrities gets involved to make things even more confusing. Corona provides a useful case study. There was never any reason to believe that the mRNA vaccines would work to end a pandemic. The science had not proven the matter one way or another. To put it in terms we have been using, the science had a low signal-to-noise ratio. This meant it was possible to believe that the vaccines “might” work. After all, anything “might” happen. Once upon a time, science was about “laws” and was founded upon hard-nosed cause and effect relationships that had been empirically proven. That’s the kind of science you see at the “simple” end of the Zipf curve. But as complexity increases, the clarity of understanding diminishes and you no longer have “laws” but “guidelines”.
Once the vaccine question became political, the political imperatives took over and politicians had to gloss over the inherent ambiguity in the science. Thus, we were assured the vaccines were “safe and effective”. Meanwhile, corporations which exist to maximise shareholder value were happy to sell a product when governments indemnified them against legal liability.
It’s not a coincidence that the corona event took place in the domain of viral disease as this is arguably one of the more complex scientific domains. I would place it somewhere about here on the graph. In other words, highly complex.
Note that viral disease as an object of study also has a built-in communication problem because it runs over three separate scientific disciplines: virology, epidemiology and medicine and that’s before you consider the mathematical epidemiologists, the immunologists and other sub-sub-disciplines. Viral disease is firmly in the category of study that the systems thinkers of the 20th century posited was not amenable to reductionist science which means it cannot be simplified to the point where calculation can be done. The best we can do is assemble cross-disciplinary teams to undertake research aimed at obtaining general principles of action. Those general principles were exactly what constituted the public health guidelines that were the accepted wisdom of how to deal with a pandemic prior to March 2020.
The post war period has seen huge amounts of resources pumped into science and yet we have ended up with the “reproducibility crisis”. The reproducibility crisis is just another word for the noise generated by the intersection of multiple diminishing returns. No amount of extra education and training and money will solve the problem. The result is not error but noise and when the noise gets raised to a high enough degree you have a situation where anybody can read into it whatever they like. At that point, science becomes a giant Rorschach Test.
The problem of a low signal-to-noise ratio is not limited to science. Most things in the modern world suffer from it. Everything is “blah” and “meh”. It’s the paradox of success. We have huge resources to apply to problems and we invest those resources into new ventures. It works for a little while but the law of diminishing returns means that everything quickly turns to mud and the quality of everything falls sharply. This is true in the consumer economy, in the political sphere, in the media, in the arts and in science and technology. Rather than accept this as a fact of life, we pump more resources in until the returns turn negative and that leads to inflation and the debasement not just of the currency but of political, social and cultural capital. We’re pretty far into that dynamic right now and it’ll probably get worse before it gets better.
It’s partly for this reason that societies and cultures seem to peak when strict resource limits are in place. Without limits, the signal-to-noise ratio falls and everything becomes saturated and over-exposed. The noise floor steadily rises until and only those who can shout the loudest get heard. To quote the New Zealand Prime Minister during corona, “we (the government) will be your single source of truth.” The words that usher in the age Caesarism.
I saw a study somewhere purporting to show that scientific discoveries ‘peaked’ in the 1960s. It sounds perfectly plausible, though I can’t find the paper anymore. That, after all, is when the West went to the moon, a dream that had been conceived right at the birth of modern science in the 17th century. But this reminds me of a broader theme discussed somewhere by J. M. Greer: he talked about how epochs or cultures find certain ‘approaches’ to or ‘ways’ of knowing that they dedicate themselves to, which naturally then have a period of initial promise, mature elaboration, and senile decadence. We can see this process playing out with the various branches of ancient philosophy, with medieval theology, and with modern science. No doubt there are a few more nice discoveries to be made in the modern paradigm, but naturally it makes one wonder what the next era will devote itself to…
Very interesting.
I am a conspiracy believer, and in my opinion the conspiratorial factors are inserting noise in the system as part of their tactic.
A lot of things can happen naturally, but it seems to me there was an intentional search for those that damage society, and used deliberately to ruin the system.
The sad thing is that the noise to signal problem exist in the right side, that is the side that oppose the new world order, not necessarily for the reasons that you explain.
Interestingly, it did not occurred in regard of the Corona. In this case, those who opposed the system, even when coming with different approaches, did not hinder each other. Actually, they intensify each other.
Austin – that sounds about right. I think one of the great hopes was that computers would enhance our calculation abilities and open up new horizons. The problem is somebody has to program the computer and models of complex domains are notoriously flaky and psychologically dangerous because they make us think we know more than we do while making us not focus on the real world. It’s amazing how many scientific papers you read nowadays that are based on nothing more than computer modelling.
Nati – I think conspirators have re-aligned themselves to the new conditions. It used to be the case that you hid a conspiracy and as soon as you were found out the game was over. In the modern world there is so much information (noise) flowing around, you don’t even need to hide anymore. You can run your conspiracy in plain sight.
I see a significant degree of malicious agency in the last two years. Keep in mind, that conspiracies don’t have to be successful to be conspiracies.
But then there’s Grey’s law: “Any sufficiently advanced incompetence is indistinguishable from malice”. Maybe that’s it.
About science: could another factor be, that in many disciplines we have simply reached the limit or our cognitive capabilities? Not because things get too complex. Complexity can be broken down into manageable bits. Our monkey brains simply are not designed to understand the concepts. Maybe we have come as far as we can go?
Same as we have with most art forms, social & political arragements and a few other things.
A Spenglerian view if you want.
Roland – if by conspiracy we mean “agreement to pursue an end” then the Great Reset and WEF are paradigm cases of conspiracy. Then you have multinational corporations pursuing their own interests irrespective of morality, which is what they have always done. Then you have an enormous amount of noise which gets incorrectly attributed to the workings of evil geniuses. All of these are true at the same time. I think the last one doesn’t get enough attention so was trying to explain that dynamic more clearly.
Have you ever read Weinberg’s “Introduction to General Systems Thinking”? He goes into detail about the problems of complexity. His argument is, yes we can break complexity down, but we cannot do computation on it because any modelling has an unknown error rate and small errors render the model useless. What we can do is use heuristics which I think is a form of intuition. But for modern rationalism, intuition is indistinguishable from superstition. So, we keep making models which keep being wrong and we call that “science”.
I did read Weinberg, but that was ages ago. Probably should reread it. That’s basically what was known as chaos theory. Small initial differences add up to large differences in outcome.
But the point I was trying to make is different. The problem goes beyond complexity and chaos.
If you take for instance the double slit experiment there is no complex setup involved. All the components and procedures are simple and easily understandable.
However the outcome cannot be meaningfully interpreted in concepts we can think in.
This is not for not trying. quantum mechanics has been around for a hundred years and, in terms of power of prediction and applicability, is one of the most successful theories science ever came up with. It just makes no sense to our monkey brain. Feynmans quip still applies.
This applies to a number of scientific disciplines these days.
I agree about heuristics.
Engineering is basically heuristics with a bit of maths chucked in to connect the bits.
And it works. Bridges don’t collapse, planes actually get off the ground, even computers sometimes do what they are supposed to.
Roland – that’s a good point. The good thing about such paradoxes is that they don’t generate noise in the way that models do. Perhaps the two phenomena are related. Given an inability to resolve paradoxes and make progress, scientists fall back into complexity where you can at least produce the appearance of “progress”. It’s easy to fool yourself with models and the inherent uncertainty in complex domains is an attractor for charlatans and conspirators.
When I studied Physics in University there was a running joke that we know how to solve only two problems: a spring, and the energy levels of a hydrogen atom. But as it turns out, the hydrogen atom problem is solved by reducing it to a spring.
> What we can do is use heuristics which I think is a form of intuition.
I would label it as heuristics/intuition being a form of wisdom, versus models which are applied knowledge. It is the elder/expert dichotomy again.
I think Weinberg implicitly understood this, but struggled somewhat to teach it as he tried to come up with a rational basis for the better way to ‘see’ that was obvious to him (contrasted against ‘solving’ the world analytically under our dominant paradigm). But the issue is that wisdom/elder/seeing is not rational or even really conscious, so Weinberg’s teaching is constrained within a framework that is inappropriate for what it wants to teach. It is a bit of a chicken and egg problem in that I found you need an understanding of systems thinking to understand Weinberg’s teachings on systems thinking. Such paradoxes seem to be an inherent part of any wisdom teaching I’ve ever come across.
Bakbook – funny. To a man with a physics degree, every problem is a spring.
Daniel – I’ve noticed that with both Weinberg and Bateson’s writing (both of whom were concerned with systems) they struggle to hold a consistent theme together. So, it’s part critique of reductionist science, part explanation of how to think about systems and part re-discovery of wisdom. I think of intuition as the faculty which “does systems thinking”. That is, it “processes” multiple variables that can’t be computed. The result is just a “feeling” about what is right.
“Never attribute to malice that which is adequately explained in Staplerfahrer Klaus”
Michael – hah. Thanks for that. Hadn’t seen that one before.