When I started thinking about writing this series of essays, I had the intuition that the corona event was very similar to things I had seen in my professional life working in IT. My point of departure was to investigate how we tested for coronavirus and this led me to the PCR test. This was of semi-professional interest to me because I am tester in my working life and therefore have an interest in test techniques. It was my disbelief that we were relying on the PCR test exclusively to determine ‘infections’ that led me down the rabbit hole that became these posts. I have touched on the high modernist ideology in other posts in this series but in this post I want to focus on a lower level element of that ideology which exists in the medical industry and which I have seen in my professional life: automation.
The automation ideology was a big thing back when I was starting out as a software tester. So big, in fact, that people were predicting that the job of software tester would disappear. This was obviously of great interest to me at the time because I had just started what looked like becoming a career in that field and so the idea that my job was going to be made redundant was a bit of a worry. So, I paid great interest to the debate. The idea was that testing was going to be automated and, as software developers would be the ones creating the automation, the job of tester would disappear. I was part of a sub-culture in the testing community that was highly critical of this idea because we defined testing as a skilled activity that could not be automated away. That didn’t mean automation wasn’t useful, just that it was a tool to be used by a skilled tester to carry out their work. As it turned out, the job of software tester hasn’t disappeared although there are now a great many jobs for the role of ‘automation tester’.
The upshot of all this is that I have seen the automation ideology first hand in my professional life. I know how that story gets told and I know the differences between that story and reality. Automation comes at a cost. It requires specialised knowledge and skill in writing code. Once built, automation must be maintained. Very rarely have I been involved in an honest discussion about the costs and benefits of automation. There have been a number of times I have tried to have an honest debate about it, usually during a job interview. On one occasion I was able to convince an employer to hire me instead of an automation engineer and that was a small victory on my part. Mostly, however, such discussions don’t go well. I remember one job interview where I asked why they wanted an automation engineer and the people doing the hiring looked at each other with blank faces. It was clear they had never been asked the question before. They spluttered out something about automation being ‘best practice’ before changing the topic. This is the ideology of automation. Within the politics of an organisation, you get rewarded for implementing automation not because it’s a good idea but because it’s ideologically correct.
I could probably fill an entire book with the problems with automation. Let me give the reader just one example to give a flavour for the kinds of issues that exist. Let’s use Facebook because everybody knows it and there was a story about them a while ago that I found amusing. They had replaced their testers with automation. This meant that nobody was actually watching the site diagnostics in real time. Instead, they had a suite of automation tests that would alert somebody when there was a problem. The engineers at Facebook released a new version of the software and some of the performance metrics dropped precipitously. Specifically, the page load time went to almost zero. That sounds like good thing. If a webpage normally takes 0.5 seconds to load and now it takes 0.01 seconds to load, that’s good. In theory, we just improved the performance of our site by more than an order of magnitude. Nobody had set up an automated test to check for a scenario where the page load time got quicker. They only had tests for when the page took too long to load. An engineer looking at those metrics in real time, however, would know that something was wrong. Page load time doesn’t just go down by that much for no reason. Any reasonably smart engineer would at least investigate to see why that had happened. That didn’t happen at Facebook because they had replaced those engineers with automation. Therefore, their site went down and, of course, when Facebook goes down it makes the news. (The reason the page load time had gone to almost zero was because all of the content on the page wasn’t loading due to a bug in the software that had just been released on that occasion.)
Of course, if Facebook goes down it’s not really a big deal. Nobody is going to die or get seriously injured and one could actually make a strong argument that it would be good for Facebook to go down more often so that people switch their phones off for a while and go outside and get some fresh air. But imagine if the same automation ideology was applied to software that was running mission critical systems where perhaps even life and death was involved. In that case, a missed automation test would be very costly indeed and a company which implemented such a practice would probably be sued for negligence. In this scenario, the choice to use automation is an engineering decision that can and should be justified on solid principles, not just some fad among middle managers.
The practical effect of the automation ideology is to remove a person from the equation. In this way, it ties directly to the heroic materialism of the industrial revolution that I outlined in post 11 in this series. The industrial revolution slowly got rid of human labour from manufacturing. In the last few decades, there has begun the attempt to get rid of human labour from other industries. The IT industry is just one. It turns out, the same thing has been happening in the medical industry and this is where the PCR test and the corona event come into the picture.
Normally, if you are sick, you go to a doctor who makes a diagnosis based on your symptoms. That is a skilled activity where the training and experience of the doctor plays a crucial role. Of course, doctors don’t always get it right. I have a couple of stories from personal experience and also know of several more dramatic ones from friends and family where a doctor got it wrong. Of course, we should hold the doctor to account in case they were actually negligent in such cases. But nevertheless, it’s a fact of life in complex domains that people will make mistakes. The human body and human health is an extremely complex system and we neither can nor should expect that doctors can get it right every time. There are too many variables at play. Most of the tests used in medicine are an aid to the doctor. They are like the doctor’s toolkit and just like any tool they must be used wisely and their results interpreted. They don’t guarantee a perfect diagnosis every time but they hopefully enable more accurate diagnoses overall. The idea that the doctor can be replaced by the tool, is the automation ideology applied to medicine and, in fact, that is exactly what we have seen during the corona event. Doctors were replaced by the PCR test. To become a confirmed covid ‘case’ did not require a doctor to diagnose symptoms, it just required a test to give a result. In this way, the corona event is a prime time example of the automation ideology at work.
In previous posts, I documented the timeline of the PCR test taking over in this way a process which took place during the last 10-15 years. However, it is not just the PCR test where the automation ideology has been at play in the medical field. There are a number of other medical tests which have been applied in a similar way. One example is the use of regular mammograms which is a practice that has been recommended for women of certain age groups for quite a long time now. Note that the idea of a regular screening fits the pattern of automation ideology: you don’t go to see a doctor and have clinical symptoms diagnosed, you just go and get the test done. Thus, there is an implicit faith in the test to give accurate results. However, mammograms, like all biological tests, have a false positive problem and because the illness in question is cancer the ramifications of false positives are extremely high. Cancer treatments are highly dangerous and damaging to the body. If a false positive leads you to get treatment when you didn’t need it, that’s a big problem. It means you went through a painful treatment process unnecessarily. In this case, the excessive faith in automation has very real consequences for individuals.
On in recent years has the costs of the mammogram testing program been coming to light and there are now a number of experts who claim that such screening programs do more harm than good. Here is one article that outlines this position. The following quote is pertinent:-
“There are significant harms associated with mammography screening and no reliable evidence of benefit. It is time to discontinue routine mammograms for all healthy women of a particular age. Resources should be shifted toward surveillance of women at higher risk for breast cancer, diagnostic workup for women with a change in their breast that does not go away and for ensuring that women receive timely treatment for a confirmed invasive breast cancer.
Population-based mammography screening has opportunity costs for the health care system, not to mention the social, financial, interpersonal and emotional costs to women and their families.”
Substitute the phrase ‘mammography screening’ with ‘PCR testing’ and change the breast cancer references to flu-like symptoms and the above paragraphs serve equally well as a critique of the approach taken during the corona event. At base, the problem is the automation ideology that drives the whole approach.
There are, of course, good tests and bad tests. Good examples of automation and bad ones. In my experience, automation can be very useful as a tool used by a professional who understands in detail what the automation can do. Every single time I have seen automation implemented as a standalone artifact, the costs far outweigh benefits even in domains where nothing really important is at stake. But that is exactly what has happened with the use of the PCR test. Although humans were involved in the processing of the test, the process itself is a fixed series of steps which doesn’t require any interpretation from humans. Somebody takes your swab, somebody transports it to the lab, somebody at the lab carries out the procedure and then you get the result. That result is final. Doesn’t matter if you have symptoms. Doesn’t matter why you were tested in the first place. There is no weighing up of probabilities about whether the test was a false positive or not and no consideration for the ramifications on your life of a potential false positive.
There are, of course, a number of other problems with the PCR test and I was very interested to see news this week of a potential case aimed at proving in court that the PCR test is not fit for purpose. Here is a video where the lawyer in question, Reiner Fuellmich, gives a great summary of the issues – https://www.youtube.com/watch?v=2UQLqWJJ8AY&ab_channel=RubberRing
In my experience, wherever the automation ideology appears, it is almost always combined with a combination of hubris and naivete. The people who promulgate it are usually educated people who have little on the ground experience in the real world. When such people take up managerial positions in bureaucracies and corporations, there is a natural disincentive for the negative consequences of automation to be reported back to them and, of course, it is highly unlikely that such people will seek out that feedback. I would be very surprised if the PCR test approach taken by governments can hold up in court. On the other hand, the scale of the financial cost associated with what has happened is so enormous that there will no doubt be every conceivable political pressure placed on such cases to not find the state liable for that damage. Nevertheless, I can’t help but hope that this and similar cases do find governments liable if for no other reason than to see the automation ideology suffer some real world consequences for once.
All posts in this series:-